Mentions légales du service

Skip to content

Add optimizer configuration

E. Madison Bray requested to merge embray/optimizers into master

Part of #68 (closed) (also relates to #37).

Adds fully configurable optimizers in the training config, like:

name: Adam
params:
    learning_rate: 0.1
    weight_decay: 0.01

all other parameters for torch.optim.Adam are also supported, and ditto for all the other optimizers built into PyTorch.

Some of this code is admittedly very confusing, because it's trying to automatically adapt all the built-in Optimizers from PyTorch into our framework, and generate config schemas for their parameters. It's probably overkill, but I wanted to see if it could be done.

In any case, all built-in optimizers from torch.optim are now supported, at least in principle.

It does not yet support per-parameter optimizer options (for example PyTorch lets you set a different learning rate for different model parameters). Supporting this will require further enhancement to the config format.

Merge request reports