Add learning rate scheduler in config file
Having a learning rate scheduler, as implemented in pytorch would be nice to have.
It terms of code modification, it seems similar to the optimizer configuration, given the short example they give:
model = [Parameter(torch.randn(2, 2, requires_grad=True))]
optimizer = SGD(model, 0.1)
scheduler = ExponentialLR(optimizer, gamma=0.9)
for epoch in range(20):
for input, target in dataset:
optimizer.zero_grad()
output = model(input)
loss = loss_fn(output, target)
loss.backward()
optimizer.step()
scheduler.step()
I think @j.guez started to look at that.
Could you @embray give some hints as to how we can tackle this type of issue (in particular, adding options in the config files).
Thanks