Mentions légales du service

Skip to content

Add LRScheduler plugin interface

E Madison Bray requested to merge embray/scheduler-plugins into master

This is a prototype for #126, demonstrating how we could go about adding learning rate schedulers.

This turns out not to work as well as for Optimizers when it comes to creating configuration schemas for the built-in schedulers from PyTorch, but this is not so important. All it means is slightly poorer validation of users' configs for schedulers.

Otherwise, it works, and almost any LR scheduler from PyTorch can be used, as well as any user-created scheduler. There are some exceptions, notably LambdaLR and MultiplicativeLR.

My idea for this is to add a new plugin interface--a "generic" one, that simply lets users provide their own functions in a plugin module that can be referenced by name in the config file.

I might also add a built-in function they can use for this purpose that allows basic arithmetic expressions in the config file (no arbitrary code, but if you want to perform a basic arithmetic operation that should cover a large majority of realistic use-cases)

Merge request reports

Loading