Implement loss regularizers & revise Optimizer API.
The main objective of this MR is to add a new type of optimizer plug-ins to declearn, that implement loss-regularization terms through gradients correction (i.e. computing the gradients of the actual regularization term and adding them to the base gradients output by the framework-specific Model
code).
One of the outcomes of this new API will be to add support for FedProx, which consists in adding a proximal term to the loss, regularizing it by the (scaled) squared difference between local and global weights (recomputed at each step).
As a side objective, the MR intends to partially revise the OptiModule
and, consequently, Optimizer
configuration / (de)serialization API to make it easier for end-users to write down and/or edit full optimizer configurations manually.
Tasks:
-
Implement a Regularizer
API and add it toOptimizer
. -
Revise OptiModule
(de)serialization format. -
Document Regularizer
and updateOptiModule
documentation (notably in README). -
Deploy and document the revised Optimizer
configuration syntax (in examples and README).
Merge request reports
Activity
added feature-request label
assigned to @paandrey
added 1 commit
- 688e2ed4 - Use framework-specific Vector type as `Model` weights.
added 6 commits
- 0a1d967f - Revise `create_types_registry`.
- bde421d1 - Rename private `optimizer.modules._base` to `_api` for readability.
- 53dd9eaf - Implement `Regularizer` optimizer plug-ins API.
- e1479287 - Implement FedProx, Lasso and Ridge loss regularizers.
- fd1c012d - Added loss regularizers to `Optimizer`.
- 61d730e4 - Use framework-specific Vector type as `Model` weights.
Toggle commit listadded 2 commits
added 1 commit
- f00fa033 - Move `NumpyVector` from `model.api` to `model.sklearn` submodule.
added 5 commits
- 0887fdd3 - Revise `OptiModule` API.
- 2ed8e450 - Modularize `regularizers` and `modules` input types for `Optimizer`.
- c88c122b - Apply minor docstring and type-hint corrections.
- a63c27b4 - Refactor `OptiModule` unit tests and implement `Regularizer` ones.
- 2fa68022 - Move `NumpyVector` from `model.api` to `model.sklearn` submodule.
Toggle commit list@nbigaud When you have a moment, could you please review this MR?
Summary of changes:
-
New
Regularizer
API-
Regularizer
is a new type ofOptimizer
plug-in to implement loss regularization terms. Technically, it computes the derivative of the regularization term based on the current model weights (and/or loss-derived gradients). - Automated type-registration of subclasses using
__init_subclass__
and relying on thename
class attribute. - Implemented a
from_specs
generic constructor to instantiate any registered regularizer from itsname
and config dict.
-
-
Revised
OptiModule
API- Deprecated the use of
OptiModule.deserialize
andOptiModule.serialize
. - Implemented
__init_subclass__
tricks andfrom_specs
method similarly toRegularizer
. - Added
aux_name
optional (default=None) class attribute to label auxiliary variables when required (e.g. for Scaffold).
- Deprecated the use of
-
Revised
Optimizer
API- Added
regularizers
parameter and attribute to use a pipeline of loss regularizers (similarly tomodules
). - Changed
to_config
andfrom_config
to use a more human-readable format. - Modularized the type of input
regularizers
andmodules
so that users may either pass instantiated objects, classname
keywords or(name, config)
tuples.
- Added
-
"Out of Scope" API changes:
- Modified
declearn.utils.create_types_registry
to enable its use as a class decorator. - Modified
Model.get_weights
andModel.set_weights
: use subclass-specificVector
rather thanNumpyVector
. - Moved
NumpyVector
fromdeclearn.model.api
todeclearn.model.sklearn
submodule as it is no longer central.
- Modified
-
Maintenance operations:
- Revised and added unit tests for optimizer plug-in classes.
- Updated the README and Heart-UCI example code based on API changes.
- Pushed version number of 2.0.0.beta2 due to API-breaking changes.
-
requested review from @nbigaud