Mentions légales du service

Skip to content

Implement loss regularizers & revise Optimizer API.

ANDREY Paul requested to merge regularizers into main

The main objective of this MR is to add a new type of optimizer plug-ins to declearn, that implement loss-regularization terms through gradients correction (i.e. computing the gradients of the actual regularization term and adding them to the base gradients output by the framework-specific Model code).

One of the outcomes of this new API will be to add support for FedProx, which consists in adding a proximal term to the loss, regularizing it by the (scaled) squared difference between local and global weights (recomputed at each step).

As a side objective, the MR intends to partially revise the OptiModule and, consequently, Optimizer configuration / (de)serialization API to make it easier for end-users to write down and/or edit full optimizer configurations manually.

Tasks:

  • Implement a Regularizer API and add it to Optimizer.
  • Revise OptiModule (de)serialization format.
  • Document Regularizer and update OptiModule documentation (notably in README).
  • Deploy and document the revised Optimizer configuration syntax (in examples and README).
Edited by ANDREY Paul

Merge request reports