Mentions légales du service

Skip to content

Feature/447 Add researcher-side (optional) Optimizer

ANDREY Paul requested to merge feature/447-researcher-optimizer into develop

This MR adds the possibility to set up an Optimizer on the researcher-side, so as to refine aggregated model updates received from nodes prior to applying them to the global model. See related issue #447.

Adding an Optimizer on the researcher side enables the following features to be plugged into an Experiment:

  • Set up some momentum or an adaptive optimizer, effectively implementing the so-called FedAvgM or FedOpt algorithms.
  • Set up a server-side learning rate, some weight decay, or any of the previous algorithms in a modular way. Most notably, the choice of using such refinements is decoupled from the choice of aggregation rule, so that any combination of recipes inspired from the literature may be set up.
  • Set up the Scaffold algorithm via the declearn backend:
    • Plug a ScaffoldClient module into the Optimizer used by nodes (defined in the experiment's TraininPlan).
    • Plug a ScaffoldServer module into the Optimizer used by the researcher (directly in Experiment).
    • This improves over the legacy implementation of Scaffold in the following ways:
      • It covers both scikit-learn and torch (and would cover tensorflow and jax/haiku if Fed-BioMed supported them).
      • It is more generic (notably enabling nodes to run distinct numbers of local optimization steps).
      • It is decoupled from the choice of aggregation rule (so one could for example use GradientMaskedAveraging and Scaffold together).
      • Its backend mechanisms may be re-used to implement other state-synchronization-based algorithms, such as FedDyn.
    • Warning: This does not work yet as this MR does not implement the sharing of Optimizer auxiliary variables (see issue #467).

This MR goes for the following implementation:

  • Add an optional researcher_optimizer instantiation parameter to Experiment. By default, leave to None.
  • Add the associate getter and setter methods.
  • When a researcher optimizer is set, add the associated steps as part of the Experiment.run_once method.
  • Include the researcher optimizer as part of the Experiment breakpoint system.

Closes issue #447

Edited by ANDREY Paul

Merge request reports