Fedprox included in torchnn
FedProx optimization scheme is included in torch training plan and can be enabled by passing the FedProx_mu
parameter in model_args
, as shown in the notebook pytorch-MNIST-FedProx.ipynb
with the basic MNIST example (otherwise standard optimization scheme will be performed).