Weight_decay option possible in training config file but has no impact
The weight_decay
parameter in the config file is not taken into account when using the optimizer.
self._optimizer = optim.Adam(self._net.parameters(),
lr=self.learning_rate)
We should pass it there (training.py, L.285)