Enable skipping frozen weights when using `Model.get_weights` and `Model.set_weights`
Compare changes
GitLab upgrade completed. Current version is 17.11.1. We now benefit from the features of the release 17.11.
This MR tackles issue #15 (closed), namely the fact that the current support for frozen neural network weights is imperfect.
Currently:
Model.compute_batch_gradients
properly ignores frozen (i.e. non-trainable) weightsModel.get_weights
/ Model.set_weights
returns/expects all weights, including frozen onesThis causes a core issue: weight-decay and loss Regularizer
instances cause bugs when using models with frozen weights.
This merge request:
Model.get_weights
and Model.set_weights
that enables skipping frozen weights.Implementation tasklist:
Model.get_weights(trainable: bool = False)
.Optimizer
backend, fixing the current bug.Model.set_weights(trainable: bool = False)
.Closes #15 (closed)