Mentions légales du service

Skip to content
Snippets Groups Projects
Commit 9571ad56 authored by ANDREY Paul's avatar ANDREY Paul
Browse files

Update RMSprop-addition example.

parent 168781e2
No related branches found
No related tags found
Loading
...@@ -27,10 +27,6 @@ in the code in the `__init__` and `get_config` method. ...@@ -27,10 +27,6 @@ in the code in the `__init__` and `get_config` method.
* The transformations applied to the gradients, corresponding to the `run` * The transformations applied to the gradients, corresponding to the `run`
method. method.
**Also make sure** to register your new `OptiModule` subtype, as demonstrated
below. This is what makes your module (de)serializable using `declearn`'s
internal tools.
**If you are contributing** to `declearn`, please write your code to an appropriate **If you are contributing** to `declearn`, please write your code to an appropriate
file under `declearn.optimizer.modules`, include it to the `__all__` global file under `declearn.optimizer.modules`, include it to the `__all__` global
variable and import it as part of the `__init__.py` file at its import level. variable and import it as part of the `__init__.py` file at its import level.
...@@ -61,21 +57,17 @@ added to `_adaptative.py` file. ...@@ -61,21 +57,17 @@ added to `_adaptative.py` file.
```python ```python
from declearn.optimizer.modules import OptiModule from declearn.optimizer.modules import OptiModule
from declearn.utils import register_type
# Start by registering the new optimzer using the dedicated decorator
@register_type(name="RMSProp", group="OptiModule")
class RMSPropModule(OptiModule): class RMSPropModule(OptiModule):
"""[Docstring removed for conciseness]""" """[Docstring removed for conciseness]"""
# Convention, used when a module uses synchronized server # Identifier, that must be unique across modules for type-registration
# and client elements, that need to share the same name # purposes. This enables specifying the module in configuration files.
name = "rmsprop" name = "rmsprop"
# Define optimizer parameter, here beta and eps # Define optimizer parameters, here beta and eps
def __init__(self, beta: float = 0.9, eps: float = 1e-7) -> None: def __init__(self, beta: float = 0.9, eps: float = 1e-7) -> None:
"""Instantiate the RMSProp gradients-adaptation module. """Instantiate the RMSProp gradients-adaptation module.
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment