Mentions légales du service

Skip to content
Snippets Groups Projects
Commit 92e82dbf authored by BIGAUD Nathan's avatar BIGAUD Nathan
Browse files

Reviewing doc

parent 0077dde9
No related branches found
No related tags found
1 merge request!23Release version 2.0
Pipeline #751126 waiting for manual action
......@@ -15,21 +15,18 @@
# See the License for the specific language governing permissions and
# limitations under the License.
"""Declearn - a python package for decentralized learning.
"""Declearn - a python package for private decentralized learning.
Declearn is a framework providing with tools to set up and
run Federated Learning processes. It is being developed by
the MAGNET team of INRIA Lille, with the aim of providing
users with a modular and extensible framework to implement
federated learning algorithms and apply them to real-world
(or simulated) data using any model-defining framework one
might want to use.
Declearn is a modular framework to set up and run federated learning
processes. It is being developed by the MAGNET team of INRIA Lille,
with the aim of providing users with a modular and extensible framework
to implement federated learning algorithms and apply them to real-world
(or simulated) data using any common machine learning framework.
Declearn provides with abstractions that enable algorithms
to be written agnostic to the actual computation framework
as well as with workable interfaces that cover some of the
most popular frameworks, such as Scikit-Learn, TensorFlow
and PyTorch.
Declearn provides with abstractions that enable algorithms to be written
agnostic to the actual computation framework as well as with workable
interfaces that cover some of the most popular frameworks, such as
Scikit-Learn, TensorFlow and PyTorch.
The package is organized into the following submodules:
* aggregator:
......@@ -54,15 +51,17 @@ The package is organized into the following submodules:
Shared utils used (extensively) across all of declearn.
"""
from . import typing
from . import utils
from . import communication
from . import data_info
from . import dataset
from . import metrics
from . import model
from . import optimizer
from . import aggregator
from . import main
from . import (
aggregator,
communication,
data_info,
dataset,
main,
metrics,
model,
optimizer,
typing,
utils,
)
__version__ = "2.0.0"
......@@ -15,7 +15,17 @@
# See the License for the specific language governing permissions and
# limitations under the License.
"""Framework-agnostic Vector aggregation API and tools."""
"""Model updates aggregating API and implementations. An Aggregator
is typically meant to be used on a round-wise basis by the orchestrating
server of a centralized federated learning process, to aggregate the
client-wise model updated into a Vector that may then be used as "gradients"
by the server's Optimizer to update the global model.
This declearn submodule provides with:
* Aggregator : abstract class defining an API for Vector aggregation
* AveragingAggregator : average-based-aggregation Aggregator subclass
* GradientMaskedAveraging : gradient Masked Averaging Aggregator subclass
"""
from ._api import Aggregator
from ._base import AveragingAggregator
......
......@@ -15,7 +15,10 @@
# See the License for the specific language governing permissions and
# limitations under the License.
"""Submodule implementing client/server communications.
"""Submodule implementing client/server communications. This is done by
defining server-side and client-side network communication endpoints for
federated learning processes, as well as suitable messages to be transmitted,
and the available communication protocols.
This module contains the following core submodules:
* api:
......@@ -24,7 +27,6 @@ This module contains the following core submodules:
Message dataclasses defining information containers to be exchanged
between communication endpoints.
It also exposes the following core utility functions:
* build_client:
Instantiate a NetworkClient, selecting its subclass based on protocol name.
......@@ -34,7 +36,6 @@ It also exposes the following core utility functions:
List the protocol names for which both a NetworkClient and NetworkServer
classes are registered (hence available to `build_client`/`build_server`).
Finally, it defines the following protocol-specific submodules, provided
the associated third-party dependencies are available:
* grpc:
......@@ -46,15 +47,14 @@ the associated third-party dependencies are available:
"""
# Messaging and Communications API and base tools:
from . import messaging
from . import api
from . import api, messaging
from ._build import (
_INSTALLABLE_BACKENDS,
NetworkClientConfig,
NetworkServerConfig,
build_client,
build_server,
list_available_protocols,
_INSTALLABLE_BACKENDS,
)
# Concrete implementations using various protocols:
......
......@@ -28,8 +28,7 @@ writing specifications for expected 'data_info' fields, and automating
their use to validate and combine individual 'data_info' dicts into an
aggregated one.
DataInfoField API tools
-----------------------
DataInfoField API tools:
* DataInfoField:
Abstract class defining an API to write field-wise specifications.
* register_data_info_field:
......@@ -39,8 +38,7 @@ DataInfoField API tools
* get_data_info_fields_documentation:
Gather documentation for all fields that have registered specs.
Field specifications
--------------------
Field specifications:
* ClassesField:
Specification for the 'classes' field.
* InputShapeField:
......
......@@ -15,7 +15,16 @@
# See the License for the specific language governing permissions and
# limitations under the License.
"""Dataset-interface API and actual implementations module."""
"""Dataset-interface API and actual implementations module. A 'Dataset'
is an interface towards data that exposes methods to query batched data
samples and key metadata while remaining agnostic of the way the data
is actually being loaded (from a source file, a database, another API...).
This declearn submodule provides with:
* Dataset : abstract class defining an API to access training or testing data
* InMemoryDataset : Dataset subclass serving numpy(-like) memory-loaded data
arrays
"""
from ._base import Dataset, DataSpecs, load_dataset_from_json
......
......@@ -19,7 +19,8 @@
This declearn submodule provides with:
* Model and Vector abstractions, used as an API to design FL algorithms
* Submodules implementing interfaces to various frameworks and models.
* Submodules implementing interfaces to curretnly supported frameworks
and models.
"""
from . import api
......
......@@ -15,8 +15,19 @@
# See the License for the specific language governing permissions and
# limitations under the License.
"""Framework-agnostic optimizer tools, both generic or FL-specific."""
"""Framework-agnostic optimizer tools, both generic or FL-specific. In more
details, we here define an `Optimizer` class that wraps together a set of
modules, used to implement various optimization and regularization techniques.
from . import modules
from . import regularizers
Main class:
* Optimizer: Base class to define gradient-descent-based optimizers.
This module also implements the following submodules, used by the former:
* modules: gradients-alteration algorithms, implemented as plug-in modules.
* regularizers: loss-regularization algorithms, implemented as plug-in modules.
"""
from . import modules, regularizers
from ._base import Optimizer
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment