Discover FLeet, an innovation by EPFL and INRIA researchers to improve federated learning. This system optimizes distributed learning while protecting privacy, enabling asynchronous model updates for improved performance and reduced energy consumption, especially for smartphones.

Researchers at EPFL and INRIA have developed a solution to improve the performance of federated learning, a technique for training a centralized algorithm in a distributed way without revealing users' personal data.
The concept of federated learning, initiated by Google in 2016, makes it possible to protect privacy while using users' local data to improve intelligent tools. It is often combined with other techniques, such as secure multi-party computing or differential privacy.
The most common example of federated learning can be found in Android keyboards, such as Gboard. Users benefit from more accurate text predictions, while minimizing the exploitation of sensitive data and reducing energy consumption.
However, federated learning has its limitations, notably in terms of device availability (connectivity and battery), which restricts its application to daily updates, suitable for keyboards but insufficient for services requiring real-time updates, such as recommendations on social networks.
To solve this problem, the researchers developed FLeet, a middleware system that enables asynchronous updates of learning models. FLeet integrates two key components: the first manages energy consumption and predicts optimal computing periods according to the capabilities of each device. The second, a gradient descent algorithm, tolerates asynchronous updates, minimizing their impact on model convergence and improving federated learning efficiency.
This solution improves the quality of federated learning while reducing battery consumption, paving the way for more efficient and sustainable applications.
Source : ICTjournal