Privacy used to be so common (in the 1990s and early 2000s) that you literally could not escape it. Interactive advances in technology, social media not being the least of them, has rapidly eroded privacy to the point where it is one of the leading motivating factors in strengthening information security. Privacy is now viewed as a luxury and laws have slowly expanded into the field of privacy to accommodate users who are increasingly seeing their privacy violated.
Machine learning has responded to the demand for a return to privacy by birthing a new method of data analysis: federated learning.
This article will provide a high-level exploration of this emerging method of machine learning by examining what federated learning is, how federated learning works, characteristics of federated learning, application, and differential privacy.
What is federated learning?
Previous methods of machine learning algorithms are intended for highly controlled environments (data centers for example) with balanced data distributed between a high number of machines where high speed and capacity networks are available. Data used for machine learning is stored on either a central server or the cloud. This comes with a high price tag, not to mention user data privacy concerns.
Federated learning is a family of algorithms that are an alternative method of machine learning. It helps to solve these privacy and cost considerations by using a centralized server to coordinate a federation of participating devices.
How does federated learning work?
This central server provides the model for participating devices but most of the learning work is performed by the federated users themselves, including training the model itself. There are different forms of federated learning, but they all have the following in common — a central server coordinates federated devices, or nodes, and initiates models for the individual nodes.
These nodes contain learning/training data and each node trains a local model. These local models are then shared with the central server (sometimes this is performed on a random sample basis). After the local model is trained and communication rounds increase in number, these further communication rounds are called node updates. The idea is to send updates to the central server until the model stabilizes.
Ideally, this stabilization process is completed in as few rounds as possible. When the central server receives these updates, it calculates the average weight of the updates to create a more accurate representation of data.
Unique characteristics of federated learning
Those familiar with machine learning have probably already noticed the marked differences between federated learning and conventional machine learning (as much of an oxymoron as that is). There are some distinct differences that make federated learning unique, including:
- More distributed than distributed machine learning: While distributed machine learning may have many users within a data center, federated learning is not bound by the confines of one location. Federated users can be anywhere, potentially being far more distributed than distributed machine learning
- More users than other machine learning methods.
- Unbalanced samples: In data centers, it may be possible to have users with generally the same amount of machine-learning data points. With federated learning, some users may have few data points and others may have far more
- Non-identical distributions of data: The data center setting makes it possible to ensure that all user updates look alike. With federated learning, this is not guaranteed or even expected
- Slow and unstable communication: Unlike in a data center, where quick and efficient communication is expected, communication can be much different in federated learning. It’s like working with remote users — their device may be turned off, it may not be on a reliable network, their connection may be slow and so on. All of these issues can impact federated learning
Application of federated learning
Federated learning is intended to be used as a machine learning method for mobile devices. As the use of federated learning spreads to other parts of the tech world, it is likely to find its legs.
It is best suited to situations where users both generate and label their own data. A specific use is when users input data into a search engine and their mobile device can then better predict what word will be entered next. Developers and researchers have only just begun to scratch the surface of the user benefits federated learning can offer. As mobile devices become even more integrated into everyday life, the uses for it will comparatively increase as well.
Although federated learning was created with privacy in mind, there are situations where it could compromise user privacy. The leading scenario involves an attack analyzing a user’s updates to determine which ones have the most weight, allowing the attacker to make certain assumptions about the data — including its value.
To remedy this, researchers have proposed observing differential privacy. This is where users are sampled more randomly and when the update is sent to the central server, noise is added to obfuscate attackers. Experiments have shown this method to be as accurate as previous forms of federated learning, but the downside is an increase in computational cost and a slower convergence rate.
Federated learning is a new approach to machine learning for mobile devices that offers some distinct benefits over distributed machine learning. User privacy is protected by not having to upload massive amounts of personal data to a central server, and cost is brought down because devices do not have to be in a central data center location.
It will be interesting to see how federated learning evolves over time to account for both information security attacks and changes in mobile device technology.
- Federated Learning, florian.github.io
- Federated learning: distributed machine learning with data locality and privacy, Cloudera Fast Forward Labs
- Federated Learning: Collaborative Machine Learning without Centralized Training Data, Google AI Blog
- The New Dawn of AI: Federated Learning, Towards Data Science