In the marketplace for artificial intelligence technology, giant companies like Google, Amazon, and Microsoft offer a powerful, centralized approach: They sell access to platforms for machine learning that hoover up vast amounts of users’ personal and proprietary information and use that data to train AI models.
A new development called federated learning offers an alternative to the centralized model. It promises to distribute the power of machine learning to mobile phones, IoT devices, and other equipment on the network edge. The payoff: Better performance and enhanced data security.
By distributing AI training to the edge, “you speed up the training process significantly, and you get better accuracy,” says Marcin Rojek, co‑founder at byteLAKE, a Poland‑based company working on federated learning solutions using Internet of Things (IoT) devices.
Federated learning is a variation of traditional machine learning, in which powerful computers run algorithms that identify patterns in data and apply what they learn to make predictions. The systems are trained by being fed vast quantities of information—studying millions of credit card transactions to learn when a purchase might be fraudulent, for instance. The garbage‑in, garbage‑out principle applies: Higher quality data yields better predictions.
This approach has its drawbacks. Training the models requires companies to amass mountains of data to central servers or data centers. Often that means collecting a user’s sensitive data—a mobile phone’s location history, a hospital’s medical information, or a manufacturer’s proprietary operating data.
That puts centralized machine learning out of reach for many businesses. “Bringing all the data together is very expensive and cumbersome,” says Karl Freund, consulting lead for deep learning at Moor Insights & Strategy.
Centralized machine learning also requires moving those mountains of data. Even minuscule delays in communicating between a device and the cloud make it impractical for cases where rapid, real‑time responses are required, such as with self‑driving cars.
Wisdom of the crowds
Federated learning attempts to solve the problems that traditional machine learning leaves on the table. Algorithm training moves to the edge of the network, so that data never leaves the device, whether it’s a mobile phone or a bank branch’s servers. Once the model “learns” from the data, the results are uploaded and aggregated with updates from all the other devices on the network. The improved model is then shared with the entire network.
Google pioneered federated learning in 2017 as a way to provide enhanced personalization services on mobile phones, and uses the approach today with its Gboard virtual keyboard on Android phones, which predicts your next words while you’re typing. Type the phrase “I’ll call you,” for example, and the system will suggest “tomorrow” or “later” as the next word. Each phone learns about its user’s unique texting habits without giving away any private information. At the same time, the aggregated results improve the keyboard’s predictive accuracy for all users.
Other tech vendors are introducing federated learning in their AI offerings.
Owkin, a New York–based startup, is bringing federated learning to the sensitive field of healthcare, building machine‑learning models for medical research.
With its software, Owkin can train AI algorithms locally to look for biomarkers in medical data. The models it produces are valuable to its main customers—pharmaceutical companies conducting biological research—because privacy regulations make it difficult for hospitals and imaging centers to send this data to the cloud for training. And some hospitals don’t produce enough images to yield accurate AI models.
“To reach 99.9% accuracy, you can’t train an algorithm on 100 images,” says Gabriel de Vinzelles, a senior associate at Otium Capital, a French early‑stage venture company and an investor in Owkin.
Owkin’s technology uses federated learning to train on medical data behind the hospital’s firewall. Only the results are aggregated and used to refine the model, which can then assist in the various stages of drug development, from identifying target molecules to recruiting patients for clinical trials.
Federated learning also addresses a common problem with cloud‑based AI: communication delays, or latency, between remote devices and the central machine‑learning system. Reducing latency is critical for AI‑powered IoT devices, such as industrial equipment where even brief delays between identifying a problem and responding to it can lead to significant damage.
ByteLAKE demonstrated a federated‑learning system aimed at manufacturers at last year’s AI Summit in San Francisco. The system uses small Lenovo computers attached to a manufacturing device—in the demo, a thick, transparent pipe with fans on each end and filled with Styrofoam balls. The computers monitored pressure data from the device and used that information to train a local AI model. The results were then aggregated in a data center. Finally, the improved model was returned to the device.
Since the intelligence in the device is held locally, manufacturers can use federated learning to bring AI to environments with limited or nonexistent network connections. Reduced latency also means an industrial turbine can shut down in milliseconds when it recognizes an imminent problem, preventing significant damage and saving businesses expensive downtime.
“Processing data in place provides a near real‑time experience for AI,” says Rojek.
Federated learning is still relatively new and limited to a few specific use cases. But experts see broad potential for applications using the technology. For example, insurance companies could use federated learning to collaborate on predictive tools without having to share competitively sensitive data. Freund envisions driverless cars using the technique to make snap decisions using real‑time data.
The promise of federated learning goes far beyond delivering better mobile experiences. One thing’s for sure: As billions of new IoT devices and sensors get connected each year, federated learning will get plenty of opportunities to prove its value.