- The AI boom is breathing life into edge computing, which moves data processing away from the cloud.
- The edge-computing boom could reduce costs and the environmental impact of powering AI.
- This article is part of "5G and Connectivity Playbook," a series exploring some of our time's most important tech innovations.
Artificial intelligence is driving us into the era of edge computing — two words you should expect to hear more in the coming months and years.
Tech giants have poured billions of dollars into the cloud and spent years trying to get customers to move data onto remote servers. Now they're expanding to edge computing, which refers to moving more of the computation closer to the user (the "edge" of the network).
Whether it's your smartphone, an autonomous car, or a security device in your home, edge computing means more of the heavy lifting happens on or near the device. That change could lead to lower latency, lower energy costs, and improved privacy and security as less sensitive information gets beamed to some far-flung server.
The concept behind edge computing is nothing new, but the AI gold rush and improvements to 5G make it a perfect time for the space to take off. 5G lets devices on the edge communicate with one another and the cloud. While 5G had a slow and messy start, recent advancements could prove a shot in the arm for edge computing, which will require data connections to run seamlessly.
Amazon had eyed edge computing as a billion-dollar business, Business Insider previously reported. The buzz around AI and its potential for edge computing also ignited talk at the Mobile World Congress, held in Barcelona last month.
Jim Poole, the vice president of global business development at Equinix, said at an MWC panel that the move toward edge computing is accelerating in part because of the AI boom, which requires much more data to be processed.
"Data gravity is a real thing," Poole said. "At some point, it becomes financially and physically impossible to transmit that data all the way back to someplace else."
The latency benefits of edge are crucial with technologies such as driverless cars, which need to make split-second decisions. That's why autonomous vehicles have powerful computers that live on the vehicle itself. The same goes for medical equipment or devices used in dangerous types of manufacturing, where more computing needs to be done on the fly.
AI accelerates edge computing
The industry has already seen some benefits of edge in smartphones, with a push to create better chips and software to allow more AI horsepower onto devices. This will also require AI companies to launch smaller language models that can run on less powerful devices.
The Taiwanese chip company MediaTek had one of the most impressive demos at MWC this year: a smartphone-like device running a generative image maker powered by a Stable Diffusion AI model, which created and edited pictures in real time.
Lenovo, which was also at the show, is making a big enterprise play by selling its "edge AI" servers to businesses and has plans to make moves in edge for consumers, Tom Butler, the executive director of Lenovo's laptop line, told BI.
"If you think of bringing in generative workloads to device, first of all, I solve for time, security, and privacy, because I'm not pushing up to cloud and back down," he said.
Edge computing could save energy
Shifting AI closer to the user could have cost benefits for tech companies like OpenAI, Google, and Amazon, which run AI models on their servers at great expense.
Edge computing could also have environmental benefits. The data centers that power AI in the cloud use an enormous amount of water and energy.
Jillian Kaplan, the head of global 5G at Dell, said during an MWC panel that edge computing will be a "huge energy saver."
" I think the sustainability topic has come and gone throughout the years," Kaplan said.
"I don't think it's going to fluctuate again," she added. "I think, where we are, it has to stay top of mind, and these edge and AI capabilities are going to help us keep our equipment extremely energy efficient, which we have to do with the massive amounts of data coming in."