Existing in-place and centralized cloud infrastructure cannot support many of the computing requirements of this powerful application, requiring low latency-or delays in data transfer-to smoothly transport and retrieve real-time. data access. To reduce latency and bandwidth usage, as well as incurring costs, computing power and processes need to be closer to the physical location of the data. The solution? Transfer the computing power to the local infrastructure to the “edge” of the network, rather than relying on remote data centers.
A whopping one 90% of the industrial sectors will use edge computing technology by 2022, according to Frost & Sullivan, while one in a recent IDC report (Must register) found that 40% of all unions will invest in edge computing next year. “Edge computing is needed to accelerate the next-generation revolution in the industry,” said Bike Xie, vice president of engineering at AI technology vendor Kneron. The future of AI and other automation technologies depends on the decentralized edge, he explained, whether it be by connecting internet devices and other devices to distributed network nodes or implemented. AI -powered chips that can create algorithmic models independently.
“Edge computing is complementary to the cloud,” Xie said. “Like the cloud, edge technology allows applications to both acquire and use data -driven knowledge that will empower factories and product -savvy ones.”
Manufacturing has moved to the edge
The move toward edge computing is the result of a sea revolution in manufacturing over the past two decades. Manufacturers, whether they make industrial products, electronics, or consumer goods, are slowly shifting but continuing to add automated and self-monitoring systems and processes to maximize greater manufacturing efficiency. products, equipment maintenance, and optimization of every link in the supply chain.
As manufacturers have implemented more sensor -based devices, they are also producing more data than ever before. But often, data sets from sensor-based devices to centralized systems can quickly grow unstable, slowing down automation and rendering applications unusable in real time.
Computing allows manufacturers to make selective choices about data processing to eliminate downtime and reduce bandwidth usage, as well as about what data could be corrupted after it is processed, as by Xie. “Manufacturers can process data quickly on the edge if carrying data to the cloud is a bottleneck, or transfer specific data to the cloud if latency and bandwidth are not issues.” Not only does processing data close to where it is used save bandwidth and reduce costs, he added, but the data is more secure because it is processed immediately.
IDC predicts that by 2023 more than 50% of new business IT infrastructure will be in-house than corporate data centers, up from at least 10% by 2020.
An example of toggling from cloud to edge comes from Paul Savill, senior vice president for product and service management at Lumen, a technology company that offers an edge computing platform. Lumen recently installed a newly built, million-square-foot factory. Robot systems from about 50 different manufacturers rely on edge computing “because it needs to be within 5 milliseconds of latency to accurately control the robots,” Savill said. Deployment provides guaranteed connectivity from internal applications to robot manufacturers’ data centers, “where they collect information in real time.”
But for long -term data storage and for machine knowledge and analysis applications – it’s all done in the public cloud, according to Savill. Other, more workloads are processed in large data centers that “have a lot of computational power” that can easily process a lot of data.
“That chain from the public cloud to the edge counting to the area counts is very important,” Savill said. “It gives customers the ability to take advantage of the latest advanced technologies in a way that they can save money and drive extremely efficiently.”