An interesting paradox has emerged during the past few years. Ever more computing infrastructure is embedded in the cloud, which tends to put the computing smarts further away from users. At the same time, however, an increasing portion of services and applications that consumers and businesses use requires instantaneous reactions that can’t be handled in far-away clouds.
That can be a big problem. For instance, autonomous vehicles have to react to road conditions in real time. The cloud is not built for this level of interaction. The time it takes for signals to go from the vehicle to the control equipment and back almost certainly is too long. In addition to the distance the signals must traverse, time is spent working through network congestion and other impediments. And, to top it all off, the longer the distance, the greater the chances that a problem occurs. That isn’t a good thing when an AV wants to know whether to brake or not.
The response to the challenge is to create a new layer of computing at the edge of the network. This functionality is located far closer to where it is needed. In meeting these challenges, edge computing cuts down on the amount of traffic that will proliferate as the Internet of Things (IoT) and 5G begin to roll out and the general use of broadband continues to rise.
The two elements will be complementary as the industry evolves.
“IoT, and more specifically Industrial IoT IIoT, promises to enable us to respond faster with better business outcomes provided we can effectively process the information,” wrote Richard Soley, the executive director of the Industrial Internet Consortium (IIC), in response to questioned emailed by IT Business Edge. “To make that happen requires quite a lot of processing horsepower, likely distributed among potentially a large number of processors. As the number of processors proliferates, processing power swamps often undersupplied communications capability. Therefore, it just makes sense to do as much data reduction and analytics as is possible ‘at the edge,’ that is, where the data is collected.”
To an outsider, it would seem that enabling this decentralized mix of cloud and edge computing to work together effectively and efficiently would be difficult. Handling huge loads of data in one place seems difficult enough. Divvying it up for processing in such a distributed fashion seems as if it would be a very difficult challenge.
That, however, isn’t the case, according to Philip DesAutels, senior director of IoT for The Linux Foundation and executive director of EdgeX Foundry. He argues that network needs and complexity actually are reduced by cloud/edge cooperation. “I am taking the mass of data points created at the edge and processing it into actionable information that I then share with the cloud,” he wrote. “In this way the edge becomes, to use Cisco’s phrase, a fog – that is a soft, physical embodiment of the cloud. Also, it makes development and support easier. The cloud model can be mirrored at the edge. We have lots of experience with the cloud model now and are successful with it.”
The distinction between the edge and the cloud will eventually fade. The result will be a pervasive blanket of computing with tasks performed where it makes sense to do so. The deciding factors will include the timeliness required, the level of criticality, the intensiveness of the computing task, and the willingness of users to pay.
The two layers will merge. “If it is data intensive, time sensitive, or highly automatable, it’s inevitable it will live at the edge,” wrote DesAutels. “The edge will soon stop being separate and distinct from the cloud … but instead will be more like a piece of the cloud. I have compute that runs remotely, most likely on someone else’s infrastructure, that I scale up and down to meet the demands of business and I have infrastructure in my business that uses the same models (albeit scaled down a bit) that looks and acts like cloud infrastructure.”
One of the biggest projects of the next few years may be creating this pervasive infrastructure, which must accommodate Big Data, artificial intelligence and other emerging data-hungry technologies.
“The need to intricately choreograph the interactions between the edge and the core/cloud/whatever (and it’s not a simple two-tier architecture, it’s actually a complex, interwoven technology stack) is what makes edge computing a tough problem to solve,” wrote Charles Araujo, a principal analyst for Intellyx LLC.
It is inevitable that we are entering the age of blanket computing. “It’s not as simple as just doing some processing here and other processing there,” Araujo wrote. “While there will be a lot of transient data, there is an expectation of data continuity, so it will be like an elaborate dance that seamlessly moves data through a complex system and dynamically transforming it and combining it with other data to derive insights and achieve the intended business results.”
Carl Weinschenk covers telecom for IT Business Edge. He writes about wireless technology, disaster recovery/business continuity, cellular services, the Internet of Things, machine-to-machine communications and other emerging technologies and platforms. He also covers net neutrality and related regulatory issues. Weinschenk has written about the phone companies, cable operators and related companies for decades and is senior editor of Broadband Technology Report. He can be reached at cweinsch@optonline.net and via twitter at @DailyMusicBrk.