Edge computing is hot right now, but not everyone understands why so many people are so focused on keeping their data in gateways or on on-premise services instead of sending it to the cloud. While it may seem like a huge shift to bring processing to the edge of the networks as opposed to sending all of the data to the cloud, for many IoT use cases, the cloud was never going to be a viable solution.
There are five primary reasons why the edge is winning when it comes to the internet of things. Three of them are technical limitations on cloud data transfers and two are dependent on business culture and the perception of cloud security. Cloud security services like http://www.thefinalstep.co.uk/ exist in order to keep your data safe, but let’s cover them.
1. Security – This is one of the favored reasons for big industrial companies. They don’t want to connect their processes to the internet because it exposes their operations to hackers and data breaches. For example, at the Honeywell User Group meeting I attended last year, most of the customers of Honeywell’s industrial automation products were loath to even put wireless infrastructure in their plants for fear of security breaches. Some of this is perception of risk, but thanks to a variety of hacks from Target’s breach — which began in its HVAC system and ended up compromising customers’ credit cards, raising concerns over hackers targeting infrastructure — this is a legitimate fear, given certain types of industrial processes.
2. IP – Related to the issue of security are concerns over proprietary data and intellectual property. High-quality sensors can be used to derive important information, such as a refinery process that counts as a trade secret. Jaganath Rao, SVP of IoT Strategy at Siemens, says that food companies are particularly sensitive to these sorts of issues. Imagine if the recipe for Coke could be inferred through its industrial data, for example.
3. Latency and resiliency – Latency is a measure of how fast information can travel over a network. Whether you are waiting for a Netflix movie to load or playing “Call of Duty,” latency matters. And when you translate digital bits into electrons or machinery, latency matters even more. In the home, for example, cloud-to-cloud services can lead to a second or two of delay when I’m turning on my lights using an app. That’s irritating. But in an industrial process, sending data from a machine to the cloud and then back again can cost a lot of money or even lives.
One of the more popular arguments for edge computing is autonomous cars. The idea is that a car going 60 miles an hour needs to be able to identify a threat and stop the car instantly, not wait a few seconds to make a round trip to the cloud. In the industrial world, a machine that is in danger of failing might only have a few seconds or a minute of warning. A sensor might pick up the new vibration signature that signals a failure and then send that to a local gateway for processing. The gateway needs to have the ability to recognize the failure and either alert someone or send back instructions to shut off the machine within milliseconds or seconds.
This also ties into resiliency. Network coverage can falter and the internet can go down. When that happens, cars, heavy industrial machinery, and manufacturing operations still need to work. Edge computing enables them to do that.
4. Bandwidth costs – Some connected sensors, such as cameras or aggregated sensors working in an engine, produce a lot of data. As in multiple gigabytes of data every hour or, in some cases, every minute. In those cases, sending all of that information to the cloud would take a long time and be prohibitively expensive. That’s why local image processing or using local analytics to detect patterns makes so much sense. Instead of sending terabytes of raw image data from a connected streetlight, a local gateway can process that data and then send the relevant information.
5. Autonomy – The problems of latency and resiliency bring us to the final reason the edge will flourish in the internet of things: autonomous decision-making can’t rely on the cloud. For many, the promise of connected plants or offices is that a large number of processes can become automated. If a machine can monitor itself and the process it’s performing, then it can eventually be programmed to take the right action when problems occur. So for example if a sensor detects a pressure buildup, it can release a valve further down the line to relieve that pressure. But once a process relies on a particular level of automation, it’s imperative that it can rely on that level to be enacted in time and all the time.
Most of these are fairly common sense, but what many in the traditional IT world miss is that when you start moving real-world machinery around instead of just bits, it’s no longer good enough to provide 99.99% reliability or millisecond latency. When challenges in the digital world meet the physical world they are magnified; real people’s lives or production processes are on the line, with real-world consequences.
It’s not to say that the cloud won’t pick up more IoT work over time, but right now, it’s a pretty scary proposition for a lot of IoT use cases.