Featured

AI is driving computing to the edge

When it comes to the buzzwords from the past few years, edge computing is right up there with 5G and AI. But the results of a survey by the Eclipse Foundation show that edge computing is actually gaining ground in the enterprise and is much faster to deploy than either of them.

The Eclipse Foundation surveyed 301 people in the first three months of 2021 and discovered that 38% of enterprises were already implementing some form of edge computing strategy, while 44% were planning to implement edge computing within the subsequent 24 months. And of them, 44% expect those deployments to take less than 12 months.

Companies are adopting edge computing strategies because the cost of sending their ever-increasing piles of data to the cloud — and keeping it there — has become too expensive. Moreover, the time it takes to move data to the cloud, analyze it, and then send an insight back to the original device is too long for many jobs.

For example, if a sensor on a factory machine senses an anomaly, the machine’s operator wants to know right away so she can stop the machine (or have a controller stop the machine). Round-trip data transfer to the cloud takes too long. That is why many of the top cloud workloads seen in the slide above involve machine learning or analysis at the edge.

Control logic for factories and sensor fusion needs to happen quickly for it to be valuable, whereas data analytics and video processing can generate so much data that sending it and working on that data in the cloud can be expensive. Latency matters in both of those use cases as well.

But a couple of other workloads on the slide indicate where the next big challenge in computing will come from. Two of the workloads listed on the slide involve data exchanges between multiple nodes. Indeed, we are moving from an era in which computing is happening either on a single cloud or client device to one in which it’s being distributed across multiple clouds and client devices.

This model of computing poses a new challenge because those architecting computing systems will have to adapt to the myriad edge computing environments and the security and latency requirements of the apps running on all of those computing environments. Thus, configuration and management become a sticky point.

Instead of choosing a server or a phone for computation, the brave new world of distributed edge computing might offer an application the chance to run on light switches, industrial programmable logic controllers, edge gateways on the factory floor, or a handheld tablet. This will mean multiple processor types, multiple levels of available memory, and different levels of network access.

Moving data and computing around in this way is new, and it’s going to require new architectures, new databases, and new ways of handling security and device management. As the survey results make clear, edge computing is already being deployed and its deployment isn’t particularly challenging. What is challenging is the way we build new applications and take advantage of the computers around us.

This point was driven home especially well in an interview from Deloitte about the new architectures from the cloud. In it, David Linthicum, a managing director at the firm, addresses the difficult questions ahead for CISOs, CTOs, and others trying to figure out how to best use their new edge computing systems this way:

The challenge that we see organizations running into is architectural. Essentially, they need help with dividing the architecture into tiers. As technologists, we hear questions, such as Where do you keep the data? Where do you keep the knowledgeWhat’s the most efficient way to do it? Moving forward, whether it’s for robots or edge-based private cloud services, organizations will need to create a configuration management system that can dynamically manage information flow across architecture tiers.The biggest fear I hear about when I talk to executives moving their organizations to edge computing is not that edge computing can’t be done but whether it’s able to operate the infrastructure or tiers in a way that allows for dynamic changes.

So building out an edge infrastructure may only take a year, but figuring out how to best use that architecture will take longer, and will likely evolve over time.

Stacey Higginbotham

Share
Published by
Stacey Higginbotham

Recent Posts

Episode 437: Goodbye and good luck

This is the final episode of The Internet of Things Podcast, and to send us…

9 months ago

So long, and thanks for all the insights

This article was originally published in my weekly IoT newsletter on Friday August 18, 2023.…

9 months ago

We are entering our maintenance era

This article was originally published in my weekly IoT newsletter on Friday August 18, 2023.…

9 months ago

IoT news of the week for August 18, 2023

Verdigris has raised $10M for smarter buildings: I am so excited by this news, because roughly eight…

9 months ago

Podcast: Can Alexa (and the smart home) stand on its own?

Amazon's head of devices, David Limp, plans to retire as part of a wave of executives that…

9 months ago

Z-Wave gets a boost with new chip provider

If you need any more indication that Matter is not going to kill all of…

9 months ago