This week, Cloudflare introduced its Workers platform to the world as a new form of edge computing. The news is worth taking a closer look at given all the intense focus on edge computing today. For example, the telcos are all pushing forward with their version of edge computing, contained on servers at the edge of their cellular networks.
And not a week goes by without some startup claiming it has a new edge computing platform or tool. Part of the ubiquity of the phrase “edge computing” comes from the fact that every player in the IoT thinks of the edge in a different way.
Sensor companies think of the edge as tiny, battery-powered devices that gather data, while industrial manufacturers consider it a computer on a machine that gathers data from multiple sensors. Intel and Dell think of the edge as a gateway, or as servers on a factory floor. While the telcos — along with content delivery and internet security provider Cloudflare — view the edge as the limits of their own networks.
For Matthew Prince, CEO of Cloudflare, the edge touted by industrialists and sensor folks will eventually disappear. “Any on-premise devices are going away,” he says. Instead, he sees a future where there is device-side computing, back-end computing in the cloud, and what he calls the “third place” of computing, which happens in between those two.
The benefits of such an architecture are that a company can take advantage of computing power that’s geographically closer to the device, and build devices at the edge that are cheaper because they have no need for big CPUs. As an added bonus, because those devices connect through Cloudflare’s network, they aren’t directly on the public internet and as such, have some security protection. The downside to this architecture is that when the internet fails, so do all the programs you have running in the cloud. Basically one might make the trade-off of putting expensive compute chips in an edge device to putting in dual forms of connectivity.
I’m not sure all on-premise devices will go away, especially not in the next five to 10 years, but I do think the idea of having a third place for computing makes sense. Some of the examples Prince offered by way of customer stories really resonate. For example, a company building an edge device designed to take in constant data, such as a thermometer, could send the data to a Cloudflare Worker program that aggregates it and then sends a sample to the cloud for storage or for processing later on. But if the temperature data spikes, the Worker program can take action and send an alert to the end user.
And ideally, that alert would take less time to reach the end user and would be more resilient than a function hosted on the cloud that’s dependent on a single data center location. Another advantage of this approach is that it makes managing the equipment a bit easier. In the temperature sensing example, for instance, the end user just has to buy the sensors tied to the Cloudflare Worker program and put them in his or her location.
As those sensors age, they can be updated remotely and even replaced without having to futz with a gateway box. One of the more challenging aspects of deploying IoT offerings is that provisioning connected devices can be a nightmare of typing in passwords or snapping pictures of QR codes. In this case, devices can arrive pre- provisioned.
What I’d like to see is a robust discussion of the merits of each approach and a clear understanding of their related trade-offs. There’s obviously an opportunity for this version of edge computing with some connected devices, especially those that need to be cheap and easily deployed.