Analysis

Project Alvarium aims to bring trust to the IoT

This week, a group of companies including ARM, Dell, OSIsoft, and IBM along with the IOTA Foundation launched a project at The Linux Foundation to help build trust in the authenticity of data from connected devices called Project Alvarium.

While it’s still in the formation phase, Project Alvarium aims to create a framework for sharing information about individual data so companies can determine if such data is trustworthy. The current framework involves several layers of trustworthiness that culminate in each piece of data receiving a trustworthiness score. I’ll dig into how the framework determines the score in a bit, but first let’s talk about why this project is so important.

Dell’s version of how to calculate a data confidence score.

You can’t talk to anyone about the IoT without hearing the phrase “data is the new oil.” The idea is that a company’s data is the next big commodity that it will use to improve operations, but also potentially share or sell to other companies or organizations to help improve their operations. Or as Jason Shepherd, global CTO of IoT and edge computing at Dell Technologies, puts it, “This isn’t really an IoT story. It’s a data story.”

For example, automakers have tons of sensors inside their cars that can track performance, location, and more. Perhaps they could put sensors into those cars’ tires to infer road conditions. That same data might also be of use to mechanics doing work on the car later on. Or maybe a government would want that data to help understand how the cars influence local air quality or traffic patterns. Right now, many of these use cases are theoretical because we don’t have the technical or legal infrastructure to share this data securely.

Instead, we have one-off contracts between organizations to share specific data. Which is fine up to a point. It doesn’t allow us to really embrace the promise of the IoT, where sensors can be harnessed by multiple organizations to glean relevant information — even somewhat trivial information that might boost the performance of an algorithm. To build that infrastructure we have to start with a scalable way to measure trust.

Most experts I talk to in the IoT recommend the blockchain as a way to establish that trust, but a distributed ledger is only part of the solution. Project Alvarium starts with trust at the device level, requiring the use of a trusted compute module on the hardware to certify that the data-gathering device is legit and that no one has tampered with it.

From there, it has a test for handling the data ingestion, and then the option to attach metadata to whatever information that was gathered. That metadata might include elements such as who can access the core data and where it comes from, as well as compliance-related information. Shepherd calls it a manifest for the individual data.

Then, in the fourth stage, we can use a blockchain. The framework calls for some form of immutable ledger so the data, the manifest, and any other information can be stored and tracked. This is where something like IOTAS or Hyperledger might come in. The idea of having some kind of secured root of trust combined with an immutable ledger isn’t new. Other companies, such as XAGE, are promoting it in their own products.

The fifth layer of this framework is where you find the information about where the data is located, as well as the confidence score that was calculated in each different layer. From there a business can decide if it wants to use the data and, if so, how much it is willing to pay for it. This sort of framework also helps ensure compliance with regulations such as GDPR. If data is tagged to a particular individual in a manifest, then when that individual tries to delete their data, an entire system exists that will find that person’s data and delete it.

Obviously, the launch of an open-source project designed to build scalable models of trust for the IoT doesn’t mean that scalable data sharing will actually come together anytime soon. Shepherd and I last talked at the launch of EdgeX Foundry, another project Dell has helped create with the Linux Foundation. That project is designed to create an open middleware layer that can translate different machine data protocols. The idea is that we can build out the industrial IoT faster if we no longer have to worry about the 1,001 protocols already in place.

Several years after launch, EdgeX Foundry is gaining ground with roughly a million downloads. So it’s safe to say that we’re still a few years away from seeing any significant movement with Project Alvarium. Still, it will be solving a big problem and already has a variety of players from the enterprise and industrial world, so I think it’s worth paying attention to.

Stacey Higginbotham

Share
Published by
Stacey Higginbotham

Recent Posts

Episode 437: Goodbye and good luck

This is the final episode of The Internet of Things Podcast, and to send us…

8 months ago

So long, and thanks for all the insights

This article was originally published in my weekly IoT newsletter on Friday August 18, 2023.…

9 months ago

We are entering our maintenance era

This article was originally published in my weekly IoT newsletter on Friday August 18, 2023.…

9 months ago

IoT news of the week for August 18, 2023

Verdigris has raised $10M for smarter buildings: I am so excited by this news, because roughly eight…

9 months ago

Podcast: Can Alexa (and the smart home) stand on its own?

Amazon's head of devices, David Limp, plans to retire as part of a wave of executives that…

9 months ago

Z-Wave gets a boost with new chip provider

If you need any more indication that Matter is not going to kill all of…

9 months ago