Featured

What to expect with 6G and the industrial IoT

It’s 2035 and you’re a plant manager inside a manufacturing line that produces electric vehicles. Or maybe it’s Twinkies. Either way, inside this factory setting a dozen years into the future someone needs to predict what types of things you might want to know.

If this seems like a tough job, it is. But that’s exactly what the folks who are building the 6G cellular communications standard are trying to do today. Expected to launch in 2030 and extend through 2040, the 6G communications standard will be the improved version of 5G, but with a few new tricks up its sleeve for the industrial IoT, overall network infrastructure, and even consumer features.

Creating digital twins that reflect real-time operations is one potential use case for 6G networks.

It may feel somewhat absurd to be discussing a telecommunications standard that will improve on the current 5G standard, which is only this year beginning to hit its stride in the industrial IoT. But that’s how these sorts of widespread standardization efforts work.

The folks thinking up use cases and jobs for 6G have actually been at it for a few years, focused initially on AR and VR, both of which would be best able to take advantage of the massive capacity (data speeds) and low latency offered by 6G. But with the metaverse disappearing in a hype cloud generated by AI, the brains at big telcos and telecommunications equipment companies have refined their use cases. And many of those use cases are primed for the industrial IoT, with the two most interesting being the evolution of digital twins to become real-time simulacrums, and the concept of the network as a sensor.

Digital twins are nothing new, but today the phrase is used to mean any sort of digital model of a physical item or physical space. The idea is that with today’s sensor technology and appropriate software, people can create digital twins of buildings that can accurately reflect their current state, build digital twins of manufacturing equipment to understand how that equipment is performing, and even use these models to make predictions.

Currently, the biggest challenges for folks building digital twins have nothing to do with networking technology, but are focused on figuring out how to easily map out infrastructure so as to create a digital twin, to make data models that are transferable between different types of hardware, and to build data formats that allow for a variety of sensors to report into digital twins.

Presumably within the next seven years we’ll have largely gotten the creation and sharing of digital twins worked out, so that by the time 6G starts getting adopted the challenge it will solve will be bringing data where it is needed and computing that data in real time. And I do mean in real time, not within 15 minutes or an hour of something changing, as can happen today.

John Smee, VP of engineering and wireless research with Qualcomm, said that the combination of edge computing, cloud computing, and artificial intelligence that will be built into the 6G network is what will enable digital twins to reflect the real-time status of their real-world counterparts. The embedding of AI models within the 6G network will also allow engineers to model different scenarios on a digital twin quickly so as to predict problems or anticipate necessary changes in settings or statuses.

If this sounds vague, it’s because engineers are still working out how exactly some of this information exchanging, model sharing, and computing will be handled across the network. But even in the 5G standard, engineers are working on solving problems such as how to bring AI models into the over-the-air network, which is something expected in the upcoming release 19 of the 5G standard. (For those who are curious, engineers are finalizing Release 18 at the moment.)

Peter Vetter, president of Bell Labs Core Research with Nokia, called this concept Massive Digital Twinning, and believes that eventually entire factory floors or smart cities will have digital twins running in simulation alongside their physical counterparts. Closely tied to the creation of these massive digital twins, and a beneficiary of them, is the second concept: using the network as a sensor.

There are two components to the network as a sensor said Vetter. The first is simply using an understanding of disruptions in radio frequencies to map out the presence of people and machines, which we’re seeing today with both Wi-Fi sensing and millimeter wave sensing. But it gets much more interesting when you combine the concept with an understanding of what the physical infrastructure actually is.

Once 6G radios are in an area and attached to equipment, sensors, people, and roving robots, the network will understand not only that something is moving within a location, but know what it is, and potentially how it relates to the other objects around it. The ways in which companies might use this technology could vary, but manufacturing clients are already eager to use the precise positioning features that are coming out in 5G networks to locate equipment and people, so having not just an object’s location, but an understanding of how an object relates to the other things around it, is going to be of interest.

Vetter said he is also looking forward to other features in 6G, such as the creation of digital trust zones where reliability, security, and communications are preset for a physical area or between a specific cluster of devices; the use of 6G for communication between robots, especially as they also interact with people on factory floors; and a focus on sustainability.

Underlying these use cases are innovations in data processing both at the edge and in the cloud (and somehow linking the two in a manner that is as close to simultaneous as possible). These innovations will require an evolution of computing and in the air interface, which is how cellular technology delivers zeros and ones crammed into a hertz of spectrum. We’ll also need policy changes regarding spectrum sharing and the allocation of new spectrum, plus a continued focus on the security of the equipment that will be delivering these bytes of data over the airwaves.

It’s a lot, so it’s good we have a few more years to get it together.

Stacey Higginbotham

Share
Published by
Stacey Higginbotham

Recent Posts

Episode 437: Goodbye and good luck

This is the final episode of The Internet of Things Podcast, and to send us…

8 months ago

So long, and thanks for all the insights

This article was originally published in my weekly IoT newsletter on Friday August 18, 2023.…

9 months ago

We are entering our maintenance era

This article was originally published in my weekly IoT newsletter on Friday August 18, 2023.…

9 months ago

IoT news of the week for August 18, 2023

Verdigris has raised $10M for smarter buildings: I am so excited by this news, because roughly eight…

9 months ago

Podcast: Can Alexa (and the smart home) stand on its own?

Amazon's head of devices, David Limp, plans to retire as part of a wave of executives that…

9 months ago

Z-Wave gets a boost with new chip provider

If you need any more indication that Matter is not going to kill all of…

9 months ago