Analysis

We’re making sensors smarter — and more complicated

During the pandemic, we’ve seen major sensor vendors add more intelligence to their sensors. In 2020 Sony launched a vision sensor that can perform people tracking or object detection locally, and last year Bosch announced its BHI260AP sensor, which has accelerometers and gyroscopes along with a processor pre-loaded with activity-tracking algorithms that can actively learn to track new motions.

There’s a clear desire from sensor manufacturers to boost the price and capabilities of their sensors, along with demand from customers that want to embed a smarter sensor inside a product without requiring more parts and integration on their end. And making all of this possible is TinyML.

 

From left to right: Abbas Ataya, TDK InvenSense; Victor Pankratius, Bosch Sensortec; Andrea Onetti, STMicroelectronics. Image courtesy of Ira Feldman at the tinyML Foundation.

Speaking at the tinyML Summit held this week in San Francisco, Victor Pankratius, with Bosch Sensortec, explained that the demand for more energy efficiency in overall systems means that deeper integrations between sensors and processors make sense. He called for computer science programs that currently dedicate time to a concept called memory hierarchy to add an energy hierarchy.

With memory hierarchy, designers allocate memory resources based on the response times of different memory formats. With an energy hierarchy, designers would have to design systems that make trade-offs around where to put data processing based on how much energy is used. According to Pankratius, each step closer to the sensor where data gets processed results in a 10x reduction in energy consumption.

This is a radically different way of thinking about systems architecture, but Pankratius isn’t alone. His co-panelist, Andrea Onetti of STMicroelectronics, agreed. Onetti’s company is also pushing more intelligence to the sensor with the goal of moving intelligence out to the furthest node. STMicro has created a new line of sensors called intelligent sensor processing units, or ISPUs, that are designed for its vision of the connected world.

It wasn’t just that panel touting smarter sensors. In another presentation, a speaker laid out plans for TinyML algorithms that would run on a sensor and then based on the results of that algorithm, wake up a second, higher-power system to provide more detail. An example might be a security camera that only wakes up if an image sensor recognizes a person. If it does, then it might wake up a larger processor to figure out if the person is a stranger or not.

This would reduce power consumption, but such designs also introduce complexity. A gentleman from Microsoft pointed that out during Q&A after the smart sensor panel. He said that when working with machine learning he already has to worry about the model creation and drift (that’s when a machine learning model becomes less accurate over time) in addition to managing the model itself. But with a sensor running its own model, he said, “suddenly my life becomes much more complicated.”

He’s not wrong. Not only would application and product developers have to manage more models, but they would also have to design a system that handles the potential interactions between the two models. Plus, smarter sensors are also more expensive sensors. Theoretically, if a designer can save 10x the power consumption by buying a smaller sensor they could also spend less on a smaller battery. But that’s not usually how those trade-offs work.

I can see why sensor makers want to turn what has been somewhat of a commodity into a high-value part, but I also can see why people will push back against that. It will be more expensive, for one thing. It will also be more complicated. But the energy efficiency, privacy, and lowered latency arguments for processing at the furthest edge are compelling. And in addition to large sensor manufacturers, I see other companies pursuing this model. Swim.ai is a startup that has been pushing this idea for quite some time. SensiML is another company built on the idea that data should be processed on the sensor itself.

So while it’s more complicated, the history of computing is full of integration of once-siloed hardware and software getting packaged up into more complex systems. And in doing so, costs shrink, energy usage is reduced, and performance improves.

Stacey Higginbotham

Share
Published by
Stacey Higginbotham

Recent Posts

Episode 437: Goodbye and good luck

This is the final episode of The Internet of Things Podcast, and to send us…

8 months ago

So long, and thanks for all the insights

This article was originally published in my weekly IoT newsletter on Friday August 18, 2023.…

8 months ago

We are entering our maintenance era

This article was originally published in my weekly IoT newsletter on Friday August 18, 2023.…

8 months ago

IoT news of the week for August 18, 2023

Verdigris has raised $10M for smarter buildings: I am so excited by this news, because roughly eight…

9 months ago

Podcast: Can Alexa (and the smart home) stand on its own?

Amazon's head of devices, David Limp, plans to retire as part of a wave of executives that…

9 months ago

Z-Wave gets a boost with new chip provider

If you need any more indication that Matter is not going to kill all of…

9 months ago