During the pandemic, we’ve seen major sensor vendors add more intelligence to their sensors. In 2020 Sony launched a vision sensor that can perform people tracking or object detection locally, and last year Bosch announced its BHI260AP sensor, which has accelerometers and gyroscopes along with a processor pre-loaded with activity-tracking algorithms that can actively learn to track new motions.
There’s a clear desire from sensor manufacturers to boost the price and capabilities of their sensors, along with demand from customers that want to embed a smarter sensor inside a product without requiring more parts and integration on their end. And making all of this possible is TinyML.
Speaking at the tinyML Summit held this week in San Francisco, Victor Pankratius, with Bosch Sensortec, explained that the demand for more energy efficiency in overall systems means that deeper integrations between sensors and processors make sense. He called for computer science programs that currently dedicate time to a concept called memory hierarchy to add an energy hierarchy.
With memory hierarchy, designers allocate memory resources based on the response times of different memory formats. With an energy hierarchy, designers would have to design systems that make trade-offs around where to put data processing based on how much energy is used. According to Pankratius, each step closer to the sensor where data gets processed results in a 10x reduction in energy consumption.
This is a radically different way of thinking about systems architecture, but Pankratius isn’t alone. His co-panelist, Andrea Onetti of STMicroelectronics, agreed. Onetti’s company is also pushing more intelligence to the sensor with the goal of moving intelligence out to the furthest node. STMicro has created a new line of sensors called intelligent sensor processing units, or ISPUs, that are designed for its vision of the connected world.
It wasn’t just that panel touting smarter sensors. In another presentation, a speaker laid out plans for TinyML algorithms that would run on a sensor and then based on the results of that algorithm, wake up a second, higher-power system to provide more detail. An example might be a security camera that only wakes up if an image sensor recognizes a person. If it does, then it might wake up a larger processor to figure out if the person is a stranger or not.
This would reduce power consumption, but such designs also introduce complexity. A gentleman from Microsoft pointed that out during Q&A after the smart sensor panel. He said that when working with machine learning he already has to worry about the model creation and drift (that’s when a machine learning model becomes less accurate over time) in addition to managing the model itself. But with a sensor running its own model, he said, “suddenly my life becomes much more complicated.”
He’s not wrong. Not only would application and product developers have to manage more models, but they would also have to design a system that handles the potential interactions between the two models. Plus, smarter sensors are also more expensive sensors. Theoretically, if a designer can save 10x the power consumption by buying a smaller sensor they could also spend less on a smaller battery. But that’s not usually how those trade-offs work.
I can see why sensor makers want to turn what has been somewhat of a commodity into a high-value part, but I also can see why people will push back against that. It will be more expensive, for one thing. It will also be more complicated. But the energy efficiency, privacy, and lowered latency arguments for processing at the furthest edge are compelling. And in addition to large sensor manufacturers, I see other companies pursuing this model. Swim.ai is a startup that has been pushing this idea for quite some time. SensiML is another company built on the idea that data should be processed on the sensor itself.
So while it’s more complicated, the history of computing is full of integration of once-siloed hardware and software getting packaged up into more complex systems. And in doing so, costs shrink, energy usage is reduced, and performance improves.