Google’s Project Soli, the use of small radar sensors in devices, officially launched in 2015. And in six years, we’ve seen few products that actually use it. Now, Google has helped create an open-source API effort for the Soli technology called Ripple. I can’t help but wonder if Soli is simply a solution in search of a problem or if the world just isn’t ready for mini-radar chips to track gestures and presence.
In the three years following the introduction of Soli, there weren’t any devices that integrated it. That I can understand: It was a new concept under Google’s Advanced Technology & Product (ATAP) team. And it takes time to fully bake new technologies, especially one that promises sub-millimeter accuracy of movements and objects.

In 2019, the Soli chip was a highlight of Google’s Pixel 4 phone launch. The feature was called Motion Sense and it didn’t get the warm reception Google probably hoped for. Promising a way to “Control your Pixel without touching it”, this turned out to be more of a tech gimmick than some transformative feature. Essentially, Motion Sense lets you wave your hand in front of the Pixel 4 to mute your phone’s notifications or play, pause and skip music tracks.
I like the overall idea of Soli on a phone, but these interactions are pretty limited. And they can easily be done in other ways, such as by voice. I say “meh” on Motion Sense.
Google may have said the same thing because it followed up the Pixel 4 with the 4a, 5, 5a, 6, and 6 Pro. Guess what they’re all missing? The Project Soli chip. The only other product with motion-sensing tech is the latest Google Nest Hub. And it’s used for one specific feature: Sleep tracking.

All this is to say that Project Soli might be impressive technology, but it’s not technology that has found a home. And that brings me to the Ripple announcement.
There’s not much to go on yet, save for a few Tweets from Ivan Poupyrev, Director of Engineering and Technical Projects Lead at Google ATAP. And those point you to a rather sparse Ripple page on the Consumer Tech Association (CTA) website. The CTA is the group that puts on the Consumer Electronics Show each year, so I find it odd that they’re publishing information on Ripple, rather than Google.
Oh, there’s a Github code repository for Ripple too. I took a look and it too is pretty sparse. And relatively new.
The repository was created on December 7, 2021, and only a few small code files were added on January 5, 2022. There are basically three APIs in the repository and little else other than a description of Ripple as an “open radar API standard to enable hardware / software interoperability and accelerate the growth of applications of general purpose consumer radar.”
That sounds to me like Google hasn’t yet found a truly compelling reason for Soli to exist in any of its own devices. No other device maker seems interested in using it either. So, it’s open-source to the rescue in hopes of creating some problem that Soli can solve.
And if you were looking for a large set of Ripple partners, you won’t find that yet either. There are six. One is Google, of course. Three are chip makers: Infineon, NXP, and Texas Instruments. And the last two make (or will make) actual end-user products: Blumio is working on smart blood pressure sensors while Ford makes vehicles. Maybe Ford finds a good use for radar-sensing chips to “see” when drivers aren’t awake or paying attention to the roads.
I know this sounds harsh. And I definitely appreciate what Google has tried to do with Project Soli.
However, somewhere between the technology lab and technology products, there’s a disconnect. What problems is the Soli team trying to solve and are they high-impact problems or simple little problems? Soli only works at very close range to a device, so what’s the benefit to not just reaching that extra few inches to tap a screen? What happened to the smartwatch that can read hand gestures that Google showed off in 2016? And the speakers that can also be controlled with a hand wave it demonstrated at the same time?

Those demonstrations amounted to nothing, except maybe additional research data points, and we haven’t seen any practical implementations since then. Well, aside from Motion Sense in the Pixel 4 that quietly disappeared in future phones and the latest Nest Hub. I wonder how many people bought a 2021 Nest Hub and actually use the sleep tracking feature, to be honest.
Soli is now going on year seven. And as amazing as the technology may be, it’s either way too early for it, or there simply aren’t enough practical applications to take advantage of it. Maybe APIs from the Ripple project will shed light on new Soli solutions that haven’t been envisioned by Google yet. After waiting this long though, I’m now a Soli skeptic.
It’s a shame that in these Covid-y times the technology isn’t being used in more places. Would love to be able to interact with public devices without having to touch screens. Hands free vending machines, check in kiosks, plenty of opportunities I reckon.
It’s sad for me to see mmwave tech wasted. This tech is useful in so many smart home application. If the chip had better range on it, such as the chip in the Aqara fp1, a lot more could be done with it. I, personally, would love to see a z-wave capable device used for presence detection and advanced gesture control in homes. Something that would allow the level of control seen so casually used in the expanse. This kind of thing is the future of the mmwave chip, in my opinion. That and the sleep tracking stuff, but that’s so limited compared with the tech’s capabilities.