Analysis

Project Soli: a fading Ripple in the IoT waters?

Google’s Project Soli, the use of small radar sensors in devices, officially launched in 2015. And in six years, we’ve seen few products that actually use it. Now, Google has helped create an open-source API effort for the Soli technology called Ripple. I can’t help but wonder if Soli is simply a solution in search of a problem or if the world just isn’t ready for mini-radar chips to track gestures and presence.

In the three years following the introduction of Soli, there weren’t any devices that integrated it. That I can understand: It was a new concept under Google’s Advanced Technology & Product (ATAP) team. And it takes time to fully bake new technologies, especially one that promises sub-millimeter accuracy of movements and objects.

Image courtesy Google

In 2019, the Soli chip was a highlight of Google’s Pixel 4 phone launch. The feature was called  Motion Sense and it didn’t get the warm reception Google probably hoped for. Promising a way to “Control your Pixel without touching it”, this turned out to be more of a tech gimmick than some transformative feature. Essentially, Motion Sense lets you wave your hand in front of the Pixel 4 to mute your phone’s notifications or play, pause and skip music tracks. 

I like the overall idea of Soli on a phone, but these interactions are pretty limited. And they can easily be done in other ways, such as by voice. I say “meh” on Motion Sense. 

Google may have said the same thing because it followed up the Pixel 4 with the 4a, 5, 5a, 6, and 6 Pro. Guess what they’re all missing? The Project Soli chip. The only other product with motion-sensing tech is the latest Google Nest Hub. And it’s used for one specific feature: Sleep tracking.

Sleep tracking data on the new Nest Hub display and app.

All this is to say that Project Soli might be impressive technology, but it’s not technology that has found a home. And that brings me to the Ripple announcement.

There’s not much to go on yet, save for a few Tweets from Ivan Poupyrev, Director of Engineering and Technical Projects Lead at Google ATAP. And those point you to a rather sparse Ripple page on the Consumer Tech Association (CTA) website. The CTA is the group that puts on the Consumer Electronics Show each year, so I find it odd that they’re publishing information on Ripple, rather than Google.

Oh, there’s a Github code repository for Ripple too. I took a look and it too is pretty sparse. And relatively new. 

The repository was created on December 7, 2021, and only a few small code files were added on January 5, 2022. There are basically three APIs in the repository and little else other than a description of Ripple as an “open radar API standard to enable hardware / software interoperability and accelerate the growth of applications of general purpose consumer radar.”

That sounds to me like Google hasn’t yet found a truly compelling reason for Soli to exist in any of its own devices. No other device maker seems interested in using it either. So, it’s open-source to the rescue in hopes of creating some problem that Soli can solve. 

And if you were looking for a large set of Ripple partners, you won’t find that yet either. There are six. One is Google, of course. Three are chip makers: Infineon, NXP, and Texas Instruments. And the last two make (or will make) actual end-user products: Blumio is working on smart blood pressure sensors while Ford makes vehicles. Maybe Ford finds a good use for radar-sensing chips to “see” when drivers aren’t awake or paying attention to the roads.

I know this sounds harsh. And I definitely appreciate what Google has tried to do with Project Soli. 

However, somewhere between the technology lab and technology products, there’s a disconnect. What problems is the Soli team trying to solve and are they high-impact problems or simple little problems? Soli only works at very close range to a device, so what’s the benefit to not just reaching that extra few inches to tap a screen? What happened to the smartwatch that can read hand gestures that Google showed off in 2016? And the speakers that can also be controlled with a hand wave it demonstrated at the same time? 

Image courtesy Google

Those demonstrations amounted to nothing, except maybe additional research data points, and we haven’t seen any practical implementations since then. Well, aside from Motion Sense in the Pixel 4 that quietly disappeared in future phones and the latest Nest Hub. I wonder how many people bought a 2021 Nest Hub and actually use the sleep tracking feature, to be honest.

Soli is now going on year seven. And as amazing as the technology may be, it’s either way too early for it, or there simply aren’t enough practical applications to take advantage of it. Maybe APIs from the Ripple project will shed light on new Soli solutions that haven’t been envisioned by Google yet. After waiting this long though, I’m now a Soli skeptic.

Kevin C. Tofel

Share
Published by
Kevin C. Tofel

Recent Posts

Episode 437: Goodbye and good luck

This is the final episode of The Internet of Things Podcast, and to send us…

9 months ago

So long, and thanks for all the insights

This article was originally published in my weekly IoT newsletter on Friday August 18, 2023.…

9 months ago

We are entering our maintenance era

This article was originally published in my weekly IoT newsletter on Friday August 18, 2023.…

9 months ago

IoT news of the week for August 18, 2023

Verdigris has raised $10M for smarter buildings: I am so excited by this news, because roughly eight…

9 months ago

Podcast: Can Alexa (and the smart home) stand on its own?

Amazon's head of devices, David Limp, plans to retire as part of a wave of executives that…

9 months ago

Z-Wave gets a boost with new chip provider

If you need any more indication that Matter is not going to kill all of…

9 months ago