Analysis

How user perception changes the way consumers adopt platforms

Is Alexa confined to a device or do you see it as a presence that spans your smart home? Image courtesy of S. Higginbotham.

This week, I was also at the Smart Kitchen Summit learning about robotic pizza-making machines for restaurants, chocolate-making machines for the home, and the future of consumer packaged goods, as Amazon builds ovens and does deals with the food network. I also had an interesting conversation with Jack Sprouse, head of technology at Synapse, about how companies can take their user interfaces beyond voice.

In a presentation, Sprouse, whose firm helps companies take advanced research technology and put them into commercial products, focused on computer vision to detect gestures and gaze, electromechanical sensors and motors to remotely move knobs that have been changed in an app, and other novel ways to control connected devices.

I do like the idea of looking at a burner on my stove and telling Alexa to turn it on, thanks to a tiny camera module on the stove detecting which burner I want activated based on my gaze. I also like the idea of having the knobs on my stove move themselves to the right setting even if I have already used my voice or an app to adjust them. It’s nice to have the status of a device mirrored in both the real world and on an app.

To me, the challenge with this concept is that adding so many electronic devices and sensors would cost a lot of money, make appliances break down more often, and would shorten the potential life of those appliances. But in a conversation with Sprouse after the panel, it turns out that a bigger challenge associated with moving beyond voice, or even with incorporating more voice, is the user. Specifically, how the user views a digital assistant such as Alexa or Google.

Sprouse says that in focus groups users fall into two categories: those that view a digital assistant as available to help them with a number of things anywhere in their home, and those who believe that when they are talking to a digital assistant they are talking to “a little person inside the device.” I don’t think he meant that literally, but said that for people who think there’s a person in each device, the concept of asking Alexa to do something in a different room is a difficult one to grasp, and so designing products for that group of users will require different decisions.

For example, if I am trying to cook dinner as someone who views a digital assistant as a whole-home entity, I can just say, “Hey Alexa, let’s roast a chicken,” and expect that my oven will pre-heat to 375 degrees and maybe that my kitchen lights will brighten so I can better see what I’m doing. Whereas If I’m a device-centric thinker, I will expect to have to tell Alexa to turn the kitchen lights to 100% and to preheat the oven to 375 degrees.

This may seem like a small thing, but because companies such as Amazon and Google view the digital assistant as a platform that connects devices, services, and different facets of people’s lives, it has an impact on how they design user interfaces, privacy standards — even how they integrate products. For example, Google just launched the ability to stream music from one room to another simply by asking the Assistant to move the music to the Kitchen speaker. If you view each device as a standalone entity, that kind of interaction isn’t intuitive, so Google will have to build in ways to educate you.

It also affects privacy. If I have Alexa or Google in my car and ask for the assistant to turn on my lights when I get close to home, for example, I’m sending my request over the internet along with my location. But if I think there’s something in my car controlling the lights, I may not realize that the data isn’t being processed locally.

I’m still surprised that there are people who don’t think the way I do about digital assistants, but I found Sprouse’s presentation to be a fascinating look at the challenges designers have when thinking about bringing digital intelligence to everyday devices.

Stacey Higginbotham

Share
Published by
Stacey Higginbotham

Recent Posts

Episode 437: Goodbye and good luck

This is the final episode of The Internet of Things Podcast, and to send us…

8 months ago

So long, and thanks for all the insights

This article was originally published in my weekly IoT newsletter on Friday August 18, 2023.…

8 months ago

We are entering our maintenance era

This article was originally published in my weekly IoT newsletter on Friday August 18, 2023.…

8 months ago

IoT news of the week for August 18, 2023

Verdigris has raised $10M for smarter buildings: I am so excited by this news, because roughly eight…

8 months ago

Podcast: Can Alexa (and the smart home) stand on its own?

Amazon's head of devices, David Limp, plans to retire as part of a wave of executives that…

9 months ago

Z-Wave gets a boost with new chip provider

If you need any more indication that Matter is not going to kill all of…

9 months ago