Categories: AnalysisFeatured

How IBM’s Watson could bring more smart home intelligence to Siri and Homekit

Earlier this week, IBM announced a new partnership with Apple, explaining how it would be adding IBM Watson Services to Core ML. Since Watson has already proved its prowess on Jeopardy, most folks know what Watson is. Core ML is probably less familiar: It’s Apple’s machine learning framework for the company’s software platforms. Specifically, Apple says that Core ML can be used with Siri, Camera and QuickType.

How Watson works

That reference of Siri jumps out at me. Granted, the IBM-Apple news is really geared towards building apps for enterprises — Apple and IBM have been partners in this area for a few years now — but I’m thinking ahead on other areas where Watson could could benefit Apple products. And Siri surely needs help, especially inside the Apple HomePod.

How so? Well, let’s step back a minute and see how Watson works today.

This video provides a fantastic explanation but if I had to summarize it, here’s how I see it. Watson ingests large amounts of unstructured data, most of which today is written information. After analyzing that data for patterns, Watson attempts to structure it to understand both the content and intent of any actions taken upon that data.

This is far more advanced than simply scanning a never-ending stream of question-answer pairs because not every question is asked the same way and that can change the meaning of the question or analysis. There’s certainly more to Watson than my limited interpretation, but these are the parts most relevant to my thought process.

Data in the smart home: Context and intent

So what if the unstructured data was human behavior in a smart home? Theoretically, Watson could determine both the context and the intent of users of that home and through pattern recognition, possibly anticipate the needs of people in the home from such insights. This is the autonomous level of smart homes that I alluded to last week when discussing routines and automations.

To be more specific, Watson could help make sense of all of the actions we take in, around and near our homes: When we generally wake, leave for work, what we cook and when, who comes and goes, when do we sit down to relax and what do we typically do during that time. For a home to be semi-autonomous, certain patterns need to be recognized from these actions. And those patterns can be combined with already available verification data such as GPS location, network traffic from Netflix or music from an online streaming service.

At that point, a digital assistant such as Siri can begin to anticipate things and make insightful suggestions without any programming or user configuration; two items used today for routines and automation.

For example, I typically retire to the home office at some point after dinner but I don’t do work. Instead, I turn a light on to read a book or watch TV and I may play some low-volume music. Now imagine if Siri knew that, thanks to Watson.

I might head upstairs to the office and find the light already turned on to my preferred brightness. Siri could proactively ask if I wanted to catch up on the show I most recently watched on Netflix. Perhaps I respond and say, “No thanks, I’m going to read for a while.” Maybe Siri prompts to see if I want music that’s tailored for light background noise while I read. You see where I’m going.

Google is doing this with Docs already

If it sounds impossible that such patterns could be detected or useful, think about Google Drive. Using its own machine learning, Google knows when I typically return to specific documents and it highlights them at the appropriate time. Think context and intent here.

A perfect example is when Stacey and I collaborate on the IoT Podcast show notes. The day and time of that effort varies but I’d say that 90% of the time, I open up Google Drive to add topics for the next show, the spreadsheet appears above my Drive contents in the “Quick Access” area. In fact, under the document, it says, “You usually open this Sheet around this time.” It’s a simple example of pattern recognition, but it’s also a powerful one.

If Google can do this with Drive documents and Watson can do this with unstructured, written data, it’s just a matter of doing the same thing with a different type of data: Objects and their actions in the smart home. There’s no guarantee that Apple is working with IBM on this to make Siri a smarter digital assistant in the home, but if they’re not, I think they should be.

Kevin C. Tofel

Share
Published by
Kevin C. Tofel

Recent Posts

Episode 437: Goodbye and good luck

This is the final episode of The Internet of Things Podcast, and to send us…

8 months ago

So long, and thanks for all the insights

This article was originally published in my weekly IoT newsletter on Friday August 18, 2023.…

8 months ago

We are entering our maintenance era

This article was originally published in my weekly IoT newsletter on Friday August 18, 2023.…

8 months ago

IoT news of the week for August 18, 2023

Verdigris has raised $10M for smarter buildings: I am so excited by this news, because roughly eight…

9 months ago

Podcast: Can Alexa (and the smart home) stand on its own?

Amazon's head of devices, David Limp, plans to retire as part of a wave of executives that…

9 months ago

Z-Wave gets a boost with new chip provider

If you need any more indication that Matter is not going to kill all of…

9 months ago