This story was originally published on Friday May 12, in my weekly newsletter. To receive the newsletter, sign up here.
Apple wants to build a health coach with the ability to track people’s vital signs as well as their emotions. Meanwhile, Amazon is testing a smarter version of its Astro robot, including giving it the ability to track and remember what’s happening in people’s homes so it can better monitor them.
On the web itself, people are becoming aware that their chats with ChatGPT/ generative AIs might accidentally leak private or even competitive intelligence. Even Amazon’s employees were concerned about the privacy implications of the company’s plans to build a health coach. Based on all of this, I have to ask: Do we really want the smarter devices and services that companies are building?

I initially got excited about the potential for the internet of things to add sensing and connectivity to everyday objects and places. With the information they gather and access to cheap cloud computing for data analysis, I’d hoped we’d make the invisible visible. Specifically, I’d hoped that we would take some of the invisible externalities associated with pollution or industrial processes and effect positive change.
Maybe it would be an NGO holding a factory accountable for air pollution, or a city enacting different zoning laws to prevent people from living on top of toxic areas. Maybe a factory would use the data it gathers to reduce the amount of harmful materials in its product, or to make a new product that lasts longer.
While I’m seeing a bit of the latter, especially when it comes to cutting carbon emissions or reducing waste, I don’t see much of anything related to using sensor data to hold private enterprise accountable. The IoT is making the invisible visible, but so far it’s doing so in a manner that benefits only the bottom line.
And when it comes to the smart home and consumers, I have to ask: Do we really want to make fully visible what we currently keep invisible to tech companies, data brokers, and the companies and governments that buy from them?
With the hype around generative AI such as ChatGPT, big tech companies are embedding their own large language models or other generative models into their products. This week at Google I/O, the search company showed off the use of new models in search, image generation, security, and medicine, and also as a way to help “jumpstart” creativity.
Amazon has talked about improving Alexa with a new large language model, and Business Insider is reporting that it has a project code-named Burnham that will give its Astro robot the ability to remember things and answer questions. But do I want a little robot armed with a camera patrolling my home looking for problems?
Amazon pitches it as a way to make sure the stove is turned off, or a means of seeing things like broken glass and letting you know something is wrong. But it will also be a way for families to surveil folks at home and, depending on the privacy features (it should provide local storage and encryption for cloud data), a way for companies to get far more information about life at home than we can begin to imagine. That information may feel benign, like the type of toilet paper you use or the number of cats in your home, but tech firms can monetize that information in so many ways, and none of those ways have the consumer’s benefits at heart.
For example, when users asked Alexa about yoga, Amazon offered them ads for its Halo wearable product, assuming that the person was interested in fitness. With something like a smarter Astro, the consumer doesn’t have to actively ask a device that they know is connected to the cloud about a yoga mat; the robot can rove the home, note a collection of weights or fitness gear, and then send to Amazon the data that leads to the consumers’ demographic profile getting tagged in health and wellness.
A more disturbing example is how some of the user data from smart devices gets sold to data brokers. Today the big fuss is over location data from cellphones, especially in the wake of multiple states criminalizing women’s health care. In August, the Federal Trade Commission (FTC) sued a company called Kochava over the sale of location data as part of an overall effort to reign in data collection on consumers.
Last week, a federal judge in Idaho sent the FTC’s suit back, arguing that the agency didn’t prove that the collection and sale of the data had caused “substantial injury” to consumers. The judge did agree that the collection and sale of such personal data had the potential to cause harm, but needed the FTC to provide additional facts proving harm.
This is both a blow and an opportunity. For many, proving substantial injury from a loss of privacy will be challenging. While we will undoubtedly see a few cases where sales of consumer data leads to substantial harm, such as an arrest after an abortion or actual injury from a violent partner or stalker, the law doesn’t recognize the creeping harms from a life under constant and invisible surveillance. But it could.
Currently, however, when we don’t fully understand the risks of adding our data to some of the newer models, and companies want to add more cameras and more devices to our homes that aim to “understand” or coach us, it’s a risk to make our full selves so visible. But that’s exactly what we’ll do with some of these newer gadgets and services.
As much as I like technology and the convenience of some of my smart devices, the combination of smarter services, more cameras, more sensors, and “smarter” AI concerns me. I think it should concern you, too.
I want “Her”. If anyone has watched that movie, I want an AI that serves only one master…. me!
I would gladly feed it all my info, train it to understand every nuance of my voice (to make sure it always understand me) and ask it to automate as much of my daily life as it could by watching my every move:
1) which lights I turn on and off; when and why
2) when I eat dinner, what I like and what I want to avoid or achieve
3) when I go to bed, how well I sleep and how to go about addressing a bad sleep pattern
4) what doctors I go to and manage follow-up appointments, ordering refills.
5) who is in my immediate vs extended family…. and so on. Remind me of events and help me order services and gifts
I would gladly meet with my AI a couple times a week to see what it recommends, approve or disapprove of its suggestions, conclusions etc.
But this AI would watch only me, serve only me and protect only me. And then my spouse would have the same thing for them and my cousin would have the same thing for her… and so.
To me, that is the future and I would pay a healthy monthly fee to have that help me navigate my life. Whomever makes that will have me buying their sensors, their cars, their accessories….everything that would make this AI function perfectly and exclusively for me.
looks like you are promoting or asking for https://hu.ma.ne/media/humane-previews-first-product-in-ted-talk
That looked like such vaporware, we didn’t even bother to cover it 🙂
Stacey, you probably already know about this one, but some of your readers may not, and I think it’s interesting
Back in 2012, target came up with a “pregnancy predictor“ that allowed it to predict with 80% accuracy when one of their customers was pregnant. Even if she hadn’t bought any maternity products or any baby products. The person just had to be in their loyalty program so they had purchases to analyze.
This made the news when a father came in to complain to target because his teenage daughter was receiving maternity item coupons in the mail. He thought they had made a mistake, but then, a few days later, he apologized: his daughter was, indeed, pregnant.
Target found its initial marketing program was too creepy, so they created a new one, which they still use, which mixes in items someone early in pregnancy might want with items they would never want so it wasn’t as obvious. But their goal was to get that person to buy their diapers at target, “because once they’re buying diapers with us, they’ll buy everything with us.“
It was a lot of math, and a lot of data, but very little tech as we think of it today.
https://bettermarketing.pub/target-knows-youre-pregnant-before-anyone-else-and-it-s-making-them-billions-7c4972a9bfab
I guess the point is that “know your customer” is linked to higher profits for most companies, and there’s always going to be that push from that side.