When everything is connected, all kinds of data will be collected. That’s the nature of a connected product. It can even be a benefit. But as a settlement this week between the Federal Trade Commission and smart TV maker Vizio details, a lack of transparency and overreach by device manufacturers will result in a slap on the wrist and a PR nightmare.
This is our future in a connected world. Without a few changes in the way data is collected, stored and protected consumers are screwed. And frankly, so are those hoping to sell connected products.
Vizio this week settled with the FTC and New Jersey for $2.2 million after the agency and New Jersey’s attorney general filed suit against the company arguing that it had tracked consumers behavior without their consent. Not only did it track their behavior, but it did so in a way that wasn’t transparent. The company shared not only data about what the consumer watched but also the consumer’s IP address which can be used to show what TVs and movies belong to which home.
The TV also gathered data on the devices that other people had hooked into the Wi-Fi network. That, plus technology that Vizio developed that took what were essentially screenshots and matched them to specific content, is a pretty damning lawsuit.
Vizio knew what you were watching on the TV, it knew what other internet-connected devices you had in your home, and it could link all of that information to your IP address. It then sold this package of information to companies that planned to use this information to sell products to consumers.
It’s like having access to millions of people as part of a giant focus group. It’s also one of the darker futures I predicted for the internet of things this year. Roughly a month ago I wrote:
The smart home is going to bite companies and consumers in the butt: We think life is so bad now with the millions of hubs, devices that get bricked with no warning and a complete lack of standards. Just wait. Next year, we’re going to see the equivalent of police trying to pull data from someone’s Amazon Echo times 1,000. Absent mores and actual laws, both individuals and companies are going to abuse the power that the internet of things can provide to profit themselves.
Apparently, I was three years too late, since Vizio started doing this all the way back in 2014. I’d love to think that Vizio’s actions are rare, but my hunch is that the allegations made here could be leveled at a large swath of the connected device industry. Fitbit, for example, sells anonymized data from your wearable to third parties. It’s certainly not alone.
Absent a way to make money on a commodity consumer good, many companies will opt for some kind of data grab in the hopes of making more money. Absent clear laws to stop them, companies will attempt to gather as much data as they can and sell that data to others.
Other than buying “dumb” products there’s not much consumers can do about this. How often has a privacy notice change on a physical product led you to shelve a device?
So where should we start?
We need to know how companies gather information and what they gather. For example, everyone is worried about the always-listening Amazon Echo, but that’s not how the device works. It may always be listening for a wake word, but the actual words spoken the rest of the time are relegated to the trash as noise.
What should give people pause is the ability for connected devices on your network to see what else is on the network. This isn’t some nefarious skill, but it is something that many services and physical devices have access to. Of the 45 or more smart devices in my home, I bet all of them have a pretty good idea of what other products are running.
Given the amount of data that can now be collected about a home or a person, we need to change what our definition of personally identifiable information (PII) is. Broadly it’s any information that can be taken together to figure out who a person is or where they are. Practically, that has included things like full names, addresses, social security numbers, phone numbers and your mom’s maiden name.
But IP addresses combined with the detailed history of shows could certainly identify me. And I’m not sure if I’m ready to be targeted based on all of the information my devices know about me. After all, one of those connected devices is an oven. I’m not sure I want Betty Crocker to know how many batches of cookies I bake in a week.
It can get far more nefarious when you combine personally identifiable information with algorithms that can detect things about your health, your credit score or more. For example, a machine learning company called Canary Speech has signed a deal with an undisclosed U.S. insurance company to use customer calls to the insurer for a project to test for Parkinson’s by listening to the caller’s voice.
To be clear, this insurer isn’t screening callers today, but the existence of this project and the fact that insurers are clearly looking at it in some fashion should give folks pause. Machine learning applied to the data you generate unwittingly or through smart devices is going to become part of corporate (and maybe even professional) decisions made about you.
At it’s most benign it may mean more intrusive ads. But it’s possible it may result in a loss of health insurance or assumptions made about you at work. So if you’re tempted to discount Vizio’s TVs as some one-off event, I’d think again. And because the ability to apply machine learning to things as simple as as a phone call or a cluster of Facebook posts, it’s possible that even buying dumb devices may not protect you.
I’d love for Congress to read the 2015 FTC report on this, the White House’s January report on this or this week’s Pew survey about the risk of relying on algorithms…and then start making some new laws.