This post originally ran on Friday, August 12 in my weekly newsletter.
After reading the Tesla stories this week that prompted Kevin’s story above, I tried to talk to my child about our upcoming trip to visit some colleges. In preparation for their junior year, my husband and I read “The Price You Pay for College”, a book that details the astonishing use of data collection by colleges as they seek information to help determine how much a family is willing to pay for college.
A summary of the practices can be found here, but the short of it is that colleges are taking in financial information and even data from email pixel trackers to gauge how interested a child might be in attending their school as well as the the financial resources of the family, then pricing the experience accordingly. After sharing this with my child, they tried to “game the system” by waiting a few hours to open their emails from the colleges we planned to visit to try to mislead any pricing signals.
Hearing them describe their strategy made me so sad. First, I’m not sure it really helps. And second, that’s only one source of data about my child available to college admissions staff and their army of consultants. Finally, this is only one small area of their life where my child is tracked by an array of sophisticated and largely invisible tools as they go about their life.
These aren’t isolated occurrences. I’ve consented to give away some of my FitBit data to third parties. My location data is available from Google and my cell phone provider, and tracking pixels follow my emails and my searches around the web. My connected devices regularly out me to my family. When I bake cookies in my connected oven, my husband gets a notification. When I drive my Tesla around town, he can check in on the app to see where I am.
I’ve shown my husband and my child how I can see everything that they ask Alexa in the app, something few people might realize that Alexa shares with the app owner. There are two issues here. One is that any connected devices can display data about people living in a home with that device or using that device. The other issue is that much of that data also travels to the device maker and from there can be shared with others, including law enforcement, advertisers and data brokers.
I address the first issue by regularly sharing the sensors and their capabilities with anyone who will use the product, and getting their consent. The second issue requires a regulatory solution. And it seems we might get it.
This week, the Federal Trade Commission said it would start a public process to develop rules that would crack down on “harmful commercial surveillance and lax data security.” On Thursday, the agency said it would issue an Advance Notice of Proposed Rulemaking that would seek public comment on the harms stemming from what the agency calls commercial surveillance and whether new rules are needed to protect people’s privacy and information.
I believe we need new rules desperately. Just watching how constrained my child feels trying to navigate social media and apps that want to know more and more about them, and put them in a demographic box while they are still trying to figure out who they are is painful. As they realize that such surveillance also has an impact on the cost of goods for people through discretionary price setting — whether that’s for college or concert tickets — I see them become disillusioned.
Some of the solutions to this problem will come from private companies. Look at how Apple has disabled tracking pixels in emails. I also expect that we’ll see more messaging platforms embrace end-to-end encryption (a fact that will stress out the government to no end) as they seek to avoid giving away incriminating information on users.
But the government has a huge role. We need rules that recognize the scope of the challenges and the existing markets for data. Data brokers are a huge, invisible force aggregating and selling extremely personal information to the highest bidder. Those buyers may merely be annoying, offering ads that are surprisingly well targeted, or they might be incredibly harmful, with the ability to deport a citizen or charge someone with a crime.
Regulations to cut off access to such data will have foes in Washington and in the private market. The FTC wants to hear from consumers about the actual harms created by commercial surveillance and plans to host a public forum on Sept.8 for additional public comment. This is your chance to voice your stories and concern. Ideally, the FTC can enact more rules, but a better solution will also involve Congress and recognition by technology firms that the business models they have pursued leave their users vulnerable to harm.
Then perhaps Kevin won’t have to trade his privacy for convenience when deciding to buy a new car.