If nothing else, the leaked Supreme Court draft opinion that would end the right to privacy allowing women to get abortions should serve as a scary wake-up call to companies that have vast stores of user data. Technology companies, including app makers and connected device companies, have a stunning array of information about people housed in their data centers. In many cases, that data is just a subpoena or warrant away from government hands.
In other cases, the data is actively harvested and sold to buyers, ranging from governments to advertisers to academic researchers. When you add connected devices into the mix — where remote access to a device and control of that device is possible — companies might be forced to turn over data and turn against customers.
Without decisive regulations on what data should be considered private, how and who can access data, and clear rules about ownership and rights that ensure a buyer has physical control of a purchased device, technology firms are creating new windows into their users’ lives and then being forced to let the government peer in. In the case of connected devices, they may even let the government open the window.
If you think I’m overstating the potential for damage that could occur as technology firms and the state collide, consider some recent examples:
- After Russian troops stole $5 million worth of agricultural equipment from a John Deere dealership in Ukraine, the company apparently bricked the combines and tractors remotely, turning them into bright green and yellow agricultural sculptures. In this case, John Deere was simply remotely controlling inventory stolen from its dealership, but I can see a future where a government pressures John Deere to remotely deactivate customer equipment if it was owned by a wanted criminal, or if doing so would change the direction of an armed conflict. In the meantime, we are currently debating, as part of the current infrastructure bill, whether or not new cars should have some kind of kill switch so that police could stop a car chase.
- The Centers for Disease Control and Prevention (CDC) spent $420,000 to buy location data on millions of Americans from a private company. The CDC wanted to see if people were following COVID curfews but also wanted to track neighbor-to-neighbor visits, visits to church and pharmacies, and other related activities during the pandemic. Much of that data collection could be related to the agency’s auditing of lockdowns to better determine their effectiveness, but the CDC is also using location data for other more nebulous programs. Additionally, because location data can be tied back to addresses where a person lives or works, it’s not anonymous, which means the CDC could theoretically identify the people breaking curfew or quarantine and get law enforcement involved.
- SafeGraph, the company the CDC contracted with to buy that location data, also gave up access — this time to Vice, and for a mere $160 — to location data on people visiting abortion providers. If abortion becomes illegal in certain states, those states could easily buy data to figure out exactly who may have accessed health care services and punish them. SafeGraph’s CEO said he was glad the media called him out; he also said his company has stopped selling that data. He added, however, that researchers trying to figure out the effect of recently enacted new laws on abortion providers were frustrated by the company’s reversal on providing that data.
This is why we need to call for technology companies to hold less data, and to hold it for less time. And when it comes to data such as location data, which can easily identify an individual, we need to apply the various protections and standards associated with Personally Identifiable Information, or PII.
PII is a broad term that is defined not by specific pieces of data, but by how data can be used to “distinguish or trace an individual’s identity, either alone or when combined with other information that is linked or linkable to a specific individual.”
There are several different regulatory frameworks for dealing with PII, so laws that govern how companies use it vary based on industry. But the fact is that location data is PII and that tech companies collecting it should treat it as such. Today it is treated as an asset, sold to companies seeking demographic data or other information.
In the future, tech firms should treat location data as something a consumer can share with third parties, but that those third parties should also respect. So while that would mean we could still enjoy location-based apps, it would also likely mean we’d have to pay for them since they would not be subsidized by data sales.
We also should recognize that even if law enforcement agencies don’t have a legal right to data, currently they can buy it from third parties. So we either need to eliminate that option or create rules around how, exactly, law enforcement can use privately purchased data.
For years, the tech industry has told its users that it only shares its data with law enforcement and the state in exceptional cases. Yet it’s clear that private data brokers are buying data on users from tech companies and selling it to the state, creating a disconnect between legal protections enshrined in the Bill of Rights and the reality of prosecution.
The challenges will only grow as people give more information to more services, and devices collect highly personal information that can be remotely accessed and controlled. Without laws governing data use and collection, tech firms will become complicit in laws such as prosecuting women for having abortions or outing LGBTQIA+ students to their families, which their employees and many of their users might find unfair. And that’s not a fun (or profitable) position for them to be in.