Analysis

7 principles for regulation in the IoT era

I give about 10 talks a year to various organizations, which gives me the chance to think broadly about trends or holes in the IoT ecosystem that I keep stumbling over in my reporting. Earlier this summer, I gave one at an ACM WiSec security event, and it was one of the scariest experiences I’ve ever had. After all, these people were researching security flaws in the IoT. They were experts.

They wanted me to talk about how normal people evaluated risks in the IoT and what steps people were taking to protect themselves. Setting aside the question of whether or not I’m a good representation of normal people, the talk gave me time to think about the challenges that the IoT, in a world of federated and highly distributed computing, brings, and where regulators might need to step in. I provide seven of those ideas below. And I would love to speak with more people about their thoughts and ideas, too, as I’d like to dedicate more coverage to these issues in the coming months.

Congress keeps trying to act on IoT security.

1. Define consumer rights for software disguised as hardware: Connected products live in a grey area between hardware and software. This leads to a lot of conflicts and uncertainties for consumers when companies pull the plug on a cloud-based service, rendering a physical device useless. Or when companies change their software terms and conditions, changing how a device might operate. The most recent example of this was Peloton changing the software on its Treadmill product to prevent consumers from using it without a paid subscription. Peloton has since backed off that idea.

This lack of clarity around the rights associated with software wrapped in a hardware shell is also behind the “right-to-repair” regulations, and behind the lawsuits filed by consumers when companies pull digital content. The confusion mostly hurts consumers, and so as it becomes more difficult to buy unconnected products, Congress needs to codify basic rules of ownership to protect consumers. I’d also welcome the FTC getting more involved in setting rules as opposed to waiting for egregious acts by companies and then fining them.

2. Rethink the Fourth Amendment for the digital era: In the U.S., the Fourth Amendment protects the rights of citizens “to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures.” Centuries of case law have helped define where citizens can expect protection from police search and seizure, but new technology and tools mean we need to lay down clear rules around how police departments access data from public and private devices and when they need a warrant.

We should also provide more assistance to citizens who find themselves pulled into an investigation. In some cases, the only notice they might get that their cell phone location data or some other data about them has been swept up in a warrant or subpoena is an email from the tech provider that gave up the information. From that point on, the burden is on that person to figure out why their information was targeted and what they should do. They often have only a short window to address this.

3. Penalize private entities for leaking data/require rapid disclosure: This has less to do with civil liberties and more to do with the outsized harm that can come to a consumer from a data leak. Companies should face penalties for insecure practices such as using production data in tests, leaving unencrypted personal data in unlocked cloud instances, and more. When data breaches are the result of poor security practices, companies are negligent, and should have to pay for that. On the disclosure front, the sooner someone knows their data or passwords have been compromised, the sooner they can fix it. Right now, the administration is discussing a rule that forces companies to tell the public they have been breached within 24 hours. That feels pretty fast; I think a 3-7 day window would be fine.

4. Build auditing authority into existing government agencies to test outcomes: This principle is to address the claims of bias or a lack of transparency in AI. We are embracing the use of AI as a tool in many aspects of people’s lives, from deciding who gets bail to using facial recognition to make arrests. I’m not against these tools, but I do think that every government agency needs to have an auditing agreement in place to both show how the AI makes decisions and to lay out justifications for the biases that will undoubtedly show up. There’s no such thing as an unbiased algorithm; all algorithms are designed to prioritize some data over other data to achieve a result. Audit committees need to be able to assess those results and see if they meet the current policy goals.

5. Provide GDPR-style rules to help consumers control their data: We still don’t have a good federal law to help consumers control how their data is used. California has the California Consumer Privacy Act, which is a start, but we need a federal law to help consumers opt in to sharing their data, to offer them chances to evaluate the data a provider has about them, and to force the provider to correct or delete that data as needed. We should also create laws that dictate what data can be used and how companies can discriminate against consumers based on their data. Consumer data will be out there, so it’s imperative we figure out how companies can use it.

6. Ensure data can be corrected/expunge childhood data: I mentioned the ability to correct data as part of a federal data law, but I think it’s important enough that we should pull it out on its own, especially because having inaccurate data on a person could materially affect their life. Think about all the poor people who are erroneously included on the federal government’s No Fly List. The risks of a company sharing incorrect data expand as we use AI and data so computers can make decisions about people. I also think that kids should have the opportunity to eliminate their data when they turn 18, giving them a clean slate. This is harder said than done, however, because of that pesky digital ownership question. If my kid wanted to eliminate all the photos of her from the web, she’d have to ask me to dump my photos of her that are in Google’s cloud or hunt down friends who have shared pictures of her on Facebook. Whose rights matter most in that situation?

I think Google’s latest policy decision around kids, which removes images of kids under the age of 18 from Google image search, is a good start. Ironically this is a private company taking action on issues relating to kids on the Internet because other countries have instituted new regulations. Giving kids the chance to be kids without having it haunt the rest of their lives is important, especially given that their images are increasingly caught by cameras in private and in public.

7. Rethink current forms of identity and create layers of identity: This is a tough one, but I think it’s worth a big discussion. We’re in the middle of granting access to our faces, our fingerprints, and even our palm prints to private companies as a way to authenticate ourselves to computers. But these forms of ID can get stolen and potentially misused. There’s also no clear indication for consumers when to use a great password and when to use a fingerprint, for example. Given the potential for misuse and the confusion about the potential for harm, I think the government should be educating and maybe implementing some kinds of rules around when a password is sufficient and when you should be asked for, say, an iris scan. It should also put in place rules around how companies store the most sensitive of these identifiers to force those that gather this information to treat it like it matters.

Stacey Higginbotham

Share
Published by
Stacey Higginbotham

Recent Posts

Episode 437: Goodbye and good luck

This is the final episode of The Internet of Things Podcast, and to send us…

8 months ago

So long, and thanks for all the insights

This article was originally published in my weekly IoT newsletter on Friday August 18, 2023.…

8 months ago

We are entering our maintenance era

This article was originally published in my weekly IoT newsletter on Friday August 18, 2023.…

8 months ago

IoT news of the week for August 18, 2023

Verdigris has raised $10M for smarter buildings: I am so excited by this news, because roughly eight…

8 months ago

Podcast: Can Alexa (and the smart home) stand on its own?

Amazon's head of devices, David Limp, plans to retire as part of a wave of executives that…

8 months ago

Z-Wave gets a boost with new chip provider

If you need any more indication that Matter is not going to kill all of…

8 months ago