Recently, I received a question from a teacher asking what to say to kids about smart home technology, specifically Alexa and Google. Amidst news reports detailing how contractors are listening in on personal devices, the rules around protecting the privacy of children under the age of 13, and the discovery of inappropriate apps that can open up a channel between your home and a hacker, it felt especially relevant. His concern was that his students have these products in their homes but may not be aware of privacy and security risks they can introduce.
As I pondered the question I realized that I, as a parent of a 13-year-old kid myself, had some advice to offer about how to talk to kids about connected devices. And to be clear, I do think we need to have these conversations with our kids. Just like we talk to them about finances, media literacy, and the world around them, we should be talking to them about their security and privacy in a tech-driven environment.
And yet, it’s still a tough question to answer because it depends so much on the family, the family’s knowledge of and ability to secure their home network and an individual’s tolerance for risk.
I think it’s important to tell kids exactly how connected devices work and what — among the various bits of information those devices gather — gets saved. Ideally, you would show them by opening up camera feeds and apps that share exactly what you as the owner of the device can see. You could/should also direct them to the privacy panels that both Google and Amazon provide. Google’s privacy panel, where they can opt out of sending their utterances to a contract worker, is here. Amazon’s is here. With both Google and Amazon, a user can select to have their data deleted every three or 18 months. You should probably instruct the child to do this. I have mine set to delete every three months.
Both Amazon and Google now allow you to tell their respective digital assistants to “delete my data” verbally. You can also show kids that their caregivers can see their Alexa and Google requests. I showed my daughter the list of songs she had asked Alexa to play over the prior few days, for example, and now she has a sense of exactly what data is shared and what it looks like.
On the privacy action front, I think talking to kids about features like camera covers and turning the microphone off matter. They should also have some idea about what information they should and shouldn’t share with connected devices. Just like they shouldn’t share their address or real name online, maybe they shouldn’t share it with an Alexa Skill. That includes when they’re playing games on such devices.
I also think it’s important to teach kids how to think about these issues, especially as they get older. For example, teaching them to ask how the data that companies like Google and Amazon collect from IoT devices fits within those companies’ larger business objectives is important for older kids to understand. They may not get all the subtleties, but I think it’s an important question, and one that kids need to ask and continue asking, especially as these companies become larger and more opaque. Teaching them to ask that kind of question now will help lead to other questions later on, such as what the role of venture capital, public markets, private equity, debt, and more play, along with various additional issues they should eventually understand. And to really get to them to understand informed consent around their data collection and privacy, make clear to them how companies abuse that by teaching them how to avoid downloading apps that suck their data but offer no real return.
Internet security, however, is a tough one, because it can be hard to convey that computer security is an ever-moving target and that hacks will happen. Depending on their age, that information might scare them and make them feel unsafe. It’s probably better to help them learn how to evaluate risk, even while knowing that kids are terrible judges of risk because their brains are not fully developed. To give them a basic risk framework, I encourage kids to think about what the target is and how difficult it would be to hack it.
It also helps to explain that all hacks are not equal; some are much more difficult to execute than others. Few can be carried out without having physical access to the compromised devices, for example. Which means home devices are generally safe — unless one of the kid’s friends wants to prank them. This is where you should also explain basic router and Wi-Fi security, as well as the benefit of regular software updates. Because the most dangerous hacks are remotely executable hacks, since that way a hacker can remotely take over and control a device. That said, they are also rare, and it’s unlikely that one will attack a popular home device without a patch getting issued ASAP.
And here’s where the value of the target — be it the device or the device’s user — comes into play. Help kids understand what makes a good target (you’ll know they understand what you mean when they suggest hacking the school’s computers) and then have them evaluate themselves and their own devices. Ask them questions such as: Are they or members of their family a good target? Would a hacker like access to a connected Barbie doll? A door lock? Why or why not?
A lot of this advice goes beyond the Echo/Google concern and into the use of connected devices more broadly. But broadly understanding what any device they bring into their house can do, who has access to it, what they can do with it, and whether or not that device is worth the potential risk, is something kids (and adults!) should be able to do. Add to that some reliable home network security, a basic understanding about hacks, and some rudimentary knowledge of how privacy and business models intersect, and you have a pretty sophisticated little participant in today’s tech-heavy world.