A little less than half of 190,000 people surveyed around the world by Mozilla said that the loss of privacy was their biggest concern about living in a connected future. The 45% of people who are worried are presumably adults, but a new paper asks “what about the children?”
Two UNICEF researchers drilled into the privacy implications for children as we connect more things and apply data analytics to the information those things collect. The resulting paper is a summation of everything we fear about data collection only applied to children.
The paper raises multiple points about the need to safeguard children’s interests as we connect more devices to the internet, create digital footprints and hand over more and more of our lives to be divvied up as bits of data to be bought and sold. It also claims that the solutions we have in place, such as the Children’s Online Privacy and Protection Rule (COPPA) in the U.S. or the EU’s General Data Protection Regulation (GDPR), are geographically fragmented and not up to the task.
Some of the concerns are ones that parents have been thinking about for a while. For example, children can’t give consent to the use of their photos and personal information when their parents post it on social media. The same goes for schools, camps and friends’ parents who also gather and post data on children. Parents also have a role to play in signing kids up for services.
They should read the terms of service and consent to them on behalf of their offspring. But that’s where the role of parents comes to an ignoble end. That’s because many adults can’t understand the implications of data mining from those who might get their hands on childrens’ data and to how that data can be used to make sweeping (and incorrect) generalizations about a population. They also may not understand the risks of their children growing up digital; with a larger digital footprint made over more of their lives, children are at a greater risk of deanonymization because there is more data to work with. From the paper:
While these negative experiences are not limited to children, this generation will be the first to experience these issues throughout their life cycle, and particularly at early life stages and critical junctions in their personal development and public life. Furthermore, ensuring privacy – even with appropriate anonymization in longitudinal data – is also extremely difficult, given the fact that such data will have multiple transactions per individual; hence indirect identifiers will be greater than eight – the recommended maximum to prevent reidentification (El Emam, 2016).
So children could be targets of data collection and mining their entire lives without ever giving their consent. Even if their parents consent on their behalf it’s a crapshoot as to if the parents understand what they are giving up.
To solve these challenges the paper discusses familiar elements such as the right to be forgotten — which is now part of the GDPR legislation — creating a framework for companies that offer services to kids letting them revoke consent as they age and even respecting that children are not kids on the day before their 13th or 18th birthday and adults the day of. Instead, consent throughout childhood should probably evolve on a continuum that respects the maturity of the kid.
The final point the paper brought up, which is worth considering by all of us, is the looming prevalence of passive data collection. Devices such as the Nest thermostat, that track when we’re home, or our phones that can create a digital trail of where we’ve been without our taking any actions, are just the beginning. More devices will track us and more complicated algorithms will allow them greater insights into our lives.
For example, a company called Klue makes software for fitness trackers that can track our hand movements to see if we’re drinking a glass of water or wine simply by the way we hold the glass. That’s a convenient way to track food intake, but it’s also a window into all of our hand motions should the company decide to tweak their algorithms.
The risk in this passive data collection is that none of us will be able to consent to it because we won’t realize what’s being collected and how. It’s a brave new world and our kids will be the first ones to grow up fully immersed in it. And as this paper explains, that’s a scary thought.