Social giants slammed for failing kids’ privacy
How can brands avoid the same mistakes?
Posted: August 28, 2024
Parents – and regulators – worry about children’s privacy, especially their online privacy. Though older laws, such as the United States Children’s Online Privacy Protection Act (COPPA), focus on protecting children from adult-unauthorized online personal information collection from children, the integration of social media into the lives of youths today creates a newer, more insidious set of concerns.
In a recent Pew study, almost 90% of Americans reported that they are concerned about online social media platforms having personal information of children. This is not just the overwrought worry of helicopter parents: research points to “adverse consequences” of social media on children, such as body and weight image, depression, addiction, anxiety, physical activity, and other issues. Layered on top are risks related to sexual predator grooming.
Regulators and public service organizations are responding. YouTube is full of videos, some of which are produced by regulators like the Office of the Privacy Commissioner of Canada, raising awareness about children’s online privacy.
Regions, countries, states, and provinces have passed legislation specifically aimed at protecting children’s privacy, especially their online privacy. Even general purpose laws around the world, such as State privacy laws in the US, frequently set aside children’s privacy as a targeted topic. Regulators are considering laws to address addictive pattern design, for example. Some of these laws are lowering the age definition of “child” from 13 to 16 or even eighteen. There is even some debate, such as the heated and very public discussion in Australia, about a minimum age for social media accounts.
The social giants are in hot water
Regulators and the media have given social media giants, such as Facebook and TikTok, a failing grade for children’s privacy.
Recently, the United Kingdom’s Information Commissioner’s Office (ICO) put social media companies on notice, saying, “We are calling on 11 social media and video sharing platforms to improve their children’s privacy practices. Where platforms do not comply with the law, they will face enforcement action.”
In the same news release, Emily Keaney (the ICO’s Deputy Commissioner) said, “Online services and platforms have a duty of care to children. Poorly designed products and services can leave children at risk of serious harm from abuse, bullying and even loss of control of their personal information.”
Similarly, TikTok faces Federal Trade Commission problems in the United States for children’s privacy concerns under COPPA. Specific FTC accusations include inadequate safeguards to prevent signups from children under the age of thirteen without parental consent, a convoluted data deletion process, and failure to follow through on parental requests for deletion of their children’s personal data.
What can brands learn from their mistakes?
So how have large social media platforms failed children, and what are the pros and cons for solutions? Though the risks of social media to children are multifaceted and workable solutions complex and not without risks themselves, there are some patterns to consider.
- Transparency and Consent
- Parental Controls
- Age Verification
Transparency and Consent
As all privacy professionals know, transparency and consent are table stakes in the privacy game. Many, if not most enforcement actions related to children and online privacy contain a transparency and consent component. For example, among other charges, the FTC charged Epic Games for collecting children’s personal information without verifiable parental consent.
The challenge with children and transparency/consent lies in the fact that a child cannot give consent him or herself, and in some cases cannot be expected to read and understand a complex notice. Rather, a parent or guardian must be able to see notice(s) and give consent on the child’s behalf.
In the online world, getting hold of a valid adult – and truly knowing that the individual in question is an adult and parent or guardian – is difficult. The push of the need to give clear notice and obtain valid consent from a responsible adult, and the pull of experience decline whenever the online process pauses, is a challenging tension to maintain.
Solutions: Careful documentation of the consent transaction, including the ‘who’ involved in the transaction and why the organization is confident that the consenting individual is the relevant responsible adult is critical to compliance and online ethics. A consent management platform can help, as can careful consideration of clear/conspicuous notices and required documentation.
Parental Controls
Some proponents of online privacy have suggested that the responsibility rests on parents, and that technology can help. There are some technologies and services that can limit browser behavior according to parental wishes. Some social media platforms have also implemented options and information for worried parents. Snapchat, Instagram, Facebook, and other social media platforms have parent-centered functionality, resources, and guides in response to public and regulatory criticisms.
The problem with this solution is twofold. First, perfect technology for this issue does not exist. Second, for parental controls to work, parents must be involved and technically savvy enough to make good decisions. Children without involved and savvy parents are not protected, which is an unacceptable outcome.
Solutions: As the Instagram platform found, creating a separate privacy-sensitive version for children does not work, as there are few foolproof ways to prevent children from misrepresenting their age. If the main platform does not have sound privacy practices, children will be impacted. The solution is to build privacy, including children’s privacy, into the mainstream platform.
Age Verification
Age verification has long been the standard for adult-oriented sites. The common experience has been to present a question to the user: “Are you over 18?” or ask for a date of birth. Of course, the child may claim to be any age at all, and if the answer suggests an appropriate age, the child can enter the site. Later techniques required a credit card submission to verify age, which brings with it security concerns, or uploading of driver’s license or ID documentation, which brings privacy and data theft concerns.
Now, however, there are age verification technologies. These technologies also have privacy implications which regulators are beginning to evaluate. The French data protection authority, the CNIL, recently published its study into such age verification technologies.
Final thoughts
Understanding that no single solution can address all social media platform privacy concerns, especially related to children, the key is to consider privacy in a wholistic, end-to-end experience way. This means:
- Carefully considering notice and notice placement.
- Identify best-in-class solutions to getting (and tracking) valid adult consents.
- Create privacy-sensitive default settings.
- Consider age-verification techniques that provide some assurance, especially for higher-risk activities like sensitive data collection, targeted advertising, and information sharing.
Children’s privacy: A guide to online age verification and parental consent
In a world where everyone seems to be always online, potential pitfalls abound, those who are most vulnerable may be the least prepared to navigate them on their own.
As kids increasingly navigate the online world for learning, socializing, and fun, it’s vital to acknowledge and address the risks they may encounter.
In this guide, we’ll take a closer look at the legal landscape and how brands can take conscious steps to ensure children’s privacy is protected online.