Privacy Law
Back to School Review: Big Tech Anticipates Congressional Moves on Teen Privacy
By Cody Venzke
August capped off a summer of activity by Congress and major technology companies to reform online privacy for individuals under 18, especially for teenagers, who currently lack protections guaranteed to children under 13. As members of Congress dug into the issue more deeply, major tech companies with large userbases of minors rolled out new protection for children and teenagers.
Congressional Action
From the late spring and throughout the summer, members of Congress pressured technology companies to bolster privacy protection for teenagers and children. The summer kicked off with a hearing by the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security, titled âProtecting Kids Online: Internet Privacy and Manipulative Marketing.â The hearing included testimony from Baroness Beeban Kidron, a cross-bench peer in the United Kingdomâs House of Lords. Baroness Kidron testified about the United Kingdomâs then-forthcoming Age-Appropriate Design Code, a set of âpractical measures and safeguards to ensureâ compliance with UKâs Data Protection Act 2018. The Codeâs fifteen principles require âinformation society services likely to be accessed by childrenâ under 18 to design their services and data practices in the âbest interests of the childâ and take steps to mitigate risks to children. The Code went into effect on September 2.
Members of Congress took notice of the Codeâs potentially far-reaching effects. Sen. Edward Markey and Reps. Kathy Castor and Lori Trahan wrote letters to major tech and gaming companies including Amazon, Facebook, Google, Microsoft, Snapchat, TikTok, Disney, Activision Blizzard, Epic Games, Niantic, and Nintendo to extend the Codeâs protections domestically. In their letter, Sen. Markey and Reps. Castor and Trahan specified several risks to children that necessitated the expansion, including increased screen time, a lack of transparency, collection and sharing of childrenâs âsensitive information,â âsocial engineering,â exposure to âcybercriminals,â ânudging,â and insufficient parental controls.
Those same members of Congress also introduced legislation based on many of the Codeâs principles:
- Sen. Markey, along with Sen. Bill Cassidy, introduce the Children and Teens Online Privacy Protection Act, S. 1628, which would extend the protections of Childrenâs Online Privacy Protection Act (COPPA) from children under 13 to teenagers under 16, expand the scope of âoperatorsâ subject to COPPA, and ban targeted advertising directed toward children, among other things.
- Similarly, Rep. Castor introduced the Protecting the Information of our Vulnerable Children and Youth Act, H.R. 4801, which would extend COPPAâs protections to teenagers under 18, ban targeted advertising on services âlikely to be accessed by children or teenagers,â create a private right of action under COPPA, and incorporate the Codeâs âbest interests of the childâ standard.
- Rep. Trahan published a draft bill that would prohibit targeted advertising in services used for âK-12 purposesâ or using data collected by those services for targeted advertising.
Techâs Response
Major tech companies have taken notice on the legislative landscape. Over the summer, Google, YouTube, Instagram, and TikTok announced overhauls of their protections for children and teenagers.
Google announced that it would begin setting âSafeSearchâ as the default for users ages 13-18 years old and that users under 18 could begin to flag images of themselves appearing in search results for removal. The company also announced that it would no longer collect location history for users 13-18 years old. Google Workspace for Education similarly announced that it would implement new controls for administrators to specify age limits for students accessing different Google services outside the core educational services. The new Education features will require administrators to indicate which student users are 18 or older.
Similarly, Google-owned YouTube released new features aimed at protecting children and teenagers. Videos uploaded by users ages 13-17 are now set to private by default, with the option to make the video public. YouTube has also rolled out digital wellbeing features, including disabling autoplay and implementing reminders to take breaks or to go to sleep.
YouTube competitor Instagram also updated its privacy protections for children and teenagers, with a focus on creating safer interactions between users. The Facebook-owned social network announced it would begin restricting direct messages (DMs) âbetween teens and adults they donât followâ and providing âsafety notices in DMsâ to notify âyoung people when an adult . . . has been exhibiting potentially suspicious behaviorâ such as âsending a large amount of friend or message requests to people under 18.â Instagram will also make it more difficult for adults âexhibiting potentially suspicious behavior to interact with teensâ by restricting teenagers from appearing in the adultsâ search results or suggested users. Each of these features will use ânew artificial intelligence and machine learning technologyâ to not only identify usersâ ages, but also flag potentially inappropriate interactions. Neither Instagram nor Facebook have detailed this technology.
Finally, TikTok released changes in teensâ default privacy settings, building off updates released in January 2021. In January, the company changed default privacy setting for all registered accounts ages 13-15. It made those accounts private by default, disable direct messaging, permitted comments by only those usersâ friends, and disabled the option for others to download their videos. In August, TikTok expanded on those protections for users ages 16-17, disabling direct messaging by default and prompting users to confirm the privacy and visibility settings for each post. The company has also disabled push notifications late at night â after 9 pm for users ages 13-15 and 10 pm for users 16-17.
Conclusion
Although the changes implemented by technology companies address many of the issues raised by the Age-Appropriate Design Code and U.S. legislators, several issues â such as targeted advertising and the scope of business subject to existing law â remain unresolved. Congresspeople may continue to exhibit interest in this area, perhaps as a stopgap in place of a federal general privacy law.