Privacy Law

Back to School Review: Big Tech Anticipates Congressional Moves on Teen Privacy

By Cody Venzke

August capped off a summer of activity by Congress and major technology companies to reform online privacy for individuals under 18, especially for teenagers, who currently lack protections guaranteed to children under 13. As members of Congress dug into the issue more deeply, major tech companies with large userbases of minors rolled out new protection for children and teenagers.

Congressional Action

From the late spring and throughout the summer, members of Congress pressured technology companies to bolster privacy protection for teenagers and children. The summer kicked off with a hearing by the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security, titled “Protecting Kids Online: Internet Privacy and Manipulative Marketing.” The hearing included testimony from Baroness Beeban Kidron, a cross-bench peer in the United Kingdom’s House of Lords. Baroness Kidron testified about the United Kingdom’s then-forthcoming Age-Appropriate Design Code, a set of “practical measures and safeguards to ensure” compliance with UK’s Data Protection Act 2018. The Code’s fifteen principles require “information society services likely to be accessed by children” under 18 to design their services and data practices in the “best interests of the child” and take steps to mitigate risks to children. The Code went into effect on September 2.

Members of Congress took notice of the Code’s potentially far-reaching effects. Sen. Edward Markey and Reps. Kathy Castor and Lori Trahan wrote letters to major tech and gaming companies including Amazon, Facebook, Google, Microsoft, Snapchat, TikTok, Disney, Activision Blizzard, Epic Games, Niantic, and Nintendo to extend the Code’s protections domestically. In their letter, Sen. Markey and Reps. Castor and Trahan specified several risks to children that necessitated the expansion, including increased screen time, a lack of transparency, collection and sharing of children’s “sensitive information,” “social engineering,” exposure to “cybercriminals,” “nudging,” and insufficient parental controls.

Those same members of Congress also introduced legislation based on many of the Code’s principles:  

  • Sen. Markey, along with Sen. Bill Cassidy, introduce the Children and Teens Online Privacy Protection Act, S. 1628, which would extend the protections of Children’s Online Privacy Protection Act (COPPA) from children under 13 to teenagers under 16, expand the scope of “operators” subject to COPPA, and ban targeted advertising directed toward children, among other things.
  • Similarly, Rep. Castor introduced the Protecting the Information of our Vulnerable Children and Youth Act, H.R. 4801, which would extend COPPA’s protections to teenagers under 18, ban targeted advertising on services “likely to be accessed by children or teenagers,” create a private right of action under COPPA, and incorporate the Code’s “best interests of the child” standard.
  • Rep. Trahan published a draft bill that would prohibit targeted advertising in services used for “K-12 purposes” or using data collected by those services for targeted advertising.

Tech’s Response

Major tech companies have taken notice on the legislative landscape. Over the summer, Google, YouTube, Instagram, and TikTok announced overhauls of their protections for children and teenagers.

Google announced that it would begin setting “SafeSearch” as the default for users ages 13-18 years old and that users under 18 could begin to flag images of themselves appearing in search results for removal. The company also announced that it would no longer collect location history for users 13-18 years old. Google Workspace for Education similarly announced that it would implement new controls for administrators to specify age limits for students accessing different Google services outside the core educational services. The new Education features will require administrators to indicate which student users are 18 or older.

Similarly, Google-owned YouTube released new features aimed at protecting children and teenagers. Videos uploaded by users ages 13-17 are now set to private by default, with the option to make the video public. YouTube has also rolled out digital wellbeing features, including disabling autoplay and implementing reminders to take breaks or to go to sleep.

YouTube competitor Instagram also updated its privacy protections for children and teenagers, with a focus on creating safer interactions between users. The Facebook-owned social network announced it would begin restricting direct messages (DMs) “between teens and adults they don’t follow” and providing “safety notices in DMs” to notify “young people when an adult . . . has been exhibiting potentially suspicious behavior” such as “sending a large amount of friend or message requests to people under 18.” Instagram will also make it more difficult for adults “exhibiting potentially suspicious behavior to interact with teens” by restricting teenagers from appearing in the adults’ search results or suggested users. Each of these features will use “new artificial intelligence and machine learning technology” to not only identify users’ ages, but also flag potentially inappropriate interactions. Neither Instagram nor Facebook have detailed this technology.

Finally, TikTok released changes in teens’ default privacy settings, building off updates released in January 2021. In January, the company changed default privacy setting for all registered accounts ages 13-15. It made those accounts private by default, disable direct messaging, permitted comments by only those users’ friends, and disabled the option for others to download their videos. In August, TikTok expanded on those protections for users ages 16-17, disabling direct messaging by default and prompting users to confirm the privacy and visibility settings for each post. The company has also disabled push notifications late at night – after 9 pm for users ages 13-15 and 10 pm for users 16-17.

Conclusion

Although the changes implemented by technology companies address many of the issues raised by the Age-Appropriate Design Code and U.S. legislators, several issues – such as targeted advertising and the scope of business subject to existing law – remain unresolved. Congresspeople may continue to exhibit interest in this area, perhaps as a stopgap in place of a federal general privacy law.


Forgot Password

Enter the email associated with you account. You will then receive a link in your inbox to reset your password.

Personal Information

Select Section(s)

CLA Membership is $99 and includes one section. Additional sections are $99 each.

Payment