Privacy Law
Children are the Future – of Online Privacy Legislations
By Kewa Jiang
While the fate of the American Data Privacy Protection Act remains in legislative limbo, federal agencies, states, and Congress have signaled their commitment for greater children’s online privacy protections. On August 30, 2022, California officially passed the California Age-Appropriate Design Code Act (“CAADC”), being signed into law by Governor Gavin Newsome on September 15, 2022). The CA AADCA is modeled after the UK’s Age-Appropriate Design Code.
Likewise, Congress is still pushing ahead with the Kids Online Safety Act (“KOSA”), and Children and Teens’ Online Privacy Protection Act, which some have dubbed COPPA 2.0. During the recent July 27, 2022, Senate mark-up hearing for KOSA and COPPA 2.0, all proposed amendments to both bills passed with very little opposition or critique. Senators stressed the bipartisan nature of the bills and their support for protecting children and teens online. While some senators voiced concerns about the potential overreach of the Federal Trade Commission’s rulemaking powers under both proposed bills, they nonetheless supported the bills moving forward.
Below is an analysis of several key provisions of KOSA and COPPA 2.0 as well as how some provisions compare with CA AADCA.
Kids Online Safety Act
Definition of Covered Platform
Under KOSA, a covered platform is an application or electronic service “that connects to the internet and that is used, or is reasonably likely to be used, by a minor.” Similarly, CA AADCA defines covered entities as those that are “likely to be accessed by a child.” Both definitions are criticized for being overly broad. Some privacy advocates worry the practical application of the definitions would mean nearly all websites, even ones not intended for minors, would need to comply with CA AADCA and KOSA. But advocates for the bills note that the goal is to protect minors on platforms where they actually frequent and not just where they are expected to frequent.
Prevention of Harm and Safeguard of Minors
KOSA imposes on covered platforms the duty to prevent harm to minors and to “not facilitate the advertising of products or services to minors that are illegal to sell to minors based on applicable State or Federal law.” This affirmative requirement raises the worry of censorship of online content for all children, regardless of age-appropriateness, and a potential chilling effect on digital service providers as their liability increases.
There is also concern that such a duty may be used to prevent teens from accessing important online resources, such as information about the LGBTQ+ community, about sexual health, or about any other information that may be considered “harmful” to minors in some states. For example, KOSA requires covered platforms to prevent material that show “sexual exploitation, including enticement, grooming, sex trafficking, and sexual abuse of minors and trafficking of online child sexual abuse material.” In some states there are increased accusations of “grooming” or “sexual exploitation” of minors when teachers or online resources provide LGBTQ+ information. These accusations may move beyond mere online smear campaigns and trolling to lead to actual censorship of accurate digital information for teens.
Similarly, in the wake of the Dobbs decision, there is concern that states may criminalize seeking an abortion or searching for information about an abortion. Information about abortion may become classified as “promotion of self-harm [or] . . . other matters that pose a risk to physical and mental health of a minor.” The lack of accurate health information will have real world negative consequences for those seeking an abortion.
Prevention of Addiction-Like Behavior
Under KOSA, covered platforms have an affirmative duty to prevent and mitigate “patterns of use that indicate or encourage addiction-like behaviors.” A similar social media addiction prevention bill was introduced in California, the Social Media Platform Duty to Children Act (AB 2408), but it failed to pass out of committee. While there is growing interest around regulating addictive qualities of online platforms, it remains to be seen how such legislations can be practically carried out. There may be required changes to push notifications, default settings, and dark pattern use, but there is nothing preventing minors from physically going online.
Age Verification and Transparency Report
KOSA would require “an accounting of the number of individuals using the covered platform reasonably believed to be minors in the United States, disaggregated by the age ranges of 0–5, 6–9, 10–12, and 13–16.” Similarly, CA AADCA will require covered platforms to “establish the age of consumers with a reasonable level of certainty.” Age verification requirements would likely generate more data collection, not less. For instance, platforms that did not require users’ age prior to the legislation may now ask in order to be compliant.
Besides additional data collection, questions remain about how platforms will practically verify the age of users. Meta proposed several forms of age verification methods, such as video selfie or social vouching. Video selfie would involve the user uploading a video of themselves, which would then be shared with Meta’s partner Yoti for age verification. Meta and Yoti would then delete the video selfie. Meta claims user provided videos would only be used for age verification purposes. However, the use of video or photo selfies as verification may be complicated by and run afoul of state biometric laws. Social vouching would require a potential user to select three other registered users, who are at least 18 years old, to confirm their age. The sponsoring users must be mutual followers with the potential user and must not be vouching for another potential user at the time.
Children and Teens’ Online Privacy Protection Act
Extends Covered Age from 13- to 16-Year-Old
COPPA 2.0 extends the covered age from 13 years old to 16 years old, which closes the gap in protection for older teens. This extension of covered age is in line with recent children’s online privacy legislations. Legislators note that older teenagers remain as vulnerable to the influence of social media, online content, and data collection as younger children. However, there may be concerns that regulations should be more nuanced and based on the needs of different age groups given the different levels of maturity and developmental needs.
Constructive Knowledge Rather than Actual Knowledge
COPPA 2.0 amends the prior requirement of “actual knowledge” of a users’ age with the requirement for “constructive knowledge” of a users’ age. Similarly, CA AADCA requires that a covered entity that does not have actual knowledge of a user’s age “shall not use any personal information for any reason other than the reason or reasons for which that personal information was collected.” Many critics note that under the prior version of COPPA, online service providers evaded the actual knowledge requirement by simply not verifying the age of any users on general audience websites. In contrast, a covered entity may now be held liable if they knew or should have known that children 16 years old or younger use their website or online services. The increase in liability on service providers for failing to handle a possible child user’s data correctly may lead to service providers implementing age verification across the board. This in turn leads to the aforementioned increased data collection.
Inclusion of Mobile Applications as a Covered Entity
Under COPPA 2.0 covered entities are “operators” if they are “a provider of a website, online application, online services, mobile application, or connected device, for commercial purposes that collects or maintains (directly or indirectly) personal information about users.” This differs greatly from the prior version of COPPA which defined covered entities as businesses that provided online services “directed to children.” It is notable that mobile applications are specifically included under the definition of covered entity because it may become difficult to enforce COPPA 2.0 against general audience mobile applications used by both teenagers and adults.
Looking Ahead
Supporters of the current children’s online privacy legislation note the need to protect young children and teens from the detrimental effects of social media and the need to reign in big tech companies. On the other hand, critics are concerned about potential chilling effects of the legislation on online service providers, access to information, and increase in data collection due to age verification requirements. For now, the United States may get a preview of the impact of the CA AADCA on the privacy landscape by examining the effect of the UK’s AADCA, which came into force in September 2021. In the meantime, it remains to be seen if KOSA and COPPA 2.0 will pass Congress.