Privacy Law

Privacy Law Review – What You Need to Know (June 2021)

Please share:

CLA’s Privacy Law Section summarizes important developments in California privacy and beyond.

Acting Chair’s Message | Sheri Porath Rockwell

Welcome to another jam-packed newsletter!  Many thanks to the Publications Committee (chaired by Jen Oliver) for providing us all with such wonderful updates about the ever-changing world of data privacy and cybersecurity.

I am excited to announce the Privacy Law Section’s Inaugural General Membership Meeting on Tuesday, June 29th from noon to 1pm.  We invite all section members to attend to learn more about how to get involved in the Section—because we are so new, there are many interesting leadership opportunities available!  We will also be offering a 30-minute program Breach Response Speed Chess: a thirty-minute overview of insurance coverage, legal counsel, and repetition management in privacy incident response. Self-study MCLE credit will be available.  Register here for free! 

We also have a new LinkedIn page!  Please follow us here.  Our old LinkedIn page was a “group” page that limited the ability to share and required permission to join.  We have moved our LinkedIn presence to our new page.  Our new page will make it easier for you to stay current with us and share to your network the content we post. 

Finally, I would be remiss if I did not mention the exciting news about the state’s new privacy protection agency.  The first meeting of the California Privacy Protection Agency will be held on Monday, June 14th beginning at 9 am.  Find out how to participate, with links to Zoom, on our LinkedIn page.

Hope to see you on June 29th at our Membership Meeting!  

Sheri

Legislation Committee Newsletter | Mallory Jensen

The Privacy Section’s Legislation Committee has been active both in setting up the committee’s plans and goals, hosting speakers for a breakout session during the annual Legislative Day, and reviewing proposed privacy legislation. In the coming months, we will spearhead the Privacy Section’s efforts on submitting comments on proposed CPRA regulations. If you’re interested in getting involved in the committee, please email co-chairs Elaine Harwell at elaine.harwell@procopio.com or Mallory Jensen at mallory.jensen@gmail.com.

Here are summaries of a few of the proposed bills we are reviewing. Watch this space, as we will have additional summaries and updates in coming months!

  • AB 335 (California Consumer Privacy Act of 2018: vessel information; proposed by Assemblymember Boerner Horvath): The CCPA grants a consumer the right to direct a business not to sell personal information about the consumer to third parties (a right to “opt out”).  This bill would add a new exemption from the right to opt out: vessel (i.e., boat or other watercraft) information or ownership information retained or shared between a vessel dealer and the vessel’s manufacturer, if the information is shared to effectuate or in anticipation of effectuating a vessel repair covered by a vessel warranty or a recall. The CCPA already has a similar exemption for certain vehicle information and vehicle ownership information retained or shared between a new motor vehicle dealer and the vehicle’s manufacturer.   
  • AB 1391 (California Consumer Privacy Act of 2018: compromised data; proposed by Assemblymember Ed Chau). CCPA authorizes a private right of action for any consumer whose nonencrypted and nonredacted personal information is subject to a breach as a result of a business’ violation of the duty to implement and maintain reasonable security procedures and practices. This bill would make it unlawful for a person to sell, purchase, or utilize data, as defined, that the person knows or reasonably should know is compromised data. The bill would define the term “compromised data” to mean data that has been obtained or accessed pursuant to the commission of a crime.
  • AB 814 (Personal information: contact tracing; proposed by Assemblymember Marc Levine): AB 814 defines “contact tracing” as identifying and monitoring individuals, through data collection and analysis, who may have had contact with an infectious person as a means of controlling the spread of a communicable disease (note that it is not limited to COVID-19). It very broadly defines “data” to mean measurements, transactions, determinations, locations, or other information, whether or not that information can be associated with a specific natural person. The bill prohibits data collected, received, or prepared for purposes of contact tracing from being used, maintained, or disclosed for any purpose other than facilitating contact tracing efforts. In addition, all data collected, received, or prepared for purposes of contact tracing would have to be deleted within 60 days. There is an exception for data in the possession of a local or state health department, but no other exclusions from its applicability. Furthermore, the bill prohibits an officer, deputy, employee, or agent of a law enforcement agency from engaging in contact tracing. AB 814 would permit a person to bring a civil action to obtain injunctive relief for any violation of this new law, as well as attorney’s fees, but the bill does not does not specify which authority would have enforcement powers.
  • AB 13 (Public contracts: automated decisionmaking; proposed by Assemblymember Ed Chau): This bill would enact the Automated Decision Systems Accountability Act of 2021 and would codify the intent of the Legislature that state agencies must use an acquisition method that minimizes the risk of adverse and discriminatory impacts resulting from the design and application of automated decision systems (meaning, here, a computational process, including one derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues a score, classification, recommendation, or other simplified output that is used to support or replace human decisionmaking and materially impacts natural persons). The bill would apply to those ADS systems with “high risk application.” In sum, the bill aims to end algorithmic bias. To do so, among other things, the bill places various requirements on contractors submitting bids to state agencies for a good or service that includes ADS for a high risk application to include very specific ADS-related information in the bid. In addition, state agencies that award contracts that use ADS must, within 10 days of awarding the contract, submit to the Department of Technology, a high-risk ADS accountability report describing the information provided in the bid.
  • AB 1262 (Information privacy: other connected device with a voice recognition feature; Assemblymember Jordan Cunningham): This bill would include smart speaker devices within the scope of existing law that prohibits a person or entity from providing the operation of a voice recognition (note: not speech recognition) feature of a connected television within the state without prominently informing the specified user of the connected television during the initial setup or installation. The bill would prohibit any actual recordings or transcriptions collected or retained through the operation of a voice recognition feature by the manufacturer of a connected television or smart speaker device, that qualify as personal information or that are not deidentified, from being used for any advertising purpose, or being shared with, or sold to, a third party, unless the user has provided affirmative written consent. The bill would also prohibit the manufacturer from retaining the recordings or transcripts unless the user opts in. The bill provides certain exemptions. Note that the bill does not cover smart speakers installed by third parties such as hotels, and specifically states that recordings shared with third parties may be used for the purpose of improving the device.
The EU Proposes an Act to Regulate AI | Stephen S. Wu

For better or for worse, developments in the law of artificial intelligence (AI) have intertwined with privacy laws developments in the past five years.  First, AI technologies and data practices used to create AI systems raise profound privacy issues.  Second, the European Union’s (EU) General Data Protection Regulation (GDPR), a privacy and security law, addresses individual rights in connection with automated data processing.  The EU did not have to include automated data processing rights within a privacy law, but now that the GDPR includes these rights, privacy practitioners analyzing the GDPR must understand the compliance implications of GDPR on AI.  Third, many of the privacy practitioners working on privacy are interested in AI, and so working on AI is a natural extension of their work on data privacy.

On April 21, 2021, the European Commission (EC) proposed a new Artificial Intelligence Act (the “AI Act”).[1]  Through the AI Act, the EC sought to promote the field of artificial intelligence in the EU while protecting individuals’ fundamental rights.  The AI Act combines elements of privacy protection, data security, product liability, anti-discrimination, national security, industrial policy, and youth protection in one regulation.  First, the AI Act would prohibit certain AI practices that threaten fundamental rights.  Specifically, the AI Act would outlaw uses of AI involving subliminal techniques; the exploitation individuals based on age, physical, or mental disability likely to cause harm; social scoring systems; and law enforcement’s use of real-time biometric recognition systems such as facial recognition, with some exceptions such as for preventing life threatening emergencies and terrorism.

Second, the AI Act distinguishes certain kinds of high-risk AI systems from other systems entailing lower risks.  High-risk AI systems are those that pose a risk to health, safety, or fundamental rights.  Under the AI Act, high-risk applications must have safeguards including a risk management process, pre-market testing and conformity assessment, transparency, human oversight, accuracy/robustness/cybersecurity, and quality management.

Third, for lower risk AI systems, the AI Act would require transparency practices.  Individuals must be informed that they are interacting with AI systems.  Technologies like Google Duplex, which simulates a human interacting with a service provider, would have to identify themselves as AI systems.

Fourth, the AI Act would create a new supervisory authority, the European Artificial Intelligence Board.  Similar to the European Data Protection Board, the AI Board would serve as a source of guidance and would coordinate among national supervisory authorities.  The Board’s goal would be to facilitate smooth implementation and consistent application of the AI Act. Like the GDPR, the AI Act would have extraterritorial effect, including some US companies with multinational reach.  The AI Act would cover providers marketing or putting into service AI systems in the EU, regardless of their location.  It would also cover users of AI systems located in the EU.  Finally, the AI Act would cover providers and users of AI systems outside the EU where the output produced by the system is used in the EU.  Given the possibility of sweeping in vendors, customers, and users outside the EU, US companies should begin to familiarize themselves with the proposed AI Act and should be prepared to include the AI Act’s requirements in their compliance program should the AI Act become law.


[1] European Commission, Proposal for a Regulation of the European Parliament and of the Council Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act) and Amending Certain Union Legislative Acts, https://digital-strategy.ec.europa.eu/en/library/proposal-regulation-laying-down-harmonised-rules-artificial-intelligence.

Class Settlement Could Reshape Children’s App Ecosystem | Cody Venzke

The parties in three related class actions alleging the improper collection of children’s personal information by gaming applications have reach settlement agreements that may have far-reaching consequences for children’s advertising, gaming, and targeted advertising. The class settlements with more than a dozen defendants were approved by the district court on April 12, 2021, and include prohibitions on targeted advertising in apps directed toward children.

The named plaintiffs in the class actions were parents who downloaded games developed by defendant  game developers (“Developer Defendants”), whose mobile game applications included software developments kits (“SDKs”) created by defendant SDK developers (“SDK Defendants”). An SDK is a pre-made package of code provided by one developer for others to incorporate into their own applications to provide specific functions or services. The Developer Defendants included major developers such as Disney and Viacom CBS.

According to plaintiffs’ amended complaint, as children played mobile games created by the Developer Defendants, their data was “exfiltrated” by the SDK Defendants’ SDKs. That data was used to serve targeted ads in the games, to track ad attribution, and “to target users with specific in-App cues or out-of-App ads” to increase children’s use of the games. Plaintiffs brought state law claims for intrusion upon seclusion and deceptive acts or practices.

The parties reached a series of preliminary settlements and sought certification of settlement classes under Federal Rule of Civil Procedure 23(b)(2). The preliminary settlements provided for injunctions against collecting and using children’s information. In particular, the settlements:

  • prohibit using data previously collected on children’s online activity for targeted advertising;
  • prohibit using data collected from the gaming applications for targeted advertising “in the same app, across other apps, or elsewhere on the internet” in the future;
  • require the Developer Defendants to use age gates on their gaming applications that do not “prompt” a user to enter an age over 13;
  • prohibit the SDK Defendants from targeting advertising to children and limit their advertising services to “contextual advertising” in “any app where the user identifies as a child under thirteen”; and,
  • require the SDK Defendants to develop an enrollment process for application developers to screen for child-directed content.

The preliminary settlement released none of the class members’ monetary claims. The parties anticipated the relief would affect not just the six gaming applications at issue in the actions but also “thousands of apps embedded with the SDK Defendants’ technology, and dozens of Disney apps.” The district court determined that the proposed classes satisfied the numerosity, commonality, typicality, and adequacy requirements of Rule 23(a) and the proposed injunctive relief would be generally applicable to the classes, as required by Rule 23(b)(2).

After a fairness hearing under Rule 23(e), the court granted the settlement final approval on April 12, 2021. The related cases are McDonald v. Kiloo, No. 17-cv-4344 (N.D. Cal.), Rushing v. Disney, No. 17-cv-4419 (N.D. Cal.), and Rushing v. Viacom CBS, No. 17-cv-4492 (N.D. Cal.).

Under New Leadership, Children’s Privacy Rises as a Priority at the FTC | Cody Venzke

Not long after being designated for her new post as Acting Chair of the Federal Trade Commission, Commissioner Rebecca Slaughter has signaled that she intends to prioritize children’s privacy among the Commission’s many enforcement duties. In remarks at the Future of Privacy Forum, Slaughter reiterated that the Commission has ordered social media and video streamlining companies to detail their privacy practices for their education technology services under Section 6(b) of the Federal Trade Commission Act. She also noted that the Commission continues to review its rules under the Children’s One Privacy Protection Act (COPPA), but stated, “We don’t need to complete our rulemaking to say that COPPA absolutely applies to ed-tech, and companies collecting information from children need to abide by it.”

Advocacy organizations have also pushed the Commission to address alleged violations of students’ and children’s privacy. In a February 19, 2021 complaint filed with the Commission, the Campaign for a Commercial-Free Childhood was joined by other organizations in requesting that the Commission investigate the math-learning platform Prodigy for deceptive practices under Section 5 of the FTC Act. The letter alleges that Prodigy falsely stated that its platform “is and always will be free” while it also advertised premium memberships with additional features directly to children. The Campaign for a Commercial-Free Childhood and the Center for Digital Democracy also filed a complaint with the Commission on March 31, 2021, alleging that the Google Play app store falsely stated that certain applications were “teacher approved” despite recent studies that have demonstrated that many applications on Google Play could potentially violate COPPA. The Campaign for a Commercial-Free Childhood’s complaint regarding Prodigy was supported by Sen. Edward Markey and Rep. Kathy Castor, who urged the Commission to launch an investigation.

In addition to her renewed emphasis on children’s privacy, Acting Chair Slaughter has also announced the creation of a new rulemaking group in the Commission’s Office of the General Counsel. According to Slaughter, the FTC’s existing rulemaking authority “has gotten a bad reputation for being too hard to use.” That rulemaking authority is under the Magnuson-Moss Warranty-FTC Improvements Act and requires the Commission to meet additional procedural hurdles such additional notices, staff reports, and a hearing. In her announcement, Slaughter stated, “a strategic and harmonized approach to rulemaking” is essential “to deliver[ing] effective deterrence for the novel harms of the digital economy.”

These changes come as the administration of President Joe Biden seeks to appoint Lina Khan, a professor at Columbia Law Schools to the Commission. Khan, a former counsel to the U.S. House Judiciary Committee’s Subcommittee on Antitrust, Commercial, and Administrative Law, is most-well known for her writings on antitrust and competition in online markets. She has also supported the Commission’s authority to conduct rulemaking under Section 5 and critiqued the emerging concept of “information fiduciaries” in privacy law. Khan’s appointment remains subject to Senate approval, where it has received support from Democrats, including Sen. Amy Klobuchar, but some Republicans, including Sen. Mike Lee, have questioned whether her experience and views are an appropriate fit for the post.

Data Breach Suit against Chili’s Gets Class Certification in Florida | Jennifer Oliver

Plaintiffs representing two classes of Chili’s customers, who claim the restaurant chain’s negligence led to a 2018 data breach that compromised their credit card information, have won their bid for class certification in the Middle District of Florida.

Florida District Judge Timothy Corrigan certified two Rule 23(b)(3) classes of consumers, one nationwide and one comprised of California customers only.  The classes are defined as customers who made credit or debit card purchases at affected Chili’s locations in March and April 2018, had their data accessed by cybercriminals, and incurred reasonable expenses or time spent mitigating the consequences of the breach.

The suit was filed after the company announced that a cyberattack at its point-of-sale systems had affected some of its 1,600 restaurants.

Last July the judge dismissed claims under Florida’s Deceptive and Unfair Trade Practices Act and for injunctive relief under California’s unfair competition law, finding that the plaintiffs’ alleged future harm was too speculative to give rise to those claims. The two remaining claims rest on allegations of current damage, including unauthorized charges on credit cards and losing the ability to accrue cash back or rewards points.

The court first found that the plaintiffs had standing to sue because they had alleged at least some actual damages, including the sale of their personal information on the dark web.

The court next examined two threshold requirements:  class definition and whether the plaintiffs are members of the class.  While plaintiffs argued that class members could be identified through defendant’s records, the Court found that the proposed class definition may be overbroad and include uninjured class members.  To remedy this, the court modified the proposed definitions sua sponte to clarify that class members’ data must have been “accessed by cybercriminals” and class members must have “incurred reasonable expenses or time spent in mitigation of the consequences of the Data Breach.”  The court reasoned that those edits remedied “later predominance issues regarding standing and the inclusion of uninjured individuals because now individuals are not in the class unless they have had their data ‘misused.’” The Court did not address how these “self-identification” procedures would impact the predominance inquiry.

As one of only a few data breach cases that have ever reached the class certification stage, we will continue to follow it and update CLA members accordingly.


Forgot Password

Enter the email associated with you account. You will then receive a link in your inbox to reset your password.

Personal Information

Select Section(s)

CLA Membership is $99 and includes one section. Additional sections are $99 each.

Payment