Privacy Law

Privacy Law Review – What You Need to Know (January 2022)

Please share:

CLA’s Privacy Law Section summarizes important developments in California privacy and beyond. 

Message from the Chair

By Sheri Porath Rockwell, Sidley Austin – Chair, CLA Privacy Law Section

On behalf of the CLA Privacy Law Section, I write to wish all of our members a Happy New Year.  We all know that 2022 will be a very busy year in California privacy with CPRA regulations expected soon and preparations for January 1, 2023 compliance obligations. 

The Privacy Law section is working to provide our members with opportunities to have a front row seat and meaningful input into CPRA regulations:  We are excited to announce the February 18th launch of our CPRA Law + Tech Series, a five-part free webinar series we are presenting with the Future of Privacy Forum, a highly-regarding policy think tank based in Washington D.C.  The series will provide lawyers with a deeper understanding of various technologies that are the subject of CPRA (and other state data privacy laws): ad tech, global privacy opt-outs, automated decision-making and dark patterns.  We are kicking off the series on Friday, February 18th at noon to 1:15 pm Pacific, and will be joined by California Privacy Protection Agency Chair, Jennifer Urban.  The CPRA Law + Tech Series webinars will take place each Friday at noon PT.  Register here for our February 18th event.

In addition, our CPRA Rulemaking Committee, a group of approximately 30 Privacy Law members, meets regularly and have formed four subcommittees that are preparing to draft comments to be included in the Privacy Law Section’s submission to the California Privacy Protection Agency as part of the CPRA rulemaking process.

In addition to our CPRA work, we have several new committees that have formed and welcome volunteers:  our Cybersecurity Committee and our Ad Tech Committee.  If you are interested in joining, please email us at privacy@calawyers.org.  We welcome all. 

Stay tuned for more events in the works on health privacy law, financial privacy law and a membership-wide networking meeting in March.  And, of course, we will keep you up to date on all new developments with CPRA rulemaking. 

Don’t forget to spread the word about our section and get involved!  It is a great way to develop leadership and make a name for yourself in California privacy.  privacy@calawyers.org.  

Update on the CPPA’s Preliminary Rulemaking Activities

By Andrew Scott, CIPP/US/E & CIPM

In September 2021, the California Privacy Protection Agency (“CPPA” or “the Agency”), solicited preliminary written comments from the public. 

This invitation for preliminary comments asked about “new and undecided issues” not addressed by the existing Californica Consumer Protection Act (CCPA) regulations, including cybersecurity audits, automated decision making, the Agency’s audit authority, consumer rights, opt-out preference signals, definitions (Precise Geolocation and Dark Patterns), and limiting the use of sensitive personal information. 

In December 2021, the CPPA made those written comments available.  There have been seventy submissions, totaling more than 900 pages, from a variety of sources, including individuals, trade associations, consumer rights groups, and more.  The comments submitted are provided below: 

During this time, the Agency’s rulemaking subcommittees will consider the comments. (For more details on the breakdown of the committees, the agencies proposed plans, please review the board’s materials from September and November, including the November Minutes)      

Obstacles and Solutions

The Agency has also identified that its main challenges will be its resources for rulemaking, resources for informational hearings, the complexity of the topics involved, and the potential for a standardized regulatory impact assessment (SRIA).  In seeking potential solutions to these problems, the agency may consider staggering its rulemaking, opting for emergency rulemaking, and delaying enforcement resources.

Next Steps

Per Section 1798.185(d) of the CCPA (as amended by the California Privacy Rights Act ), the “timeline” for adopting final regulations required by the CPRA is July 1, 2022.  Given the truncated timeline to conduct its rulemaking, it is expected that the Agency will publish its initial set of proposed regulations soon.     

The board scheduled its next meeting for Thursday February 17, 2022, at 9:30am.  The agenda includes an “Executive Director’s Update,” which will include a “Rulemaking Process update:  Informational Hearings and Timeline.”  The meeting will be available via Zoom videoconference

To keep up to date, you may subscribe to the Agency’s email lists here

AG Reminds of Business Obligations

By Brandon M. Jasso, CIPP/US, CIPP/E

On January 28, 2022, the California Attorney General Rob Bonta (“AG Bonta”) announced that his office would be investigating the loyalty programs in California for compliance with the California Consumer Privacy Act (“CCPA”) (see here). AG Bonta’s Office indicated that it sent letters to major corporations in a variety of sectors who offer financial incentive programs letters requiring that they come into compliance with the CCPA within thirty days.

            AG Bonta stated:

“In the digital age, it’s easy to forget that our data isn’t only collected when we go online. It’s collected when we enter our phone number for a discount at the supermarket; when we use rewards for a free coffee at our local coffee shop; and when we earn points to purchase items at our favorite clothing store.”

            AG Bonta further stated:

“We may not always realize it, but these brick and mortar stores are collecting our data – and they’re finding new ways to profit from it. On Data Privacy Day, we’re issuing notices to business that operate loyalty programs and use personal information in violation of California’s data privacy law. I urge all businesses in California to take note and be transparent about how you’re using your customer’s data. My office continues to fight to protect consumer privacy, and we will enforce the law.”

Under the CCPA, businesses have an obligation when offering “financial incentives, including payments to consumers as compensation, for the collection of personal information, the sale of personal information, or the deletion of personal information.” Civ. Proc. § 1798.125(b)(1). Businesses may enter a consumer into a financial incentive program “only if the consumer gives the business prior opt-in consent pursuant to Section 1978.130 that clearly describes the material terms of the financial incentive program, which may be revoked by the consumer at any time.” Civ. Proc. § 1798.125(b)(3) (emphasis added).

The California Department of Justice (“DOJ”) began enforcing the CCPA on July 1, 2020, and providing examples of enforcement actions taken via their website (see here). As seen on the DOJ’s website, the DOJ has already provided notices in the past regarding a grocery chain’s loyalty program and the required postings under the CCPA. The DOJ’s prior action show that this financial incentive programs has been of concern, and AG Bonta’s statement show it will continue to be a concern going forward.

Therefore, it is important that corporations who offer incentive programs pay attention to their obligations under the CCPA (and its amendments under the California Privacy Rights Act (“CPRA”)). The protections of individual privacy rights will continue to be a priority in California and around the country as more privacy and data laws are proposed and implemented based on the CCPA and CPRA.

7th Circuit to Address to Address Whether Each Transmission of Biometric Data is a BIPA Violation

By Jennifer M. Oliver

On December 20, 2021, the Seventh Circuit U.S. Court of Appeals certified the following question to the Illinois Supreme Court related to interpretation of the Illinois Biometric Information Privacy Act (BIPA):

“Do section 15(b) and 15(d) claims accrue each time a private entity scans a person’s biometric identifier and each time a private entity transmits such a scan to a third party, respectively, or only upon the first scan and first transmission?”

No. 20-3202, 2021 U.S. App. LEXIS 37593 (7th Cir. Dec. 20, 2021).

Just five days earlier, an Illinois appellate court ruled that, yes, claims under sections 15(a) and (b) accrue with each capture and use of a plaintiff’s biometric information. Watson v. Legacy Healthcare Financial Services, LLC, et al.2021 IL App (1st) 210279 No. 1-21-0279, Dec. 15, 2021.

The underlying case with the certified question, Cothron v. White Castle Systems, Inc., brought by an employee of the White Castle hamburger chain, which requires fingerprint scans for employees to access computer systems. The plaintiff charged that sharing her fingerprints with a third party vendor violated the law.

A finding of statutory damages for each collection poses the possibility of tremendous damages in large class action cases, since BIPA provides for statutory damages of $1,000 or $5,000 per violation.

Importantly, however, parties disagree on whether BIPA damages are mandatory or discretionary. Should the court determine that the first scan is the only scan that starts the statute of limitations clock ticking, anyone bringing a claim more than five years after the first collection could be considered time-barred, even if their private biometric data continued to be transmitted within the last five years.

On one hand, the Illinois Supreme Court may determine that damage awards are at the discretion of a court and are not mandatory under the law. On the other hand, it could rule that every scan or transmission restarts the statute of limitations clock, but that a claimant may only collect damages once. The court might also determine that the clock starts to run when a claimant first learns of an alleged violation, which has precedent in litigation involving latent diseases caused by products, where individuals cannot know they were harmed until they developed a signature disease, i.e., one connected to a specific product.

The ruling in this case is especially interesting as the COVID-19 pandemic has led to skyrocketing adoption of remote access tools that can collect biometric data for learning, court appearances, and work-from-home arrangements, and a corresponding uptick in BIPA lawsuits.

Representatives Seek Information on COPPA’s “Safe Harbor” Programs in Investigation of Children’s Privacy and Wellbeing Online

By Cody Venzke

This month, the focus on children’s privacy and digital wellbeing continued to gain steam. On January 10, 2022, Reps. Kathy Castor (D-FL-14) and Jan Schakowsky (D-IL-09) sent a letter to six organizations administering Safe Harbor programs under the Children’s Online Privacy Protection Act (COPPA).  Noting that children’s online privacy is critical as “children are increasingly required to use online resources for educational, informational, and other essential purposes,” the letter seeks to understand whether the Safe Harbor programs are meeting their goals under COPPA.

COPPA permits for “representatives of the marketing or online industries” to establish self-regulatory guidelines as an alternative mechanism for compliance with COPPA’s requirements. 15 U.S.C. § 6503. Organizations that establish guidelines, known as “Safe Harbor programs,” must seek approval from the Federal Trade Commission (FTC) and ensure that online services that participate in their programs provide “substantially the same or greater protections for children” as those under COPPA. 16 C.F.R. § 312.11(b)(1).

Consequently, operators of websites directed toward children that voluntarily comply with Safe Harbor Guidelines “will be deemed to be in compliance” with COPPA’s substantive requirements. Id. § 312.11(g). The Safe Harbor guidelines must also include an “effective, mandatory mechanism for the independent assessment” of compliance with the guidelines and a discipline mechanism for “non-compliance.” Id. § 312.11(b)(2), (3).

In their letter, Reps. Castor and Schakowsky stated, “Unfortunately, there are signs that COPPA Safe Harbor organizations are not adequately doing their job.” Citing recent comments by former FTC Commissioner (and current Director of the Consumer Financial Protection Bureau) Rohit Chopra, they called for increased oversight of Safe Harbor programs. The Representatives sought information on how many website operators are enrolled in each Safe Harbor program, how each program guarantees protection for children’s privacy, and how they effectively audit and enforce their respective programs. The letter was addressed to the six remaining Safe Harbor programs: the Children’s Advertising Review Unit (CARU), Entertainment Software Rating Board (ESRB), iKeepSafe, kidSAFE, Privacy Vaults Online, Inc. (d/b/a PRIVO), and TRUSTe.

Rep. Castor and Schakowsky’s letter comes amid increased concern about children’s privacy and wellbeing online, COPPA, and the Safe Harbor programs. In August, the FTC removed Aristotle International, Inc., from its list of approved Safe Harbor programs, although it did not explain the removal. The FTC has also issued eight compulsory process orders, which will permit FTC staff to subpoena documents and testimony of companies. Those orders include one for “unfair, deceptive, anticompetitive, collusive, coercive, predatory, exploitative, or exclusionary acts or practices” related to goods or services marketed to  children.

Similarly, in Congress, attention to children’s privacy and wellbeing online grew following leaks showing that Instagram and Facebook were aware of harm that their platforms cause some children related to body image and self-esteem. The Senate Committee on Commerce, Science, and Transportation has held five hearings since May on “protecting kids online,” including witness testimony focused on “fixing” the Safe Harbor programs.

The Top 5 Biggest Adtech Shakeups in January

By McKenzie Thomsen

It was a wild 2021 for adtech privacy and judging by the number of newsworthy pieces this past month, 2022 will be turbulent. Here are the top 5 biggest shake-ups in adtech privacy of January 2022.

  1. FLoC Killed Cookies, but Privacy Advocates Killed FloC. Now Introducing… ‘Topics.’

The saga of Google’s third-party cookie continues with deprecation now scheduled for the end of 2023. No, Google hasn’t pushed by the date of deprecation again, and the word on the street is that they won’t. But the cookie replacement has changed. Google originally announced they were replacing cookies with the Federated Learning of Cohorts (FLoC), which would gather a user’s browsing data and put the users into categories (cohorts, if you will) based on interests. For example, if a user visited travelocity.com and looked at traveling to Lima, Peru, then Google would put that user into a cohort of people interested in traveling to Lima. But privacy advocates said FLoC wasn’t privacy preserving enough, that it didn’t actually stop users from being tracked. Instead, FLoC made it so that only Google could track a user, which isn’t privacy preserving, it’s just disintermediating in the name of privacy.

Well, the backlash worked. The newly introduced ‘Topics’ is a similar principle to FLoC, in that it tracks user interests based on their web browsing. The difference is that the interests disclosed are broader, thus giving out less information to advertisers. Essentially, the Topics method of targeted online advertising gives marketers less-granular information about web users.

  1. CNIL fines Google and Facebook for Dark Patterns

CNIL has fined Google 150 million euros and Facebook 60 million euros for letting users opt-in to cookies easily, but not letting users opt out easily. Google, including YouTube, and Facebook made it harder to reject cookies than accept cookies, that’s what’s been deemed a “Dark Pattern.” (A hot topic of 2021). According to the CNIL, it took several extra clicks to reject cookies. Along with paying the fines, the companies have to fix the errors and make it just as simple to reject cookies as it is to accept them. If they don’t fix the errors, the companies have to pay 100,000 euros a day.

  1. Legislate Away Surveillance Advertising

Congresspersons Eschoo (D-CA), Schakowsky (D-IL), and Booker (D-NJ) have introduced a bill to ban ‘Surveillance Advertising.’ The term ‘Surveillance Advertising’ was coined by Harvard Professor Shoshan Zuboff. The term gained popularity with her book, “The Age of Surveillance Capitalism: The Fight For a Human Future At The New Frontier Of Power” and has become a common term for using personal data for targeted advertising. While the bill bans targeted advertising, it does explicitly state that contextual advertising (where ads are selected based on the context of what the user is looking at on the page where the ad is served) is allowed. This bill should serve as a wake up call to adtech. The current forms of self-regulation aren’t working well enough. If you want legislators off your back, step up your self-regulatory game.[1]

  1. If Legislation Won’t Do It, How About the FTC?

In January, the FTC looked for public comment on a petition filed by Accountable Tech that asks the FTC to use its rulemaking authority to prohibit ‘Surveillance Advertising.” Did I mention that privacy advocates want more regulation of adtech? The petition states that surveillance advertising is “inherently an unfair method of competition” as it relies on and reinforces “monopoly power.” While the FTC typically regulates unfair methods of competition through enforcement actions, it does have the power to promulgate legislative rules as well.[2]

  1. Data Clean Rooms are Hot.

Finally, let’s talk about data clean rooms. They’re the next big thing in adtech. Because Google is joining the likes of Apple and Mozilla and getting rid of third-party cookies, adtech is reeling trying to figure out how to measure returns on advertising spend.

Let me give a quick primer. Cookies can do more than track a user’s interests based on their browsing activity. Cookies can also let advertisers know which ads were effective (the measurement of ad effectiveness is called “attribution”). This issue isn’t unique to web browsing without cookies, it’s the same issue advertisers face with Apple’s AppTrackingTransparency (ATT). Without use of cookies (or the IDFA on iPhones) advertisers don’t know which advertisements were effective and so don’t know how to spend their money with future campaigns.

So where do data clean rooms come in? Data clean rooms allow advertisers and brands to match user level data without actually sharing personal information, with the correct privacy controls, that is. And you can add all your favorite buzz words to data clean rooms too. ‘Differential Privacy,’ ‘homomorphic encryption,’ whatever. The data clean room can employ all kinds of privacy engineering, it’s just up to the owner of the clean room to do so. Big companies, sometimes referred to as “content fortresses” already have data clean rooms where you can upload your first party data and analyze your ad campaign. What’s new is ‘cross-network’ attribution which is the use of a data clean room not by a single content fortress, but by multiple companies. There is a debate on twitter about whether this use by multiple companies constitutes cross-site/cross-context tracking of users, even with all the privacy engineering buzzword bingo. We’ll see what happens. Maybe adtech will step up their game and self-regulate data clean rooms to avoid 3. and 4. above. Time will tell. Either way, data clean rooms are hot.


[2] https://www.huntonprivacyblog.com/2022/01/06/ftc-seeks-comments-on-accountable-techs-petition-for-rulemaking-to-prohibit-surveillance-advertising/

Inside the CNIL Fines of Google and Facebook for Cookie Violations

By Hina Moheyuddin

Internet cookies are not so unlike our favorite post meal indulgence. Like baked cookies, http cookies are accessible by many and come in many forms. They are relatively easy to accept when offered and difficult to reject. We all know what happens when you give a mouse a cookie, but what about when you give tech giant a cookie? It’ll probably ask for your consent… sort of.

            HTTP cookies are small packets of textual information about you and your preferences stored on your computer. They were developed to enhance user experience by saving information about you, such as the time a website was visited, items searched and saved in a shopping cart, links clinked, and so on. As the concept of the internet became more demanding and ubiquitous, as did the usage of cookies. Today, cookies are saved in a different way. Now, when you visit a website, the website generates a unique user ID which is exclusively recognizable by that server. This ID is then linked to the database of the company where all your information is stored and then fetched.  

            A fundamental piece of European Union (EU) privacy rules require prior consent for the use of cookies, and the French people have been complaining that tech giant cookie-consent popups are unlawful. The France’s National Commission on Informatics and Liberty (“CNIL”) agrees. Recently, CNIL fined Google 150 million euros and Facebook 60 million euros claiming that the websites do not allow visitors the option to refuse cookies “with the same degree of simplicity provided to accept their use.”  Both websites allow visitors to accept their entire set of cookies in one click. Rejecting the cookies, though, is a manual, discouraging process that requires visitors to disable them one by one.  CNIL sources their authority to enforce this action from Article 5(3) of the ePrivacy Directive which stipulates that the storing or access of information already stored requires obtaining prior consent from the user.  

            Consent online is regularly solicited through the form of a pop-up. Arguably, the use of these “cookie consent” interface design elements used by large tech-companies release whiffs of dark pattern use.

Dark patterns are subtle software tricks used to hoodwink users into doing things they don’t mean to on websites or apps. On March 15, 2021, California became the first state to expressly establish that consent obtained with dark patterns would not legally constitute consent. The adoption of the California Privacy Rights Act (“CPRA”) added new terms and definitions to the California Consumer Privacy Act (“CCPA”).  The revised definition of “consent” in the CCPA specifies that “agreement obtained through the use of dark patterns does not constitute consent.” (Cal. Civ. Code § 1798.185(a)(20)(C)(iii).  Public comments to the amended CCPA described dark patterns as “deliberate attempts to subvert or impair a consumer’s choice to opt out.”

            The most important ingredient in cookies is flour because it provides structure and without it there would be no cookie. It appears that consent is considered the flour for HTTP cookies, but I am not too convinced. Informed consent is integral in contract and many other legal relationships, but I don’t think it will solve all online privacy issues. Finding a resolution that will balance technological convenience and data privacy is difficult. What counts as legally acceptable consent differs depending on the surrounding circumstances and how informed consent fits our needs within an arguably elective domain is up for debate. I wonder whether exploring other disciplines can enhance or guide our approach to finding a resolution when it comes to internet cookies. Should privacy professionals explore brain science and cognitive psychology in efforts to understand how reading screens differs from reading paper in efforts to better appreciate how online users make decisions? Perhaps, what is needed is not more regulation but a different approach in designing regulations for a virtual environment.


Forgot Password

Enter the email associated with you account. You will then receive a link in your inbox to reset your password.

Personal Information

Select Section(s)

CLA Membership is $99 and includes one section. Additional sections are $99 each.

Payment