Privacy Law


Please share:

CLA’s Privacy Law Section summarizes important developments in California privacy and beyond. 

 Privacy Talks:  Interviews with California Privacy Leaders

By Andy Serwin and Smita Rajmohan

This month, the Privacy Law Section is proud to introduce a new feature:  Privacy Talks:  Interviews with California Privacy Leaders.

This month, we interview two members from our Executive Committee:  Andy Serwin and Smita Rajmohan. 

Andy is Partner and Co-Chair of the Global Data Protection, Privacy and Security Practice at DLA Piper.  Andy is also a Board Member for the National Cyber-Forensics and Training Alliance.  Here, you can watch our interview with Andy.

Smita is Senior Product Counsel at Autodesk.  Smita is also a Board Member for the California Lawyers Association’s Privacy Law Section and the Board Chair for the IAPP’s Silicon Valley/SF Bay Area Knowledgenet Chapter.  Here, you can watch our interview with Smita.

Timeline Extended for Finalizing CPRA Regulations

By Andrew Scott CIPP/US/E & CIPM

On February 17, 2022, The California Privacy Protection Agency (CPPA) held a public meeting. 

At that meeting, the CPPA Board (“the Board”) received an orientation training from the Department of Consumer Affairs (DCA) on board member roles and the process for rulemaking. Importantly, the Board received an update from Ahskan Soltani, the Executive Director of the CPPA, who believes the California Privacy Rights Act (CPRA) rulemaking activities will not be finished until Q3/Q4 of 2022. 

It was explained at the meeting that the Executive Director’s main role is to implement policy that the Board puts together, to know what is important to the Board, and to operate the day-to-day operations of the Board.   In carrying out other duties and responsibilities, the Executive Director files disciplinary actions against licensees, signs charging documents, manages the administrative and fiscal functions of the Board, and represents the Board to the legislature, the media, and the public.  Both the Board members and the Executive Director work together in drafting agendas and strategic plans.

In the Executive Director, as the DCA explained, the Board has basically hired a driver who is making sure the car is gassed up and ready to go.  Mr. Soltani, however, described this process more like “building the car while driving it.”   Whether it is driving or building the car, Mr. Soltani provided a comprehensive update on how the car would get where it needed to go.

Organizational Update
Mr. Soltani said he is building up the agency’s staffing needs, including its HR, IT, contract, legal, and more.  He also stated he is developing organizational policies for staff.

Budget Update
Mr. Soltani said that 10 million dollars is coming from the statute, and he will be using this budget, in part, to fill thirty-four positions and to pursue public awareness and education. 

The Board is currently in the fourth and “Final Phase” of the rulemaking process, which is when the Board responds to comments and decides whether to modify the text.  

Mr. Soltani proposed a nearly year-long plan for the California Privacy Rights Act (CPRA) rulemaking, which would exceed the original rulemaking deadline of the July 1, 2022.  This delayed timeline considers Mr. Soltani’s priorities:  staffing and substantial preliminary information gathering.  With the 900 pages of public comments submitted, Mr. Soltani stated he wants to make sure the Board is informed of the diverse viewpoints, which requires substantial information gathering.  Board members Jennifer Urban and Vincent Leh supported Mr. Soltani’s timeline, noting that informational hearings are very important.   

Accordingly, there will be instructive hearings held in March 2022, and the Board will invite academics and experts to speak and to comment on issues relating to topics the Board will be interested in.  In April 2022, there will be a second set of hearings that will be for the public, allowing stakeholders to provide insight into issues that are important.

When the Board opened up the floor for public comments, however, it was asked whether CPRA’s January 1, 2023, enforcement date would be pushed back due to the new rulemaking activities timeline.  The Board said it would take this concern under advisement.  

New Privacy Bills in California

By Alexander Diaz

Will CPRA Rights Apply to Employee and B2B Data?

When California voters approved CPRA in November 2020, businesses braced themselves for additional requirements around consumer rights and vendor management. However, they found some relief in the provision that extended CCPA’s exemptions for employee and business-to-business (“B2B”) data another two years, to Jan. 1, 2023. As that date approaches, it remains controversial whether consumer rights to limit sale/sharing, access, deletion, and correction should apply to data collected in the course of employment and B2B transactions.

On Feb 18, the California Assembly introduced two bills favoring businesses that are anxious about how consumer privacy rights might affect their HR files and (business) customer profiles. AB 2891 would extend the exemptions another 3 years, to January 1, 2026, while AB 2871 would extend them indefinitely. The extension of these exemptions under either bill would bring California’s privacy regime closer in line with newer statutes in Colorado and Virginia. 

What Requirements Will Apply to Biometric Data?

On Feb 17, AB 1189 was introduced to provide specific requirements around business’s use and retention of biometric data. The bill would limit a business’s disclosure of biometric data to a small number of circumstances, including (1) at the direction or authorization of the data subject and (2) by order of law. The bill would also require a business to publish a retention schedule for and guidelines for the permanent destruction of any biometric information it may collect. The bill provides for $100 to $1,000 in damages per violation per day, but also provides for actual damages if greater, and authorizes relief in the form of punitive damages, fees and costs, and equitable remedies as well.

 Illinois Supreme Court Holds Workers May Sue Employers Under BIPA

By Jennifer Oliver

This month Illinois Supreme Court held that the exclusivity provisions of the state’s Workers’ Compensation Act do not bar claims for statutory damages under the Illinois Biometric Information Privacy Act where the employer is alleged to have violated an employee’s privacy rights provided by the act.

The ruling in McDonald v. Symphony Bronzeville Park (2022 IL 126511) was handed down in a proposed class action brought by a former employee, Marquita McDonald, against Symphony Bronzeville Park, LLC. McDonald alleged that the company’s collection, storage and use of her fingerprints as part of a timekeeping system was a BIPA violation. She maintains that she was not provided aa release to store her information, did not sign a release consenting to storage of her information, and was not informed of why the company needed to store the data or for how long.

Symphony Bronzeville argued that the case was barred by the exclusive remedy provisions of the Workers’ Compensation Act, saying it was the only remedy available to workers who suffer accidental injuries in the workplace. Employees do not have a common-law or statutory right to recover civil damages from their employer for such injuries, the company argued.

The state Supreme Court disagreed with the employer, holding that the suit alleges injuries that do not categorically fit within the purview of the Workers’ Compensation Act, therefore it is not barred by its exclusive remedy provisions. McDonald will now continue her pursuit of damages on behalf of the proposed class. BIPA provides for damages of $1,000 per negligent violation.

This is not the only important case interpreting the reach of BIPA currently pending before the state’s high court. As we discussed in our California Lawyers Association Privacy Section’s January update, in Cothran v. White Castle the court is considering whether each transmission of biometric data is a separate BIPA violation. That question remains before the court.

Colorado and Virginia Privacy Law Updates

By Brandon M. Jasso, CIPP/US, CIPP/E

States are continuing either to clarify and amend their existing privacy and data laws or to begin the privacy and data privacy rulemaking processes.  Specifically, Colorado has begun its rulemaking process while Virginia has begun to work on clarifying its existing law.  Below, there are recent updates to the Colorado and Virginia laws, respectively.

Colorado Update:

            On July 8, 2021, Colorado officially enacted the Colorado Privacy Act (“CPA”) (see CPA text here). CPA became the third state to enact a comprehensive privacy law in the United States following the passing of privacy and data laws in California (California Consumer Privacy Act as amended by the California Privacy Rights Act) and Virginia (Consumer Data Protection Act).

            In January 2022, the Colorado Attorney General Phil Weiser (“AG Weiser”) announced that his department will begin its rulemaking process as authorized under the CPA.  AG Weiser stated that “Chief Deputy Natalie Hanlon Leh leads an Impact Team, which brings together those charged with enforcing this law, along with lawyers who advise our state agency clients on data privacy and data security matters and technologists who protect data that we collect here at the Department of Law.”

            AG Weiser not only acknowledged that the CPA was meant for consumers as follows but also indicated that consideration of the following was necessary for the rulemaking process:

  • “[C]onsumers have a right to know what information companies collect about them and how that information will be used, enabling them to reject the sale and use of their private data by third parties”
  • That the process of “consumer notice and approval or rejection of data sharing needs to be conducted fairly, free from what some have called ‘dark patterns,’ which can unfairly mislead consumers on this issue” (see here for more information on dark patterns).
  • Lastly, that the CPA’s vision for “company auditing and data protection assessment” is another area that the Attorney General’s Office (“AG’s Office”) may want to address.

Additionally, AG Weiser stated that the rule making process will proceed in two steps:

  • First, they want to hear from consumers, business, and other people who will be affected by CPA. The AG’s Office “will post a series of topics for informal input on our website and solicit responses in writing and at scheduled events.”
  • Next, the AG’s Office “will post a formal Notice of Proposed Rulemaking, which will include a proposed set of model rules. This will kick off a process of collecting verbal and written comments about the proposed rules and how they would operate from a range of stakeholders and other interested persons across Colorado.”

It is good to see AG Weiser starting the rule making process early as the CPA becomes effective July 1, 2023, and the formal rulemaking process can take a long time. Consumers should have clarification of how they can enforce their rights, and businesses and stakeholders should have the opportunity to become compliant well before the enforcement date.

Virginia Update:

The Virginia Consumer Data Protection Act (“CDPA”) was signed into law on March 2, 2021 (see CDPA text here), and which becomes effective January 1, 2023. Just like the California Privacy Rights Act amended California Consumer Privacy Act, the Virginia lawmakers passed HB 381 which amends section 59.1-577 by adding a new exemption to the right to delete. The new section added is 59.1-577(B)(5), which states:

5. A controller that has obtained personal data about a consumer from a source other than the consumer shall be deemed in compliance with a consumer’s request to delete such data pursuant to subdivision A 3 by either (i) retaining a record of the deletion request and the minimum data necessary for the purpose of ensuring the consumer’s personal data remains deleted from the business’s records and not using such retained data for any other purpose pursuant to the provisions of this chapter or (ii) opting the consumer out of the processing of such personal data for any purpose except for those exempted pursuant to the provisions of this chapter.

            This new section should help businesses’ CDPA compliance by providing another method to help them prove that they are affirmatively responding to consumers.

            Virginia lawmakers also passed SB 534 (see text here), which amends the CDPA’s definition of “Nonprofit organization” to include political organizations tax exempt under Internal Revenue Code section 501(c)(4). The action defines “Political organization” as:

[A]s a party, committee, association, fund, or other organization, whether or not incorporated, organized and operated primarily for the purpose of influencing or attempting to influence the selection, nomination, election, or appointment of any individual to any federal, state, or local public office or office in a political organization or the election of a presidential/vice-presidential elector, whether or not such individual or elector is selected, nominated, elected, or appointed.

            Lastly, SB 534 further repeals the “Fund” which meant “Consumer Privacy Fund” that was to be established to CDPA section 59.1-585. Now, when the Virginia Attorney General initiates an action and obtains a judgment against a controller or processor, “[a]ll civil penalties, expenses, and attorney fees collected pursuant to this chapter shall be paid into the state treasury and credited to the Regulatory, Consumer Advocacy, Litigation, and Enforcement Revolving Trust Fund.”

 IRS Cancels Facial Recognition Contract Due to Privacy and Equity Concerns, Prompting Congressional Interest

By Cody Venzke

After pushback from privacy advocates and civil society, the Internal Revenue Service and its private contractor,, have backed away from a planned identity verification system that would have required taxpayers to use facial recognition technology (FRT) to access certain services. Under the plan, users would have had to provide a photo of an identity document such as a driver’s license or passport to be compared with a selfie taken on a smartphone or a computer webcam. would then have employed facial matching and recognition technologies to confirm the user’s identity and to determine if the user’s image was associated with other identities — a potential sign of fraud. The IRS announced the “transition” away from’s facial recognition technology on February 7, 2022, with announcing a “new option to verify identity without using automated facial recognition” for public sector customers the following day.

The IRS had originally implemented the system to combat fraud, and although’s identify verification would have been required to access an IRS account or to submit banking information for the Child Tax Credit, it would not have been required to file taxes.

Privacy advocates and members of civil society opposed the plan because of equity and privacy concerns regarding the use of FRT. FRT has been demonstrated to be disproportionately inaccurate for people of color, especially people with darker complexions, potentially preventing them from accessing IRS services.’s identify verification has been unable to verify applicants for a significant number of state benefit programs; the California State Auditor estimated “that among the estimated number of legitimate claimants [for California unemployment benefits] who attempted to validate their identities, about 20 percent — just under 144,000 — were unsuccessful in validating their identity.” Criminal indictments have also indicated that individuals have been able to circumvent’s systems, with one man filing at least 78 fraudulent unemployment claims using slightly altered photos and fake IDs.

In addition to FRT biometrics, would have also collected geolocation data from the users’ mobile networks, according to the IRS’s privacy assessment of the system.

Members of Congress also responded to the IRS’s plan. In a February 7 letter, Sen. Ron Wyden (D-OR urged the IRS to end its use of facial recognition technology. Although he lauded the IRS’s “best of intentions — to prevent criminals from accessing Americans’ tax records, using them to commit identity theft, and make off with other people’s tax refunds,” he criticized “forc[ing] Americans to submit to scans using facial recognition technology as a condition of interacting with the government online, including to access essential government programs.” He further criticized the “alarming” trend of governmental agencies “outsourc[ing] their core technology infrastructure to the private sector.”

In his letter, Sen. Wyden urged the IRS to instead utilize the federal government’s single single-on service,, operated by the General Services Administration. In a separate letter on February 15, Sen. Wyden was joined by Sens. Sherrod Brown (D-OH) and Elizabeth Warren (D-MA) in urging the U.S. Department of Labor to help state unemployment benefits transition away from private contractors and instead rely on for identify verification services. Reps. Ted Lieu (D-CA), Anna Eshoo (D-CA), Pramila Jayapal (D-WA), and Yvette Clarke (D-NY) also urged the IRS to halt its implementation of FRT, and Rep. Carolyn Maloney (D-NY) sought information on the data of the seven million people that signed up for’s IRS identity verification service before it was shuttered. The Congressional response to federal uses of FRT has included several bills. The Ban IRS Biometrics Act, S.3599, introduced by Sen. Rick Scott (D-FL), would prohibit the IRS from conditioning a taxpayer “receiving any service” on the provision of biometric information. Similarly, the bicameral Facial Recognition and Biometric Technology Moratorium Act, S.2052 and H.R.3907, would ban federal uses of FRT and other “biometric surveillance” unless expressly authorized by Congress. The Moratorium Act’s introduction in June 2021 followed a Government Accountability Office report on federal law enforcement’s use of FRT. The GAO report found that of 42 surveyed agencies, 20 used FRT technology and six used it to identify protestors. Of those 20, 17 used “another entities’ system,” including private systems, and an additional 13 did not have the data to answer whether they used FR

AdTech Privacy Update –  Things Have not Settled Down Thanks to Google and the IAB

By McKenzie Thomsen, CIPP/US

We’ve had two big shake ups in AdTech privacy since our last update, namely that both Google Analytics and IAB EU’s TCF have been found in violation of the European Union’s (EU) General Data Protection Regulation (GDPR). We’ll discuss them in turn.

Google Analytics got ‘Schrems’ed

First, let’s talk about Google Analytics. First Austria and then France ruled that transferring Google Analytics from the EU to the United States is a violation of the GDPR.  Google Analytics is a violation of GDPR. Both cases were brought by Max Schrems (of Schrems and Schrems II) of None of Your Business (NOYB), and the issues decided were reminiscent of the Schrems rulings. In both the Austria and France cases, NOYB complained that the transfer of information obtained via Google Analytics to Google in the U.S., a transfer that was relying on SCCs, violated GDPR because Google is an ‘electronic communications service provider’ and so subject to Section 702 of FISA, which can allow the U.S. government to order  (aka Google to hand over personal data of EU citizens. And, both courts agreed. Austria and France found that the additional safeguards taken by Google were insufficient at preventing the U.S. government from accessing EU citizens’ personal data (e.g., Austria held that encryption at rest is insufficient).

So what does this mean for Google Analytics?

So far, this ruling is limited to Austria and France, but the word on the street is that the rest of the EU member states will domino into the same holding. Google has since published some measures companies can take to control Google Analytics data (Some facts about Google Analytics data privacy, and Take control of how data is used in Google Analytics). But take caution, these articles don’t have Austria or France’s stamp of approval.

And what does this mean for AdTech in general?

The logic of the rulings is broad. These rulings could be used, not just against other companies’ use of cookies. The rulings could apply to any data transfer through any digital means (for example sending data via HTTP or even browser/device fingerprinting). Take this as a reminder to understand and reevaluate your company’s data flows.


Next up is Interactive Advertising Bureau (IAB) EU’s Transparency and Consent Framework (TCF). The Belgian DPA ruled that the framework violates GDPR, fining it €250,000 and giving IAB EU two months to come up with a plan to fix the issues. This is a strange ruling, and I’ll explain to you why, but first you need to know more about the TCF mechanism.

TCF is simply a framework. It is a way for consent management platforms (CMPs) who implement consent mechanisms, such as cookie banners, to receive consent from users (or not) and tell the company that operates the website that uses the CMP’s cookie banner that consent was given (or not). The TCF consent string, as its called, is then sent through the AdTech industry to publishers, for example, who can publish ads on their webpages to users who gave consent. Put plainly, the TCF is a brainchild of IAB that the adtech industry uses to communicate consent. (The TCF communicates other legal bases such as legitimate interest, and this is a whole can of worms, but for the sake of explaining, I’ll just be referring to consent).

What’s odd is that Belgium ruled that IAB EU is a data controller. This has been upsetting to the AdTech industry because it’s IAB EU’s opinion that all they did was create a framework and that they don’t control how it is implemented by companies. But Belgium disagrees. Belgium, in its ruling, stated that IAB EU is responsible for conducting strict CMP audits and guaranteeing that TCF consent strings are used correctly.

Two months is a short deadline to overhaul an entire framework, and the way I see it, this could go in one of two ways. Either IAB EU finds a plan gets the GDPR kiss of approval making it the single framework in all of the EU that is known to be GDPR compliant, which would be a HUGE boon to the industry, or IAB EU can’t overhaul the framework and programmatic advertising becomes the wild wild west, a scary thought for privacy advocates. IAB EU has since announced they are appealing the ruling.

 CyberSecurity Insight in the Ukraine Conflict

By M. Scott Koller

On February 24, 2022, Russia launched a large-scale military incursion into Ukraine.  By all accounts, the Russian offensive attacked on multiple fronts, including against Ukraine’s network computers and communication systems.  The cyber attacks began before the first tank crossed the border, with Ukrainian networks subjected to multiple targeted attacks involving hacking, DDoS, and the introduction of malware that specifically targeted Ukrainian systems and wiped data. 

This isn’t the first time Russia has engaged in this type of cyber warfare, nor is it likely to be the last.  Many will remember the widespread power outages in 2015 when Russian hackers breached the Ukrainian power grid, or the 2017 NotPetya malware which was intended to target Ukraine’s networks but quickly spread out of control causing billions of dollars in damage across the globe.

This time, Ukraine was better prepared and has, for the most part, resisted the Russian cyber onslaught.  However, one of the more interesting components of this conflict has been the international response, and how a regional military dispute between countries has spread internationally.  Russia’s military action in Ukraine was immediately condemned internationally, and the United States and most of Europe imposed severe economic sanctions. 

In response to the sanctions and United States opposition to the war, Russia-aligned hackers started deliberately targeting U.S. based organizations, when some groups promised retaliatory measures for interfering.  The Cybersecurity and Infrastructure Security Agency (CISA) warned state and local governments, and aviation and energy sector networks of the increased risk of attacks from Russia.  One well known ransomware group, Conti, publicly announced their intention to “use our full capacity to deliver retaliatory measures”  and there have been reports of prior ransomware victims having their data posted to the dark web, even after paying the ransom, as retribution for the U.S. government’s opposition of the war. 

Interestingly, Russia has been on the receiving end of increased cyber attacks as well.  Shortly after the Russian operation began, the Ukraine government called on volunteers to form a Cyber Army to help protect their critical infrastructure and spy on Russian troops. More recently, Ukrainian Vice Prime Minister Mykhailo Fedorov tweeted a link to a Telegram channel calling for hackers and tech specialists to join the “cyber front,” which today has more than 250,000 members.  These efforts appear to have yielded positive results with several Russian websites and state online portals taken offline by the Ukrainian cyber police force. A twitter post from an account that purports to be associated with the international hacker group Anonymous claimed credit for disabling websites belonging to the Russian oil giant Gazprom, Russian news agency RT, and several other Russian and Belarusian government agencies, including the Kremlin’s official site. 

Even Russia’s allies are being targeted as earlier this week, it was announced that hackers in Belarus have attacked their country’s train system in order to hamper Russia’s ability to move troops to Ukraine. Despite the military action being isolated geographically, the cyber front operations transcends borders, with various hacker groups taking sides.  One group in particular seems to be conflicted, with both pro-Russia and Pro-Ukraine members.  After announcing their “full support” of the Russian government, the Conti group may be having second thoughts about taking sides.  On February 27th, an anonymous individual associated with the Conti group leaked a huge cache of internal data including chat logs, bitcoin payment addresses, and detailed information regarding their technical infrastructure, logistical operations and attack methodologies.  The leaker made their position clear by including a message with the data stating: “Glory to Ukraine,” suggesting that the group’s earlier decision to support Russia was not unanimous. This is a significant blow to Conti’s operation and may jeopardize the group’s long-term viability.  The chat logs and bitcoin wallet addresses will help law enforcement track the flow of money while the source code leak may allow security researchers to reverse engineer the encryption used in ransomware attacks.  Of particularly importance has been the group’s internal communications, which suggest close ties between Conti’s upper management and Russian intelligence agencies.  Back in October 2020, the U.S. government started adding ransomware groups to its list of sanctioned entities, which effectively prohibits U.S. victims from paying a ransom to those groups (even if they wanted to).  If the connection between Conti and the Russian government is true, it may motivate the U.S. Government to add the Conti group to their list of sanctioned entities, which can be a death sentence for the group which operates using the affiliate/ransomware-as-a-service model.

Following the leak of Conti’s internal data, a competing ransomware group Lockbit announced their intention to remain neutral, citing the fact that they have members around the globe and a desire not to follow in Conti’s footsteps.  

Key Takeaways

Given the U.S. government’s support of Ukraine, all organizations should be on high alert for possible cyber attacks, especially those in the finance, energy, aviation, military or supply chain industries.  Understand that the goal of the attacker maybe to embarrass or maximum disruption, as opposed to being financially motivated.  The Log4J exploit remains the exploit of choice for many attacker groups so take steps to ensure all affected systems are fully patched.  The Cyber Security & Infrastructure Security Agency have provided access to a variety of free tools available at 

Carefully consider the ramifications of any public statements.  Given the cyber “firepower” on both sides of the conflict, provocative language can draw attention and potentially make your organization a target for future attacks. 

If your organization is hit with ransomware, work with an experienced data breach coach to ensure your organization does not violate U.S. sanctions.   Be especially careful when dealing with Russian-aligned attacker groups, Conti in particular.  While they are not currently on the U.S. sanctions list, there is a good chance they will be added in the near future due to their close ties with the Russian government.  There are some ransom negotiators that are reluctant to fund anything that is remotely tied to the region due to this risk.

Finally, take a close look at the provisions of your cyber liability insurance policy to determine whether acts of war are excluded from coverage.  In 2017, Russia targeted Ukraine with its NotPetya virus, which quickly spread globally shutting down computer systems for hundreds of companies worldwide.  One of the entities affected was the pharmaceutical giant Merck, which incurred billions of dollars in damages.  However, Merck’s cyber insurance company denied the claim arguing that the losses fell under the “War or Hostile Acts” exclusion.  Although a court recently ruled in favor of Merck awarding over $1.4 billion dollars in that case, during the four years since that event, insurance companies have been tightening up policy language to explicitly exclude nation-state actors.  It remains to be seen how those policies may be affected by this recent conflict where the global scale of the conflict carries significant risk of spillover cyberattacks that could impact global supply chains and commerce, just like the 2017 NotPetya attack. 

The current conflict provides a sneak preview of the future of modern warfare.  Whatever tools,  techniques and exploits are developed during this conflict, are likely to be the same ones used against the U.S. by threat actors in the future. 

More About the Bagley-Keene Open Meetings Act

By Hina Moheyuddin

California Privacy Rights Act (“CPRA”)  conferred upon the California Privacy Protection Agency (“CPPA”) full administrative power, authority, and jurisdiction to implement and enforce the California Consumer Privacy Act.  The CPPA’s Board, however, does not have carte blanche to call meetings or operate however it wants; as a “state body,” the CPPA’s Board is subject to The Bagley-Keene Open Meeting Act (“Bagley-Keene”). 

Unless specifically excluded by statute, Bagley-Keene defines the “state body” as a state board, commission, or similar multimember body that is created by statute or required by law to conduct official meetings. (§ 11121). Also, “meeting” is defined as including “any congregation of members of a state body at the same time and place to hear, discuss, or deliberate upon any item that is within the subject matter jurisdiction of the state body to which it pertains.” (§ 11122.5(b)(1)).

As a “state body,” the CPPA has, essentially, three duties under the Act:

  1. To give timely and sufficient public notice of meetings to be held;
  2. To provide the public with opportunities to directly address the State board on each agenda item before or during the discussion or consideration of an item; and
  3. To conduct such meetings in open sessions except where a closes session is specifically authorized.

As has been evident by the very public CPPA board meetings, Bagley-Keene has roots in transparency.  Bagley-Keene mandates open meetings for California State agencies, boards, and commissions (§ 11120), re-emphasizing a provision in the California Constitution which states that “the people have the right of access to information concerning the conduct of the people’s business, and therefore, the meetings of public bodies and the writings of public officials and agencies shall be open to public scrutiny.”

The provisions on notice apply to both open and closed meetings.  Under Bagley-Keene, notice shall be provided to those individuals who have requested it by email, mail, or both and also made available on the internet at least ten days in advance of the meeting. (§ 11125(a)). Among other things, notice shall include the time and place of the meeting; along with the name, telephone number, and address of any person who can provide further information prior to the meeting. (Id).

The notice of each board meeting must include an agenda of the meeting. The agenda must be prepared with enough information and in alternative appropriate formats to allow interested lay persons to decide whether to attend the meeting or participate in a particular agenda item. (§ 11125(f)). The agenda must include all items to be discussed or acted upon and must be prepared at least ten days prior to the meeting with no items added after the notice, unless otherwise provided. (§ 11125(b)).

Second, the legislative intent behind the Act was to ensure that actions by State bodies be taken openly and that their deliberations be conducted openly. (§ 11120). Hence, covered bodies are forbidden from imposing any conditions on attendance at a meeting. (§ 11124). Meeting locations must be public, accessible to the disabled, and nondiscriminatory on the basis of race, religion, national origin, etc. (§ 11131).

The Act permits public comments at board meetings with specified exceptions. Section 11125.7 requires State agencies to provide an opportunity for members of the public to directly address the

State body on each agenda item before or after the discussion or consideration of the item. This opportunity for comment need not be made available if:

  1. The agenda item was previously considered at a public meeting by a committee comprised exclusively of board members, where members of the public were provided an opportunity to address the item.
  • The agenda item is one that may properly be considered in closed session, which would include deliberation and action on disciplinary proceedings under the Administrative Procedure Act. (§ 11125.7).

The Act specifically provides that a State body may not prohibit public criticism of its policies, programs, or services, or of the acts or omission of the agency. (§ 11125.7(d)). Additionally, any individual attending an open and public meeting has the right to record the proceeding and access records of the body. (§ 11124.1(a), 11125.1).

Finally, the Act states that “[a]ll meetings of a state board shall be open and public and all persons shall be permitted to attend any meeting of a state board except as otherwise provided.” (§ 11123). Section 11126 sets forth those specific items of business which may be transacted in a closed session and only those enumerated items of business may be conducted in a closed session. These include, but are not limited to personal matters (§ 11126(a)(4)); matters affecting individual privacy (§ 11126(c)(2)); administrative disciplinary matters (§ 11126(c)(3));  pending litigation (§ 11126(e)); and open session otherwise specifically authorized by statute. (§ 11132).

Should the CPRA Align Its Definition of Dark Patterns With the FTC?

By Rory Sweeney

On September 22, 2021, the California Privacy Protection Agency (CPPA) sent out an invitation for preliminary comments on proposed rulemaking under the California Privacy Rights Act of 2020 (CPRA), seeking input from stakeholders in developing regulations.

Under Section 8 of the invitation, it states the California Consumer Protection Act (CCPA) and CPRA provide for various regulations to create or update definitions of important terms and categories of information or activities covered by the statute.  Subsection J asked for comments on whether regulations, if any, that should be adopted to further define “dark patterns.”

What is a Dark Pattern?

Under CPRA, a “dark pattern” is defined as, “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision‐making, or choice.” Cal. Civ. Code § 1798.140 (l).  In other words, dark patterns are bad design practices that are, essentially, tricks that websites and mobile apps employ to make you consent to things you otherwise would not have.

Stakeholders who submitted public comments from the CPPA’s preliminary rulemaking activity provided a wide range of perspectives about the proposed definition. Some comments indicated the definition is overly broad while other comments indicated it is too narrow; some comments indicate the current definition strikes a reasonable balance between consumer protection and promoting commerce.  

Personally, I believe the definition is a repackaging of the well-established concept of unfair or deceptive acts or practices (UDAP)[1].  Accordingly, I believe the definition should be modified to align with the Federal Trade Commission’s (FTC) extensive guidance on the subject.

The FTC’s recent efforts at defining dark patterns refers to classic forms of unfairness and deception.  At the end of 2021, the agency issued a policy statement addressing “negative option marketing” as a form of dark patterns. The statement was primarily comprised of references to cases from as early as 2003 and focused on reiterating the below guidelines around disclosures, consent, and cancellation practices.

  • Disclose clearly and conspicuously all material terms of the product or service…costs, deadlines, and how to cancel when the consumer first sees the offer.
  • Obtain the consumer’s express informed consent before charging them for a product or service.
  • Provide easy and simple cancellation to the consumer. Cancellation mechanisms should be as easy to accomplish as signing up for a product or service.

It can be said, a user interface not adhering to these guidelines is “designed with the substantial effect of subverting or impairing user autonomy, decision‐making, or choice,” i.e., a dark pattern. This suggests that the FTC’s existing guidelines can be used to clarify the meaning of dark patterns.

The agency also held a workshop and gave a few more examples of conduct is considered to be dark patterns like automatic renewal subscriptions and free-to-pay conversions. None of these examples were novel or suggested any deviation from classic acts of unfairness or deception.

Additionally, a recent enforcement action brought by the District of Columbia’s Attorney General alleged that Google engaged in dark patterns by, “employing user interfaces that make it difficult for consumers to deny Google access to and use of their location information, including making location-related user controls difficult to find and repeatedly prompting users who previously declined or disabled location-related controls to enable those controls.” Despite the complaint using the term dark patterns its allegations fit neatly under the above referenced principle of providing “easy and simple cancellation mechanisms” to ensure a business practice is not found to be unfair or deceptive.


Enforcement of the California Privacy Rights Act (CPRA) commences on July 1, 2023. Additionally, on February 17, 2022, the California Privacy Protection Agency (CPPA) announced publication of the CPRA’s regulations would be pushed out to some point in the third or fourth quarter of 2022. These events leave little time for businesses to standup or adjust compliance programs for novel, substantial areas of the law such as financial incentive disclosures, automated decision making, and dark patterns.

In light of this truncated time period, the CPPA should probably not look to re-invent the wheel.  Dark Patterns have been enforced by the FTC and state Attorney Generals for decades as unfair or deceptive acts.  The CPPA should take advantage of existing principles and precedent used for enforcement to help inform the CPRA’s definition of Dark Patterns.  Adopting such principles would help create a unity and consistency in Dark Patterns enforcement.

Dark Patterns WebinarWant to learn more about dark patterns?  The California Lawyers Association Privacy Law Section and the Future of Privacy Forum are co-hosting a great webinar on Dark Patterns:  “Dark Patterns” and Manipulative Design:  Understanding Data, Decision making, and Design.  The webinar is taking place on March 11th, 2022, from 12:00-1:15 PT. Register Here.

Forgot Password

Enter the email associated with you account. You will then receive a link in your inbox to reset your password.

Personal Information

Select Section(s)

CLA Membership is $99 and includes one section. Additional sections are $99 each.