Privacy Law


Please share:


Written by Daniel Goldberg and Bram Schumer*

California continues to lead the nation with new laws, regulations, enforcement actions, and court decisions relating to privacy compliance. These efforts have profoundly impacted the companies involved in the ad tech ecosystem. This article provides a high-level overview of the ad tech ecosystem, outlines some of the major California privacy developments in 2023 impacting the ad tech ecosystem, and concludes with practical steps companies in the ad tech ecosystem can take to reduce risk.


Ad tech (short for advertising technology) refers to those technologies used to buy, sell, and manage digital advertising. The ad tech ecosystem comprises advertisers (companies that buy ads), publishers (companies that sell ad inventory), agencies (companies that help manage buying and selling for advertisers and publishers), ad networks (companies that sell ad inventory from many publishers), technology providers (companies that offers the tools to facilitate this process), and other related parties. Most companies have some relationship with the ad tech ecosystem, often in connection with their monetization models. According to a study[1] by Allied Market Research that was reported in Forbes[2], the ad tech ecosystem was valued at $748.2 billion in 2021, and could reach $2.9 trillion by 2031.

Ad tech relies heavily on the use of data. For example, to deliver an ad, a publisher must collect some data about the device where the ad is delivered. This data is collected through invisible tracking technologies, such as cookies and pixels, embedded within the publisher’s website. In addition to delivering ads, these tracking technologies, often licensed to delivering ads, these tracking technologies, often licensed from third party providers, can collect data for purposes such as research and analysis, attribution and measurement, and targeted advertising shown to the device. Advertisers can also place tracking technologies on their own websites and within their ads. In the app environment, most companies use SDKs (short for Software Development Kit) instead of cookies and pixels, to provide various functionalities offered by the third-party providers. Some of the most well-known third-party providers include Meta and Google.

Another aspect of ad tech involves data matching. To improve campaign performance and deliver targeted advertising, an advertiser or publisher may upload its first party data to a technology provider (sometimes called a clean room) to match against third party data. The uploaded data may be in the form of an email address or device identifier that is hashed prior to sending.

Takeaway. As a result of its heavy reliance on data, the ad tech ecosystem has become associated with privacy concerns. Over the past decade, California lawmakers and regulators have taken the position that most of the data processed through the ad tech ecosystem, even when hashed, is personal information subject to privacy law, and taken measures to regulate such processing.


Below are some of the major California developments in 2023 impacting the ad tech ecosystem:

Do Not Sell or Share Rights under CPRA

In January 2023, the California Privacy Rights Act (“CPRA”) took effect. One major aspect of the CPRA is that consumers have the right to opt-out of the “sale” or “sharing” of their personal information. Under CPRA, a “sale” is broadly defined to include a disclosure of personal information to a third party for something of value, and a “share” is broadly defined to include a disclosure of personal information to a third party for cross-context behavioral advertising (i.e., targeted advertising). CPRA also requires companies to process opt-out preference signals, such as Global Privacy Control3 (“GPC”).

Takeaway: Ad tech inherently involves activities that constitute sales or shares under CPRA. Companies that use tracking technologies or engage in data matching activities could be found to be selling or sharing personal information, and need to comply with the obligations relating to sales and shares.

Contractual Obligations under CPRA

As part of the CPRA, California was required to issue implementing regulations. In March 2023, California finalized its CPRA regulations[4] and filed them with the Secretary of State. The CPRA regs add robust obligations around sales and shares, including specific language required in contracts with third parties. The CPRA regs also specify that a service provider cannot contract to provide targeted advertising services. This effectively means that companies in that ad tech ecosystem may not be able to position themselves as service providers, and instead should include specific language in their contracts regarding their obligations as third parties.

Takeaway: Notably, the CPRA regs were set to take effect in July 2023, but the Sacramento County Superior Court issued a decision[5] delaying their enforcement until March 2024. March 2024 is quickly approaching, and companies in the ad tech ecosystem should be ready for compliance well before then.

Sensitive Data Rights under CPRA

Another aspect of the CPRA is that consumers have rights around their sensitive personal information.

Under CPRA, sensitive personal information includes precise geolocation, racial or ethnic origin, religious or philosophical beliefs, health data, sex life, and more. Companies collecting sensitive personal information may only use that information for permissible purposes (such as preventing security incidents, resisting fraudulent activities, ensuring the physical safety of others, and maintaining product safety or quality). Where a company uses sensitive personal information for non-permissible purposes, it must provide consumers with a right to limit the use or disclosure of their sensitive personal information to the permissible purposes. The CPRA regs specify further obligations around implementation of this right.

Takeaway: Ad tech often involves the collection of sensitive personal information. For example, a ride share app may request precise geolocation for the purpose of locating a ride. If the ride share app includes an advertising SDK embedded within the app, that SDK may also receive the precise geolocation, and use that data for advertising purposes (which would be considered a secondary purpose). Under CPRA, if a consumer limits the use or disclosure of their sensitive personal information, the app developer likely would be prohibited from sharing the precise geolocation with the advertising SDK.

Reasonable Expectation Test under CPRA

Although the CPRA establishes an opt-out regime, it also specifies that companies must obtain opt-in consent for any data practices that are not consistent with a consumer’s “reasonable expectation.” What constitutes consumer reasonable expectation is a question of fact. Under the CPRA regs, to determine reasonable expectation, a company must evaluate the relationship between consumers and the company, the type, nature, and amount of personal information collected, the source of the personal information and the method for collecting it, the specificity of disclosures made by the company about the practice, and the degree to which third party involvement is disclosed to consumers.

Takeaway: This factor test could establish a de facto opt-in regime for certain parts of the ad tech ecosystem. For example, in the rideshare example above, California regulators could determine that collecting precise geolocation data for advertising purposes always fails the reasonable expectation test, and thus requires opt-in consent—a position consistent with many other frameworks found in US privacy law, including from the FTC.

Protecting Children under California Privacy Law

Protecting children’s personal information in the context of targeted advertising has become a top priority for lawmakers and regulators at every level, and California is no exception. The main US privacy law that regulates children’s personal information is the Children’s Online Privacy Protection Act (“COPPA”). Now over two decades old, COPPA requires websites and online services to obtain verifiable parental consent before collecting personal information (including device identifiers) from children under 13 unless an exception applies. CPRA added obligations that companies obtain opt-in consent for sales or shares of personal information of consumers aged 13 to 15. California regulators can bring an action under state consumer protection laws for alleged violations of COPPA and CPRA, and there have been indications that California regulators have issued warning letters and met privately with companies in the ad tech ecosystem relating to their use of children’s personal information.

California also has been working toward interpreting obligations under its Age-Appropriate Design Code law (“AADC”), which lawmakers passed in September 2022. The California AADC, modeled after the United Kingdom’s AADC, aims to protect the privacy and data of children under 18 when they use online services, products, or features that may affect their mental, physical, and emotional health. Notably, targeted advertising is considered inherently detrimental to the health or wellbeing of children, and is arguably entirely prohibited (even with parental consent) under the law. The California AADC was set to take effect July 2024, but was recently stayed[6] by a California court on first amendment grounds.

Takeaway: Ad tech often involves the collection of personal information from children and minors, implicating these laws.

New Data Broker Obligations

In October 2023, California passed the California “Delete Act,” which introduces new requirements for “data brokers”. Under the law, a data broker is defined as a company “that knowingly collects and sells to third parties the personal information of a consumer with whom the business does not have a direct relationship.” Like the national “Do Not Call” registry, the Delete Act will create a centralized mechanism where consumers can submit a single delete request that all registered data brokers in California must honor. If a registered data broker denies a deletion request subject to an exception, the data broker must treat the request as an opt-out of sales or shares. This one-step registry must be created by the California Privacy Protection Agency (“CPPA”) by January 2026, and honored by data brokers starting August 2026.

Takeaway: Many companies in the ad tech ecosystem qualify as data brokers and will need to comply with these obligations.

Litigation Over Tracking Technologies

In the past year, there has been a significant increase in class action litigation relating to tracking technologies based on alleged violations of the California Invasion of Privacy Act (“CIPA”) and federal wiretapping and video privacy laws. Plaintiffs claim that companies shared their personal information with third parties through tracking technologies without their consent, thereby violating CIPA. Many of these actions involve “session replay” technologies, chatbot technologies, and the popular Meta and Facebook pixels. Takeaway: Tracking technology litigation has exploded in the past year, in part due to a decision by the 9th Circuit Court of Appeals reversing[7] the District Court’s dismissal of a tracking technology case.[8]

As a result, many companies have chosen to settle these claims out of court rather than risk a potential adverse ruling. Of the pending actions, many are still in the motion to dismiss phase, or with plaintiffs given leave to amend their complaints. Given that the plaintiff’s bar interpretation of opt-in consent under CIPA seemingly contradicts the opt-out framework under the CPRA, this area of litigation may be short-lived.


As California’s statutory, regulatory, and litigation landscape continues to develop, companies–particularly advertisers, publishers, and other entities within the ad tech ecosystem– should consider the following steps to reduce risk and demonstrate good faith efforts in the eyes of regulators:

All companies–in and out of the ad tech space– should review and revise their privacy policies to accurately and comprehensively communicate their practices regarding their collection and handling of personal information, how it is used, and how consumers can exercise their rights. Among these rights, the right to opt out of sales and shares is of particular concern for regulators.

Companies that use advertising tracking technologies within their websites or apps should consider themselves sellers/sharers of personal information, and place a link in the footer of their websites that reads either “Do not sell or share my personal information” or “Your Privacy Choices”.9 This link should allow consumers to turn off or limit disclosure of their personal information collected through advertising tracking technologies. In addition, if a company builds any internal lists–or “audiences”–of its consumers, such as for data matching purposes, the link should also direct consumers to a short form where they can enter their contact information to be excluded from audience lists moving forward. Companies must also configure their websites to listen for and process GPC signals.

In many cases, companies will need to engage a privacy vendor to assess the use of tracking technologies, categorize those technologies to determine which ones are used for marketing and advertising, and develop backend functionality to effect consumers’ opt out requests and honor GPC signals.

Address Obligations for Sensitive Personal Information

Companies should always understand what sensitive personal information they collect and how it is used. Companies that process information from or concerning children, consumer health, precise geolocation, or other high risk data sets should be particularly diligent in their analysis.

As a best practice, companies should collect sensitive personal information only when necessary, and when they understand the purpose(s) for its collection. Privacy policies must disclose all sensitive categories of personal information that are collected, and the corresponding purpose(s). If a company uses sensitive personal information for any secondary purpose(s) (i.e., purposes that are not “permissible” under the CPRA), it must provide consumers with a notice of their “right to limit” the use of their sensitive personal information, and explain how to make that choice. To do this, as noted above, companies must place a link in the website footer that reads “Limit the use of my sensitive personal information” or use the “Your Privacy Choices” link. The latter is an omnibus solution for consumers to exercise both their right to opt out of sales/shares, and their right to limit. The link should direct consumers to a mechanism where consumers can exercise their right.

Privacy vendors and counsel can help audit and categorize uses of sensitive personal information and develop mechanisms to comply with the right to limit.

Conduct Due Diligence for Vendors

Companies should conduct due diligence around their use of vendors, including tracking technology and clean room providers. Due diligence includes ensuring vendor contracts contain appropriate terms and restrictions around data use, reviewing code and platform functions and configurations for vendor technology, and considering vendor reputation. To the extent possible, companies should also understand data flows through vendor technology.

In some instances, use of specific types of technology may pose unreasonable risk to a company. For example, as noted above, the use of session replay technology and interactive website chatbots has led to significant litigation in the past year. Companies may consider discontinuing their use of these technologies until the litigation landscape in California becomes more clear.

Evaluate Data Broker Requirements

Many companies in the ad tech space can be classified as data brokers under existing California law, and are already subject to a number of obligations. The Delete Act builds upon these obligations. In order to comply today, and prepare for the Delete Act, companies should evaluate their obligations under data broker law, including the registration and annual fee requirements.

While data brokers–like all companies subject to CPRA–already are required to honor a consumer’s right to delete personal information, data brokers should start considering how they will address new obligations under the Delete Act, including deletion requests via the state’s forthcoming centralized deletion portal as well as new reporting requirements. The practical impact of the deletion portal may be that data brokers treat requests as opt-outs, effectively creating a centralized opt-out mechanism.

Conduct Data Protection Impact Assessments

In ad tech, conducting DPIAs has emerged as a crucial step toward responsible data handling. DPIAs serve as a systematic evaluation of the potential risks and impacts associated with the processing of personal information, especially when the data is sensitive. By undertaking these assessments, companies can identify and mitigate risks before they escalate, ensuring that both their operations and data practices align with regulatory expectations, new statutes such as the AADC, and best practices. Moreover, regularly conducting DPIAs signals to stakeholders and consumers that the company is proactive and committed to safeguarding personal information. DPIAs can also help address the reasonable expectation test under CPRA.

Companies should complete DPIAs for every new processing operation involving personal information that presents a potential heightened risk to the consumer, which includes targeted advertising. As the requirements and processing activities that merit DPIAs takes shape, companies should engage counsel to assist with their drafting.

Develop a Data Governance Framework

A “data governance framework” is a structured approach to managing and ensuring the accuracy, consistency, usability, security, and availability of a company’s data assets. It consists of policies and procedures developed by the company, and should take into account all the suggestions in this article, and more. With the evolving landscape of privacy laws and increased regulatory scrutiny, implementing a framework has become imperative to demonstrate compliance with the law and to help stakeholders understand and address their obligations within the company.


Daniel M. Goldberg is Chair of the Privacy & Data Security Group and Chair of the Advertising Technology Group at Frankfurt Kurnit. Based in California, he is consistently recognized as one of the nation’s leading data lawyers and voices on California privacy law. He routinely advises on matters involving advertising technology and artificial intelligence (AI), and is known for his ability to translate complex technical and legal concepts into actionable items. Please see his full bio at

Bram Schumer is an associate in the Privacy & Data Security Group at Frankfurt Kurnit. He helps clients comply with the array of federal and state privacy laws and platform obligations, including relating to use of advertising technology. He negotiates complex data-driven deals, such as those involving clean rooms and media buys. Please see his full bio at

  2. forbestechcouncil/2023/08/29/adtech-market-is-boominghow- to-benefit-from-this-growth/?sh=54157f4b3bd4
  4. text.pdf
  5. 29_4745C8U6094V3K3O%2FCU_34-2023-80004106- CU-WM-GDS_10a66e19-7726-4167-bfca-5c1591881c5f8.pdf
  8. Ultimately, the District Court dismissed Javier’s case with prejudice, after many rounds of motions and amended complaints. Javier v. Assurance IQ, LLC, No. 20-CV-02860- CRB, 2023 WL 3933070 (N.D. Cal. June 9, 2023).
  9. The blue symbol is required under the CPRA regs to appear next to the “Your Privacy Choices” link in the website footer.

Forgot Password

Enter the email associated with you account. You will then receive a link in your inbox to reset your password.

Personal Information

Select Section(s)

CLA Membership is $99 and includes one section. Additional sections are $99 each.