Privacy Law

Key Takeaways from the 2026 California Lawyers Association Privacy Summit

By: Julie Oberweis

Disclosure: AI was used to assist with the drafting of this article.

The California Lawyers Association (CLA) Privacy Law Section hosted its fourth annual Privacy Summit on February 19–20, 2026, at the UCLA Luskin Conference Center in Los Angeles, drawing approximately 250 privacy professionals for two days of discussion on privacy enforcement, artificial intelligence, litigation developments, and operational compliance.

The summit brought together regulators, in-house counsel, outside practitioners, technologists, and academics. The prevailing sense among them was that data protection has entered a new phase. The debates over emerging statutes and regulatory frameworks that once dominated these gatherings have given way to a more pressing question: whether privacy programs work in practice, not just on paper.

Summit Awards and Recognitions

California Privacy Lawyer of the Year: Liz Travis Allen

During the Summit, the Privacy Law Section recognized Liz Travis Allen as the 2026 California Privacy Lawyer of the Year with her husband and daughter in attendance to cheer her on. Allen, who serves at the California Privacy Protection Agency, was honored for her work implementing the Delete Act. This included developing the Delete Request and Opt-Out Platform (DROP), a centralized system that allows California consumers to submit a single request directing registered data brokers to delete their personal information. The DROP system represents one of the most ambitious operational privacy rights infrastructures yet implemented in the United States. The initiative reflects California’s continued leadership in developing practical mechanisms to operationalize consumer privacy rights.

Inaugural Law Student Writing Competition

The Summit also celebrated the winner of the Privacy Law Section’s Inaugural Law Student Writing Competition, which invited California law students to submit essays addressing emerging issues in AI and privacy law.

The winning entry was authored by Derek Song, a third-year student at UCLA School of Law, for his essay titled “Fault Without Proof: Comparative Responsibility for Statutory Privacy Violations.” The essay explores how courts might adapt comparative fault principles from products liability law to privacy violations involving multiple actors. As modern AI systems increasingly distribute decision-making across vendors, deployers, and other entities in complex technical supply chains, the essay argues that privacy liability frameworks should evolve to reflect shared responsibility among actors who design, control, and profit from data processing activities.

Panel Highlights

State Regulators Signal a More Coordinated and Mature Enforcement Era

The Summit has built a reputation for drawing regulators into the room and giving participants something relatively rare, a direct line to the people doing the watching and a candid sense of what concerns them the most.

Across two panels featuring regulators from California, Colorado, Oregon, Maryland, and New Jersey, a clear message emerged: state privacy enforcement is moving from experimentation to coordination and execution.

Regulators repeatedly pushed back on the common characterization of U.S. privacy law as a chaotic “patchwork.” In practice, they said, the states communicate frequently and increasingly coordinate both policy and enforcement through mechanisms such as the Consortium of Privacy Regulators, a bipartisan collaboration among jurisdictions with comprehensive privacy laws. While statutory nuances remain, the baseline expectations for companies—transparency, honoring consumer rights, and responsible data use—are largely aligned. Assistant Attorney General Kashif Chand of NJ lightheartedly even offered the “Kash Ick Factor” as a test if the data policy or process just feels wrong – it probably is.

A major area of focus across states in this panel and for the FTC is children’s and teen privacy. Legislatures and regulators are expanding protections and tightening expectations around age-related safeguards.

Regulators also signaled a shift in enforcement posture. Cure periods have already or are expiring. That transition reflects a broader view among regulators that companies have had sufficient time to understand the rules. Going forward, enforcement decisions will increasingly hinge on the seriousness of violations and whether organizations have made good-faith efforts to build compliant privacy programs.  They suggested that organizations will need to “show their work” going forward

A separate panel on consumer health data enforcement underscored this shift, discussing recent investigations and the proposed Healthline settlement, which signals a broader interpretation of sensitive personal information and heightened expectations around meaningful consumer consent for health-related data sharing.

 CalPrivacy’s Tom Kemp: Building the Infrastructure of Privacy Enforcement

In a fireside chat, Tom Kemp of the California Privacy Protection Agency (CPPA), newly nicknamed CalPrivacy, emphasized that California is investing heavily in the operational infrastructure needed to support long-term privacy enforcement.

A key initiative is the development of the DROP platform and support of implementation of universal opt-out preference signals—capabilities that browser vendors will be required to support by 2027. The agency is also expanding its audit division, which will oversee cybersecurity audits and certification processes, signaling a more proactive and systemic approach to oversight.

Taken together, the message from regulators was clear: the next phase of U.S. privacy enforcement will be defined less by new laws and more by coordination, operational capability, and active oversight.

Broader Themes from the 2026 Privacy Summit

Beyond the regulator discussions, the rest of the conference offered a broader look at where privacy law and practice are heading. Across panels on AI governance, ad tech, litigation, cybersecurity, and emerging legislation, a consistent message emerged: privacy compliance is continuing to mature. Former regulators pushed back on the earlier regulator panels’ suggestion that privacy laws are not a “patchwork” arguing that the inconsistencies are real and creating challenges. Yet, organizations must push past interpretation to build governance structures capable of managing increasingly complex technological and regulatory risks.

Several themes surfaced repeatedly throughout the sessions.

Privacy Governance Is an Operational Discipline

One of the clearest signals from the conference was that privacy compliance is no longer simply a matter of policy drafting or legal interpretation. Increasingly, it requires operational systems capable of managing risk across products, data flows, and technologies.

This theme ran through the AI governance panel, which explored the rapidly evolving regulatory landscape for artificial intelligence. Speakers noted that organizations are facing a growing mix of obligations, from California’s forthcoming automated decision‑making rules to the EU AI Act and a wave of state‑level AI proposals. Trying to track each new requirement individually, several panelists suggested, is likely to prove unsustainable.

Instead, companies are being pushed toward governance frameworks that can adapt as regulation evolves. Structured risk assessments, internal review processes, and clear documentation of how systems are designed and deployed are quickly becoming baseline expectations.

Advertising Data Practices Face Growing Pressure

The ad tech panel highlighted how the ecosystem is facing pressure from multiple directions at once. State privacy laws continue to expand consumer rights around data sharing and targeted advertising. Children’s data is and will be particularly targeted for new regulation and enforcement. A dedicated session on children’s privacy compliance explored how the updated 2025 COPPA Rule and a growing wave of state age-assurance laws are raising the bar for data security, third-party disclosures, and data retention. At the same time, developers are being pushed toward “actual knowledge” of users’ ages—creating an increasingly complex compliance landscape and highlighting the tension between age-verification requirements and broader data-minimization principles.

Generally, regulators are paying closer attention to how advertising data moves through complex networks of partners and vendors. At the same time, plaintiffs’ lawyers have been targeting the same practices through litigation. The privacy litigation update panel examined the continued surge of lawsuits under California’s Invasion of Privacy Act (CIPA), many of which challenge the use of common website technologies such as session‑replay tools, chatbots, and analytics platforms.

New legislation is adding another layer of complexity. The California Delete Act session explored how the law’s centralized deletion mechanism (DROP) could reshape compliance obligations for data brokers and the broader advertising ecosystem.

Data Minimization Is Emerging as a Core Enforcement Principle

The panel on the “necessary and proportionate” standard examined how regulators are beginning to interpret this language in a more substantive way. Recent enforcement activity suggests regulators may increasingly scrutinize whether companies truly need the data they collect in order to provide a particular service.  Honda was mentioned several times as a leading example, having required users to provide their VIN just to opt out of data sharing.

Several panelists noted that this concept closely resembles the purpose‑limitation principle embedded in the GDPR, though with an additional focus on consumer expectations. Regulators may increasingly ask whether a company’s data practices align with what an ordinary user would reasonably anticipate when interacting with a service.

Privacy, Security, and National Security Are Converging

Cybersecurity discussions added another dimension to the conference’s broader themes. As digital threats grow more sophisticated, the boundaries between privacy compliance and security risk management are becoming harder to separate.

The panel on data security strategies for 2026 examined how emerging technologies—including AI‑enabled attacks and automated threat tools—are changing the threat landscape. For privacy professionals, the implication is that technical literacy is increasingly important.

The session on the Department of Justice’s Bulk Sensitive Data Rule also explored how national security concerns are increasingly shaping privacy compliance, particularly for organizations that rely on global vendors, cloud infrastructure, or international workforces.

Data Protection Is Entering Its Next Phase

Taken together, the conference conversations reflected a field that is evolving quickly. The early years of privacy regulation were largely defined by the passage of new laws and the creation of consumer rights frameworks. The next phase appears likely to focus on implementation—a shift from drafting the blueprint to turning on the lights and seeing whether the building actually works.

For organizations, this means building privacy programs that operate not just on paper but in practice while supported by governance structures, technical understanding, clear accountability, and ready to “show their work” with testing.


Forgot Password

Enter the email associated with you account. You will then receive a link in your inbox to reset your password.

Personal Information

Select Section(s)

CLA Membership is $99 and includes one section. Additional sections are $99 each.

Payment