Privacy Law
AI and Privacy: A Guide to California’s Recently Passed Legislation
By: Afshan Bhatia, Anokhy Desai, Kewa Jiang, and Hina Moheyuddin
In the corridors of the California legislature, legislators are reshaping the boundaries of privacy and artificial intelligence. This article reviews 11 key bills that confront a wide spectrum of contemporary challenges from workplace surveillance to automated decision systems that arbitrate human opportunity. It is critical for businesses, developers, consumers, and privacy professionals alike to understand these bills because they signal how California is setting nationwide standards for transparency, accountability, and ethical technology use.
The California legislative session concluded on September 12, 2025, the first year of a two-year legislative process. The bills that passed both the Senate and the Assembly will now be considered by Governor Gavin Newsom, who has until October 12, 2025 to either sign, veto, or take no action on the bills. The legislature will reconvene on January 5, 2026.
AB 322: Precise Geolocation Data (Ward)
By: Afshan Bhatia
Status: As of August 29, 2025, held under submission.
AB 322 amends the California Consumer Privacy Act (CCPA) to prohibit the sale, trade, or lease of precise geolocation data. The bill advanced through initial committee hearings but was held under submission in the Senate Appropriations Committee in August 2025, effectively stalling the measure.
AB 322, authored by Assemblymember Chris Ward, expands CCPA by imposing strict new limitations on how businesses collect and handle precise geolocation information. The bill mandates that when such data is being collected, the business must prominently notify the consumer by detailing what type of precise location data is gathered, how it is processed, for what goods or services, and to whom it may be disclosed. It also limits retention: businesses may only retain the data as long as necessary to provide requested services or, at most, one year after the consumer’s last intentional interaction – whichever is earlier.
AB 566: California Opt-Out Act (Lowenthal)
By: Afshan Bhatia
Status: As of September 12, passed both the Senate and Assembly. Awaiting the Governor’s signature.
AB 566, authored by Assemblymember Josh Lowenthal, seeks to strengthen enforcement of consumer privacy rights under the California Consumer Privacy Act (CCPA) by requiring web browsers to provide a built-in setting allowing users to send a universal opt-out preference signal. Businesses that receive the signal are required to honor it, preventing the collection or sale of personal information without consumer consent.
AB 566 requires data brokers to honor deletion requests for all personal information they have collected, regardless of whether it was obtained directly or indirectly. The bill further extends this requirement to future collections, mandating that data brokers continue to delete information about the same consumer unless a statutory exception applies. By tightening these obligations, AB 566 aims to ensure that Californians have stronger and more enforceable control over their personal data, advancing the state’s leadership in privacy protection.
If signed by the Governor, AB 566 would take effect beginning January 1, 2027.
AB 1018: Automated Decisions Safety Act (Bauer-Kahan)
By: Kewa Jiang
Status: As of September 12, ordered to inactive file but proceeds as a two-year bill.
AB 1018 seeks to regulate how covered automated decision systems (ADS) are developed and deployed to make or facilitate consequential decisions about a natural person. “Consequential decisions” include, among others, decisions about employment, housing, access to public utility, and legal services. The bill requires developers to conduct an impact assessment of the covered ADS, contract an independent-third party auditor to assess compliance beginning on January 1, 2030, and disclose specific information to potential deployers if a covered ADS was sold, licensed, or otherwise transferred. The impact assessment needs to include information about developer-approved uses, accuracy of and reliability of the ADS, and whether disparate treatment or impact occurred or likely to occur. The bill requires deployers to retain relevant information about the ADS for as long as it is in use plus additional 5 years, to designate at least one employee to oversee compliance, and to provide disclosures in plain language to individuals before and after the use of covered ADS (i.e. right to object or correct decisions, whether a human was in the loop, appeal outcome).
AB 1018 is similar to AB 2930, which was also introduced by Assemblymember Rebecca Bauer-Kahan during the 2024 legislative session but it ultimately failed to pass. Though AB 1018 differs in that it applies more broadly than AB 2930, which only applied to the use of ADS in the employment context. Generally, both bills align with the California Civil Rights Council’s newly approved regulation to prevent employment discrimination through the use of artificial intelligence and related tools.
AB 1043: Digital Age Assurance Act (Wicks)
By: Kewa Jiang
Status: As of September 12, passed both the Senate and Assembly. Awaiting the Governor’s signature.
AB 1043 aims to protect children on the internet by requiring operating system providers and app developers to verify the age of users. The bill’s author, Assemblymember Buffy Wicks, believes AB 1043 “avoids constitutional concerns by focusing strictly on age assurance —not content moderation.” Assemblymember Wicks also authored the California Age Appropriate Design Code Act, which was challenged on constitutional grounds by NetChoice, a trade association of online and eCommerce businesses. The lawsuit is currently pending before the Ninth Circuit Court of Appeals.
The bill requires operating system providers to display an accessible interface at account setup to indicate the age of an user by an account holder, an individual at least 18 years old or the parent or legal guardian of a user under 18 years old. The bill defines a “user” as a child, under the age of 18, that is the primary user of the device. App developers are similarly required to verify the age of a user by requesting a signal, defined as age bracket data (e.g. under age of 13 or between 13 and 16), from an operating system provider or a covered app store. The developer must comply with applicable laws based on the signal or, if the developer has clear and convincing information that a user’s age is different from the provided signal, the developer must use that information as the primary indicator of the user’s age.
In order to minimize data sharing, operating system providers must only share a minimum amount of information necessary to app developers and must not share signal information with a third party for purposes other than age assurance. Likewise, an app developer is prohibited from requesting more information from an operating system provider or app store for age assurance and is prohibited from sharing the signal information with a third party for other purposes. The Attorney General enforces the provisions and violations can result in monetary penalties.
AB 1064: Leading Ethical AI Development (LEAD) for Kids Act (Bauer-Kahan)
By: Hina Moheyuddin
Status: As of September 12, passed both the Senate and Assembly. Awaiting the Governor’s signature.
AB 1064 establishes a framework for regulating artificial intelligence systems used by children in efforts to protect minors’ safety. The bill author, Assemblymember Rebecca Bauer-Kahan, believes children are increasingly exposed to AI technologies that pose a risk to their mental health, social development, and safety, which AB 1064 aims to address by prohibiting the most harmful applications.
AB 1064 prohibits operators from making chatbots available to children if they are foreseeably capable of: encouraging self-harm, suicidal ideation, violence, drug/alcohol use or disordered eating; providing unsupervised mental-health therapy or discouraging professional help; promoting harm to other, illegal activity, or child sexual abuse materials; engaging in sexually explicit interaction; excessively validating user preferences at the expense of safety; or optimizing manipulative engagement.
To promote regulatory uniformity, the bill draws on a New York law addressing suicide and self-harm. A “companion chatbot” is defined as a generative AI system with a natural language interface that stimulates a sustained humanlike relationship by (1) retaining user information and preferences to personalize interactions; (2) asking unsolicited, emotion-based questions beyond direct prompts; and (3) sustaining ongoing dialogue about personal matters. Excluded are customer service bots, research tools, and internal productivity systems.
AB 1064 adopts a risk-tiered classification system for AI products (prohibited, high-risk, moderate, low), mandates consent requirements for using children’s personal data in training, and creates transparency, audit, and risk-assessment obligations.
A Standards Board will finalize regulations by 2028. Violations carry civil penalties up to $25,000, attorney general enforcement, and a private right of action.
SB 690: Crimes: Invasion of Privacy (Caballero)
By: Anokhy Desai
Status: Advanced to Assembly Privacy and Consumer Protection Committee as a two-year bill.
After its introduction in February, the California Senate unanimously passed SB 690 on June 3, 2025. The bill would amend the California Invasion of Privacy Act (CIPA) to exempt businesses’ use of web tracking technologies like cookies and pixels if they are being used for “a commercial business purpose.” The bill defines the term as “the processing of personal information either performed to further a business purpose or subject to a consumer’s opt-out rights.” The “or” language does not exempt businesses from complying with the California Consumer Privacy Act requirements on data collection, processing, and retention, however. Notable for CIPA litigators, the bill limits organizational liability by specifying that neither a pen register nor a trap and trace device means a device or process is being used for said commercial business purpose.
While SB 690 passed with ease in the state Senate, it met hurdles in the Assembly and is now marked as a two-year bill. The state’s legislative sessions run in two-year cycles, which gives bills that did not pass in the first year the chance to be reconsidered and amended in the second year of the session. This bill status change in combination with the fact that the bill no longer applies retroactively indicates a delay in decreased CIPA actions to potentially 2027, if the bill passes at all.
SB 238: Workplace Surveillance Tools (Smallwood-Cuevas)
By: Hina Moheyuddin
Status: As of July 16, passed the Senate but failed to advance out of the Assembly Privacy and Consumer Protection Committee.
SB 238 increases transparency around workplace surveillance by requiring employers to report annually what surveillance tools they use, along with details about data collection and worker choices.
Employer includes both private and public employers, including state, counties, municipalities, school districts, special districts, and labor contractors. Worker means natural persons (or their authorized representatives) who are job applicants, employees, or independent contractors working for or through a business or governmental entity. Workplace surveillance tool is any system/application/device that collects or helps collect worker data (or related communications, activities, biometrics, behaviors, etc.) by means other than a person’s direct observation. This includes, for example, audio and visual surveillance, geolocation, continuous time-tracking tools, biometric tools, and electromagnetic/photoelectronic/photo-optical means.
SB 238 requires employers to annually notify the California Department of Industrial Relations (DIR) of all the workplace surveillance tools being used. If the employer was already using surveillance tools before January 1, 2026, there is a special deadline (before February 1, 2026) to provide the first notice. Once DIR receives the notice, it must publish it on the department’s website within 30 days.
SB 420: Automated Decision System (Padilla)
By: Hina Moheyuddin
Status: As of June 9, jointly referred to the Assembly Privacy and Consumer Protection and Judiciary Committee but failed to advance out of committees.
SB 420 aims to regulate “high-risk” automated decision systems: (ADS) used by both public and private actors, particularly when decisions made by AI/automation have serious legal or material effects on people’s lives (e.g., employment, housing, health care, credit).
Automated decision system means a system that uses machine learning, statistical modeling, data analytics, or AI to issue outputs (score, classification, recommendation) that assist or replace human discretionary decision-making and materially impact people. High-risk covers ADS used in areas like education, employment, housing, essential utilities, healthcare, and legal rights.
SB 420 requires developers of high-risk ADS to conduct impact assessments before making the system publicly available (if after January 1, 2026) or if substantially modifying it. Deployers also have assessment obligations in many cases. Individuals subject to decisions made by high-risk ADS must be notified that an ADS is being used, what decision, what kind of data are involved, and provided information about how the ADS works. Where technically feasible, individuals affected should have the opportunity to appeal the decision to a human reviewer. Developers and deployers must maintain governance programs with administrative and technical safeguards to manage foreseeable risks, especially algorithmic discrimination (i.e., risks of unfair or biased outcomes against protected classes).
The Attorney General and the California Civil Rights Department have enforcement authority. Violations may result in civil actions, civil penalties (which vary by size of entity), or injunctive relief. However, there is a “cure” period: entities given notice have 45 days to cure certain violations and, if cured with a written statement under penalty of perjury, may avoid action.
SB 361: Data Brokers: Data Collection and Deletion (Becker)
By: Afshan Bhatia
Status: As of September 12, passed both the Senate and Assembly. Awaits the Governor’s signature.
SB 361, introduced by Senator Josh Becker, expands the Delete Act by increasing data broker transparency requirements. The bill requires data brokers to disclose sensitive categories of data they collect, such as account logins, government identification numbers (e.g., Social Security or driver’s license numbers), citizenship and immigration status, when they register with the California Privacy Protection Agency (CCPA). Data brokers are also required to report whether, in the past year, they have sold or shared consumer data with specified parties including foreign adversaries, federal and state governments, law enforcement, or developers of AI systems.
SB 361 also reinforces consumer privacy by mandating that data brokers access a deletion mechanism, established by the CCPA, at least once within 45 days to review consumer opt-out or deletion requests.
SB 771: Social Media Platforms Endangering Californians (Stern)
By: Afshan Bhatia
Status: As of September 12, passed both the Senate and Assembly. Awaits the Governor’s signature.
SB 771 aims to hold large social media platforms (defined as those with annual gross revenues exceeding $100 million) liable if their algorithms or other actions aid, abet, conspire in civil rights or hate crime violations, or act as joint tortfeasors in violations of established civil rights, hate crime, or harassment laws. Under the bill, intentional, knowing, or willful violations can result in civil penalties of up to $1,000,000. In contrast, reckless violations can incur penalties of up to $500,000, with the penalties potentially doubled if the platform knew or should have known that the victim was a minor.
SB 771 also explicitly states large social media platforms are independently liable for deploying an algorithm that relays content which is separate from the message of the content itself. Platforms are deemed to have actual knowledge of how their algorithms operate.
If signed by the Governor, the bill becomes operative on January 1, 2027.
SB 435: Sensitive Personal Information (Wahab)
By: Afshan Bhatia
Status: As of July 16, failed to pass the Assembly Privacy and Consumer Protection Committee but proceeded as a two-year bill.
SB 435 proposes to strengthen California privacy protections by amending the California Consumer Privacy Act (CCPA) to eliminate the current exemption that excludes “publicly available” sensitive personal information from being treated as sensitive.
Currently, data that appears in government records, widespread media, or was disclosed by individuals without audience restrictions is categorized as “publicly available” and thus falls outside the CCPA’s sensitive personal information safeguards.
SB 435 removes this carve-out, ensuring that information such as social security numbers, immigration status, sexual orientation, genetic data, and geolocation will retain full protection as sensitive personal information even when publicly accessible.