Privacy Law


CLA’s Privacy Law Section summarizes important developments in California privacy and beyond. 

The UK’s Age Appropriate Design

By Brandon M. Jasso, CIPP/US

The Information Commissioner’s Office (“ICO”) is independent body in the United Kingdom (“UK”) that oversees and upholds information rights (for more information see here). In particular, the ICO oversees the Data Protection Act 2018 (“DPA”), which came into force on May 25, 2018 (for an overview of the DPA see here). Pursuant to the DPA section 123, the ICO is required to prepare “a code of practice which contains such guidance . . . appropriate on standard of age-appropriate design of relevant informatic society services which are likely to be accessed by children,” referred to the Age Appropriate Design Code (the “Code”) (see DPA § 123).

The Code was issued on August 12, 2020, and took effect on September 2, 2021. To help business comply with the Code, the ICO released a comprehensive guide (“Guide”) (a copy of the Guide can be founder can be found here (pdf) or here (html)). Please note that all quotes and references below shall be from the Guide. A thorough reading of the Guide is highly recommended.

The Code’s Focus:

The ICO emphasizes that the purpose of the Code is to create a safe space for children to “learn, explore and play,” and that it the Code is “not seeking to protect children from the digital world, but by protecting them within it.” Specifically, “[t]he focus is on providing default settings which ensures that children have the best possible access to online services whilst minimising data collection and use, by default.” Essentially, the Code seeks to provide protections that are in the best interest of children living in the UK by setting out 15 standards that reflect a risk-based approach as follows:

  1. Best interests of the child: The best interests of the child should be a primary consideration when you design and develop online services likely to be accessed by a child.
  2. Data protection impact assessments: Undertake a DPIA to assess and mitigate risks to the rights and freedoms of children who are likely to access your service, which arise from your data processing. Take into account differing ages, capacities and development needs and ensure that your DPIA builds in compliance with this code.
  3. Age appropriate application: Take a risk-based approach to recognising the age of individual users and ensure you effectively apply the standards in this code to child users. Either establish age with a level of certainty that is appropriate to the risks to the rights and freedoms of children that arise from your data processing, or apply the standards in this code to all your users instead.
  4. Transparency: The privacy information you provide to users, and other published terms, policies and community standards, must be concise, prominent and in clear language suited to the age of the child. Provide additional specific ‘bite-sized’ explanations about how you use personal data at the point that use is activated.
  5. Detrimental use of data: Do not use children’s personal data in ways that have been shown to be detrimental to their wellbeing, or that go against industry codes of practice, other regulatory provisions or Government advice.
  6. Policies and community standards: Uphold your own published terms, policies and community standards (including but not limited to privacy policies, age restriction, behaviour rules and content policies).
  7. Default settings: Settings must be ‘high privacy’ by default (unless you can demonstrate a compelling reason for a different default setting, taking account of the best interests of the child).
  8. Data minimisation: Collect and retain only the minimum amount of personal data you need to provide the elements of your service in which a child is actively and knowingly engaged. Give children separate choices over which elements they wish to activate.
  9. Data sharing: Do not disclose children’s data unless you can demonstrate a compelling reason to do so, taking account of the best interests of the child.
  10. Geolocation: Switch geolocation options off by default (unless you can demonstrate a compelling reason for geolocation to be switched on by default, taking account of the best interests of the child). Provide an obvious sign for children when location tracking is active. Options which make a child’s location visible to others must default back to ‘off’ at the end of each session.
  11.  Parental controls: If you provide parental controls, give the child age appropriate information about this. If your online service allows a parent or carer to monitor their child’s online activity or track their location, provide an obvious sign to the child when they are being monitored.
  12.  Profiling: Switch options which use profiling ‘off’ by default (unless you can demonstrate a compelling reason for profiling to be on by default, taking account of the best interests of the child). Only allow profiling if you have appropriate measures in place to protect the child from any harmful effects (in particular, being fed content that is detrimental to their health or wellbeing).
  13. Nudge techniques: Do not use nudge techniques to lead or encourage children to provide unnecessary personal data or weaken or turn off their privacy protections.
  14. Connected toys and devices: If you provide a connected toy or device ensure you include effective tools to enable conformance to this code.
  15.  Online tools: Provide prominent and accessible tools to help children exercise their data protection rights and report concerns.

How the Code Defines Child:

The Code adopted the 1989 United Nations on the Rights of the Child (“UNCRC”) definition of children to mean all persons under 18 years old and their personal data. Therefore, the Code is broadly applicable, and that fact must be considered for organizations when adapting to the Code.  The ICO considered the needs and capacity of children at their different ages as older children will be able to make more informed decisions for themselves, while younger children will require more guidance and support.

Who the Code Applies to:

The Code applies to information society (“ISS”), which are those who provide online products or services such as apps, websites, games, community environments, connect toys or devices “that process personal data and are likely to be access by children in the UK.”  The Code aligns itself with the UK General Data Protection Regulation (“UK GDPR”), which came into effect on January 1, 2021, in that the “code sets out practical measures and safeguards to ensure processing under the UK GDPR can be considered ‘fair’ in the context of online risks to children,” and will help with compliance under Article 5.

Failure to Comply and How to Use the Code

Failure to comply with the Code will likely make it difficult to prove that “your processing is fair and complies with the GDPR and [Privacy and Electronic Communications Regulations (“PECR”)].” The ICO notes that if a child’s personal data is processed and there is a breach of the GDPR or PECR, they can act against an ISS. These actions include “assessment notices, warnings, reprimands, enforcement notices and penalty notices (administrative fines).” The fines for serious breaches can be up to “€20 million (£17.5 million when the UK GDPR comes into effect) or 4% of your annual worldwide turnover, whichever is higher.”

The ICO states that 15 standards are what need to be implemented for an ISS covered by the Code. Conformity with the Code will be based on the 15 standards, which must be implemented “to the extent they are relevant to your service.” This means that the ICO will obviously assess an ISS based on the quantity of information processed to determine if you have complied with the Code.


            It is recommended that the parties subject to the Code complete a thorough review of their operations to verify compliance and make changes as necessary. The Guide provides a variety of tools in the appendix including a flow to determine if the Code applies to ISS, a chart regarding age and development stages that an ISS can use to determine consent requirements, what a lawful basis for processing is, and a data protection impact template.

            Companies outside of the UK should consider the practices and standards and potentially adopt them as best practices as more privacy and data laws are adopted around the world. In particular, the United States based practitioners should pay close attention to international privacy laws and regulations—like the Code—affecting children as the states look to what is happening internationally when adopting privacy and data laws to protect their citizens, including children.   

FTC Previews Privacy and Cybersecurity Rulemaking Following Moves to Streamline Process

By Cody Venzke

On December 10, 2021, the Federal Trade Commission published an updated statement of its regulatory priorities, including rulemaking to address “abuses stemming from surveillance-based business models.” According to the Commission, the rulemaking may focus on “curbing lax security practices, limiting intrusive surveillance, and ensuring that algorithmic decision-making does not result in unlawful discrimination.”

The updated statement of priorities came as part of the administration-wide update of the Unified Regulatory Agenda and Regulatory Plan, led by the Office of Information and Regulatory Affairs. The Unified Agenda and Regulatory Plan are required by President Bill Clinton’s 1993 Executive Order 12866. Under the Executive Order, the Unified Agenda is a semiannual “agenda of all regulations under development or review” and must include “at minimum . . . a brief summary of the action, the legal authority for the action, any legal deadline for the action,” and agency contact information. While the Unified Agenda is descriptive of current and planned rulemaking, the Regulatory Plan requires agencies to describe their high-level “regulatory objectives and priorities and how they relate to the President’s priorities,” anticipated costs and benefits of regulatory actions, alternatives under consideration, and a statement of need for the regulations.

In its new statement of priorities, the Commission stated that it would also “explore whether rules defining certain ‘unfair methods of competition’ prohibited by section 5 of the FTC Act would promote competition and provide greater clarity to the market.” It cited examples provided in President Joe Biden’s Executive Order 14036, “Promoting Competition in the American Economy,” including non-compete clauses, surveillance, the right to repair, and unfair competition in online marketplace. The corresponding proposal in the Commission’s updated Unified Agenda provided for an Advance Notice of Proposed Rulemaking (ANPRM) to be issued in February 2022.

An ANPRM is an administrative procedure under the Magnuson-Moss Warranty—FTC Improvements Act, Pub. L. No. 93-637, which prescribes additional procedures for the Commission to promulgate “trade regulation rules.” The Act’s provisions were added as Section 18 of the FTC Act, 47 U.S.C. § 57a, and include procedures such as publication of an ANPRM, public comment on the ANPRM and its submission to Congress, a mandatory oral hearing, a staff report, public comment on a proposed rule, and special rules for judicial review. The Commission has previously noted the burden of these procedures, stating, “Even under the best of circumstances, this would be a lengthy process.”

However, the Commission has recently taken steps to streamline those procedures. In March, then-Acting Chair Rebecca Slaughter established a new working group to “help build [the] Commission’s rulemaking capacity and agenda for unfair or deceptive practices and unfair methods of competition.” Further, in July, the Commission’s the then-Democratic majority voted to approve changes to its procedural rules under Magnuson-Moss. The changes provide that the Commission Chair or her designate will preside over hearings, give the Commission more control over who may present at the hearings, and give the Commissioner an earlier opportunity to designate issues of material fact to be resolved in hearings. In a statement, the Commission described the changes as removing “extraneous and onerous procedures that serve only to delay Commission business”; in turn, Commissioners Christine S. Wilson and Noah Joshua Phillips stated in their dissent that the revisions would “undermine the goals of participation and transparency that Congress sought to advance when it enacted and amended Section 18.” According to the dissenting Commissioners, “These changes will facilitate more rules, but not better ones.”

Grindr Fined 65 Million NOK by Norwegian Data Protection Authority

By Natalie Marcell, CIPP/US

Grindr, headquartered in West Hollywood, California, has been fined 65 million NOK by Norway’s data protection authority, Datatilsynet, per the administrative fine decision issued December 13, 2021 for having disclosed personal data to advertising partners based on what were found to have been invalid consents. The data sharing occurred from July 20, 2018 through April 7, 2020 and was limited to data subjects in Norway that were using the free version of the Grindr app. The data that was shared included advertising ID, IP address, information about the computing environment, self-reported age, gender, and location.   

Grindr was found to have violated Articles 6(1) and 9(1) of the European Union’s General Data Protection Regulation (“GDPR”). Grindr’s prior consent mechanism, according to Datatilsynet, obtained invalid consents because they were not freely given, specific, informed and unambiguous. Additionally, Datatilsynet found that the sharing personal data on a specific user alongside the Grindr app name or app ID qualified as sharing data concerning a person’s sexual orientation – a special category of data. Datatilsynet found that Grindr had not obtained explicit consent to sharing data concerning a person’s sexual orientation in violation of Article 9(1). 

Fines Imposed

On January 24, 2021 Datatilsynet issued an advance notifice of their intent to impose a fine of 100 million NOK (approximately $11 million USD) against Grindr. After receiving Grindr’s reply, comments from the Norwegian Consumer Council (“NCC”), and further information from Grindr, the fine was ultimately adjusted to 65 million NOK (approximately $7.2 million USD). According to the decision, the reduction of 35 million NOK (approximately $3.8 million USD) was applied on account of Grindr’s revenue and because of the changes that Grindr has made to its consent mechanism. 

Grindr’s Consent Mechanism

Grindr argued that its prior consent mechanism was compliant with the GDPR and that it collected a double consent requiring two positive actions.

Grindr’s prior consent mechanism incorporated a process where notice at the Google Play Store or Apple App Store provided a link to Grindr’s full privacy policy along with information that the paid subscription to the app included no banner ads. When a user downloaded the app, they were presented with Grindr’s terms & conditions which incorporated a link to Grindr’s full privacy policy. The user had to click “proceed” on the terms & conditions which prompted a pop up that they had to accept or cancel. If terms & conditions were accepted, the user was then presented with Grindr’s full privacy policy. The user had to click “proceed” on the privacy policy which prompted another pop up where they had to either accept the privacy policy or cancel. The privacy policy that was presented was the full text version, and it included: links to “where we share” and “third party advertising companies,” an explanation on data sharing with advertising partners, instructions on how to disable location sharing through device settings, instructions on how to opt out of behavioral advertisements through device settings, and a table listing Grindr’s various purposes for processing user data which listed sharing data with advertising partners to show advertising on Grindr services based on the data provided, and personalized advertising. Since 2017 Grindr was providing users with the full text of its privacy policy which was available for review via the app.

Datatilsynet’s Key Findings

Supervisory authorities are expected to follow EDPB Guidelines when enforcing the GDPR

Grindr argued that Datatilsynet cannot rely on the European Data Protection Board’s (“EDPB”) Guidelines as binding authority in assessing whether consents Grdinr obtained were valid. Datatilsynet indicated that the EDPB Guidelines are not the bases for its decisions in this case, rather the Guidelines are used throughout the decision as interpretative aids to ensure consistent application of the GDPR. Notably Datatilsynet specified that supervisory authorities are expected to follow EDPB Guidelines when enforcing the GDPR.  

As to the time the EDPB Guidelines were issued, Datatilsynet explained that the Guidelines on consent adopted on May 4, 2020 were a revision of the Article 29 Working Party Guidelines on consent that had been adopted for the first time on November 28, 2017, revised on April 10, 2018, and endorsed by the EDPB on May 25, 2018. The EDPB Guidelines were revised in 2020 to provide guidance on cookie walls and scrolling but the rest of the Guidelines remained unchanged from the prior version. As such the guidance on the notion of consent that was available back in July 2018 through April 2020, when Grindr’s prior consent mechanism was in use, was identical to the one issued by the EDPB in 2020.  

Consents collected by Grindr were invalid under Article 6(1) because they were not freely given, specific, informed, or unambiguous

Grindr had different purposes for processing data. Datatilsynet found that the consents collected were not freely given because Grindr did not allow for separate consents to be given for the separate purposes of processing data. Furthermore, Grindr’s prior consent mechanism bundled the consents to sharing personal data with advertising partners with acceptance of the privacy policy as a whole. This bundling meant that in addition to not being freely given, the consents were also not specific. 

While Grindr emphasized that it had provided data subjects with information specific to each of the purposes of data processing before obtaining their consent, the Datatilsynet explained that this is insufficient if the data subject is not allowed to give separate consent to different processing operations. In terms of providing information to data subjects, Grindr provided users the full text of its privacy policy. Datatilsynet noted that Grindr’s privacy policy in effect from December 31, 2019 listed 25 processing purposes. The privacy policy effective before December 31, 2019 contained 3,793 words, and the privacy policy in effect after December 31, 2019 contained even more. Overall, data subjects were presented with large amounts of information at once and they were asked to accept all of it. While Grindr argued that users were not nudged to consent, Datatilsynet found this practice of presenting a large amount of information at once followed by a request to accept it all essentially nudged data subjects to proceed without actually familiarizing themselves with all of the information that was provided. Ultimately the information was not presented in an easily accessible form to enable data subjects to make an informed decision of whether or not to grant consent.

As to ambiguity, Datatilsynet found that clicking “accept” or “I accept the privacy policy” was not an unambiguous consent because data subjects were unclear that pressing either button entailed giving consent to sharing their data with advertising partners for behavioral advertising.

Sharing personal data alongside Grindr’s app name or app ID is equivalent to sharing data concerning a person’s sexual orientation placing Grindr within the requirements of Article 9

Under Article 9 of the GDPR, in order to lawfully process special categories of data, the controller must fulfil one of the exemptions of Article 9(2) in addition to having valid consent pursuant to Article 6(1). Of relevance in this case were the exemptions of explicit consent and of data subjects manifestly making the personal data public.

Datatilsynet concluded that Article 9 does not require disclosure of the data subject’s specific sexual orientation. Datatilsynet further concluded that information that a data subject is a Grindr user is data “concerning” the data subject’s sexual orientation within the context of Article 9.

Grindr argued that it did not share data concerning a user’s sexual orientation and that the fact that a data subject is a Grindr user does not qualify as data concerning a person’s sexual orientation. In its investigation Datatilsynet found that OpenX, Grindr’s processor, pulled the description of Grindr’s app from the online store and attached keywords such as “gay”, “bi”, “trans” and “queer” to ad calls. These keywords were not generated or shared by Grindr to OpenX, they were generated by the OpenX software development kit (SDK). While Datatilsynet agreed that the keywords shared on different sexual orientations are general and described the app, not a specific data subject, Datatilsynet concluded that the sharing of personal data alongside the app name, app ID or the keywords describing the app qualifies as sharing data concerning a person’s sexual orientation. The Datatilsynet reasoned that Grindr is not intended to be used by cis men looking to interact with cis women and vice versa; Grindr explicitly targets data subjects belonging to a sexual minority through its marketing; public perception is that being a Grindr user indicates that the data subject belongs to a sexual minority; and that the disclosure of information on a data subject alongside the fact that the data subject is a user of Grindr, or the keywords, strongly indicates to the recipient that the data subject belongs to a sexual minority. 

Grindr’s argument that ad tech companies have devised blinding methods to obfuscate which app the ad call is coming from, and that participants in the ad tech ecosystem likely only receive a “blinded” app ID and not the corresponding app name so that downstream bidders are blind to the actual name of the app where the ad is to be served was rejected by Datatilsynet. Controllers cannot rely on the action of advertising partners or other participants in the ad tech ecosystem to halt its sharing of data. Regardless, Datatilsynet received a Mnemonic technical report from the NCC which showed that the Grindr app name was shared to Twitter’s MoPub, who further shared this within their network, and the app name was also shared from Grindr to multiple other advertising partners. Moreover, even if the app name or app ID was actually blinded, the recipient could still receive keywords relating to the Grindr app, as evidenced by OpenX appending keywords in ad calls. 

Grindr also argued that by being a user of Grindr, the data subject has manifestly made data concerning their sexual orientation public. Datatilsynet disagreed finding that there is a distinct difference between making information available to a community of peers on the Grindr platform and making the information available to the public.

As Grindr was found to have collected invalid consents under Article 6(1), the sharing of any special categories of data was unlawful irrespective of Article 9. 

Exceeding industry practices does not ensure compliance

Grindr argued that its privacy practices exceeded the industry standard. Grindr pointed out that companies such as Tinder and Match present a link to their privacy policy, not the entire text of the privacy policy. Grindr by contrast has displayed its entire privacy policy since 2017. Grindr also stated that although it was industry practice to bundle consents to privacy practices with general terms and conditions, Grindr separated consents to its privacy policy from acceptance of its general terms and conditions.  

Ultimately, exceeding industry practices is not sufficient to ensure compliance. Datatilsynet explained that even if Grindr exceeded industry practices Grindr could not demonstrate that data subjects had given informed consent that was freely given, specific, and unambiguous.

Aggravating and mitigating factors relevant to Datatilsynet’s administrative fine

The aggravating factors in the Datatilsynet’s decision included:

  • Duration of the infringement: The GDPR entered into force in Norway on July 20, 2018. From that point in time until Grindr launched its new consent mechanism in the European Economic Area (“EEA”) on April 8, 2020 (21 months) Grindr lacked valid legal basis for disclosing personal data of free app users to advertising partners.
  • Datatilsynet considered Grindr’s infringements to be intentional: Grindr through its board members or executives which acted on behalf of it were responsible for the prior consent mechanism used by Grindr and that consent mechanism was not compliant with the GDPR’s requirements. Datatilsynet explained that businesses and responsible persons acting on behalf of it need to examine what legal requirements apply to their field and implement accordingly. Guidance on the relevant consent requirements from the time in question was available from the Article 29 Working Party Guidelines endorsed by the EDPB.
  • Grindr did not sufficiently take responsibility: According to the Datatilsynet, Grindr lacked control of the data flow and recipients, and had limited or no control over subsequent processing.
  • Categories of personal data affected: The type of data shared included special categories of personal data and GPS location. GPS location is particularly revealing of the life habits of data subjects and can be used to infer sensitive information.
  • Data concerning sexual orientation enhanced Grindr’s responsibility: Grindr collected personal data from thousands of data subjects in Norway and disclosed data concerning their sexual orientation which enhanced Grindr’s responsibility to exercise processing with conscience and due knowledge of the applicable legal requirements.
  • Grindr profited from the infringement: Grindr generated advertisement revenue from sharing personal data of Norwegian users without valid consents.

As to mitigating factors, the Datatilsynet decision indicated:

  • Low number of complaints filed is not a mitigating factor: The Datatilsynet specified that the fact that only one data subject filed a complaint to the NCC does not imply a low level of damage suffered by data subjects.
  • Changes Grindr made with aim to remedy deficiencies in its prior consent mechanism is a mitigating factor: In June 2019 Grindr began evaluating internal capabilities and alternatives to its consent mechanism which led to Grindr contracting with OneTrust for their new consent mechanism that was launched in the EEA on April 8, 2020. Grindr’s current consent mechanism according to its March 8, 2021 reply to Datatilsynet, now includes layered set of choices and a new layered format of presenting its privacy policy. While the Datatilsynet did not assess Grindr’s new consent mechanism, it found the implementation of a new mechanism to be a mitigating factor warranting adjustment of its administrative fine from 100 million NOK to 65 million NOK.  

Grindr has the option of lodging an appeal against Datatilsynet’s decision within three weeks. It has been reported that Grindr is considering lodging an appeal of the decision.

OpenX Technologies to Pay $2 Million for Violating COPPA and §5 of The FTC Act

By Natalie Marcell, CIPP/US

OpenX will pay the Federal Trade Commission (“FTC”) $2 million for not complying with the Children’s Online Privacy Protection Act (“COPPA”) and for violating §5 of the FTC Act.

OpenX is a programmatic advertising tech company headquartered in Pasadena, California that operates a real time bidding platform that monetizes websites and mobile apps by selling ad space. OpenX had reviewed hundreds of apps that identified as being “for toddlers,” “for kids,” “kids games,” or “preschool learning,” and apps that included age ratings that indicated that they were aimed at children under the age of 13, however OpenX miscategorized these apps when they participated in the OpenX ad exchange. As a result, OpenX collected data from those children directed apps, and when OpenX received ad requests directly or indirectly from the child directed apps, OpenX transmitted the bid requests with the personal information of children, including location information.

Additionally, OpenX violated the FTC Act by falsely claiming that it was not collecting geolocation from users that opted out of location tracking. OpenX did continue to collect geolocation data from some Android mobile users after they had opted out of tracking.

The Department of Justice filed the complaint and final order on behalf of the FTC in the U.S. District Court for the Central District of California on December 15, 2021. The final order stipulated to a $7.5 million penalty, but it was capped at $2 million because of OpenX’s inability to pay. Per the settlement terms, OpenX is required to delete all of the ad request data that the company collected in violation of COPPA, implement a comprehensive privacy program to ensure compliance with COPPA, and keep track of apps and websites that have been banned or removed from its exchange.

The FTC’s press release is available here.

OpenX posted a statement on its website calling the collection of children’s information an unintentional error. OpenX also stated that had inadvertently collected geolocation data from Android users which it rectified by updating its Android software development kit (SDK). OpenX indicated that it has reviewed and bolstered its data privacy program to ensure COPPA compliance, and that it is engaging another third-party auditor to examine its policies and processes.

The CPRA’s Look Back Provision: Compliance Practices that Need to Begin January 1, 2022

By Natalie Marcell, CIPP/US

The California Privacy Rights Act (CPRA) amends the California Consumer Privacy Act (CCPA). While most provisions of CPRA do not go into effect until January 1, 2023, some of the changes have a 12-month look back provision that affects data collection practices. Businesses covered by the CPRA must have their data tracking compliance programs implemented and operational starting on January 1, 2022, in order to comply with the changes that go into effect January 1, 2023. 

In addition to the look back provision, the CPRA expands personal information to include data collected by businesses about employees, applicants, independent contractors and other work-related roles (“HR data”), as well as business to business (B2B) data collected. The CCPA originally exempted HR data and B2B data collected by businesses. This exemption will remain in effect through December 31, 2022.  As of January 1, 2023, however, HR data as well as B2B data will be covered by the CCPA, and businesses will need to be prepared to treat this information as other PI.  

With the CPRA’s look back provision requiring that a business’ disclosure of required information cover the 12-month period before the receipt of a consumer request, businesses have to track their collection, use and disclosure of personal information with regards to consumer data, HR data and B2B data starting on January 1, 2022. 

Businesses/employers covered under the CCPA

There are some changes as to which businesses will be required to comply with the CCPA. Businesses covered under the CCPA will include those that do business in California, operate for profit, determine the purpose and means of data processing, and meet either of the revenue or information processing thresholds:

  • Businesses with +$25 million in annual gross revenues
  • Businesses that buy, sell, or share the personal information of 100,000 or more consumers or households; or
  • Businesses that derive more than 50% of their revenue from selling or sharing consumers’ personal information.

Businesses that are a parent or subsidiary of an entity that meets any of these requirements and where the two use a common brand will also be a business covered under the CCPA.

If a business is covered by the CCP for consumer data, it is also covered for HR data, as well as B2B data.

CPRA’s look back provision

Under the CPRA, disclosures of required information must cover the 12-month period preceding the business’ receipt of a verifiable consumer request. A request submitted on January 1, 2023 for example, would require a business to respond with disclosure of personal information collection, use and disclosure covering the time period of January 1, 2022 through January 1, 2023.

The CPRA also provides for the adoption of regulations by the California Privacy Protection Agency (“the Agency”) that will allow for requests that cover more than the preceding 12-month period. Under said regulations businesses would be obligated to provide that information unless doing so proves impossible or would involve a disproportionate effort. Regardless, the CPRA does specify that the right to request required information beyond the 12-month period and a business’s obligation to provide that information applies to personal information collected on or after January 1, 2022.

Obligations subject to the look back provision – business compliance practices that should start January 1, 2022

Given the CPRA’s look back provision, businesses have to track their collection, use and disclosure of personal information as of January 1, 2022 so that they are able to respond to consumer requests that they will begin receiving on and after January 1, 2023 once the CPRA is in effect.

Personal Information

The definition of personal information for the most part remains the same as before – information that identifies, relates to, describes, is reasonably capable of being associated with or can reasonably be linked, directly or indirectly, with a particular consumer or household. 

With the right to access expanding to include employees, applicants, independent contractors and other work force roles, it is important to keep in mind that personal information includes professional, or employment related information, and inferences drawn from personal information to create a profile reflecting preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes.

The CPRA has added sensitive personal information as a new category of personal information.  Sensitive personal information includes:

  • Social security, driver’s license, state identification card, or passport number;
  • Account log-in, financial account, debit card, or credit card in combination with any required security or access code, password, or credentials allowing access to an account;
  • Precise geolocation;
  • Racial or ethnic origin, religious or philosophical beliefs, or union membership;
  • Contents of mail, email and text messages, unless the business is the intended recipient of the communication;
  • Genetic data;
  • Processing of biometric information for purpose of uniquely identifying a consumer; and
  • Personal information collected and analyzed concerning a consumer’s health, sex life, or sexual orientation.

Disclosure of Required Information

The required information that a business will have to disclose in response to a verifiable request received on or after January 1, 2023 -which will have to cover the 12-month period preceding the date of request includes:

Category & Source of Personal Information

  • Categories of personal information that the business has collected about that requestor.
  • Categories of sources from which the personal information is collected.

Selling and Sharing Personal Information

  • If the business has not sold or shared the requestor’s personal information, then that fact must be disclosed.
  • Categories of personal information that the business has sold or shared about that requestor. 
  • Categories of third parties to whom the requestor’s personal information was sold or shared.  This must be broken down by category of personal information for each category of third party to whom personal information was sold or shared.

Disclosure of Personal Information

  • If the business has not disclosed personal information for a business purpose, then that fact must be disclosed.
  • Categories of personal information that the business has disclosed about the requestor for a business purpose.
  • Categories of third parties to whom the business has disclosed the requestor’s personal information.
  • Categories of third parties to whom the business has disclosed the requestor’s personal information for a business purpose.   


  • The business or commercial purpose for collecting, selling or sharing personal information.

Specific pieces of Personal Information

  • Specific pieces of personal information the business has collected about that requestor.

The Agency is set to promulgate regulations that will define the term “specific pieces of information obtained from the consumer” with the stated goal of maximizing the right to access relevant personal information while minimizing the delivery of information that would not be useful to the consumer, such as system log information and other technical data.  

Eventually also likely to be included: Use of Automated Decision-Making Technology to Process Personal Information

  • Information about the logic involved in automated decision-making processes used by the business.
  • Likely outcome of the automated decision-making process with respect to the requestor.

While use of automated decision-making technology is not listed in the CPRA’s provisions that address disclosure of required information, the CPRA instructs the Agency to promulgate regulations that will require that businesses’ responses to access requests include meaningful information about the logic involved in automated decision-making processes as well as a description of the likely outcome of the process with respect to that individual making the request for access. Final regulations are supposed to be adopted by July 1, 2022, however, to date the Agency has not commenced the formal regulation making process. With this said, it remains unknown whether the final regulations will require that disclosures related to automated decision-making cover the 12-month period preceding the access request.

Pending regulations from the California Privacy Protection Agency

The CPRA formed the Agency which is governed by a five-member Board. The Agency has been granted rule making authority to carry out the purposes and provisions of the CCPA. In accordance with the CPRA, the Agency has given notice to the Attorney General that it is prepared to assume its rule making authority, so adoption of regulations can happen as early as April 2022. 

Of key relevance here, the CPRA instructs the Agency to adopt regulations that shall:

  • establish the standard to govern a business’ determination that providing information beyond the 12-month period in response to a verifiable request is impossible or would involve a disproportionate effort;
  • define the term “specific pieces of information obtained from the consumer”; and
  • require businesses’ response to access requests to include meaningful information about the businesses’ use of automated decision-making technology including the logic involved with decision-making processes and a description of the likely outcome of the process with respect to the consumer.

Per the CPRA, final regulations are supposed to be adopted by July 1, 2022. The Agency however is facing challenges in connection with rule making responsibilities tied to limited staffing and the complexity of issues involved. The Board is considering several options including emergency rulemaking, delaying CPRA enforcement, hiring temporary staff, and staggering rule making.  

Back in September the Agency called for preliminary public comments on proposed rulemaking. The last Board meeting was held on November 15, 2021, and the next meeting has yet to be announced.

While the CPRA’s look back provision requires covered businesses to track their data collection, use and disclosure practices starting January 1, 2022, one year prior to the CPRA’s effective date, it is essential that businesses remain agile in their compliance practices. As the Agency’s regulations take shape in 2022 and requirements are clarified, businesses will need to be prepared to modify certain aspects of their compliance programs.

Forgot Password

Enter the email associated with you account. You will then receive a link in your inbox to reset your password.

Personal Information

Select Section(s)

CLA Membership is $99 and includes one section. Additional sections are $99 each.