Privacy Law

PRIVACY LAW REVIEW – WHAT YOU NEED TO KNOW (MARCH 2022)

Please share:

CLA’s Privacy Law Section summarizes important developments in California privacy and beyond. 

Message from the Chair

The Privacy Law Section is moving full steam ahead and continuing to grow.  We hope some of you were able to attend at least one of the CPRA Law + Tech Series sessions we presented with the D.C.-based Future of Privacy Forum.  If you missed them, not a problem!  You can find recordings of all on our website, here. More sessions are planned in the months ahead, too.   

We are very excited about our April events and hope you will join us: 

On April 6th (1:30 pm to 3 pm PT), in observance of CLA’s Legislative Day, we are hosting a roundtable discussion from high-ranking officials in California who are intimately involved in the development of the state’s privacy legislation. From Governor Newsom’s office, we will be joined by Melissa Immel, Deputy Legislative Secretary and Chief of Legislative Operations, and Julia Spiegel, Deputy Legal Affairs Secretary. They will be joined by Irene Ly, Policy Counsel for Common Sense Media.  The roundtable will be moderated by CLA Privacy Section Past Chair Jeewon Serrato. Sign up here!

On April 26 (Noon to 1 pm PT), we have convened a panel of top privacy practitioners in the state to discuss trends they are seeing in CCPA Notice of Violation letters and provide practical advice on how to comply with the CCPA and to engage with the OAG. Our panel includes Jeewon Serrato, Dominique Shelton Leipzig, Alysa Hutnik, David Zetoony.  This is an incredible opportunity to hear about enforcement trends from those on the front lines. Sign up here.

In addition to our events, we have many opportunities for engagement and leadership. Involvement in the section is a great way to develop leadership and make a name for yourself in California privacy.  If you are interested in learning more about our various committees, including our newly-formed Ad Tech and Cybersecurity Committees, please reach out to us at privacy@calawyers.org.

Sheri Porath Rockwell, Sidley Austin – Chair, CLA Privacy Law Section 

California’s Senate Tackles Data Broker Oversight

By Alexander Diaz, CIPP/US 

Have you ever given your phone number to a new internet service only to find your phone receiving calls from unusual numbers for days on end?  You can often blame data brokers.

A data broker is a business that knowingly collects and sells to third parties the personal information of a consumer with whom the business does not have a direct relationship.  Data brokers collect many hundreds or thousands of data points about consumers from multiple sources, including: internet browsing history, online purchases, public records, location data, loyalty programs, and subscription information. The data broker will then sell that data to another service, which itself could mold it into a marketable profile (i.e., the consumer plays Gardenscapes, lives in Bakersfield, loves Sneaker ads).

In theory, California consumers should have notice (and the opportunity to opt-out) before their data is sold on the internet. In practice, however, data brokers can unscrupulously collect data without notice, or off another app. The only way to know if a data broker has a California consumer’s data is for that consumer to contact each data broker, exercising a data subject access request (Cal. Civ. Code Sections 1798.100, 1798.110 and 1798.115).  In short, once data brokers have your data, it can be very difficult to track what they do with it or where it goes next. 

Enter State Senator Josh Becker (D – San Mateo County). Sen. Becker introduced S.B. 1059 on March 9, 2022, to tackle some of the loopholes and enforcement issues in existing data broker regulation. A 2020 law requires data brokers to register with the Attorney General’s Data Broker Registry; but the earlier law lacks teeth against those who fail to comply. This has resulted in only about 400 registrants instead of the 1,000 lawmakers expected.

Sen Becker’s proposal would toughen the data brokers law on several fronts. The legislation would:

  • Expand the definition of data brokers to include not only “sellers” of data, but also “sharers” (per CPRA’s definition).
  • Add new requirements to information that must be published in the Data Broker Registry, including:
    • Reports of data breach incidents
    • Disclosures around collection of minors’ information
    • How to assert consumer rights (opt-out, correction, deletion) around information held by data brokers
  • Double penalties to $200 per day for failure to comply with Data Broker Registry requirements
  • Extend authority for enforcement of broker registry laws to the California Privacy Protection Agency, bringing another piece of privacy enforcement under the new agency’s umbrella.

For further reading on how S.B. 1059, check out Tom Kemp’s summary on Medium.

California’s Age-Appropriate Design Code Act

By Alyona Eidinger

California lawmakers continue paving the way on the privacy front. The focus is on children’s privacy after the California Legislature reconvened in January 2022. Assemblymembers Buffy Wicks and Jordan Cunningham introduced the California Age-Appropriate Design Code Act (“AADC Act” or “Act”) as Assembly Bill No. 2273 (“AB 2273”) in Sacramento on February 16, 2022.

For the most part, AB 2273 is bipartisan and closely modeled after the United Kingdom’s (“UK”) Age Appropriate Design Code. The UK counterpart took effect on September 2, 2020, and represents a set of 15 flexible standards of age-appropriate design. The companies that provide online services likely to be accessed by children in the UK must implement these standards to minimize data collection and use to safeguard children’s privacy.

WHAT CALIFORNIA AADC ACT DOES

The AADC Act applies to businesses that provide a good, service, or product feature likely to be accessed by a child. “Likely to be accessed by a child” is measured under the more-likely-than-not legal standard. The Act also extends the age of a child to consumers who are under the age of 18 (cf. Children’s Online Privacy Protection Act of 1998—the term child means an individual under the age of 13).

The Act requires businesses to comply with these 8 requirements (§ 1798.99.31.(a)):

  1. Consider the best interests of children when designing the product in the manner that prioritizes the privacy, safety, and well-being of children.
  2. Carry out a Data Protection Impact Assessment and maintain appropriate documentation.
  3. Establish the age of consumers with a level of certainty appropriate to the risks that arise from the business’s data management practices, or apply the privacy and data protections afforded to children to all consumers.
  4. Maintain the highest level of privacy for children by default (e.g., disabling profiling).
  5. Provide privacy notices/policies and terms of service concisely, prominently, and using clear language that the child can understand.
  6. Provide an obvious signal if the product is used to monitor the child’s online activity or track their location.
  7. Uphold published terms, policies, and community standards that the business establishes.
  8. Provide prominent, accessible, and responsive tools to help children exercise their privacy rights and report concerns.

Businesses “shall not take” the following actions with the child’s personal information (PI):

  • use in a way that is harmful to the child’s physical or mental health or well-being;
  • collect and retain if it is not necessary to provide the service;
  • use for any reason other than the reason(s) for which that PI was collected;
  • disclose it;
  • collect any precise geolocation information or any sensitive PI by default;
  • use dark patterns or other techniques. (§ 1798.99.31.(b)):

CALIFORNIA CHILDREN’S DATA PROTECTION TASKFORCE

The California AADC Act establishes and convenes the California Children’s Data Protection Taskforce to evaluate best practices for the Act’s implementation and to help businesses (especially small- and mid-size businesses) comply with the Act. The California Privacy Protection Agency’s Board would be responsible for appointing the members of the taskforce as well as for adopting regulations and publishing guidelines. The latter must be done in consultation with the taskforce.

TIMELINE

If passed, AB 2273 will take effect on July 1, 2024.

California Attorney General’s Opinion on Inferences: Even if Internally Generated, Inferences are Personal Information and Must be Disclosed in Consumer Right to Know Requests

By Natalie Marcell CIPP/US

On March 10, 2022, the California Attorney General (AG) issued Opinion No. 20-303, its first opinion interpreting the California Consumer Privacy Act (“CCPA”) in which it clarified that inferences that are internally generated by a covered business are personal information that must be disclosed under right to know requests.

The AG explains the CCPA gives consumers the right to receive all information collected about the consumer, not just information collected from the consumer.  Focusing on Civil Code section 1798.140(o), the Attorney General explains that under the CCPA, “personal information” is made up of a vast array of information, clarifying there is no distinction made between whether the information comes from a public or private source.  In a footnote, the AG notes that section 1798.140(o)(2) specifies that personal information “does not include publicly available information” without further clarifying its rationale that there is no distinction between sources of personal information.

The AG goes on to express that it does not matter whether the business gathers information from the consumer, from public repositories, from data brokers, or whether the information is inferred through its own proprietary process; once a business has made an inference, the inference becomes personal information. 

The Opinion explains that inferences, including those internally generated, must be disclosed when two conditions exist:

  1. The inference is drawn from any of the information identified in subdivision (o) of Civil Code section 1798.140

 and

  • The inference is used to create a profile about a consumer or to predict a salient consumer characteristic

Where inferences are not used to create a profile about a consumer, then they are outside of the scope of inferences that must be disclosed.

As for the concern that the disclosure of internally generated inferences would expose a business’s trade secrets, the AG noted that this was a concern raised throughout the CCPA regulation making process, however no concrete examples have been presented of situations where inferences themselves are trade secrets or where the disclosure of inferences would expose a business’s trade secrets.  The AG further clarified that the CCPA requires the disclosure of the individualized products of its secret algorithm; not the algorithm itself. 

The AG declined to address whether a particular kind or class of internally generated inferences might be protected from disclosure because it constitutes a trade secret.

The Attorney General however emphasized that the text of both the CCPA and California Privacy Rights Act (“CPRA”) contain language indicating an intent to protect intellectual property, so when a trade secret exists, the CCPA will not require its disclosure to a consumer.  If a business however denies a request for information, in whole or in part because of a conflict with federal or state law, or an exception to the CCPA, it must explain the nature of the information and the basis for denial to comply with the obligation to respond to requests in a meaningful and understandable way.  The AG did not provide clarity as to how a business should, in practice, explain the nature of information not being disclosed or how the trade secret exception should be explained to a consumer in a meaningful or understandable way.

The Opinion does not include a substantial discussion about the impact of the CPRA on the issues addressed, nor does it discuss the issue of how covered businesses should address consumers’ right to delete and right to correct with respect to internally generated inferences.  With the CPRA rulemaking activities still in process, it remains to be seen whether guidance will be provided by the California Privacy Protection Agency (“CPPA”).

CPRA Pre-Rulemaking Activities Continue:  CPPA Holds Two Informational Sessions 

On March 29th and 30th, the Board of the California Protection Agency (CPPA) continued with its California Privacy Rights Act pre-rulemaking activities by holding Information Sessions.  Both sessions were open to the public and held through video conference. 

The Board has made available the following materials: 

The first day of Informational Sessions included the following topics:

  1. Personal Information Data Flow Overview: How Is Personal Information Collected, Sold, and Shared?  Speaker: Ashkan Soltani, Executive Director, California Privacy Protection Agency
  2.  How the California Consumer Privacy Act Interacts with Personal Information Data Flows  Speaker: Lisa Kim, Deputy Attorney General, California Department of Justice (Invited)
  3. Business and Consumer Interactions: “Dark Patterns” Speakers: Jennifer King, Ph.D., Privacy and Data Policy Fellow, Stanford Institute for Human-Centered Artificial Intelligence, Stanford University; Lior J. Strahilevitz, Sidley Austin Professor of Law, University of Chicago
  4. Business and Consumer Interactions: Communicating Business Practices and Consumer Preferences.  Speaker: Lorrie Faith Cranor, D.Sc., Director and Bosch Distinguished Professor in Security and Privacy Technologies, CyLab; FORE Systems Distinguished Professor of Computer Science and of Engineering & Public Policy; Carnegie Mellon University
  5. Business and Consumer Interactions: Opt-Out Preference Signals and the California Consumer Privacy Act Speaker: Stacey Schesser, Supervising Deputy Attorney General, California Department of Justice (Invited)

The second day of Informational Sessions included the following topics:

  1. Overview of Data Processing and Automated Decision Making: Challenges and Solutions. Speaker: Safiya Noble, Ph.D., Professor and Director of the UCLA Center for Critical Internet Inquiry, University of California, Los Angeles
  2. Data Privacy Impact Assessments: What Should Be Considered. Speaker: Gwendal LeGrand, Head of Activity for Enforcement Support and Coordination, European Data Protection Board.
  1. Cybersecurity Audits: Speaker: Chris Hoofnagle, Professor of Law in Residence, School of Law; Faculty Director, Center for Long Term Cybersecurity; University of California, Berkeley
  2. Automated Decision Making: The Goals of Explainability and Transparency Speaker: Andrew Selbst, Assistant Professor of Law, University of California, Los Angeles
  3. Automated Decision Making: A Comparative Perspective. Speaker: Margot Kaminski, Margot E. Kaminski, Associate Professor of Law, University of Colorado, Boulder
The Utah Consumer Privacy Act

By Weiss Hamid

Continuing the growing trend, Utah has become the fourth state to enact a comprehensive state privacy law, entitled the Utah Consumer Privacy Act (“UCPA”).

Utah’s Senate passed the UCPA unanimously on February 25, 2022, and was followed by a unanimous vote by Utah’s House on March 2. On March 22, Governor Spencer Cox signed the UCPA, officially making it the law of the land. Utah therefore has joined California (California Consumer Privacy Act as amended by the California Privacy Rights Act), Virginia (Consumer Data Protection Act) and Colorado (Colorado Privacy Act in passing extensive privacy and data laws. The law will take effect December 31, 2023.

Generally, the UCPA bears a closer similarity to the VCDPA and CPA rather than the CCPA. One key distinction is that the UCPA offers no private right of action. This mirrors the VCDPA and CPA and in contrast to the CCPA which offers a private right of action for data breaches involving specific types of personal information.

Other significant components to the UCPA include:

Applicability

The UCPA applies only to controllers or processors that (1) do business in the state (or target Utah residents with products or services); (2) earn at least $25 million in revenue; and (3) either: (a) control or process personal data of 100,000 or more consumers (defined as a Utah resident) in a calendar year; or (b) derive more than 50 percent of gross revenue from selling personal data and control or process data of 25,000 or more consumers.

The “and” is a key distinction between the UCPA and the CCPA, whereas the CCPA’s $25 million dollar revenue requirement is an independent basis to determine applicability. Therefore, the UCPA is much more narrow in scope.

Sensitive Data

The UCPA is also distinct from VCDPA and CPA in that it does not require opt-in consent for sensitive data. Instead, the UCPA requires controllers to “present[] the consumer with clear notice and an opportunity to opt out” of sensitive data processing.

Consumer Rights Provided

The UCPA offers consumers the ability to access, obtain in a portable manner, and delete personal information they have specifically provided to the controller/processor. This differs from the CPA and CCPA, which requires controllers and processors to provide personal data “concerning” (CPA) or “about” (CCPA) a consumer. Unlike the other three data privacy laws, the UCPA does not provide a right of correction or accuracy.

Enforcement

As indicated above, the UCPA does not provide consumers a private right of action. The Utah Attorney General can recover actual damages for consumers and a penalty of up to $7,500 per violation. Businesses are provided a 30-day notice and right to cure period.           

Data Protection Assessment

The UCPA is silent on any requirement for controllers and/or processors to conduct data protection assessments, which differs from the CCPA, CPA, and VCDPA.

Florida Consumer Privacy Bills ‘Indefinitely Postponed’

By Jennifer M. Oliver

Florida senators and representatives ended their 2022 legislative session on March 11 by withdrawing consideration of consumer data privacy legislation. Any discussion of stronger protections for consumers, and more requirements on businesses, has been dubbed “indefinitely postponed.”

Senate Bill 1864 and House Bill 9 were the latest efforts to enact Florida privacy legislation. They were carried over and altered along the way from the 2021 session, when the House bill passed but the Senate measure did not. Some say a report on the financial impact to companies contributed to the demise of the Senate bill. Changes in the most recent iterations included a consumer private right of action in the House bill, albeit watered down to make it more palatable for businesses. The Senate did not propose such a right, instead saying state enforcers, not citizens, would litigate unfair and deceptive practices.

Businesses opposed the bills by arguing they exposed them to too much litigation without giving them time to correct any violations consumers would allege in demand letters. Industries including telecommunications, finance, insurance, utilities, and real estate also objected to the attendant financial burdens of the proposed laws. But the costs sounded the loudest alarms. Opponents cited one report that compliance would cost Florida companies many billions of dollars to implement and maintain.

The organization Florida TaxWatch, which produced that report, focused primarily on the House bill. The group’s CEO Dominic Calabro was quoted as saying that in addition to the costs and “financially motivated” litigation, smaller businesses would feel pressure to adopt expensive data privacy measures in order to remain competitive, and businesses that would have been covered by the law would not have had enough time to implement the necessary technology. Meanwhile, companies would be at risk of non-compliance and “costly litigation for failing to respond,” Calabro said.

In cheering the death of the bills, Florida TaxWatch posted that despite the bipartisan popularity of giving consumers “more control over how their online data, including the ability to request businesses delete their personal data or refrain from selling it,” their fiscal study had the desired impact. “[Our] research showed it would have reduced Florida’s gross operating surplus — the total profit of private enterprise sans immediate costs and workers — by 3.9%. That amounts to a $21 billion hit to the state economy.”

The measures were originally announced by a traditionally pro-business Republican governor, and many believe the twice-failed bill still enjoys a tenacious backing by consumer privacy advocates and others, so we may be reporting on this one again sometime in the near future.

State Privacy Legislative Sessions Updates

By Brandon M. Jasso, CIPP/US/E

With the failure of Congress to pass a comprehensive privacy law, states are still taking it upon themselves to introduce privacy bills to protect their citizens. However, some of the introduced bills have failed to pass. However, these bills are still worth reviewing as they are representative of what may pass in each state later and show that certain standards are being sought across the states.

Washington (state) – SB 5062 and HB 1850:

This year Washington’s legislative session ran from January 10, 2022, through March 10, 2022, with March 4, 2022. (See here).

SB 5062 – Washington Privacy Act (“WPA”)

On January 10, 2022, the WPA, which was also introduced in 2021, was reintroduced in its previous form by senate resolution. On February 24, 2022, the WPA (referring to the 2nd was moved to Rules White Sheet, which is “where bills are sent immediately after being passed out of a standing committee. . . . more or less, a review calendar.” (See here). However, no progress was made after and the bill was left unpassed when the legislative session ended.

WPA covered many of the rights seen in other privacy bills, including the right to: (1) access to confirm whether information is being processed and know the categories of information; (2) correct inaccurate person data; (3) delete personal data; (4) portability of personal data; and (5) opt-out of targeting advertising, sale of personal data, or profiling. WPA applied to “entities that conduct business in Washington or produce products or services that are targeted at” Washington residents and meets one of the following thresholds:

  • Processes or controls personal data of 100,000 or more consumers in a calendar year or
  • Derives 25 percent of gross revenue from the sale ofpersonal data and processes or controls personal data of 25,000 or more consumers

HB 1850 – Washington Foundational Data Privacy Act (“WFDPA”)

            While the WPA was pending, the Washington House was considering, what was originally known as Washington Foundational Data Privacy Act (“WFDPA”) (see here) (a name later removed from the second substitute bill passed by the House Committee on Appropriations). Although similar to the WPA as the WFDPA incorporated the WPA by reference, the WFDPA would have created a consumer data privacy commission tasked with administrative powers, rulemaking, and enforcement authority. Additionally, the WFDPA would have allowed for private rights of action in certain contexts. Ultimately, the WFDPA met the same fate as the WPA when the legislative session ended.

Wisconsin – AB 957 – Wisconsin Consumer Data Protection Act (“WCDPA”)

            “The Wisconsin Legislature operates in a biennial (two year) session that lasts from early January of the odd numbered year to early January of the odd numbered year two years later.” (See here). The floor period is when the legislatures pass bills. During the February 2022 floor period, on February 23, 2022, the Wisconsin Assembly passed the WCDPA. The WCDPA is very similar to the Virginia Consumer Data Protection Act.

            The bill afforded many of the same rights as other privacy laws, including: (1) access to confirm whether information is being processed and know the categories of information; (2) correct inaccurate person data; (3) delete personal data; (4) portability of personal data; and (5) opt-out of targeting advertising, sale of personal data, or profiling. Additionally, WCDPA gave the attorney general sole enforcement authority with penalties of up to $7,500.00 per violation, and provided a thirty day cure period after notice that did not appear to expire. The bill applied to persons that conduct business in this state or produce products or services that are targeted to residents of this state and who satisfy either of the following”:

  • “During a calendar year, the person controls or processes personal data of at least 100,000 consumers”; or
  • “The person controls or processes personal data of at least 25,000 consumers and derives over 50 percent of gross revenue from the sale of personal data.

The threshold of at least 100,000 consumers, and gross revenue of over 50 percent aligns itself with the California Privacy Rights (amending the CCPA and operative January 1, 2023); although the requirement for the gross revenue is distinct in that it requires concurrent processing of  a certain number of consumers’ personal data. Overall, the bill was nearly identical to the Virginia Consumer Data Protection Act. Ultimately, the WCDPA failed on March 10, 2022, when the floor period ended without passing in the Wisconsin Senate.

Indiana – SB 358 – Indiana Consumer Data Protection Act (“ICDPA”)

The Indiana legislative session ran from January 4, 2022, through March 8, 2022 (see here for more information on the Indiana legislative process and session). During this session, the Indiana Senate introduced the ICDPA (for info on the ICDPA see here), which passed unanimously. The ICDPA was also very similar to the Virginia Consumer Data Protection Act. The ICDPA contained all the previous rights afforded in the Wisconsin and Washington bills, provided the attorney general with exclusive authority to enforce the ICDPA, provided a thirty-day notice and cure period, provided no private right of action, and mirrored the language of the WCDPA for who the bill was applicable to. Like the other laws, it failed to pass the Indiana House before the legislative session ended on March 8, 2022. 

West Virginia – HB 4454

West Virginia’s “[r]egular sessions of the Legislature begin on the second Wednesday in January of each year and last for 60 consecutive day.” (See here). This year’s session ran from January 12, 2022, through March 12, 2022. (See here). Unlike the other bills introduced, HB 4454 was focused on limiting the sale and sharing of personal information, creating the right to opt-out of the sale or sharing of personal information, prohibit discrimination for using the rights afforded, and “providing for methods of limiting sale, sharing, and use of personal information, as well as any use of sensitive personal information.” (See HB 4454 text here).

Essentially, HB 4454 was less comprehensive than the other bills introduced and failed to provide definitions, enforcement provisions, and other standard information seen in most bills. The bill was introduced on January 31, 2022, and sent to the House Judiciary where it appeared to stall until the legislative session expired.

Hey Adtech, Watch Your Step, Legislatures Want To Ban ‘Surveillance Advertising

By McKenzie Thomsen

There’s recently been a movement against ‘surveillance capitalism -’ a term coined by Shoshana Zuboff in her rather large book “The Age of Surveillance Capitalism.” But what is surveillance capitalism and its sister term ‘surveillance advertising’ and what does it mean for adtech privacy?

What Is Surveillance Advertising?

Surveillance Capitalism’ is a “market driven process where the commodity for sale is your personal data, and the capture and production of this data relies on mass surveillance of the internet.” Surveillance Capitalism tends to occur on free online services, such as Facebook, that provide a ‘free’ service, but in return, receives large swaths of user data that the company sells or otherwise monetizes. (You remember Cambridge Analtyica). ‘Surveillance Advertising’ is essentially surveillance capitalism with the goal of serving you ads. Surveillance advertising is otherwise  known as targeted advertising or behavioral advertising. It is the practice of showing individual consumers different advertisements based on inferences about their interests, demographics, and other characteristics drawn from tracking their online activities over time.

Currently, What Can I Do To Stop Surveillance Advertising Targeting Me?

The Digital Advertising Alliance (DAA) and the Network Advertising Initiative (NAI) have long had industry-wide opt outs for consumers. But, they don’t work. I suggest reading this article by The Markup to get a better understanding, but here are the basic flaws of these industry-wide opt outs.

First, they’re based on cookies. Rather than prevent webpages from placing cookies on you, if you opt out via DAA or NAI, a second cookie is placed that says, don’t use the information you gathered from the prior cookies for targeted advertising. In other words, companies are still collecting your personal information. And cookies are being deprecated in the second half of 2023. So this method won’t even work next year.

Second, not every company is a member of DAA or NAI and so when you use their opt out, it misses some companies.

Finally, when you use these industry-wide opt outs, your opt out is being sent to every company on the DAA’s/NAI’s list. Essentially, its being sent to companies that otherwise wouldn’t know about you, thus spreading your information further, which is the opposite of the purpose of these opt outs.

What Laws Are Being Proposed?

In light of the inadequacies of the DAA and the NAI, both the US and the EU, there is a movement to ban surveillance advertising, and it’s gaining momentum. In the EU, the Digital Services Act (DSA) is making its way through the legislative process. The bill stopped short of an outright ban on targeted advertising, but it does ban companies like Facebook and Google from using sensitive data for targeted advertising. Moreover, the DSA bans the targeted advertising of minors altogether, no matter the category of personal data.

In the US there are a few attempts to ban surveillance advertising. First, the FTC is being petitioned by Accountable Tech to use its rulemaking authority under § 5 of the FTC Act (specifically, the ‘unfair methods of competition’ prong) to ban surveillance advertising. The FTC is currently seeking comments.

In addition, Senator Booker (D-NJ) has introduced the Banning Surveillance Advertising Act (BSAA) while Eshoo, Schakowsky have introduced a companion bill in the house. The bill prohibits advertising networks and facilitators from using personal data to target advertisements, with the exception of broad location targeting to a recognized place, such as a municipality.

BSAA also prohibits advertisers from targeting ads based on protected class information, such as race, gender, and religion, and personal data purchased from data brokers. However, the bill explicitly allows contextual advertising (Contextual advertising is the placing of ads on web pages based on the content of those pages rather than the personal data of the individual. An example of contextual advertising would be the placement of an ad for a bike tire when viewing your local bike repair shop’s webpage. The bike tire ad was placed because the webpage is about repairing bikes, not because you searched for bike tires two weeks ago).

Along with the legislative momentum is the constituents’ push. A few organizations including The Center for Digital Democracy, Accountable Tech, The Center for Humane technology, Common Sense Media, and EPIC.org have created a coalition (which can be found at ​​https://www.bansurveillanceadvertising.com/) to petition the passing of BSAA .

Now, I’m sure its unsurprising to hear, but Google and other marketers think outright bans go too  far, that this will (1) harm US companies and not foreign companies, (2) harm data privacy and security, and (3) hurt small businesses who rely on targeted advertising. These arguments are not new, in fact they were all used against Epic in Epic Games v. Apple. I’ll refrain from commenting here and let you be your own judge.

So What Is Adtech’s Next Move?

NAI and DAA have been inadequate for a decade. And if adtech companies want to keep their industry alive, they need to regulate more. The momentum isn’t stopping and if adtech doesn’t provide basic privacy safeguards, legislatures may end the industry altogether by banning targeted advertising.

HIPAA Privacy Rule & SB 3620 Health Data Use and Privacy Commission Act

By Natalie Marcell, CIPP/US

Final Rule Changes to HIPAA Privacy Rule Anticipated Later This Year

Lisa Pino, Director of the U.S. Department of Health and Human Services (HHS) Office of Civil Rights (OCR) has confirmed in an interview with Information Security Media Group published on March 21, 2022 that OCR is still reviewing comments received in response to proposed changes to the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule as it prepares the final rule changes anticipated later this year. 

Back in December 2018 the HHS had issued a Request for Information seeking input on changes to the HIPAA Privacy Rule.  In December 2020 the OCR announced proposed changes to the HIPPA Privacy Rule. Proposed changes included among others:

  • Changes to strengthen individual rights of access including allowing individuals as part of their right to inspect their own PHI to take notes, videos, or pictures to view or capture their protected health information (PHI)
  • Addition of express prohibition against entities imposing unreasonable measures on individuals exercising their right of access
  • Shortening the required response time to no more than 15 calendar days with the opportunity for one 15 day extension regardless of the form of PHI
  • Added requirement that covered health care providers and covered health plans submit an individual’s access request to another health care provider and the added requirement to such request for PHI
  • Amendments to permissible fee structures for responding to requests to direct records to a third party
  • Added requirements that covered entities post estimated fee schedules on their websites for access and disclosures requests, and added requirement that covered entities provide individualized estimates for fees for requests for copies upon request
  • Added requirement to respond to clear, conspicuous and specific requests to direct electronic copies of PHI to a third party designated by an individual which may include oral requests and electronically executed requests
  • Added definitions including “Personal Health Application” or “Personal Health App”
  • Clarification on the form and format of responding to requests for PHI which would include transmitting PHI securely to an individual’s personal health application
  • Changes reducing the burden of identity verification for individuals exercising their right of access
  • Elimination of the requirement that covered entities obtain written acknowledgement of receipt of Notice of Privacy Practices
  • Modification of the content in Notices of Privacy Practices
  • Added individual right to discuss the Notice of Privacy Practices with a designated person

The public comment period for the proposed changes expired on May 6, 2021.  As of June 22, 2021 the HHS has published all comments received.  Pino noted that OCR received comments from over 1,400 organizations/individuals.

The HHS’s Rule List/Agenda has listed final action for the HPAA Privacy Rule in October 2022 however there is no statutory deadline for the publication of the final rule changes.

Proposed changes will go into effect 60 days after they are published.  OCR has proposed a compliance date of 240 days after the publication date at which point enforcement will commence however public comment was requested on the proposed compliance date.

Proposed Federal Legislation: Health Data Use and Privacy Commission Act

Legislation related to health information privacy may be on the horizon.  On February 9, 2022 the Health Data Use and Privacy Commission Act (SB 3620) was introduced into the Senate. 

The Health Data Use & Privacy Commission Act would create a Commission of 17 members appointed by the Comptroller General.  The Commission would be tasked with conducting a study related to striking the right balance between protecting individual privacy and allowing and advancing appropriate uses of personal health information.  Within 6 months after appointment of all 17 members the Commission would have to approve a report to be presented to Congress and the POTUS, that addresses/includes:

  • Potential threats posed to individual health privacy and legitimate business and policy interests
  • Analysis of the purposes for which sharing of health information is appropriate and beneficial to consumers
  • Analysis of the threat to health outcomes and costs if privacy rules are too stringent
  • Analysis of the effectiveness of existing statutes, regulations, private sector self-regulatory efforts, technological advances, and market forces in protecting individual health privacy
  • Recommendations on whether federal legislation is necessary
  • Specific suggestions on proposals to reform, streamline, harmonize and unify or augment current laws and regulations
  • Analysis of whether additional regulation may impose costs or burdens, or cause unintended consequences in other areas
  • Cost analysis of the changes proposed by the Commission
  • Recommendations on non-legislative solutions such as education, market-based measures, industry best practices and new technologies
  • Review of the effectiveness and utility of third-party statements of privacy principles and private sector self-regulatory efforts, and third-party certification or accreditation programs

While there is no companion legislation in the House of Representatives, the introduction of the Health Data Use and Privacy Act follows a recent trend of increased attention to data privacy concerns related to health information. On September 15, 2021 the Federal Trade Commission (FTC) issued a policy statement on “Breaches by Health Apps and Other Connected Devices” emphasizing that while apps and connected devices capturing health data may not be covered under HIPAA, developers of health apps or connected devices are health care providers under, and thus subject to, the FTC’s Health Breach Notification Rule. In January 2022, the FTC also published guidance resources “Health Breach Notification Rule: The Basics for Business” and “Complying with FTC’s Health Breach Notification Rule” which also focused on businesses with mobile apps, websites, and connected devices that hold consumers’ health information.

The Health Data Use and Privacy Act has been referred to the Senate’s Health, Education, Labor and Pensions Committee for further action. 

Federal Trade Commission’s Increasing Use of Algorithmic Destruction Orders

By Natalie Marcell, CIPP/US

In last month’s settlement with WW International (formally known as Weight Watchers) and Kurbo, Inc., the Federal Trade Commission (“FTC”) for the third time within the last three years made use of algorithm destruction orders, and in this particular instance it was for violations of the FTC Act and the Children’s Online Privacy Protection Act (“COPPA”).  Prior to the March 2022 settlement order, algorithm destruction orders were made part of the stipulated settlement agreements involving violations of §5 the FTC Act by Cambridge Analytica and Everalbum.

In the two most recent instances when algorithm destruction orders have been made part of a stipulated order, the defendant company has been required to within 90 days (1) destroy data that was improperly collected; (2) destroy “any models or algorithms developed in whole or in part” using said data; and (3) provide a written statement to the Commission sworn under penalty of perjury confirming the deletion and destruction of the data, models and algorithms.  These orders have included an exception allowing for the retention of data, models or algorithms for the purpose of complying with government requests, court orders, or other legal obligations including the safeguarding of evidence in pending litigation provided that the written statement to the Commission describes in detail the information that has been retained as well as the basis and specific obligation that prohibits the deletion of such information.  The first algorithm destruction order directed at Cambridge Analytica in December 2019 included similar provisions however Cambridge Analytica was given a ten-day window to destroy data collected including “any algorithms or equations, that originated in whole or in part” from the data collected, and to provide a written statement to the Commission.

The three cases involving algorithm destruction orders are summarized below.

WW International (formerly known as Weight Watchers) and Kurbo, Inc.

On March 4, 2022 the FTC entered a settlement order with WW International and its subsidiary Kurbo, Inc. over violations of the FTC Act and COPPA.  The Kurbo app, which was marketed to children, provided help with tracking food intake.  In exchange it required personal information from users such as age, gender, height, weight, and food and exercise choices.  According to the FTC, until late 2019, users could sign up for the service by either indicating that they were a parent signing up for their child, or that they were over the age of 13 and registering themselves. From 2014 through late 2019 Kurbo failed to ensure that the users signing up were actually parents rather than children pretending to be adults to bypass the age restriction.  During this timeframe, hundreds of users managed to sign up for the Kurbo app originally claiming that they were at least 13 years old however they later changed their profiles and listed their birthdates indicating that they were actually under the age of 13 with no affect to their ability to access the app.  Further violations included failure to provide complete notice to parents about data collection practices and the indefinite retention of data with deletion occurring only when requested by a parent.

Under the settlement order entered, both California corporations are required to pay a fine of $1.5 million, they must destroy the illegally collected personal information, and they must destroy “any models or algorithms developed in whole or in part” using the data collected from children under the age of 13. The companies are also prohibited from retaining data collected in the future from children under the age of 13 for more than one year after the last time a child uses the Kurbo app.

Everalbum, Inc.

In May 2021, Everalbum Inc., a California based developer of a photo sharing app, was charged by the FTC with violating §5 of the FTC Act for misleading users of its Ever app.  According to the FTC’s complaint, Everalbum lead its app users to believe that Everalbum’s facial recognition technology would not be applied to their content unless they affirmatively activated the feature, however, the company had automatically activated the feature on all accounts.  Users were unable to turn off the facial recognition feature unless they lived in one of three U.S. States or the European Union.  Everalbum used the photos that were uploaded by its app users to help build its facial recognition technology. Further violations also included the indefinite retention of photos and videos of users as Everalbum did not delete the data when users deactivated their accounts.

As part of its settlement with the FTC, Everalbum was required to destroy the photos, videos, and facial and biometric data it had collected from app users that had deactivated their accounts. Additionally, Everalbum was required to delete products built using such data, including “any models or algorithms developed in whole or in part.”

Cambridge Analytica

In December 2019 the FTC found that Cambridge Analytica violated §5 of the FTC Act through deceptive conduct involving (1) the collection of personal information from Facebook users who were asked to answer survey questions and share some of their Facebook profile data, (2) false claims that Cambridge Analytica was a participant in the EU-US Privacy Shield framework after its certification had lapsed, and (3) failure to adhere to Privacy Shield requirements.  In its Final Order, the FTC called on Cambridge Analytica to destroy the data it had gathered about Facebook users through deceptive means along with the “information or work product, including any algorithms or equations” built using that data within 10 days of the Order.  Separate settlements reached with Cambridge Analytica’s former chief executive officer Alexander Nix and app developer Aleksandr Kogan likewise included provisions requiring the destruction of data as well as “any algorithms or equations that originated, in whole or in part” from the collected data.

Conclusion

Last year FTC Commissioner Rebecca Slaughter expressed support of algorithmic destruction orders as a penalty for unfair and deceptive data practices indicating that it is a tool the FTC could use to foster economic and algorithmic justice.  FTC Chair Lina Khan is scheduled to make her first public address focused on privacy issues at the upcoming IAPP Global Privacy Summit in Washington, D.C. Justin Brookman, Consumer Reports Director of Consumer Privacy and Technology Policy has expressed that given the recent privacy settlements that have been reached under her tenure, he expects Khan to make clear that the FTC will insist on behavioral changes before settling cases.  With two algorithm destruction orders in the last 10 months it appears that we will increasingly see algorithmic disgorgement orders in future FTC actions.

President Biden’s Executive Order on Digital Assets

By Hina Moheyuddin

Cryptocurrencies, digital assets, and blockchain technology have intensified the development of the national economy and the United States is now acknowledging the role of digital assets in the global market. On March 9, 2022, President Joe Biden signed the Executive Order on Ensuring Responsible Development of Digital Assets (EO).

While there are existing regulations and laws governing the activities of digital assets, the Order calls for a consistent and united national approach to digital assets. Recognizing the influence digital assets have on our global markets relations, society, and modern technologies in general, the Executive Order (EO) endorses six policy objectives with respect to digital assets:(1) protect consumers, investors, and businesses from significant financial risks due to the current absence of adequate safeguards in place; (2) maintain global financial stability and mitigate unprecedented systemic risks posed by the illicit use of digital assets; (3) ensure responsible development and design in efforts to mitigate illicit financial risks and set standards to secure national security; (4) reinforce United States leadership in global finance and technological and economic competitiveness; (5) promote access to safe and affordable financial services; and (6) support technological advances that encourage responsible development and use of digital assets.

The Administration is exploring a U.S. Central Bank Digital Currency (CBDC). The EO directs the U.S. Government to study the potential for a U.S. CBDC. More specifically, it requests various agencies to submit reports within 90, 120, 180, or 210 days addressing digital assets. More information on the reports can be found here. The EO “also encourages the Federal Reserve to continue its research, development, and assessment efforts for a U.S. CBDC, including development of a plan for broader U.S. Government action in support of their work.”

This EO is a big step forward in the digital assets space and while it places an urgency for U.S. leadership in digital assets and blockchain technology, it does not compel agencies to adopt any specific rules or approaches.

Statement by President Biden on our Nation’s Cybersecurity

By Oliver Kiefer

On March 21, 2022, President Biden issued a statement warning of the critical need for the private sector to take cybersecurity defenses and resiliency seriously.  The statement comes on the heels of the recent Russian invasion of Ukraine and the reactive sanctions issued by the United States and other Western nations.  The Biden Administration is concerned that malicious Russian cyber actors may use the sanctions as a reason to escalate already pervasive attacks against public and private sector entities in the United States, which the statement describes as part of “Russian’s playbook.”

Among the commonsense steps businesses can take to protect themselves are:

  • Mandatory use of multi-factor authentication
  • Ensuring IT systems have all of the latest patches
  • Frequently creating offline backups of company data to reduce vulnerability to ransomware attacks
  • Use of proper encryption to protect data
  • Required frequent password updates to render compromised credentials less useful to attackers

A full list of immediate actions and long-term activities that companies can take to bolster their cybersecurity defenses can be found on this fact sheet accompanying President Biden’s statement.

SEC Gets Serious About Cybersecurity Disclosure Rules

By M. Scott Koller

On March 9, 2022, the Securities and Exchange Commission (SEC) published a set of proposed rules that would require public companies to make certain disclosures related to cyber security incidents and preparedness. 

To be clear, the SEC already requires public companies to disclose cybersecurity incidents.  However, the SEC believes that companies are under-reporting, not reporting, or reporting on an inconsistent basis.  In addition, SEC believes that investors would benefit from more consistent and timely reporting of cybersecurity incidents and policies and procedures related to managing cybersecurity risks.  Therefore, the proposed rules are intended to both expand and standardize cybersecurity disclosures. 

The proposed rules add a new item to the Form 8-K requiring companies to disclose information within four business days after the company determines that it experienced a material cyber security incident.  In addition, the company must also disclose:

  • When the incident was discovered and whether it is ongoing;
  • A brief description of the nature and scope of the incident;
  • Whether any data was stolen, altered, accessed, or used for any other unauthorized purpose;
  • The effect of the incident on the company’s operations; and
  • Whether the incident has been remediated or the company is currently remediating the incident.

It is important to note that the proposed rules do not change what constitutes “materiality” for purposes of the proposed cybersecurity incidents disclosure.  In addition, the reporting deadline is within four business days after the company determines that it has experienced a material cybersecurity incident, and not within four days after discovery of the incident.  This is a critical distinction because few companies will be able to properly investigate and evaluate a cybersecurity incident in just four days. Nevertheless, the SEC does expect that the company will make a materiality determination as soon as reasonably practicable after the discovery of the incident. 

One surprising omission is that the proposed rules do not allow for a company to delay notification if requested by law enforcement.  Nearly every state and federal breach notification statute includes a provision that allows law enforcement to direct an organization to delay public disclosures of an incident if doing so would jeopardize their investigation or their ability to apprehend the perpetrators or prevent future cybersecurity incidents.  It is the view of the SEC that timely disclosure of cybersecurity incidents for investors outweighs the needs of law enforcement, even in the interest of national security.  Admittedly, one of the questions directed to the public by the SEC is whether there should be a law enforcement delay. 

Apart from the new reporting requirements, the proposed rules would also require “enhanced and standardized disclosure on registrants’ cybersecurity risk management, strategy, and governance.”  Specifically, it would companies to describe the procedures they have for the “identification and management of risks from cybersecurity threat,” including a discussion of whether “[t]he registrant engages assessors, consultants, auditors, or other third parties in connection with any cybersecurity risk assessment program” and whether “[c]ybersecurity related risks and previous cybersecurity-related incidents have affected or are reasonably likely to affect the registrant’s strategy, business model, results of operations, or financial condition and if so, how.”  See proposed 17 C.F.R. §229.106(b)(1)(ii), (vii).

As to cybersecurity governance, companies would have to describe their board’s “oversight of cybersecurity risk,” including identifying which board members or committees oversee cybersecurity risks and the frequency with which the board discusses cybersecurity risks.  Id. § 229.106(c)(1).  Outside of the boardroom, the proposed rules would also require disclosure of how the company’s management assesses cybersecurity-related risks, including a description of the persons or committees managing cybersecurity risk and a description of the expertise of any chief information security officer.

Finally, the proposed rules would require companies to disclose information about the cybersecurity expertise of members of the board of directors.  “If any member of the board has cybersecurity expertise, the registrant would have to disclose the name(s) of any such director(s), and provide such detail as necessary to fully describe the nature of the expertise.” Although having this expertise would not impose any additional duty or liability on that board member for having that specialized knowledge. 

Overall, the proposed rules and the recent enforcement actions highlight the SEC’s focus on cybersecurity issues.  Companies should take this opportunity to strengthen their procedures around cyber security incidents, including incident reporting and risk management. The risks associated with cybersecurity are significant and must be factored into the overall risk assessment and management process of public companies.

The UK’s New International Data Transfer Agreements

By Paul Lanois

On 22 March 2022, the United Kingdom (“UK”) adopted a new International Data Transfer Agreement (“IDTA”) and an international data transfer addendum (“Addendum”) to the European Commission’s standard contractual clauses (“SCC”) for use in connection with transfers of personal data from the UK to locations outside the UK.

The IDTA and the Addendum were prepared to serve as an “appropriate safeguard” under Section 119A of the UK Data Protection Act 2018 for the transfer of personal data under the UK’s implementation of the General Data Protection Regulation, tailored by the Data Protection Act 2018 (the “UK GDPR”), which is essentially the same law as the EU GDPR, but adapted to take into account UK specificities.

The new UK framework serves as the UK’s post-Brexit replacement for the European Commission’s SCCs used to transfer personal data from the UK to third countries that do not offer an “adequate level of data protection” in compliance with the UK GDPR. By way of a reminder, the UK GDPR requires  that a transfer of personal data to third countries to incorporate appropriate safeguards to ensure the protection of such personal data, unless the transfer is covered by a UK adequacy regulation or an exception that covers the transfer.

One such appropriate safeguard is the Standard Contractual Clauses (SCCs). The European Commission adopted new versions of the SCCs in June 2021, replacing the previous three 10-year-old sets of SCCs that had been adopted under the EU Data Protection Directive 95/46/EC. The new SCCs take into account the judgment of the European Court of Justice (ECJ) in a case that is commonly known as the ‘Schrems II’ decision back in July 2020. Among other things, that decision required organizations to carry out further diligence before making a transfer of personal data to a third country without an adequacy decision. However, because the UK left the EU prior to the issuance of the new SCCs, the new SCCs are not recognized under UK law, so data exporters in the UK had to continue to rely on the old SCCs.

The Information Commissioner’s Office (ICO) ran a consultation on data transfers under the UK GDPR from 11 August 2021 to 11 October 2021, which included both the draft IDTA and the Addendum. The IDTA and Addendum were laid before Parliament by the Secretary of State on 2 February 2022, and came into force on 21 March 2022.

For 6 months, i.e. until 21 September 2022, it will be possible for organizations to choose whether to use the legacy SCCs (i.e. the old EU SCCs, not those issued by the EU Commission in 2021) for new data transfers or the new UK transfer mechanisms. Existing transfer arrangements which incorporate the old SCCs will remain valid in relation to deals already in place for a further 24 months, as long as the processing operations remain unchanged. In many cases, there will be a good argument for switching to one of the new UK transfer tools before 21 March 2024. As this final deadline falls less than a year and three months after the deadline for repapering the new EU SCCs, it will make sense for international organizations to harmonize their repapering projects, to cover both EU and UK data flows at the same time.

The use of the EU SCCs in conjunction with the UK Addendum (which essentially includes tweaks to the EU SCCs to make them work for UK data transfers), will likely be the preferred mechanism for organizations that process data originating from both the UK and the EU. Such approach would allow organizations to use just one set of SCCs for transfers of all their European data (i.e. the EU SCCs with the addition of the UK Addendum for UK data) and help reduce complexities introduced to data transfers by Brexit.

Alongside the release of the new data transfer tools, the UK’s ICO has also made a small but important update to its Guide to UK GDPR (clarifying its approach to a ‘restricted transfer’) and announced that further detailed guidance on international data transfers will be published soon. In particular, the ICO has clarified that all data transfers to receivers located in a “non-adequate” country outside the UK (including those who are subject to the UK GDPR under the law’s long-arm jurisdictional reach) will be treated as ‘restricted transfers’. 

Importantly, organizations will have to bear in mind that the adoption of either the IDTA or the Addendum is not a ‘silver bullet’ that will eliminate all issues relating to data transfers. Neither the IDTA nor the Addendum will automatically mean that no further steps are needed before a data transfer can occur – a risk assessment will still be needed and supplementary measures may also be needed depending on the country and circumstances.

The EU and the US Agree in Principle to a Trans-Atlantic Data Privacy Framework

By Paul Lanois and Andrew Scott

On March 25, 2022, Ursula von der Leyen, President of the European Commission (EC), and U.S. President Joe Biden released a joint statement revealing the two leaders had agreed “in principle” to develop a new transatlantic data flow agreement between the European Union and the U.S.  In the joint statement, President Biden referred to this new agreement as an enhanced Privacy Shield Framework, but further references – including the White House’s Fact Sheet (“Fact Sheet”) – have referred to it as the Trans-Atlantic Data Privacy Framework. 

In speaking about the new agreement, President von der Leyen emphasized this new framework will “enable predictable and trustworthy data flows between the EU and US, safeguarding privacy and civil liberties.”  Similarly, President Biden emphasized that the leaders had agreed “to unprecedented protections for data privacy and security for our citizens.” Additionally, he noted “[t]his new arrangement will enhance the Privacy Shield framework, promote growth and innovation in Europe and in the United States and help companies, both small and large, compete in the digital economy.”

While no text has been offered as to what the framework will say, both the EU and U.S. offered high-level overviews of what can be expected to be included and how the framework will come together.  For example, the U.S. provided  a press release and the Fact Sheet.  Similarly, the European Commission has released its own overview of the Framework, including an insight into the key principles, the benefits, and the next steps.

The breakthrough agreement is a result of ongoing negotiations between the U.S. and the E.U. since the Court of Justice of the European Union (CJEU) invalidated the previous transatlantic data flow arrangement (Privacy Shield) on June 16, 2020, in the Schrems II decision

In the Schrems II decision, the CJEU called into question the adequacy of the safeguards being used to protect the EU citizens’ data from U.S. government surveillance and the adequacy of the remedy offered to the EU citizens.  Without question, the new framework will have to address these two questions in order to remain in force and pass muster of almost-certain legal challenges that will likely end up in the CJEU (perhaps a future Schrems III decision?).

It appears that the new framework will be taking into consideration the issues raised in Schrems II.  Both the U.S.’s fact sheet and the EU’s overview of the Framework give early indications that the U.S. will be aligning its data privacy practices with the European data approach.  For example, the U.S fact sheet identifies the general commitments that would be adopted by the U.S. by way of a presidential Executive Order, which will form the basis of the EU Commission’s assessment in its future adequacy decision. Through that order, the US would be creating a “new multi-layer redress mechanism that includes an independent Data Protection Review Court, which would have full authority to adjudicate claims and direct remedial measures as needed” and ensuring “that signals surveillance activities are necessary and proportionate the pursuit of defined national security objectives.” In particular, the U.S. Fact Sheet notes that signals intelligence collection would only be undertaken “where necessary to advance legitimate national security objectives”.

While many of the details remain unclear, the U.S. and EC have indicated that the next steps will be to translate the agreement in principle into formal legal documents.  First, consider that the last two adequacy decisions adopted by the EU ran 93 pages (the UK) and 122 pages (South Korea) – both significantly longer than the current Privacy Shield Framework.  Also, the mechanisms that the US must implement by way of the Executive Order are not trivial, especially the creation of a Data Protection Review Court. We will continue to monitor the developments of this agreement and look forward to updating you when more details on the specific requirements of the new framework have been released.

Interestingly, the new framework appears to carry on from where the Privacy Shield Framework left off: according to the U.S. Fact Sheet, “participating companies and organizations that take advantage of the Framework to legally protect data flows will continue to be required to adhere to the Privacy Shield Principles, including the requirement to self-certify their adherence to the Principles through the U.S. Department of Commerce”. 

Once prepared, the agreement will be submitted to the European Data Protection Board for review and approval as required by the General Data Protection Regulation. The specific timeframe for such review and approval has not been communicated.

Privacy Talks:  Interviews with California Privacy Leaders.

This month, the Privacy Law Section is proud to continue  Privacy Talks:  Interviews with California Privacy Leaders.

This month, we interviewed Cody Venzke.  Cody is Senior Counsel for the Center for Democracy and Technology’ (CDT) Equity in Civic Technology Project, ensuring that education agencies and other civic institutions use technology responsibly and equitably while protecting the privacy and civil rights of individuals.

Cody not only serves on the Executive Committee for the Privacy Law Section but also as the Section’s Treasurer.  You can watch our interview with Cody.


Forgot Password

Enter the email associated with you account. You will then receive a link in your inbox to reset your password.

Personal Information

Select Section(s)

CLA Membership is $99 and includes one section. Additional sections are $99 each.

Payment