Privacy Law

PRIVACY LAW REVIEW – WHAT YOU NEED TO KNOW (September 2021)

Please share:

CLA’s Privacy Law Section summarizes important developments in California privacy and beyond. 

Privacy Law Section Year-End Review

By Jeewon Serrato

We are excited to announce the six (6) new Executive Committee members who will be  joining the current Executive Committee to serve the CLA Privacy Law Section, 2021-2022.  The full list of the current members and new members is listed below.  The 2021-2022 term will begin after the CLA Annual Meeting.  I had he privilege of serving as Chair this past year and I know the Executive Committee, under Sheri Porath Rockwell’s leadership as Chair will continue to build on our inaugural year.

As we come to close our first year as a new section, I wanted to take a moment to recognize some of our milestones and accomplishments.

Membership

Since launching as the newest section November 2020, we have now grown to an organization with over 1,000 members. 

Events

In the last 10 months, we have produced 15 CLE seminars and events dedicated to providing education for privacy laws. 

Publications

Members of the Privacy Law Section will be publishing the California Consumer Privacy Act Handbook this fall, which provides practical guidance on how to comply with the CCPA as well as the California Privacy Rights Act.   In addition to the Handbook, which is an update to the California Privacy Law Treatise, which we will update annually, we published monthly newsletters that you have all been receiving in your inboxes, which include timely write-ups on the latest privacy law developments. 

Networking and Advocacy

The section’s events and publications programming were enhanced by networking, community building and advocacy programs as well, including our participation in Legislative Day in April where members of the section met with staff from the California Governor’s Office and the Assembly, as well as the Office of the California Attorney General.  This fall, we will be actively participating in rulemaking proceedings for the California Privacy Rights Act, which goes into effect on January 1, 2023 and the section has been actively interfacing with the new California Privacy Protection Agency. 

Young Lawyers Programming

For attorneys and law students who are looking to become privacy practitioners, we have launched the “Intro to Privacy” series and a mentoring program for law students. 

Privacy Law Specialization

For practitioners who specialize in privacy, we submitted an application for the California Board of Legal Specialization to explore the creation of a Legal Specialty for Privacy in California.  This was a CLA-wide effort, which included practitioners from the Business Law Section, the IP Law Section and the Los Angeles County Bar Association.  The Board just voted last week to move forward with our application and we will let you know how to get involved in the committee that will be establishing the standards for certification.

These accomplishments and more could not have been possible without the tireless dedication of the inaugural Executive Committee.  We thank Jake Snow, who will be stepping down from serving on the Executive Committee and welcome him as an Advisor next year.  We welcome the new Executive Committee members who, together with committee members, will be expanding and enhancing the section’s service offerings in the coming year, including the Annual California Privacy Law Conference, which will take place in January 2021.   

Representing the privacy law community in California for the past year has been an honor.  I look forward to continuing to work with all of you to make California Lawyers Association the best community for privacy lawyers and practitioners.

If you are interested in joining our committees or have questions about any of the workstreams we highlighted above, please send us a note at privacy@calawyers.org.

With a grateful heart, I’m signing off.  Onward!

Jeewon Kim Serrato

Executive Committee

  • Jeewon Kim Serrato, Chair
  • Sheri Rockwell, Chair-Elect
  • Mark Aldrich, Vice Chair
  • Brett Cook, Vice Chair
  • Cody Venzke, Treasurer
  • Jennifer Oliver, Secretary
  • Aaron Lawson
  • Alisa Hall
  • Christian Hammerl
  • Elaine Harwell
  • Hailun Ying
  • Irene Jan
  • Jake Snow (Outgoing)
  • Joshua de Larios-Heiman
  • Mary Stone Ross
  • Smita Rajmohan

New Executive Committee Members

  • Andrew Serwin
  • Donald Yoo
  • Joy Peacock
  • Nicholas Ginger
  • Paul Lanois
  • Robert Tookoian
September Legislative Update

By Mallory Jensen

The California Legislature recently unanimously passed SB 41, a bill that our committee has been following. As previously described here, SB 41 would require a direct-to-consumer genetic testing company (e.g., 23andme) to provide consumers with certain information on the company’s use of genetic data, as well as to obtain express consent for collecting and using such data. It also requires genetic testing companies to provide a way for consumers to revoke their consent, and to implement “reasonable security practices and procedures” to protect the genetic data. SB 41 now goes to Governor Newsom for his signature.

In addition, the Privacy Section’s Legislation Committee is planning to set up a working group to draft recommendations regarding CPRA regulations. The California Privacy Protection Agency has recently released an invitation for preliminary comments that will serve as a starting point for the group’s discussion. Watch this space and your email inboxes for an invitation to complete a survey of interest and relevant expertise as we start to form the group.

Back to School Review: Big Tech Anticipates Congressional Moves on Teen Privacy

By Cody Venzke

August capped off a summer of activity by Congress and major technology companies to reform online privacy for individuals under 18, especially for teenagers, who currently lack protections guaranteed to children under 13. As members of Congress dug into the issue more deeply, major tech companies with large userbases of minors rolled out new protection for children and teenagers.

Congressional Action

From the late spring and throughout the summer, members of Congress pressured technology companies to bolster privacy protection for teenagers and children. The summer kicked off with a hearing by the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security, titled “Protecting Kids Online: Internet Privacy and Manipulative Marketing.” The hearing included testimony from Baroness Beeban Kidron, a cross-bench peer in the United Kingdom’s House of Lords. Baroness Kidron testified about the United Kingdom’s then-forthcoming Age-Appropriate Design Code, a set of “practical measures and safeguards to ensure” compliance with UK’s Data Protection Act 2018. The Code’s fifteen principles require “information society services likely to be accessed by children” under 18 to design their services and data practices in the “best interests of the child” and take steps to mitigate risks to children. The Code went into effect on September 2.

Members of Congress took notice of the Code’s potentially far-reaching effects. Sen. Edward Markey and Reps. Kathy Castor and Lori Trahan wrote letters to major tech and gaming companies including Amazon, Facebook, Google, Microsoft, Snapchat, TikTok, Disney, Activision Blizzard, Epic Games, Niantic, and Nintendo to extend the Code’s protections domestically. In their letter, Sen. Markey and Reps. Castor and Trahan specified several risks to children that necessitated the expansion, including increased screen time, a lack of transparency, collection and sharing of children’s “sensitive information,” “social engineering,” exposure to “cybercriminals,” “nudging,” and insufficient parental controls.

Those same members of Congress also introduced legislation based on many of the Code’s principles:  

  • Sen. Markey, along with Sen. Bill Cassidy, introduce the Children and Teens Online Privacy Protection Act, S. 1628, which would extend the protections of Children’s Online Privacy Protection Act (COPPA) from children under 13 to teenagers under 16, expand the scope of “operators” subject to COPPA, and ban targeted advertising directed toward children, among other things.
  • Similarly, Rep. Castor introduced the Protecting the Information of our Vulnerable Children and Youth Act, H.R. 4801, which would extend COPPA’s protections to teenagers under 18, ban targeted advertising on services “likely to be accessed by children or teenagers,” create a private right of action under COPPA, and incorporate the Code’s “best interests of the child” standard.
  • Rep. Trahan published a draft bill that would prohibit targeted advertising in services used for “K-12 purposes” or using data collected by those services for targeted advertising.

Tech’s Response

Major tech companies have taken notice on the legislative landscape. Over the summer, Google, YouTube, Instagram, and TikTok announced overhauls of their protections for children and teenagers.

Google announced that it would begin setting “SafeSearch” as the default for users ages 13-18 years old and that users under 18 could begin to flag images of themselves appearing in search results for removal. The company also announced that it would no longer collect location history for users 13-18 years old. Google Workspace for Education similarly announced that it would implement new controls for administrators to specify age limits for students accessing different Google services outside the core educational services. The new Education features will require administrators to indicate which student users are 18 or older.

Similarly, Google-owned YouTube released new features aimed at protecting children and teenagers. Videos uploaded by users ages 13-17 are now set to private by default, with the option to make the video public. YouTube has also rolled out digital wellbeing features, including disabling autoplay and implementing reminders to take breaks or to go to sleep.

YouTube competitor Instagram also updated its privacy protections for children and teenagers, with a focus on creating safer interactions between users. The Facebook-owned social network announced it would begin restricting direct messages (DMs) “between teens and adults they don’t follow” and providing “safety notices in DMs” to notify “young people when an adult . . . has been exhibiting potentially suspicious behavior” such as “sending a large amount of friend or message requests to people under 18.” Instagram will also make it more difficult for adults “exhibiting potentially suspicious behavior to interact with teens” by restricting teenagers from appearing in the adults’ search results or suggested users. Each of these features will use “new artificial intelligence and machine learning technology” to not only identify users’ ages, but also flag potentially inappropriate interactions. Neither Instagram nor Facebook have detailed this technology.

Finally, TikTok released changes in teens’ default privacy settings, building off updates released in January 2021. In January, the company changed default privacy setting for all registered accounts ages 13-15. It made those accounts private by default, disable direct messaging, permitted comments by only those users’ friends, and disabled the option for others to download their videos. In August, TikTok expanded on those protections for users ages 16-17, disabling direct messaging by default and prompting users to confirm the privacy and visibility settings for each post. The company has also disabled push notifications late at night – after 9 pm for users ages 13-15 and 10 pm for users 16-17.

Conclusion

Although the changes implemented by technology companies address many of the issues raised by the Age-Appropriate Design Code and U.S. legislators, several issues – such as targeted advertising and the scope of business subject to existing law – remain unresolved. Congresspeople may continue to exhibit interest in this area, perhaps as a stopgap in place of a federal general privacy law.

California Attorney General Reminds Healthcare Providers of Obligations

By Brandon M. Jasso, CIPP/US

On August 24, 2021, California Attorney General Rob Bonta (“AG”) issued guidance to healthcare facilities and providers to remind them of their existing and continuing obligations under state and federal health data privacy laws, including health data breach reporting requirements. The guidance came via a bulletin, which was sent to a variety of stakeholder organizations, including the California Hospital Association, the California Medical Association, and the California Dental Association (see here). The AG reminded healthcare providers that the California Department of Justice is committed to enforcing consumer protections and health privacy laws.

The AG further reminded providers and organizations that attacks to the healthcare section have interrupted services to patients, which has, and will continue, to adversely affect patients’ trust. The AG further pointed out that data breaches result in long term effects that outlast the initial breach such as “fraudulent use of [patients] personal information obtained from a breach of health data.” Providers and organizations have a duty to be “proactive and vigilant” about protecting themselves against ransomware attacks and breaches and must “meet their health data breach notification obligations to protect the public.”

The AG’s bulletin comes as the healthcare industry—already suffering under the strain of the increase demands caused by COVID-19—has been subject to an continuing increase in ransomware attacks since 2020. In general, there has been an overall 102% increase in organizations in being affected by ransomware, with the healthcare industry globally sustaining an average of 109 attack attempts per organization each week. According to IBM, for the eleventh consecutive year, healthcare has had the highest industry cost for breach increased from an average total cost of $7.13 million in 2020 to $9.23 million in 2021, a 29.5% increase.

In his bulletin, The AG reminded recipients that state and federal laws obligate healthcare entities and organizations to establish policies and security measures concerning protected health information. To emphasize the importance of compliance, the AG also reminded recipients that the California Attorney General has authority to bring civil actions on behalf of California resides under the Health Insurance Portability and Accountability Act (“HIPAA”) as amended by the Health Information Technology for Economic and Clinical Health Act (“HITECH”). See U.S. C. § 1320d-5(d).

The AG’s bulletin  also added that entities should put in place the following minimum preventative measures in place to protect from ransomware attacks:

  • keep all operating systems and software housing health data current with the latest security patches;
  • install and maintain virus protection software;
  • provide regular data security training for staff members that includes education on not clicking on suspicious web links and guarding against phishing emails;
  • restrict users from downloading, installing, and running unapproved software; and
  • maintain and regularly test a data backup and recovery plan for all critical information to limit the impact of data or system loss in the event of a data security incident.

Organizations, providers, and professionals should take note of what the AG has stated as the minimum preventative measures because should an AG investigation occur, the listed measures will most likely be scrutinized when determining any penalties or claims against an organization or provider.

Organizations, providers, and privacy professionals must also be familiar with breach notice and reporting requirements under the Confidentiality of Medical Information Act (“CMIA”) and HIPAA (and as amended by HITECH). The HIPAA breach notification rules can be found in 45 CRF § 164.400-414, and U. S. C. § 17932, and apply to covered entities and business associates following a breach of unsecured protected health information. The California breach reporting requirements can be found in Civ. Code § 1798.82. Additionally, the same parties must be familiar with the requirements under the HIPAA Security and Privacy Rules (and the California corollaries, see Health & Safety Code §§ 1280.15 and 1280.18) in order to actively and effectively secure patients’ protected health information.

For information about recent breaches, a list of breach notices provided by entities to the AG’s Office can be found here; and breach notices provided to the United States Department of Health and Human Services, Office of Civil Rights, can be found here.

T-Mobile’s Breach Highlights the Importance of Data Minimization and CPRA

By Andrew Scott

On August 17, 2021, a criminal cyberattack hit T-Mobile and compromised the personal information held by the communications carrier of more than 50 million. 

Three days later, T-Mobile shared updated information regarding the ongoing investigation into the cyberattack.   According to the update, three categories of customers had their data compromised:  current customers, former customers, and prospective customers. Depending on the type of customer, however, different types of personal information were compromised.

  • The 13 million current customers had the following information compromised:  first and last names, date of birth, SSN, and driver’s license/ID information, phone numbers, as well as IMEI and IMSI information, the typical identifier numbers associated with a mobile phone.  Also, 850,000 phone numbers and account PINs were exposed.
  • The 40 million former or prospective customers had the following personal information compromised: first and last names, date of birth, SSN, and driver’s license/ID information.

On a positive note, T-Mobile has expressed no indication that the data contained in the stolen files included any customer financial information, credit card information, debit card information, or other payment information.  The company’s breach response included an around-the-clock forensic investigation and an update that the access point to the breach had been closed.  The business also appears to have taken action to comply with its notification duties and offer its customers remediation, helping the company rebuild its trust.  

As expected, a class-action lawsuit has been filed against T-Mobile in the United States District Court of Washington.  Among the claims alleged, the suit argues that, pursuant to Section 1798.150 of the California Consumer Privacy Act (CCPA), T-Mobile violated its duty “to implement and maintain reasonable security procedures and practices appropriate to the nature of the information to protect the personal information….”  Essentially, when insufficient security results in a breach, the CCPA provides for statutory damages of between $100 and $750 per customer per violation or actual damages, whichever is greater.

Was T-Mobile’s security insufficient under the CCPA?  In a recent article from The Wall Street Journal, a 21-year-old claiming to take responsibility for the hack told the news outlet that the company’s security is “awful.”  The hacker went on to say he accessed the customer (current, former, and prospective) data by scanning for an unprotected router, which gave him access to a Washington state data center that stored credentials for over 100 servers.

While the company’s CEO did not respond directly to the comments made by the hacker, he released a statement that recognized “significant steps” are needed to enhance the businesses’ approach to cybersecurity, needing to take its cybersecurity expertise “to the next level” and develop “improved security measures.” 

Putting aside the discussion of whether T-Mobile had in place reasonable security measures, the more important question is why did the company find it relevant or necessary to retain the personal information of the 40 million people it was not doing currently doing business with?  Had the company only retained the personal information of people it had been doing business with, 40 million people would not have to be worrying about their identity being stolen. 

In the United States, the requirement to minimize the amount of data collected to that which is “necessary and relevant” does not exist in any enacted law (compare HIPAA’s Minimum Necessary Rule).  To adhere to this data minimization principle, a company would have to voluntarily hold itself to a higher standard.  

The Importance of Data Minimization

The concept of limiting the amount of data that is collected and retained is known as the Data Minimization, which is a privacy principle.  The principle can help businesses respect their customers’ privacy, reduce liability, save money, and mitigate breach. 

Generally, the principle asks businesses to only collect personal information for specified, explicit, and legitimate purposes; to not collect more personal information than is needed; to not store personal information for longer than is necessary. 

The U.S. has supported this principle.  In March 2012, the Federal Trade Commission issued a report that made recommendations to businesses and policy makers about how to protect consumer privacy.  In the report, the commission recommended that “companies should implement reasonable restrictions on the retention of data and should dispose of it once the data has outlived the legitimate purpose for which it was collected.”  The commission also recommended to create enact baseline privacy legislation to incorporate data principles, but that (as we know) has yet to happen.

Across the Atlantic, the European Union (EU) has codified six data principles into its General Data Protection Regulation (GDPR).  In Article 5:  Principles relating to processing of personal data, the six data principles provide the framework on how personal data must be processed. Three of those principles–legitimate purpose limitation 5 (1)(b), data minimization 5 (1)(c) , and storage limitation (5 (1)(e)–impose some form of the data minimization requirements:

In particular, Article 5 (1)(c) states that personal data of an EU citizen that is processed must be “adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed.”  Examples of processing irrelevant data could be anything from an employer asking a job applicant about health conditions that are relevant to his or her prospective job, a website that asks visitors for their social security number in order to sign up for their mailing list, or, as in T-Mobile’s case, collecting and retaining sensitive data of prospective and former customers.

Unlike the GDPR’s requirement, the United States does not have any currently enacted privacy laws that have a similar data minimization requirement; however, some do recommend data minimization as a best practice; however, privacy laws worldwide are becoming stricter.

The California Privacy Rights Act (CPRA), passed in 2020, contains the first “data minimization” requirements of any U.S. privacy law.   On January 1, 2023, CPRA will be enacted, adding to the California Consumer Privacy Act (CCPA) several data minimization requirements:

  • A business shall not collect additional categories of personal or sensitive personal information or use personal information collected for additional purposes that are incompatible with the disclosed purpose for which the personal information was collected without providing the consumer with notice consistent with this section.” (See 1798.100 (a)(1)-(2);
  • “…a business shall not retain a consumer’s personal information or sensitive personal information for each disclosed purpose for which the personal information was collected for longer than is reasonably necessary for that disclosed purpose.”(1798.100 (a)(3).

It is clear that the CPRA’s data minimization requirements are not as restrictive as the GDPR’s requirements under Article 5.  For example, the GDPR requires that a business does not collect or retain irrelevant personal information.  In comparison, CPRA more broadly requires that additional categories of personal information or sensitive personal information are not collected for an incompatible purpose.  Additionally, the CPRA does not define how long is a “reasonably necessary” period of storage.

Even though CPRA’s requirements may not require as much on companies as the GDPR, it is a step in the right direction for privacy legislation in the U.S. 

California Businesses Should Not Wait to Implement CPRA’s Data Minimization Requirements

It seems unfortunate that T-Mobile could legally keep detailed records on millions of people who may never have been their customers, but to expect businesses to hold themselves to a higher standard than the law requires is unlikely.  Proactively investing in a privacy department, or voluntarily adopting a framework that calls for more compliance than necessary is still a foreign concept for many businesses. 

For example, the Department of Commerce’s Privacy Shield Framework is our nation’s only federal privacy framework. While it was created soon before the GDPR, the framework was created with the GDPR in mind.  Participation in the Privacy Shield is voluntary and requires adhering to the Framework, which may be enforced by the FTC.  The Framework’s Data Integrity and Purpose Limitation Principle provides, in part, that “personal information must be limited to the information that is relevant for the purposes of processing.”  In a footnote, the text states that the purpose of the processing “must be consistent with the expectations of a reasonable person given the context of the collection.” It is hard to imagine that the former and prospective T-Mobile customers had a reasonable expectation that the business was going to retain their data. 

Until the U.S. adopts an omnibus federal privacy law similar to the EU’s GDPR, Brazil’s General Data Protection Law (known as the LGPD), China’s Personal Information Protection Law (PIPL), or any state-level legislation like CPRA that requires stronger adherence privacy principles, data minimization will not be a priority. 

Businesses subject to the CCPA might consider implementing CPRA the data minimization requirements early.  Even applying the more restrictive GDPR-framework could be even more helpful. There is nothing stopping any company from holding itself to a higher standard and making sure that data is collected that is only relevant and necessary.  After all, the more compliant a company is, the more trust it will engender from business partners and/or consumers.

Privacy Considerations for Employers Collecting Employee Vaccine Data

By Brett Cook and Oliver Kiefer

Disclaimer: This article reflects the thoughts and opinion of the authors and not their law firms and/or employers.

With the rise of the highly contagious Delta variant, employers across the United States are increasingly concerned with collecting and recording COVID-19 vaccination information from their employees.  In the United States, vaccine information may be legally collected for several reasons, including where required pursuant to state law.  For example, California currently requires employers to document the vaccination status of fully vaccinated employees if the employees do not wear face coverings indoors.[1]  Differing federal and state employment laws and regulations require a carefully tailored approach to vaccination data collection.

Varying state laws have injected uncertainty into the collection of vaccination information as employers seek to reopen traditional workplaces and comply with applicable public health and privacy laws.  Nonetheless, foundational privacy principles can be used to mitigate risk.  In general, employers should (1) define the purpose of collection, (2) define how data will be used, (3) determine how long data will be retained in accordance with retention policies/laws, (4) determine who will have access to the data, and (5) implement appropriate security measures to safeguard information. 

In the United States, the Equal Employment Opportunity Commission has provided guidance regarding the collection of an employee’s COVID-19 vaccination status by their employer.  Businesses who collect this information should consider whether it is required to be treated as confidential medical information under the Americans with Disabilities Act, which would require the data to be kept confidential and stored separately from the employee’s personnel files.[2]  Thus, broadly speaking, federal law does not prohibit employers from reasonably and responsibly collecting vaccination information from their employees.

In California, employers should consult the California Department of Industrial Relations Division of Occupational Safety & Health’s (CAL OSHA’s) Model COVID-19 Prevention Program, which provides a framework for compliance with the most current Emergency Temporary Standards in place for COVID-19.  The Model COVID-19 Prevention Program contains a template that can be used to track employee vaccination status.  It also provides guidance regarding how employers should document employee vaccination status. 

However, not all states have adopted the same approach as California.  In Florida, for example, Governor Ron DeSantis signed an executive order on April 2, 2021, that prohibited businesses in Florida from requiring so-called “vaccine passports.”[3]  These “passports” are shorthand for any piece of documentation that would allow businesses to determine whether a customer was fully vaccinated against COVID-19.  Left unclear, however, was how the executive order regulates vaccine data that businesses collect from employees.  Florida also has not addressed how employers should treat that data, once collected.  Despite this lack of guidance, at least one Florida county has determined the executive order does not prohibit it from mandating that county employees show proof of vaccination.[4]  The executive order is also currently facing a challenge in the courts.

As outlined above, this area of privacy law is rapidly changing, and regulations vary substantially across the states.  Compliance is critical for all employers given the high stakes at issues.  Risk can be limited by working with regional attorneys and privacy leaders to identify legitimate reasons for vaccine data collection before collecting employee vaccine data and collecting the minimum amount of data necessary.


[1] https://www.dir.ca.gov/dosh/dosh_publications/06-16-21-ETS-Revisions.pdf

[2] https://www.eeoc.gov/wysk/what-you-should-know-about-covid-19-and-ada-rehabilitation-act-and-other-eeo-laws (Section K.4)

[3] https://www.flgov.com/wp-content/uploads/2021/04/EO-21-81.pdf

[4] https://www.tallahassee.com/story/news/2021/07/28/leon-county-florida-covid-vaccine-mandate-mandatory-tallahassee/5408362001/


Forgot Password

Enter the email associated with you account. You will then receive a link in your inbox to reset your password.

Personal Information

Select Section(s)

CLA Membership is $99 and includes one section. Additional sections are $99 each.

Payment