CLA’s Privacy Law Section summarizes important developments in California privacy and beyond.
Privacy Talk – Interviews With California Privacy Leaders, Hailun Ying
The Privacy Law Section is proud to continue Privacy Talks: Interviews with California Privacy Leaders.
This month, we interviewed Hailun Ying. Hailun is Lead Privacy Counsel of Roblox and currently serves on the Section’s Executive Committee. Please find our interview with Hailun here.
AB-1651: As ‘Workplace’ Extends to Our Homes, Can Employers Still Conduct Worker Monitoring?
By: Jeewon Kim Serrato, Jerel Pacis Agatep, and Jenny Ha
If there is one thing that could be considered a silver lining of the COVID-19 pandemic, it is the accelerated discussion of employee privacy issues and workplace monitoring. The mass movement to remote work in recent years has put the spotlight on whether and how employees have privacy rights. In our March 2021 newsletter, we highlighted the top six workplace monitoring issues employers should be considering: email and text, telephone and voicemail, video surveillance, GPS and cellphone location, social media, and keystroke and productivity monitoring. These topics are now being pointedly addressed in a new California bill, AB-1651, aptly labeled the “Workplace Technology Accountability Act,” as the debate surrounding employee privacy heats up again in California.
Introduced by Assemblymember Ash Kalra, AB-1651 seeks to amend parts of the Government Code and Labor Code in line with the intent of the California Consumer Privacy Act (CCPA) and the California Privacy Rights Act (CPRA). The CPRA extends the CCPA exemption for personal information collected from “workplace members” but is set to expire on January 1, 2023. AB-1651 would confer corresponding “worker” rights and employer obligations in connection with the collection and use of worker data. Additionally, the bill addresses work-from-home monitoring and the use of algorithms and artificial intelligence with employee data. Lastly, and quite unlike the CPRA, which excludes government agencies, AB-1651 applies not only to businesses but also to state and local government entities.
Below, we provide a high-level summary of some key provisions of AB-1651 and what to expect in terms of employee privacy issues in the next year as the debate continues in the California legislature.
WORKER DATA RIGHTS
Notice Requirement – Similar to CPRA’s notice requirement, AB-1651 would require employers, at or before the point of collection, to inform workers regarding the categories of worker data collected, the purpose of collecting data, if the data is related to the worker’s essential job functions, and if it will be used in making or assist in making employment-related decisions. Additionally, employers must inform workers whether the data will be deidentified, used in an aggregated form or shared with third parties. Lastly, employers must also inform workers of the employers’ data retention policies, the workers’ rights to access and to correct their data (discussed further below), and any data protection impact assessments (DPIA) or worker information systems (WIS) that are being actively investigated by the Labor and Workforce Development Agency (LWDA).
Right to Access – Similar to CPRA’s right to access, AB-1651 would require employers to provide, upon receipt of verifiable request, the categories and pieces of worker data retained, the purpose and sources of data collection, whether the data is related to the worker’s essential job functions or employment decisions, whether the data is involved in an automated decision system (ADS; discussed further below), and the names of any third parties from whom the data is obtained or to whom the data is disclosed.
Right to Correct – Similar to CPRA’s right to correct inaccurate information, AB-1651 would require employers to keep worker data accurate, which in turn creates a worker’s right to request correction of inaccurate worker data. However, this does not apply to “subjective information, opinions, or other nonverifiable facts.” In our opinion, this may include performance reviews and performance improvement plans.
Limited Use of Worker Data – AB-1651 would impose on employers the obligation to limit data collection and use to that which is “strictly necessary” for certain allowable purposes, including but not limited to allowing workers to accomplish an essential job function, administering wages and benefits, and assessing worker performance. Additionally, employers cannot sell or license worker data, including deidentified or aggregated data. Lastly, employers can disclose or transfer worker data to a third party only pursuant to a contract that prohibits sale or licensing of that data and that stipulates that the third party must have reasonable data security protections. However, biometric, health or wellness data cannot be disclosed or transferred to any third party unless required by law.
Notice Requirement – AB-1651 would require employers to provide workers notice of electronic monitoringbefore monitoring begins. This section of the bill details the notice requirements, including but not limited to listing the purpose of the monitoring; describing the technology used; providing the dates, times, and frequency of the monitoring; and explaining why that form of monitoring is “strictly necessary” and the “least invasive means” for an allowable purpose. The notice must also be “clear and conspicuous”; it cannot simply state electronic monitoring “may” take place. Note that while these are stringent requirements to keep workers aware of how employers monitor them, they fall short of requiring consent.
Ongoing Notice Obligation – Employers would also have ongoing notice obligations annually and when their monitoring significantly changes.
Limited Use of Workplace Monitoring – AB-1651 would prohibit the use of workplace monitoring unless all of the following conditions are met: the monitoring is “strictly necessary” to accomplish one of the listed allowable purposes and is the “least invasive means” to accomplish that purpose; it is limited to the smallest number of workers and collects the least amount of data necessary; and the worker data collected will be accessed only by authorized agents and within the notified duration of monitoring.
Essentially, this section of the bill would codify some of the best practices we have described in our newsletter. For instance, (1) limit the minimum number of workers and amount of data necessary, (2) do not monitor workers who are not doing work-related tasks, (3) do not monitor workers who are exercising their legal rights, and (4) do not monitor workers where they have “reasonable expectation of privacy,” including locker rooms, changing areas, and break rooms.
Facial Recognition Prohibited – AB-1651 prohibits the use of electronic monitoring systems that incorporate facial recognition, gait, or emotion recognition technology.
Remote Workplace Monitoring Must Be “Strictly Necessary” – As pointed out in our newsletter, generally, California employers that want to monitor their remote workers can do so by balancing their legitimate business reasons with the worker’s “reasonable expectation of privacy.” That will change if AB-1651 passes. AB-1651 prohibits the use of audio-visual monitoring of a workplace in a worker’s residence, personal vehicle, or property, unless it is “strictly necessary” to ensure worker health and safety or the security of employer data, or fulfills other similarly compelling purposes.
In line with that, workers would have the right to decline to install data collection or transmission applications on personal devices, unless the monitoring is “strictly necessary” to perform essential job functions. Lastly, GPS applications and devices would also have to be proactively disabled outside of work times and activities.
Workplace Monitoring and Employment Decision-Making – AB-1651 would prohibit relying solely on worker data collected through electronic monitoring when making employment decisions such as hiring, promotion, termination, and other disciplinary actions. The employer must have independent information from its own assessment to corroborate the electronic monitoring data before making any employment decision.
AUTOMATED DECISION SYSTEMS
A large portion of this bill is dedicated to automated decision systems, or ADS. An ADS is “a computational process, including one derived from machine learning, statistics, or other data processing or artificial intelligence techniques, that makes or assists an employment-related decision.” Whether or not this bill passes, the specific inclusion of such technology signals that legislation geared toward regulating the use of artificial intelligence (AI) in the workplace is just starting. For example, in March 2022, the California Fair Employment & Housing Council drafted modifications to California’s employment anti-discrimination laws that would impose liability on companies or third parties administrating AI tools that have a discriminatory impact on screening job applicants or that make other employment decisions.
Continuing the theme of the entire bill, AB-1651 would require employers using ADS to (1) provide notice, which would be an ongoing obligation, (2) limit the use of ADS in making employment-related decisions, and (3) not solely rely on ADS output to make an employment-related decision and instead conduct an independent evaluation to corroborate the ADS output.
This evaluation includes a new term, “meaningful human oversight,” which requires a designated internal reviewer who has sufficient power, resources, and expertise to investigate and understand the ADS and its outputs. When the employer does use ADS output as part of a decision, the employer must explain the decision to the affected worker. Furthermore, ADS outputs regarding a worker’s health cannot be the basis for any employment-related decision.
Lastly, employers must conduct impact assessments, known as an “Algorithmic Impact Assessment” (AIA), by independent assessors with relevant experience. AB-1651 describes the requirements and process of the assessment, including a review and comment period for workers potentially affected by the ADS. An employer must also publish an AIA summary on its website upon submitting the AIA to the LWDA.
ENFORCEMENT AND PRIVATE RIGHT OF ACTION
AB-1651 would give workers a private right of action for injunctive relief and recovery of civil penalties and attorney’s fees. The bill also confers on the LWDA the authority to enforce and assess penalties and collect copies of notices under the reporting requirements throughout this bill. Penalties range from $100 to $20,000 per violation.
Employers have not previously faced significant regulation when it comes to monitoring their workers. However, the new business obligations in the CPRA and as contemplated in AB-1651 signal that employers should carefully review their privacy controls before rolling out any business practices that impact employees’ privacy rights. Prior to the COVID-19 pandemic and the proliferation of work-from-home arrangements, most employees had more separation between their homes and their workplaces. Now that the “workplace” has crept into every corner of our homes and lives and is not limited to a building downtown, worker privacy and remote monitoring are again front and center. Similar to CPRA and AB-1651 in California, employers should review new and evolving state and local requirements as they are enacted, including the new regulations in New York and New Jersey.
As we have previously advised as best practice, we recommend employers (1) review definitions of “expectation of privacy” in the workplace, (2) notify employees of the types and purposes of workplace monitoring, (3) obtain voluntary consent when possible, and (4) have in place reasonable protocols and safeguards to secure employee information.
Note: As of the posting of this blog, AB-1651 had been read a second time, amended, and re-referred to the Assembly Committee on Privacy and Consumer Protection. If it is to pass, the bill must have its third reading, reach a majority vote, and continue on to the Senate. The CPRA, on the other hand, will become operative on January 1, 2023 and provide full consumer rights to employees if no amendments to the law are passed before the legislature goes into recess in August.
 For purposes of this blog, “workplace member” is a “job applicant to, an employee of, owner of, director of, officer of, medical staff member of, or independent contractor of a business.” Cal. Civ. Code §1798.145(m).
 “Worker” means “any natural person or their authorized representative acting as a job applicant to, an employee of, or an independent contractor providing service to, or through, a business or a state or local governmental entity in any workplace.” AB-1651 §1522(n).
CPPA Releases Notice of Proposed Regulatory Action Implementing New Consumer Privacy
By Andrew Scott
On July 8, the California Privacy Protection Agency (CPPA) started the formal rulemaking process to adopt proposed regulations implementing the Consumer Privacy Rights Act of 2020 (CPRA), which amended the California Consumer Privacy Act (CCPA).
With a goal of strengthening consumer privacy, the proposed regulations aim to do three things: (1) update existing CCPA regulations to harmonize them with CPRA amendments to the CCPA; (2) operationalize new rights and concepts introduced by the CPRA to provide clarity and specificity to implement the law; and (3) reorganize and consolidate requirements set forth in the law to make the regulations easier to follow and understand.” Regarding the third point, CPPA’s Executive Director Ashkan Soltani has explained one of the goals is to improve readability of the regulations by centralizing all of the definitions and other subject areas in one area rather than being spread all over the place.
Below, there are further details on the following: availability of rulemaking file, sections affected by the rulemaking, the written comment period, the effect of the proposed rulemaking, what is not included in the proposed rulemaking, public hearing details, anticipated benefits of the proposed regulations, disclosures regarding the proposed actions, consideration of alternatives, and contact persons.
Availability of Statement of Reasons, Text of Proposed Regulations and Rulemaking File
A copy of the proposed regulations and supporting documents can be found on the Agency’s website at https://cppa.ca.gov/regulations/consumer_privacy_act.html.
The rulemaking file consists of the Notice, the Text of Proposed Regulations, the Initial Statement of Reasons (ISOR), and any information upon which the proposed rulemaking is based. The entire rulemaking file is available for inspection and copying throughout the rulemaking process upon request to the contact persons (listed below). This information is available on the CPPA’s website at https://cppa.ca.gov/regulations/. If you would like to receive notifications regarding rulemaking activities, you may subscribe to the CPPA’s email list here.
With regard to the proposed regulations, text that is added is underlined. The ISOR is a summary of specific sections, and the document explains the necessity of each revision (an element looked for by the Office of Administrative Law). The ISOR provides insight into the intent of the agency and includes reflection from the CPPA on the nearly 900 pages of public comments and expert advice received on specific topic areas. Acting General Counsel Brian Soublet has referred to the ISOR as a great primer for the proposed regulations.
It should be noted that the proposed draft regulations were not leaked. As explained in our Section’s fireside chat hosted by Jeewon Serrato on June 30, 2022, Mr. Soltani and Mr. Soublet explained that because the Bagley-Keene Open Meeting Act applies to CPPA’s board meetings, the CPPA, like any board or commission of a state entity, has to conduct its business with open availability. So, any time the majority of the board is meeting to deliberate on an issue, it has to be noticed and open to the public, which means the draft-rules or draft-statement of reasons will probably have to be proposed and noticed every time before a decision is made.
SECTIONS AFFECTED BY THE RULEMAKING
The CPPA proposes to amend sections 7000, 7001, 7010, 7011, 7012, 7013, 7016, 7020, 7021, 7022, 7024, 7026, 7028, 7050, 7060, 7061, 7062, 7063, 7070, 7071, 7072, 7080, 7081, 7100, 7101, and 7102, adopt sections 7002, 7003, 7004, 7014, 7015, 7023, 7025, 7027, 7051, 7052, 7053, 7300, 7301, 7302, 7303, and 7304, and repeal section 7031 of title 11, division 6, chapter 1 of the California Code of Regulations concerning the California Consumer Privacy Act.
WRITTEN COMMENT PERIOD
The CPPA has invited “any interested person or their authorized representative may submit written comments relevant to the proposed regulatory action. The written comment period closes on August 23, 2022, at 5:00 p.m. Only written comments received by that time will be considered.
Submissions may be made via electronic means or hand letters. Via electronic means, comments may be submitted electronically to firstname.lastname@example.org. Please include “CPPA Public Comment” in the subject line. If a comment is being submitted via mail, please send to the following:
California Privacy Protection Agency
Attn: Brian Soublet
2101 Arena Blvd., Sacramento, CA 95834
The CPPA notes that written and oral comments, attachments, and associated contact information (e.g., address, phone, email, etc.) become part of the public record and can be released to the public upon request.”
In the fireside chat on June 30, Mr. Soublet and Mr. Soltani stated that comments should be in an attachment to an email and to not include any personal information. Mr. Soublet noted that unhelpful comments include criticism and requests to take actions to make changes to the statues. The CPPA will respond to every comment.
After the Comment Period Ends
The CPPA will analyze “all timely and relevant comments received during the 45-day public comment period.” The CPPA has the choice of either adopting the regulations substantially or making modifications based on the comments.
Any modifications made must be “sufficiently related to the originally-proposed text.” The modified text will be available to the public for at least 15 days before the CPPA adopts the regulations as revised. Requests for copies of any modified regulations may be sent to the attention of the contact. The CPPA will accept written comments on the modified regulations for 15 days after the date on which they are made available.
Upon the completion of the regulations, a Final Statement of Reasons will be available on the CPPA’s website: https://cppa.ca.gov/regulations/.
EFFECT OF PROPOSED RULEMAKING
The CPPA “is directed to adopt regulations to further the purposes of the Act, including promulgating regulations on 22 specific topics.” (§ 1798.185). Specifically, the proposed regulations aim to establish the following:
- Rules defining notified purpose limitations on which a business’ data practices are consistent with consumers’ expectations. (§ 1798.185, subd. (a)(10).);
- Rules, procedures, and any exceptions necessary to ensure that required notice of a businesses data practices under the CCPA are provided in a manner that may be easily understood by the average consumer (§ 1798.185, subd. (a)(6).);
- Rules and procedures to facilitate and govern the submission of a consumer’s request to opt-out of sale/sharing and request to limit and a business’s compliance with the request. (§ 1798.185, subd. (a)(4).);
- Rules and procedures to ensure that consumers have the ability to exercise their choices without undue burden and to prevent businesses from engaging in deceptive or harassing conduct. (§ 1798.185, subd. (a)(4).);
- Rules and procedures to facilitate a consumer’s right to delete, correct, or obtain personal information. (§ 1798.185, subd. (a)(7).);
- Rules on the right to request a correction. (§ 1798.185, subd. (a)(8).);
- Procedures on how to extend the 12-month period of disclosure of information after a verifiable consumer request pursuant to section 1798.130, subdivision (a)(2)(B). (§ 1798.185, subd. (a)(9).);
- Defining the requirements and specifications for an opt-out preference signal. (§ 1798.185, subd. (a)(19)(A) & (B).);
- Establishing regulations governing how businesses respond to an opt-out preference signal where the business has elected to comply with section 1798.135, subdivision (b). (§ 1798.185, subd. (a)(20).);
- Establish regulations governing the use or disclosure of a consumer’s sensitive personal information. (§ 1798.185, subd. (a)(19)(C).);
- Defining and adding to the business purposes for which businesses, service providers, and contractors may use personal information consistent with consumer expectations, and further define the business purposes for which service providers and contractors may combine personal information. (§ 1798.185, subd. (a)(10).);
- Identifying the business purposes for which service providers and contractors may use consumers’ personal information pursuant to a written contract with a business, for the service provider or contractor’s own business purpose. (§ 1798.185, subd. (a)(11).);
- Establishing procedures for filing complaints with the Agency (§ 1798.199.45) and procedures necessary for the Agency’s administrative enforcement of the CPRA. (§ 1798.199.50);
- Define the scope and process for the exercise of the Agency’s audit authority as well as the criteria for selecting those that would be subject to an audit. (§ 1798.185, subd. (a)(18).); and
- Harmonize regulations governing opt-out mechanisms, notices, and other operational mechanisms to promote clarity and functionality. (§ 1798.185, subd. (a)(22).)
WHAT IS NOT INCLUDED:
The CPPA “will not be promulgating rules on cybersecurity audits (§ 1798.185, subd. (a)(15)(A)), risk assessments (§ 1798.185, subd. (a)(15)(B)), or automated decisionmaking technology (§ 1798.185, subd. (a)(16)) at this time. These areas will be the subject of a future rulemaking and are not within the scope of this Notice of Proposed Rulemaking.”
In our fireside chat, Mr. Soublet and Mr. Soltani indicated that although these topics are not addressed in this iteration of rulemaking, these topics will be addressed at a timeline unknown. Moreover, it seems likely that rulemaking will continue after the January 1, 2023, deadline.
PUBLIC HEARING DETAILS
The CPPA will hold a public hearing to provide an opportunity to present statements or arguments, either orally or in writing, with respect to the proposed regulations, at the following dates and time at the physical location identified below and via Zoom video and telephone conference:
Dates: August 24 and 25, 2022
Time: 9:00 a.m. Pacific Time
Elihu M. Harris State Building
1515 Clay Street
Oakland, CA 94612
Auditorium (1st floor)
To join this hearing by Zoom video conference: https://cppa-ca-gov.zoom.us/j/89421145939
USA (216) 706-7005 US Toll
USA (866) 434-5269 US Toll-free
Conference code: 682962
The CPPA requests that members of the public who wish to speak at the hearing should RSVP in advance on the Agency’s website at https://cppa.ca.gov/regulations/.
Of note, Mr. Soltani has stated that it will be much easier to join the Zoom meeting than to drive-up to Sacramento.
Further, the public hearings will be transcribed.
Anticipated Benefits of the Proposed Regulations
Operationalizing the CPRA Amendments – The CPPA stated that the proposed regulations provide comprehensive guidance on how to implement and operationalize new consumer privacy rights and other changes to the law introduced by the CPRA amendments to the CCPA.
Helping Consumers and Businesses – With the goal of strengthening consumer privacy, the CPPA stated that it proposed regulations “that support innovation in pro-consumer and privacy-aware products and services while also helping businesses efficiently implement privacy-aware goods and services. (Id., § 3(C)(1) & (5).) The proposed regulations take into consideration how privacy rights are being implemented in the marketplace presently and build upon the development of privacy-forward products and services.”
Harmonizing with Other Jurisdictions: The CPPA stated that the “proposed regulations take into consideration privacy laws in other jurisdictions and implement compliance with the CCPA in such a way that it would not contravene a business’s compliance with other privacy laws, such as the General Data Protection Regulation (GDPR) in Europe and consumer privacy laws recently passed in Colorado, Virginia, Connecticut, and Utah. In doing so, it simplifies compliance for businesses operating across jurisdictions and avoids unnecessary confusion for consumers who may not understand which laws apply to them.”
With respect to other state or federal laws, The CPPA went on to say that it has “determined that these proposed regulations are not inconsistent or incompatible with existing State regulations” and that “[t]here are no existing federal regulations or statutes comparable to these proposed regulations.”
DISCLOSURES REGARDING THE PROPOSED ACTIONS
The CPPA disclosed many initial determinations it had made, including the following:
- There are no mandates on local agencies or school districts;
- There is no fiscal impact anticipated on the CPPA;
- There may be an impact to the Department of Justice’s (DOJ) expenditures for enforcement because the DOJ is currently enforcing CCPA and maintains civil enforcement authority;
- There is no cost to any local agency or school district;
- Estimating that the proposed regulations will have a cost impact of $127.50 per business, representing “the labor cost of updating certain website information to comply with the proposed regulations.”;
- There is no significant, statewide adverse economic impact directly affecting businesses, including ability to compete: “The Agency has made an initial determination that that the proposed action will not have a significant, statewide adverse economic impact directly affecting businesses, including the ability of California businesses to compete with businesses in other states”;
- Conclusions from the Economic Impact Assessment (EIA) include “ (1) unlikely that the proposal will create or eliminate jobs within the state, (2) unlikely that the proposal will create new businesses or eliminate existing businesses within the state, (3) unlikely that the proposal will result in the expansion of businesses currently doing business within the state”;
- Business report requirement: Section 7102 requires businesses collecting large amounts of personal information to annually compile and disclose certain metrics. The CPPA proposes to amend section 7102 to require these businesses to additionally disclose information about requests to correct and requests to limit.
- Small business determination: The Agency has determined that the proposed action affects small businesses.
CONSIDERATION OF ALTERNATIVES
The CPPA determined that the proposed regulations are the most effective way to operationalize the CPRA amendments to the CCPA. The CPPA considered a more stringent regulatory requirement and a less stringent regulatory requirement. Interestingly, the less stringent regulatory alternative would, among other things, allow limited exemption for GDPR-compliant firms. Limitations would be specific to areas where GDPR and CCPA conform in both standards and enforcement, subject to auditing as needed.” The CPPA rejected The Agency rejects this regulatory alternative because of key differences between the GDPR and CCPA.
Inquiries concerning the proposed administrative action may be directed to:
California Privacy Protection Agency
Attn: Brian Soublet
2101 Arena Blvd., Sacramento, CA 95834
California Privacy Protection Agency
Attn: Von Chitambira
2101 Arena Blvd., Sacramento, CA 95834
Significant State Law Developments in Maine, New York, and Massachusetts
By Evan Enzer
Northeastern states are making major changes to their privacy laws. Maine enacted a narrow bill, New York significantly amended one of its pending comprehensive privacy proposals, and the Massachusetts House shelved its flagship privacy bill.
Maine’s governor recently signed the Maine Data Collection Protection Act (MDCPA). Perhaps unexpectedly, the MDCPA is a relevantly narrow law compared to many other state proposals.
The heart of the act prohibits “data collectors” from aggregating, selling, or using some documents to determine “a consumer’s eligibility for consumer credit, employment or residential housing.” In short, the MDCPA only allows data collectors to consider records that are part of a “court case or government action or investigation” when they indicate wrongdoing or liability on the consumer’s part. For example, a data collector may not consider documents resolved in the consumer’s favor, documents that do not allege wrongdoing on the consumer’s part, or documents resolved by the consumer’s agreement, as opposed to an adverse judgment. Data collectors also may not use an eviction judgment against a consumer if the court issued the ruling during the COVID pandemic.
Additionally, the MDCPA requires “data collectors” to obtain a license before operating a business in the state. The statute defines data collectors as (1) “A person that collects or attempts to collect data, directly or indirectly, from publicly maintained records and sells that data to 3rd parties for any purpose,” (2) any person that obtains data from such a person, and (3) any person that uses public records to “determine an individual’s eligibility for consumer credit, employment or residential housing.”
Consumers can enforce the act through individual lawsuits as well as class actions. In such a suit, the data collector is liable for actual and punitive damages as the court sees fit. However, the data collector can avoid liability if it shows that the violation was not intentional and resulted from a bona fide error” that occurred despite taking reasonable precautions. Similarly, the court may award attorney’s fees and other costs to the defendant if it finds the plaintiff brought the suit in bad faith.
B. NEW YORK
Down the East Coast, New York’s Senate made significant amendments to its marquee privacy proposal, the New York Privacy Act. The changes make the bill significantly more business friendly.
Where the previous bill required covered entities to obtain consumer’s opt-in consent, the new proposal uses an opt-out requirement in most cases. Now, covered entities only need to collect opt-in consent in the case of “sensitive data,” such as geolocation, biometric identifiers, and some demographic information.
The amendments may be bittersweet for consumer privacy advocates. While they watered down the proposal’s opt-in framework, they may also represent a compromise, suggesting a renewed push to enact comprehensive privacy law in the state.
After an amendment process that significantly rewrote the original bill, the Massachusetts Information Privacy and Security Act (MIPSA), the state’s high-profile comprehensive privacy proposal, is unlikely to pass this session. Reports show that the Joint Committee on Health Care Financing sent the House bill to “study,” a designation that effectively shelves the bill. The Senate version remains under consideration, so it is still technically possible that MIPSA could advance this session.
While there are clearly still disagreements about how to best approach privacy statutes on the East Coast, the Northeastern states continue to develop their laws at a quick pace.
FTC Sets Robust Data Privacy Protection Agenda
By Kewa Jiang
In May 2022, Georgetown University law professor Alvaro Bedoya was confirmed to the fifth and last open commissioner’s seat on the Federal Trade Commission (FTC). With the appointment of Commissioner Bedoya, a Democrat, the Democratic appointees on the Commission now have a 3-2 majority. Consequently, many analysts observe that FTC Chair Lina Khan is clear to push forward a very robust data privacy protection agenda. In interviews, Chair Khan noted how increased consumer use of online platforms during the COVID-19 pandemic has led to a wider “commercial surveillance” industry. Below are some highlights of the FTC’s forthcoming data privacy agenda.
RENEWED FOCUS ON CHILDREN’S ONLINE PRIVACY AND ADVERTISING
On May 19, 2022, the FTC released a policy statement reaffirming the agency’s commitment to protecting children’s online privacy, especially in this era of virtual learning. The policy statement warns educational technology companies that the agency “intends to scrutinize compliance with the full breadth of the substantive prohibitions and requirements of the COPPA Rule and statutory language.” In particular, the FTC will focus on prohibition against mandatory collection, use prohibitions, retention prohibitions, and security requirements. The statement also emphasizes that the onus of ensuring protection of children’s data privacy should be on businesses and not schools and parents. Additionally, young students should not have to hand over their data privacy and be subject to online surveillance to do online schoolwork.
Another area of focus for the FTC is child-directed stealth advertising. Children may respond differently to advertisements and endorsements compared to adults, especially given the blurred lines between content and advertisement on social media. The FTC will be hosting a virtual event on October 19, 2022 titled, “Protecting Kids from Stealth Advertising in Digital Media.” The event will unite experts from different fields, such as the law, child development, and consumer advocacy, to discuss children’s capacity to distinguish advertisements and ways to protect children from stealth digital marketing.
DECEPTIVE DIGITAL ADVERTISING AND DARK PATTERNS
The FTC’s increased scrutiny of children’s digital advertising is part of its broader efforts to combat deceptive digital advertising. A component of achieving the objective is to modernize its guide to businesses titled, “.com Disclosures: How to Make Effective Disclosures in Digital Advertising.” The guide provides resources about advertising and marketing in digital spaces. But the agency is concerned that some businesses have been incorrectly using the guide as justification for deploying dark patterns and digital deception in advertising. The FTC is currently seeking public comments on updating the guide until August 2, 2022.
Modernization of the guide is a continuation of the FTC’s increased efforts to crack down on the use of illegal dark patterns that deceive consumers. For instance, in November 2021 the agency released the enforcement policy statement about using illegal dark patterns to deceive consumers into subscription services. The agency worked to prevent consumers from being tricked by the use of automatic renewal subscriptions, continuity plans, free-to-pay conversions, and pre-notification plans. The statement was meant to warn businesses of the need to provide clear, up-front information about services provided and fess, obtain consumers’ express informed consent, and make cancellation easy.
HARMFUL USE OF ARTIFICIAL INTELLIGENCE TECHNOLOGY AND ALGORITHMS
In June 2022, the FTC also issued a report to Congress warning against the use of artificial intelligence (AI) to combat online problems and relying on AI as a policy solution. The report notes that AI is already used by most major tech companies and adoption of AI tools as policy solutions may introduce additional harms. The report highlights the inherent design flaws and inaccuracy within AI tools, bias and discrimination, and commercial surveillance incentives. The agency recommended lawmakers to focus on “developing legal frameworks that would ensure AI tools do not cause additional harm” rather than attempting to use AI to solve online harms.
In response to the concerns raised about biases in AI and algorithmic tools, the FTC is considering initiating rulemaking to “curb lax security practices, limit privacy abuses, and ensure that algorithmic decision-making does not result in unlawful discrimination” Regulation of AI and algorithmic tools is supported by numerous Congressional Senators, such as Elizabeth Warren (D-Maryland) and Bernie Sanders (D-Vermont), who sent a letter to Chair Khan in June 2022 showing their support. The letter highlights the Senators concern about how such tools may discriminate against marginalized and minority communities. Two of the Senators that signed the letter, Ron Wyden (D-Oregon) and Cory Booker (D-New Jersey), along with Representative Yvette Clarke (D-New York) also proposed the Algorithmic Accountability Act to Congress in February 2022. The Act aims to create more transparency and oversight in the use of AI and algorithmic tools.
Besides consumer data privacy, the FTC also has its eyes set on enforcing antitrust laws against numerous big tech companies. Given all the FTC intends to achieve, there is sure to be more news and developments to come as we enter the second half of the year.
My Body, My Data Act: Reclaiming Control Over Reproductive and Sexual Health Data
By Kewa Jiang
On June 24, 2022, the Supreme Court officially overturned Roe v. Wade and Planned Parenthood v. Casey. Prior to the announcement of the decision, alarms were raised about how health information collected by mobile apps and tech companies may be abused and used to penalize women for seeking or considering an abortion. On June 16, 2022, Congressional Representative Sara Jacobs (D-California), Senator Mazie Hirono (D-Hawaii) and Senator Ron Wyden (D-Oregon) responded to such concerns by introducing the federal privacy bill, My Body, My Data Act (“the Act”).
The Act is meant to bridge the gaps in data privacy protection left by the current patchwork of federal privacy laws. For instance, the Health Insurance Portability and Accountability Act (HIPAA) only applies to covered entities or their business associates, which does not cover many mHealth apps. On the state level, the Act also affirms that any state laws that provide greater protections do not conflict with its provisions.
SPOTLIGHTING PROVISIONS OF THE ACT
The Act will apply to “regulated entities,” which is defined as “any entity (to the extent such an entity is engaged in activities in or affecting commerce…” But a regulated entity does not include a “covered entity” or “business associate” under HIPAA privacy regulations which would continue to be regulated by HIPAA. The Act defines “personal reproductive or sexual health information” to include personal information related to past, present, or future reproductive or sexual health of an individual. The type of health information includes efforts to research or obtain reproductive or sexual information services or supplies, whether an individual is sexually active, ability to conceive a pregnancy, ovulation, menstruation, and reproductive and sexual health related surgeries or procedures, such as termination of a pregnancy.
The Act limits the collection, retention, use, or disclosure of personal reproductive or sexual health information to what is strictly necessary to provide service or with the express consent of the user. The Act also specifically states that regulated entities must limit the access of their own employees or service provider to users’ personal reproductive or sexual health information to what is necessary to provide a product or service.
RIGHT OF ACCESS AND DELETION
The Act requires the regulated entity to create reasonable mechanisms to allow users the ability to request deletion of and access to any personal reproductive or sexual health information retained by the entity. The information provided to the user must be in both “human-readable format” and “machine-readable format.” Users may access or delete information that includes:
- Information the entity collected from third parties, such as how and from which specific third parties,
- Information the regulated entity inferred about the user, and
- List of specific third parties the regulated entity disclosed any personal reproductive or sexual health information to.
FTC ENFORCEMENT AND PRIVATE RIGHT OF ACTION
The Act delegates the power to promulgate rules and enforce regulations to the Federal Trade Commission (FTC). The FTC’s enforcement powers of the Act will be derived from the Unfair or Deceptive Acts or Practices provisions of the Federal Trade Commission Act.
Another important provision is that plaintiffs are provided an injury-in-fact under the Act. The Act states that plaintiffs who allege violations of their “personal reproductive or sexual health information constitutes a concrete and particularized injury in fact…” This is crucial given the difficulty plaintiffs have faced in proving standing in data and privacy breach lawsuits. Under the Act, courts may award plaintiffs no less than $100 and not greater than $1,000 per violation per day or actual damages (whichever is greater). Additionally, plaintiffs may be awarded punitive damages, reasonable attorney’s fees and litigation costs, and other reliefs, such as equitable or declaratory reliefs.
Today the fight for reproductive and sexual freedom and access to healthcare no longer ends at physical bodily autonomy but also extends to digital autonomy and control over one’s own data. My Body, My Data Act will provide an opportunity for users to reclaim control over their reproductive and sexual health data, but much remains to be seen as the bill winds its way through Congress.