Antitrust and Unfair Competition Law

Competition: Spring 2019, Vol 29, No. 1

SOCIAL MEDIA, RIGHT TO PRIVACY AND THE CALIFORNIA CONSUMER PRIVACY ACT

By Dominique-Chantale Alepin1

I. INTRODUCTION

By the spring of 2018, media and consumer outrage over social media misuse of personal information had reached fever pitch. On March 17, 2018, three news organizations including the New York Times, published stories revealing that Cambridge Analytica had harvested the personal data of millions of people’s Facebook profiles without their consent and used it for political purposes.2 It was a watershed moment in the public understanding of how much personal data was being stored on social media, and the use and misuse of that data.

In California that spring, the group "Californians for Consumer Privacy" had been putting together a sweeping privacy initiative for presentation on the ballot for California voters. Given the mounting concern over the use and misuse of personal data, the initiative quickly garnered enough signatures—629,000—nearly twice the required minimum to appear on the ballot statewide in the November 2018 election.3

As they collected signatures, Californians for Consumer Privacy told California lawmakers that it would remove their initiative from the ballot in exchange for passing and signing a reasonable privacy bill by June 28, 2018. The California legislature was under immense pressure to meet this demand and pass legislation—a privacy law passed through the ballot process could prove unworkable both for industry and for consumers. For once a ballot initiative passes and is enacted, it cannot be amended by the state legislature. Instead, any amendments generally must be made through other initiatives. Practically speaking, that means it can be very difficult to amend ballot initiatives once they are voted into law. And for a privacy law where things are always in flux, being unable to amend the law would be impracticable.

On June 21, 2018, Californians for Consumer Privacy and the California Legislature struck a deal: in exchange for withdrawing the initiative, the state legislature would pass an agreed version of the California Consumer Privacy Act ("CCPA").4 The initiative was withdrawn and on June 28, 2018, the CCPA was signed into law by Governor Jerry Brown.

[Page 96]

The CCPA appears to be a consumer success story—citizens pressured their government to enact privacy laws that would help protect them from the misuse of their data by social media and other companies. But the story does not end there.

First, the CCPA has already been amended once and its implementation delayed until January 1, 2020, to allow for further amendments and to provide the California Attorney General the opportunity to promulgate rules and regulations interpreting the Act’s various provisions. As the discussion below reflects, there are particular provisions of the CCPA that have very broad definitions and may have unintended effects on products and applications, market power and other market dynamics.

Second, the CCPA does not address all uses and misuses of personal data by social media companies—there is still more work to do. And a greater level of transparency for consumers as to the workings of social media companies (including the collection and use of personal data) would help remedy some other identified problems with social media platforms.

PANELISTS

The panelists below discuss those issues and more:

  • Jennifer Lynch, Electronic Frontier Foundation. Jennifer Lynch the Surveillance Litigation Director with the Electronic Frontier Foundation where she worked to protect user privacy and civil liberties at both the federal and state level. While at EFF, Jennifer founded the EFF Street Level Surveillance Project, which informs advocates, defense attorneys, and decision makers on potentially invasive police tools.
  • Tracy Shapiro, DLA Piper LLP. Tracy Shapiro is a Partner with DLA Piper LLP. Her practice focuses on privacy, data security, advertising, and marketing practices. She regularly represents and counsels social media companies on privacy issues. Tracy was also at the Federal Trade Commission where she worked in the Division of Privacy and Identity Protection and the Division of Advertising Practices. Following her time at the FTC, she worked at Yahoo!, advising in-house clients on privacy, advertising and marketing issues.
  • Moderators—Dominique Alepin and Elaine Call. Dominique Alepin is the Assistant Director for the Western Region of the Federal Trade Commission. Elaine Call is Senior Privacy Counsel at LinkedIn.

II. PANEL DISCUSSION

MS. CALL. Four or five years ago when I told people that I was a privacy lawyer, I would get blank stares. I think that has definitely changed today. You cannot open a newspaper and read something about privacy, for better or worse, or the latest breach, it seems to be on the headlines truly daily. Ms. Shapiro, what do you see as the most important or challenging issue that social media has with respect to privacy?

[Page 97]

MS. SHAPIRO: I think that, as we saw with Facebook and Cambridge Analytica, it would have to be the ways social media companies are providing access to consumer’s information. Social media companies have a good grasp on what they collect and they are using the information, but once they start making data available, for example, via APIs, to third-party applications, or to other data providers, then there is a risk that you start to lose control over who is obtaining, collecting and using your data.

MS. ALEPIN: Ms. Lynch, coming from a consumer perspective, what do you see as the biggest privacy issues arising from social media?

MS. LYNCH: There are a few notable ones. First, we do not know what companies are doing with our data. We do not know what they are collecting about us, and with whom they are sharing the data. And we have very little control as consumers over what happens to the data. Companies can track us all throughout the Internet—whether it is reading an article on the New York Times, searching for a wedding dress, or sharing information with our friends. Companies are collecting a broad amount of information.

More troubling perhaps, companies like Facebook are collecting biometric data as well, including facial recognition data. Most people do not realize that those companies are collecting biometric data. And if people do realize it, they do not understand the implications of it.

Going back to the issues revealed by the Cambridge Analytica incident. Sensitive personal data collected by some companies is being shared with third parties. It is very difficult to track where the data is going. With Cambridge Analytica, not only did Facebook users not have a lot of control over what data was shared when they interacted with Cambridge Analytica directly, those users’ friends had no knowledge that their data was being shared too. These issues go beyond the consumer’s direct relationship with a social media company.

MS. ALEPIN: You mentioned biometric data. Recently Microsoft president Brad Smith called for federal legislation on facial recognition.5 He demanded that tech companies exercise more responsibility when implementing facial recognition technology to accommodate the need to retain privacy and control over personal information. As social media companies begin to collect data and implement these technologies, do you feel a similar need for increased transparency and caution? And is legislation the only way to achieve this?

MS. LYNCH: I definitely think we need more transparency and caution, and I am at the point where I think that legislation is the only way to achieve that. A few years ago, EFF was involved in a working group process at NTIA, the National Telecommunications and Information Administration. The working group got together a bunch of consumer and biometric companies and tried to come up with meaningful regulations for the use of face recognition. But the process was flawed from the beginning, and not even because the federal agency intended it that way. They fully tried to allow consumer groups, advocacy groups, to be involved. But consumer groups are small, we have small budgets and we cannot be involved in every meeting. And social media companies and biometric associations are large and they can fund people to be at all these meetings.

[Page 98]

What the consumer groups tried to do in those meetings was to set a baseline. We tried to ask companies just to sign on to say that they would not allow tracking or collection of facial recognition data in public without a consumer’s opt-in consent. And companies were not even willing to get to that point, when we knew that people around the country were concerned about the collection of facial recognition data and really did not want companies to be holding onto this data.

And so all the consumer advocacy groups could do at that point was to walk out. We could no longer be a part of a process where we couldn’t have a meaningful voice. Which is why I think that legislation is, at this point, really the only way we can see change. We’re starting to see that happen with a law that passed in Illinois in 2008, the Biometric Information Privacy Act (BIPA) of 2008,6 which requires opt-in consent from consumers before a company can collect and share their biometric data. BIPA is unique as well because, while three states have biometric privacy statutes on the book, only BIPA allows for a private right of action. Legal questions surrounding that law are starting to be litigated, both in Illinois and in California,7 and there are ongoing attempts in the state to roll back the law’s protections. I hope that law can be a model for future legislation.

MS. ALEPIN: In LabMD v. Federal Trade Commission,8 the FTC’s order was shot down by the 11th Circuit because of its reference to "reasonable" data protection measures which the Court believed was too vague to be enforceable.9 In essence, regulators and private plaintiffs have been relying on companies to implement "reasonable" data measures. Ms. Shapiro, is there a need as articulated by the Eleventh Circuit to be more prescriptive or specific as to what "reasonable" data protections mean and what companies should or should not do?

MS. SHAPIRO: Well I think it’s a very big challenge, being more prescriptive. First, the FTC has given a lot of guidance as to what they mean by "reasonable" data security, both in their consent decrees, where they have kind of laid out the elements of what a reasonable security program looks like, and they have also given a lot of business guidance. Having said that, in counseling a lot of companies, I sympathize that data security is challenging to navigate. You get a report back from your outside consultant who has many things that you need to fix immediately. But you don’t have the resources, and it can be hard to figure out what to prioritize, what are the real security threats and what the FTC will really care about.

[Page 99]

If Congress were to attempt to implement a law that spells out what "reasonable security" means, that would likely change in six months. And I think companies will be more frustrated by having an overly prescriptive law that tells them what their data security programs need to look like than they would be by having a broad standard that gives them the flexibility to kind of stay with the times.

MS. CALL: We are definitely seeing some legislative activity around biometric information and facial recognition technologies in the US. Two other states followed Illinois including Washington and Texas in passing their own biometric data protection statutes.

Recently, Europe passed its General Data Protection Regulation (GDPR)10 which became effective on May 25, 2018, and it specifically calls out biometric information as a sensitive category of data that merits increased privacy and security guardrails. Generally, consumers in Europe will need to opt-in before companies can use that information, similar to the BIPA in Illinois.

Ms. Lynch, can you explain the differences between an opt-in and opt-out regime, and do you think that opt-in schemes are necessary to protect consumers?

MS. LYNCH: I think that is an important point. The difference between an opt-in and an opt-out is that in an opt-in system, Facebook has to come to me and ask if it would be ok to collect my personal information. And I have to reply in the affirmative. Facebook has to do that for each specific thing that is under the regulation, whether it is personal information collected under GDPR, or specific biometric information collected under BIPA.

In an opt-out scheme, Facebook is already collecting my facial recognition data, and I have to hunt through the privacy settings on my Facebook account to find what used to be called "tag suggestions" and unclick that. It was very confusing for consumers—the button was not even called "face recognition" and consumers were not prompted to "opt-out."

I think the key is there needs to be meaningful opt-in. It cannot just be hidden somewhere in the terms of service. It needs to be a brand new pop-up that describes for the consumer that the company is collecting more information, what that information is, and get the consumer’s meaningful consent to the collection of that information.

MS. CALL: Ms. Shapiro, can you provide a brief overview of the protections afforded consumers under the CCPA?

MS. SHAPIRO: The CCPA was introduced and enacted in the span of a couple of weeks earlier this summer, which is rather remarkable. It was recently amended, and it goes into effect on January 1, 2020. The CCPA gives consumers several basic rights: (1) the right to know what a company’s data practices are, including what information they collect about consumers;11 (2) the right to opt-out of the sale of their personal information;12 (3) the right to access certain data and have it deleted;13 and (4) the right to receive full service from companies at an equal price even if they exercise those privacy rights.14

[Page 100]

So first, with the right to know is a disclosure obligation. A company will need to disclose what personal information the company will collect, how they will use it, what types of personal information will be shared with third parties, what categories of third parties they are, and what rights consumers have to access their data.15

Second, with the right to opt-out of the sale of personal information,16 most companies are relieved because they think that they do not "sell" personal information. But the way that "sale" is defined by the CCPA is very broad. It is any disclosure to a third party of personal information when you get valuable consideration in exchange.17 Companies will need to do an assessment of all the third parties, including all the service providers they work with and then do a legal analysis of what they receive from those third parties to see if they are receiving valuable consideration in exchange for that disclosure.

Third, on access rights, the CCPA allows consumers the ability to access what categories of data have been collected and sold about them.18 It also gives consumers the right to specific pieces of information that a company has collected about them. They can find out the sources of that information, where you got it from, how you are going to use it, and the third parties who you’ve shared it with.

Fourth, the CCPA requires companies to dispose of a consumer’s personal information upon their request,19 though there are numerous exceptions. For example, if the company’s use of the data is consistent with the context that you collected it in, and consistent with consumer expectations, then you don’t have to delete the data. And there are other exceptions like preventing fraud, data security, research and exercising free speech.20

Finally, there is the right to get equal service and price even if you opt out of having your data sold or access requested.21 Companies can discriminate and offer different prices or quality of goods and services if the difference is "reasonably related to the value provided to the consumer of the consumer’s data."22

MS. CALL: What is the scope of the application of the CCPA? Does the CPPA govern the personal data of all Californians? How will it affect businesses outside of California?

[Page 101]

MS. SHAPIRO: Businesses are concerned that the CCPA is in fact a national law because most online companies are dealing with at least one California consumer who comes to their website or uses their app.

There are a few prerequisites for the CCPA to apply to a business. First, the business must be for profit, non-profits are excluded.23 Second, it is only if you’re collecting California residents’ information, and you have to be doing business in California. It is unclear at this point what that means. Finally, one of three conditions have to be met: (1) you have gross annual revenues of over $25 million a year; (2) you buy, receive or sell, alone or in combination, the personal information of 50,000 or more California consumers per year, or; (3) you derive 50% or more of your revenue in a year from selling consumers’ personal information.24 With this criteria, it makes it easy for a small app or local California newspaper to qualify under the CCPA.

MS. CALL: When does the CCPA go into effect?

Ms. Shapiro: It will go into effect in January 1, 2020.25 But the law was just amended in September 2018, and there has been an extension on enforcement. It is very likely that it will not be enforced until July of 2020.

Also, the California Attorney General was tasked with promulgating rules under the law, and he has a deadline of July 2020.26 Enforcement starts the six months after those rules come into effect or July 2020, whichever comes first.

MS. CALL: And it was amended once. Do we anticipate additional changes being made?

MS. SHAPIRO: We do. This law is a mess. It has sentences that are not complete— they just cut off at the end. Then it references sections that do not exist, and many portions contradict each other.

So the legislature has indicated that they would spend some time cleaning it up. They started with a first round of amendments. There is still a lot of lobbying happening in Sacramento by businesses, and it is expected that when they go back into session in January, the legislature will look at further amendments.

Ms. Alepin: Ms. Lynch, the CCPA is the first piece of legislation here in the U.S. geared towards developing rights for consumers and addressing data privacy. What are its greatest strengths and weaknesses?

MS. LYNCH: I fully agree that the law is a mess. But looking at the history of the law, there’s an explanation for that. The CCPA was originally put on the ballot as part of the initiative process with 600,000 signatures. The initiative process in California is a particularly challenging way to enact a law, and that is because it is extremely difficult to change an initiative once it is on the books. Nobody in California really wanted this to be on the ballot as an initiative because privacy is complicated, as we all know. That is why we are all fully employed as privacy attorneys. Initiatives on the ballot should be straightforward and simple—they should not have these long provisions which require additional initiatives to pass in order to amend the original initiative.

[Page 102]

The reason this was written and passed through the legislature so quickly—in less than two weeks—was because, if it had not been passed before the end of June, it was going to be on the ballot as an initiative, Californians were going to vote for it, and they would likely have passed it. That is part of the reason why it is such a big mess.

I think that there are areas where the law can be improved. For example, the CCPA prohibits businesses from charging more for privacy protections, but they can offer incentives and payment for people to waive their privacy rights27, which in essence means that people who are not waiving their privacy rights are paying more because they are not waiving their privacy rights and therefore not getting those incentives.

Another shortcoming is that there is no private right of action except in the case of data breaches.28 The CCPA is to be enforced by the California Attorney General. But the right for consumers to sue over a data breach already exists in California. So you could never get lawsuits like those that we have seen under the Illinois biometric privacy law which are important to vindicating consumer rights.

There is no user consent required for data collection, which is a problem. Similar issues appear for data sale. The right to know provision, which is quite nice that we can know that certain data, very specific data, is collected on us, but it is only certain categories of data that we can find out about. For example, if my social media company is sharing my data with Bank of America, I can learn that they are sharing my data with banks, but not necessarily Bank of America.

MS. CALL: Ms. Shapiro, as you counsel clients across industries, what do you see as being the challenges companies may encounter with the CCPA?

MS. SHAPIRO: One is the incredibly broad definition of personal information.29 It is very similar to the GDPR in that respect. You are not just talking names, addresses and phone numbers but persistent identifiers, IP addresses. The CCPA may be quite burdensome, most especially when it comes to things like access and deletion rights. We have seen companies struggle with this in the context of GDPR. When you get an access or deletion request from one of your customers, the company may only have their IP address. It is incredibly challenging to know how you can authenticate them.

Then there is the risk that if all the consumer provides is their IP address, the company could end up turning over sensitive personal information to a different individual. For example, your roommate asks for information using the common IP address from your apartment. I think that is a concern for many people.

[Page 103]

Another issue is what Ms. Lynch raised. You can incentivize users to opt-in, but the incentive you provide them needs to be reasonably related to the value of their data. You need to make an assessment of what the data is worth, and you can only charge them that amount. But this seems impractical in practice. For example, let’s say I shared with this analytics provider that you clicked on these three links and that you entered these search terms. How would you estimate the value of that? I think that will be really challenging.

Many of my clients are tech companies who became successful by offering a fermium model and a premium model. By offering a freemium model that is supported by ads, they got a bunch of users because those users do not have to commit to anything. Eventually, the service became popular and people started to pay a monthly fee for the service. It may not be possible under this law with these restrictions on treating everyone equally and giving them all the same price.

MS. CALL: Going back to the EU’s GDPR, I think the last two weeks before this law was signed by the governor, much was imported or borrowed from the GDPR and incorporated into the CCPA, including data subject rights. Are there similarities between the GDPR and the CCPA?

MS. SHAPIRO: Definitely. There are many similarities, including the broad definition of personally identifiable information (PII). Traditionally, the U.S. has had a more limited definition of PII.

One big difference that I think Ms. Lynch touched on is that the GDPR focuses on processing personal information, so it sets regulations around the collection of information— including your use of it. In contrast, the CCPA is really about collection of information, so there regulations are set around the disclosure and sharing of personal information.

Under the GDPR, you need to have a legal basis to collect and use data, whether that is consent or if it is from a contract, or to comply with a legal obligation or legitimate interest. The CCPA does not do that. You are still free to collect and use whatever data you want to collect, as long as that is disclosed in your privacy policy.

In terms of disclosure, on the GDPR side, for those who have done GDPR compliance, it is quite a burdensome undertaking in terms of getting data protection agreements in place for all of your service providers. But the CCPA is less onerous. You do not need to have a contract in place. You just need to make sure there is a provision that says that the service provider cannot make secondary use or disclosure of the data.

On access and deletion, they are similar, though GDPR is actually a bit more burdensome in terms of access and likely requires the identification of specific third parties, which Ms. Lynch was referencing that the CCPA lacks.

There is one area where the CCPA is more onerous. Under the GDPR, as long as you have a legal basis for collecting the data, then you have a legal basis for sharing the data. An exception is if your legal basis was consent, then the user has the right to withdraw consent at any time. So there is essentially an opt-out there. But if you were relying on legitimate interests, then users do not have the ability to opt-out of the sharing of their data under the GDPR. That is in the CCPA. The CCPA is a bit more tough.

[Page 104]

MS. ALEPIN: Ms. Lynch, we talked about how under the GDPR, a personal data breach is defined as a breach of security leading to an accidental or unlawful destruction, loss, alteration or unauthorized disclosure of personal data. This would include both accidental and deliberate causes. Take, for example, Cambridge Analytica and the Russian hackers. In your mind, should data breaches be limited to traditional notions of cyberattacks resulting in personal data being lost or stolen, or should that definition be expanded? Would that provide necessary rights to consumers?

MS. LYNCH: One of the things that we saw right after Cambridge Analytica is a lot of people were really upset that Facebook had known about this sharing of data with Cambridge Analytica and the misuse of data by Cambridge Analytica for a year or two by the time it made its way into the press, and Facebook never told any of its users whose data had been shared and co-opted by Cambridge Analytica.

Those writing about the situation say it was a data breach and that Facebook should have told users. But Facebook contends it was not a data breach because it was not as if Facebook was hacked—Facebook knew exactly what was going on. They knew that Cambridge Analytica and other companies had access to the data. They did not feel like they had the obligation to tell their customers. And it did not meet the criteria for a data breach under California data breach laws or any of the other 49 states’ data breach laws, so there was no obligation to tell users.

I wonder where we would be if Facebook had just come out and said that companies had access to people’s personal data, and we are sorry, and we will not let it happen ever again.

They did not, so here we are. So should we change the law to cover situations like Cambridge Analytica? I think that if companies are not going to be proactive about telling their customers about this kind of misuse of their data, then perhaps we need to change the law.

MS. CALL: Ms. Shapiro, your thoughts.

MS. SHAPIRO: I do think it makes sense for there to be a customer notification if there is an unauthorized access to personal information. Perhaps it makes sense for these laws to include an element of harm analysis so not every access of personal information becomes an issue. Because, for example, as I have seen with my clients, there are accidental disclosures all the time where there are minimal risks of harm. For example, a business to business ("B2B") company sends a document to one customer instead of another. They immediately contact us, tell us that we have made a mistake, and delete the file, etc. Incidents like that strike me as silly and unnecessary to provide customer notification, and not particularly useful because the user will immediately want to know "what type of information"? I think some harm consideration would make sense for a breach provision.

MS. ALEPIN: Ms. Lynch, we talked before about some specific privacy issues that arise in the social media context. We talked about users not knowing what data is being collected about them and where that information is going. How does the CCPA address those issues?

[Page 105]

MS. LYNCH: The CCPA covers some of those issues. As Ms. Shapiro said, the CCPA has a broad definition of user personal information, so it can cover things like biometric data. But I think we will really see over the next year to eighteen months how the law is amended.

And, we will see whether the CCPA will be preempted by federal legislation. There is a big push right now to enact some form of federal legislation. A number of congress members have proposed privacy bills. Companies are certainly pushing for federal legislation so that there is preemption of the CCPA. Preemption also makes sense if you are a company operating in all 50 states or around the world. However, from a consumer perspective, the fear with preemption is that federal legislation could make the privacy protections weaker for consumers than what we’ve been able to secure in California and Illinois.

MS. CALL: Turning to some of the operational challenges with the right to delete— also known as the right to be forgotten in Europe and which is codified in the EU’s GDPR. This comes up when, for example, I post a picture of myself and someone reposts it, maybe with a comment that I do not appreciate. And I reach out to the company asking that they delete the reposting.

For companies, this process can be complicated and requires a lot of engineering work and resources. There are large companies who deal with millions of users on their platforms that are looking for ways to create self-serve tools where their users can do this on their own, including accessing information and downloading it in an acceptable, easy-to-read format. I think the struggle there is not only the actual technical issues with deleting it, but the balancing that needs to take place with respect to the reposters’ First Amendment rights.

MS. LYNCH: I think the right to be forgotten or the right to delete would be extremely challenging to implement in the United States because we have the First Amendment. The First Amendment not only protects your right to speak, but it also protects your right to hear and see information. Especially when that information is a matter of public concern.

So in the EU, if there is data about a person that is embarrassing, the person can go to the data custodian and request that the information be taken down. Under the way the GDPR is structured, the company needs to erase the personal data without undue delay. Now there are exceptions where the company is not required to erase the data. But it is very difficult for, say, Google to figure out what information should be left up and what should be taken down.

What we have already seen with countries that do not have the First Amendment or the right to free speech but do have this right to be forgotten is that the people making requests to delete information are people with a lot of money and power, whether that be politicians or wealthy people with oil interests. They are trying to hide matters concerning the environment or their past that we would consider in the US is a matter of public concern.

The other thing we have seen in the EU is that people are able to ask for truthful information to be taken down about them. So, perhaps you were arrested when you were in your 20s and now you are in your 50s and trying to run for office. That arrest record is there because you were actually arrested. That file is somewhere within a government agency. What EU citizens have been requesting in legal proceedings is not for the information to be purged from government records, but for search engines like Google to "de-list" it—meaning remove a user’s ability to find the information unless you know the exact website and web page that it is on.

[Page 106]

We have also seen the EU try to enforce this globally which has come up with issues in the US because of its conflict with the First Amendment. I think that the right to be deleted or forgotten is one aspect of the GDPR that will not be implemented in the US.

MS. ALEPIN: Ms. Shapiro, you mentioned there has been quite an outcry from Silicon Valley and tech companies about the CCPA. Can you discuss the reaction of tech companies and the newfound focus on federal legislation?

MS. SHAPIRO: I think there has been a lot of concern with the law on behalf of start-ups and their ability to monetize their products. In order to get off the ground, startups rely on the ability to do the advertising in their apps or in their products or even have analytical tools that are free. For the products that are offered for free, consumers allow some use of their data to perform analytics. But if you cannot ask for data to perform the analytics, it is very possible that you cannot offer your product for free and therefore cannot use that business model to start your company.

Also, compliance obligations can be a barrier to entry for many small companies. We see this with the GDPR all the time. Many of the clients that I work with were in the EU and with the advent of GDPR, they decided to exit and only work in the US. They did not have the resources to undergo GDPR compliance.

I think many companies will not have the option to just pull out of California. Compliance with CCPA could be prohibitive in terms of companies getting into the market. That raises concerns that the CCPA could hurt competition because it would favor large Silicon Valley giants who have enormous teams of privacy lawyers.

Another concern, not just about the CCPA, is related to the patchwork of state legislation that has started to be pieced together. We saw in 2003 that California enacted a data breach notification law and a number of other states followed suit. It is possible when all is said and done that we will end up with 47 or so different state privacy laws that are not always consistent, which will make compliance really hard. That issue has led to a number of companies to push for federal privacy legislation.

MS. ALEPIN: Ms. Lynch, the EFF has called for greater transparency of social media companies and their practices, including disclosure of its algorithms and ad practices. For example, since social media companies have the capability of targeting specific classes of consumers with specific ads, there have been concerns that housing ads or financial aid ads can be specifically targeted to exclude subsets of the population.

Is what is needed generally when it comes to social media greater transparency about what their practices are?

[Page 107]

MS. LYNCH: I think greater transparency would help all around. You mentioned housing and financial services. Within the last year, the Housing and Urban Development Agency (HUD) filed a Fair Housing Act complaint against Facebook because of discrimination that was happening in the way the company was implementing its algorithm for who was seeing ads.30 Facebook’s ad platform allowed advertisers to display housing ads to only men or women, not show ads to Facebook users interested in disability assistance, not show ads to users interested in childcare or parenting and not display ads to users interested in certain countries. Most of us would not think of Facebook as an employment or housing agency, but that’s where most people are seeing ads for jobs and housing these days. There are specific laws at the state and federal levels regarding employment and housing discrimination, which prohibit age, disability, and gender discrimination. Yet, employers and landlords can place ads on Facebook using exactly those criteria.

Even more than that, Facebook’s algorithm can look at an ad that was not targeted in a discriminatory fashion and its analytics will tell it that only white men aged 18-30 were looking at the ad, and it will then start showing the ad only to white men between the ages of 18-30. Therefore, even if an employer or financial institution tries not to be discriminatory, the way that Facebook’s algorithm works on targeting ads can have a discriminatory effect. If there was more transparency from Facebook on how it was targeting ads, that would really help advertisers and law enforcement determine if there were problems.

On another front, researchers have tried to dive into Facebook’s algorithms to understand its ad targeting and figure out the social and psychological effects on society, but Facebook has really pushed back on that. Facebook has claimed it is a form of computer hacking or a violation of their terms of service, and Facebook has threatened legal action against researchers that are really just trying to do research on this important platform. I think it will be important going forward to try to protect the researchers who are trying to figure out how companies are collecting and using data and targeting ads and how social media is impacting our society.

MS. ALEPIN: How can we as consumers demand more transparency from social media platforms?

MS. LYNCH: We have to push our legislators to make changes. That was what happened with the CCPA. As I said, more than 600,000 people signed the ballot. When people are concerned, they can really effect change.

[Page 108]

——–

Notes:

1. Dominique Alepin is the Assistant Director for the Western Region of the Federal Trade Commission. Special thanks to Joshua Le, University of California, Berkley Law School (Boalt), Class of 2019, for his work drafting and editing this article.

2. Matthew Rosenberg, Nicholas Confessore & Carole Cadwalladr, "How Trump Consultants Exploited the Facebook Data of Millions." The New York Times, (March 17, 2018), https://www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump-campaign.html; and Emma Graham-Harrison & Carole Cadwalladr, "Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach," The Guardian, (March 17, 2018), https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election.

3. Nicholas Confessore, "The Unlikely Activists Who Took on Silicon Valley—and Won," The New York Times (August 14, 2018), https://www.nytimes.com/2018/08/14/magazine/facebook-google-privacy-data.html.

4. California Consumer Privacy Act of 2018 ("CCPA"), Cal. Civ. Code §§ 1798.100, et seq (2018).

5. Brad Smith, "Facial recognition: It’s time for action," Microsoft blog (December 6, 2018), https://blogs.microsoft.com/on-the-issues/2018/12/06/facial-recognition-its-time-for-action/.

6. 740 ILCS 14/1, et seq.

7. See In re Facebook Biometric Information Privacy Litigation, 185 F. Supp. 3d 1155 (N.D. Cal. 2016) (holding the Illinois law applies and the plaintiffs had stated a claim under BIPA); Rosenbach v. Six Flags Entertainment Corp., No. 123186, 2019 WL 323902 (Ill. Jan. 25, 2019) (holding that a plaintiff does not need to have suffered damages in order to recover for violations of BIPA.).

8. LabMD, Inc. v. Federal Trade Commission, 894 F.3d 1211, (11th Cir. 2018). The FTC had accused LabMD of failing to maintain "basic" data security practices, which resulted in the unauthorized disclosure of over 9,000 individuals’ personal information. As a result, the FTC determined that LabMD had engaged in "unfair" business practices under the FTC Act and entered an order mandating that the company maintain a "reasonable" data-security program.

9. Id at 1236.

10. Regulation [European Union (EU)] 2016/679 (April 27, 2016).

11. Cal. Civ. Code § 1798.100

12. Cal. Civ. Code § 1798.120

13. Cal. Civ. Code § 1798.105

14. Cal. Civ. Code § 1798.125.

15. See Cal. Civ. Code §§ 1798.100, 1798.110, 1798.115, 1798.130.

16. Cal. Civ. Code § 1798.120.

17. Cal. Civ. Code § 1798.140 (t)(1).

18. See Cal. Civ. Code §§ 1798.110, 1798.115.

19. Cal. Civ. Code § 1798.105.

20. Cal. Civ. Code § 1798.105(d).

21. Cal. Civ. Code § 1798.125 (a)(1).

22. Cal. Civ. Code § 1798.125(a)(2).

23. Cal. Civ. Code § 1798.140(c).

24. Cal. Civ. Code § 1798.140(c)(1)(A)-(C).

25. Cal. Civ. Code § 1798.198.

26. Cal. Civ. Code § 1798.185.

27. See Cal. Civ. Code § 1798.125(b)(1).

28. See Cal. Civ. Code § 1798.150(a)-(c).

29. Cal. Civ. Code § 1798.140(o)(1).

30. Assistant Secretary for Fair Housing & Equal Opportunity v. Facebook, Inc, No. 18-085 (U.S. Dep’t of Housing and Urban Development Aug. 17, 2018) (Housing Discrimination Complaint).

Forgot Password

Enter the email associated with you account. You will then receive a link in your inbox to reset your password.

Personal Information

Select Section(s)

CLA Membership is $99 and includes one section. Additional sections are $99 each.

Payment