Antitrust and Unfair Competition Law

Competition: Spring 2015, Vol. 24, No. 1

Content

NOWHERE TO RUN, NOWHERE TO HIDE: IN THE AGE OF BIG DATA IS DATA SECURITY POSSIBLE AND CAN THE ENFORCEMENT AGENCIES AND PRIVATE LITIGATION ENSURE YOUR ONLINE INFORMATION REMAINS SAFE AND PRIVATE? A ROUNDTABLE

Moderated by Niall E. Lynch1

We have a distinguished group of panelists to discuss the very topical issue of privacy, computer security and data breaches. It is certainly interesting and encouraging to hear Judge Kathryn Mickle Werdegar, Justice of the California Supreme Court, say that privacy law will be a major legal issue in the next decade in California and the United States.

The title of today’s program is "Nowhere to Run, Nowhere to Hide: In the Age of Big Data Is Data Security Possible and Can the Enforcement Agencies and Private Litigation Ensure Your Online Information Remains Safe and Private?" Every day we open the newspaper there are stories about data breaches. We are constantly being told to change our password, which is inconvenient. Apparently the number one password used in the United States is the word "password." In this country, we obviously have a problem with privacy and keeping our data secure.

We have a great list of panelists for this discussion who come from a variety of backgrounds, all of whom have experience in the privacy area. Our illustrious panel consists of:

  • Laura Berger is an attorney in the Division of Privacy and Identity Protection at the Federal Trade Commission ("FTC"). She enforces federal laws that protect consumer privacy. Recently her law-enforcement work has focused on the privacy and security standards applicable to social media and the Internet of Things. She also has worked on the agency’s efforts to educate app developers about privacy including the recent guide "Marketing your Mobile App: Get It Right from the Start." In addition, she was the author of the Commission’s safeguards rule. She works in the FTC regional office in San Francisco. The FTC does a substantial amount of work in this area, and has a useful website with all sorts of tools and resources. The FTC has filed cases against every single major Internet company or computer company in the last couple years.
  • Adam Miller is from the California State Attorney General’s office. Adam has worked for the California Attorney General’s office in San Francisco since 1997. He is the inaugural supervising Deputy Attorney General for the Privacy

[Page 178]

  • Enforcement and Protection Unit that was created in 2012. From 1997 until 2001, he worked in the licensing section where he prosecuted hundreds of vocational licenses for professional misconduct. From 2001 through 2012, he worked in the Antitrust Law Section where he investigated and prosecuted mergers and anti-competitive conduct involving markets such as computer software, Microsoft and hardware, flat panels, search advertising, oil and gas and film exhibition. Adam, too, comes from an enforcement agency that’s been very, very active in this area and they have a host of resources on their website.
  • Ara Jabagchourian is a partner at Cotchett, Pitre & McCarthy, where he has litigated and tried cases in numerous areas. A class action he tried was selected as one of the top verdicts by impact by the Daily Journal, and one of the Top 100 verdicts in the United States by the National Law Journal. He has been selected as a finalist for the Trial Lawyer of the Year of the Consumer Attorneys of California in 2011 and 2012. Ara was formerly a staff attorney with the Federal Trade Commission’s Bureau of Competition in Washington, D.C. Ara has had experience in private litigation in the privacy area, and he’ll talk a little bit about that.
  • Jim Snell is a partner at Perkins Coie. Jim represents and counsels clients on a wide range of complex commercial matters including privacy and security, Internet and marketing and intellectual property litigation matters. Jim’s extensive experience includes counseling and defense of class action and other litigation relating to privacy policies, terms of use, behavioral advertising, data collection and use, Telephone Consumer Protection Act ("TCPA"), call recording statutes, commercial e-mails, spyware and adware, data breach investigations and responses, data security, social media and computer crime statutes, including many different industries.

Niall E. Lynch, a partner in Latham & Watkins’s Antitrust and Competition Practice Group, moderated this discussion. This discussion builds upon the recent panel discussion at the Antitrust Section’s 2014 Golden State Institute in San Francisco. Mr. Lynch moderated the combined panel discussion and was joined by Messrs. Berger, Miller, Jabagchourian, and Snell.

Moderator: Let me begin with Ms. Berger. What has been happening at FTC in terms of enforcement in the area of data security, privacy and data breaches?

Ms. Berger: Well, I want to say how much I appreciate the chance to be here and just, of course, say that while I’m going to be speaking as a staff person and not on behalf of the Commission or any particular commissioner, I’m excited to talk to all of you about what the Commission has been up to lately in the area of privacy and data security.

With an audience like this — and I’m always excited to talk to an audience that is not primarily focused in their professional careers on the issues that I am immersed in — so I’d love to see if folks are willing at this late hour do a show of hands. How many of you have a professional focus on privacy or data security?

[No hands raised]

[Page 179]

That’s amazing. A show of fatigue or actual audience makeup.

So let me just say cover a few basics. As many of you may know from your other work, our primary authority is Section 5 of the FTC Act, which prohibits unfair or deceptive acts or practices.2 And we use this tool a lot in the private and data security context. As Niall pointed out, we’ve been extremely active. Privacy and data security are for everybody. They are relevant even to companies that you represent that may not be technology companies or may not be on the cutting edge of developing technologies.

So, we’ve continued to focus on enforcing the deceptive or unfair standard, but we’ve started to apply that concept to lots of new technology. If time permits, I’ll talk about a few of these cases.

With deception, we’re looking at the truthfulness of what you say to consumers about your privacy or data security practices, which may entail evaluating whether your privacy settings live up to what they are supposed to do or may entail looking at a document like a privacy policy. Adam will tell you that you absolutely have to have a privacy policy here in California if you collect personally identifiable information.

But even if you think: "You know what? We don’t have to have a privacy policy and we don’t make any promises about privacy," chances are, you still say something to consumers about privacy. We look at a website and we look at the signs that companies post in their storefronts and we look at settings and at other communications with consumers. And we may think your statements give consumers reasonable expectations about privacy, even if those statements don’t take the form of a privacy policy.

We have been applying these principles regarding deception or unfair conduct to some key areas, which I’ll highlight for you. So, this is what we’ve been up to lately.

One of those areas, of course, is Big Data. And there some of you may know, we issued a report on the activities of data brokers in May. I won’t have time to get into the details of that report, but it recommended, among other things, that Congress consider legislation in this area so that consumers can be more aware of data brokers’ activities and possibly exercise some control over those activities.

But that’s also been an area—Big Data has also been a key area for us in law enforcement. And these cases will highlight another law that we enforce, the Fair Credit Reporting Act ("FRCA").3 There are cases where, depending on what a data broker is doing, if they are doing something that bears on your eligibility for employment or a credit decision or other FCRA-protected activity, they are going to be subject to the requirements of that law.

Moderator: Can you define "data broker"?

Ms. Berger: A data broker for easy reference is a company that’s in the business of collecting or compiling information about consumers. And if you are engaged in FCRA-covered activities, you’re going to be subject to the specific requirements of that law.

[Page 180]

Recently, we have alleged that a couple of companies that tell merchants whether or not to cash a consumer’s check, based on their past financial transactions, failed to live up to their obligation under the FCRA to make sure the information they are using is accurate.

By comparison, you might not need accurate information to make a decision about marketing. But if you’re making a decision that is covered by the FCRA, you need to take steps to maintain the accuracy of the information you are using. So, we alleged that, in the check cashing context, these companies were not living up to their accuracy obligations. We also had four recent cases in the employment context, where we alleged that data brokers who sold information to employers, to help them make hiring decisions, weren’t living up to their obligations to make sure that the information they provided was accurate and to make sure that the people that they provided it to had the right kind purpose, known as a "permissible purpose," to acquire it.

Moderator: So they passed on false information, and someone didn’t get a job as a result?

Ms. Berger: That’s exactly the type of the things that can happen. So the FCRA requires companies that are engaged in the business of providing information that is used or expected to be used for certain purposes — like employment screening or making decisions about credit or insurance — to need to have procedures to maintain its accuracy and to fulfill other obligations under the law.

In addition to Big Data, we’ve also been very active in the area of mobile and connected devices. We’ve been working in the mobile space for quite a while, looking at the types of disclosures, privacy disclosures, that are effective over a mobile device. You may want to look at our updated guidance on online disclosures, the Dot.Com Disclosures, which now has screen shots showing many examples of mobile disclosures.

We’ve also been active in law enforcement in that area as well, bringing actions against app developers like Snapchat who make promises that they don’t live up to — in that case, promises about the disappearing nature of photos and videos sent through the app.

And then, of course, we’ve had our first case in the Internet of Things area. The Internet of Things is what we talk about when there is a device that maybe traditionally wasn’t connected to the Internet but now is connected, like the surveillance cameras at issue in the case of TRENDnet. This increased connectivity can have significant privacy implications for consumers, whose information could be at risk of being disclosed more broadly than they thought. In TRENDnet, we alleged that an IP camera maker failed to take reasonable steps to make sure that the code they used to operate their camera was secure against hackers. And as a result, hackers could access the video feeds of cameras inside consumers’ homes and observe people going about their daily activities or observe children who were being monitored by the cameras.

The third key area I’d like to highlight is, of course, our safeguards program. We’re continuing to really focus on that area. We have well over fifty cases where we allege companies were deceptive about their data security practices or that the practices were unfair because they caused or were likely to cause substantial injury to consumers, which consumers could not reasonably avoid, and which is not outweighed by countervailing benefits to consumers or to competition.

[Page 181]

And many of those cases have involved more sensitive data, like financial or health information; but, you know, other cases have simply been based on the fact that companies failed to live up to their promises to provide at least reasonable data security.

Moderator: I’m struck by the stories I read of the Chinese, North Korean, or Russian governments actively trying to hack our files and so forth. I am sympathetic to the companies. And given all the tech companies that have been charged with violations, the question is: Is the law being enforced too broadly? What do you think of that? And also, typically what is remedy that is imposed by the FTC?

Ms. Berger: Niall has raised a really important question. How much data security is enough that you’re not going to be the subject of law enforcement? If you are subject to a really insidious attack that nobody has ever thought of and it’s really on the cutting edge, that is not likely going to be the basis for an action by us. We look at many, many data breaches where we see no reason to believe that the company failed to maintain reasonable data security, particularly in circumstances where the attack would have been — even with a very, very high level of data security — difficult to avoid.

If you look at the cases that we have brought, they are often cases where we allege that the company had multiple failures to defend against what we refer to as "widely-known vulnerabilities" by using tools that are often commonly available, or relatively low cost.

Under Section 5, we have only at our disposal equitable remedies.4 In the privacy and data security context, our orders frequently impose an injunction against misrepresentations and a requirement to implement a comprehensive privacy or data security plan that will be audited by a third party. There may also be other appropriate injunctive relief — for example, regarding the use of cookies or the updating of security software. It just depends. Of course, there could be equitable monetary remedies if consumers suffer losses or there is unjust enrichment or something of that nature. Once a company is subject to an order, we may seek civil penalties for any violations of that order.

And under the FCRA, which I also mentioned, there are civil penalties, as there are under other statutes, like the kids’ privacy law, known as COPPA (Children’s Online Privacy Protection Act).

Moderator: Thank you, Laura. Sounds like the FTC has been very busy. I do recommend everyone going to the website. It not only lists the cases they file but also best practices that companies can look to and read to make sure they have their own adequate data protections.

Let’s now turn to state enforcement. The State of California is as busy, if not busier, than the FTC and seems to have more tools available to prosecute some of these cases. Adam, maybe you can talk a little bit about what the state agency has been up to.

Mr. Miller: I also want to say my remarks are my own and do not represent the Office of the Attorney General. The Privacy Enforcement and Protection Unit is still a fairly new unit in the State Attorney General’s Office. We’re dedicated to protecting

[Page 182]

consumer privacy. It’s one of the few but growing numbers of dedicated such units across the country. And our basic tool, which a lot of you are familiar with, is unfair competition law.

We can use that law to get equitable remedies and penalties, significant penalties, against companies that violate state or federal privacy laws.

One of the strongest tools at our disposal is the California Online Privacy Protection Act, Business and Professions Code sections 22575-22579 ("CalOPPA").5 And that’s a provision which some people have already referred to and probably know if you go to the website and you see a privacy policy link at the bottom, that’s probably a result of that law being passed in 2003. And it was the first, I think the only, law that specifically addresses a requirement that you have to have a privacy policy. What it requires is if you collect personally identifiable information, which is defined in the statute and includes such things as first and last name, home or physical address, e-mail, telephone, Social Security number, if you collect — if you’re a commercial operator and collect that information from the users of your website or online service — you have to have a privacy policy.

And I agree with what Niall said that even if you don’t do that — I have to qualify what he said — it’s probably still a good policy even if you don’t collect. You never know. If you have some third-party code on your website, you might be collecting more information than you think.

So one way we use this tool is we prosecuted Delta Airlines for their mobile app. And they had a mobile app called "Fly Delta" which at the time had somewhat limited functionality. You could go look at schedules; you could do a certain amount of stuff. And it had features that allowed Delta to track where you were, your geographical location, but there was no privacy policy within the app. And CalOPPA applies to both websites and online services, which is not defined in the statute, but our interpretation is that it includes mobile apps because mobile apps are online. If they are online, they are online services.

While Delta did have a privacy policy on their website, it was specific to the website; it didn’t mention the mobile app at all. And it didn’t say anything about the geographical tracking that was going on when you used Fly Delta.

What’s interesting is even if you weren’t a customer of Delta, you could download the app, even if you didn’t buy a ticket, and it would still track you and collect information from you.

So that complaint was successfully demurred to by Delta. That is up on appeal right now. It’s been briefed. There is no oral argument set yet.

But what’s interesting, if you were here this morning and heard Tom Papageorge, he mentioned a recent California Supreme Court case, Harris v. Pacific Anchor Transportation, 329 P.3d 180 (Cal. 2014).

[Page 183]

What was interesting about that case is the reason why the Delta lawsuit was demurred to, was federal preemption under the Airline Deregulation Act ("ADA").6 And what’s interesting about Pacific Anchor is that Pacific Anchor was decided under the FAAAA (the Federal Aviation Administration Authorization Act of 1994);7 I usually just refer to it as F quadruple A — which is basically a transportation regulation statute, and the cases have basically said that the preemption provision in the Airline Deregulation Act, the ADA, is equivalent to the FAAAA. So what that means is in the same way that the California Supreme Court in Pacific Anchor decided that the FAAAA was not preempted by a state California law,8 our argument in Delta would be that CalOPPA is similarly not preempted by the Airline Deregulation Act.

What have we been up to? So we’ve only been around for a couple of years. We’ve filed a couple of cases, including joining several multi-state settlements, including a case that’s probably going to be discussed in a little bit, which is the Joffe v. Google case involving Google Streetview.9 We actually settled the case with Google and several other state attorneys general.

We also filed a case against Citibank Online. And this kind of dovetails into what Laura was talking about in terms of what type of hacking or access do you go after. We hear all the time there are foreign actors, state actors, hacking the government, JP Morgan; the federal government has reports of being hacked. And some businesses say, "We’re the victim," which they are. "Why should we be punished because we had some brand-new innovative hack that no one ever thought of?" And I agree with Laura, if that’s really the case and that’s what the evidence shows, then we would likely not investigate or enforce any kind of case against that company. But the problem is that is sometimes used as an excuse, and there may be other aspects of how a company designs and implements their network, or their security design, which can facilitate the unusual hack. Sometimes it’s just a simple hack.

For instance, we settled the case against Citibank Online and did a stipulated final judgment where they had a very trivial exploit, which was a known exploit, where if you went to the Citibank Online website and had a Citibank credit card, you could log in and access other accounts. What this attacker did was he was able to log in with a valid credential, and then when you looked to the Universal Resource Locator ("URL"), or space bar at the top of your website or page, you can go to browse different sites, you would see a number of digits in the URL. He was able to change those digits one at a time, and they would show account information for other individuals.

The hacker, or hackers, were able to access over 300,000 accounts nationwide, over 80,000 in California. And we thought there wasn’t adequate security, and we were able to resolve a settlement with Citibank Online.

[Page 184]

Another thing that we are looking at — this is not so much a kind of enforcement issue yet — but this dovetails with what Justice Werdegar mentioned in her remarks. She was talking about privacy in particular. She was talking about tracking activity on the Internet. And this is a subject that’s very important to the privacy unit and also the California legislature. And although the do-not-track standard, which is essentially a signal that is sent to a browser that’s supposed to be a standard that will let you know if a user does or does not want to be tracked, it hasn’t been established what that standard is, what the proper reaction is supposed to be. But the California legislature recently decided to pass a bill to try to do something dealing with do-not-track — which is AB370.10 That modifies CalOPPA and was effective at the beginning of 2014.

Now, AB370 does not say whether you can or cannot track. Just like the California Supreme Court said in the case of Apple Inc. v. Superior Court (Krescent), 292 P.3d 883 (Cal. 2013), this was a case involving a purchase of downloadable material over the internet. Apple is one of the few cases which cites CalOPPA, and it stated that CalOPPA is merely a disclosure regime.11 CalOPPA doesn’t say you can’t capture personal information. It just says if you’re going to collect the information, just tell the users what you are going to collect. And AB370 builds upon that by saying if you collect PII, then you have to tell the user how your website or online service responds to a do-not-track signal. Just how you respond. Or how your site responds to some other mechanism for tracking over time, because do-not-track is simply one type of tracking. There are other possible ways of tracking as well now, and that may be developed in the future.

AB370 also requires you to disclose whether third parties might collect personal identifiable information. It is concerned with tracking people across time and across websites, such as by cookies. Another option to comply with AB370 is you can link to some kind of third-party protocol which offers users the choice of how to opt out. Not to recommend one site or another, but some examples of those are the Network Advertising Initiative ("NAI") and the Digital Advertising Alliance ("DAA"). But we think best practice guidance is that you should actually say, yes or no; we do comply, or we don’t comply with a do-not-track signal.

And I also reinforce what Niall said: the California Attorney General, we were able to import a lot of material and also resources from the former Office of Privacy Protection, including Joann McNabb, who is our privacy unit’s Director of Privacy, Education, and Policy. And a lot of the materials that formerly resided with the Office of Privacy Protection are now on the AG’s website at oag.ca.gov/privacy.

Moderator: Thank you, Adam. I think we’ll look back at this time today as the infancy of privacy law. Virtually any company that has an online presence on the Internet, which is basically every company, is going to be subject to some of these rules and laws in some regard. They are data collectors. And I think this is going to be a growth area, I mean, for those who are interested, for compliance counsel, and creating effective compliance programs.

[Page 185]

It is not overstating it to say that many Internet companies are woefully unprepared for these types of laws and there’s a substantial role for counsel to provide the advice necessary for companies to avoid some of these penalties.

Next I would like to address private litigation. It’s different than the antitrust side where a government investigation will inevitably trigger a follow-on class-action lawsuit. There is active private litigation in the privacy area, but it does not always track government investigations. In terms of private litigation, think of the situations in which you’re writing an e-mail to someone; you say you want to buy a pair of shoes, and suddenly a shoe ad pops up on the Internet. That’s the world we’re looking at. It’s the use and misuse of your data by these companies to sell you things that maybe you don’t want to buy. Let me ask Ara to talk about the plaintiff’s perspective. What’s happening in private litigation in the privacy space?

Mr. Jabagchourian: I’m just going to go through some of the statutory schemes and kind of where the issues are. I’m not going through this entire presentation. But ultimately what you’re going to see on these statutory schemes is you’re going to see laws that were enacted twenty, thirty, forty years ago being applied to the Internet age, either through e-mail, through Facebook "Like" clicks, through Hulu video selections. You’re seeing laws that were enacted in the sixties, seventies, and eighties being applied now to the Internet age. And I’m sure we’ll have a lively discussion, I hope we do, as to whether or not that makes a difference or not.

The first one I put up is about the Federal Wiretap Act.12 And you’ll see there’s a lot of litigation going on that’s being brought related to scanning of the contents of e-mails. The scanning of WI-FI, people’s personal WI-FI, by Google vehicles that go around taking pictures and downloading WI-FI. And issues are arising and actions are being brought under the Federal Wiretap Act which ultimately prohibits the interception of wire, oral or electronic communications. It was first enacted in 1968 and amended in 1986. And the fight, obviously, is electronic communications and other issues.

The Federal Wiretap Act set forth statutory damages per violation of $10,000.13 And that’s another issue that you’ll see come up on these types of cases, is actual damages which are statutory damages. And in the context of class actions, whether or not that violates due process if you’re sticking it to millions and millions of folks.

The next one is the California parallel version of this, which is the California Invasion of Privacy Act,14 which is under a penal code. And there’s some differences there, but one of the differences between the Federal and California is in the Federal Wiretap Act you only need consent of one party. In California, you need consent of two parties. So that might explain why it is on your phone calls you hear this call is being recorded for training and technical purposes. I guess my silence is consent to that. I don’t know what I’m supposed to do.

But that’s the California Invasion of Privacy Act. These actions are usually brought in parallel to one another in the complaint.

[Page 186]

Next one I got here is the Video Privacy Protection Act.15 This came out of Justice Bork’s confirmation hearing when it was leaked by the video store to the Washington Post on what this guy is watching. And I’m sure that Congress got nervous that they might be the guys going behind the special curtain room at the video store and made sure that law was enacted so their video selections are not put up in the Washington Post. That was enacted in 1988 under Ronald Reagan, being applied in active litigation with Hulu.

A couple more I’ll go through quickly is the Fair Credit Reporting Act.16 And that was enacted in 1970. And the Telephone Consumer Protection Act,17 which enacted in 1991. These also contain statutory damages. $2500 per hit under the Fair Credit Reporting Act, and $500 under the Telephone Consumer Protection Act.

Let’s get to the beef of this. I apologize for going so slow.

(Addressing reporter): And if I’m going too fast, I apologize there.

The issues that are cropping up on these cases — and I’ll tell you right off the bat, these cases have not been going too well for the plaintiffs — it is consent. Consent is essentially explicit consent and implied consent. And the explicit consent is when you press "okay" on that ninety-page form that if you want to use Gmail or Yahoo e-mail, "You’ve read our terms and conditions, click yes." Guess what? You just agreed to something.

The interesting part of some of these e-mail cases were not so much, say, a Google user or a Gmail user or Google; what was interesting to me on these pages were non-Gmail users going, "so you’re a Yahoo user and you sent e-mail to somebody who has a Gmail account." And guess what? Google scans that e-mail. That Yahoo person did not consent to that Google policy, and issues came up there.

The arguments that were raised were: where are you allowed to have implied consent? And apparently under one decision I just saw by Lucy Koh, implied consent means your knowledge from newspapers, your knowledge from talking to friends.18 It’s almost so wide open that I guess the thinking is if you know they are scanning your e-mail from any source, you agree to it, which is a bit troublesome from a Libertarian standpoint but I’ll save that argument for later.

Under the Federal Wiretap Act, this argument is raised consistently, is the ordinary course of business exemption. Ultimately, in the Wiretap Act, it indicates that if it’s done in the ordinary course of business — and I have the language I can give to you later; I’m not going to waste your time — is if it’s in the ordinary course of business, that is not deemed to be a violation of the Federal Wiretap Act. So the arguments that are raised is: "Look, we’ve always been scanning this; we’ve always been doing this." And you’ll see split decisions in two Google cases. One was a decision from Magistrate Grewal,19

[Page 187]

and one was from Judge Lucy Koh. One was the Gmail case and one was dealing with Google’s universal policy of taking Gmail, Google Maps and saying (inaudible) and exchanging information.

The difference was Lucy Koh found that there was an ordinary course restriction. It was a narrow meaning. It provided a restriction to what ordinary course means. It can’t be subjective. It can’t be "I do this or the industry does this, therefore it’s okay."

Judge Grewal took a different perspective. He said, "Look, it’s work." And the analysis is, "Look, this is what they have done, and this is what they have always done, and that’s ordinary course; sorry, it’s not a violation of the Act."

That issue, I don’t know if it’s going to percolate up because Judge Koh’s case just got, essentially, effectively stripped down because class certification was not granted. So I don’t know if that issue is going up or not.

Other issues that come up: Standing. The issue of damages. The issue between actual injury and statutory damages. The argument that was raised is, "look, you’re getting $10,000 a head but that’s not your actual damage. What’s the harm to you?" And the argument is, "look, legislation recognizes there’s going to be a problem to try to prove damage, and they put the statutory damage claim to provide Article III standing."

So the arguments always come up. That argument, plaintiffs have been pretty successful. It’s the other issues that they get slaughtered on. I haven’t seen it go up to this point, but if somebody gets past summary judgment and is continuing to go, the statutory damage of $10,000 a violation or $5,000 a violation, is that going to cause due process concerns because on a class-action basis it’s going to lead to big damages which are divorced from any actual injury. And there are California Supreme Court decisions and circuit decisions on that, whether or not that is permitted.

Moderator: Thank you Ara. Let’s hear from Jim. Jim represents Internet companies and technology companies faced with these issues. Jim, from your perspective, what’s the landscape on civil litigation and also compliance? And do you agree with Ara that the bar is too high for plaintiffs in these cases?

Mr. Snell: Well, first of all, on panels like this, they usually don’t sit the plaintiff lawyer right next to defense lawyer . . . I hope you don’t have to jump between us at some point, Niall.

[Laughter.]

I’d say that from a 10,000-foot level, this is a really fascinating area. Fascinating for folks like me, but also frustrating for clients. As Niall said, we’re just at the beginning, I think, of privacy issues and litigation. Drones, webcams and other technologies are being introduced on a daily basis. There is now a thermal camera available for $300 that you can connect to your iPhone and take pictures of things in the shadows and in the dark. I think we are just beginning to grapple with a lot of these new technology issues.

I think litigation is a pretty blunt instrument for dealing with privacy concerns. And I’ll give one example: The California Invasion of Privacy Act statute in California is a statute that has been interpreted to prohibit recording of phone calls without the consent

[Page 188]

of both parties. When the legislature passed that statute in 1967, the year I was born, there were really only landline phones. We didn’t have cell phones; we didn’t have cloud telephony services. And the legislature addressed "calls recorded for quality assurance," and the legislature said we don’t intend for the statute to cover calls recorded for quality assurance, which is pro-consumer. Right? It’s something that businesses should be doing.

Well, the exception the legislature added in 1967 to address recording of calls for quality assurance was an exception that states that if you get your call-recording equipment pursuant to a tariff of a public utility, you’re exempt. Because at that time the place to get your quality assurance call-recording equipment was from the public utility. So now, plaintiffs’ lawyers are applying the statute to today’s robust call technology, and every case I’m aware of that I’ve handled for defendants has been one where the challenged recording was made for a pro consumer reason — quality assurance. But the argument is that the statutory exception does not apply to today’s technology. And the plaintiff’s counsel are creative in allegations. In one instance, the statement "Your call may be monitored for quality assurance" was given on the English recording but not on the Hispanic one; somebody missed that and a claim may be filed.

The damages under that statute are $5,000 a call. It doesn’t take that many calls to make it a billion dollar case. So this is an area where there’s been a real mushrooming of litigation. But I think one of the things to echo what Ara says is what we see is new technology being applied to old statutes, and that’s going to continue, I think.

There’s four basic types of cases that I think we see. First, the alleged misuse of data. So what is somebody doing with data? Is it a wiretap concern? Is it a computer fraud and abuse concern? A second type of litigation we see is misrepresentation claims. So companies should be very careful about what they say in their privacy policy. Make sure you do what you say and say what you do. Don’t make promises where you don’t need to. Where you do, make sure they are accurate. Because one of the places where plaintiffs are able to overcome the hurdle of standing is where there’s an alleged misrepresentation and, in fact, that can add up.

Third, alleged statutory violations. There’s a host of statutes in this area that give rise to privacy litigation. The Telephone Consumer Protection Act, there’s probably five cases, six cases a week filed in this country under that statute. Damages of $500 or $1,500 a call or text for violations. The Video Privacy Protection Act is another.

There’s a very interesting issue in front of the U.S. Supreme Court right now on a case called Spokeo.20 It’s a Fair Credit Reporting Act case, but there’s been a lot of amicus briefs filed by other parties. And the issue is whether a plaintiff who has suffered no concrete harm has Article III standing to pursue a case where there’s been a statutory violation and a private right of action to pursue that violation. And in that Fair Credit Reporting Act case, Spokeo, the Court is considering whether a plaintiff who alleges there was a violation but cannot show harm, has standing. The U.S. Supreme Court is considering that issue right now.

[Page 189]

If the U.S. Supreme Court decides that for standing the plaintiff has to have actual harm, notwithstanding what the statute says, I think there will be less plaintiffs who can allege these types of claims.

I think the fourth type of action that we’re seeing is data breach, data security issues. And in those cases, the trend is that it’s pretty hard to establish standing where there’s just a fear of how your data might be misused; but where there’s actual misuse, you can actually plead a case. Although it might be hard to plead a class action in that instance.

Just briefly on the litigation trends: I think privacy litigation is generally increasing, not decreasing. Plaintiffs, in general, are having trouble establishing injury. There’s been some recent case law including a California U.S. Supreme Court case that lends some credence to injury claims by plaintiffs in class actions.

Arbitration clauses are being upheld, and so that’s something that you see a lot of companies instituting. Where you do that, I think you want to make sure that there’s a clickbox on the terms of use or explicit consent from the customer that they agreed to the terms of use, including the arbitration provision.

The decisions are uneven and not uniform in this area. We see some cases finding injury, and on similar facts other Courts find there was no injury. Class certification is generally hard to achieve, I think, right now. That’s a trend that we’re seeing. But if class certification is achieved, we’re seeing pretty high settlements. The highest settlement of a TCPA case last month was $53 million. And there was another recent TCPA case where the damages per class member were $1,600 and there were many class members.

Moderator: So let me get this straight. The panelists have described a "parade of horribles" for any company involved in the Internet. Theories of laws, both old and new and some outdated and shoe-horned into certain conduct that people never anticipated. There’s private litigation, government enforcement. This seems like a really dangerous area to be operating in.

Laura and Adam, what advice do you have for companies on how they can operate in a way that stays within the borders of the law and still is able to function in this current environment?

Ms. Berger: So I think some of the advice has been mentioned already. Jim was talking about do what you say and say what you do. I think that’s really important. Just to follow on something that Ara said, you know, giving an example of cases where private litigation can be defended by saying, "Oh, but the consumer agreed to this; they clicked a box," that does not, that is not going to necessarily prevent an FTC action.

We are looking to see if there’s a material claim or omission that’s likely to mislead a reasonable consumer. And we have — as in the Sears case in 200921 — we have alleged that that deception occurred where the relevant information was buried in a lengthy end-user licensing agreement.

[Page 190]

The key if you’re going to try to be truthful with consumers is to put yourself in their shoes. Think about the consumer on that website. Are you putting the material information in a place where a reasonable consumer is going to see it? If it’s a mobile app, you have to think about that pretty creatively. If you’re doing a connective device, like an IP camera that has no screen at all, you’re going to have to think even more carefully about how to make sure you meet consumer expectations. But I think in the privacy area, that’s a really important thing to think about.

For the start-up communities, I think a common mistake we see is launching a new site or service without really being ready, without having thought through all these issues. There is a push to get things done and get them out there, but you need to make sure they are going to meet consumers’ reasonable expectation for privacy and security.

I think as has been mentioned, our website www.business.ftc.gov, is useful if you’re looking for information to guide your businesses. Incidentally, if you’re looking for a case list, you can find that, too. On that page, there’s a tab for privacy and security, and if you go to legal resources you can find both our relevant business education and the case highlights, which are in reverse chronological order.

We’ve been at this a long time, starting with the FCRA and our online privacy and security work since the 1990s, so there is quite a lot of information.

Moderator: Adam?

Mr. Miller: Without going into the full breadth of what privacy might involve, focusing on CalOPPA and breach issues, I would agree with what Laura said. If you’re going to collect someone’s information, you need to tell them what you’re collecting.

I think companies also need to have a better sense of what data they are actually collecting. Sometimes there’s data stored in multiple storage areas, and I get the sense sometimes companies don’t realize what they are doing or they just — there’s a button on a website control panel that they decide: "Let’s just push the button, collect more data."

In this area of Big Data where the volume and velocity of data is going to increase, a colleague of mine told me recently that data being collected in automobiles is going to be terabytes of data a day. When you have that much data, you really have to think "Do I really need it?" A lot of companies say you never know, this might come in handy sometime, or we might discover something. And I agree it might come in handy and certainly for certain public policy reasons that the FTC has pointed out in some of their Big Data reports, it might be useful to know a lot about school children or medical or health issues. For a business, it’s really important they know what data they have and make sure that the customer knows that they are collecting it, and keep it safe.

In addition, if you’re talking about keeping data safe, talking about breaches, California passed the first breach law in 2003; and now nearly every state in the country has such a data breach law with the exception of the federal government.

So what that means is you’ve got to know what kind of technology you have. And this has already been mentioned, if there’s a patch out there for a known exploit and you don’t do anything about it, that’s going to be a problem.

[Page 191]

Moderator: I would like to hear from each of our panelists what do your predict will be the top issues or trends in privacy law in the next year or five years. Let’s start with Ara Jabagchourian.

Mr. Jabagchourian: I don’t know; I’m no Nostradamus. Maybe I can pick what I would like to see.

Of these privacy laws I put up there, I always go back to what was the intent, what was the purpose of the law, to try to understand whether or not it applied to new technology. So even if the California Invasion of Privacy Act, the legislative history, the statement that was put in there was: "Use of such devices and techniques has created a serious threat to the free exercise of personal liberties and cannot be tolerated in a free civilized society."22

So then we take the issue of, all right, obviously they weren’t thinking of e-mail or Internet or anything at that time, but how is that different? How is that different than a mail coming in from my mother or a call from my wife? How is an e-mail content different?

And the position that’s been taken is. well, the ordinary course exception. The ordinary course exception, I find a little strange. You can either take one of two positions. One is: "Look, our industry has always done it this way"; or, two, "This is the way our company has always done it." If it’s the first, that’s just not the case. If you look at Yahoo and inception of AOL, Hotmail, Net Zero, they weren’t scanning the contents of people’s e-mail to do marketing. So let’s take the second position: "Look, Google, Gmail has always done this when they started." Well, take the circumstance, I want to open up my own phone line. I want to call AR&J, and I’m going to listen to all your phone calls. Nobody is going to say, "Sorry, ordinary course of business, that’s the way Ara has always done it and therefore it’s okay."

So I’m hoping that this notion, the sixties, seventies notion of privacy, the issue of increasing rights, again even outside of any pecuniary interest, the position of being a citizen, how I would like to see privacy rights, I would really like to see some judges say: "Wait a minute." When eight companies go up and put in a New York Times editorial open letter saying we’re against the NSA scanning your e-mails and doing all this, you’re violating civil liberties, but it’s okay for them to do it for profit,23 I’m taken aback by both. And I’m hoping, and I don’t know what the prediction will be, but I’m hoping the circuit starts to bring some of these privacy considerations and come up with a balance. Until legislation is enacted — I don’t know if any is going to be enacted on this. I suspect given the way decisions are going, Silicon Valley is going to be sending a lot of money to the beltway to prevent any laws on this issue since the case law is going this way.

So, again, I don’t know if that’s a prediction or not, but that’s kind of where I hope things go.

Moderator: Jim Snell?

[Page 192]

Mr. Snell: Thank you. A couple of predictions. Internet of Things is going to be a huge area, I think. This involves interconnected devices collecting, storing and processing various types of data. This includes video, being able to watch what’s going on in your home from your iPhone, mobile device with cameras, etc. Workplace productivity, monitoring employees, the wired community. Think about how many times you’re staring at a camera in our society. If you’re looking at your phone, you’re staring at a camera. Most people looking at your computer, you’re staring at a camera. iPad, same thing. Throughout my whole day — my neighbor just installed cameras outside his house to prevent theft. His cameras are picking up every time I drive to and from work. I am staring at a camera now, recording this talk. There are multiple cameras at every intersection. So, Internet of Things is going to be a big issue.

Big Data and analytics. I think this is going to be a big issue because it really turns privacy on its head. As somebody was saying earlier today, businesses feel that they need to keep data based on the possibility that there may be analytics, perhaps in the future, that they can run on that data to better serve customers; and if they get rid of the data, their competitors may keep it, and those competitors may be running analytics and serving customers better and being more competitive.

Well, the traditional privacy notion is when you collect data, don’t keep it longer than you need for the purpose it was collected. So the tension between those two concepts, I think, is going to be an interesting area.

Data breaches, I think, are getting more sophisticated. There’s sort of an arms race right now in security technology and the criminals that are breaching it. Issues of government surveillance, I think. Right now, foreign companies are using the Snowden revelations as ways to be competitive with U.S. companies, and those foreign governments in many cases have as many or more surveillance tools than the U.S. government. And finally, just global privacy and security legal issues generally. I think that’s going to be an interesting area.

Moderator: Laura Berger?

Ms. Berger: I already mentioned Big Data and mobile and data security and the Internet of Things. So I think I would just add to that that I agree, I think consumer demands for privacy and security of their information is going to continue to go up as awareness of these issues mounts.

Moderator: Adam Miller?

Mr. Miller: I would echo the Internet of Things. Also wearable devices, both medical and nonmedical. And I would also — I recently saw some advertisements and saw some stuff on TV about coding camps, where people are able to learn how to program in a couple of months. As a former computer science programmer, which was my major, I’m happy to see this is happening, but I’m wondering if they are teaching privacy by design. I’m concerned you’re going to have these software developers where people just don’t care about privacy issues, and that’s going to create a lot of problems.

Moderator: My prediction is that there will be pressure coming from two directions. First, from the people, everyday consumers, concerned about this issue. Although young people may have

[Page 193]

different attitudes about privacy than older people, there will be pressure from consumers to strengthen data security laws. Second, there will be pressure from industry for clearer rules. We currently have a patchwork of rules between the FTC and the states. So I think these two constituencies are going to join together to encourage the creation of a national privacy law. There’s been efforts in Congress to push that forward in the past, that have failed. But the bigger this issue becomes, and longer this goes on unresolved, there will be greater demand for a national universal standard coming from the federal government.

I want to thank all of the panelists, and hopefully you found this program informative and enjoyable.

[Page 194]

——–

Notes:

1. Niall E. Lynch is a partner at Latham & Watkins in San Francisco, California. Mr. Lynch is an antitrust lawyer who defends companies and individuals in criminal price fixing cases as well as FTC investigations and class-action litigation. Before joining Latham & Watkins, Mr. Lynch spent fifteen years at the Department of Justice, and was Assistant Chief of the San Francisco Office of the Antitrust Division. This article reflects the views of the authors and not necessarily those of Latham & Watkins, its attorneys, or its clients.

2. 15 U.S.C. § 45.

3. 15 U.S.C. § 1681.

4. 14 U.S.C. § 45 (l).

5. Cal. Bus. & Prof. Code § 22575-22579.

6. Airline Deregulation Act of 1978, Pub. L. No. 95—504, 92 Stat 1705.

7. 49 U.S.C. § 40101.

8. Pacific Anchor, 329 P.3d at 188 (Cal. 2014).

9. Joffe v. Google, Inc., 746 F.3d 920 (9th Cir. 2013).

10. Cal. Bus. & Prof. § 22575.

11. Krescent, 292 P. 3d at 895.

12. 18 U.S.C. §§ 2510-2522.

13. Id. § 2520 (c)(2)(B).

14. Cal. Penal §§ 630-638.

15. 18 U.S.C. § 2710.

16. 15 U.S.C. § 1681.

17. 47 U.S.C. § 227.

18. In re Google Inc., No. 13-MD-02430-LHK, 2013 WL 5423918 (N.D. Cal. Sept. 26, 2013).

19. In re Google, Inc. Privacy Policy Litig., No. C-12-01382-PSG, 2013 WL 6248499 (N.D. Cal. Dec. 3, 2013).

20. Spokeo, Inc. v. Robins, 135 S.Ct. 23 (2014).

21. In re Sears Holdings Management Corp., No. C-4264 (F.T.C. 2009), available at http://www.ftc.gov/enforcement/cases-proceedings/082-3099/sears-holdings-management-corporation-corporation-matter

22. Cal. Penal Code § 630.

23. See Global Government Surveillance Reform, https://www.reformgovernmentsurveillance.com/ (last visited Feb. 16, 2015).

Forgot Password

Enter the email associated with you account. You will then receive a link in your inbox to reset your password.

Personal Information

Select Section(s)

CLA Membership is $99 and includes one section. Additional sections are $99 each.

Payment