Antitrust and Unfair Competition Law
Competition: Spring 2016, Vol 25, No. 1
Content
- 2015: a Year of Big Plaintiff Wins In Antitrust and Privacy Cases
- Big Stakes Antitrust Trials: O'Bannonvnational Collegiate Athletic Association
- California Antitrust and Unfair Competition Law Update: Procedural Law
- California Antitrust and Unfair Competition Law Update: Substantive Law
- Chair's Column
- Considerations, Not Limitations: An Argument Against Defining the Anticompetitive Harm Under F. T.C. Vactavis As the "Elimination of the Risk of Potential Competition"
- Editor's Note
- Golden State Institute 25Th Anniversary Retrospective and Prospective Views On California Antitrust and Unfair Competition Law
- Keynote Address: a Conversation With the Honorable Tani Cantil-sakauye, Chief Justice of California
- Managing Antitrust and Complex Business Trials-a View From the Bench
- Masthead
- Royal Printing and the Ftaia
- Settlement Negotiation Tactics, Considerations and Settlement Agreement Provisions In Antitrust and Ucl Cases: a Roundtable
- The Decision of the Supreme People's Court In Qihoo Vtencent and the Rule of Law In China: Seeking Truth From Facts
- The Nexium Trial Pioneers Actavis' Activation: a Roundtable of Nexiums Counsel Reflect On Their Six-week Trial
- The Ucl-now a Money Back Guarantee?
- Ftc Data Security Enforcement: Analyzing the Past, Present, and Future
FTC DATA SECURITY ENFORCEMENT: ANALYZING THE PAST, PRESENT, AND FUTURE
By Crystal N. Skelton1
The "Internet of Things" has recently come to dominate the consumer products market. Businesses are connecting nearly everything to the Internet — from homes and cars to clothing and even yoga mats. The shift from consumers using basic computers to interacting with mobile apps and other connected devices has continued to generate massive amounts of consumer data online. This trend is not expected to end anytime soon.
The U.S. Federal Trade Commission ("FTC") defines the Internet of Things as "the ability of everyday objects to connect to the Internet and to send and receive data," and includes both consumer- and non-consumer-facing devices.2 Some analysts describe it as, the "third wave of the Internet," following the fixed Internet wave of the 1990s and the mobile wave in the 2000s.3 DHL and Cisco report that there are 15 billion connected devices in the world today and predict that there will be 50 billion by 2020.4 By that time, computers (including PCs, tablets, and smartphones) are expected to represent only 17 percent of all Internet connections, while the other 83 percent will result from the Internet of Things, including wearables and smart-home devices.5 Intel’s estimates are even more generous, forecasting that more than 200 billion devices will be connected by 2020.6
The collection of vast amounts of consumer data, however, often may not go hand-in-hand with increased efforts to protect the security of such data online. High profile data breaches and security lapses draw increased scrutiny from consumers, lawmakers, and federal and state regulators. The data security regulatory environment is in constant flux, with regulators and legislators alike proposing varying frameworks to protect personal information online. Nonetheless, there is currently no comprehensive federal privacy or data security law in the United States, as only sector-specific laws are present at the federal level. Simultaneously, forty-seven states and the District of Columbia have separate laws governing data security breach notification, while only a handful of states have implemented data security requirements applicable to any entity collecting information about residents of their state.7 Businesses are thus subject to a patchwork of statutory and regulatory data security-related requirements, which creates a complex environment for entities with a national or regional presence.
[Page 305]
In an era where federal law appears to trail behind technology, the FTC’s robust body of data security enforcement actions and business guidance can provide entities with a basic understanding of the types of minimum security controls that should be in place. This article will explore the current legal landscape on data security, the FTC’s legal authority to regulate data security practices, efforts to increase enforcement, and predictions for the future.
I. FTC’S DATA SECURITY ENFORCEMENT AUTHORITY
The FTC is the most active government enforcer with respect to business compliance with data security obligations. Its primary authority comes from Section 5(a) of the Federal Trade Commission Act ("FTC Act"), which prohibits unfair or deceptive acts or practices in or affecting commerce.8 Although the FTC Act does not expressly grant the FTC authority to regulate data security, the FTC has interpreted its Section 5(a) authority to regulate data security practices, which was affirmed by at least one federal appellate court.9 To date, the FTC has brought more than 55 actions against entities for alleged data security lapses.
When the Commission has "reason to believe" that a statutory violation has occurred (usually as a result of an investigation into the company’s practices), the FTC may issue a draft complaint setting forth its charges. If the parties are unable to resolve the matter through settlement, the FTC may pursue alleged Section 5(a) violations either in federal court pursuant to Section 13(b),10 or in its administrative capacity under Section 5(b).11 The processes differ, but the most significant difference is that the FTC cannot obtain monetary relief in an administrative proceeding (without further proceedings in a federal district court).12 Under Section 13(b), the Commission is authorized to seek not only preliminary and permanent injunctions to halt unfair and deceptive practices, but may also freeze assets, appoint receivers, obtain disgorgement of profits associated with the challenged conduct, and seek restitution and other relief to redress injury. In both federal and administrative actions, the FTC may also assert violations of other consumer protection statutes enforced by the Commission. A respondent can either elect to settle or contest the charges brought by the FTC.13
[Page 306]
A Commission order generally becomes final sixty days after it is served on a respondent, unless the order is stayed by the Commission or by a reviewing court.14 If a respondent violates a term of its order, it can be liable for a civil penalty of up to $16,000 for each violation, which the FTC can construe as each day of noncompliance.15 In that case, a court may also issue mandatory injunctions and further equitable relief as is deemed appropriate.16
A. Unfair and Deceptive Acts and Practices in Data Security
The FTC’s primary focus under Section 5 relates to whether: (1) an entity misrepresented its privacy or security practices or the privacy or security controls of a product (a "deception" claim), or (2) failed to implement or maintain "reasonable" and "appropriate" controls to secure sensitive personal information in a way that causes or is likely to cause substantial consumer injury, and such injury is not (a) outweighed by benefits to consumers and (b) reasonably avoidable by consumers (an "unfairness" claim).17
The FTC will consider an act or practice to be deceptive "if there is a representation, omission, or practice that is likely to mislead the consumer acting reasonably in the circumstances, to the consumer’s detriment."18 Certain claims, such as express claims, are presumed to be material.19 Even implied claims can be deceptive. In the data security context, the FTC generally finds an act deceptive if an entity makes materially misleading statements (e.g., in a privacy policy, or other public-facing materials) or deceptive omissions of material facts concerning its security measures and how it would handle, protect, or otherwise treat personal information that is inconsistent with the entity’s actions.20
Separately, the FTC will deem an act or practice to be "unfair" if it (1) causes or is likely to cause substantial consumer injury (2) which is not reasonably avoidable by consumers themselves and (3) not outweighed by countervailing benefits to consumers or to competition.21 The FTC’s evaluation for an unfairness claim will generally focus on whether the practices at issue were "reasonable" under the circumstances and in light of industry standards, the cost and ease of having various security controls in place, and the known vulnerabilities of not having such controls.22
[Page 307]
B. Additional FTC Authority to Regulate Data Security Practices
Companies collecting and storing consumer information must also comply with a host of other federal laws targeting the use and collection of certain particularly sensitive information. Unlike the FTC Act, many of these laws provide the Commission with the authority to issue civil penalties for statutory violations.
For instance, the Children’s Online Privacy Protection Act ("COPPA") and FTC’s COPPA Rule are designed to protect online users under the age of 13.23 COPPA applies to an operator of a website or online service directed to children under 13, or an operator that has actual knowledge that its website or online service is collecting personal information from a child. The COPPA Rule requires operators to, among other things, "establish and maintain reasonable procedures to protect the confidentiality, security, and integrity of personal information collected from children."24 An operator may retain personal information collected from a child for only as long as is reasonably necessary to fulfill the purpose for which the information was collected, and must delete such information using reasonable measures to protect against unauthorized access or use of the data in connection with its deletion.25 Violators of the COPPA Rule can face civil penalties of up to $16,000 per violation.26
The Gramm-Leach-Bliley Act ("GLBA") requires financial institutions to protect consumers’ nonpublic personal information, including by preventing disclosure to unauthorized third parties.27 As part of its implementation of the GLBA, the Commission promulgated the Safeguards Rule, which requires subject entities to develop and maintain a comprehensive written information security program that addresses administrative, technical, and physical safeguards appropriate to the business based on its size and complexity, nature and scope of activities, and sensitivity of the personal information collected.28 The Safeguards Rule requires businesses to "[i]dentify reasonably foreseeable internal and external risks to the security, confidentiality and integrity of customer information that could result in [its] unauthorized disclosure," among other requirements.29 A violation of the Safeguards Rule constitutes an unfair or deceptive act or practice in violation of Section 5(a) of the FTC Act.30
[Page 308]
Entities that use or provide consumer reports must also comply with the Fair Credit Reporting Act ("FCRA").31 The FCRA requires consumer reporting agencies to maintain reasonable procedures designed to avoid disclosing consumer information, and imposes safe disposal obligations on entities that maintain or otherwise possess information used in consumer reports.32 Regulations implementing the safe disposal obligations provide that reasonable measures include, but are not limited to, "[i]mplementing and monitoring compliance with policies and procedures that require the destruction or erasure of electronic media containing consumer information so that the information cannot practicably be read or reconstructed."33 The FCRA authorizes the Commission to commence a civil action to recover monetary civil penalties in a district court in the event of a knowing violation when it constitutes a pattern or practice of violations.34
II. CHALLENGING THE FTC’S AUTHORITY TO REGULATE DATA SECURITY PRACTICES
Although the FTC’s statutory authority to prohibit unfair and deceptive acts and practices is quite broad, the FTC Act makes no specific reference to data security or liability for data breaches. Nonetheless, since at least 2002, the FTC has asserted its authority over inadequate data security practices as unfair and/or deceptive in violation of Section 5, whether or not these practices lead to a breach of customer information.35
Only recently has the FTC’s authority to regulate data security practices under Section 5 been challenged. In separate actions filed against Wyndham Worldwide Corporation (in federal court) and LabMD Inc. (as an administrative proceeding), the FTC alleged that the companies’ failures to employ reasonable and appropriate measures to prevent unauthorized access to consumers’ personal information violated the FTC Act. Both Wyndham and LabMD challenged the FTC’s authority to regulate data security practices as "unfair."
A. Wyndham Worldwide Corporation
In June 2012, the FTC filed suit against global hospitality company Wyndham Worldwide Corporation and three of its subsidiaries, Wyndham Hotel Group, LLC, Wyndham Hotels and Resorts, LLC, and Wyndham Hotel Management, Inc. (collectively, "Wyndham").36 The Commission alleged that Wyndham failed to implement reasonable and appropriate data security safeguards for the personal information collected and maintained at its franchise locations. According to the Commission, this allowed computer hackers to breach franchise computer systems and the company’s centralized property management center on three separate occasions between 2008 and 2010. The complaint further alleged that the breach resulted in more than 619,000 consumer payment card account numbers being compromised, many account numbers being exported to a domain registered in Russia, fraudulent charges on consumers’ accounts, and more than $10.6 million in fraud loss.
[Page 309]
The FTC alleged the following deficiencies in Wyndham’s cybersecurity practices: (i) failure to use "readily available security measures" (such as firewalls) to limit access between the property management systems, the corporate network, and the Internet; (ii) failure to implement adequate information security policies and procedures prior to connecting their local computer networks to the broader Wyndham computer network; (iii) failure to require strong usernames and passcodes to access property management systems; (iv) failure to employ reasonable measures to detect and prevent unauthorized access or to conduct security investigations; (v) failure to follow proper incident response procedures; and (iv) failure to adequately restrict third-party vendor access to its network and company servers. Given the breadth of alleged deficiencies, the FTC also claimed Wyndham’s privacy policy deceptively misrepresented the extent to which the company safeguarded consumer data.
In August 2012, Wyndham filed a motion to dismiss the FTC’s complaint on four grounds.37 First, Wyndham challenged the FTC’s authority to assert an unfairness claim in the data-security context. Second, Wyndham asserted that the FTC must formally promulgate rules or regulations before bringing an unfairness claim and, by failing to do so, the FTC violated fair notice principles. Third, Wyndham argued that the FTC’s allegations are plead insufficiently to support either an unfairness or deception claim. Lastly, Wyndham challenged the FTC’s deception claim that Wyndham’s privacy policy misrepresented measures taken by the company to protect consumers’ personal information.
In April 2014, the district court denied the motion, but certified its decision for interlocutory appeal on two key questions: (1) whether the FTC had the authority to regulate data security under the unfairness prong of Section 5(a); and (2) whether Wyndham had fair notice that its specific practices could run afoul of that provision.38 Prior to the decision, no federal court had adjudicated whether the FTC had authority under Section 5(a) to bring actions against companies for allegedly deficient cybersecurity practices.
The U.S. Court of Appeals for the Third Circuit granted interlocutory appeal. On August 24, 2015, the Third Circuit confirmed that the FTC has authority to bring an action focused on a company’s data security practices under the "unfairness" prong, and that the FTC is not required to articulate a specific cybersecurity standard for companies to follow.39
[Page 310]
1. FTC’S Authority to Regulate Data Security Practices as "Unfair"
In challenging the FTC’s authority to bring an unfairness action for allegedly deficient cybersecurity practices, Wyndham advanced a novel theory: the three requirements of an unfairness claim that are codified at 15 U.S.C. § 45(n)—(i) substantial injury, (ii) that is not reasonably avoidable by consumers, and (iii) that is not outweighed by the benefits to consumers or to competition—were "necessary but insufficient conditions" of an unfair practice. Specifically, Wyndham argued that the plain meaning of the word "unfair" imposes independent requirements that were not met by the FTC.40 For example, Wyndham argued that conduct could only be unfair when it injured consumers "through unscrupulous or unethical behavior" or was otherwise "marked by injustice, partiality, or deception."41
In rejecting Wyndham’s arguments, the court opined that the FTC Act contemplated a theory of liability based on tortious negligence. In the court’s view, the FTC Act "expressly contemplates the possibility that conduct could be unfair before actual injury occurs."42 Even if a company’s conduct is not the most proximate cause of an injury, this generally will not immunize the entity from liability if the harm was foreseeable or likely.43 Thus, the Third Circuit confirmed that companies may be liable under an unfairness theory for a reasonably foreseeable data breach, even absent any evidence of actual injury or harm.
In the alternative, Wyndham argued that Congress intended to exclude cybersecurity from the FTC’s unfairness authority by enacting more targeted federal privacy legislation (i.e., FCRA, GLBA, and COPPA). The Third Circuit again rejected this novel theory finding that the various federal privacy laws were enacted to expand the FTC’s authority over corporate cybersecurity, not merely to establish the FTC’s authority in the first instance.
2. "Fair Notice" is Not Required
Wyndham claimed that, notwithstanding whether its conduct was unfair under Section 5(a), the Commission "failed to give fair notice of the specific cybersecurity standards that the company was required to follow."44 Wyndham claimed that the court could not defer to the agency’s interpretation of its own regulations unless private parties had "ascertainable certainty" as to those interpretations. Because the company was not made aware with "ascertainable certainty" of the specific cybersecurity standards on which it would be held accountable, Wyndham asserted that the FTC’s interpretation of what constituted minimum security standards was not entitled to deference.
The court rejected this argument, noting that the Commission was not relying on an agency interpretation, rule, or adjudication of minimum cybersecurity standards under Section 5 of the FTC Act. Rather, no such precedence exists because the FTC had not yet declared that cybersecurity practices could be unfair (i.e., its numerous cybersecurity related administrative settlements could not be considered precedential). Thus, the Third Circuit found that the company was not entitled to "ascertainable certainty" of the FTC’s interpretation of the specific cybersecurity practices required by the FTC Act. As a result, the relevant question was not "whether Wyndham had fair notice of the FTC’s interpretation of the statute, but whether Wyndham had fair notice of what the statute itself requires."45
[Page 311]
The Third Circuit concluded that the FTC’s previous adjudication and interpretive guidance provided the requisite notice to Wyndham that its actions could be considered "unfair" under the FTC Act. The court reasoned that Wyndham was entitled to a comparatively low level of statutory notice because no constitutional rights were implicated, the statute was civil instead of criminal, and it regulated economic activity.46Moreover, the cost-benefit analysis provided in Section 5(n) informed Wyndham that it should consider the probability and magnitude of harms to consumers caused by its data security practices and whether these costs outweighed any savings from not employing more secure practices.47 The court noted that Wyndham was hacked three times and, at least after the second attack, it "should have been painfully clear to Wyndham that a court could find its conduct failed the cost-benefit analysis."48 Based on these factors, the court rejected Wyndham’s fair notice claim.
B. LabMD
At the same time that Wyndham was being litigated in federal court, the action against LabMD, Inc., a clinical testing laboratory, proceeded on a parallel track as an administrative adjudication. The LabMD saga began in May 2008, after Tiversa Holding Company (a third-party cybersecurity consultant specializing in peer-to-peer network searches) informed LabMD that a company report containing patient information was publicly available on a peer-to-peer network called LimeWire. The report contained names, dates of birth, Social Security numbers, and health insurance information of approximately 9,300 patients. Tiversa claimed that it had linked this report to four IP addresses associated with known identity thieves.
The FTC commenced investigation in 2010, shortly after Tiversa informed the FTC of its findings. Following several years of public sparring between LabMD and the FTC, the Commission commenced an administrative enforcement action in August 2013, alleging that LabMD failed to reasonably protect the security of consumers’ personal data.49The complaint contends that in two separate incidents, LabMD collectively exposed the personal information of approximately 10,000 consumers. The first breach related to the findings by Tiversa. Then, in October 2012, LabMD "day sheets" and a small number of copied checks containing sensitive personal information of at least 500 consumers were allegedly found in the possession of individuals determined to be identity thieves.
[Page 312]
1. Procedural Background
Following the FTC’s administrative complaint, LabMD contested the FTC’s allegations through the administrative process while pursuing a parallel challenge in federal court.50 Namely, LabMD asserted that the administrative action was an improper expansion of FTC jurisdiction, was retaliatory, and violated the Due Process Clause. Although the actions in federal court were eventually dismissed, LabMD continued to pursue its challenge through the administrative process.
LabMD filed a motion to dismiss the administrative complaint in November 2013. The motion alleged that: (1) the FTC lacks Section 5 "unfairness" authority to regulate patient-information data and data security practices; (2) the enforcement action violates LabMD’s due process rights because the Commission has not provided fair notice of the data security standards that it believes Section 5 prohibits or requires; (3) the acts or practices alleged in the complaint do not affect interstate commerce; and (4) the alleged acts and practices have not caused, and are not likely to cause, substantial injury that is not reasonably avoidable by consumers themselves and not outweighed by countervailing benefits to consumers or to competition.51 Under the FTC’s Rules of Practice, the motion was decided by the Commission. The Commission rejected LabMD’s defenses, holding that the statutory prohibition against unfair trade practices in Section 5 could be applied to allegedly unreasonable and injurious data security practices, and declined to dismiss the complaint.52
The administrative proceeding proved lengthy and very contentious. With more than 200 entries on the administrative docket, including numerous discovery motions, motions for sanctions, and motions to dismiss, the three-year battle found itself at a full administrative trial before the Chief Administrative Law Judge D. Michael Chappell ("ALJ"). Days after the administrative trial began in May 2014, it was reported that the House Committee on Oversight and Government Reform ("OGR") had begun an investigation of Tiversa regarding the company’s disclosures to the FTC. The OGR intended to call LabMD’s key witness, former Tiversa employee Richard Wallace, in exchange for immunity. It was further disclosed that, if called to testify in the administrative proceedings, Wallace would invoke his constitutional privilege against self-incrimination, pending his effort to obtain a grant of prosecutorial immunity. The action was stayed to allow Wallace to seek prosecutorial immunity for the OGR testimony and for testimony in these administrative proceedings.53
[Page 313]
On May 30, 2014, Wallace testified before OGR that Tiversa had fabricated evidence linking the LabMD report to identity thieves’ IP addresses.54 According to Wallace, Tiversa never found any evidence that anyone other than LabMD or Tiversa had accessed the report. Wallace testified that Tiversa’s business model was to "monetize" documents that it downloaded from peer-to-peer networks, by using those documents to sell data security remediation services to affected entities, and by manipulating Tiversa’s internal database of peer-to-peer network downloads to make it appear that a business’ information had been found at IP addresses belonging to known identity thieves. Furthermore, Wallace testified that Tiversa allegedly fabricated this information and reported it to the FTC after it unsuccessfully solicited business from LabMD.
On December 29, 2014, on the Commission’s motion, and pursuant to authority granted by the Attorney General of the United States on November 14, 2014, the ALJ issued an order granting Mr. Wallace immunity pursuant to Commission Rule 3.39 and directing Mr. Wallace to testify in these proceedings.55 Proceedings reconvened for Mr. Wallace’s testimony on May 5, 2015. Following the testimony, FTC staff chose not to cross-examine Wallace, indicated it would not rely on certain Tiversa-related testimony and evidence in its proposed findings of fact, and opted not to offer any rebuttal to Wallace’s testimony.
2. FTC Failed to Demonstrate Likelihood of Consumer Injury
On November 13, 2015, the ALJ handed down the decision, dismissing the case in its entirety. The decision turned on whether the FTC had demonstrated that the alleged unreasonable conduct caused or is likely to cause substantial injury to consumers, as required by the first prong of the three-part "unfairness" test under Section 5(n).56 The decision explains that the FTC’s evidence failed to prove that the limited exposure of the report resulted or was likely to result in any identity theft-related harm. Moreover, the evidence failed to prove that embarrassment or similar emotional harm is likely to be suffered from the exposure of the report alone. The decision notes that, even if there were proof of such harm, this would constitute only subjective or emotional harm without further proof of other tangible injury. The ALJ determined that this does not amount to a "substantial injury" within the meaning of Section 5(n).
[Page 314]
In addition, with respect to the exposure of "day sheets" and check copies, the decision notes that the FTC failed to prove that the exposure of these documents was causally connected to any failure of LabMD to reasonably protect data maintained on its computer network, because the evidence failed to show that these documents were maintained by, or taken from, LabMD. The decision further explains that FTC failed to prove that the exposure of day sheets and check copies caused, or was likely to cause, any consumer harm. Notably, the decision states that:
To impose liability for unfair conduct under Section 5(a) of the FTC Act, where there is no proof of actual injury to any consumer, based only on an unspecified and theoretical "risk" of a future data breach and identity theft injury, would require unacceptable speculation and would vitiate the statutory requirement of "likely" substantial consumer injury.57
At best, the ALJ found that the FTC had proven the "possibility" of harm, but not any "probability" or likelihood of harm. Therefore, because the Commission failed to prove its case on the merits, the ALJ dismissed the case.
The ALJ’s decision in LabMD is significant in that it clarifies the scope of the FTC’s authority. By requiring some probability or likelihood of harm for unfairness claims, the decision presents a potentially restricting interpretation of Section 5(a)’s reach into data security practices. Regardless, the Third Circuit’s decision in Wyndham, finding that the FTC has regulatory authority in this arena, is not disturbed.58 Moreover, the FTC staff has appealed the ALJ’s decision before the full Commission, so the LabMD saga likely will continue.59
III. FTC DATA SECURITY ENFORCEMENT AND TRENDS
The FTC is empowered under Section 5(a) to ensure that business representations and practices are not deceptive and/or unfair. In recent years, the FTC has focused its data security enforcement efforts in two primary areas. First, the FTC has reviewed the extent to which a business protects the personal information collected and shared so that it is not misused by the business, third-party affiliates, service providers, or by hackers. Second, the FTC has scrutinized public statements made concerning how such personal information is protected.
[Page 315]
The FTC has found fault with a number of common business practices for placing customer information at risk of misuse, including but not limited to:
- Permitting common or weak passwords to the company’s systems and databases (including using a system administrative default log-in and password);
- Failing to implement internal measures appropriate under the circumstances to protect sensitive consumer information;
- Failing to sufficiently train employees on how to protect the confidentiality and privacy of customer information;
- Providing unrestricted retention of customer information (i.e., keeping data longer than necessary) if there is no legitimate business reason to maintain the data;
- Allowing for inadequate data disposal practices for both electronic and hard copy data containing sensitive customer information;
- Failing to secure mobile and other devices with sensitive customer information, including laptops, PDAs, mobile phones, and flash drives;
- Failing to protect against "Structured Query Language" attacks to electronic databases;
- Storing sensitive information in clear, readable, unencrypted text that could be accessed through commonly-known log-ins and passwords (both during transmission and while the data is at rest);
- Failing to regularly monitor and control connections to the company’s network, including via wireless connections;
- Failing to employ proper means to detect unauthorized access to sensitive personal information; and
- Failing to regularly perform security investigations or audits (or ignoring the results).
Data security settlements have not typically resulted in monetary penalties, unless the FTC can show that affected consumers were injured monetarily, that the business obtained a direct financial benefit from use of the consumers’ personal information in a manner inconsistent with the relevant privacy policy (such as from a sale of such information), or the business violated another statute that contains civil penalty provisions, such as FCRA or COPPA. Although the FTC cannot assess monetary civil penalties directly for violations of Section 5(a), settling companies are typically under FTC order for 20 years. This means that if the company violates a term of its settlement agreement with the FTC, it can be liable for a civil penalty of up to $16,000 for each violation, which the FTC can construe as each day of noncompliance.
FTC data security settlements have typically been limited to robust injunctive provisions, third-party audits every two years during the term of the settlement, and even individual liability for company owners. The FTC also regularly requires businesses to have a comprehensive written information security program.60 In general, an FTC consent order will require that the company establish, implement, and thereafter maintain a written comprehensive information security program that is reasonably designed to protect the security, confidentiality, and integrity of personal information collected from or about consumers. Such programs are typically required to maintain administrative, technical, and physical safeguards appropriate to the entity’s size and complexity, the nature and scope of the entity’s activities, and the sensitivity of the personal information collected from or about consumers, including:
[Page 316]
- The designation of an employee or employees to coordinate and be accountable for the information security program;
- The identification of reasonably foreseeable, material internal and external risks to the security, confidentiality, and integrity of personal information that could result in the unauthorized collection, use, disclosure, misuse, loss, alteration, destruction, or other compromise of such information, and assessment of the sufficiency of any safeguards in place to control these risks. At a minimum, this risk assessment should include consideration of risks in each area of relevant operation, including, but not limited to: (1) employee training and management, including secure engineering and defensive programming; (2) information systems, including network and software design, information processing, storage, transmission, and disposal; (3) product design development, and research; and/or (4) prevention, detection, and response to attacks, intrusions, or other systems failures;
- The design and implementation of reasonable safeguards to control the risks identified through risk assessment;
- Regular testing or monitoring of the effectiveness of the safeguards’ key controls, systems, and procedures;
- The development and use of reasonable steps to select and retain service providers capable of appropriately safeguarding personal information received from the company, and requiring service providers by contract to implement and maintain appropriate safeguards; and
- The evaluation and adjustment of the information security program in light of the results relating to the testing and monitoring of the effectiveness of the safeguards, any material changes to any operations or business arrangements, or any other circumstances that may have an impact on the effectiveness of the information security program.61
The FTC’s past enforcement in this area can provide entities with a basic understanding of the types of minimum security controls that should be in place. Moreover, these enforcement efforts identify four general categories of practices or products where FTC enforcement action has been and remains prevalent.
[Page 317]
A. Corporate Data Security Practices
From its first data security action in 2002 against Eli Lilly involving the unauthorized disclosure of sensitive personal information by an employee,62 the FTC has continued to necessitate that entities remain vigilant in safeguarding sensitive consumer information maintained on company servers, corporate and computer networks, and employee devices.
For instance, the FTC has brought several enforcement actions when unsecured corporate devices containing customers’ personal information were misplaced or stolen. Notably, the FTC settled with Accretive Health, Inc. after an employee’s laptop computer, containing sensitive information on 23,000 patients, was stolen from the passenger compartment of an employee’s car.63 Similarly, the operator of a cord blood bank, Cbr Systems, Inc., settled with the FTC after several backup tapes, a laptop, an external hard drive, and USB drive were allegedly stolen from an employee’s personal vehicle.64 The backup tapes were unencrypted and contained personal information from approximately 300,000 customers. The laptop and external hard drive were also unencrypted, and contained enterprise network information, including passwords and protocols, which could have facilitated an intruder’s access to Cbr’s network. In each case, the Commission alleged that the entities created unnecessary risks by transporting portable media containing consumer information in a manner that was vulnerable to theft or other misappropriation, and failed to employ reasonable procedures to ensure that consumers’ personal information was removed or deleted when there was no longer a business purpose for retaining the data.
The Commission also settled with PLS Financial Services, Inc., a corporate manager of more than 300 payday loan and check cashing stores, and an affiliated owner and operator of several stores, following the unsecured disposal of consumer sensitive personal information. Specifically, the FTC’s complaint alleged that the companies disposed of documents containing sensitive personal identifying information (including Social Security numbers, employment information, loan applications, bank account information, and credit reports) in unsecured dumpsters near several PLS Loan Stores or PLS Check Cashers locations. In addition to violations of Section 5, the FTC also alleged that the companies violated the Gramm-Leach-Bliley Safeguards Rule, which requires financial institutions to develop and use safeguards to protect consumer information.
The FTC has also kept a close watch over the data security practices of social networking sites. For example, in 2010, Twitter settled claims that it failed to protect users’ personal information allowing hackers to obtain unauthorized administrative control of Twitter, including access to non-public user information, private tweets, and the ability to send out phony tweets from any account.65 Specifically, the FTC’s complaint asserted that an intruder used an automated password guessing tool to derive an employee’s administrative password, after submitting thousands of guesses into Twitter’s public login webpage. The password was a weak, lowercase, letter-only, common dictionary word. Using this password, the intruder reset user passwords, and publicly posted the information allowing other hackers to send unauthorized tweets from user accounts. This included one tweet, purportedly from President Barack Obama offering his more than 150,000 followers a chance to win $500 in free gasoline, in exchange for filling out a survey.
[Page 318]
Although the FTC may investigate an entity’s corporate data security practices when a breach occurs, it may not always mean the FTC will bring an enforcement action. For instance, in August 2015, the FTC sent a closing letter to Morgan Stanley Smith Barney LLC relating to the Commission’s investigation over whether Morgan Stanley engaged in unfair or deceptive acts or practices by failing to secure certain account information related to its Wealth Management clients.66 The investigation examined allegations that a Morgan Stanley employee misappropriated client information by transferring data from the Morgan Stanley computer network to a personal website accessed at work, and then onto other personal devices. The exported data subsequently appeared on multiple Internet websites, causing the potential for misuse of the data. The FTC decided to informally close the case without further action because Morgan Stanley had established and implemented comprehensive policies and access controls designed to protect against insider theft of personal information.
B. Software Providers
Both companies designing and using their own software, in addition to those using third party software in their business operations, have been subjected to FTC enforcement. Often times, the FTC will allege that the software itself, or the company’s use of the software, either caused, or had the potential to cause, unauthorized third parties or hackers to access customer data. This can be due to vulnerabilities in the software design or the way in which the software was used. As discussed in LabMD, the FTC alleged that the company report was placed on a peer-to-peer file sharing network that was publicly accessible. The FTC has also brought similar actions against a number of other companies.
For instance, web analytics company, Compete Inc., settled FTC charges that it used its web-tracking software to collect personal data without disclosing the extent of the data collected through its software and third-party web tools, and failed to honor representations made about the extent to which it would protect the data.67 Specifically, Compete’s web-tracking software allegedly captured information consumers entered into websites, including usernames, passcodes, search terms, and sensitive information, such as credit card and financial account information, security codes and expiration dates, and SSNs. Compete made representations concerning the security of its software, including that all personal information is stripped before being sent to the company’s servers. Despite these representations, the FTC’s complaint alleged that Compete failed to remove personal data before transmitting it; failed to provide reasonable and appropriate data security; transmitted sensitive information from secure websites in readable text; failed to design and implement reasonable safeguards to protect consumers’ data; and failed to use readily available measures to mitigate the risk to consumers’ data.
[Page 319]
In December 2015, Oracle agreed to settle charges that it deceived consumers about the security of Java software updates.68 The FTC alleged that since acquiring Java in 2010, Oracle was aware of significant security issues affecting older versions of Java SE.69 The security issues allowed hackers to create malware allowing access to consumers’ usernames and passwords for financial accounts, and permitting hackers to acquire other sensitive personal information through phishing attacks. The FTC alleged that Oracle represented that by installing its updates to Java SE, the updates and the consumer’s system would be "safe and secure" with the "latest… security updates." During the update process, however, the Java SE update removed only the most recent prior version of the software, but did not remove any earlier versions that might be installed. As a result, consumers could still have older, insecure versions of the software on their computers that were vulnerable to being hacked.
Most recently, Henry Schein Practice Solutions, Inc., a provider of office management software for dental practices, agreed to settle FTC charges that it falsely advertised the level of encryption provided to protect patient data.70 The FTC’s complaint alleged that Schein marketed its Dentrix G5 software to dental practices with deceptive claims that the software provided industry-standard encryption for sensitive patient information to ensure that practices using its software would protect patient data, as required by the Health Insurance Portability and Accountability Act (HIPAA). The software, however, used a proprietary algorithm that had not been tested publicly, and was less secure and more vulnerable than widely-used, industry-standard encryption algorithms.
[Page 320]
C. Mobile Apps and Mobile Devices
Since the FTC brought its first enforcement action involving a mobile app in 2011, the Commission has continued its active enforcement efforts, bringing more than 20 total cases involving mobile applications or mobile devices, including four relating to data security. In the data security context, the FTC has generally alleged that developers maintained inadequate security measures on their mobile apps or mobile devices.
The FTC kicked off its mobile app data security enforcement in 2011, settling with Frostwire LLC, a peer-to-peer (P2P) file-sharing app developer and its principal, over charges of unauthorized public disclosure of sensitive information and misrepresentation to its users concerning the sharing of downloaded user files.71 Frostwire offered two free P2P file-sharing applications, which enabled users to share files—photos, videos, documents, and music—with other users on the same network. The FTC alleged that Frostwire had configured the default settings to allow the app to publicly share personal files that were stored on the users’ mobile devices without proper notice or consent.
In June 2013, the FTC charged HTC America with failing to take reasonable steps to secure the software developed for the company’s smartphones and tablets.72 According to the complaint, the software’s security flaws placed sensitive consumer information at risk of disclosure. The security vulnerabilities allowed the installation of malware onto devices without the consumer’s knowledge or consent. Such malware could record and transmit information stored on the device, such as financial account and medical information. Due to the software flaws, the device user manuals and the interface for the company’s Tell HTC app deceptively represented that HTC would notify users of third-party application access to device information and obtain their consent.
The FTC settled with two mobile app developers in March 2014 alleging that the companies failed to secure the transmission of consumers’ sensitive personal information collected via their mobile apps and misrepresented the security precautions that the companies took for each app.73 Specifically, the FTC asserted that Fandango and Credit Karma disabled the SSL (Secure Sockets Layer) certification validation procedure for each of their mobile apps. By doing so, the FTC claims that the apps were open to attackers positioning themselves between the app and the online service by presenting an invalid SSL certificate to the app—i.e., "man-in-the-middle" attacks. The FTC contends that Fandango and Credit Karma engaged in a number of practices that, when taken together, failed to provide reasonable and appropriate security in the development and maintenance of its mobile app.
Most recently, the FTC announced a settlement with Snapchat resolving allegations that the app deceived consumers over the disappearing nature of users "snaps" and made false and misleading representations concerning its privacy and information security practices. The FTC took issue with several of Snapchat’s practices and representations, including that Snapchat failed to securely design its "Find Friends" feature by failing to verify the phone number of the user upon registration. In such case, an individual could create an account using a phone number belonging to another consumer. In addition, Snapchat had represented in its privacy policy that it takes "reasonable steps" or "reasonable measures" to protect users’ information. The FTC asserts, however, that Snapchat failed to implement any restrictions on serial and automated account creation, which allowed attackers to create multiple accounts and send millions of Find Friends requests using randomly generated phone numbers. According to the complaint, the attackers were able to compile a database of 4.6 million Snapchat usernames and associated mobile phone numbers.74
[Page 321]
D. Internet-of-Things and Other Connected Devices
The FTC has expanded its enforcement efforts to cover the "Internet of Things" ("IoT") and other connected devices. In the FTC’s first enforcement action regarding IoT, the FTC settled with TRENDnet, maker of Internet-connected home security cameras and baby monitors.75 The FTC alleged that TRENDnet failed to employ reasonable and appropriate security in the design and testing of the software that it provided consumers for its IP cameras and failed to implement a process to actively monitor security vulnerability reports. The FTC alleged that, due to the company’s failure to properly secure the cameras, hackers were able to access and then post online the private video and even audio feed of nearly 700 TRENDnet cameras, including live feeds displaying private areas of users’ homes.76
The FTC followed up in February 2016 with its second IoT enforcement action charging that ASUSTeK Computer, Inc. failed to secure its connected routers and "cloud" services.77 The FTC alleged that ASUS misrepresented the products’ security through claims such as "the most complete, accessible, and secure cloud platform" and "safely secure and access your router." Nonetheless, the router and cloud services contained significant vulnerabilities and design flaws allegedly allowed unauthorized access to router login credentials and consumer files. For example, the complaint alleged that hackers could exploit pervasive security bugs in the consumer’s web-based control panel to change the router’s security settings, turn off the router’s firewall, permit public access to the consumer’s "cloud" storage, and configure the router to redirect consumers to malicious websites. Attackers could also access users’ cloud storage without any login credentials and gain complete access to a consumer’s connected storage device. This led to the compromise of thousands of consumers’ connected storage devices, exposing consumers’ personal files and sensitive information. At least one ASUS customer was the victim of identity theft.
[Page 322]
IV. PREDICTIONS FOR THE FUTURE
The FTC will continue its data security enforcement efforts for new and innovative products and connected devices when companies fail to enact reasonable data security policies and procedures to protect the security of users’ personal or sensitive information. FTC investigations and enforcement activity often will follow on the heels of the Commission’s workshops, reports, testimony before Congress, and other activities, and can provide insight for future trends. For instance, the FTC held a public workshop in November 2015 on cross-device tracking to examine privacy issues associated with tracking consumers’ activities over time and across different devices for advertising and marketing purposes.78 In March 2016, the FTC followed up on lessons learned in this area when it sent warning letters to a dozen mobile app developers alleging that audio monitoring software used in their apps had the potential to track consumers’ activities for advertising purpose, but was not clearly disclosed to consumers.79
The FTC’s pre-enforcement activities can provide a looking glass into issues that likely will emerge in the data security context. For example, the FTC has already announced that it will host a series seminars in fall 2016 to examine several new consumer protection issues.80 Notably, one workshop will address the issue of ransomware, whereby hackers can gain access to consumer and business computers, encrypt files containing photos, documents, and other important data, and then demand a ransom in exchange for the encryption code. The FTC, through its prior workshops, reports, speeches, and Congressional testimony, has already revealed its interest in big data, connected cars, and consumer-generated and controlled health data (such as through health and fitness apps and devices). Although the Commission has yet to announce any enforcement actions in these areas, we should expect to see them in the near future.
Nonetheless, the fact that a company may suffer a data breach or data security lapse does not necessarily mean that the same company will face an FTC enforcement action. What the FTC requires is that entities implement reasonable data security measures, taking into account the sensitivity and volume of the consumer data collected, the size and complexity of the entity operations, and the cost of available tools to secure the data.81 The Commission has reiterated that "reasonable security" does not necessarily mean perfect security.82 It is only when a company deceives consumers about the data security practices or protections, or fails to provide reasonable security, that the FTC may intervene.
[Page 323]
The Third Circuit’s Wyndham decision is significant when analyzing future trends for several reasons. First, the decision provides support that companies may be liable under an FTC unfairness theory for inadequate cybersecurity measures on the basis of likely (rather than actual) injury to consumers. Second, the decision underscores that companies have "fair notice" that a cybersecurity program may fall within the FTC’s jurisdictional scope of Section 5(a), and whether such program is reasonable will turn on the extent to which the program survives a cost-benefit analysis. Lastly, the FTC’s business guidance, administrative complaints, and consent decrees can guide entities as to what practices may give rise to unfairness claims based on inadequate corporate cybersecurity, and help to identify what security controls should be in place.
The general takeaway from FTC’s data security enforcement actions is that a company’s data security practices may be considered "reasonable" by the FTC (even if not perfect, and even within the context of a breach), if the company can demonstrate that it implemented comprehensive policies and access controls designed to protect consumer information; that the potential costs of more robust data security measures would offset any benefit to consumers in the aggregate and to competition; and that it did not misrepresent these practices in statements to consumers.
The FTC’s deception and unfairness principles will continue to apply to Internet connected devices, web services, and mobile applications, just as they always applied to basic computers, corporate practices, and employee devices. As connected devices and the "Internet of Things" descends into new areas, so too will FTC enforcement.
[Page 324]
——–
Notes:
1. Crystal N. Skelton is an associate at Kelley Drye & Warren LLP, and a member of Kelley Drye’s Privacy & Information Security and Advertising & Marketing groups. This article reflects the views of the author and not necessarily those of Kelley Drye, its attorneys, or its clients.
2. Fed. Trade Comm’n, Staff Report, Internet of Things: Privacy & Security in a Connected World (Jan. 2015), https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf.
3. The Goldman Sachs Group, Inc., The Internet of Things: Making Sense of the Next Mega-Trend 1 (Sept. 2014), http://www.goldmansachs.com/our-thinking/outlook/internet-of-things/iot-report.pdf.
4. DHL Trend Research & Cisco Consulting Servs., The Internet of Things in Logistics 4 (2015), http://www.dhl.com/content/dam/Local_Images/g0/New_aboutus/innovation/DHLTrendReport_ Internet_of_things.pdf; see also Dave Evans, CISCO Internet Bus. Sols. Grp., The Internet of Things: How the Next Evolution of the Internet is Changing Everything, 3 (Apr. 2011) http://www.cisco.com/c/dam/en_us/about/ac79/docs/innov/IoT_IBSG_0411FINAL.pdf/.
5. DHL Trend Research, supra note 4, at 4 (citing David Mercer, Strategy Analytics, Connected World: The Internet of Things and Connected Devices in 2020 (Oct. 2014)).
6. Intel Corp., Intel IoT Gateway (2014), http://www.intel.com/content/dam/www/public/us/en/documents/product-briefs/gateway-solutions-iot-brief.pdf.
7. See e.g. 201 Mass. Code Regs. § 17.00 et seq. (West 2016) (requires entities handling personal information to implement a written comprehensive information security program); Md. Code Ann. Com. Law § 14-3503 (West 2008) (requires a business that owns or licenses personal information of an individual residing in the state to implement and maintain reasonable security procedures and practices that are appropriate to the nature of the personal information owned or licensed and the nature and size of the business and its operations."); Nev. Rev. Stat. § 603A.215(1) (2011) (requires businesses that accept payment cards to comply with the most current version of the Payment Card Industry Data Security Standard).
8. 15 U.S.C § 45(a) (2012).
9. Fed. Trade Comm’n v. Wyndham Worldwide Corp., 799 F.3d 236 (3d Cir. 2015).
10. 15 U.S.C. § 53(b).
11. Id. § 45(b).
12. Id. § 57b(a)(2).
13. A respondent electing to settle the charges will typically enter into a consent agreement without admitting liability, consent to entry of a final order, and waive all right to judicial review. A respondent electing to contest the charges will often result in the Commission adjudicating the complaint in federal court or before an administrative law judge, with an opportunity to appeal any final decision.
14. 15 U.S.C. § 45(g)(2).
15. Id. § 45(l) (2012); see also 16 C.F.R. § 1.98(c) (2015) (increased liability for statutory violations to $16,000).
16. 15 U.S.C. § 45(1).
17. Id. §§ 45(a)(1), 45(n); see also Complaint for Injunctive and Other Equitable Relief, Fed. Trade Comm’n v. Wyndham Worldwide Corp., Civ. No. 2:13-cv-01887 (D. Ariz. Jun. 26, 2012).
18. See Letter from James C. Miller III, Chairman, Fed. Trade Comm’n, to the Honorable John D. Dingell, Chairman, Comm. on Energy & Commerce, U.S. House of Representatives (Oct. 14, 1983) (appended to In re Cliffdale Associates, Inc., 103 F.T.C. 110, 174 (1984)), http://www.ftc.gov/bcp/policystmt/ad-decept.htm [hereinafter Miller, Deception Statement].
19. Id. at 182.
20. See e.g., Complaint, In re Snapchat, Inc., Docket No. C-4501 (F.T.C. Dec. 23, 2014), https://www.ftc.gov/system/files/documents/cases/141231snapchatcmpt.pdf.
21. 15 U.S.C. § 45(n) (2012); see Fed. Trade Comm’n, Commission Statement of Policy on the Scope of the Consumer Unfairness Jurisdiction, 104 F.T.C. 1070 (1984) (appended to In re Int’l Harvester Co., 104 F.T.C. 949 (1984)) [hereinafter Fed Trade Comm’n, Unfairness Policy Statement].
22. See e.g. Complaint at 3, In re CVS Caremark Corp., Docket No. C-4259 (F.T.C. June 18, 2009), https://www.ftc.gov/sites/default/files/documents/cases/2009/06/090623cvscmpt.pdf (alleging that failure to use reasonable and appropriate measure to prevent unauthorized access to personal information caused or was likely to cause substantial injury to consumers not offset by countervailing benefits to consumers or competition and thus was an unfair act or practice).
23. 15 U.S.C. § 6501 et seq.; 16 C.F.R. § 312.2 (2015).
24. 16 C.F.R. § 312.8.
25. Id. § 312.10.
26. 15 U.S.C. § 45(m)(1); see also 16 C.F.R. § 1.98(d) (2015).
27. 15 U.S.C. § 6801 et seq. (2012); 16 C.F.R. §§ 313-314 (2015).
28. FTC Standards for Safeguarding Customer Information, 16 C.F.R. § 314.1 et seq.
29. Id. § 314.3(a).
30. Similarly, the FTC cannot impose civil penalties for violations of GLBA’s safeguarding provisions.
31. 15 U.S.C. § 1681 et seq.; 16 C.F.R. § 602.1 et seq.
32. See 16 C.F.R. § 682.3(a) (2015).
33. Id. § 682.3(b)(2).
34. 15 U.S.C. § 1681s(a)(2)(A) (2012).
35. See, e.g., Decision and Order, In re Eli Lilly & Co., Docket No. C-4047 (F.T.C. May 8, 2002), https://www.ftc.gov/sites/default/files/documents/cases/2002/05/elilillydo.htm.
36. See Complaint for Injunctive and Other Equitable Relief, Fed. Trade Comm’n v. Wyndham Worldwide Corp., Civ. No. 2:13-cv-01887 (D. Ariz. Jun. 26, 2012).
37. Motion to Dismiss by Defendant Wyndham Hotels & Resorts LLC, Fed. Trade Comm’n v. Wyndham Worldwide Corp., No. 12-cv-1365-PHX-PGR (D. Ariz. Aug. 27, 2012).
38. See Fed. Trade Comm’n v. Wyndham Worldwide Corp., 10 F. Supp. 3d 602, 607 (D.N.J. 2014); see also id. at 636 (granting Defendant’s motion to certify issues for interlocutory appeal).
39. Fed. Trade Comm’n v. Wyndham Worldwide Corp., 799 F.3d 236 (3d Cir. 2015). Following the Third Circuit’s decision, Wyndham agreed to settle the FTC’s charges in December 2015. Stipulated Order for Injunction, Fed. Trade Comm’n v. Wyndham Worldwide Corp., Civ. No. 2:13-CV-01887-ES-JAD (D.N.J. Dec. 9, 2015), https://www.ftc.gov/system/files/documents/cases/151209wyndhamstipulated.pdf.
40. Wyndham, 799 F.3d at 244.
41. Id. at 245.
42. Id. at 246 (citing 15 U.S.C. § 45(n) (2012) ("[An unfair act or practice] causes or is likely to cause substantial injury" (emphasis in original))); see also id. (holding unfairness claims could "be brought on the basis of likely rather than actual injury" (quoting In re Int’l Harvester Co., 104 F.T.C. 949, 1061 (1984))).
43. Id.
44. Id. at 249.
45. Id. at 253-254.
46. Wyndham, 799 F.3d at 255 (citing Vill. of Hoffman Estates v. Flipside, Hoffman Estates, Inc., 455 U.S. 489, 498-99 (1982)).
47. Id. at 255-256.
48. Id. at 256.
49. See Complaint at 5, In re LabMD Inc., Docket No. 9357 (F.T.C. Aug. 28, 2013), https://www.ftc.gov/sites/default/files/documents/cases/2013/08/130829labmdpart3.pdf.
50. LabMD filed suit in the District Court for the District of Columbia, seeking an injunction to stay the administrative action from going forward on the grounds that it was an improper expansion of FTC jurisdiction, was retaliatory, and violated the Due Process Clause. See LabMD Inc. v. F.T.C., No. 1:13-cv-01787-CKK (D.D.C. Nov. 14, 2013). LabMD filed a similar action in the United States Court of Appeals for the Eleventh Circuit, making the same allegations. See LabMD Inc. v. F.T.C., No. 13-15267-F (11th Cir. Feb. 18, 2014). The Eleventh Circuit denied LabMD’s claim, "citing [its] lack ofjurisdiction over a non-final agency action, but declined to address whether the District Court could hear any of the claims." LabMD, Inc. v. F.T.C., 776 F.3d 1275, 1277 (11th Cir. 2015). LabMD voluntarily dismissed its District of Columbia suit. On March 20, 2014, LabMD filed another suit in the Northern District of Georgia, alleging that: "(1) the FTC’s administrative action against LabMD is arbitrary and capricious in violation of the [Administrative Procedure Act (APA)] because the FTC has no authority to regulate protected health information (PHI); (2) the action is ultra vires and exceeds its statutory authority; (3) the FTC’s application of Section 5 to LabMD’s security protocols violates the Due Process Clause of the U.S. Constitution because it did not provide fair notice or access to a fair tribunal and a hearing; and (4) the FTC violated LabMD’s First Amendment right to free speech." Id. at 1277—78. The FTC filed a motion to dismiss, which the District Court granted, and LabMD appealed the decision, once again, to the Eleventh Circuit. The Eleventh Circuit eventually upheld the district court’s dismissal of LabMD’s complaint against the FTC, finding that the complaint did not stem from a "final" agency action as required under the APA. See id. at 1279.
51. See e.g., Respondent LabMD, Inc.’s Motion to Dismiss Complaint with Prejudice and to Stay Administrative Proceedings, In re LabMD Inc., Docket No. 9357 (F.T.C. Nov. 12, 2013), https://www.ftc.gov/sites/default/files/documents/cases/131112respondlabmdmodiscomplaintdatyadminproceed.pdf.
52. See Commission Order Denying LabMD’s Motion to Dismiss at 2, In re LabMD, Inc., Docket No. 9357(F.T.C. Jan. 16, 2014), https://www.ftc.gov/sites/default/files/documents/cases/140117labmdorder.pdf.
53. See Order on Respondent’s Unopposed Motion at 6, In re LabMD, Inc., Docket No. 9357 (F.T.C. Oct. 9, 2014), https://www.ftc.gov/system/files/documents/cases/141009labmdaljorder.pdf.
54. See Letter from Rep. Darrell E. Issa, Chairman, House Oversight & Gov’t Reform Comm., to The Honorable Edith Ramirez, Chairwoman, Fed. Trade Comm’n (June 11, 2014).
55. See Order Granting Respondent’s Renewed Motion, In re LabMD, Inc., Docket No. 9357 (F.T.C. Dec. 29, 2014), https://www.ftc.gov/system/files/documents/cases/141229labmdorder.pdf.
56. Section 5(n) of the FTC Act states that "[t]he Commission shall have no authority to declare unlawful an act or practice on the grounds that such act or practice is unfair unless [1] the act or practice causes or is likely to cause substantial injury to consumers [2] which is not reasonably avoidable by consumers themselves and [3] not outweighed by countervailing benefits to consumers or to competition." Initial Decision at 13, In re LabMD Inc., Docket No. 9357 (F.T.C. Nov. 13, 2015) (citing 15 U.S.C. § 45(n)), https://www.ftc.gov/system/files/documents/cases/151113labmd_decision.pdf.
57. Id. at 14.
58. Given that LabMD was an administrative proceeding, the ALJ’s decision does not have binding precedential effect on federal or state courts.
59. Complaint Counsel’s Appeal Brief, In re LabMD, Inc., Docket No. 9357 (F.T.C. Dec. 22, 2015), https://www.ftc.gov/system/files/documents/cases/complaint_counsels_appeal_brief_-_labmd_580407.pdf.
60. We note, additionally, that the FTC requires financial institutions subject to the GLBA Safeguards Rule to maintain a comprehensive written information security program that addresses administrative, technical, and physical safeguards appropriate to the business based on its size and complexity, nature and scope of activities, and sensitivity of the personal information collected. 16 C.F.R. § 314.3(a) (2015).
61. See e.g., Decision and Order at 3, In re Snapchat, Inc., Docket No. C-4501 (F.T.C. Dec. 23, 2014), https://www.ftc.gov/system/files/documents/cases/141231snapchatdo.pdf; Agreement Containing Consent Order, In re ASUSTeK Computer, Inc., File No. 142-3156 (F.T.C. Feb. 23, 2016), https://www.ftc.gov/system/files/documents/cases/160222asusagree.pdf.
62. See Decision and Order, In re Eli Lilly & Co., Docket No. C-4047 (F.T.C. May 8, 2002), https://www.ftc.gov/sites/default/files/documents/cases/2002/05/elilillydo.htm.
63. See Decision and Order, In re Accretive Health, Inc., Docket No. C-4432 (F.T.C. Feb. 24, 2014), https://www.ftc.gov/system/files/documents/cases/140224accretivehealthdo.pdf; see also Complaint at 2, In re Accretive Health, Inc., Docket No. C-4432 (F.T.C. Feb. 5, 2014), https://www.ftc.gov/system/files/documents/cases/140224accretivehealthcmpt.pdf.
64. Decision and Order, In re Cbr Systems, Inc., Docket No. C-4400 (F.T.C. Apr. 29, 2013), https://www.ftc.gov/sites/default/files/documents/cases/2013/05/130503cbrdo.pdf; see also Complaint at 3, In re Cbr Systems, Inc., Docket No. C-4400 (F.T.C. Apr. 29, 2013).
65. Decision and Order, In re Twitter, Inc., Docket No. C-4316 (F.T.C. Mar. 2, 2011), https://www.ftc.gov/sites/default/files/documents/cases/2011/03/110311twitterdo.pdf; see also Complaint at 3—4, In re Twitter, Inc., Docket No. C-4316 (F.T.C. Mar. 2, 2011), https://www.ftc.gov/sites/default/files/documents/cases/2011/03/110311twittercmpt.pdf.
66. Letter from Maneesha Mithal, Assoc. Dir. of the FTC Div. of Privacy and Identity Prot., to Lisa J. Sotto, Counsel for Morgan Stanley Smith Barney LLC (August 10, 2015), https://www.ftc.gov/system/files/documents/closing_letters/nid/150810morganstanleycltr.pdf.
67. Decision and Order, In re Compete, Inc., Docket No. C-4384 (F.T.C. Feb. 20, 2013), https://www.ftc.gov/sites/default/files/documents/cases/2013/02/130222competedo.pdf; see also Complaint at 6, In re Compete, Inc., Docket No. C-4383 (F.T.C. Feb. 20, 2013), https://www.ftc.gov/sites/default/files/ documents/cases/2013/02/130222competecmpt.pdf.
68. Agreement Containing Consent Order, In re Oracle Corp., File No. 132 3115 (F.T.C. Dec. 21, 2015), https://www.ftc.gov/enforcement/cases-proceedings/132-3115/oracle-corporation-matter.
69. See Complaint at 2—3, In re Oracle Corp., File No. 132 3115 (F.T.C. Dec. 21, 2015), https://www.ftc. gov/system/files/documents/cases/151221oraclecmpt.pdf.
70. Agreement Containing Consent Order, In re Henry Schein Practice Sols., Inc., File No. 142-3161 (F.T.C. Jan. 5, 2016), https://www.ftc.gov/system/files/documents/cases/160105scheinagreeorder.pdf; see also Complaint at 3, In re Henry Schein Practice Sols., Inc., File No. 142-3161 (F.T.C. Jan. 5, 2016), https://www.ftc.gov/system/files/documents/cases/160105scheincmpt.pdf.
71. Complaint for Permanent Injunction and Other Equitable Relief, Fed. Trade Comm’n v. Frostwire LLC, Civ. No. 1:11-cv-23643-DLG (S.D. Fla. Oct. 7, 2011), https://www.ftc.gov/sites/default/files/documents/cases/2011/10/111011frostwirecmpt.pdf.
72. Complaint at 7, In re HTC America, Inc., Docket No. C-4406 (F.T.C. June 25, 2013), https://www.ftc.gov/sites/default/files/documents/cases/2013/07/130702htccmpt.pdf.
73. See Decision and Order, In re Fandango, LLC, Docket No. C-4481 (F.T.C. Aug. 13, 2014), https://www.ftc.gov/system/files/documents/cases/140819fandangodo.pdf; Decision and Order, In re Credit Karma, Inc., Docket No. C-4480 (F.T.C. Aug. 13, 2014), https://www.ftc.gov/system/files/documents/cases/1408creditkarmado.pdf.
74. Complaint at 8, In re Snapchat, Inc., Docket No. C-4501 (F.T.C. May 8, 2014), https://www.ftc.gov/system/files/documents/cases/140508snapchatcmpt.pdf.
75. Decision and Order, In re TRENDnet, Inc., Docket No. C-4426 (F.T.C. Feb. 7, 2014), https://www.ftc.gov/system/files/documents/cases/140207trendnetdo.pdf.
76. See Complaint at 5, In re TRENDnet, Inc., Docket No. C-4426 (F.T.C. Jan. 16, 2014), https://www.ftc.gov/system/files/documents/cases/140207trendnetcmpt.pdf.
77. Complaint at 2-3, In re ASUSTeK Computer, Inc., File No. 142-3156 (F.T.C. Feb. 23, 2016), https://www.ftc.gov/system/files/documents/cases/160222asuscmpt.pdf.
78. Fed. Trade Comm’n, Cross-Device Tracking, An FTC Workshop (Nov. 15, 2015), https://www.ftc.gov/news-events/events-calendar/2015/11/cross-device-tracking.
79. Fed. Trade Comm’n, Press Release, FTC Issues Warning Letters to App Developers Using ‘Silverpush’ Code (Mar. 17, 2016), https://www.ftc.gov/news-events/press-releases/2016/03/ftc-issues-warning-letters-app-developers-using-silverpush-code.
80. Fed. Trade Comm’n, Press Release, FTC to Host Fall Seminar Series on Emerging Consumer Technology Issues (Mar. 31, 2016), https://www.ftc.gov/news-events/press-releases/2016/03/ftc-host-fall-seminar-series-emerging-consumer-technology-issues.
81. See e.g., Fed. Trade Comm’n, Statement Marking the FTC’s 50th Data Security Settlement (Jan. 31,2014), https://www.ftc.gov/system/files/documents/cases/140131gmrstatement.pdf.
82. See Julie Brill, Comm’r, Fed. Trade Comm’n, Keynote at FTC’s Start with Security Event, Do Try This at Home: Starting Up with Security at 2 (Feb. 9, 2016), https://www.ftc.gov/system/files/documents/public_statements/915043/160208swsseattle.pdf.