By Anita Inagandla[i]
Many FinTech Companies’ missions are similar: democratize financial services[ii], create an open financial system[iii], and have every individual participate and thrive[iv] in the economy. FinTech Companies have utilized the rise of smart phones, IoT, and computer algorithms so all consumers have the ability to access the global economy. However, COVID-19 has defined the world of contactless payments[v], with many consumers weary of transacting in cash. Yet, this is not a new trend – 34% of adults under the age of 50 making no purchases in a typical week using cash.[vi] The growing acceptance of a cashless society may give rise to U.S. FinTech Companies implementing facial recognition software to authenticate payments[vii], following countries like Denmark and China[viii] who are rapidly accepting this technology. Though facial recognition software may simplify payments, it also creates new privacy concerns that professionals will have to address. (It is important to note that using facial recognition to authenticate payments differs from multi-factor authentication financial institutions may provide.[ix])
The FTC, in its recent consent order[x] against Everalbum, Inc., adopted a biometric information definition that is more expansive than the Illinois’ Biometric Information Privacy Act (BIPA) but aligns more closely with the California Consumer Privacy Act (CCPA). Biometric information is defined in the consent order as “data that depicts or describes the physical or biological traits of an identified or identifiable person”[xi] including, but not limited to, depictions, descriptions, or copies of an individual’s facial or other physical features. FTC Commissioner Rohit Chopra, who, if confirmed by the Senate, will lead the Consumer Financial Protection Bureau in the Biden Administration[xii], stressed the effectiveness of state governments to regulate facial recognition technology. Chopra noted that Everalbum took greater care with individual’s data in states with laws related to facial recognition and biometric identifiers (e.g., IL, WA, TX) compared to those states which did not.[xiii] Therefore, both BIPA and the CCPA may offer guidance for FinTech companies who choose to implement facial recognition technology to process payments.
First, BIPA applies to private entities that process an individual’s biometric information. A popular, and increasingly common, defense to BIPA litigation in Illinois is that “federal courts in Illinois do not have personal jurisdiction over companies headquartered and incorporated in other states, and conducting business nationally.”[xiv] However, a recent decision has widened the jurisdiction of BIPA claims.[xv] In January 2021, Judge James Donato of the Northern District of California, approved a $650 million settlement between Facebook and approximately seven million Illinois residents over claims that that the social media giant violated their rights under BIPA.[xvi] The Court declined to apply California law since BIPA “manifests Illinois’ substantial policy of protecting its citizens’ right to privacy in their personal biometric data.”[xvii] With this recent holding, California-based companies should be aware of five distinct obligations required under BIPA: (1) a written retention and destruction policy; (2) a written release; (3) prohibition against profiting (even with consent); (4) restrictions on disclosure; and (5) security requirements.[xviii] Even if California passes a BIPA-equivalent law in the future, it would be tough to assume California law would govern a BIPA claim in a California court. As Judge Donato noted “if California law is applied, the Illinois policy of protecting its citizens’ privacy interests in their biometric data . . . would be written out of existence.” Therefore, California based companies that provide services to Illinois residents are advised to evaluate if their company’s collection and processing of biometric information falls in line with the requirements of BIPA.
Second, the CCPA defines biometric information as “an individual’s physiological, biological or behavioral characteristics . . . that can be used, singly or in combination with each other or with other identifying data, to establish individual identity” including one’s face from which a faceprint can be extracted.[xix] The definition of biometric information is much broader under the CCPA than under BIPA. Therefore, companies that fall under BIPA may additionally fall under the CCPA. Companies utilizing biometric data can comply with the CCPA by: (1) data mapping so they are aware of what privacy disclosures are required; (2) updating privacy policies; (3) providing consumers with the CCPA’s mandatory “notice at collection” prior to collecting biometric information; (4) allowing consumers to opt out of collection of their biometric information; (5) maintaining systems to comply with consumer’s rights requests; (6) maintaining “reasonable security practices and procedures” in their data security processes; (7) updating service provider contracts to include provisions that allow companies to share biometric information with vendors; and (8) consulting with experienced biometric privacy counsel.[xx] With the passing of the California Privacy Rights Act (CPRA) comes new guidance in regard to the processing of biometric information. First, a new sub-category of personal information called “sensitive personal information” is created under the CPRA[xxi] which includes “the processing of biometric information for the purpose of uniquely identifying a consumer.”[xxii] Second the category of “publicly available” information does include information “lawfully made available from . . . government records”[xxiii] but does not include “biometric information collected by a business about a consumer without the consumer’s knowledge.”[xxiv] (This provision is worth noting since it may prohibit companies from scraping “widely available images of individuals in order to populate facial recognition databases for law enforcement use.”[xxv]) Therefore, though California may not have privacy legislation dedicated solely to biometric information, both the CCPA and newly passed CPRA can offer guidance. Maintaining compliance with the guidance offered by both California laws may mitigate any risk with processing biometric data.
With only a handful of states passing biometric privacy regulations (e.g., IL, TX, WA, CA, NY, AK), many consumers believe that a lack of regulation in the facial recognition space has created a “Wild West” in the industry.[xxvi] Further, a case study conducted in China revealed that first-time users who did not understand the technology and payments process felt that their privacy was at risk during the onboarding experience.[xxvii] Facial recognition software has the ability to make the payments processing world more efficient, but the lack of regulation combined with the lack of trust has created a reluctance in accepting this technology. To combat any hesitancy with accepting this software, companies can look to Disney’s most recent endeavor to gain widespread acceptance of facial recognition software at Walt Disney World. Disney allowed guests to volunteer to use this software and then asked for feedback via a survey, asking questions regarding the ease of use and how the software impacted their visit.[xxviii] Once FinTech companies have done their due diligence to comply with biometric-related privacy laws, they can then give consumers the option to use facial recognition software. By requesting feedback and maintaining transparency, FinTech companies may be able to create a world where facial recognition software to process payments is widely accepted.
[i] Anita Inagandla is a Spring 2021 J.D. and Privacy Law Certificate Candidate at Santa Clara University School of Law.