Litigation
Cal. Litig. VOLUME 38, ISSUE 1, MAY 2025
Content
- Caci 1805: a Comedy of Errors
- California Gun Violence Restraining Order Blueprint
- Chair's Column
- Editor's Foreword: a Supra-eme Farewell
- How To Improve (And Not Blow!) Your Chances For Obtaining Appellate Writ Relief
- Inside This Issue
- Interview With Magistrate Judge Michelle M. Pettit
- Litigating Sex Trafficking Cases Involving Online Platforms: Strategies and Considerations For Advocacy On Behalf of Survivors
- Motions In Limine: the Right To a Fair Trial
- PAST SECTION CHAIRS & EDITORS-IN-CHIEF
- Psych Records In Emotional Distress Cases: Where We Went Wrong and How To Fix It
- SECTION OFFICERS & EDITORIAL BOARD
- Settlement Conferences V. Mediations: a Distinction Without a Purpose?
- Success In Trial: the Work After the Courtroom Closes
- Table of Contents
- Ai In Criminal Cases In 2025: Use of Facial Recognition Technology
AI IN CRIMINAL CASES IN 2025: USE OF FACIAL RECOGNITION TECHNOLOGY
Written by Hon. Abraham C. Meltzer
It would take a book to comprehensively discuss how AI is likely to impact criminal cases in 2025, so this article will focus on two cases that have received recent attention regarding the use of facial recognition technology (FRT) in criminal investigations.
Recent developments in FRT indicate that caution is needed. Live facial recognition has been implemented in the United Kingdom to search for wanted persons in real time in public spaces. But in the United States, facial recognition cannot yet, by itself, satisfy the standard of probable cause to arrest. A significant hurdle to using live facial recognition is the problem of detecting low base-rate events. Rite Aid Corporation’s foray into FRT provides a cautionary lesson on how live facial recognition can go astray. After Rite Aid’s failed experiment with installing FRT in its stores to catch shoplifters, the Federal Trade Commission sued it.and Rite Aid agreed to a five-year ban on using any facial recognition systems.
Nevertheless, FRT has already inserted itself into criminal cases, typically as the first link in a chain identifying a defendant. This raises the question of whether a defendant is entitled to discover the other matches a facial recognition system returned when it also matched to the defendant. The first court to address that issue, in Lynch v. State of Florida (2018) 260 So.3d 1166, held that the alternative matches are not discoverable. In contrast, in State v. Arteaga (2023) 476 N.J.Super. 36, a New Jersey court held that a defendant is not only entitled to discover the alternative matches, but also the source code and error rates of the facial recognition system used to identify the defendant. This question is likely to be revisited by other courts in 2025.