Abstract
Law enforcement investigations are increasingly driven by hidden algorithmic tools, without disclosure of those tools to the people being prosecuted or the public at large. Facial Recognition Technology, for example, has been used by police to identify suspects in investigations since 2001. Despite the widespread and growing use of Facial Recognition Technology in police investigations, its scientific validity has never been tested in court, and its secret use has prevented it from being challenged on constitutional grounds. This Article coins the term “witness-washing” to describe the mechanism by which this immense evasion has occurred. Witness-washing occurs when law enforcement uses algorithmic technology such as facial recognition, incorporates the results into a human decision-making process, and then hides their use of the algorithm by presenting the result as an exclusively human decision. For facial recognition, this means that a law enforcement officer runs an image from a surveillance video through an algorithm that is designed to find images of similar faces in a database of mug shots and driver’s license photos. One of the images that the algorithm returns is then used as part of a traditional photo lineup procedure. But the State only discloses the results of the lineup, without mentioning the use of the algorithm. By using this technique, prosecutors have presented tens or hundreds of thousands of cases as if they originated in run-of-the-mill eyewitness identifications, while evading disclosure and examination of the use of speculative technology. This Article uses Facial Recognition Technology as a case study to demonstrate law enforcements’ increasing, hidden use of untested algorithmic tools. In doing so, it makes three central contributions. First, it offers a thorough descriptive account of witness-washing and the flawed technology that it has hidden. Second, it shows how witness-washing allows algorithmic tools to evade the traditional legal and practical limitations on investigative techniques—statutes, constitutional litigation, and limited resources—and how witness-washing distorts the assumptions that underly ongoing attempts to restrict these tools. Finally, this Article argues that the lack of examination caused by witness-washing has allowed the carceral logic of modern law enforcement and the commercial logic of technology vendors to override the inherent logic of the tools themselves. Ultimately, this Article shows that the transition to a criminal legal system driven by algorithms will not be announced—it is happening already, under the cover of witness-washing.
First Page
715
Recommended Citation
Nathan E. Rouse, Witness-Washing Facial Recognition Technology, 102 Denv. L. Rev. 715 (Spring 2025).
Publication Statement
2025-03-01