
A Tennessee grandmother spent over three months in jail because a facial recognition algorithm made a catastrophic mistake. Angela Lipps never set foot in North Dakota, yet software analyzing bank surveillance footage identified her as a fraud suspect, setting in motion a chain of events that would cost her freedom, home, and livelihood.
The Algorithm’s Fatal Error
Between April and May 2025, someone used a fraudulent US Army military ID to withdraw tens of thousands of dollars from Fargo banks. When detectives fed surveillance footage into facial recognition software, the system returned one name: Angela Lipps, a 50-year-old mother and grandmother living over 1,200 miles away in Tennessee.
A detective wrote in court documents that Lipps appeared to match the suspect based on facial features, body type, and hairstyle. That assessment, made by software and rubber-stamped in a report, became sufficient cause for arrest. No one from Fargo police contacted Lipps to verify her whereabouts before US marshals arrived at her door with guns drawn in July.
108 Days of Injustice
Lipps was arrested while babysitting four children at her Tennessee home. Booked as a fugitive from justice, she faced four counts of unauthorized use of personal identifying information and four counts of theft. The algorithm’s identification was treated as evidence enough to hold her without bail for 108 days while North Dakota arranged transport.
“I’ve never been to North Dakota, I don’t know anyone from North Dakota,” Lipps told WDAY News. Her protests fell on deaf ears as the system processed her like any other suspect flagged by the technology.
The Real Evidence That Set Her Free
What eventually cleared Lipps was not another algorithm or advanced investigation technique, but simple bank records. Her attorney, Jay Greenwood, obtained documentation proving she had been in Tennessee during every transaction investigators claimed she committed in North Dakota. Only after presenting this irrefutable evidence were the charges dropped and Lipps released on Christmas Eve.
“If the only thing you have is facial recognition, I might want to dig a little deeper,” Greenwood told InForum. Fargo police had not dug deeper.
The Devastating Aftermath
The algorithm’s error cost Lipps far more than 108 days of freedom. Unable to pay bills while incarcerated, she lost her home, car, and dog. When Fargo police released her, they provided no assistance for her return to Tennessee. Defense attorneys helped cover a hotel room and food over Christmas, while a local nonprofit, the F5 Project, funded her journey home.
As of the reporting from InForum, no one from the Fargo police department had apologized to Lipps for the ordeal caused by their reliance on flawed technology.
A Pattern of Algorithmic Failures
The Lipps case illustrates a disturbing pattern in law enforcement’s relationship with facial recognition technology. Last October, an AI system at a Baltimore school identified a bag of Doritos as a firearm, leading to the armed detention of student Taki Allen. Officers forced the teenager to his knees, handcuffed him, and searched him before discovering their mistake.
Research by the ACLU of Northern California found that Amazon’s facial recognition software falsely matched 28 members of Congress with faces in a mugshot database, with congressmembers of color being misidentified at higher rates. The National Institute of Standards and Technology has documented persistent racial disparities in facial recognition accuracy.
The Burden of Disproving Machines
What makes these cases particularly troubling is how the burden of proof shifts once an algorithm generates a match. As the Lipps case demonstrates, law enforcement acts on computer-generated identifications while the accused must produce documentary evidence to disprove the machine’s assessment.
There is no laboratory test that represents the conditions under which police use facial recognition in real-world scenarios. Testing labs cannot access the exact databases of mugshots, licenses, and surveillance photos that police search through in specific communities. They cannot account for the full range of variables that affect accuracy in practice.
The Human Cost of Algorithmic Policing
Angela Lipps’ story reveals the devastating human consequences when law enforcement treats algorithmic suggestions as facts. The software was wrong, but Lipps still had to spend 108 days proving it while her life fell apart around her.
Her case stands as a stark reminder that facial recognition technology, no matter how sophisticated it appears, remains prone to errors that can destroy innocent lives. Until law enforcement agencies acknowledge these limitations and implement proper safeguards, more Angela Lipps will find themselves imprisoned by the mistakes of machines they never knew existed.
This article draws on reporting from Activist Post, ACLU, and Boston University.



