AI Facial Recognition Error Gets Grandma Arrested At Gunpoint And 6 Months In Jail

hero fargo police
In a quiet suburb of Tennessee, the AI and physical world collided with devastating consequences for a grandmother whose only crime was existing in a database. Law enforcement’s increasing reliance on facial recognition technology has shifted from a futuristic tool for catching fugitives into a bureaucratic nightmare that treats algorithmic matches as absolute truth, regardless of the human cost, in this case causing the grandma to be wrongfully arrested and jailed for almost six months.

The ordeal began when facial recognition software used to investigate a series of fraud cases in Fargo, North Dakota, flagged Angela Lipps, a grandmother living hundreds of miles away in Tennessee. Despite having no connection to the crime and a lack of physical evidence linking her to the scene, the software hit was treated as a definitive ID. Lipps was arrested at gunpoint by U.S. Marshals on July 14 last year, and subsequent incarcerated in a Fargo jail where she remained for months, separated from her family and struggling to clear her name from a crime she did not commit.

lipps1
Credit: WDAY

This incident is not an isolated glitch but part of a growing pattern of automation bias within law enforcement departments. Experts argue that while facial recognition is marketed as a precision tool, it often struggles with demographic accuracy, frequently misidentifying women and people of color at higher rates. When a computer provides a potential match, investigators sometimes bypass traditional verification steps. In the case of Lipps, no one checked her alibis or other proof of innocence. Records showed that even as the police placed the grandma in Fargo, she was actually in Tennessee that entire time depositing her Social Security checks and buying cigarettes. 

It was found that the software used to identify the criminal attempted to find patterns where they may not exist. In this case, the grandmother’s facial structure, body type, and hair were deemed similar enough to the investigated suspect’s that the system generated a high-confidence match. Unfortunately, once those data points entered the legal system, it triggered a chain of events that the victim was powerless to stop. For 50-year old Lipps, the eventual dismissal of charges came too late to prevent the trauma of being uprooted and jailed. 

So far eight other Americans have been wrongfully arrested due to erroneous facial recognition matches. In and of itself, the technology, as proponents argue, has saved thousands of man-hours and solved cold cases that would otherwise remain dormant. However, civil liberties advocates point out that the legal safeguards have not kept pace with the technology. In many jurisdictions, there are no mandatory standards for how confident an algorithm must be before an arrest is made, nor are there requirements for independent human reviews of digital matches.

Main photo credit: Fargo Police Department
AL

Aaron Leong

Tech enthusiast, YouTuber, engineer, rock climber, family guy. 'Nuff said.