Faulty facial recognition leads to false imprisonment
Faulty facial recognition leads to false imprisonment
On March 26, 2022, a violent assault occurred on a Maryland Transit Administration bus in suburban Baltimore. The attacker punched the female bus driver multiple times after an argument over COVID masking rules, then fled with her cell phone.
According to a recent New Yorker report, surveillance cameras captured images of the assailant. Transit police used these pictures to create a Be on the Lookout bulletin distributed to law enforcement agencies.
An analyst at the Harford County state’s attorney’s office ran one surveillance image through facial recognition software. The algorithm matched Alonzo Cornelius Sawyer, a black man in his 50s from Abingdon, MD.
Sawyer was arrested days later while appearing in court for an unrelated case.
Police interrogated him and showed him the BOLO images, which he insisted were not of him — but they dismissed his claims after his probation officer, Arron Daugherty, positively identified Sawyer as the attacker upon viewing the footage. Daugherty had only met Sawyer briefly twice before while Sawyer wore a mask.
Sawyer’s wife, Carronne Jones-Sawyer, also adamantly denied the images showed her husband, citing physical differences in age, build, clothing and more. She provided potential alibis, placing Sawyer away from the scene when the assault occurred. However, detectives conducted no further investigation to corroborate the facial recognition match.
AI racial bias
This case exemplifies the risks of overreliance on AI tools without sufficient standards.
Racial bias leads facial recognition systems to misidentify people of color at much higher rates. The algorithmic match outweighed contradictory eyewitness testimony in the police investigation.
After a month in jail, charges were dropped when Daugherty admitted doubts after meeting with Sawyer’s wife. The use of facial recognition was never disclosed to Sawyer. Neither he nor his wife were notified when another man was eventually arrested.
The story highlights concerns about inadequate training in facial recognition, lack of corroboration, failure to disclose the use of the technology and confirmation bias, leading police to dismiss contradictory evidence.
Critics argue that facial recognition usage should be banned or strictly limited, given its potential for abuse and entrenching injustice without oversight. Sawyer believes he would have falsely pleaded guilty without his wife’s intervention, showing how the practice can enable overzealous prosecution.
As rapid AI advancements spread, the public needs protection against unproven technologies. Sawyer’s experience underscores the urgent need for reform, transparency, and accountability to prevent more wrongful arrests.
Featured Image Credit: Cottonbro Studio; Pexels; Thank you!
The post Faulty facial recognition leads to false imprisonment appeared first on ReadWrite.
(8)