How could facial recognition be abused? In Detroit, police used it to arrest an innocent man.
The January arrest of Robert Williams, a black man, is the first known case of someone being mistakenly arrested in the US due to facial-recognition technology, according to the American Civil Liberties Union. On Wednesday, Williams took his story public.
“Why is law enforcement even allowed to use such technology when it obviously doesn’t work?” he wrote in an op-ed piece for The Washington Post.
The ACLU agrees. Citing Williams’ wrongful arrest, the group is calling on Congress to ban law enforcement from using facial-recognition technologies. “Lawmakers need to stop allowing law enforcement to test their latest tools on our communities, where real people suffer real-life consequences,” says Neema Singh Guliani, ACLU senior legislative counsel.
In October 2018, a Shinola watch store was robbed and the store’s video surveillance captured somewhat blurry footage of the culprit, an unknown black man, according to the ACLU. Five months later, police tried to identify the suspect using a facial-recognition system owned by the Michigan State Police. The result pointed to Williams, who was arrested in January 2020.
However, the ACLU says the facial-recognition technology failed; the only commonalities between Williams and the suspect are their race and large frame.
Williams says it was obvious Detroit police got the wrong person. “This is not me. I hope you all don’t think black people all look alike,” he recounts telling the police officers after his arrest. Nevertheless, police investigators told him the “computer” had identified him as the suspect, making Williams realize the role facial recognition had played in his detention.
Detroit’s police department hasn’t commented on the incident. However, Michigan State Police told PCMag the facial-recognition technology is only meant to provide an investigative lead. “Further investigation is needed to develop probable cause to arrest,” the department added.
According to ACLU, Detroit police tried to confirm Williams’ connection to the crime by asking a security guard at the Shinola store to identify him in a photo lineup. The guard claimed Williams was indeed the culprit. However, the security guard in question was not present during the shoplifting crime; he had merely reviewed the incident from the video surveillance footage.
Nonetheless, police arrested Williams on Jan. 9, handcuffing him at his house in front of his family. He was held in detention for 30 hours, even as he heard a police officer at one point admit “the computer must have gotten it wrong.”
In response, the ACLU filed a formal complaint with the Detroit Police Department, demanding it stop using facial recognition as an investigation tool. “The facts of Mr. Williams’ case prove both that the technology is flawed and that DPD investigators are not competent in making use of such technology,” it says.
The complaint goes on to accuse Detroit authorities of stonewalling attempts to provide public records regarding his arrest.
As for Williams, he’s concerned the technology will exacerbate discriminatory policing practices. “Even if this technology does become accurate (at the expense of people like me), I don’t want my daughters’ faces to be part of some government database. I don’t want cops showing up at their door because they were recorded at a protest the government didn’t like,” he wrote in his op-ed piece.
According to Michigan State Police, the facial-recognition software is made by DataWorks Plus, a provider of law enforcement technologies. The company did not respond to a request for comment. However, past studies have found facial-recognition software can often misidentify people of color.
The news comes as the Boston City Council voted unanimously on Wednesday to ban the use of facial-recognition technology by police and other city agencies, Boston.com reports.