How AI-Powered Tech Landed Man In Jail With Scant Evidence

2
Michael Williams sits for a portrait in his South Side Chicago home Tuesday, July 27, 2021. Williams was behind bars for nearly a year before a judge dismissed the murder case against him in July at the request of prosecutors, who said they had insufficient evidence. (AP Photo/Charles Rex Arbogast)

CHICAGO (AP) — Michael Williams’ wife pleaded with him to remember their fishing trips with the grandchildren, how he used to braid her hair, anything to jar him back to his world outside the concrete walls of Cook County Jail.

Join our WhatsApp group

Subscribe to our Daily Roundup Email


His three daily calls to her had become a lifeline, but when they dwindled to two, then one, then only a few a week, the 65-year-old Williams felt he couldn’t go on. He made plans to take his life with a stash of pills he had stockpiled in his dormitory.

Williams was jailed last August, accused of killing a young man from the neighborhood who asked him for a ride during a night of unrest over police brutality in May. But the key evidence against Williams didn’t come from an eyewitness or an informant; it came from a clip of noiseless security video showing a car driving through an intersection, and a loud bang picked up by a network of surveillance microphones. Prosecutors said technology powered by a secret algorithm that analyzed noises detected by the sensors indicated Williams shot and killed the man.

“I kept trying to figure out, how can they get away with using the technology like that against me?” said Williams, speaking publicly for the first time about his ordeal. “That’s not fair.”

Williams sat behind bars for nearly a year before a judge dismissed the case against him last month at the request of prosecutors, who said they had insufficient evidence.

Williams’ experience highlights the real-world impacts of society’s growing reliance on algorithms to help make consequential decisions about many aspects of public life. Nowhere is this more apparent than in law enforcement, which has turned to technology companies like gunshot detection firm ShotSpotter to battle crime. ShotSpotter evidence has increasingly been admitted in court cases around the country, now totaling some 200. ShotSpotter’s website says it’s “a leader in precision policing technology solutions” that helps stop gun violence by using “sensors, algorithms and artificial intelligence” to classify 14 million sounds in its proprietary database as gunshots or something else.

But an Associated Press investigation, based on a review of thousands of internal documents, emails, presentations and confidential contracts, along with interviews with dozens of public defenders in communities where ShotSpotter has been deployed, has identified a number of serious flaws in using ShotSpotter as evidentiary support for prosecutors.

AP’s investigation found the system can miss live gunfire right under its microphones, or misclassify the sounds of fireworks or cars backfiring as gunshots. Forensic reports prepared by ShotSpotter’s employees have been used in court to improperly claim that a defendant shot at police, or provide questionable counts of the number of shots allegedly fired by defendants. Judges in a number of cases have thrown out the evidence.

ShotSpotter’s proprietary algorithms are the company’s primary selling point, and it frequently touts the technology in marketing materials as virtually foolproof. But the private company guards how its closed system works as a trade secret, a black box largely inscrutable to the public, jurors and police oversight boards.

The company’s methods for identifying gunshots aren’t always guided solely by the technology. ShotSpotter employees can, and often do, change the source of sounds picked up by its sensors after listening to audio recordings, introducing the possibility of human bias into the gunshot detection algorithm. Employees can and do modify the location or number of shots fired at the request of police, according to court records. And in the past, city dispatchers or police themselves could also make some of these changes.

Amid a nationwide debate over racial bias in policing, privacy and civil rights advocates say ShotSpotter’s system and other algorithm-based technologies used to set everything from prison sentences to probation rules lack transparency and oversight and show why the criminal justice system shouldn’t outsource some of society’s weightiest decisions to computer code.

When pressed about potential errors from the company’s algorithm, ShotSpotter CEO Ralph Clark declined to discuss specifics about their use of artificial intelligence, saying it’s “not really relevant.”

“The point is anything that ultimately gets produced as a gunshot has to have eyes and ears on it,” said Clark in an interview. “Human eyes and ears, OK?”


Listen to the VINnews podcast on:

iTunes | Spotify | Google Podcasts | Stitcher | Podbean | Amazon

Follow VINnews for Breaking News Updates


Connect with VINnews

Join our WhatsApp group


2 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments
James L. Farrell
James L. Farrell
2 years ago

If the evidence is based on time differences only, that isn’t enough. Velocity is a vector, dependent on direction as well as speed. A judge should reject evidence that doesn’t take direction into account.