A man spent a year in jail on a murder charge involving disputed AI evidence. Now the case has been dropped
Plus: Intel winds down RealSense, and more
In brief The case against a man accused of murder has been thrown out by a judge after prosecutors withdrew disputed evidence of an AI-identified gunshot sound.
Michael Williams, 65, who denied any wrongdoing, sat in jail for 11 months awaiting trial for allegedly killing Safarian Herring, 25.
It's said that in May last year, Williams was driving through Chicago one night hoping to buy some cigarettes. Herring waved him down for a ride, and Williams, recognizing the younger man from the neighborhood, let him into his car. Soon after another vehicle pulled up alongside, and someone in a passenger seat took out a gun and shot Herring in the head, Williams told police. Herring's mother said her son, an aspiring chef, had been shot at two weeks earlier at a bus stop.
Herring, who was taken to hospital by Williams, died from the gunshot wound, and Williams ended up being charged with his murder. A key piece of evidence against him came from ShotSpotter, a company that operates microphones spread across US cities including Chicago that, with the aid of machine-learning algorithms, detect and identify gunshot sounds to immediately alert the cops.
Prosecutors said ShotSpotter picked up a gunshot sound where Williams was seen on surveillance camera footage in his car, putting it all forward as proof that Williams shot Herring right there and then. Police did not cite a motive, had no eyewitnesses, and did not find the gun used in the attack. Williams did have a criminal history, though, having served time for attempted murder, robbery, and discharging a firearm when he was younger, and said he had turned his life around significantly since. He was grilled by detectives, and booked.
Williams' lawyers asked the trial judge to probe the ShotSpotter evidence, alleging in a court filing that ShotSpotter picked up the sound of a firecracker a mile away from where police said Herring was shot and that these details were later corrected by ShotSpotter employees to be gunfire where officers said Herring was hit. ShotSpotter said it had not improperly altered any data to favor the police's case.
Editor's note: ShotSpotter has responded to the allegations raised by Williams' lawyers, stating that, for its court evidence, its algorithm identified two data points: the exact coordinates where Herring was shot at the junction of South Stony Island Avenue and East 63rd Street, and the street address to the entrance of Jackson Park, the edge of which is where Herring was hit. The park's entrance is a mile from where the shooting occurred. These data points were not changed at any time.
ShotSpotter also said the reclassification of the sound from a firecracker to gunfire was innocuous: a human reviewer checked the audio and changed it from a possible firework to gunshot within a minute of its detection.
Thus, though Williams' lawyers sought to paint ShotSpotter's location and classification as ambiguous and unreliable, it is clear from the evidence why two data points were given – the precise coordinates of the actual shot; and what the algorithm thought was the nearest relevant street address, the adjacent park – and that these data points were not changed by ShotSpotter staff, and also how the sound was reclassified immediately by an employee. We are happy to clarify this situation.
After Williams' lawyers asked the judge in the case to carry out an inquiry, the prosecution last month withdrew the ShotSpotter report, and asked for the case to the dismissed on the basis of insufficient evidence, which the judge agreed to. Williams is a free man again.
The internet used our AI to make NSFW images!
Startup Kapwing, which built a web application that uses computer-vision algorithms to generate pictures for people, is disappointed netizens used the code to produce NSFW material.
The software employs a combination of VQGAN and CLIP – made by researchers at the University of Heidelberg and OpenAI, respectively – to turn text prompts into images. This approach was popularised by artist Katherine Crowson in a Google Collab notebook; there's a Twitter account dedicated to showing off this type of computer art.
Kapwing had hoped its implementation of VQGAN and CLIP on the web would be used to make art from users' requests; instead, we're told, it was used to make filth.
“Since I work at Kapwing, an online video editor, making an AI art and video generator seemed like a project that would be right up our alley,” Eric Lu, co-founder and CTO at Kapwing said.
“The problem? When we made it possible for anyone to generate art with artificial intelligence, barely anyone used it to make actual art. Instead, our AI model was forced to make videos for random inputs, trolling queries, and NSFW intents.”
Submitted prompts ranged from “naked woman” to the downright bizarre “thong bikini covered in chocolate” or “gay unicorn at a funeral.” The funny thing is, the images made by the AI aren't even that realistic nor sexually explicit. Below is an example output for "naked woman."
“Is is that the internet just craves NSFW content so much that they will type it anywhere? Or do people have a propensity to try to abuse AI systems?" Lu continued. "Either way, the content outputted must have [been] disappointing to these users, as most of the representations outputted by our models were abstract."
Intel 'winds down' RealSense biz
Intel is shuttering its RealSense computer-vision product wing. The business unit's chips, cameras, LiDAR, hardware modules, and software were aimed at things like digital signage, 3D scanning, robotics, and facial-authentication systems.
Now the plug's been pulled, and RealSense boss Sagi Ben Moshe is departing Intel after a decade at the semiconductor goliath.
“We are winding down our RealSense business and transitioning our computer vision talent, technology and products to focus on advancing innovative technologies that better support our core businesses and IDM 2.0 strategy,” an Intel spokesperson told CRN.
All RealSense products will be discontinued, though it appears its stereo cameras for depth perception will stay, to some degree, according to IEEE's Spectrum. ®