This article is more than 1 year old

Who knows the secret of the black magic box? Boffins seek the secrets of AI learning by mapping digital neurons

And Zoox setlles with Elon's Musketeers over purloined IP

Roundup OpenAI Microscope: Neural networks, often described as “black boxes”, are complicated; it’s difficult to understand how all the neurons in the different layers interact with one another. As a result, machine learning engineers have a hard time trying to interpret their models.

OpenAI Microscope, a new project launched this week, shows that it is possible to see which groups of neurons are activated in a model when it processes an image. In other words, it’s possible to see what features these neurons in the different layers are learning. For example, the tools show what parts of a neural network are looking at the wheels or the windows in an image of a car.

There are eight different visualisations that take you through eight popular models - you can explore them all here.

At the moment, it’s more of an educational resource. The Microscope tools won’t help you interpret your own models because they can’t be applied to custom neural networks.

“Generating the millions of images and underlying data for a Microscope visualization requires running lots of distributed jobs,” OpenAI explained. “At present, our tooling for doing this isn't usable by anyone other than us and is entangled with other infrastructure.”

The researchers hope that their visualisation tools might inspire people to study the connections between neurons. “We’re excited to see how the community will use Microscope, and we encourage you to reuse these assets. In particular, we think it has a lot of potential in supporting the Circuits collaboration—a project to reverse engineer neural networks by analyzing individual neurons and their connections—or similar work,” it concluded.

Don't stand so close to me: Current social distancing guidelines require people to stay at least six feet away from each other to prevent the spread of the novel coronavirus.

But how do you enforce this rule? Well, you can’t really but you can try. Landing AI, a Silicon Valley startup led by Andrew Ng, has built what it calls an “AI-enabled social distancing detection tool.”

Here’s how it works: Machine learning software analyses camera footage of people walking around and translates the frames into a bird’s eye view, where each person is represented as a green dot. A calibration tool estimates how far apart these people or dots are from one another by counting the pixels between them in the images. If they’re less than six feet apart, the dots turn red.

Landing AI said it built the tool to help the manufacturing and pharmaceutical industries. “For example, at a factory that produces protective equipment, technicians could integrate this software into their security camera systems to monitor the working environment with easy calibration steps,” it said.

“The detector could highlight people whose distance is below the minimum acceptable distance in red, and draw a line between to emphasize this. The system will also be able to issue an alert to remind people to keep a safe distance if the protocol is violated.”

“Landing AI built this prototype at the request of customers whose businesses are deemed essential during this time,” a spokesperson told The Register.

“The productionization of this system is still early and we are exploring a few ways to notify people when the social distancing protocol is not followed. The methods being explored include issuing an audible alert if people pass too closely to each other on the factory floor, and a nightly report that can help managers get additional insights into their team so that they can make decisions like rearranging the workspace if needed.”

You can read more about the prototype here.

Amazon improves Alexa’s reading voice: Amazon has added a new speaking style for its digital assistant Alexa.

The “long-form speaking style” will supposedly make Alexa sound more natural when it’s reading webpages or articles aloud. The feature, built from a text-to-speech AI model, introduces “more natural pauses” as it recites paragraphs of text or switches from one character to another in dialogues.

Unfortunately, this function is only available for customers in the US at the moment. To learn how to implement the long-form speaking style, follow the rules here.

Zoox settles with Tesla over IP use: Self-driving car startup Zoox announced it had settled its lawsuit with Tesla and agreed to pay Musk’s auto biz damages of an undisclosed fee.

“Zoox acknowledges that certain of its new hires from Tesla were in possession of Tesla documents pertaining to shipping, receiving, and warehouse procedures when they joined Zoox’s logistics team, and Zoox regrets the actions of those employees,” according to a statement. ”As part of the settlement, Zoox will also conduct enhanced confidentiality training to ensure that all Zoox employees are aware of and respect their confidentiality obligations.”

The case [PDF], initially filed by Tesla’s lawyers last year, accused the startup and four of its employees of stealing proprietary documents describing its warehouses and operations, and attempting to get more of its employees to join Zoox.

NeurIPS deadline extended: Here’s a bit of good news for AI researchers amid all the doom and gloom of the current coronavirus pandemic: the deadline for submitting research papers to the annual NeurIPS AI conference has been extended.

Now, academics have until 27 May to submit their abstracts and 3 June to submit their finished papers. It can be hard to work during current lockdown situations as people juggle looking after children and their jobs.

“Due to continued COVID-19 disruption, we have decided to extend the NeurIPS submission deadline by just over three weeks,” the program chairs announced this week. ®

More about

TIP US OFF

Send us news


Other stories you might like