Oh no, you're thinking, yet another cookie pop-up. Well, sorry, it's the law. We measure how many people read us, and ensure you see relevant ads, by storing cookies on your device. If you're cool with that, hit “Accept all Cookies”. For more info and to customize your settings, hit “Customize Settings”.

Review and manage your consent

Here's an overview of our use of cookies, similar technologies and how to manage them. You can also change your choices at any time, by hitting the “Your Consent Options” link on the site's footer.

Manage Cookie Preferences
  • These cookies are strictly necessary so that you can navigate the site as normal and use all features. Without these cookies we cannot provide you with the service that you expect.

  • These cookies are used to make advertising messages more relevant to you. They perform functions like preventing the same ad from continuously reappearing, ensuring that ads are properly displayed for advertisers, and in some cases selecting advertisements that are based on your interests.

  • These cookies collect information in aggregate form to help us understand how our websites are being used. They allow us to count visits and traffic sources so that we can measure and improve the performance of our sites. If people say no to these cookies, we do not know how many people have visited and we cannot monitor performance.

See also our Cookie policy and Privacy policy.

This article is more than 1 year old

Boffins teach cars to listen for the sound of a wet road

Artificial intelligence to make autonomous cars safer

The sound a tyre makes on a wet road could become part of the road safety arsenal, if a proposal submitted to an IEEE publication becomes widespread.

Pre-published at ArXiv, the idea comes from a group of German and US boffins, who reckon that deep learning techniques can help cars detect not only that the road is wet, but how wet it is.

As the paper states, autonomous and semi-autonomous vehicles need to know about road conditions “to automatically adapt vehicle speed while entering the curve or keep a safe distance to the vehicle in front”.

The researchers are hoping to overcome the shortcoming of using video cameras to detect a wet road: they're too dependent on light conditions.

The paper (lead author is the IEEE's Irman Abdi of MIT's AgeLab) says 74 per cent of weather-related crashes in the US are down to a wet road surface, causing more than 380,000 injuries and 4,700 deaths each year, so better sensing is a big public-safety issue.

Supported by Toyota and the New England University Transportation Center, the researchers collected sound samples in wet and dry conditions, at varying speeds, in different traffic conditions, and on a variety of pavements.

After some pre-processing, the researchers then passed the audio data through “long short-term memory” recurrent neural networks to end up with rankings of road wetness.

Classifications were produced by software called CURRENNT – the Munich Open-Source CUDA RecurREnt Neural Network Toolkit, here.

The researchers reckon they achieved 93.2 per cent accuracy at all speeds. Even when the test car was stationary, they write, the road noise of other passing vehicles through the microphone was enough for the software to predict how wet the pavement was. ®

 

Similar topics

Similar topics

Similar topics

TIP US OFF

Send us news


Other stories you might like