This article is more than 1 year old

Want to hear our beloved David Attenborough narrate your life? Thanks to the power of machine learning, you can

Plus: AI app for the visually impaired – and Clearview lawyers up

In brief A tech savvy Reddit dweller has trained a machine-learning model to mimic David Attenborough’s world-famous plummy voice.

The award-winning 94-year-old British naturalist is well-known for narrating nature documentaries. Now the UK national treasure's voice has been taken on a joyride to read Reddit comments, rather than describe the feeding behavior of whales, or the politics within of a cackle of hyenas.

Software programmer Garett MacGowan trained the AI model from audio samples of Attenborough, taken from TV clips on YouTube, Vice reported. The results aren’t bad; there’s definitely an Attenborough-esque quality to it though there’s a slight tinny sound that gives it away as being computer-generated.

You can listen for yourself below – posts to Reddit's relationship and life advice message boards are automatically narrated by the Attenborough-bot here.

Youtube Video

Clearview argues that face scraping is free speech

The controversial facial-recognition-for-cops startup Clearview has hired one of the best experts in US constitutional law to defend itself against multiple class-action lawsuits.

Floyd Abrams will argue that scraping billions of photos from people’s public social media profiles to train a massive face-recognizing database is acceptable and protected under the First Amendment, the NYT first reported. America's Constitution protects the rights to free speech and expression.

Clearview’s other lawyer Tor Ekeland, and CEO Hoan Ton-that have also made similar arguments before. They both argued that people’s selfies posted on the internet counts as public information. Revoking access to public information violates the company's First Amendment rights, they previously insisted.

Google built an ML-based app to help visually impaired people

Android phone app Lookout assists people with low-vision or blindness with tasks like buying food in supermarkets and reading mail.

The app, built by Google, uses computer vision algorithms to identify common food items by inspecting the packaging, and reads the labels out loud. Lookout can also scan barcodes. When it comes to paying for the groceries, it has a feature to help people find the right bank notes to hand to a cashier.

The software is trained on about two million products, Google said. There are two other modes in the app that scans the text on documents, like letters or pages of a book, to be recited. You can read more about the technical details here. ®

More about


Send us news

Other stories you might like