Bumble open sources AI code to automatically blur NSFW photos

Plus: Why some manga and anime fans hate AI-generated art, and ex-Google boss funds AI students

In brief Bumble has open sourced an AI image classifier model designed to automatically blur nude pictures sent on its dating app.

The tool, dubbed Private Detector, was launched on Bumble in 2019. If it detects fleshy skin tones and characteristic shapes of sensitive body parts, the model blurs out the image to prevent a user from potentially unsolicited NSFW pictures. People who have been sent these naughty nudes are warned and can decide if they want to open and see an unfiltered version of the image or not.

Bumble said only 0.1 percent of users send lewd images. Nevertheless, the company was able to collect a large "best-in-the-industry dataset" featuring naked and non-naked photographs. It also included pictures of edge cases to make the Private Detector more robust against benign images of people's arms or legs.

"As just one of many players in the world of dating apps and social media at large, we also recognize that there's a need to address this issue beyond Bumble's product ecosystem and engage in a larger conversation about how to address the issue of unsolicited lewd photos – also known as cyberflashing – to make the internet a safer and kinder place for everyone," it said in a blog post.

Some manga and anime fans hate AI art

AI-generated art is dividing manga and anime artists and fans in Asia who are concerned over just how easy it is to recreate and mimic a particular style.

A former French game developer, going by the name 5you online, told tech news site Rest of World he received death threats after he shared a model he had trained to produce images in the style of the late graphic South Korean cartoonist Kim Jung Gi. 

Kim is known for his highly intricate ink drawings. He died from a heart attack on October 3 and although 5you thought his model paid homage to the late artist, some thought it was disrespectful and crass. Japanese AI startup Radius5 faced similar backlash and withdrew an image-generation service, mimic, when people began uploading images from artists.

Radius5 is planning to release a new version that prevents people "uploading images for which one does not own the rights nor have permission to use," according to Automaton. Whether AI-made content copying artists' styles violates copyright or not will remain unclear until a legal case is fought in court.

Still, many seem uneasy over the fact that it takes seconds for machines to rip off a style from a human who has spent years mastering their craft.

Schmidt Futures to fund $148m worth of AI postdocs

Philanthropy fund Schmidt Futures, led by former Google CEO Eric of the same surname and his wife Wendy, has announced $148 million funding to support hundreds of postdoctoral positions across nine universities.

The Eric and Wendy Schmidt AI in Science Postdoctoral Fellowship will initially fund approximately 160 postdoc researchers working in the field of AI every year. Participating universities will open up a maximum of 20 positions to new fellows every year for six years.

The nine universities funded by this program include the University of Michigan, Cornell University, the University of Chicago, the University of California, San Diego, Imperial College London, the University of Oxford, the National University of Singapore, Nanyan Technical University, and the University of Toronto.

Schmidt Futures said the current adoption of AI in STEM subjects in academia is "slow" and resources and expertise is "unevenly distributed."

The fellowship program aims to "change how science is done by accelerating the incorporation of AI techniques into the natural sciences, engineering, and mathematical science (STEM), providing access to AI tools and training to the sharpest minds on the frontlines of scientific innovation." ®

Similar topics

TIP US OFF

Send us news


Other stories you might like