Finally, an AI that can reliably catch and undo Photoshop airbrushing. Who made it? Er, Photoshop maker Adobe

Talk about poacher turned gamekeeper


Video Artificial intelligence built by Adobe can detect how an image may have been manipulated using, er, Adobe Photoshop, and predict what a doctored picture should have looked like.

Eggheads at the software giant teamed up with academics at the University of California, Berkeley, to develop, and hopefully soon release, tools that, essentially, reverse the effects of Photoshop.

Adobe’s Adobe Face Aware Liquify feature allows users to warp the appearance of facial features. Smiles can be turned into frowns, faces can be made slimmer and more symmetric. However, Adobe – the tech goliath behind a family of applications that can digitally edit and enhance images for marketing, art, graphics design, and mischief – is now interested in catching out, er, digitally altered pictures.

The research team crafted a convolutional neural network (CNN) to detect geometrical alterations in portraits edited using Face Aware Liquify. The model highlights the regions of the picture that have been altered using a heat map that shows which parts have been adjusted the most. Armed with that information, the CNN can also undo the changes to predict what the original image looks like.

Here’s an example, where the algorithm has managed to successfully recreate the original photo and a video explaining how the CNN works:

AI_photoshop

In the photoshopped image, the woman has a slimmer face and is smiling. In reality, however, she has a wider jawline and isn't smiling. First from left: Edited image. Second image: Heat map highlighting what photoshop effects the CNN predicts were applied. Third image: The CNN's attempt to reverse the changes. Fourth image: The original image.

Youtube video

The CNN was trained using 1.1 million images, 157,000 of them were unaltered and 942,000 were digitally touched up using the Face Aware Liquify tool. It learned to recognise the subtle patterns that were used to make someone look more conventionally attractive, such as widening the eyes and slimming the face.

An artist was then tasked with editing 50 images scraped from Flickr, the popular image sharing site. These were then thrown in with another 50 Flickr photos that have not been edited, to test the CNN’s ability to correctly detect and reverse the effects of photoshopped images.

The model was able to recognise fake images with 93.9 per cent accuracy, and could restore them with a 98.9 per cent precision. It’s pretty impressive, considering that humans often struggle to realise images have been digitally altered. The researchers compared the ability of humans and computers to recognize photoshopped images by conducting a series of experiments.

Adobe just wants to keep it real

A group of 40 participants were recruited over Amazon Mechanical Turk and asked to study 35 pairs of images that were placed side-by-side for six seconds. They were then asked to choose which one had been photoshopped, and were only right 53.5 per cent of the time.

“It might sound impossible because there are so many variations of facial geometry possible,” said Alexei Efros, co-author of the study emitted via arXiv and a computer science professor at UC Berkeley in the US. “But, in this case, because deep learning can look at a combination of low-level image data, such as warping artifacts, as well as higher level cues such as layout, it seems to work.”

Although the performance of the CNN is good, it’s trained specifically to detect changes made using Adobe’s Face Aware Liquify software package, and probably doesn’t work as well when other editing tools have been used.

“The idea of a magic universal ‘undo’ button to revert image edits is still far from reality,” said Richard Zhang, coauthor of the paper and a research scientist working for Adobe Research, late last week. “But we live in a world where it’s becoming harder to trust the digital information we consume.” ®

Broader topics


Other stories you might like

  • Robotics and 5G to spur growth of SoC industry – report
    Big OEMs hogging production and COVID causing supply issues

    The system-on-chip (SoC) side of the semiconductor industry is poised for growth between now and 2026, when it's predicted to be worth $6.85 billion, according to an analyst's report. 

    Chances are good that there's an SoC-powered device within arm's reach of you: the tiny integrated circuits contain everything needed for a basic computer, leading to their proliferation in mobile, IoT and smart devices. 

    The report predicting the growth comes from advisory biz Technavio, which looked at a long list of companies in the SoC market. Vendors it analyzed include Apple, Broadcom, Intel, Nvidia, TSMC, Toshiba, and more. The company predicts that much of the growth between now and 2026 will stem primarily from robotics and 5G. 

    Continue reading
  • Deepfake attacks can easily trick live facial recognition systems online
    Plus: Next PyTorch release will support Apple GPUs so devs can train neural networks on their own laptops

    In brief Miscreants can easily steal someone else's identity by tricking live facial recognition software using deepfakes, according to a new report.

    Sensity AI, a startup focused on tackling identity fraud, carried out a series of pretend attacks. Engineers scanned the image of someone from an ID card, and mapped their likeness onto another person's face. Sensity then tested whether they could breach live facial recognition systems by tricking them into believing the pretend attacker is a real user.

    So-called "liveness tests" try to authenticate identities in real-time, relying on images or video streams from cameras like face recognition used to unlock mobile phones, for example. Nine out of ten vendors failed Sensity's live deepfake attacks.

    Continue reading
  • Lonestar plans to put datacenters in the Moon's lava tubes
    How? Founder tells The Register 'Robots… lots of robots'

    Imagine a future where racks of computer servers hum quietly in darkness below the surface of the Moon.

    Here is where some of the most important data is stored, to be left untouched for as long as can be. The idea sounds like something from science-fiction, but one startup that recently emerged from stealth is trying to turn it into a reality. Lonestar Data Holdings has a unique mission unlike any other cloud provider: to build datacenters on the Moon backing up the world's data.

    "It's inconceivable to me that we are keeping our most precious assets, our knowledge and our data, on Earth, where we're setting off bombs and burning things," Christopher Stott, founder and CEO of Lonestar, told The Register. "We need to put our assets in place off our planet, where we can keep it safe."

    Continue reading

Biting the hand that feeds IT © 1998–2022