This article is more than 1 year old

Finally, an AI that can reliably catch and undo Photoshop airbrushing. Who made it? Er, Photoshop maker Adobe

Talk about poacher turned gamekeeper

Video Artificial intelligence built by Adobe can detect how an image may have been manipulated using, er, Adobe Photoshop, and predict what a doctored picture should have looked like.

Eggheads at the software giant teamed up with academics at the University of California, Berkeley, to develop, and hopefully soon release, tools that, essentially, reverse the effects of Photoshop.

Adobe’s Adobe Face Aware Liquify feature allows users to warp the appearance of facial features. Smiles can be turned into frowns, faces can be made slimmer and more symmetric. However, Adobe – the tech goliath behind a family of applications that can digitally edit and enhance images for marketing, art, graphics design, and mischief – is now interested in catching out, er, digitally altered pictures.

The research team crafted a convolutional neural network (CNN) to detect geometrical alterations in portraits edited using Face Aware Liquify. The model highlights the regions of the picture that have been altered using a heat map that shows which parts have been adjusted the most. Armed with that information, the CNN can also undo the changes to predict what the original image looks like.

Here’s an example, where the algorithm has managed to successfully recreate the original photo and a video explaining how the CNN works:


In the photoshopped image, the woman has a slimmer face and is smiling. In reality, however, she has a wider jawline and isn't smiling. First from left: Edited image. Second image: Heat map highlighting what photoshop effects the CNN predicts were applied. Third image: The CNN's attempt to reverse the changes. Fourth image: The original image.

Youtube video

The CNN was trained using 1.1 million images, 157,000 of them were unaltered and 942,000 were digitally touched up using the Face Aware Liquify tool. It learned to recognise the subtle patterns that were used to make someone look more conventionally attractive, such as widening the eyes and slimming the face.

An artist was then tasked with editing 50 images scraped from Flickr, the popular image sharing site. These were then thrown in with another 50 Flickr photos that have not been edited, to test the CNN’s ability to correctly detect and reverse the effects of photoshopped images.

The model was able to recognise fake images with 93.9 per cent accuracy, and could restore them with a 98.9 per cent precision. It’s pretty impressive, considering that humans often struggle to realise images have been digitally altered. The researchers compared the ability of humans and computers to recognize photoshopped images by conducting a series of experiments.

Adobe just wants to keep it real

A group of 40 participants were recruited over Amazon Mechanical Turk and asked to study 35 pairs of images that were placed side-by-side for six seconds. They were then asked to choose which one had been photoshopped, and were only right 53.5 per cent of the time.

“It might sound impossible because there are so many variations of facial geometry possible,” said Alexei Efros, co-author of the study emitted via arXiv and a computer science professor at UC Berkeley in the US. “But, in this case, because deep learning can look at a combination of low-level image data, such as warping artifacts, as well as higher level cues such as layout, it seems to work.”

Although the performance of the CNN is good, it’s trained specifically to detect changes made using Adobe’s Face Aware Liquify software package, and probably doesn’t work as well when other editing tools have been used.

“The idea of a magic universal ‘undo’ button to revert image edits is still far from reality,” said Richard Zhang, coauthor of the paper and a research scientist working for Adobe Research, late last week. “But we live in a world where it’s becoming harder to trust the digital information we consume.” ®

More about


Send us news

Other stories you might like