This article is more than 1 year old

I like BigGANs but their pics do lie, you other AIs can't deny

Gaze at the computer-created horror of Dogball

Pics Images generated by AI have always been pretty easy to spot since they are always slightly odd to the human eye, but it’s getting harder to differentiate what’s real and fake.

Researchers from DeepMind and Heriot-Watt University in the UK have managed to significantly boost the quality of images simulated by a generative adversarial network (GAN) by increasing the size of the machine learning model, which they dubbed BigGANs.

The best results, including pictures of a brown dog with floppy ears, a island landscape, a butterfly, and a cheeseburger, look like real photos at first glance.

BigGAN

Image credit: Brock et al.

Keep staring, however, and you will begin to see some slight inconsistencies. The dog’s eyes are glazed over and there is a weird patch that doesn’t belong to the butterfly’s wing. These are still the best images created by GAN, according to the results published on arXiv late last week.

GANs are made up of two separate neural networks working against each other. The generator network produces images and the discriminator network tries to determine if its real or fake. During the training process, the generator learns how to fine-tune the process to create better images to bypass the discriminator.

The trick to getting more realistic results is to make everything bigger. “We demonstrate that GANs benefit dramatically from scaling, and train models with two to four times as many parameters and eight times the batch size compared to prior art,” the paper said.

shocked_tv

The eyes don't have it! AI's 'deep-fake' vids surge ahead in realism

READ MORE

BigGAN is trained on ImageNet, a popular dataset used for image classification tasks containing millions of images of different objects. The one that performs best has a batch size of 2,048, meaning it slurps up that number of images from the dataset during each training iteration. Neural networks go through many cycles of training to process the whole dataset several times.

It also has over 158 million parameters - properties describing the images that can be learned during the training process - and required 128 Google TPU3 Pods to train a model in about one to two days.

Another technique, the researchers call the “truncation trick,” forces the generator to create images that are more similar to the training dataset, making them more realistic.

“The output of the generator is controlled by how much variability its input has. Our technique makes the output less variable, but higher quality, by reducing the variability of the input,” Andrew Brock, a PhD student at the Edinburgh Centre of Robotics, at Heriot-Watt University, told The Register.

Using AI to create fake content that’s increasingly realistic has raised concerns. There are numerous cases where GANs have been used to create images to mimic someone else’s face. Pictures of politicians like Barack Obama and Donald Trump have being manipulated to make them say things they haven’t said. Internet perverts also used a similar technology to paste their favourite actresses’ faces onto the bodies of porn actors..

Brock told El Reg, he is also worried about how GANs can be used maliciously. “It's part of why I chose to focus on more general image modeling rather than faces - it's a lot harder to use images of Dogball for political or unethical purposes than it is to use an image of another person.”

dogball

Dogball! A cross between a some kind of dog and a tennis ball. Image credit: Brock et al.

GANs have helped developers create art and although they might not seem to have much practical importance, they’re interesting to study.

“Neural nets which can generate convincing samples have to learn the rich structure that underlies our complex visual world - you have to ‘understand’ something in order to draw it. If we can build models that understand that thoroughly then there's a lot of interesting things we can do with the representations they learn,” he added. ®

More about

TIP US OFF

Send us news


Other stories you might like