This article is more than 1 year old

Creatives up in arms over claim that AI is killing human art

Plus: Cruise expanding self-driving robotaxi operations, and more

In brief Everyone agrees that text-to-image models are here to stay, though opinions are divided over AI-generated art.

Some artists are enthralled by the ability to create completely new digital images using text prompts and see it as a new tool to be creative. Other people who make their living from art, however, detest the technology – believing it will cost them their jobs and devalue their work.

A machine can be trained to recreate a particular artist's style and outpace human artists, as RJ Palmer, a conceptual artist, told the BBC. "Right now, if an artist wants to copy my style, they might spend a week trying to replicate it. That's one person spending a week to create one thing. With this machine, you can produce hundreds of them a week."

AI is "directly stealing their essence in a way", Palmer said, and artists are currently powerless to stop it from happening.

Developers train these models by feeding them with a large database of images scraped from the internet, so it is therefore not too surprising for an artist to find their work in a model's training dataset. 

The creator of Stable Diffusion, a popular open model taking the internet by storm, however, said he didn't believe AI would take away artists' ability to make a living. Excel "didn't put the accountants out of work; I still pay my accountants," Emad Mostaque said.

He said the tool will give artists new jobs: "This is a sector that's going to grow massively. Make money from this sector if you want to make money, it'll be far more fun".

Jason Allen, who controversially won a state art fair with a digital image, has previously said: "Art is dead, dude. It's over. AI won. Humans lost."

Cruise is expanding its AI robotaxi service

Self-driving car biz, Cruise, will launch its autonomous taxi service to cities in Texas and Arizona by the end of this year.

Co-founder and CEO Kyle Vogt told TechCrunch the company is planning to operate a small fleet of self-driving vehicles on the roads of Austin, Texas, and Phoenix, Arizona "in the next 90 days and before the end of 2022". Cruise launched its first robotaxi service without human drivers in San Francisco, California.

The service only runs in a few select areas late at night – starting from 2200 until 0530 to avoid rush hour traffic. Not everyone can call a car, however, only a small group of pre-screened riders can. The waitlist for members of the public to be considered and join is open. 

Vogt said Cruise is also hoping to start driving newly designed Origin vehicles next year. These boxy cars will have no steering wheel or pedals and will be fully automated. "Looking at 2023, next year, things get really interesting on the growth side," he said. 

"There's gonna be thousands of AVs rolling out of the General Motors plant, including the first Origins. We'll be using those to light up many more markets and to start to generate meaningful revenue in those markets."

Will the AI community be stuck with transformers?

A creator of the popular AI library, PyTorch, warned that the current trend of optimizing hardware for transformer models will make it more difficult for new architectures to succeed.

Transformers were first used in natural language processing, and are behind the most powerful generative models yet capable of creating text and images. They have been used in all sorts of applications from gaming to drug design. Hardware companies like Nvidia are optimizing their chips to accelerate transformer-based models, and that could hamper innovation in the future.

Soumith Chintala, who helped build PyTorch, told Business Insider he hopes another type of model will emerge.

"We're in this weird hardware lottery. Transformers emerged five years ago, and another big thing has yet to come up. So it may be that companies think 'we should just optimize hardware for transformers.' That results then in going any other direction being much harder.

"Architectures that are different from transformers will not run as effectively on current and future chips, and could dissuade developers from coming up with other types of models. 

"It's gonna be much harder for us to even try other ideas if hardware vendors end up making the accelerators more specialized to the current paradigm," Chintala warned. ®

More about


Send us news

Other stories you might like