Forget dumping games designers for AI – turns out it takes two to tango

Machines still need humans to build decent game levels

Got Tips? 6 Reg comments
Super Mario Bros

AI can get pretty good at creating content like images and videos, so researchers are trying to get them to design game levels.

Machines are okay working on their own and can regurgitate the same material seen in the numerous training examples fed by its human creators. It's fine if what you're after is more of the same thing, but that's boring for games. Game designing bots need more creativity and the best place to learn is off humans.

A team of researchers from the Georgia Institute of Technology conducted a series of experiments where humans partnered up with bots to come up with new levels in Super Mario, a popular Nintendo platform game. They call the process “co-creation level design” and have described it in more detail in a paper on arXiv.

Using AI to generate game levels has spun off into its own niche known as “Procedural Content Generation via Machine Learning” (PCGML). Researchers have tinkered with making simple games like Super Mario and Doom but the results haven’t been too exciting. The layout of the level seems to closely mimic the designs the computer been trained on, so playing them is isn’t that much fun.

the_international_2

Game over, machines: Humans defeat OpenAI bots once again at video games Olympics

READ MORE

Co-creation level design, however, might help. The researchers called in 84 participants to work with an AI partner to spawn new Super Mario levels. Different objects in a game of Super Mario like blocks, pipes, or types of sprites are encoded as tiles. Different types types of bots based on three different architectures were tested: Markov Chain, Bayes Net, and an long-short term memory network (LSTM).

The Markov Chain technique and the LSTM inspects the sequence of tiles in a training level, in order to predict a new set of tiles for the generated level. Bayes Net uses probability to work where to add a sprite, and what type it should be.

Each participant co-creates two levels both with different types of agents that were chosen at random. The maximum time given to complete a level was 15 minutes, and the human participants and AI partners took turns filling in a of tiles and sprites for each new level.

The bots were ranked on how fun, interesting, challenging or creative the level is, and also if the participant would choose to team up with them again. “These initial results of our user study do not indicate a clearly superior agent,” the paper said.

It’s tricky to find the best bot, each type has their own pros and cons. Individual participants have different creative objectives so it’s difficult to determine the best overall type. When the researchers analysed the levels, they found that “they clearly demonstrate some departures from typical Super Mario Bros. levels, meaning none of these levels could have been generated by any of these agents. No existing agents are able to handle the variety of human level design or human preferences when it comes to AI agent partners.”

Part 2 of the experiment

The team decided that what’s really needed for co-creation level design is a technique that has been adapted for teamwork rather than from autonomous PCGML.

“In particular, given that none of our existing agents were able to sufficiently handle the variety of participants, we expect instead a need for an ideal partner to either more effectively generalize across all potential human designers or to adapt to a human designer actively during the design task,” they wrote.

So, the team took all the results and data from the experiments and turned it into another dataset, one that already contains how humans and machines work together. All the actions taken by both designers as they take turns adding tiles and sprites have been recorded.

The final scores are determined by the user ranking. Together these actions and scores can train a supervised learning system on what features lead to the best Super Mario levels. There were a total of 1,501 training samples and 242 test samples fed into a convolutional neural network.

They found that training on the new dataset taken from co-creative interactions led to better level designs compared to ones created by AI autonomously. The new and improved bots created levels that were more diverse to represent the different styles created by the co-creative teams.

“PLGML methods are insufficient to address co-creation. Co-creative AI level designers must train on datasets or approximated datasets of co-creative level design,” the paper concluded.

But the current method described in the paper relies on scraping together a new dataset from user experiments is too time consuming and tedious to use practically. Tens of new pilot studies with a different rotation of AI agents would have to be run to make levels for new games other than Super Mario. The researchers said they planned to see if transfer learning might help the process adapt to different games.

Matthew Guzdial, co-author of the paper and a PhD student at the Georgia Institute of Technology, believed that co-creation tools might be useful in the future.

"This study is with published, practicing game designers. They’ve not only been open but excited to interact with the software and see how it develops in the future. The pros and cons tend to depend on how they try to use the tool and how they perceive the role of the AI."

"If they take the AI as a sort of student of theirs, or as a source of inspiration, they tend to enjoy the experience. If they instead think of the AI as the one who’s supposed to keep them on task [in order to] make a playable level, or is supposed to do all the grunt work, they tend to have a less enjoyable experience," he told The Register. ®

Sponsored: Ransomware has gone nuclear

SUBSCRIBE TO OUR WEEKLY TECH NEWSLETTER


Keep Reading

Epic Games floats $1m bounty to ID source of 'commercial smear' claiming Houseparty chat app has been hacked

Lots of non-savvy users may be recycling previously hacked creds
Battlefleet Gothic. Pic: Games Workshop

Tabletop battle-toys purveyor Games Workshop again warns of risks in Microsoft Dynamics 365 ERP project

Project holding steady for resident techies but white knuckle ride continues
Rocket Girls 101

Tencent pop group, formed on a Tencent TV show, boosted Tencent games and Q1 revenue

Chinese web giant also flings cash at cloud and videoconferencing to capture 'This working from home thing could really catch on' sentiment
people wait for job interviews

Looking for a new tech gig? Here are vacancies for web devs, games programmers, server engineers and more

Job Alert Advertise with us here, or browse the listings to see if a role would suit you
doom

Apple bans COVID-19 games and restricts virus-related apps to authoritative souces

No virus-fragging fun unless you’re actually fragging viruses – and no universal developer fee waiver either
Stressed dude awake at night

Crunch time: It's all fun and video games until you're being pressured into working for free

UK industry survey sheds light on ridiculous hours, culture of harassment and bullying
Mark Zuckerberg

Nvidia's A100 GPU coming to a cloud near you, DARPA details AI war games, Intel wants to help scan your brain

Roundup Plus: Zuck wants machines to spot bad memes on Facebook
man signs for fridge delivery

Brits swarm Dixons Carphone for laptops, printers, games consoles, fridges, freezers to weather out COVID-19 storm

Online sales up 72%, but retailer warns of impact of store closures

Biting the hand that feeds IT © 1998–2020