This article is more than 1 year old
Just $10 to create an AI chatbot of a dead loved one
How software can help some grieve, perturb others
Feature Death is inescapable. Everyone experiences grief at some point in their lives, whether it's when a relative, friend, or pet passes away.
Many often find comfort in keeping their memories of a loved one alive in some way. As technology progresses, a few have found solace in using artificial intelligence to reconnect with the dead.
Generative AI offers imaginative ways to remember people's lives by simulating their likeness. The story of how one man primed a GPT-3-powered chatbot with text messages from his dead fiancée so that he could talk to her again went viral last year. A software mimic essentially helped Joshua Barbeau come to terms with the death of Jessica Pereira, a woman he met and fell in love with a decade ago. Universal Television reportedly purchased the exclusive rights to develop his story into a TV series.
After that San Francisco Chronicle article, people flocked to Project December, the technology Barbeau used, to spin up their own AI chatbots. The software's creator, Jason Rohrer, an indie games developer, built the code during the COVID-19 pandemic and thought netizens would be willing to pay five dollars a pop to customize the personality of a virtual entity they wanted to speak to. He didn't immediately think people would be interested in using Project December to simulate the dead until Barbeau's story blew up.
Now Rohrer has relaunched Project December as a tool specifically geared toward reconnecting with the dead. Users can pay $10 to create a chatbot mimicking the behavior of someone no longer alive.
"I decided to build a special purpose service when I saw such a desire in the community around Project December after the SF Chronicle story," Rohrer told The Register. "I wanted to build something better for those people. Hopefully, they get the help they were looking for out of this experience."
"It's interesting to build something so cutting-edge, crazy, and science fiction-esque. It's fascinating for me as a creator," he added. Rohrer has even created a trailer, see below, to reposition Project December.
Users are asked to fill out a questionnaire about the person they want to simulate and converse with, providing their name, age, and hobbies, and specific memories and facts. Project December uses this information to make conversations more personal and the chatbot's replies more convincing. Rohrer's program is powered by AI21 Lab's language model after he lost access to GPT-3 when OpenAI shut down his developer account citing safety reasons.
People usually decide to play with Project December out of curiosity, and a few choose to keep returning to it if they get something positive from talking to a machine. One person who had experience with the software told The Register results vary; he had created conversations with all sorts of dead people, from his grandmother to Steve Jobs.
"Depending on the intention, conversations can be funny, creepy, profound, weird, spiritual, or even comparable to a healing process," we're told. He has even tried spinning up a chatbot to model a conversation with his dead future self.
"It reminds me of astrology in a way. You are looking at a star field in the sky to discover yourself. I did the same looking at a screen of pixels," he said.
Is using AI to simulate the dead a growing industry?
The appeal of using AI to conjure the dead is mixed. Using, for instance, generative adversarial networks to touch up and color old photos is pretty innocuous. Tools such as MyHeritage's Deep Nostalgia go even further, animating images to make people blink and smile. The feeling of seeing dead family members or friends seemingly brought back to life momentarily can be unsettling.
"Some people love the Deep Nostalgia feature and consider it magical, while others find it creepy and dislike it," reads an FAQ from the online genealogy company. "Indeed, the results can be controversial and it's hard to stay indifferent to this technology. This feature is intended for nostalgic use, that is, to bring beloved ancestors back to life."
AI can simulate the dead across various types of data, including audio and video. Amazon demonstrated how its personal digital assistant Alexa could mimic people's voices, causing much controversy. "Alexa, can Grandma finish reading me The Wizard of Oz?" one child asks in a video shown at the internet giant's re:MARS conference over the summer. And so the machine went, impersonating grandma.
Rohit Prasad, head scientist for Alexa AI, said personalizing the technology provides a way to build trust between humans and machines, and added this is especially important when "so many of us have lost someone we love" during the pandemic. He seemed to be saying that Amazon's Alexa could pretend to be a dead relative or friend and converse as them with others on request.
The technology was criticized for being creepy and dystopian; it's not clear whether the audio feature is something the tech titan will make generally available. "We do not have more to share on specific features or availability," a spokesperson for Amazon told El Reg in a statement.
"Personalizing Alexa's voice is a highly desired feature by our customers who could use this technology to create many delightful experiences. We are working on improving the fundamental science that we demonstrated at re:MARS and are exploring use cases that will delight our customers, with necessary guardrails to avoid any potential misuse."
- Customer service chatbot sector forecast to be worth $7bn this year
- Chatbots: A load of hype or fancy lifehack for the lazy IT person?
- Why machine-learning chatbots find it difficult to respond to idioms, metaphors, rhetorical questions, sarcasm
- Do AI chat bots need a personality bypass – or will we only trust gabber 'droids with character?
LA-based StoryFile made headlines when pioneering Holocaust educator Marina Smith worked with the company to craft a video that was played at her funeral. Smith pre-recorded video messages, and machine learning algorithms helped select which clips were most appropriate to play when guests asked her questions, as if she could talk to them beyond the grave right there and then.
Reviving the dead using algorithms may seem subversive, weird, or freaky, but it can bring comfort to those who are open-minded enough to try these new types of services. One Project December user told us he thinks twice before admitting he uses the software to hold conversations with the dead since it is "somewhat taboo."
He finds the whole experience "oddly therapeutic," though. He told The Register his mother had just been admitted to a hospice, and he wasn't sure if he would simulate a conversation with her once she passed away.
Rohrer has tested Project December several times, modelling the chatbot on people from his own life who have passed away, including his grandparents, aunt, and childhood piano teacher. He said it gave him a chance to think about them, and relive old memories.
"What are the most important things to say about this person?" he said. "What would I say to them if I had one last time with them?" ®