Section Branding
Header Content
Chinese companies offer to 'resurrect' deceased loved ones with AI avatars
Primary Content
TAIPEI, Taiwan — Whenever stress at work builds, Chinese tech executive Sun Kai turns to his mother for support. Or rather, he talks with her digital avatar on a tablet device, rendered from the shoulders up by artificial intelligence to look and sound just like his flesh-and-blood mother, who died in 2018.
“I do not treat [the avatar] as a kind of digital person. I truly regard it as a mother,” says Sun, 47, from his office in China’s eastern port city of Nanjing. He estimates he converses with her avatar at least once a week. “I feel that this might be the most perfect person to confide in, without exception.”
The company that made the avatar of Sun’s mother is called Silicon Intelligence, where Sun is also an executive working on voice simulation. The Nanjing-based company is among a boom in technology startups in China and around the world that create AI chatbots using a person’s likeness and voice.
The idea to digitally clone people who have died is not new but until recent years had been relegated to the realm of science fiction. Now, increasingly powerful chatbots like Baidu’s Ernie or OpenAI’s ChatGPT, which have been trained on huge amounts of language data, and serious investment in computing power have enabled private companies to offer affordable digital “clones” of real people.
These companies have set out to prove that relationships with AI-generated entities can become mainstream. For some clients, the digital avatars they produce offer companionship. In China, they have also been spun up to cater to families in mourning who are seeking to create a digital likeness of their lost loved ones, a service Silicon Intelligence dubs “resurrection.”
“Whether she is alive or dead does not matter, because when I think of her, I can find her and talk to her,” says Sun of his late mother, Gong Hualing. “In a sense, she is alive. At least in my perception, she is alive,” says Sun.
The rise of AI simulations of the deceased, or “deadbots” as academics have termed them, raises questions without clear answers about the ethics of simulating human beings, dead or alive.
In the United States, companies like Microsoft and OpenAI have created internal committees to evaluate the behavior and ethics of their generative AI services, but there is no centralized regulatory body in either the U.S. or China for overseeing the impacts of these technologies or their use of a person’s data.
Data remains a bottleneck
Browse Chinese e-commerce sites and you will find dozens of companies that sell “digital cloning” and “digital resurrection” services that animate photographs to make them look like they are speaking for as little as the equivalent of less than $2.
Silicon Intelligence’s most basic digital avatar service costs 199 yuan (about $30) and requires less than one minute of high-quality video and audio of the person while they were living.
More advanced, interactive avatars that use generative AI technology to move on screen and converse with a client can cost thousands of dollars.
But there’s a big bottleneck: data, or rather, the lack of it.
“The crucial bit is cloning a person’s thoughts, documenting what a person thought and experienced daily,” says Zhang Zewei, the founder of Super Brain, an AI firm based in Nanjing that also offers cloning services.
Zhang asks clients to describe their foundational memories and important experiences, or that of their loved ones. The company then feeds those stories into existing chatbots, to power an AI avatar’s conversations with a client.
(Due to the rise in AI-powered scams using deepfakes of a person’s voice or likeness, both Super Brain and Silicon Intelligence require authorization from the person being digitally cloned, or authorization from family and proof of kin if the person is deceased.)
The most labor-intensive step of generating an avatar of a person is then cleaning up the data they provide, says Zhang. Relatives often hand over low-quality audio and video, marred by background noise or blurriness. Photos depicting more than one person are also no good, he says, because they confuse the AI algorithm.
However, Zhang admits that for a digital clone to be truly life-like would need much higher volumes of data, with clients preparing “at least 10 years” ahead of time by keeping a daily diary.
The scarcity of usable data is compounded when someone unexpectedly dies and leaves behind few notes or videos.
Fu Shou Yuan International Group, a Chinese-listed company in Shanghai that maintains cemeteries and provides funeral services, instead bases its AI avatars primarily on the social media presence a person maintained in life.
“In today's world, the internet probably knows you the best. Your parents or family may not know everything about you, but all your information is online — your selfies, photos, videos,” says Fan Jun, a Fu Shou Yuan executive.
A taboo against death
Fu Shou Yuan is hoping generative AI can lessen the traditional cultural taboo around discussing death in China, where mourning is accompanied by extensive ritual and ceremony though expressions of daily grief are discouraged.
In Shanghai, the company has built a cemetery, landscaped like a sun-dappled public park, but it’s no ordinary burial ground. This one is digitized: Visitors can hold up a cellphone to scan a QR code placed on select headstones and access a multimedia record of the deceased’s life experiences and achievements.
“If these thoughts and ideas were to be engraved like in ancient times, we would need a vast cemetery like the Eastern Qing tombs for everyone,” Fan says, referring to a large imperial mausoleum complex. “But now, it is no longer necessary. All you might need is a space as small as a cup with a QR code on it.”
Fan says he hopes the experience will better “integrate the physical and the spiritual,” that families will see the digital cemetery as a place to celebrate life rather than a site that invokes fear of death.
So far fewer than 100 customers have opted for placing digital avatars on their loved ones’ headstones.
“For the family members who have just lost a loved one, their first reaction will definitely be a sense of comfort, a desire to communicate with them again,” says Jiang Xia, a funeral planner for the Fu Shou Yuan International Group. “However, to say that every customer will accept this might be challenging, as there are ethical issues involved.”
Nor are Chinese companies the first to try recreating digital simulations of dead people. In 2017, Microsoft filed a patent application for simulating virtual conversations with someone who had passed, but an executive of the U.S. tech giant later said there was no plan to pursue it as a full commercial service, saying it was “disturbing.”
Project December, a platform first built off ChatGPT’s technology, provides several thousand customers the ability to talk with a chatbot modeled off their loved ones. OpenAI soon terminated the platform’s access to its technology, fearing its potential misuse for emotional harm.
Ethicists are warning of potential emotional harm to family members caused by life-like AI clones.
“That is a very big question since the beginning of humanity: What is a good consolation? Can it be religion? Can it be forgetting? No one knows,” says Michel Puech, a philosophy professor at the Sorbonne Université in Paris.
“There is the danger of addiction, and [of] replacing real life. So if it works too well, that's the danger,” Puech told NPR. “Having too much consoling, too much satisfying experience of a dead person will apparently annihilate the experience, and the grief, of death.” But, Puech says, that in fact, it's largely an illusion.
Most people who have decided to digitally clone their loved ones are quick to admit every person grieves differently.
Sun Kai, the Silicon Intelligence executive who digitally cloned his mother, has deliberately disconnected her digital avatar from the internet, even if it means the chatbot will remain ignorant of current events.
“Maybe she will always remain as the mother in my memory, rather than a mother who keeps up with the times,” he tells NPR.
Others are more blunt.
“I do not recommend this for some people who might see the avatar and feel the full intensity of grief again,” says Yang Lei, a resident of the southern city of Nanjing, who paid a company to create a digital avatar for his deceased uncle.
Low-tech solutions to high-tech problems
When Yang’s uncle passed away, he feared the shock would kill his ailing, elderly grandmother. Instead of telling her about her son’s death, Yang sought to create a digital avatar that was realistic enough to make video calls with her to maintain the fiction that her son was still alive and well.
Yang says he grew up with his uncle, but their relationship became more distant after his uncle left their village looking for work in construction.
After his uncle’s death, Yang struggled to unearth more details of his life.
“He had a pretty straightforward routine, as most of their work was on construction sites. They work there and sleep there, on site. Life was quite tough,” Yang says. “It was just a place to make money, nothing more, no other memories.”
Yang scrounged around family group chats on various social media apps on his own phone and came up with enough voice messages and video of his late uncle to create a workable digital clone of his likeness. But there was no getting around the lack of personal records, social media accounts and thus the lack of data his uncle had left behind.
Then Yang hit upon a more low-tech solution: What if a company employee pretended to be his uncle but disguised their face and voice with the AI likeness of his uncle?
In spring 2023, Yang put his plan into motion, though he has since come clean with his grandmother once she was in better health.
The experience has left Yang contemplating his own mortality. He says he is definitely going to clone himself digitally in advance of his death. However, doing so would not create another living version of himself, he cautioned, nor would such a digital avatar ever replace human life.
“Do not overthink it,” he cautions. “An AI avatar is not the same as the human it replaced. But when we lose our flesh and blood body, at least AI will preserve our thoughts.”
Aowen Cao contributed research from Nanjing, China.