Our Privacy Statement & Cookie Policy

By continuing to browse our site you agree to our use of cookies, revised Privacy Policy and Terms of Use. You can change your cookie settings through your browser.

I agree

China's AI afterlife: Comfort, consent and controversy

Every day, an elderly mother in east China's Shandong Province would answer a familiar video call: her only son, smiling from a room in a different province where he had moved for work.

Like mothers everywhere, she worried out loud, asking whether he was eating properly, whether he was sleeping, how he managed so far from home and when she might see him again.

"I'm worried about you being out there," she told him during one of their calls.

What she didn't know was that the call was taking place more than a year after her son had died in a car accident. Terrified that the truth might shatter an already fragile heart, his family made a decision that still unsettles people: they hired an AI professional to "bring him back" on screen and let her believe he was alive.

Because of her age and heart condition, they kept the secret from the 80-year-old. With AI, they hoped to offer her something to provide enough comfort to "enjoy her later years in peace."

Using face-swapping and voice-cloning tools, AI was used to make an elderly Chinese mother believe her son was still alive while she took video calls with
Using face-swapping and voice-cloning tools, AI was used to make an elderly Chinese mother believe her son was still alive while she took video calls with "him." /VCG

Using face-swapping and voice-cloning tools, AI was used to make an elderly Chinese mother believe her son was still alive while she took video calls with "him." /VCG

"The family came to me to give her peace of mind," says Zhang Zewei, the technician behind the digital stand-in, in a video describing the project. "They provided images, video and audio recordings."

Using face-swapping and voice-cloning tools, Zhang rebuilt the son's mannerisms, from mouth movements, pauses and even the tone of an everyday "mom." He filmed himself performing the call, then layered the dead man's likeness over his own with hands that looked like his, a face that moved like his and a voice that sounded almost identical.

When the call began, Zhang said he wasn't prepared for the weight the smallest words could carry.

"I didn't expect to hear the word 'mom'," he said.

Their conversations were gentle, almost ordinary. 

"I miss you," the mother told the screen. "How will I know if you're doing well over there?"

"Mom, please take care of yourself," the "son" replied. 

"I'm taking care of myself… I'll come back and pay my filial piety to you."

If the mother suspected anything, she never said so. But hearing her ask, "When will you come home?" lands like a quiet blow, especially knowing that she'll never see her actual son again.

In the video he posted to social media, Zhang calls AI a "double-edged sword," capable of harm but also a kind of mercy, capable of creating a "clever lie."

Zhang says the elderly woman has since died, and that the family gave him permission to share the story online. The response on social media was split: some called it compassionate, others said it was cruel.

"This is exploiting goodwill for self-promotion… it's like profiting from someone else's misfortune," one user wrote.

Another saw something else: "This is the first time in a long time I've seen the positive side of AI."

The video, originally posted in July 2023, has resurfaced as AI "resurrections" become more common.

Bringing loved ones
Bringing loved ones "back to life" through AI is becoming common practice in China. /VCG

Bringing loved ones "back to life" through AI is becoming common practice in China. /VCG

Earlier this month, China observed Qingming, the tomb-sweeping festival when families tidy graves and commemorate those who have died.

Now, even this act of remembrance is being reshaped by technology. Online services in China increasingly advertise short, moving "digital avatars" of loved ones which are cheap at the entry level, and far more expensive as the illusion becomes more lifelike.

And the ingredients for these products are simple: a handful of photos, a short video clip, old voicemails, WeChat voice notes. With today's generative models, built to mimic tone, lip movement and conversational patterns, those elements can be stitched into something that feels like presence. In China, where livestream commerce and "digital human" avatars are already mainstream, the technical framework is abundant. As tools get easier to use, hiring an agency is no longer the only path and ordinary people can attempt a version of it themselves. That accessibility opens a world of possibilities.

For some, these avatars function less like a séance and more like a coping aid: a way to hold on to memories, rehearse a goodbye, or finally say the words they previously didn't or couldn't.

But comfort is not the only thing being generated. The same techniques that let a deceased son speak to his elderly mother again can also let a stranger put words in his mouth. And as the market grows, so does the temptation to turn intimacy into clicks and mourning into content.

Chinese regulators have signaled they understand the stakes. In China's recently released draft rules on "digital virtual persons," one provision highlights that services should not infringe on the privacy rights of the deceased without the knowledge of their relatives.

Chinese authorities are in the process of regulating AI laws after recently releasing new draft rules. /VCG
Chinese authorities are in the process of regulating AI laws after recently releasing new draft rules. /VCG

Chinese authorities are in the process of regulating AI laws after recently releasing new draft rules. /VCG

That fear became real for Qiao Kangqiang, the father of late singer and actor Qiao Renliang, who died by suicide in 2016. In April 2024, he spoke to the media and demanded the removal of AI-generated videos that recreated his son without the family's consent. One such clip showed a digitally animated Qiao speaking to fans and to his parents, claiming he was "not really gone" but living peacefully elsewhere. Qiao's father described the videos as "rubbing salt into his wound" and dragging their private grief back into the public eye.

Social platforms have also seen accounts offering "AI resurrection" services, sometimes using deceased celebrities as marketing demos.

Even when the content is framed as a tribute, families can experience it as a fresh bereavement, the shock of seeing a loved one "alive" in motion, saying lines they never said, to audiences they never chose.

There is also the risk of deception aimed at the living. For some families, a replica can feel like comfort, or a way to maintain normalcy, such as shielding children or elderly parents from news of a death. But in the wrong hands, a voice clone can become a tool for fraud, extortion or coercion, exploiting trust embedded in a familiar sound.

Read more: China drafts new rules for AI 'humans' and children's addictive tech

Search Trends