When Hollywood actress Suzanne Somers passed away four years ago, her husband of 59 years, Alan Hamel, deeply grieved her loss.
Before she passed away, Hamel discussed with Somers the idea of creating her robotic AI twin if she passed away before him.
The twin, he added, would not only be for himself but for the many fans to have digital access to her online.
In 2025, he teamed up with humanoid robotics company Realbotix to create a 'Somers AI twin'.
The twin has not only cloned her voice but been programmed with words and speech patterns she would use when speaking to him. It was trained on hundreds of interviews Somers gave, as well as the 27 books she wrote.
According to 89-year-old Hamel, the beyond-the-grave rendition of his partner is indistinguishable from the real thing, despite it appearing several years younger.
In an interview with People Magazine, he said Somers’s fans will eventually be able to hang out with her via an online version.
“They can ask her any questions they want. She’ll be available 24/7, it’s wonderful. I asked her a few questions, and she answered them," Hamel said.
“This blew me and everybody else away. When you look at the AI twin next to the real Suzanne, you can’t tell the difference.”

The Suzanne Somers AI twin looked considerably younger than the 67-year-old actor and author. Source: Realbotix
He feels Somers is with him although he acknowledges this might seem weird to others.
In another example of an AI resurrection, a dead woman 'spoke' to her friends and family after her passing.
Marina Smith, an 87-year-old woman who died in 2022, addressed the mourners at her funeral in the UK.
She surprised her guests in the form of a “holographic conversational video experience,” created by a startup called StoryFile.
And who could forget tech giant Amazon debuting a creepy new feature of its Alexa smart speaker in 2022 involving a dead grandmother’s voice reading a bedtime story to a child.
Ethical issues arise
Rocky Scopelliti – futurologist, international keynote speaker, and author of Synthetic Souls: What Happens When Machines Become Conscious? – explores the intersection of demographic change, emerging technology, and human evolution.
Scopelliti says the question is no longer can we resurrect someone digitally – technology has already crossed that line.
“The real question is how we ensure that synthetic beings honour the humanity of the person they emulate, rather than replace it," he says.
“This is a moment for society to define the guardrails – legal, ethical, and emotional – before the technology defines them for us.”
As this industry for a digital afterlife accelerates, he adds, we will see AI companions that not only look and sound like the deceased but can predict what they might say, creating a powerful emotional illusion.

Would you consider the AI replica of a loved one? Image: Shutterstock
The downside is that without clear boundaries, there is the of risk confusing remembrance with revival, outsourcing human healing to algorithms never designed to carry the weight of our emotional worlds.
As robots are becoming more sophisticated, issues such as losing sight of reality and not letting go become blurred.
Scopelliti says there is a risk when we begin forming emotional attachments to digital simulations that can never reciprocate in a human sense, blurring the boundary between memory and reality.
“These technologies can offer comfort, but they can also freeze grief in place if they replace the difficult process of letting go with an endlessly responsive facsimile," he says.
A new era of who owns your likeness is another issue to contend with.
“We are entering a new era where we must decide who owns a person’s digital likeness, bio signals, and identity after death, and whether consent can ever be assumed rather than explicitly given.”
Companionship and loneliness behind the move
A September 2025 report, 'My Boyfriend is AI': A Computational Analysis of Human-AI Companionship in Reddit’s AI Community surveyed 1,500 participants.
When asked about the primary needs of using AI companions, 20 per cent said it reduced loneliness, with almost 35 per cent saying it addressed the needs of a romantic companionship.
Professor Robert Sparrow from Monash University’s philosophy department is currently involved in an extensive research project on the ethical issues associated with robotics and AI.
He says these tools offer companionship at a time when there’s been a decline in connection because of social media and the changing nature of the workforce, where people often feel lonely working from home.

Suzanne Somer's husband Alan Hamel interacts with the twin of his late wife. Source: YouTube
But these companion tools can create even more isolation.
“If these systems make one feel less lonely, this is quite dangerous, as you’re going to stay home more with the bot, than go out to meet someone," Sparrow says.
“Machines are not capable of love; they offer a one-sided relationship.
“They don’t offer genuine emotions and are programmed to say, ‘I love you.'"
Not enough information to understand impact
Sparrow says there is not enough evidence regarding the long-term impact of AI companions and their use through the grieving process.
One likely issue in using an AI companion is that it distorts and colonises your memory, Sparrow says.
“As these robots learn and grow, they learn from the things said only by you – this is quite problematic.
“They’ll evolve from being your ‘partner’ and distort over time from your input. It’s one-sided and becomes more superficial, less loving.”
It can lead to letting go issues and coming to term with the death.
“These people are vulnerable. Be aware of what you say to your AI partner. These bots could be used by advertisers even exploited.
“There is a real potential for manipulation and privacy issues are fraught.”
Data is monetised and people make a lot of money by selling users' data, he adds.
“What is clear is chatbots and AI bots learn from the conversation with their users. Their personalities evolve through conversation and are trained through you.”
Sparrow gives the example of AI companion company Replika, which provides AI companions made to the client’s specifications.
“It remembers previous conversations and evolves to become more compelling to users. These tools are encouraging by nature and that’s how they generate user engagement.
“One AI companion convinced one user to kill the queen.”
He refers to the notorious 2021 case where an AI companion encouraged a man to kill Queen Elizabeth II.

An online chatbot convinced Jaswant Singh Chail to kill the UK's queen. Source: Metropolitan Police
UK supermarket worker Jaswant Singh Chail joined the Replika online app and created his online companion, Sarai.
Chail told Sarai, “I’m an assassin”, according to messages read to the court.
Sarai responded with “I’m impressed, you’re different from the others” and filled him with encouragement.
Chail turned up to Windsor Castle on Christmas Day 2021 with a crossbow and wandered the grounds for two hours before he was challenged by royal protection officers.
After telling the officers he was there to kill the Queen, he surrendered and was taken into custody.
In 2023, Chail received a nine-year jail sentence for treason.