Artificial goodbyes
An 80-year-old woman speaks with her son for a few minutes each day through video calls. She has not seen him in some time, so she keeps asking when he will visit. He always replies that he relocated to another province to save money before returning home to care for her. What she does not know is that her son died in a car accident a year ago.
Rather than tell her the truth, the family members hired an artificial intelligence (AI) company to create a digital twin so she would believe that he was still alive. According to the family, she has a weak heart, and they were worried that the news might harm her health. This incident, reported by the South China Morning Post last week, has since sparked an online debate regarding the ethical use of AI, especially in cases where it can impact human emotions.
As generative AI matures, the world is also seeing the emergence of “grief tech,” also known as the digital afterlife industry. These technologies enable users to interact with simulated versions of their deceased loved ones in intimate ways. Conversational AI products like Project December and You, Only Virtual (YOV) simulate a person’s conversational style by training the model on the deceased person’s text, email, and social media content. Startups like Eternal and HereAfter AI are offering interactive, voice-enabled avatars of people’s loved ones.
In the Philippines, we’re already seeing early forms of AI being used for grief. When a young influencer died by suicide last year, an online user used deepfake technology to make it appear that she was addressing everyone from heaven to reassure them that she was at peace.
Proponents say these tools could provide comfort and help make the grieving process easier. Some people use grief bots to express things they were unable to articulate while the person was still alive. Others seek what psychologists call a continuing bond: a way of maintaining connection through conversation, memory, or the feeling that the person is still somehow present.
Yet many experts warn that when simulations become too lifelike, they may interrupt the difficult but necessary work of mourning. Instead of helping people process loss, they may create emotional dependency and trap users in a prolonged state of unresolved grief.
According to grief counselor and IT expert Kevin Si, grief tech may serve as a temporary bridge, but not a substitute for healing. “One of the things that we have to remember and restore after loss is connection, and in most cases, a bot can’t restore that; only mimic it to a certain extent,” Si said. “The end goal of grief work is to be able to reconnect them to people that matter, not isolate them to be dependent on something else.”
There are also serious ethical concerns. First, there is the issue of consent and data ownership. In the case of the elderly mother in China, neither the deceased person (son) nor the primary user (mother) consented to have his likeness reanimated, nor did the primary user (mother) consent to being deceived by AI into having a continued relationship with her dead son.
Si also warned that people in grief often reveal their most private fears, guilt, anger, and shame. Any technology built for mourners must therefore be held to the highest standards of confidentiality, transparency, and care.
As someone who deeply struggles with loss, I personally understand the appeal and possible usefulness of grief tech. When my Lola passed away during the early parts of the pandemic, the strict lockdown prevented me from saying goodbye to her in person. Although we attended her wake virtually, the distance gave the loss an unreal quality. It felt for a while that I was mourning her in fragments, but in ways that never fully felt complete. I can understand how griefbots might have offered the illusion of a proper goodbye.
Eventually, what helped me was writing her a letter and leaning on the teachings of my faith about resurrection and the afterlife.
Si shared that grief is not a logical experience, and that there are no specific steps or framework that works exactly for everyone. Bots may help people organize thoughts, but they are not necessarily effective in reading emotional cues, sensing silent distress, or offering the warmth of human presence. “Healthy grieving needs people because we have a different level of understanding of nonverbal cues, and whether to delve deep on a topic that’s necessary for them to heal rather than just to lean on what makes them feel good at the moment,” Si said.
They say that loss is painful because someone mattered, and that grief is the price we pay for love. While it is tempting to ease all forms of suffering, grief is the kind of pain that must be lived through. Grief technology may have the power to “resurrect” an illusory version of our loved ones, but it cannot do the human work of helping us let go.
—————-
eleanor@shetalksasia.com

From energy shock to energy security