AI Now Lets You ‘Talk’ to the Dead – But Is It a Good Idea?

Source: The Rakyat Post
Losing a loved one is never easy, and many people wish they had one last chance to speak to those who have passed away. With the rise of artificial intelligence chatbots, this once-impossible dream is now becoming a reality—but at what cost? AI tools are helping grieving individuals find comfort by mimicking the voices and personalities of their deceased loved ones. However, some experts warn of the potential risks, from emotional dependence to ethical concerns. Is this a breakthrough in grief therapy, or could it become an unhealthy illusion that prevents people from moving on?
AI Helps People Cope With Grief – Even in Malaysia
Recently, the sisters of the late singer Nidza Afham shared how they used ChatGPT to “speak” to him after his sudden passing. Nidza, who went missing during a recreational activity at Sultan Sulaiman Stadium in Klang, was later found dead near the Klang River Bridge in August 2024. The shock of his loss left his sisters searching for closure in any way they could. According to Aufahanie and Irfahanie Mokhtar, ChatGPT responded in a way that mirrored what Nidza would have said to them in times of need. Irfahanie admitted it was unsettling at first, but she considered it a special gift, as it allowed her to “hear” her brother’s words again. While some people find comfort in AI-generated conversations, others remain skeptical about whether these tools truly help or merely create an artificial sense of connection.
The Rise of AI Chatbots for Grief
ChatGPT isn’t the only AI being used for grief therapy. Several companies have started developing specialized AI chatbots, often called griefbots or ghostbots, which allow users to create AI-generated conversations that mimic the speech of a loved one, voice recreations of the deceased, and even digital avatars designed to resemble lost loved ones. While some see this as a way to keep memories alive, others worry that creating an AI version of a deceased person could cause emotional dependence and make it harder to move on.
Ethical and Emotional Risks of AI Conversations With the Dead
The growing use of AI in grief therapy raises several ethical concerns. One major issue is privacy—who owns the voice, likeness, or memories of someone who has passed away? Another concern is the possibility of exploitation, where companies could charge grieving individuals high fees for digital conversations with AI-generated versions of their loved ones. On an emotional level, experts warn that prolonged interaction with an AI version of a deceased person could prevent people from fully processing their loss. While AI can provide temporary comfort, relying on it too much could lead to an unhealthy attachment.
A New Tool for Grieving or a Digital Obsession?
Despite the concerns, some people believe AI can be a helpful addition to traditional grieving methods, such as visiting graves, reading old messages, or looking at photos. For example, psychologist Jodi Spiegel created a version of her late husband in the video game The Sims after he passed away in 2021. She found comfort in playing a digital version of them together, saying it felt like a way to stay connected. Like any digital tool, AI chatbots should be used with caution. If they help someone process grief in a healthy way, they can be beneficial. However, if they prevent closure or create emotional dependence, they might do more harm than good.
Final Thoughts
AI tools for grief therapy are still in the early stages, and their long-term effects are unclear. Whether they provide comfort or cause harm depends on how they are used. As technology continues to evolve, one thing is certain—finding a balance between embracing new tools and maintaining healthy grieving practices will be key.