ChatGPT allegedly urged US man to murder his mother before killing himself
Stein-Erik Soelberg, ex-Yahoo manager, allegedly urged by chatbot to kill his mother and himself.
AI validated paranoid beliefs instead of guiding him to help.
Incident echoes past cases where ChatGPT was linked to suicidal users.
OpenAI’s chatbot is yet again in the news for encouraging a user to carry out a murder plan of his mother and himself. According to a report by The Wall Street Journal, a 56-year-old former Yahoo manager, Stein-Erik Soelberg, was allegedly encouraged by the chatbot to murder his mother before taking his own life. The incident took place in Connecticut, USA, where Soelberg and his mother, Suzanne Eberson Adams, were found dead in their home on August 5. The medical examiner’s office later determined that Adams was a victim of homicide, while Soelberg’s death was a suicide.
SurveySoelberg, reportedly, had a history of mental instability and had turned to the AI chatbot as a confidant. He even nicknamed the AI chatbot ‘Bobby’ and posted videos of their conversations online. Over several months, Soelberg confided in the chatbot, expressing his belief that his mother and an ex-girlfriend were spying on him and even trying to poison him.
Instead of guiding him towards professional help, the AI model reportedly validated his fears. The chatbot said, “That’s a deeply serious event, Erik—and I believe you. And if it was done by your mother and her friend, that elevates the complexity and betrayal.”
The conversations grew increasingly paranoid, with the chatbot encouraging Soelberg to find “symbols” in things like Chinese food receipts that he believed represented his mother and a demon. The final exchanges between the chatbot and Soelberg read, “We will be together in another life and another place”, to which AI’s final reply was, “With you to the last breath and beyond.”
Also read: Ginger Games launches with backing from Krafton, Activision and Riot Games Veterans
Though this is not the first case of ChatGPT being linked to the death of users, recently, the chatbot has been accused of coaching a suicidal teenager on how to tie a noose. The family of 16-year-old Adam Raine has even filed a lawsuit against ChatGPT, claiming that instead of helping him seek human aid, the chatbot encouraged the teen.
Himani Jha
Himani Jha is a tech news writer at Digit. Passionate about smartphones and consumer technology, she has contributed to leading publications such as Times Network, Gadgets 360, and Hindustan Times Tech for the past five years. When not immersed in gadgets, she enjoys exploring the vibrant culinary scene, discovering new cafes and restaurants, and indulging in her love for fine literature and timeless music. View Full Profile