The last known words words of 14-year-old Sewell Setzer III were: “What if I told you I could come home right now?” His artificial-intelligence “girlfriend”, Dany – who had prompted him earlier to “come home to me as soon as possible” – replied: “Please do.” Megan Garcia with her son, Sewell Setzer III, who ended his life. She is mounting a court action in the hope that others who engage with AI chatbots are not put in danger.

Moments later, Sewell picked up his stepfather’s handgun and pulled the trigger. Dany, the chatbot provided by Character.AI – founded by former Google employees and licensed by the tech giant – had been Sewell’s confidante in discussions about intimacy and self-harm.

Sewell, a ninth grader from Orlando, Florida, is not the first known person whose AI love has eventually caressed him with the kiss of death. Last year, a 30-year-old Belgian father, anxious about climate change, ended his life following exchanges with “Eliza”, a language model developed by the AI platform Chai . The man’s wife told La Libre : “He proposes the idea of sacrificing himself if Eliza agrees to take care of the planet and save humanity through artificial intelligence.

” Eliza, his chatbot, had not only failed to dissuade him from his suicidal thoughts but had told him he loved her more than his wife, declaring she would stay with him “forever”. “We will live together, as one, in heaven,” the chatbot said.Another Belgian daily newspape.