featured-image

The mother of a Florida boy who died by suicide in February filed a lawsuit against an artificial intelligence technology company on Wednesday, saying a chatbot drove her child to take his own life. Sewell Setzer III, 14, was described in the lawsuit as an “incredibly intelligent and athletic child.” Last year, his family noticed him withdrawing and acting up in school and saw an overall decline in his mental health.

A therapist assessed that Sewell’s problems were caused by some sort of addiction, but neither the therapist nor his parents knew the true source of his issues, the lawsuit said. After Sewell died by suicide on the night of Feb. 29, his mother, Megan Garcia, discovered that for the 10 months leading up to his death, he had been speaking with several AI chatbots.



According to the lawsuit, he had fallen in love with one of the bots, and it had encouraged him to kill himself. Matthew P. Bergman, the lawyer Garcia retained after her son’s death and the founding attorney of the Social Media Victims Law Center, told HuffPost that Sewell was shy and on the autism spectrum.

The teen enjoyed being outside and playing basketball before he started talking to the chatbots, said Bergman, who characterized the bots as “grooming” the teen. According to the lawsuit, Sewell’s chatbot addiction began in April 2023, when he logged into Character.AI, a platform founded in 2022 by two former Google engineers, Noam Shazeer and Daniel De Freitas Adiwardana.

The lawsuit, w.

Back to Health Page