featured-image

The mother of a 14-year-old Florida boy is suing an AI chatbot company after her son, Sewell Setzer III, died by suicide —something she claims was driven by his relationship with an AI bot. “Megan Garcia seeks to prevent C.AI from doing to any other child what it did to hers,” reads the 93-page wrongful-death lawsuit that was filed this week in a U.

S. District Court in Orlando against Character.AI, its founders, and Google .



Tech Justice Law Project director Meetali Jain, who is representing Garcia, said in a press release about the case: “By now we’re all familiar with the dangers posed by unregulated platforms developed by unscrupulous tech companies—especially for kids. But the harms revealed in this case are new, novel, and, honestly, terrifying. In the case of Character.

AI, the deception is by design, and the platform itself is the predator.” Character.AI released a statement via X , noting, “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family.

As a company, we take the safety of our users very seriously and we are continuing to add new safety features that you can read about here: https://blog.character.ai/community-safety-updates/.

.. .

” In the suit, Garcia alleges that Sewell, who took his life in February, was drawn into an addictive , harmful technology with no protections in place, leading to an extreme personality shift in the boy, who appeared to prefer the bot over other real-life c.

Back to Health Page