News
Sewell Setzer III grew ‘dependent’ on an AI chatbot service. Now his mother filed a lawsuit to hold the creators responsible (US District Court Middle District of Florida Orlando Division) ...
Sewell Setzer fatally shot himself after becoming “addicted” to AI chatbot app, mom alleges Mike DeForest , Investigative Reporter Published: April 28, 2025 at 11:49 AM Updated: April 28, 2025 ...
Lawsuit: Sewell Setzer III sexually abused by 'Daenerys Targaryen' AI chatbot. Throughout Sewell's time on Character.AI, he would often speak to AI bots named after "Game of Thrones" and "House of ...
Fourteen-year-old Sewell Setzer III fell in "love" with a Character.AI chatbot, then killed himself. Now his mom, Megan Garcia, is fighting the popular tech ...
The suit was filed by a mother from Florida, Megan Garcia, who alleges that her 14-year-old son Sewell Setzer III fell victim to a Character.AI chatbot.
Fourteen-year-old Sewell Setzer III killed himself after falling in "love" with a Character.AI chatbot, his family says — now an expert weighs in on the risks that could be associated with the ...
For almost a year, Sewell Setzer III, a 14-year-old from Orlando, spoke to AI-generated chatbots, powered by a company called Character.AI, meant to emulate “Game of Thrones” characters, like ...
Sewell Setzer, a 14-year-old from Florida, took his life after interactions with chatbots on the Character.AI platform, a lawsuit filed by his mother, Megan Garcia, claims. The complaint claims ...
Fourteen-year-old Sewell Setzer III loved interacting with Character.AI's hyper-realistic chatbots—with a limited version available for free or a "supercharged" version for a $9.99 monthly fee ...
Sewell Setzer III, who was 14, died by suicide in February 2024 at his Orlando home, moments after an artificial intelligence chatbot encouraged him to “come home to me as soon as possible.” ...
Sewell Setzer, a 14-year-old from Orlando, used to love sports and science. But then got sucked into the world of A.I. and ultimately killed himself to escape the real world.
Warning distressing content: When the teen expressed his suicidal thoughts to his favorite bot, Character.AI ‘made things worse,’ a lawsuit filed by his mother says ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results