An AI company is now facing a lawsuit after a 14-year-old killed themselves over a Game of Thrones chatbot. Trigger Warning: ...
Sewell Setzer isolated himself from the real world to speak to clone of Daenerys Targaryen dozens of times a day ...
When the teen expressed his suicidal thoughts to his favorite bot, Character.AI ‘made things worse,’ a lawsuit filed by his ...
Placing Blame A grieving mother claims an AI chatbot not only convinced her teen son to commit suicide, but also pushed him ...
His mum is now suing the creatorsThis article contains details some people may find triggering. If you need support, ...
Sewell Setzer III had professed his love for the chatbot he often interacted with - his mother Megan Garcia says in a civil ...
ALBAWABA - A 14-year-old boy took his own life after allegedly becoming emotionally attached to a Game of Thrones AI chatbot ...
Sewell Setzer III, a Florida boy, would chat with his online friend, Daenerys Targaryen, a lifelike AI chatbot named after a ...
The AI chatbot was more than a digital companion for Sewell—it became a friend he could talk to about his life, problems, and emotions. According to a New York Times report, while some conversations ...