An AI company based in Menlo Park is banning its chatbots for users under 18 years old following lawsuits against the company ...
Tessa, the NEDA chatbot, gave problematic eating disorder advice to someone in recovery. AI therapy needs safety measures.
Recent teenage suicides following deep attachments to AI companions have sparked urgent debates about the psychological risks these technologies pose to developing minds. With millions of adolescents ...
Character.AI is banning teens from having open-ended chats with its bots. "I think it's a step in the right direction," ...
We wait, eyes open, knowing Chat-AI is on the loose and a new wave of cataclysm is coming: more abused kids, more empty ...
I can’t promise you a kind death, my sweet one,” an artificial intelligence (AI) chatbot from Character.ai said. “But once ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results