News

Some of my kids’ teachers encourage AI (artificial intelligence), others disallow it. AI is clearly harmful to kids’ learning ...
California lawmakers introduced SB 243 after a teen’s suicide, aiming to regulate AI chatbots and prevent future harm to vulnerable young users.
Megan Garcia, a Florida mother whose oldest child, 14-year-old Sewell Setzer III, died by suicide after extensive interactions with unregulated AI chatbots, is calling on lawmakers to slash a ...
The Heritage Foundation — the group behind the infamous Project 2025, the conservative policy plan that outlined ____ — is suddenly really, really down with AI regulation. Who knew! The conservative ...
Some people are turning to chatbots as an easy and cheap way to get support. But The i Paper has found they could give potentially dangerous advice to someone feeling suicidal ...
"Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should." ...
Ms. Szalavitz is a contributing Opinion writer who covers addiction and public policy. Before he died by suicide at age 14, Sewell Setzer III withdrew from friends and family. He quit basketball ...
In her ruling, a judge describes how Sewell Setzer III became "addicted" to an AI chatbot app within months, quitting his basketball team and becoming withdrawn.
A mother who claims her 14-year-old son Sewell Setzer III was sexually abused and driven to suicide by an AI chatbot has secured a major victory in her ongoing legal case.
Judge rejects free speech argument for AI chatbots in lawsuit over FL teen's suicide The suit was filed by a mother from Florida, Megan Garcia, who alleges that her 14-year-old son Sewell Setzer ...
The suit was filed by a mother from Florida, Megan Garcia, who alleges that her 14-year-old son Sewell Setzer III fell victim to a Character.AI chatbot. AP Meetali Jain of the Tech Justice Law ...
Sewell Setzer III, who was 14, died by suicide in February 2024 at his Orlando home, moments after an artificial intelligence chatbot encouraged him to “come home to me as soon as possible.” ...