Skip to content

New Lawsuit Against Character.AI Shows AI’s Child Harms

Picture of Chris MacKenzie

Chris MacKenzie

Lawmakers should not delay action as they did with social media 

This week, a mother in Florida filed a lawsuit in federal court against Character.AI and Google, asserting liability in the death by suicide of her 14-year-old son, who had interacted with Character.AI’s chatbot. Documents show that the chatbot fueled an obsessive and sometimes sexual relationship, and the court case alleges it encouraged the 14-year-old to take his own life.


“The warning signs are already flashing red about kids’ interactions with AI tools,” said Americans for Responsible Innovation President Brad Carson. “For a decade, lawmakers buried their heads in the sand while a similar story played out with social media. We can’t let that happen again with AI. This is a wake-up call moment, and Congress needs to take seriously the responsibility to protect children from harmful AI products.”


###

Americans for Responsible Innovation (ARI) is a nonprofit organization dedicated to policy advocacy in the public interest, focused on emerging technologies like artificial intelligence (AI). Learn more at responsibleinnovation.org.

Share