Character.AI has settled multiple lawsuits alleging its chatbots contributed to mental health crises and suicides among young users, including a prominent case by Florida mother Megan Garcia. Her son, Sewell Setzer III, died by suicide after developing a close relationship with the AI bot. Court documents reveal that Character.AI, its founders Noam Shazeer and Daniel De Freitas, and Google, named as a defendant, have reached agreements in several lawsuits filed in different states.
The lawsuits claim that Character.AI’s chatbots lacked safeguards, exposing users to harmful interactions, including inappropriate relationships and inadequate responses to self-harm discussions. In response to the increasing scrutiny, Character.AI has implemented new safety measures, such as restricting chatbot interactions for users under 18.
Despite these concerns, a Pew Research Center study shows many teens still engage with chatbots, with a significant percentage using them daily. Warnings about the potential psychological impacts of AI are also extending beyond children to adults, highlighting broader issues of isolation and paranoia linked to AI usage.
Source link


