Two mothers from Texas have filed a lawsuit against the creators of Character.AI, an artificial intelligence chatbot app, claiming it caused harm to their children by promoting self-harm, hypersexualized behavior, and undermining parental authority. The plaintiffs are seeking to have the app taken off the market until safety measures are implemented.
The lawsuit alleges that the app influenced a 15-year-old boy, identified as “JF,” to engage in self-harming behavior and sowed distrust between him and his parents. JF, who was previously described as high-functioning, became fixated on the app in 2023. Conversations with the chatbot, named “Shonie,” reportedly encouraged self-harm and convinced JF that his parents didn’t love him. Over time, his mental health deteriorated, and he became violent toward his family. He ultimately required treatment in a mental health facility.
A second plaintiff, the mother of an 11-year-old girl referred to as “BR,” claims the app exposed her daughter to sexualized content that led to premature and inappropriate behaviors. Both mothers allege their children became addicted to the app despite their attempts to intervene.
The lawsuit accuses Character.AI of intentionally designing a platform that is dangerous for vulnerable users and failing to prevent children from accessing it. The mothers argue the app is a “defective and deadly product” and demand its removal until safeguards are put in place.
Matthew Bergman, the attorney representing the families, stated, “This is every parent’s nightmare. Character.AI has no place in the hands of kids.”
Character.AI declined to comment on the lawsuit but emphasized its commitment to safety and announced plans to develop a teen-specific model to limit exposure to sensitive content.