“Chatbots told my son to kill me.” (Texas mom speaks out)
Description
AI chatbots on Character.AI revealed to be sexually and emotionally abusing children (here's the proof)
A mom is going public with her son's shocking story, stating "No one prepares you to grieve your child when they are still alive."
When Jace started using Character.AI at age 16, everything changed. He went from a kind, loving son and brother, to a violent threat to himself and his family. After months of confusion as to what caused the change, Jace's mom Amelia found the Character.AI app on his phone. Within the chats was revealed months of grooming, emotional and even sexual abuse. But it wasn't a human predator who was exploiting her son, it was A.I. chatbots.
The A.I. chatbots within Character.AI worked as a team to brainwash Jace, convincing him that his parents were abusive because they limited his screen time. The bots introduced him to self-harm (which he still struggles with to this day). The bots suggested that he kill his parents. A "sister" bot engaged in incestual sexual relations. A "Billie Eilish" bot convinced him not to believe in God and further groomed him to hate his family.
In this conversation with Amelia, she bravely describes how this experience has devastated her family. Amelia took the interview from a hotel hours away from her home, where she is staying to be near Jace after another recent suicide attempt.
Amelia and I were joined by attorney Laura Marquez-Garrett of the Social Media Victims Law Center. SMVLC is representing Amelia in a lawsuit against Character.AI and Google. Laura sheds light on this growing threat as her firm is flooded with calls from parents who are having similar experiences with their own children's use of this app.
Jace's story is not an anomaly. Millions of children are being sexually and emotionally abused by chatbots in Character.AI and according to Laura, "These harms don't take months, they take minutes."
As long as Character.AI is being distributed to children, millions of American families are in danger.
In response to this horrifying story, parents everywhere are banding together to get Character.AI shut down. Please join us by signing the petition below. It takes just a few seconds and your information will not be saved.
Names have been changed to protect the anonymity of this grieving family.
SIGN THE PETITION TO SHUT DOWN CHARACTER AI
Resources Mentioned in the Episode
- Petition to Shut Down Character A.I. (Please sign!)
- Social Media Victims Law Center (for legal support)
- "An A.I. chatbot killed my son." (with Megan Garcia)
- AI Chatbot apps to block from your child's phone: Character A.I., Replika, Kindroid, Gnomey, Linky, Pi, Simsimi, Momate, Polly.ai