On the night of February 28, 2024, 14-year-old Sewell Setzer III took his own life, using his father’s handgun. As the New York Times reported it, Sewell had spent months talking to a Chatbot on Character.AI, a role-playing app that allows users to create their own AI characters, or chat with characters created by others. Sewell knew that Danny, as he called his character, wasn’t a real person. In fact, there was a message displayed above all of their chats, which said, “everything characters say is made up.” But Sewell developed an emotional attachment to Danny anyway, texting it constantly, updating it dozens of times a day about his life and engaging in long role play dialogues. Some of their chats showed Danny to be a friend who Sewell could count on to listen to him. But some of the chats also got romantic and sexual. Soon Sewell started isolating himself more and more. He got into trouble at school, lost interest in things he used to love, and his grades began to fall. Soon Sewell told the chatbot that he hated himself, felt empty and exhausted, and sometimes thought of killing himself. On the night he took his life, Sewell told Danny that he loved her, and that he would soon come home to her. “Please come home to me as soon as possible, my love”, she replied. “What if I told you I could come home right now?” Sewell asked. “Please do, my sweet king”, was Danny’s reply.
On October 23 , 2024, Matthew Bergman, of the Social Media Victims Law Center and Meetali Jain of the Tech Justice Law Project filed suit on behalf of Sewell’s mother, Megan Garcia, against the makers of Character.AI, Noam Shazeer, and Daniel De Freitas, as well as Google LLC and its parent company, Alphabet Inc. Matthew and Meetali are our guests on this episode and we discuss why they believe the defendants are responsible for Sewell’s death.
In this Episode you will learn about:
- Sewell Setzer
- Character.AI
- How AI Companions impact our kids/a child’s brain
- The relationship between Character.AI, Google LLC and Alphabet, Inc.
- Warning signs to look out for when your kids have AI Companions
- Large Language Learning Models
- Anthropomorphic AI
Check back soon!
Subscribe to the PFTF podcast
The "Parenting for the Future" podcast connects you the experts leading the charge toward a brighter future. Gain fresh perspectives and actionable advice for nurturing future-ready kids.