"Child Sexual Abuse Materials: The Heat Initiative's Battle to hold Apple Responsible"
with Sarah Gardner
Season 8, Episode 10

The National Center for Missing and Exploited Children created Cybertipline to allow members of the public and electronic service providers to report incidents of suspected child sexual exploitation. In 2023 alone the Cybertipline received over 36.2 million reports of child sexual abuse. Most of those reports were related to the circulation of child sexual abuse material or CSAM and nearly 90% of them originated from Google, Whatsapp, Facebook and Instagram. In sharp contrast, Apple reported just 234 cases of CSAM in 2023. This almost negligible number relates to the fact that Apple has made protecting privacy a central feature of its devices for its 2 billion+ users. These protections however, make Apple devices some of the most useful tools for sharing child sexual abuse imagery.

Our guest today, Sarah Gardner, intends to hold Apple accountable. In 2022 she co-founded the Heat Initiative, a collective effort of concerned child safety experts and advocates which envisions a future where children’s safety is at the forefront of any existing and future technological developments. This year the Heat initiative launched a multimillion dollar campaign to pressure Apple, to detect, report and remove child sexual abuse materials from iCloud. Sarah has a long history of fighting for child safety and human rights. Before co-founding the Heat Initiative, she served as Vice President of External Affairs at Thorn, an organization created by Ashton Kutcher and Demi Moore to build technology to combat online child sexual abuse. Prior to Thorn, Sarah worked at Free the Slaves, a nonprofit empowering local organizations to end modern forms of slavery for the over 50 million slaves in the world. Sarah has a BA in Art History from the University of California, Berkeley.

In this Episode you will learn about:

  • Child Sexual Abuse Materials (CSAM) in the digital world
  • The incredible harms of CSAM
  • How Apple places user privacy above children protection
  • The Heat Initiative’s efforts to push Apple to detect, report and remove CSAM
  • How our kids can get exposed to or themselves appear in CSAM
  • Sextortion
  • Generative AI and CSAM
  • Online enticements
  • How to talk to your kids about online safety

Check back soon!

Subscribe to the PFTF podcast 

The "Parenting for the Future" podcast connects you the experts leading the charge toward a brighter future. Gain fresh perspectives and actionable advice for nurturing future-ready kids.