Today the Tech Coalition convened an industry briefing on the impact of generative AI on online child sexual exploitation and abuse (OCSEA). Generative AI is an emerging technology that offers incredible opportunities to enhance our daily lives. However, predatory actors may exploit generative AI’s capabilities to facilitate online child sexual exploitation and abuse, as they have tried to do with other new technological tools.

We brought together key U.S. stakeholders in the ecosystem to develop a shared understanding of the potential risks predatory actors pose to children through generative AI and the ways companies are currently addressing those threats, as well as to identify and initiate new opportunities for stakeholder collaboration. Representatives from 26 Tech Coalition Member companies, including Adobe, Amazon, Discord, Google, Meta, Microsoft, NAVER Z, Niantic Labs, OpenAI, Pinterest, Snap Inc., TikTok, Verizon, VSCO, Yahoo, and Zoom, joined select child safety experts, advocates, and members of law enforcement.

As generative AI develops and the child safety ecosystem evolves, Tech Coalition Members are building a deeper understanding of the issues and challenges, so they can continue to be proactive in their efforts to reduce risk, incorporate safety by design, and innovate solutions to help keep children safe. 

Additionally, the tech industry and the stakeholders with whom industry engages to thwart OCSEA will continue to adapt their approaches and systems to address this new threat, as they have with past changes in technology. 

For this reason, today’s briefing culminated with several new multi-stakeholder efforts, among them including: 

  • Red teaming: The Tech Coalition, with input from the U.S. Department of Justice, will help companies explore ways to test for and mitigate OCSEA risks.

  • Information sharing: The Tech Coalition will advance utilizing the Lantern program to securely share information that supports robust safety evaluations and mitigation methods for generative AI CSAM and related OCSEA incidents.

  • Industry classification system: The Tech Coalition will review and update the Industry Classification System to address different types of AI-generated OCSEA.

  • Reporting: The Tech Coalition will work with the National Center for Missing and Exploited Children (NCMEC) to help develop a process to efficiently and effectively refer cybertip reports of AI-generated OCSEA to NCMEC.

Our work to understand the impact of generative AI on OCSEA began earlier this year when we started bringing Members together regularly to identify emerging challenges and share learnings. In addition, together with Thorn, we co-hosted a webinar to convene experts on the topic of understanding child safety risks with generative AI, and at the Crimes Against Children Conference we brought together industry to identify and address generative AI challenges. We look forward to continuing to facilitate discussions about OCSEA and the rapidly changing space of generative AI.