Sean Litton, Executive Director, The Tech Coalition

The Tech Coalition is launching Lantern, the first cross-platform signal sharing program for companies to strengthen how they enforce their child safety policies. Online child sexual exploitation and abuse (OCSEA) are pervasive threats that can cross various platforms and services. Two of the most pressing dangers today are inappropriate sexualized contact with a child, referred to as online grooming, and financial sextortion of young people. To carry out this abuse, predators often first connect with young people on public forums, posing as peers or friendly new connections. They then direct their victims to private chats and different platforms to solicit and share child sexual abuse material (CSAM) or coerce payments by threatening to share intimate images with others.

Because this activity spans across platforms, in many cases, any one company can only see a fragment of the harm facing a victim. To uncover the full picture and take proper action, companies need to work together. Lantern is a groundbreaking initiative that brings together technology companies to securely and responsibly share signals about activity and accounts that violate their policies against child sexual exploitation and abuse (CSEA). Signals can be, for example, information tied to policy-violating accounts like email addresses, usernames, CSAM hashes, or keywords used to groom as well as buy and sell CSAM. Signals are not definitive proof of abuse - they offer clues for further investigation and can be the crucial piece of the puzzle that enables a company to uncover a real-time threat to a child’s safety.

Until now, no consistent procedure existed for companies to collaborate against predatory actors evading detection across services. Lantern fills this gap and shines a light on cross-platform attempts at online child sexual exploitation and abuse, helping to make the internet safer for kids. The program can enable the increase of prevention and detection capabilities; speed up identification of threats; build situational awareness of new predatory tactics; and, strengthen reporting to authorities of criminal offenses.

Participating companies upload signals to Lantern about activity that violates their policies against child sexual exploitation identified on their platform. Other participating companies can then select from the signals available in Lantern, run the selected signal against their platform, review any activity and content the signal surfaces against their respective platform policies and terms of service, and take action in line with their enforcement processes (E.g. removing an account and reporting criminal activity to the National Center for Missing and Exploited Children (NCMEC) and appropriate law enforcement). 

Lantern Infographic BLOG

CASE STUDY: During the program’s pilot phase, MEGA shared URLs that Meta used to conduct an investigation on its platforms into potentially violating behaviors related to these URLs, resulting in the removal of more than 10,000 Facebook Profiles, Pages and Instagram accounts. Meta reported the violating Profiles, Pages and accounts to NCMEC, in accordance with its legal obligations. Meta also shared details of its investigation back to the signal sharing pilot program, enabling other participating companies to use the signals to conduct their own investigations.

Building a thoughtful cross-platform procedure is hard and it takes time.  Over the last two years we have been developing this program with several Tech Coalition members and have made a concerted effort to design a program that is effective at addressing OCSEA, and also legally, regulatory, and ethically compliant. 

The Tech Coalition is responsible for facilitating access to and overseeing compliance of the program through eligibility vetting, legal agreements, data audits, and more. We understand cross-platform signal sharing raises concerns and requires diligent oversight, which is why we are committed to responsible management through:

Safety and privacy by design: We developed Lantern in a responsible way with safety and privacy by design, including through: 

  • Establishing clear guidelines and rules for appropriate data sharing among participating companies,

  • Ongoing review of Lantern’s policies and practices, and

  • Mandatory trainings and routine check-ins with participants.

Respecting human rights: We commissioned Business for Social Responsibility (BSR) to conduct a Human Rights Impact Assessment (HRIA) to inform the development of Lantern and provide ongoing guidance as we iterate and enhance the program. 

Stakeholder engagement: Earlier this year we engaged more than 25 experts and organizations focused on child safety, digital rights, advocacy of marginalized communities, government, and law enforcement, to solicit feedback, and invite participation in the HRIA. Our stakeholder engagement work is ongoing. 

Transparency: We are committed to including Lantern in the Tech Coalition’s annual transparency report and providing participating companies with recommendations on how to incorporate their participation in the program into their own transparency reporting. 

Lantern is launching with an initial group of companies joining this first phase, including Discord, Google, Mega, Meta, Quora, Roblox, Snap, and Twitch. Their experiences will help evaluate and strengthen this necessary new initiative to collaborate against a profoundly consequential threat. The Tech Coalition is committed to working with companies able to fulfill the agreements and requirements of joining Lantern and will continue to welcome additional participants as they learn more about the program and how it may be useful to enforce their respective policies.

By addressing urgent threats with safeguards against unintended consequences, Lantern exemplifies the thoughtful, nuanced approach needed for online child safety. We will continue to refine the program and uphold privacy and human rights alongside our mission of protection.

Words from Lantern Participating Companies:

Discord

"Child-harm content is appalling, unacceptable, and has no place on Discord or in society. We work relentlessly to keep this content off our service and take immediate action when we become aware of it.

"Our participation in the Lantern Program has enabled Discord to have a much wider and more nuanced approach to combating harmful behavior on our platform. Crucially, we’ve been able to scale the actioning of offending accounts and the sharing of bad actor data points relating to the highest-harm abuses of our platform with other participating companies. Discord has also acted on data points shared with us through the program, which has assisted in many internal investigations. The Tech Coalition plays a pivotal role in sharing analysis and actionable threat information with its members to mitigate risks and enhance platform resiliency." - John Redgrave, Vice President of Trust & Safety at Discord

Google

“We commend the Tech Coalition’s leadership in bringing these companies together to further our collective fight against child sexual abuse and exploitation online. This ongoing work and the industry-wide collaboration are incredibly important steps in helping to keep children safe online and the partnership speaks to Google’s longstanding commitment to preventing CSAE on our platforms. We look forward to exploring how we can best contribute to the program moving forward.”  -- Laurie Richardson, VP of Trust and Safety at Google

MEGA

"MEGA values the Lantern initiative as we have zero tolerance for sharing of objectionable material such as CSAM. Sharing information across platforms helps us to be more effective in our process to remove objectionable content and to close accounts of anyone sharing illegal content." - Stephen Hall, Chief Compliance Officer, MEGA

Meta

“Predators don’t limit their attempts to harm children to individual platforms, and the technology industry needs to work together to stop predators and protect children on the many apps and websites they use. We’ve spent over a decade fighting to keep young people safe online, and we’re glad to partner with the Tech Coalition and our peers in the Lantern program on this important work.” - Antigone Davis, Global Head of Safety, Meta

Snap

"Nothing is more important than the safety and well-being of our Snapchat community. The exploitation of anyone, especially young people and minors, is illegal, unacceptable, and explicitly prohibited by our Community Guidelines. Preventing, disrupting, detecting, and reporting child sexual exploitation and abuse (CSEA) is a priority for us - which is why we have been working with the Tech Coalition and other companies on innovative industry-wide approaches, like Lantern. Lantern will strengthen our existing capabilities and technologies to identify and combat CSEA and predatory actors. To advance our collective mission of keeping young people safe across platforms and services, we look forward to continuing this collaboration with the Tech Coalition." - Jacqueline Beauchere, Global Head of Platform Safety, Snap

Twitch

"Child sexual exploitation is abhorrent and has no place on Twitch, or anywhere. We’re deeply committed to protecting our community, and have been aggressive here – ramping up detection tools, Community Guidelines, and community education. This urgent work shouldn’t be done in a silo. Industry collaboration is crucial and leads to better outcomes for all. Lantern, critically, speaks to that need. We look forward to this continued partnership." - Angela Hession, Vice President of Customer Trust, Twitch

(Quote added November 8, 4pm EST)

Words from Child Safety Experts:

ECPAT International

“Tackling child sexual exploitation in the digital realm demands not only the willingness but also the legal and technical capacity for platforms to unite in action. Lantern is a crucial step forward. It ignites the path to collaboration—fostering a safe, secure, and privacy-conscious environment where the exchange of crucial signals could pivot a child's fate from exploitation to protection.” - Amy Crocker, Head of Child Protection and Technology, ECPAT International

Lucy Faithfull

“Child sexual abuse happens on so many different apps, websites and platforms. Right now, people who want to harm children can move freely between them. We congratulate the Tech Coalition on launching project Lantern, an innovative way to try to block this. We encourage tech companies to follow the lead taken by Discord, Google, Mega, Meta, Quora, Roblox, Snap, and Twitch and participate in Lantern to work together to detect harm, prevent abuse and protect children. The more partners that get involved, the further the reach and wider the impact. It is only by working together that we can place a comprehensive online shield around children.” - Deborah Denis, Chief Executive of The Lucy Faithfull Foundation

The National Center for Missing & Exploited Children (NCMEC)

“The National Center for Missing & Exploited Children (NCMEC) applauds the Tech Coalition for the creation of Lantern. We hope that by sharing signals across platforms, the online sexual exploitation of children can be disrupted and more thorough and actionable CyberTipline reports can be made.” - Gavin Portnoy, VP of Communications & Brand, NCMEC