The Technology Coalition Annual Report

Become a Member

Dear Friends,

In an increasingly digital world, the technology industry bears a special responsibility to ensure that its platforms are not used to facilitate the sexual abuse and exploitation of children. Thankfully, as you will see in the content that follows, industry takes this responsibility seriously.

The Technology Coalition was founded by industry to provide member companies with a forum for the collaboration necessary to keep their platforms free of such abuse. Last year, with the launch of Project Protect, the Coalition committed to a bold plan to accelerate measures to keep children safe online. In this report, I am proud to report our progress to date.

Whether you are a policy maker, a civil society organization, a survivor, or another member of industry, I hope that this report will provide you with a sense of the momentum that is building within the Technology Coalition. Over the last 12 months, the member companies have completely transformed the governance, vision, strategy, structure and capabilities of the Coalition. Much good work has been done, and still we know that there remains more to do.

We are resolved to continue to work together with all stakeholders to keep children safe. We are resolved to drive forward the improvements in technology and systems that will ultimately eradicate the online sexual abuse and exploitation of children on our platforms.

I am grateful for the hard work of our Members that has brought us to where we are, and I look forward to all the progress that is yet to come.

Forward,

Sean Litton
Executive Director

Who We Are


The Technology Coalition is a global alliance of leading technology firms that have come together to combat online sexual abuse and exploitation of children. Because member companies have the same goals and face many of the same challenges, we know that collaborating to develop and scale solutions offers the most promising path towards a global solution to the problem.

A Growing, Diverse Membership

Year over Year Membership Increase 33%

Member Services Offered

Game Hosting 21%
Live streaming 42%
Private file storage 25%
Social Media offerings 50%
Text-Based Content 21%
Web hosting services 16%
Education 13%
Infrastructure and Cloud Services 38%

We are proud to have added the following members in the first half of 2021:

What We Do


The Technology Coalition is the place where the global tech industry comes together to build the tools and advance the programs that protect children from online sexual exploitation and abuse. Each member directly contributes to the operations of the Coalition through collaborative working groups that are focused on solving the biggest challenges industry faces with respect to this issue.

One year ago, we launched Project Protect, an ambitious five pillar plan to combat online child sexual exploitation and abuse. Whether it is funding independent research to inform industry priorities, developing new technology, or sharing hard-won knowledge and effective practice, each pillar of Project Protect is designed to generate objective outcomes that will accelerate industry progress in the fight against online child sexual exploitation and abuse.

Thanks to the hard work of our Member Working Groups, we are pleased to share our progress to date:

Independent Research: Fund high-impact research that informs improvements in technology, policy, & practice


  • Invested $1M to establish the Safe Online Research Fund in partnership with EVAC
  • Received 128 proposals from 55 countries via a request for proposals that closed in April
  • Currently working with our independent Advisory Group of experts to select the grantees and will announce them in the coming months

Tech Innovation: Accelerate the development and adoption of new technology


  • Committed $1.25 million to a Tech Innovation partnership with THORN
  • Currently pursuing two projects that will enable industry to more quickly identify, report and take down abusive content:
    • Universal Video Hashing
    • Improving Cross-Industry Rapid Response Capabilities

Information and Knowledge-Sharing: Expedite the adoption of current technologies & effective practices across industry


  • Leverage the resources and experience of our Members to help one another wherever needed
  • Routinely host Member webinars where leading experts in their field educate our members on proven tech, successful practices, and operations
    • New Tech
    • Effective Practice
    • Research Results
    • Policy Briefings
    • Global Experts
    • Current Issues
  • Built a resource library, piloted training programs, and are compiling common engineering resources
Starter Kit
Transparency Guide
Sample Child Safety Policies
Safety By Design Considerations
Successful Practices for CyberTip Reporting
Existing Technology Resources and Engineering Approaches
“Cloudflare has benefited from our membership with the Technology Coalition by having access to an invaluable open environment where we can learn and contribute in our efforts to fight the spread of CSAM.”
Cloudflare

Collective Action: Facilitating aligned efforts to tackle the problem


  • Because combatting Child Sexual Abuse Material (CSAM) requires a whole-of-society approach, the Technology Coalition hosts authentic dialogues between industry and external stakeholders to identify practical steps we can take together on the most pressing challenges
  • In 2021, we hosted 3 forums with 238 participants from 29 countries representing industry, policymakers, regulators, law enforcement, civil society, and survivor advocacy groups
*Gathered the most attendees
“At Microsoft, we believe digital safety is a shared responsibility requiring a whole-of-society approach. This means that the private sector, academic researchers, civil society, and governmental and intergovernmental actors all work together to address challenges that are too complex – and too important – for any one group to tackle alone. The Technology Coalition has played a key role in enabling this approach through industry collaboration and multi-stakeholder forums focused on challenging and novel topics such as youth-produced imagery.”
Microsoft

Transparency and Accountability: Drive greater transparency and accountability across industry with respect to Child Sexual Abuse Material (CSAM)


  • Developed a Transparency Guide to support member companies in their transparency journey
  • Earlier this year, we conducted a survey of all of our Members with respect to their policies and practices concerning combatting CSAM
  • We are publishing the results of that survey in this report and they will also be included in WeProtect’s upcoming Global Threat Assessment
  • 71% of TC members are either already regularly publishing transparency reports or will be doing so from this year onwards. Members report on a variety of metrics, such as the number of accounts terminated, reports to national authorities, law enforcement requests, source of first detection, and the number of pieces of CSAM removed. Links to a sample of these reports are included below:

Our Partners

  • WeProtect-Global-Alliance-Yellow-Purple
  • Layer 1

Where Our Members Stand Today


As part of Project Protect, the Technology Coalition Members pledged to increase transparency and accountability in their efforts to combat online sexual exploitation and abuse of children. The following section is our first step towards providing that transparency, and is based on data gathered from the National Center for Missing & Exploited Children’s (NCMEC) information on the sources of CyberTipline® Reports, as well as the Technology Coalition’s Member Survey.

In 2020 Member companies provided 98% of all reports to NCMEC CyberTipline®
From 2019 to 2020, the number of these reports submitted by Members increased by 29%

Technologies such as PhotoDNA enable tech companies and organizations like NCMEC to assign unique “hash-based” alphanumeric identifiers to images of known Child Sexual Abuse Material (CSAM). These “hashes” of known CSAM images are compiled and used by industry to detect further attempts to upload known CSAM to their platforms and to report, remove, or block upload of those images. The hash-based technology for detecting still images is mature and operates on a common hash format used across industry. Hash-based video detection is less developed, and there is not yet an industry standard hash format.

All Technology Coalition Members currently deploy or are in the process of implementing hash-based technology that detects known CSAM images. PhotoDNA remains the most popular image hash-based detection tool for members, used by 75% of companies, with MD5 second most prevalent (38%). 54% of Members currently deploy video hash-based detection technology, with CSAI Match (33%), PhotoDNA for Video (25%) and MD5 (25%) being the most commonly used video hash-based detection tools.

38% of Technology Coalition member companies contribute hashes or keywords to NCMEC’s industry hash repository and 8% contribute to the Canadian Center for Child Protection’s Project Arachnid.

Implementation of Hash-Based Detection
“We want Pinterest to be a place for inspiration and that means we need to be deliberate about creating a safe and positive space for our users. In 2020, we continued to focus on our internal tooling detection mechanisms -- including PhotoDNA and machine learning tools -- to ensure consistent and effective enforcement with new feature launches.”
Pinterest
Additional Means of Detection, Reporting & Deterrence

While not all tools are relevant to all platforms (for example video detection tools are not used on platforms that do not host video), Technology Coalition Members deploy a variety of additional tools and strategies to detect, facilitate reporting, and deter online sexual exploitation and abuse of children.

Classifiers are algorithms that attempt to flag potential (new or unhashed) CSAM for categorization and/or human review

Detection
Members deploy:
Members Deploy Classifiers
  • Text-Based Classifiers
  • Image-Based Classifiers
  • Video-Based Classifiers
  • Classifiers in Livestream Contexts
  • Classifiers to Identify Grooming or Predatory Behavior
“In 2020, the Content Safety API, a tool that helps organizations better prioritize potentially abusive content for human review, was used by our trusted partners to classify more than 2 billion images. In the last year, we ramped up adoption of this tool and others, like the YouTube CSAI Match API and its video hash matching capabilities, which we offer to qualifying organizations and companies, including interested Technology Coalition members.“
Google

Reporting
In addition to standard reporting measures, Members facilitate:
Direct User Reports of abusive content or behavior on their platform
Specialist Reporting Path for Law Enforcement
Specialist Reporting Path for NGOs
“Being a part of the Tech Coalition is incredibly valuable. It has given Discord the ability to exchange knowledge, connect with members going through the same struggles as us, and brainstorm strategies to address abuse with working groups and new partnerships.”
Discord

Deterrence and Prevention
Members provide:
Deterrence Messaging for Potential Offenders
Online Safety Resources
Parental Control Tools
“As a startup committed to preventing child exploitation on our platform, the resources and connections provided by the Tech Coalition are invaluable. Whether it is access to providers of hash matching databases or best practices for detecting abusive accounts, our membership means that we can collectively learn from the challenges that other companies are facing. This means more effective trust and safety procedures that perpetually evolve and improve and a better online experience for everyone.”
Clubhouse

Our Members’ focused investment in identification and reporting capabilities is driving measurable results. In 2020, the National Center for Missing & Exploited Children (NCMEC) received 21.7 million CyberTipline reports of suspected child sexual exploitation. Technology Coalition member companies accounted for 98% of those reports. From 2019 to 2020, the number of reports submitted by Members increased by 29%, and overall CyberTipline reporting increased by 28%. NCMEC stated that the increased volume of reports could be “indicative of a variety of things, including larger numbers of users on a platform or how robust an Electronic Service Provider’s (ESP) efforts are to identify and remove abusive content from their platforms.”

Source: 2020 NCMEC CyberTipline Data

Disclaimer: Transparency data is based on reports confirmed through the US-based National Center for Missing & Exploited Children, as the single most significant representation of global CSAM. In the future, we hope to expand our analysis to include reporting to additional organizations to better represent global trends.

Increased Reporting

Looking Ahead


In the coming year, the Technology Coalition will continue to drive forward the ambitious goals of Project Protect and increase the tech industry's collective capacity to protect children from online sexual exploitation and abuse. We understand that no single company, organization, or institution can solve this problem alone, and we look forward to adding new members and building new partnerships in the months ahead. Thank you for working with us to keep children safe online.

Testimonials


“Facebook conducted an in-depth analysis of the illegal child exploitative content we reported to the NCMEC in October and November of 2020. We found that more than 90% of this content was the same as or visually similar to previously reported content. And copies of just six videos were responsible for more than half of the child exploitative content we reported in that time period.“
Facebook
“At Adobe, we are deeply committed to combating child sex abuse online and collaborating with others in the industry through the Technology Coalition to fight this issue at scale. We leverage industry research and insights from the Technology Coalition to further advance our online safety program and are proud to drive innovation in this area by partnering with the safety community.“
Adobe
“Mega has zero tolerance for CSAM but faces increased prevalence involving multiple platforms. We appreciate the support provided by the Technology Coalition to face this global problem.“
Mega
“We want Pinterest to be a place for inspiration and that means we need to be deliberate about creating a safe and positive space for our users. In 2020, we continued to focus on our internal tooling detection mechanisms -- including PhotoDNA and machine learning tools -- to ensure consistent and effective enforcement with new feature launches.“
Pinterest
“At TikTok, we understand we're at our strongest when we work together, and the Technology Coalition has been an invaluable partner in the fight against child sexual abuse as it provides a much needed forum to collaborate with companies of all sizes to share resources, technology, and best practices. The Technology Coalition has provided a necessary, dedicated space to design strategies against an ever-evolving threat across the industry.“
TikTok
“We believe that information sharing, learning and discussion is an essential part of creating safe online environments. Being able to participate in panels, discussions, and learning opportunities presented by the Technology Coalition alongside other tech companies is invaluable to all of our efforts to combat OSEAC as effectively as possible.“
Twitch
“As the smallest company member of the Technology Coalition, we benefit from the experience of the other members and learn how the biggest platforms structure their moderation system. We also are eager to propose our innovative solutions that could be used in the sector to continue to fight together OSEAC.“
Yubo
Learn more about the
Technology Coalition
Sign Up for Our Newsletter