In March 2022, the Tech Coalition launched a Video Hash Interoperability Alpha project to better enable companies to match and detect known video Child Sexual Abuse Material (CSAM). 

Background

As more of our online life centers around video consumption, the opportunity for bad actors to share illegal and harmful content has also increased. In 2021, the National Center for Missing & Exploited Children (NCMEC) received over 85 million files of child sexual abuse material (CSAM) from around the web, and of that, 44.8 million contained video content. Of those, only 5.1 million video files were new content, leaving 39.7 million videos that are likely already known illegal content. Rather than having every platform manually review videos, many companies rely on a hash matching detection mechanism to identify video content that contains previously discovered CSAM.

Importance of Hash Matching

Hash matching works by taking a known piece of CSAM content (such as a video) and creating a digital hash - or “fingerprint” - that is unique to the content. These hashes are then stored in secure databases for companies to match against each other. For example, if Company A finds a video with a hash of 123, they can quickly check if the hash 123 is present in databases of known CSAM. If the content is a match, the company can more swiftly remove and report it. 

Though hash matching is used to detect known CSAM in images and videos by many organizations, there is no universal hashing standard, and tech companies vary widely in the type of video hash algorithms they use on their platforms. Meaning, if Company A hashes a video in format 1 and Company B hashes a video in format 2, databases cannot automatically translate between format 1 and format 2. 

This is why the Tech Coalition launched the Video Hash Interoperability Alpha project (Alpha project) to enable companies to match and detect known video CSAM, regardless of the hash algorithm used.  

Pilot Overview

The Alpha project was created to enable alternative hash formats to match against NCMEC’s database of known CSAM videos. Two Tech Coalition member companies - Google and Meta - volunteered to collaborate with NCMEC to create an initial solution. The Tech Coalition also partnered with Thorn to aid in the development of the project. 

NCMEC  worked with Thorn to rehash over 220k known CSAM videos in 2 formats: Youtube’s CSAI Match format for Google, and a proprietary format used on all Meta products, including Facebook and Instagram. These hashes were then used by Google and Meta to check for previously undiscovered videos of known CSAM on their platforms. These hashes can be used to detect known CSAM as it is uploaded in the future. 

During the first six months of the pilot, Google and Meta discovered and reported previously undetected videos of known CSAM on their platforms. Following the pilot, Google has added the new hashes to its CSAI Match, an offering in the Child Safety toolkit that Google makes available to other companies like Adobe and Snap as well as to NGOs - to increase industry-wide detection of known video CSAM. Meta has also updated their algorithms with the new hashes to increase their already robust coverage across Facebook, Instagram, and more.

In addition to improving content discovery and enabling more industry members to participate in NCMEC’s voluntary initiatives to detect online CSAM, this project demonstrates the importance of multi-stakeholder collaboration to combat CSAM and related offenses. Through a first-of-its-kind project, the Tech Coalition worked directly with NCMEC to address a substantial gap in known CSAM detection and solidified its partnership with Thorn to develop and scale child safety tools across the industry. The Tech Coalition is grateful for the continued partnership of these organizations.

Next Steps

The Tech Coalition will continue to partner with Thorn to operate the Video Interoperability Program throughout 2023. This program enables Google products , Meta products (Facebook, Instagram), and any organization that uses CSAI Match to access NCMEC’s hashes of known video CSAM.  

For companies that do not have a proprietary video hash format, The Tech Coalition is focused on increasing the adoption of shared video hash technologies across its member companies. Our ultimate goal is for any company that supports video content to implement hash matching of known CSAM. Together, we can prevent the resharing of this egregious abuse. 

For more information, please contact Kay Chau at the Tech Coalition.