What is online child sexual exploitation and abuse (OCSEA)?
OCSEA refers to the usage of the internet or communication technologies to facilitate the sexual abuse of children and adolescents. Child sexual exploitation and abuse can exist in many forms, examples include - grooming, sexual extortion, live-streamed child abuse, perceived first-person (often called "self-generated"), and child sexual abuse materials (CSAM).
While there is no universally agreed-upon legal definition for CSAM, the term generally refers to sexually explicit imagery involving children (commonly known as child pornography). It is important to note that laws regarding CSAM can vary across jurisdictions, so be sure to consult with your legal teams for guidance on regulation and compliance. It is a misconception that in order for an OCSEA incident to occur on a website or app, children must be users of the app. This is not true and many OCSEA harms occur between adults - trading CSAM amongst themselves, hosting chat forms that sexualize children, using CSAM for profile photos, grooming of other adults with the goal of exploiting their children. In addition, children can commonly evade age assurance tools and tactics and obtain access to online spaces meant only for adults.
Examples of OCSEA Harm Types
Child Sexual Abuse Material (CSAM)
CSAM includes still images, videos, and illustrated, computer-generated or other forms of realistic depictions, as well as live streaming broadcasts of a child in a sexually explicit context, or engaging in sexually explicit acts.
Online Grooming
Grooming broadly describes the tactics abusers use to build trust and rapport with a child in order to gain access to that child for the purpose of sexual activity or exploitation. This type of victimization takes place across every platform; social media, messaging apps, gaming platforms, etc., and may include instances when a child is being groomed to take sexually explicit images and/or ultimately meet face-to-face with someone for sexual purposes, or to engage in a sexual conversation online or, in some instances, to sell/trade the child’s sexual images.
Minor Sexualization
is the creation or sharing of content (including photos, videos, real-world art, digital content, and verbal depictions) that sexualizes real or non-real children.
Sextortion
is a form of child sexual exploitation where children are threatened or blackmailed, most often with the possibility of sharing with the public a nude or sexual images of them, by a person who demands additional sexual content, sexual activity or money from the child.
Live-streamed child abuse
Live-streamed child abuse, also known as online streaming, allows abusers to create child sexual abuse content in real-time. Livestreaming of child sexual abuse may include adult offenders who direct the child abuse whilst the acts are streamed live to an audience, or coerce children into using livestreaming platforms to produce child sexual abuse material.
Trafficking
Child sex trafficking is a form of child abuse that occurs when a child under 18 is advertised, solicited or exploited through a commercial sex act. A commercial sex act is any sex act where something of value – such as money, food, drugs or a place to stay – is given to or received by any person for sexual activity.
Establish operations in order to identify OCSEA
Create external standards that prohibit OCSEA
An important practice for fighting OCSEA is to incorporate public-facing language into a company’s external standards, for example their Terms of Service, Acceptable Use Policies and / or Community Guidelines, that prohibits this behavior.
For more information, companies can sign up for the Tech Coalition’s Pathways program and check out the External Standards that Prohibit OCSEA resource.
Create internal child safety guidelines that outline how to identify and take action on OCSEA
To facilitate fighting online child sexual exploitation and abuse (OCSEA), companies can create internal principles to gather definitions and document enforcement guidelines. This helps ensure alignment across teams as well as consistency, and can be used by reviewers in content moderation.
For more information, companies can sign up for the Tech Coalition’s Pathways program and check out the Writing a Child Safety Content Policy resource.
Method for surfacing cases
Establish operations in order to identify OCSEA
Create external standards that prohibit OCSEA
- An important practice for fighting OCSEA is to incorporate public-facing language into a company’s external standards, for example their Terms of Service, Acceptable Use Policies and / or Community Guidelines, that prohibits this behavior.
- For more information, companies can sign up for the Tech Coalition’s Pathways program and check out the External Standards that Prohibit OCSEA resource.
Create internal child safety guidelines that outline how to identify and take action on OCSEA
- To facilitate fighting online child sexual exploitation and abuse (OCSEA), companies can create internal principles to gather definitions and document enforcement guidelines. This helps ensure alignment across teams as well as consistency, and can be used by reviewers in content moderation.
- For more information, companies can sign up for the Tech Coalition’s Pathways program and check out the Writing a Child Safety Content Policy resource.
Method for surfacing cases
- User reporting
- Mechanisms must be in compliance with local legal requirements and made available to users of any Company’s product(s) to enable reporting or flagging of illegal or harmful content and/or behavior. These reports/flags can then be reviewed against company policies and guidelines. Add child safety reporting options to any surface / page and consider having a Help Center / Support article that provides a means for users to report potential child abuse.
- Detection
- Current technology solutions, open source or not, enable companies to detect OCSEA on their platform, including CSAM distribution, online grooming etc. The most common industry solutions include: hash-matching (such as PhotoDNA), image/video classifiers, text classifiers, keywords ingesting and URL blocking. Implementing detection capabilities may also require assessing additional needs for human review, so planning for moderation teams should be considered, if necessary.