New Report: Child Sexual Abuse Content and Online Risks to Children on the Rise

CSAM Detection

Certain online risks to children are on the rise, according to a recent report from Thorn, a technology nonprofit whose mission is to build technology to defend children from sexual abuse. Research shared in the Emerging Online Trends in Child Sexual Abuse 2023 report, indicates that minors are increasingly taking and sharing sexual images of themselves. This activity may occur consensually or coercively, as youth also report an increase in risky online interactions with adults.

“In our digitally connected world, child sexual abuse material is easily and increasingly shared on the platforms we use in our daily lives,” said John Starr, VP of Strategic Impact at Thorn. “Harmful interactions between youth and adults are not isolated to the dark corners of the web. As fast as the digital community builds innovative platforms, predators are co-opting these spaces to exploit children and share this egregious content.”

These trends and others shared in the Emerging Online Trends report align with what other child safety organizations are reporting. The National Center for Missing and Exploited Children (NCMEC) ‘s CyberTipline has seen a 329% increase in child sexual abuse material (CSAM) files reported in the last five years. In 2022 alone, NCMEC received more than 88.3 million CSAM files.

Several factors may be contributing to the increase in reports:

  1. More platforms are deploying tools, such as Thorn’s Safer product, to detect known CSAM using hashing and matching.
  2. Online predators are more brazen and deploying novel technologies, such as chatbots, to scale their enticement. From 2021 to 2022, NCMEC saw an 82% increase in reports of online enticement of children for sexual acts.
  3. Self-generated CSAM (SG-CSAM) is on the rise. From 2021 to 2022 alone, the Internet Watch Foundation noted a 9% rise in SG-CSAM.

This content is a potential risk for every platform that hosts user-generated content—whether a profile picture or expansive cloud storage space.

Only technology can tackle the scale of this issue#

Hashing and matching is one of the most important pieces of technology that tech companies can utilize to help keep users and platforms protected from the risks of hosting this content while also helping to disrupt the viral spread of CSAM and the cycles of revictimization.

Millions of CSAM files are shared online every year. A large portion of these files are of previously reported and verified CSAM. Because the content is known and has been previously added to an NGO hash list, it can be detected using hashing and matching.

What is hashing and matching?#

Put simply, hashing and matching is a programmatic way to detect CSAM and disrupt its spread online. Two types of hashing are commonly used: perceptual and cryptographic hashing. Both technologies convert a file into a unique string of numbers called a hash value. It’s like a digital fingerprint for each piece of content.

CSAM Detection

To detect CSAM, content is hashed, and the resulting hash values are compared against hash lists of known CSAM. This methodology allows tech companies to identify, block, or remove this illicit content from their platforms.

Expanding the corpus of known CSAM#

Hashing and matching is the foundation of CSAM detection. Because it relies upon matching against hash lists of previously reported and verified content, the number of known CSAM hash values in the database that a company matches against is critical.

CSAM Detection

Safer, a tool for proactive CSAM detection built by Thorn, offers access to a large database aggregating 29+ million known CSAM hash values. Safer also enables technology companies to share hash lists with each other (either named or anonymously), further expanding the corpus of known CSAM, which helps to disrupt its viral spread.

Eliminating CSAM from the internet#

To eliminate CSAM from the internet, tech companies and NGOs each have a role to play. “Content-hosting platforms are key partners, and Thorn is committed to empowering the tech industry with tools and resources to combat child sexual abuse at scale,” Starr added. “This is about safeguarding our children. It’s also about helping tech platforms protect their users and themselves from the risks of hosting this content. With the right tools, the internet can be safer.”

CSAM Detection

In 2022, Safer hashed more than 42.1 billion images and videos for their customers. That resulted in 520,000 files of known CSAM being found on their platforms. To date, Safer has helped its customers identify more than two million pieces of CSAM on their platforms.

The more platforms that utilize CSAM detection tools, the better chance there is that the alarming rise of child sexual abuse material online can be reversed.



Original Source



A considerable amount of time and effort goes into maintaining this website, creating backend automation and creating new features and content for you to make actionable intelligence decisions. Everyone that supports the site helps enable new functionality.

If you like the site, please support us on “Patreon” or “Buy Me A Coffee” using the buttons below

 To keep up to date follow us on the below channels.