CSAM Detection for Any Platform with an Upload Button

Equip your Trust and Safety team with proactive child sexual abuse material (CSAM) detection solutions built by experts in child safety technology.

home__hero-graphic

Safeguard Your Brand with Comprehensive CSAM Detection

Safer, built by Thorn, is a proactive CSAM technology solution that combines an expansive hash database of verified CSAM hash values, predictive AI technology and deployment options to deliver CSAM scanning at scale. Designed for platforms that host user-generated content, Safer helps to mitigate risks to your platform and users.

Our tool offers:

  • A CSAM hash database with 57+ million hash values

  • Advanced predictive AI models to identify new or previously unknown CSAM

  • Potential to share hash values of known CSAM among Safer customers to help minimize the spread of harmful content
  • Self-hosted or Thorn-hosted deployment options to suite your needs

  • A content moderation review module built with employee wellness in mind

  • Ability to connect to your NCMEC or RCMP account from the reporting module

Let's chat about putting Safer to work for your platform.

Get in touch.

The world’s most innovative companies protect their platforms using Safer

logo-shopify
bumble_c
Niantic_logo_white
logo-openai
logo-giphy
quora_c

Protect your platform and your users from abuse content at scale.

matching_illustration white bkgd@3x

Hash Matching

Detect known CSAM by hash matching against our CSAM database of 57+ million hashes— the largest database of CSAM hashes available.

hashing_illustration white bkgd@3x

Predictive AI for CSAM detection

Leverage advanced AI classifiers to find and score potentially new, unknown image and video CSAM for review.

reviewing_illustration white bkgd@3x

Content Moderation Review Module

Our Review Module for abusive content moderation is a user interface with wellness features built in.

reporting_illustration white bkgd@3x

Reporting Module

Report to NCMEC and RCMP by connecting to your account directly from Safer's reporting module. 

headshot-risa-shapes

"Thorn makes it simple for businesses to set up and operate a robust child safety program. Their Safer tools are designed with flexibility in mind, and Thorn has provided excellent support to our product and engineering teams to ensure our implementation of these tools fits the unique context of our platform. Slack has long relied on Thorn to help keep our services safe in a responsible and privacy-protective way."

Risa Stein, Director of Product Management, Integrity at Slack

SAFER'S HASHING SERVICE INCLUDES:

  • Cryptographic hashes (a.k.a. MD5 hashes) for all file types for exact hash matching

  • Perceptual hashes for images via SaferHash and PhotoDNA

  • Perceptual hashes for videos via scene sensitive video hashing a.k.a. SSVH

Safer's self-hosted deployment gives you the features you need for CSAM identification, while providing the data privacy your users expect. You can enforce your community guidelines and CSAM policies while being a champion for child safety. 

If you have an upload button, chances are you have an urgent need to address CSAM at scale.

Any platform with an upload button-dark

CSAM is on the rise.

In 2004, the National Center for Missing & Exploited Children (NCMEC) reviewed roughly 450,000 child sexual abuse material, or CSAM, files. Fast forward to 2023, the NCMEC’s CyberTipline received more than 104 million files of potential CSAM from tech companies.

CSAM cropping up undetected is pervasive, and it’s a serious issue that’s only continuing to grow. File sharing and user-generated content are foundational to the internet as it exists today. Anyone can upload and share content widely.

An unintended consequence is how easy the internet has made it for abusers to share child sexual abuse material within communities that can flourish on the same platforms we all use every day. Another unintended consequence of the internet is that CSAM content often gets shared widely after it is posted, recirculating for years, perpetuating the abuse, and re-traumatizing victims.

 

Content moderation is complex.

Scanning for, identifying and enforcing policies related to this harmful content falls to content moderators. And, given the scale of this issue, it's safe to assume they are overwhelmed.

Trust & Safety teams juggle many priorities. Enforcing community guidelines while upholding your privacy policy. Scanning for CSAM while protecting user data. With Safer, you don't have to choose one or the other.

Safer can help you protect your platform with privacy in mind. Our CSAM technology empowers you to be proactive in CSAM identification while providing a secure and flexible tool that puts you in control of how it integrates into your infrastructure and content moderation workflow.

 

CSAM detection built by experts in child safety technology.

We're on a mission to create a safer internet for children, beginning with the elimination of CSAM and targeting every platform with an upload button. Safer was created with the intention of transforming the internet by finding and removing child sex abuse material, defending against revictimization, and diminishing the viral spread of new material.

With a relentless focus on CSAM elimination strategies via advanced AI/ML models, proprietary research, and a cutting-edge CSAM detection tool, Safer enables businesses to come together and protect children online.

Get in touch.