Flickr Uses Safer’s Image Classifier to Expand Their Detection of Child Sexual Abuse Material (CSAM)

safer_upload

Flickr's small Trust & Safety team used AI to make a big impact.

Millions of photos are uploaded to Flickr every day, and with those photos comes a responsibility to keep the platform safe. With a small team, they needed a tool that could help them quickly detect new and previously unknown CSAM. 


IN THIS CASE STUDY:

  • How Flickr leveraged Safer’s CSAM Classifier's machine learning classification model to accelerate their detection of new or previously unknown CSAM
  • How Safer’s AI/ML helped Flickr’s small Trust & Safety team make a big impact
  • How Safer x Flickr are working together to improve cross-platform intelligence

Get the full case study to see how many classifier hits Flickr had in 2022 alone.

Where can we send your case study?

 

THE WORLD’S MOST INNOVATIVE COMPANIES PROTECT THEIR PLATFORMS USING SAFER

logo-shopify
bumble_c
Niantic_logo_white
logo-openai
flickr_hero
quora_c

Protect your platform and your users from abuse content at scale.

matching_illustration white bkgd@3x

Hash Matching

Detect known CSAM by hash matching against our database of 32+ million hashes— the largest database of CSAM hashes available.

hashing_illustration white bkgd@3x

CSAM Image and Video Classifiers

Leverage advanced AI/ML classification models to find and flag potentially new, unknown CSAM for review.

reviewing_illustration white bkgd@3x

Review Tool

Moderate content using Safer’s Review Tool, a user interface built with employee wellness in mind.

reporting_illustration white bkgd@3x

Reporting API

Report to NCMEC and RCMP quickly and accurately. Our Reporting Service supports you in sending quality reports.  

jace-testimonial

safer-quote
With Safer’s CSAM Classifier, we’re finding content that we would not have known existed. What you don’t know is what’s going to hurt you in this line of work. 

We don’t have a million bodies to throw at this problem, so having the right tooling is really important for us. The Classifier enables us to be proactive. We’re not just keeping our platform safe, we’re protecting it and making it a safe, equitable platform.
  

—JACE POMALES, TRUST AND SAFETY MANAGER, FLICKR