Millions of photos are uploaded to Flickr every day, and with those photos comes a responsibility to keep the platform safe. With a small team, they needed a tool that could help them quickly detect new and previously unknown CSAM.
IN THIS CASE STUDY:
Detect known CSAM by hash matching against our database of 32+ million hashes— the largest database of CSAM hashes available.
Leverage advanced AI/ML classification models to find and flag potentially new, unknown CSAM for review.
Moderate content using Safer’s Review Tool, a user interface built with employee wellness in mind.
Report to NCMEC and RCMP quickly and accurately. Our Reporting Service supports you in sending quality reports.