Reports of child sexual abuse material (CSAM) have skyrocketed by over 15,000% in the last 15 years, with over 64 million files of CSAM reported in 2020 alone
With Safer on the AWS Marketplace, you now have flexible, easy access to best-in-class technology to proactively detect and remove CSAM from your platforms
Keep your platform safe and help eliminate CSAM from the internet
Now you can say yes to solving one of the world’s most difficult and pressing challenges - all while building the safe platform your community deserves
Remove the threat of known CSAM on your platform with the power of Safer hash matching, fueled by our growing collaborative dataset of over 10 million hashes
With Thorn’s CSAM Classifier integrated into Safer, customers can now detect potentially new, unknown CSAM, as well as improve mitigation of false positives and streamline perceptual matches
Safer detects abuse imagery, as well as video content, keeping your platform safety ahead of the curve as the number of reported CSAM video files reported to NCMEC continue to grow
Reporting API: Leverage intelligence of CSAM flagged by Safer’s Detection services to streamline reporting and removal based on company policy and internal processes