VSCO Uses Safer to Protect Its Platform and Community of Creators from Child Sexual Abuse Material (CSAM)
In addition to helping VSCO prioritize user safety on its platform, Safer’s on-premises deployment unlocked automated solutions and moderation efficiencies for the Trust and Safety and content moderation teams.
VSCO’s safety by design ethos drove them to seek industry-leading CSAM detection tools.
Since its founding, VSCO has believed in proactively protecting the wellbeing of its global community of 250+ million registered users (or creators), who upload photos and videos to its platform every day.
VSCO wanted to enhance its tooling to prevent the distribution of CSAM. For this, it needed an industry-leading detection tool.
Inside the case study:
- How Safer's self-hosted deployment provided efficiencies for VSCO’s team
- How proactive tooling helps VSCO prioritize safety on their platform
- How Safer’s AI/ML helped VSCO’s small Trust & Safety team make a big impact

Where can we send your case study?
Our partnership with Thorn complements the expertise of our internal team by giving us access to their researchers, data scientists and more. Using tools built by Thorn gives us confidence—and credibility—in what we’re doing and really reinforces our commitment to Safety by Design.