VSCO Uses Safer to Protect Its Platform and Community of Creators from Child Sexual Abuse Material (CSAM)

In addition to helping VSCO prioritize user safety on its platform, Safer’s on-premises deployment unlocked automated solutions and moderation efficiencies for the Trust and Safety and content moderation teams.

 

VSCO’s safety by design ethos drove them to seek industry-leading CSAM detection tools.

Since its founding, VSCO has believed in proactively protecting the wellbeing of its global community of 250+ million registered users (or creators), who upload photos and videos to its platform every day.

VSCO wanted to enhance its tooling to prevent the distribution of CSAM. For this, it needed an industry-leading detection tool.

Inside the case study:

  • How Safer's self-hosted deployment provided efficiencies for VSCO’s team
  • How proactive tooling helps VSCO prioritize safety on their platform
  • How Safer’s AI/ML helped VSCO’s small Trust & Safety team make a big impact
Thorn is committed to providing resources and tools to digital platforms to help them combat child sexual abuse and exploitation at scale.
23_VSCOxFlickr_CaseStudy_LandingPage_1200x1200

Where can we send your case study?

Where can we send your case study?