GetSaferio

0
Network
Score (What’s this?)

Perlu Network score measures the extent of a member’s network on Perlu based on their connections, Packs, and Collab activity.

Safer is a complete solution to help stop child sexual abuse material from spreading across your platform. Keeping you, your company and your users, safer.

Share
Social Audience 223
getsafer.io Last Month
  • Moz DA 27
Categories
  • Education
  • Technology & Computing
  • Computing
  • Video Gaming
Highlights
The challenge of detecting CSAM videos and what we can do about it today

2019 was the first year the number of videos exceeded the number of images reported by technology companies to the NCMEC CyberTipline, indicating the rising popularity of video among internet users also made its way to distributing videos of child abuse content. While cryptographic hashing can be used to find and de-duplicate these videos, there have been very few perceptual hashing options to identify similar videos. At Thorn, we spent part of last year designing, testing, and deploying algorithms to reliably detect CSAM at scale in order to stop re-victimization and help mitigate continued exposure to traumatic content for content reviewers. As a non-profit that also builds technology, we understand the technical challenges of effective perceptual hashing for video, one of which is the sheer number of ways to approach it, which include:Framewise video hashes (e.g., computing image hashes from videos at some regular interval)In this blog, we share our thought process for evaluating the various methods, open a dialogue in the child safety community about how to tackle this problem, and build a community around growing our capabilities together.

Safer for AWS Marketplace

These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media. To find out more about the cookies we use, see our Privacy Policy. We won't track your information when you visit our site. But in order to comply with your preferences, we'll have to use just one tiny cookie so that you're not asked to make this choice again.

Common Terms and Definitions

Child sexual abuse material is material that depicts a child engaged in real or simulated sexual activity, or sexual parts of a child primarily for sexual purposes often for distribution within communities that are built specifically to normalize and request the explicit sexual abuse of children. Alternatively, CSAM may also be referred to as child sexual abuse imagery (CSAI) or child sexual exploitation (CSE). In the case of hashes and identifying CSAM, false positives are images that were incorrectly identified as CSAM because they happened to match the perceptual hash value for a CSAM image. Hashes in SaferList may represent content that is CSAM, that is otherwise sexually exploitative of children, or false positives that may trigger false matches against actual SE and CSAM hashes.

CSAM Keyword Hub

In partnership with the Technology Coalition, Thorn has developed an API containing child sexual abuse material (CSAM) terms and phrases in multiple languages to improve your content moderation process. Technology companies with a chat function who are seeking to protect children from online predation on their platforms can apply for cost-free use of the technique. Instead of using the Keyword Hub for strictly blocking specific keywords that match the list, the strong preference is to use the list to kickstart the training of machine learning models. Anyone interested in accessing the CSAM Keyword Hub must complete the application on this page, share their intent of use, and agree to the terms and conditions.

Join Perlu And Let the Influencers Come to You!

Submit