How to prevent child sexual abuse material from spreading online

Lots of the websites and platforms which have performed a lot to democratize free expression around the globe have additionally sadly spurred an increase in dangerous and unlawful content material online, together with child sexual abuse material (CSAM).

The web didn’t create CSAM, nevertheless it has supplied offenders with elevated alternative to entry, possess, and commerce child sexual abuse photos and movies, typically anonymously, and at scale. The National Center for Missing & Exploited Children (NCMEC) has seen a 15,000% improve in abuse information reported within the final 15 years. On the identical time, a report from Fb to NCMEC in fall 2020 found that solely six movies accounted for greater than half of reported CSAM content material throughout Fb and Instagram—indicating that huge networks of people are relentlessly sharing pre-existing CSAM.

Whereas these legal transactions had been as soon as confined to the darkest reaches of the online, the arrival of social media platforms has unwittingly supplied an environment friendly distribution pipeline. Consequently, platforms and legislation enforcement companies have struggled to include the seemingly infinite streams of CSAM. Google, Dropbox, Microsoft, Snapchat, TikTok, Twitter, and Verizon Media reported over 900,000 situations on their platforms, whereas Fb reported that it eliminated almost 5.4 million items of content material associated to child sexual abuse within the fourth quarter of 2020.

Fb famous that greater than 90% of the reported CSAM content material on its platforms was the “identical as or visibly related to beforehand reported content material,” which is the crux of the issue. As soon as a chunk of CSAM content material is uploaded, it spreads like wildfire, with every subsequent incident requiring its personal report and its personal particular person motion by authorities and platforms. It’s akin to an infinite, unwinnable recreation of Whac-a-Mole, additional difficult by legal customers modifying and distorting pictures and movies to evade legislation enforcement tags.

For victims of abuse, the impression is devastating. Whereas these dangerous photos and movies are sometimes the one inculpatory proof of victims’ exploitation and abuse, rampant sharing causes revictimization every time the picture of their abuse is seen. In a 2017 survey led by the Canadian Centre for Child Safety, 67% of abuse survivors mentioned the distribution of their photos impacts them in a different way than the hands-on abuse they suffered; the distribution by no means ends, and the photographs are everlasting.

Some offenders might by no means “spiral” to consuming extra child abuse material in unregulated online areas in the event that they by no means entry it on the foremost social platforms within the first place. Earlier this yr, Fb launched a characteristic to prevent customers from trying to find CSAM content material on its platform. Absolutely, if customers can not discover this unlawful content material, then will probably be tougher to unfold. However a extra surefire method to remedy this drawback is to prevent the importing of CSAM within the first place. Digital platforms and legislation enforcement want expertise that may establish all variations of a video, regardless of distortions, and do it in seconds, prior to the content material being revealed online.

Luckily, options exist at this time to assist sort out this drawback and related surrounding points. Our organizations, Pex and Child Rescue Coalition, partnered earlier this yr to successfully test Pex’s expertise, sometimes used for copyright administration and licensing, to establish and flag CSAM content material on the level of add. Different corporations—together with Kinzen, which is using machine studying to shield online communities from disinformation and harmful content material, and Crisp, which provides an answer to shield youngsters and youngsters from child exploitation teams online—are additionally aiding within the battle to create a safer web.

Options like these assist to be certain that victims of child abuse discover care, group, and closure on social media and break the cycle of trauma. However as we proceed constructing applied sciences that join us to others, we should additionally prioritize retaining these communities secure. That’s the web all of us deserve.


Glen Pounder serves because the Director of Applications of the Child Rescue Coalition (CRC), a nonprofit group devoted to curbing abusive material online. Rasty Turek is the Founder and CEO at Pex, the trusted world chief in digital rights expertise, enabling the primary real-time market for copyrighted content material.