Pornhub is trialing a new automated tool that pushes CSAM-searchers to seek help for their online behavior. Will it work?
volumes of child sexual abuse photos and videos online—millions of pieces are removed from the web every year. These illegal images areon social media websites, image hosting services, dark web forums, and legal pornography websites. Now a new tool on one of the biggest pornography websites is trying to interrupt people as they search for child sexual abuse material and redirect them to a service where they can get help.
“The scale of the problem is so huge that we really need to try and prevent it happening in the first place,” says Susie Hargreaves, the chief executive of the Internet Watch Foundation , a UK-based nonprofit that removes child sexual abuse content from the web. The IWF is one of two organizations that developed the chatbot being used on Pornhub. “We want the results to be that people don’t look for child sexual abuse. They stop and check their own behavior,” Hargreaves says.
“We realized this needs to be as simple a user journey as possible,” says Dan Sexton, the chief technology officer at the IWF. Sexton explains the chatbot has been in development for more than 18 months and involved multiple different groups as it was designed. The aim is to “divert” or “disrupt” someone who may be looking for child sexual abuse material and to do so using just a few clicks.