Tech

Pornhub, OnlyFans, and Meta Join New Sextortion Prevention Platform

'Take It Down,' a new initiative by the National Center for Missing and Exploited Children, creates a hash from non-consensual images of minors without requiring them to upload anything.
Photo by Tatiana Syrikova on Pexels
Photo by Tatiana Syrikova on Pexels

Two of the biggest adult content platforms on the internet have partnered with the National Center for Missing and Exploited Children on a new service that helps minors report their own non-consensual imagery—and get it taken offline.

Take It Down is developed by NCMEC with technical and financial assistance from Meta, and participating companies at launch include Pornhub, OnlyFans, and Yubo, a French social networking app popular with teenagers. 

Advertisement

“Having a personal intimate image shared with others can be scary and overwhelming, especially for young people,” Antigone Davis, Meta’s global head of safety, wrote in a Meta newsroom announcement on Monday. “It can feel even worse when someone tries to use those images as a threat for additional images, sexual contact or money—a crime known as sextortion.”

Gavin Portnoy, Vice President of communications at NCMEC, told Motherboard that since 2015, NCMEC has seen “an explosion” in online enticement reports to the CyberTipline, the organizations’ national centralized reporting system for child sexual abuse material.  “Within this subset of reports has been an alarming growth of cases of sextortion,” Portnoy said. “Sextortion is a form of child sexual exploitation where children are threatened or blackmailed, most often with the possibility of sharing with the public a nude or sexual images of them, by a person who demands additional sexual content, sexual activity or money from the child. This crime may happen when a child has shared an image with someone they thought they knew or trusted, but in many cases they are targeted by an individual they met online who obtained a sexual image from the child through deceit, coercion, or some other method.” 

The Take It Down service is meant to empower minors, specifically, to start the process of removing abusive imagery from the internet. It assigns a unique hash value (an alphanumeric string that serves as an image file’s “fingerprint”) to nude, partially nude, or sexually explicit images or videos of people under the age of 18. To use the service, a victim, even if they’re a minor themselves, can choose an image on their device that they want submitted to the platform—for example, a nude they shared with a friend that they then posted to Instagram without their consent. Without the image leaving their device, the victim can use Take It Down to generate a hash that’s then added to a new list maintained by NCMEC and shared with participating online platforms. Those platforms then voluntarily scan their own services for hashes that match, and review the matches “in order to reduce the circulation or upload of sexually explicit or exploitative content depicting minors,” Portnoy said.

Advertisement

In 2017, Meta (then still Facebook) launched a similar initiative: targets of non-consensual image sharing could send one of Facebook’s partner organization the very images they dreaded seeing on Facebook, Instagram, or Messenger, and then Facebook would create a hash and remove copies of the image anywhere it appeared on the platforms it owns. Facebook promised that the process was secure, and added that humans would review that content. But understandably, asking victims of abuse and harassment to share their own nudes in order to prevent more nudes from being shared was an absurd proposition. 

The new Take It Down platform is less invasive, but still requires victims to have the images on their own devices. If someone recorded them without their consent, for example, they’d have to seek out and download that content to their own device before creating a hash on Take It Down—something that can be retraumatizing for victims.

The proliferation of underage sexual abuse material has plagued Meta for years. In 2021, the company disclosed to NCMEC that Meta’s platforms, including Instagram, Whatsapp, and Facebook, reported 20,307,216 instances of child exploitative content in 2020—more than any other reporting agency by a large margin. It’s been working 

Pornhub and OnlyFans have been attacked in recent years by anti-porn organizations for allegations of underage imagery on their sites. Most recently, 11 states, including Louisiana and Arkansas, have either introduced or passed legislation that requires adult sites to collect identifying information about users. 

In late 2020, following a report by the New York Times that made claims about child sexual material on the site, Pornhub overhauled its moderation practices, including purging millions of videos uploaded by unverified users from its site and banning all future unverified uploads, a move supported by advocacy groups working to prevent non-consensual intimate imagery.

“NCMEC plays a pivotal role in the protection of children online, and the Take It Down project is yet another positive step in preventing the circulation of abhorrent images on the internet,” a spokesperson for MindGeek, Pornhub’s parent company, told Motherboard. “As leaders in online trust and safety, MindGeek is pleased to participate in this initiative, raising the bar for online safety standards. Take it Down is an important measure to add to our industry-leading list of policies and protections, which include mandatory age and identity verification in order to upload content, extensive moderation practices, and partnerships with dozens of non-profit organizations around the world. We encourage all image-sharing platforms to follow our lead and participate in Take it Down.”

Similarly, OnlyFans and other platforms that host adult content creators have long been discriminated against by financial institutions. In 2021, OnlyFans claimed that pressure from banks forced the company to consider banning explicit content altogether—a decision quickly reversed after outrage from the adult creator community and their fans and allies.

OnlyFans says that it already shares hashing data with leading NGOs and regulators, but that the Take It Down initiative will build on those efforts. “OnlyFans is an 18s and over platform and we have a zero tolerance policy for the sharing of intimate images of minors,” Keily Blair, Chief Strategy and Operations Officer at OnlyFans told Motherboard. “We believe that platforms have a responsibility to protect children online and the Take it Down initiative builds on the work OnlyFans is already doing in this area. We are proud to continue our ongoing work with NCMEC on this important issue.”