- The popular pornography website Pornhub is deleting all unverified content on its platform, the company announced on Monday.
- “As part of our policy to ban unverified uploaders, we have now also suspended all previously uploaded content that was not created by content partners or members of the Model Program,” the company said.
- It’s the latest response from Pornhub following a New York Times column that accused the company of hosting child pornography and other illegal content, like videos filmed without the consent of those featured.
- Both Visa and Mastercard have pulled their charging services from Pornhub, and Pornhub has announced plans to verify all the content on its platform.
- Visit Business Insider’s homepage for more stories.
Pornhub is purging all unverified videos from its platform – the latest move in an ongoing response to accusations that the popular pornography website hosts child pornography.
“As part of our policy to ban unverified uploaders, we have now also suspended all previously uploaded content that was not created by content partners or members of the Model Program,” the company said in a blog post on Monday morning. “This means every piece of Pornhub content is from verified uploaders, a requirement that platforms like Facebook, Instagram, TikTok, YouTube, Snapchat and Twitter have yet to institute.”
The company did not confirm how many videos were removed from the site, but Motherboard, which first reported the news, notes that the number of videos visible on Pornhub’s search function went from 13.5 million to 4.7 million on Monday morning.
Pornhub previously operated like YouTube, but with a focus on pornography, where anyone could upload a video to the service.
In a column written by Nicholas Kristof in the New York Times, Kristof described videos on Pornhub that he said were recordings of assaults on unconscious women and girls.
“The issue is not pornography but rape. Let’s agree that promoting assaults on children or on anyone without consent is unconscionable,” Kristof wrote on December 4.
The column called for Visa and Mastercard, two credit card companies that Pornhub works with, to stop working with the company. One week later, both companies officially ended their relationships with Pornhub.
Pornhub and its parent company Mindgeek have denied the allegations in the Times. The company told Business Insider it employs a “vast team of human moderators” who manually review “every single upload,” as well as automated detection technologies. It did not say how many people were part of its review team.
“Pornhub has actively worked to employ extensive measures to protect the platform from such content,” a Pornhub representative told Business Insider. “These measures include a vast team of human moderators dedicated to manually reviewing every single upload, a thorough system for flagging, reviewing and removing illegal material, robust parental controls, and a variety of automated detection technologies.” Those technologies, it said, include tools created by YouTube, Google, and Microsoft that are intended to combat child pornography and sexual abuse imagery.
Following the Times report, Pornhub announced stricter guidelines on who can publish videos and what videos are allowed to be published: Only accounts which Pornhub verifies will be allowed to publish content. Monday’s announcement takes that one step further, and purges Pornhub of all previously unverified content.
It’s unclear how many videos are being deleted from the service, and representatives didn’t respond to a request for comment as of publishing.
Got a tip? Contact Business Insider senior correspondent Ben Gilbert via email (email@example.com), or Twitter DM (@realbengilbert). We can keep sources anonymous. Use a non-work device to reach out. PR pitches by email only, please.