Pornhub’s first transparency report details how it addresses illegal content
Pornhub’s first transparency report details how it addresses illegal content
It said it deleted 653,465 pieces of content that violated its guidelines.
Pornhub removed a ton of content and went through some very major changes last December after New York Times reported that its lax policy enforcement allows it to monetize rape and child exploitation videos. Now, the website has published its first ever transparency report that sheds light on its moderation practices and on the reports its received from January 2020 to December 2020. Apparently, Pornhub removed 653,465 pieces of content that violated its guidelines. Those include videos depicting a minor and anything non-consensual, such as revenge pornography and doxxing attempts. It also removed videos containing animal harm, violence and prohibited bodily fluids.
The website has explained how it deals with child sexual abuse material (CSAM), as well. Pornhub detects CSAM on its website through moderation efforts and from reports submitted by the National Center for Missing and Exploited Children. The center submitted a total of over 13,000 potential CSAM last year, with 4,171 being unique reports and the other being duplicates.
As for how it moderates content before publishing, Pornhub said it uses several detection technologies. In 2020, it scanned all previously uploaded videos against YouTube’s CSAI Match, the video platform’s proprietary technology for identifying child sexual abuse imagery. It also scanned all previously submitted photos against Microsoft’s PhotoDNA, which was designed for the same purpose. Pornhub will continue using both technologies to scan all videos submitted to its platform. In addition, the website uses Google’s Content Safety API, MediaWise cyber fingerprinting software (to scan all new user uploads against previously identified offending content) and Safeguard, its own image recognition technology meant to combat both CSAM and non-consensual videos.
Back in February, the company also announced that it’s using a third-party firm to verify the identities of creators. It chose to end all unverified uploads and to ban downloads following the NYT article and shortly after Mastercard and Visa cut off payments to Pornhub. Visa started accepting payments for some of MindGeek’s (Pornhub’s parent company) adult sites featuring professionally produced videos again around Christmastime, but Pornhub itself remained banned.
(52)