Pornhub tightened its rules around violent and underage content this week. Those changes are a good start, experts say, but they won’t be sufficient to combat a growing problem of non-consensual videos.
Why it matters: The New York Times story, by Nick Kristof, reported that Pornhub’s vast user-generated content library contains plenty of revenge porn and videos with underage participants. It also details the harm that being on Pornhub can cause for people whose videos were posted without their consent.
Driving the news: Pornhub this week announced a series of changes, including stepped-up moderation, temporarily limiting uploads to known content producers, and eliminating the ability to download videos.
Yes, but: “I dont think it’s going to come anywhere close to fixing the whole problem,” Rape, Abuse & Incest National Network CEOÂ Scott Berkowitz told Axios.
- For example, he noted that verifying posters is an important step, but doesn’t go far enough to ensure that everyone depicted is a willing, consensual adult.
- “Theres been a staggering increase in the amount of child sexual abuse material thats available,” Berkowitz said, in addition to the posting of revenge porn and other videos that are posted without the consent of all participants.
PornHub also said it would increase the resources it puts toward moderation.
- “The key question is, is [Pornhub] going to implement these changes fully?” said Yiota Souras, senior vice president and general counsel for the National Center for Missing and Exploited Children. “On paper, it’s great. But there must be investment and follow-through.”
Between the lines: There’s a big difference between videos with consensual adults, often professional, enacting all manner of sex scenes and the minefield that is user-generated content. In the latter, Berkowitz says, it is impossible to know if users consented to the act shown, if users consented to broad distribution of the video, and if everyone was of age to consent.
The big picture: Pornhub’s changes come at a moment when legislators and activists are looking to solve a wide range of problems online by proposing limits to the tech industry’s liability protection. But that approach has had unintended consequences in the past.
- SESTA/FOSTA, a law passed in 2018, aimed to curtail sex trafficking on line by narrowing the tech industry’s liability protection, known as Section 230.
- But the controversial law resulted in massive policing of sex-related content across online platforms, inadvertently hurting sex workers and eliminating their comparatively safe online spaces, said Kendra Albert, a clinical professor at the Cyberclinic at Harvard Law School.
Pornhub’s changes came faster than legislation just days after the New York Times report and, while perhaps not going far enough, are more squarely aimed at problem areas.
- The Times column’s effectiveness at prompting quick change shows the power of investigative work and putting sexual assault survivors’ voices first, which sways public opinion and clearly illustrates harm, said Souras.
- But sometimes columns like Kristof’s can leave sex workers and other affected groups out of the conversation, Albert said.
Of note: Part of the pressure on PornHub came from its payment processors, companies like MasterCard and Visa.
- Sex worker advocates are concerned that if those companies withdraw completely from the porn industry, workers will end up in more dangerous situations.
Our thought bubble: Before tinkering with Section 230 again, lawmakers should look at SESTA/FOSTA’s record of effectiveness as well as the collateral damage it inflicted. As the Pornhub example shows, public and media pressure might change sites’ behavior faster than legislation.
What’s next: A bipartisan Senate bill introduced Wednesday would allow victims of rape or sex trafficking to sue porn sites that profit from their images, an approach RAINN has endorsed.