Skip to main content
Uncategorized

What’s Going On With Pornhub?

By April 28, 2021April 29th, 2021No Comments

Episode 42

What’s Going On With Pornhub?

Disclaimer: Fight the New Drug is a non-religious and non-legislative awareness and education organization. Some of the issues discussed in this episode may be legislatively-affiliated. Though our organization is non-legislative, we fully support the fights against already illegal forms of pornography and sexual exploitation, and against sex trafficking.

This week we’re bringing you a different, bonus episode discussing what’s been going on with one of the world’s largest porn sites, Pornhub, and its parent company, MindGeek. In December 2020, Pulitzer Prize-winning journalist Nicholas Kristof published an investigative column in the New York Times giving visibility to Pornhub’s questionable business practices, specifically highlighting how the porn tube site reportedly hosts and profits off of nonconsensual content, image-based sexual abuse (IBSA) and child sexual abuse material (CSAM). Since the New York Times exposé, a lot has happened in response, including Pornhub announcing changes to their platform and removing over 10 million videos from the site, Mastercard, Visa, and Discover suspending their payment processing services on Pornhub, and the Canadian House of Commons Ethics Committee (ETHI) launching an investigation into MindGeek for reportedly hosting videos of child sexual abuse, rape, sex trafficking, and nonconsensually-distributed content.

Please note that this is a developing story and ongoing investigation. For a recent, simplified timeline of events, please visit ftnd.org/phtimeline.

FROM THIS EPISODE
EPISODE TRANSCRIPT

Fight the New Drug Ad: Hey listeners, Did you know that Consider Before Consuming is a podcast by Fight the New Drug? Fight the New Drug is a non-religious, non-legislative 501C3 non-profit that exists to provide the opportunity to make an informed decision regarding pornography by raising awareness on the harmful effects using only science, facts, and personal accounts. Fight the New Drug is research based, education focused, sex-positive, and anti-shame. To learn more about Fight the New Drug, and to see the additional free resources that we offer, like our three-part documentary series, and our interactive conversation guide, visit FTND.org. That’s FTND.org.

Garrett Jonsson: My name is Garrett Jonsson. During this conversation, we discussed child sexual abuse, materials, sex trafficking, and other forms of nonconsensual content. Listener discretion is advised. With that being said, let’s jump into the conversation. We hope you enjoyed this episode of Consider Before Consuming.

We want to welcome to the podcast Keri. Keri is our editorial director. So thanks for joining us today, Keri.

Keri: Thanks for having me Garrett.

Garrett Jonsson: This is a different type of episode. The reason why we decided to have this conversation is because there is a lot happening with PornHub and its parent company MindGeek, the 30,000 foot view of what has happened is this, there was a New York Times expose, a that was published in December of 2020 about nonconsensual content and child sexual abuse material being hosted on one of the world’s most popular porn sites, PornHub. And this piece has acted as a spark and has led to changes that significantly impact PornHub specifically, and the porn industry as a whole.

Keri: Right. And, um, there was kind of a domino effect after this huge expose, a that the New York Times, um, had by, um, Pulitzer Prize, winning journalist, Nicholas Kristoff, who was previously on this podcast actually. Um, and that it’s important to note that, uh, there was the domino effect after the article, but the article exists because of, um, the work that advocates and survivors have been doing for years. So it’s really exciting to see kind of how, um, the culmination of all of that hard work has paid off in some significant ways. There’s still a long way to go. But, um, as we’re going to talk about today, there have been a lot of changes that have happened in the porn industry because of this article in December,

Garrett Jonsson: Right. To get on the same page with our listeners. I think it’s going to be important to go over some definitions, um, of things that we’re going to talk about during this conversation. Can you go over some of those definitions that you think are important for the listener to understand?

Keri: Yeah.

So the first thing I want to dive into is who is PornHub and who is MindGeek? MindGeek is the parent company of PornHub. MindGeek has their main offices in Montreal, Canada, and their headquarters are in Luxembourg. So, um, MindGeek is not only the parent company to PornHub. They also have 100 to 160 different websites, production companies and brands. They were founded in 2004. PornHub is a porn tube site that was founded in 2007. PornHub is the third most visited adult site with an estimated 42 billion site visits per year. That’s just one of the many sites that MindGeek owns. It’s reasonable to say that MindGeek owns and operates the majority of what most people understand to be the mainstream porn.

Garrett Jonsson: Yeah.

And that was actually really surprising to me to learn that they own that many sites.

Keri: Yeah. And, uh, I think that the general public would absolutely be in the same place. It’s, it’s a huge surprise how huge MindGeek is and how much they really own and control in the porn industry. So, um, yeah, and I want to define something that I just said. Um, “porn tube site”, PornHub is a porn tube site, um, and they’re modeled after YouTube. So porn tube sites, including the ones that mind geek owned, they rely on user uploaded content from both professional porn production companies, as well as general site users. So many porn tube sites allow free content, download a videos as well. That has recently changed with PornHub as we’re going to discuss, but most have an upload, a free upload and free download option. Um, they make their money through advertisements, subscriptions, and selling user data. So that’s a porn tube site and they are, um, probably the most popular form of porn consumption as the mainstream porn industry operates today.

And there are many different porn tube sites, but we’re going to specifically be talking about PornHub and a couple of other rival porn sites later in the conversation today. Something we’re going to mention today, Garrett, is CSAM, that is an acronym for child sexual abuse material. That’s commonly known as child porn, it’s explicit or exploitative images of someone who is under age. So these can be videos or images, but when we say CSAM, that’s what we’re going to be talking about.

Um, something that I think is important to this conversation, as well as talking about how sex trafficking is defined the trafficking victims protection act, which was a bipartisan effort passed in 2000 by the us Congress, it defines legally defined sex trafficking as a situation in which a commercial sex act is induced by force, fraud, or coercion or in which the person induced to perform such an act has not attained 18 years of age.

Garrett Jonsson: And oftentimes a person is anti-sex trafficking, but then pro pornography. And there’s more nuance to it than that.

Keri: Yeah. Yeah. I think that that’s a good way to put it. I think it’s not very intuitive, um, to, uh, to grasp that pornography and sex trafficking are sometimes when the same now not there’s so much nuance here, right? Cause it’s not every time, not every single video that exists, um, is of a sex trafficking victim. However, um, there are many images and videos of sex trafficking victims as well as victims of exploitation, um, that have existed on free porn sites. And the complicated thing is that there’s not really, um, a clean way to tell the difference between the two. So there’s not some guide you can follow to say, Oh, well she was a sex trafficking victim, or he was a victim of exploitation. So there’s just so much nuance. There’s so many layers here and consent is a rather blurry, complicated, uh, revolving issue.

Um, it’s not like you sign a contract and suddenly everything that you will ever make in the porn industry is consensual because consent by definition, um, needs to be rescinded at any time. So somebody who consents now, um, can withdraw their consent. If they’re in the middle of something that they’re not comfortable with, but that’s not the way that the porn industry operates. You sign your contract, you say your do’s and don’ts. And if something happens during a shoot that you did not necessarily consent to, um, porn performers have told us that they didn’t want to be difficult and that their agents and managers said, “Hey, don’t be difficult. You know, we know that that thing happened during that shoot, but you’re not going to get booked again, if you speak up about it.” or “You’re going to be blacklisted as a difficult performer, if you try to say something.”

Garrett Jonsson: Right. Then it would fall under the coercion. Because if you define what is coercion? It’s like to persuade using threat or force, I think is like the actual definition of coercion.

Keri: Right.

Garrett Jonsson: And so then it’s like, you’re talking about threats where you’re going to be blacklisted or…

Keri: Yes. That’s coercion and people don’t understand that you can be exploited and sex trafficked and still pick up a paycheck at the end of the day. It’s not the scenario that Hollywood portrays in many movies, which absolutely happens of people getting kidnapped and trafficked in faraway countries. But it’s something that you can be exploited or trafficked in a scenario and still sleep in your own bed at the end of the day. You know?

Garrett Jonsson: Right.

Keri: So it is really complicated and there’s a lot of nuance to this conversation. Um, but yeah, I do think that it’s important that people understand that it’s not black and white.

Garrett Jonsson: Right.

Keri: One more term for ya is IBSA or image-based sexual abuse. Um, this is also kind of used interchangeably with nonconsensual content that term. So when we talk about nonconsensual content or IBSA image based abuse, we’re, we’re talking about the creation distribution and or threat to distribute content of a sexual material, whether that content was made consensually or not consensually, and this is also known as technology facilitated sexual violence. Um, yeah, that’s what we’re going to be talking about when we say “nonconsensual content”.

Garrett Jonsson: And we want you the listener to understand that we’re going to be going through a timeline and that timeline begins on December 4th, 2020, and ends on April 16th of 2021.

Keri: Yeah, that’s right, Garrett. Like there are several months that are, that we’re going to just, you know, zoom through. So, um, yeah, if you’re listening to this buckle up, because we’re about to just go through a ton of events that have happened since December 4th. It’s a lot of information, but our hope and our goal is that you, as the listener is going to be more equipped to talk about the exploitation in the porn industry and how, um, nonconsensual content is not uncommon in the porn industry.

Garrett Jonsson: Right? So let’s jump into the timeline then.

Keri: Let’s do it.

Garrett Jonsson: So starting off in this timeline, December 4th, 2020 Pulitzer Prize, winning journalist, Nicholas Kristoff publishes an investigative opinion column in the New York Times, giving visibility to PornHub’s questionable business practices. Uh, this, this piece was titled, The Children of PornHub. It’s specifically investigated claims of nonconsensual content and CSAM on the site and shine a light on what has been reported for years, um, unofficially and officially by anti exploitation advocates and victims and survivors. And for the listeners that aren’t as familiar or didn’t get a chance to read, um, Kristoff’s article published on December 4, 2020. Can you jump into some of the main points made in that article?

Keri: Yeah, absolutely. It was, um, it was pretty groundbreaking for this movement, especially to have such a large platform discuss the issues that survivors and advocates have been blowing the us on for years and years. Uh, Kristoff sat down and interviewed some CSAM survivors, child trafficking survivors, and other individuals who have had content of them nonconsensual uploaded to PornHub. Um, a quote from the article is quote, “They’re making money off the worst moment of my life off my body.” a Colombian teenager who asked to be called Zilla a nickname, told me. Two American men paid her when she was 16 for a sexual encounter that they filmed and then posted on PornHub.” end quote. So that’s just a tiny, tiny snippet of many of the stories that he was able to gather and give visibility to. Um, some important things that the article also reported is that PornHub is the third most visited porn site in the world.

Like we said, and they only reportedly has 80 content moderators. Now these are 80 content moderators for 1.36 million new hours of video that were in 2019 alone, as an example. Um, so PornHub receives over 3 billion ad impressions a day. Um, that’s what Kristof found and reported Kristof also found that “a search for quote girls under 18 or quote fourteen-year-old leads in each case to more than a hundred thousand videos most are not of children being assaulted, but many are.”, is what he wrote. And quote, “I came across many videos on PornHub that were recordings of assaults on unconscious women and girls. The rapists would open the eyelids of the victims and touch their eyeballs to show that they were non-responsive.” end quote, Oh, heavy stuff. Right? But this is the reality of what many victims have had to deal with the worst moments of their life being obliged to the internet for entertainment.

Um, PornHub did share, um, a comment in the piece and Kristof reported that PornHub added that any assertion that the company allows, child videos on the site is quote, “Irresponsible and flagrantly untrue.” So Feras Antoon and David Tassilo, who are the individuals who run my geek, but do not own it, who were we’ll talk about more later, they declined to be interviewed for this article. Um, Kristof specifically called out Visa and MasterCard for continuing to offer services to the site after PayPal already stopped a couple of years ago, due to these exact issues. Um, the three steps that Kristoff said that would, um, help to curb CSAM, and nonconsensual content being supported on the site is he said that number one, PornHub should only allow verified users to post videos instead of allowing anybody to upload anything like they have been number two, he, uh, Kristof said that they should prohibit the download of content so that it couldn’t be re-uploaded again and again.

And revictimize survivors and number three, uh, Kristof reported that PornHub’s, um, moderation efforts needed to increase. So, um, that was really how he entered his piece, was a call to action, kind of calling PornHub to the mat to say, Hey, like, these are the things that are happening on your site. This is what I think you need to do about it. Kristof reported a follow-up piece on December 9th about the lives of some of the survivors he interviewed. Um, and it was rather uplifting piece because so many of these, um, young women and girls their lives had been improved because of his reporting, um, cause they’d received so much public support. So that was really heartening to see. So you can check those out on, uh, the New York Times for yourself if you’d like to, uh, see. But two days after the article was released Visa and MasterCard announced that they were going to investigate the claims Kristof made in his article to see whether they wanted to cut ties with the porn site. So that was just two days after that initial article.

Garrett Jonsson: Yeah. That’s a quick response.

Keri: Absolutely.

Garrett Jonsson: But I shouldn’t say a quick response. That’s a quick response to the article, but like we talked about, it’s been years and years where, um, anti exploitation advocates and survivors and victims have been fighting for these changes. So it was encouraging to see that this article did spark some significant change that we’re going to talk about. Fast forward to December 8th, 2020 after Visa and MasterCard announced their investigations, PornHub announced significant changes in security measures to their platform, including the suggestions that Kristof mentioned only allowing videos uploaded by verified users, removing the download feature and assembling a larger moderation team.

Keri: Right, exactly. Um, so this was a pretty big deal because advocates had been pushing for something like this for so long. And yet two days it just took two days after me set Visa, MasterCard announced they’re going to, um, look into what Kristof reported that PornHub suddenly was like, “Oh, by the way, we got these big changes coming.” So, um, one other thing that they announced was that, um, they announced a partnership with the National Center for Missing and Exploited Children. Um,…

Garrett Jonsson: I think it’s important to note that the president of the National Center for Missing and Exploited Children made a statement that we’ll talk about later in the timeline, his statement revealed that PornHub and the National Center for Missing and Exploited Children, uh, we’re not partners.

Keri: Right. And they also announced that they were going to release their first transparency report in 2021, which, um, at the time of this conversation that we’re having right now has been released. We’ll talk about that later. Um, they also announced that while the list of banned keywords on PornHub was already extensive, they wrote there, we’re going to continue to identify additional keywords for removal on an ongoing basis. Um, they also mentioned regular monitoring of those search terms within the platform for increases in phrasings that attempt to bypass the safeguards in place. It’s also important to remember that, um, PornHub also announced that they would “continue to work with law enforcement globally to report and curb any issues of illegal content.” And that’s a direct quote from their press release that they, um, partnered with NCMEC, which is National Center for Missing Exploited Children. And, um, yeah, they would continue working with law enforcement.

So that, that presumes that they’ve been working with law enforcement all along. But we’re going to dig into that a little bit later. Uh, and I do want to just make one last note here that just because the changes were made on PornHub does not mean that those same changes were made on every single one of MindGeek site. The truth is that we don’t really know specifically what has happened on other MindGeek sites. PornHub is the focus of this specific conversation and PornHub announced the changes to site security, but there are a lot more sites in play.

Garrett Jonsson: And it’s possible that other sites have issues with nonconsensual content. But again, this conversation is specifically about PornHub. Jumping to December 10th, 2020 MasterCard, Visa, and Discover announced that they have suspended their payment processing services with PornHub. And again, this is just days after the New York Times reported that the platform included videos of child abuse and rape.

Keri: Right. MasterCard, Visa, and Discover had announced that their own investigations had found and corroborated what Kristof had reported. Um, but here’s, here’s the kicker. This is a big deal. This is an absolutely monumental piece of news, this specific timeline event, but it’s important to note that the payment companies did not necessarily cut ties with TrafficJunky, um, which is MindGeek’s own advertising company that advertisers use to run ads on MindGeek’s porn sites. So big payment processor suspended their services on the sites themselves on a PornHub itself, but they did not necessarily cut ties with a main source of revenue for these companies, which is ad spend.

Garrett Jonsson: Yeah. That’s an important thing to note. Yeah. Um, just one day later, December 11th, 2020 MindGeek executives, Feras Antoon, and David Tassilo were alerted that they will be called to testify before parliament ethics committee regarding the allegations about, uh, child exploitation and non-consensual content on the site. And that the proceedings were to start, uh, in 2021. So just, you know, a few weeks later.

Another big thing that happened just a few days later on December 14th, 2020 porn hub removed over 10 million uploaded videos from unverified users. And they remove this content from the site because it couldn’t be guaranteed that it was exploitation and abuse free. Um, this content was an asset before and now it seems as though it was a liability after Visa, MasterCard and Discover cut ties.

Keri: Yeah. I remember waking up that morning and seeing the headlines about how porn hub had removed millions of videos. And it was just like another domino that fell in this, you know, chain of events. So first we had the Kristof article and then we had the Visa, MasterCard investigations, and then PornHub announced the site security changes and then Visa and MasterCard and Discover severed ties, and then PornHub deleted all of this content. So it’s just this massive chain of event. Um, it’s important to note that still their standard practice was not to require the identification of all individuals in content when videos are uploaded through a verified user. So secondary in videos are not checked. So yes, all these unverified, um, all this content uploaded by unverified users was removed, but it’s also important to remember that verified content is still very much an issue on their site.

There are documented cases of verified users being abusive in nature. For example, the Girls Do Porn case, which is when a very well-known verified popular porn production company called Girls Do Porn, um, was implicated. And, um, it’s it’s company owner was actually charged with, federally charged with sex trafficking of dozens of women in this huge scheme where, uh, women were forced tricked and coerced to perform sex acts on camera. So that’s one of the many examples where verified users can also upload nonconsensual content. There was also an underage trafficking victim who’s featured in almost 60 videos. Um, and her mother identified her and she was rescued from trafficking because of videos uploaded to PornHub. So, um, clearly only having verified content on the site is not going to solve their problem. So, um, these 10 million voters were not deleted because it was certain they were exploitative, but they were deleted because it was not certain they were not. And that’s an important note.

Garrett Jonsson: Right.

Keri: So there could still, there could still be nonconsensual content it’s just from verified users now on their site.

Garrett Jonsson: Yeah. I think the important thing to note here is what you said previously where, um, exploitative content is being sold on the same shelf as consensual content and you, you can’t tell the difference.

Keri: Yeah, exactly. So, and that’s, that’s why this conversation is so relevant to the average porn consumer, as well as the average activist. Um, because there’s no clean cut way to tell what’s not consensual what’s consensual. I’m just jumping ahead on the timeline a little bit on December 15th, those same women from the Girls Do Porn case. They filed suit against mind geek PornHub’s parent company for reportedly failing to moderate the videos of them being sex trafficked by the amateur porn company. Girls Do Porn and for actively profiting from and promoting these videos for a member PornHub made its money off of advertisements on almost every single page where videos are played. So where there’s an advertisement on a page with a video. That means that that that video is being profited from whether it’s consensual or not. So there were many, many, many reports of these girls, um, being trafficked. I personally sat down with one of these victims and heard her story. Um, she was trafficked by girls do porn and she had petitioned for and hub so many times to take down her content and, uh, was either met with silence or they would take it down. And it was just re-uploaded again by a different user because the upload feature was widely used, um, before it was disabled in December.

Garrett Jonsson: That’s why, that’s one reason why I’m, I’m glad that we defined the trafficking victims protection act, because I think it’s important for people, people to understand that, again, this like, like you said, sex trafficking was being hosted and broadcasted to the world by this, by this tube site. Um, and, and again, I think there’s a lot of misunderstanding on what sex trafficking is, right? And so this, again, a commercial sex act induced by force, fraud, or coercion and, and these circumstances, these situations were being broadcasted to the world. So…

Keri: Yeah, exactly. And that lawsuit that the girls do, porn, um, survivors filed that lawsuit was built off of another lawsuit that went to trial in 2020 on behalf of 22, Jane Doe’s, uh, that were implicated in the Girls Do Porn sex trafficking scheme. So it’s unclear whether all or some of these 22 Jane dos, um, joined the newest lawsuit against my geek, but there were 40 now who filed suit specifically. So there’s definitely a momentum growing to hold porn sites accountable, uh, which is why, again, on January 8th, 2021, uh, CSAM and sec child sex trafficking survivor filed suit against MindGeek. So this is another lawsuit that we’re aware of. Um, it’s a $600 million class action lawsuit from an Ontario woman who says that she experienced sexual abuse at the age of 12, which was filmed and then share it online. Um, she alleges that she notified the company of the video requesting its removal, but only ever received an automated response.

So one of the many, yeah, lawsuits that are now being filed against PornHub as a result of the momentum of holding them accountable because of Kristof’s article in December, um, February 1st is another event on this timeline that we’re chatting about. And that was when the standing committee on access to information, privacy and ethics of, uh, Canada House of Commons or the ethics committee is how we’re going to refer to them in the rest of the conversation. So they launched their investigation into MindGeek on February 1st for reportedly hosting the videos of child sex abuse, rape sex trafficking, non consensually distributed content. So the first to testify in these meetings that the ethics committee had were, um, uh, CSAM survivor featured in the New York Times piece and her lawyer. So they sat down with the ethics committee and really talked about, um, how her case happened and some more implications, um, from what she experienced and really the CSAM survivor, um, speaks for countless other survivors who have experienced this exact same thing of they’re the worst moments of their, of their life, their exploitation, their violation being uploaded to, um, PornHub for yeah, consumption of the masses,

Garrett Jonsson: Right. And, you know, it’s, it goes without saying, but we just have a lot of gratitude for those victims and survivors who have out, because once again, this, these changes wouldn’t be possible without them speaking out.

Keri: Exactly.

Garrett Jonsson: Jumping to February 5th, 2021, Feras Antoon, MindGeek’s CEO, and David Tassilo, MindGeek’s COO, and another PornHub representative testified before the committee and seemingly stated multiple mischaracterizations, false hoods and lies of omission about their business practices and history of hosting nonconsensual content. There’s a lot of info here, but for the listeners that don’t have a couple hours to invest and listened to the hearing, or watch the hearing in its entirety, let’s go through some of those important bullet points.

Keri: Yeah. So the CEO and COO claimed that MindGeek has a zero tolerance, uh, for CSAM and nonconsensual content. They also stated that about 50% of their revenue comes from advertising on their sites through TrafficJunky. And they consider themselves a leader in the adult entertainment world for preventing and removing, um, nonconsensual content and CSAM. They claimed also the, every single piece of content is reviewed before it’s uploaded to the site. So we fact checked all of the claims made in their hearing and to see what we found, um, you can visit that blog at ftnd.org, but, um, there’s a direct link to that blog on, um, ftnd.org/PHtimeline, if you want to check that out. So, um, they claimed that they don’t profit from illegal material or illicit material, including CSAM. Now to not profit from illicit material. That means that there would have to be no ads positioned on the same page as these videos, but it’s been reported in the past. The ads have been present next to abusive content. So when the committee members pressed Antoon and Tassilo about this, they said that they didn’t know if Mikey had received money from specific cases of nonconsensual content. Also at the hearing on February 5th for us, Antoon said, quote, “MindGeek should have zero child sexual abuse material on our websites.” end quote. MindGeek so confidently asserts this claim of zero CSAM likely because of the drastic steps. They’ve only just very recently taken the one being suspending content, not uploaded by verified users, but by suspending all the content by unverified users, they can’t actually be guaranteed that they removed all of non-consensual content. Consider again, the cases of an, of a verified PornHub accounts, uploading videos of underage sex trafficking victims, and consider how experts have claimed. It’s unlikely that sites that accept user uploaded content, even if they’re verified users have been able to be completely and totally free from CSAM or able to completely and totally prevent CSAM. So that’s another fact check that we were able to do, um, on a claim that, uh, this, the, MindGeek executives made.

Garrett Jonsson: One of the claims that was surprising that I think we should talk to a little bit more is when they claimed that everything, every piece of new content is reviewed. Can you, can you provide us with the quote where Tassilo mentions that, uh, every piece of new content is reviewed by a human moderator?

Keri: Yeah, absolutely. So when the committee members, um, took them to the task and said, “Okay, you say that every single piece of content is reviewed, can you back that up?” Tassilo said, quote, “I can guarantee you that every piece of content before it’s actually made available on the website goes through several different filters. The way we do it, irrespective of the amount of content is that the content will not go live unless a human moderator views it. I want to assure the panel of that.” end quote. This is a seemingly impossible feat, Garrett. Let’s just talk about the math, um, from the hearing Tassilo and, Antoon mentioned that MindGeek employs about 1,800 people, but we don’t actually know how many are human moderators. So Kristoff’s piece suggested 80. Um, other advocates have, um, suggested much less in their own findings, but let’s, let’s try to work this out in real time to even see if it’s possible. And let’s be generous and say that there are actually 80 moderators or compliance agents as MindGeek calls them, um, reviewing content. So do you wanna, do you wanna work out the math to see if this is possible?

Garrett Jonsson: Okay. So assuming each moderator is working 40 hours a week, for 52 weeks a year, that equals 2,080 estimated working hours for each full-time moderator in a given year, assuming that Kristof’s estimation is accurate and PornHub has 80 moderators, 80 moderators working 40 hours a week for 52 weeks equals 166,400 total hours of moderation in a year. In 2019, PornHub reported 1.36 million hours of new content uploaded. And again, Tassilo claimed that every piece of new content is viewed by a human moderator before it’s published on PornHub. If we take the 1.36 million hours of new content from 2019 and divide it amongst the 80 moderators that equals 17,000 hours of moderation per moderator.

Keri: Um, yeah, 17,000 hours of moderation, per moderator in a year, remember there’s only 8,760 hours in a given year, let alone working hours. So PornHub’s moderation team would need to be much bigger than 80 moderators they’d need. Actually at that rate, they need, um, 653 full-time moderators if they reviewed 1.36 million hours of content at one X speed or 326 full-time moderators, if they reviewed all of that content at two X speed. So its….

Garrett Jonsson: It makes me suspicious.

Keri: It’s unlikely that two solos claim is truly, uh, grounded in reality. It’s unlikely, it’s possible if they have many more moderators, but it’s unlikely given the information that we know.

Garrett Jonsson: Yeah.

And I think it’s important to note again, too, to say that they have made it an important point to say that it is reviewed by human moderator, but they have still not to this day reported how many moderators they have officially.

Keri: Yeah, exactly. And also Garrett, here’s one thing that I think some people are kind of missing in this conversation with talking about PornHub’s moderation. Um, it’s either that not every piece of content is manual or reviewed, which is entirely possible given the math that we just broke down or else if every piece of content was reviewed that illustrates culpability and allowing nonconsensual content and CSAM.

Garrett Jonsson: Right.

Keri: So either way, either way you slice it, whether they have, or they have not reviewed every piece of content, they’re still liable for allowing so many cases of nonconsensual content reportedly and CSAM on the site. Um, but ultimately, you know, kind of stepping through more of the claims that they made during that hearing, um, significant differences were demonstrated in the reported experiences of survivors versus what MindGeek executives claimed to the ethics committee. Uh, one quote from the CEO that kind of stood out to me was quote “MindGeek is a proud partner of NCMEC. We report every instance of CSAM when we are aware of it, so that this information can be disseminated and investigated by authorities across the globe.” End quote. And we’re going to break that down a little bit later because those same child protection, um, advocates were also interviewed by the panel and they had some very illuminating things to say about what that MindGeek “partnership” was really like. Um, MindGeek also revealed during that hearing that they were not aware of how many victims have submitted content removal requests for nonconsensual content in any given year, specifically 2019 or 2020. Uh, we hoped that they would be more prepared and that they would have been able to give more information seeing as they were told about a month and a half before they were actually sitting down before the ethics committee, um, that they would have to do this, but they simply weren’t.

They simply didn’t have information, basic information to share about what kind of nonconsensual content, uh, there was on the site. So, um, again, MindGeek’s claims that they were unsure if they have reported specific instances of abuse to the police, um, which is required by Canadian law. So the executive said that they reported to NCMEC, but note that this, this policy of theirs only began in 2020, um, the whole, you know, the nutshell version of what you should get from the hearing is that PornHub was the only subsidiary of MindGeek that was focused on in this specific hearing. Um, the other approximately 100 to 160 sites and brands that MindGeek owns were not discussed. And it was not clarified whether the same safeguards would be put in place on those sites as well.

Garrett Jonsson: Right. I’m jumping to February 12th, 2021, a class action lawsuit was filed against MindGeek. Can you talk to that a little bit?

Keri: Yeah. Again, this was another class action lawsuit where the representative class members were two survivors of child sex trafficking. They unfortunately had videos and images of their abuse posted on PornHub and other MindGeek owned sites. So the lawsuits just keep coming, you know, after the December article with Kristof, the flood Gates kind of opened for survivors to really have the platform to say, Hey, now, you know, this happened to me and now I’m going to report and file this lawsuit because there’s a lot more, um, ground to stand on with credibility with the public, especially.

Garrett Jonsson: And for those listeners who want to do a deeper dive into, you know, this lawsuit, for example, we do have that linked in the episode notes. So you can go to the Fight the New Drug blog, and check out that for more information, um, jumping to February 19th, 2021, to better understand the effects of abusive content being available for consumption on one of the world’s largest porn sites for survivors of sexual exploitation provided witness statements to the ethics committee over the course of a couple of meetings. One survivor shared during the meetings on February 1st, and three other shared on February 19th.

Keri: Right. And, um, yeah, these stories are devastating and it’s important to remember that these are just for real stories from survivors of image-based abuse in nonconsensual content. I mean, consider how many more there may be. Um, it’s pretty heartbreaking when you think about it. Um, but these women, they gave faces and humanizing elements to what countless of, uh, to what countless individuals have experienced. So, um, yeah, if you do have the time to review that hearing, uh, we do have it linked in our, um, ftnd.org/phtimeline.

Garrett Jonsson: On February 22nd, 2021 in response to everything that has happened, Fight the New Drug, joined a collective group of 104 survivors of sexual exploitation and 525 organizations from 65 countries that have sent a letter to the Canadian parliament committee praising the ethics committee’s actions thus far and urging for a full criminal investigation into MindGeek. Um, and once again, that’s because it appears that though they have violated Canada’s child protection laws and laws regarding the sharing of intimate images without consent.

Keri: Yeah. And for the last 10 years that law has been active, but MindGeek has reportedly not done so. And we’re about to talk about that with the next meeting that just happened. Um, and again, [inaudible], we’re not religiously or legislatively affiliated, but we absolutely support the fight against child exploitation and sex trafficking, which is why we jumped on board with this coalition to send this powerful letter that to the committee, um, thanking them for, you know, their accountability, um, of MindGeek and urging them for a full criminal investigation.

Garrett Jonsson: Exactly. And on February 22nd, 2021, the ethics committee heard witness statements from leaders in child protection services that weakened, the MindGeek executives testimony about their content moderation and reporting of illegal content. It’s a big thing to note here. It was revealed that PornHub reported zero instances of child sexual abuse materials to us or Canadian child protective agencies or law enforcement from 2008 to 2019. And for most of 2020, um, again, this is a criminal offense and Canada and the U S.

Keri: Exactly. So it appears as though PornHub did not actually fulfill the requirement to report child abuse, images to law enforcement and child protection organizations until the public pressure increased after the New York Times piece. So all this talk about them being a partner of NCMEC and, uh, having a zero tolerance policy for CSAM and working with law enforcement agencies, that was all from very recent compared to when they were founded. So, um, yeah, one important comment that we need to make is that the National Center for Missing and Exploited Children, NCMEC, um, the President testified in that committee hearing, and he was clear to say that PornHub and NCMEC were not partners. Here’s, here’s the quote where he said that quote, “NCMEC and PornHub are not partners. PornHub has registered to voluntarily report instances of child sexual abuse material on its website to NCMEC. But this does not create a partnership.” End quote.

In the last part of 2020 alone porn have made over 13,000 reports of child sexual abuse material, um, CSAM, to the cyber tip line organized by NCMEC, but only, um, 4,000 of those were unique reports, about 9,000 of them were duplicates. So during this hearing the National Center for Missing and Exploited Children, the Canadian center for child production and the Royal Canadian Mounted Police all testified that they only began receiving reports of child sexual abuse images from porn hub in late 2020. This appears to be a very clear violation of Canadian law that requires internet providers to report CSAM to authorities. So all of these reports from the actual child protection advocates, they invalidate MindGeek’s claims that they’re quote, shouldn’t be CSAM on their site, uh, and that they have not profited from CSAM, because like we mentioned, there are advertisements on the vast, vast majority of pages where videos are viewed.

So in order for them to have never profited from CSAM, there had to be zero ads on those videos. And there’s very little evidence if no evidence to suggest that was the case. It’s important to note that in the hearing with the child protection advocates, um, on February 22nd with the ethics committee, it was discussed that even if the CSAM content on the site on PornHub was removed, the tags indicating that the content featured someone who was underage were not removed, which means that anyone who is Google searching or, um, using a search engine to look for underage content, they would still be led to and landed on PornHub and be directed to their content. So PornHub was reportedly profiting from searches for CSAM, even if the CSAM itself, um, even if the video itself was suspended and removed from the site. Um, it was also revealed in the hearing that over the years, when he victims had reached out to NCMEC sharing that they had not received positive responses from PornHub.

So on their behalf, NCMEC communicated directly with PornHub and directly requested the content removal, which was reportedly granted, but ultimately, um, it was revealed that PornHub was very, very reluctant to take down problematic content, even when it was reported. So that’s something that we heard from survivors. It was then corroborated by these child protection advocates.

Garrett Jonsson: Yeah.

Keri: So just as a nutshell, Garrett, I mean over the course of these hearings that we’re talking about, uh, with the ethics committee, the scope of the situation was revealed and its true entirety. It was becoming, it’s becoming clear that the world’s largest porn company that claims to care about victims of child sexual abuse, material and nonconsensual content, they reportedly only theory very recently put basic safeguards in place. And these new basic safeguards were put in place, not because of the multitudes of victims of image-based abuse and trafficking and child exploitation. These safeguards were not put in place because they begged for the videos and images of their exploitation to be removed. But reportedly because PornHub wanted to protect their financial successes and preserve their bottom line.

Garrett Jonsson: Yeah.

Keri: That’s what these ethics committee meetings have really revealed at the heart of it.

Garrett Jonsson: Right.

Keri: And I don’t think that that’s a point that should be missed or skated over it. That’s exactly what was revealed. It wasn’t enough for the exploitation survivors to speak out. It wasn’t enough for advocates to demand change. It had to be about their bottom line.

Garrett Jonsson: And I think as you go through the hearings and the briefings that have happened, uh, you come to that realization that this is not a, a one-off situation. Um, there is a trend of negligence. On April 2nd, 2021 porn hub released its first ever transparency report in that report, PornHub reported that it removed 653,465 pieces of content that violated its terms of service in 2020, this number of 653,465 pieces of content is separate from the 10 million videos that were removed from the site in December of 2020. And just to be clear, PornHub considers content depicting minors, nonconsensual content, hate speech, animal harm, incest, bodily fluids, like blood and feces as content that infringes upon their terms of service. This transparency report also detailed the moderation efforts, the site employees, but still did not disclose how many moderators they have.

Keri: Yeah. And that’s something that we’ve been asking for for quite a while. I mean we know like Facebook reports their moderators, YouTube reports their monitors, they all have very public numbers of how many people are reviewing content, but all throughout these steps of accountability and these steps of revealing, you know, the inner workings of the sites, uh, PornHub has never released the number of moderators and MindGeek has never released the number of moderators that are looking at these sites that have reportedly millions of hours of content uploaded. Um, so that’s something to be aware of. Um, so within this transparency report, it was said that they had received, uh, over a thousand legal requests from governments, law enforcement, private parties. Um, and that was just in 2020. So those requests included nonconsensual content and child exploitation. Um, and this was one quote, Garrett, that I thought was noteworthy in the transparency report, uh, quote, “We report with law enforcement and readily provide all information available to us upon request and receipt of appropriate documentation and an authorization.” End quote.

But that’s not what many survivors have reportedly experienced. They haven’t experienced that eager cooperation. And as we just heard, um, from the child protection advocates in the last, um, ethics committee hearing, um, they did not reportedly experienced that eager cooperation on behalf of survivors.

So this report seemed to be comprehensive when reviewing it. Um, it did break down the moderation process with, you know, flow charts and things like that, as well as the take-down process, as well as their quote unquote zero tolerance policy for CSAM and nonconsensual content. But here’s the thing it remains to be seen, whether they will live up to the standards they have set for themselves. Um, they have not yet corroborated the evidence from survivors as well as advocates. Um, the evidence presented in Kristof’s article. So that’s one missing piece here is however many advocates and survivors have spoken out. Um, there, it remains to be seen whether PornHub will actually work with them alongside them and make sure that their content doesn’t appear on any MindGeek sites whatsoever, not just PornHub, but the 100 to 160 different subsidiaries and porn companies, important sites that MindGeek actually owns. This is, this is an industry-wide issue here. This is not just PornHub. This is industry-wide

Garrett Jonsson: Right. April 12th, 2021, the ethics committee had another meeting. Some people within that hearing argued that self-regulation is ideal. And other people within this hearing argued that MindGeek has shown that self-regulation, isn’t an option based on PornHub, not reporting sex trafficking, rape abuse, exploitation, and non consensual content on their platform for the past nine years.

Keri: Yeah, Garrett, um, that committee meeting was a little bit heated and one of the quotes from the committee members is that quote, “Voluntary compliance isn’t an option based on their history of not reporting.” So, um, this is an important note. So I mentioned at the beginning of our conversation that my geek is based in Montreal, Canada, but their headquarters are in Luxembourg. So there was a discrepancy during that meeting about where MindGeek is actually based since they do have main offices, but their headquarters is elsewhere. So this means that victims has, have often had a lot of difficulty filing legal complaints against the company because its corporate power structure is so complex, so inaccessible and so unclear.

Garrett Jonsson: Right. One of the things discussed during the April 12th, 2021 hearing is that this can be a very complicated situation because we have a company headquartered in one country, with significant presence in another country, with users from around the world. I think it’s important to note that during this hearing Minister of Justice and attorney General of Canada, David Lametti gave an explanation as to why this can be such a complex problem. And I quote “While the internet has provided many benefits to Canada and the world, it has also provided criminals with a medium that extends their reach and thus their victim base and a medium that elevates the level of complexity of investigations. One complicating factor is that telecommunication networks and services transcend international borders while the enforcement authority of police is generally limited to their domestic jurisdiction.”

Keri: Exactly. So the issue moves much faster than the law does. So even if mind geek does have more oversight, how effective is it going to be even five years from now? And I think that that’s a lot of, um, what advocates and lawmakers are wrestling with right now.

Garrett Jonsson: So moving forward in the timeline on April 14th, 2021, something very significant happened. MasterCard changed its rules for adult sites, announcing that it will require clear, unambiguous and documented consent for the content on all of the platforms using it as a payment processor.

Keri: Yeah, this is actually a super significant event in the timeline that shouldn’t be overshadowed. Uh, one quote from the blog post where they announced this on MasterCard site, quote “Chief among these rules and standards that govern the use of our payment network is that we do not and will not permit merchants to engage in unlawful activity on our network.” End quote. So this seems to take, uh, essentially their severed relationship with PornHub and it applies it to all adult websites that might use MasterCard as a payment processor. So it sets the standard for other payment processing platforms to not allow its their services to support nonconsensual content and exploitation. And when I say that MasterCard is taking their severed relationship with other adult sites, I don’t mean that MasterCard is setting, um, I don’t mean that MasterCard is actually severing their relationship with other adult sites. It’s that MasterCard now has stipulations and conditions for those partnerships. It’s not just carte blanche partnership it’s Hey, you can’t use our payment processing services unless you follow these guidelines and these guidelines are pretty comprehensive. Um,…

Garrett Jonsson: Yeah, that’s a big. “clear, unambiguous and documented consent.”

Keri: Exactly. And it’s, they they’re quoted as saying documented age and identity vacation for all people depicted and those uploading the content. So that’s even more than PornHub because PornHub, as of right now does not require the identification of secondary performers. If you appear in a verified video in a verified, um, account’s video on PornHub, they’re not going to necessarily check your identification, but according to MasterCard’s rules…

Garrett Jonsson: That would be required.

Keri: Exactly. That would be required. All people, exactly that would be required. All people depicted and those uploading the content. So I’m going read, um, another part of what they also said in this new announcement to, um, “requirements include content review process prior to publication, complaint resolution process that addresses illegal or nonconsensual content within seven business days, and appeals processing, allowing for any person depicted to request their content to be removed.” So these set very, very high standards and the payment processing company. And we’ll have to see if other payment processing companies follow suit

Garrett Jonsson: Jumping to April 16th, 2021, Nicholas Kristof. Again, the same person who wrote The Children of PornHub for the New York Times wrote another heartbreaking expose. This time about PornHub’s direct competitor and to the world’s largest porn site, Xvideos.com. They are reportedly hosting nonconsensual content and CSAM. Xvideos.com is the seventh most visited website in the world. And the first most visited porn site, they claim that they have an average of 2 billion daily impressions worldwide. um, Xvideos’ sister company Xnxx.com is also one of the most visited porn sites in the world and hosts almost the exact same content. Together, they have about 6 billion impressions daily, according to Nicholas Kristof’s article.

Keri: Yeah. And these sites are specifically owned by French nationals and they’re based in Prague. Here’s a quote from the article just in case, um, our listeners didn’t have a chance to read it. Quote “Xvideos guides viewers to videos that purport to show children. Search for quote ‘young’, and it helpfully suggests also searching for tiny, girl, boy, jovencita, and youth porn. Many of those onscreen will be younger looking adults, but some will be minors whose lives have been badly damaged.” This as we’ve chatted about this, Garrett, I mean this issue of nonconsensual content and CSAM, PornHub and MindGeek have not cornered the market on this issue. Unfortunately, I mean it’s over the entire porn industry. Um, here’s another quote from the article that really demonstrates that here’s some Kristoff quote, “But as I noted at the time,” he’s speaking about the December article Children of PornHub, “the exploitation is rooted, not in a single company, but in an industry that operates with impunity and punishing one corporation may simply benefit its rivals. That’s happening here when PornHub deleted videos, millions of outrage customers fled to its nemesis Xvideos, which has even fewer scribbles.”

So this is something that Kristof warned in December with his article was “K we’re gonna, you know, put PornHub on blast, but ultimately this is not a PornHub problem. This is a porn industry problem.”

Garrett Jonsson: Yeah.

Keri: So that exactly what he was warning about. People fleeing from jumping ship from PornHub to another porn site. That’s exactly what happened. So we’re seeing this process though, start all over again from, you know, we’re talking about this porn hub domino effect. Well, it seems to be that there might be an Xvideos domino effect after the fact and kind of in part of this ripple effect of, of what’s been going on with everything with PornHub,

Garrett Jonsson: Right. In that, in his most recent article, there was a study that stood out.
Keri: Yes.

Garrett Jonsson: And I think it’s very significant. We should mention that. Can you talk to that study a little bit?

Keri: Yeah, absolutely. This is one of the biggest studies of its kind to date. Um, it was published in the British Journal of Criminology this year, 2021. And it found that one in eight videos on three major tube sites, Xvideos, PornHub, and X hamster, they depicted sexual violence or nonconsensual conduct.

Garrett Jonsson: Once again, this study just shows that sexual exploitation, um, is being sold on the same shelf as consensual material.

Keri: Exactly. And the consumer has no way to know, uh, which is which an important point that Kristof made in the article that we reiterated at the beginning of this conversation. I think it’s important to reiterate again, um, is that we can be sex positive, but exploitation negative.

Garrett Jonsson: Yep.

Keri: Yeah. Some solutions that Kristof suggested in his article, um, kind of expound on his previous suggestions that he made in the children of PornHub article that, um, were followed through in time. So we’ll see if, if these are also followed through, but he suggested that porn, that credit card companies should stop working with companies that promote illegal content, search engines like Google should stop leading people to rape videos, and there should be accountability in criminal and civil law with porn companies, as well as uploaders of nonconsensual content. Um, things are in limbo at this very moment, but, um, it’s clear that there has been a huge ripple effect across the porn industry because of survivors and advocates work and Nicholas Kristof reporting on all of that.

Garrett Jonsson: People are starting to realize that a nonconsensual content is not uncommon.

Keri: Yeah, exactly. I mean, just consider what’s happening with Xvideos and Xnxx. Xvideos was actually, um, an investigation was opened on them in January of 2021, uh, for nonconsensual content. So, um, that was in part due to their spending so much momentum around porn tube sites and, um, CSAM, and nonconsensual content being reportedly so available. So ultimately, um, this conversation, it should end on an encouraging note because ultimately the world is becoming more and more aware of how unacceptable nonconsensual content is and how, um, actions need to be taken to hold people and sites accountable, uh, that hosts CSAM and nonconsensual content. So this conversation is happening because change is happening.

Garrett Jonsson: Right.

Keri: And that should not be underplayed whatsoever.

Garrett Jonsson: Right. One of the goals that Fight the New Drug is to change the conversation. And just like you said, the conversation is changing and, um, we’re happy about that.

Keri: Yeah, we really are. Um, but this is really is a lot of information. So, um, if you wanted to follow along with, um, our timeline that we discussed today, or if you want to listen to, or watch the Canadian parliamentary hearings with the ethics committee, uh, you can go to ftnd.org/phtimeline, but we are going to continue to fight and, um, yeah, let’s focus on the hope.

Garrett Jonsson: Yeah. Well, Keri, we want to thank you for joining us on this episode and the listeners. Thank you for tuning in.

Fight the New Drug Ad: How can pornography impact you, your loved ones, and the world around you? Discover the answer for yourself in our free three-part documentary series, Brain Heart World. In three thirty minute episodes, this docuseries dives into how pornography impacts individuals, relationships, and society. With witty narration, and colorful animation, this age-appropriate series shines a hopeful light on this heavy topic. In each episode you’ll hear from experts who share research on porn’s harms, as well as true stories from people who have been impacted personally by pornography. Stream the full series for free, or purchase an affordable screening license at brainheartworld.org

Garrett Jonsson: Thanks for joining us on this episode of Consider Before Consuming. Consider Before Consuming is brought to you by Fight the New Drug. fight. Fight the New Drug is a non-religious and non legislative organization that exists to provide individuals the opportunity to make an informed decision regarding pornography by raising awareness on its harmful effects, using only science facts and personal accounts.

Again, big, thanks to you for listening to this conversation. As you go about your day, we invite you to increase yourself awareness, look both ways, check your blind spots and consider before consuming.

Fight the New Drug collaborates with a variety of qualified organizations and individuals with varying personal beliefs, affiliations, and political persuasions. As FTND is a non-religious and non-legislative organization, the personal beliefs, affiliations, and persuasions of any of our team members or of those we collaborate with do not reflect or impact the mission of Fight the New Drug.

MORE RESOURCES FROM FTND

A three-part documentary about porn’s impacts on consumers, relationships, and society.

Fifteen research-based articles detailing porns negatively impacts.

Tees to support the movement and change the conversation wherever you go.

Successfully navigate conversations about porn with your partner, child, or friend.

A database of the ever-growing body of research on the harmful effects of porn.

An interactive site with short videos highlighting porn’s proven negative effects.