Skip to main content
Uncategorized

Dawn Hawkins on How AI Technology is Impacting Sexual Exploitation

Episode 117

Dawn Hawkins on How AI Technology is Impacting Sexual Exploitation

Available wherever you get your podcasts

The following podcast episode contains discussions of child sexual abuse, sex trafficking, and image-based sexual abuse. Listener discretion is advised.

Dawn Hawkins is the CEO of the National Center on Sexual Exploitation, which exists to build a world without sexual exploitation and abuse. She joins us for this Consider Before Consuming Podcast episode to shed light on the pervasive influence of pornography, the normalization of sexual harm, and the alarming risks posed by AI-generated content. The episode underscores the urgent need for reform to protect vulnerable individuals from exploitation.

EPISODE TRANSCRIPT

Intro (00:00):
Thank you for joining us for this episode of Consider Before Consuming, brought to you by Fight the New Drug. Fight The New Drug is a non-religious and non-legislative organization that exists to provide individuals the opportunity to make an informed decision regarding pornography by raising awareness on its harmful effects, using only science, facts and personal accounts. Though Fight The New Drug is non-legislative, [00:00:30] we fully support the regulation of already illegal forms of pornography and sexual exploitation, including the fight against sex trafficking and child sexual abuse material. Today’s episode is with Dwn Hawkins, CEO, of the National Center on Sexual Exploitation. This timely discussion coincides with our Stop the Demand Campaign and the upcoming Coalition to End Sexual Exploitation Global Summit. The conversation delves into the interconnected nature of porn and sexual exploitation, the role of AI [00:01:00] technology in both facilitating exploitation and helping to prevent it, and the efforts needed to create a safer environment for all individuals. With that, let’s jump into the conversation. We hope you enjoy this episode of Consider Before Consuming.

Fight The New Drug (01:20):
Well, Dawn, I’m so delighted to have you here on Consider Before Consuming today. I can’t wait for our listeners to hear this episode. It’s coming out at a really special time during our Stop the Demand [00:01:30] campaign and right before the Coalition to End Sexual Exploitation Global Summit, which we are all working on together right now. So, Dawn, can you tell us a little bit about your role and your work at the National Center on Sexual Exploitation?

Dawn Hawkins (01:44):
Hello, Natale. I love this podcast and I love you guys, so thank you for having me. I serve as CEO at the National Center on Sexual Exploitation, and we’re focused on kind of the prevention side of these issues. We work on all issues [00:02:00] of sexual abuse and exploitation. We’re trying to stop sexual harms, prevent it before it happens and do so at mass scale. So we’re really focused, I’d say in three areas. we are advocating for policy or legislative solutions, and we are advocating for corporate responsibility and accountability ’cause so many companies are part of this problem. And then we’ve started a lawsuit center and started filing civil lawsuits against many of these bad, [00:02:30] bad actors who are profiting from sexual abuse and exploitation.

Fight The New Drug (02:35):
Yeah, we’re so grateful for your work and we will dive into all of these aspects of the work that you do. Tell us a little bit about what motivated you to become involved with Koi and advocate against sexual exploitation.

Dawn Hawkins (02:47):
Wow. Well, it’s such a, like Clay. Clay Olson is such a part of my story, I’d say. But, you know, initially I just saw exploitation in a lot of different places, and I saw it really as connected and overlapping. [00:03:00] I lived in Europe and I saw like prostitution and sex trafficking were rampant. I had been a victim of sexual assault a couple of times, just like on the street, and pornography was all over. And I was like, in trying to make sense of that, I just realized it’s all connected and if we want to make a difference, we have to address how they’re fueling one another. And and then I came back to the US and I started getting involved actually about the same time that like the new drug was starting. And it feels like we just kind of went [00:03:30] arm in arm in our, in separate, in separate lanes, but in kind of learning about the issues together.

Fight The New Drug (03:36):
Yeah. Can you speak to what factors and issues in our society are primarily fueling sex trafficking today? Fueling the demand for sexually explicit content and material?

Dawn Hawkins (03:46):
Primarily, I would say it is pornography. there’s, you know, there’s a lot of complexities here, but the reality is that pornography has gone unchecked. It has become really like the wallpaper of our lives, especially of youth lives. [00:04:00] And it’s changing sexual templates. It’s impacting what is seen as normal, what’s seen as expected. it’s leading to things like, you know, more victimization and more perpetration of sexual harm. It’s also contributing to increased demand for sex trafficking and prostitution. When we commodify people and say that like, sex is for sale and it’s just an object it becomes so much [00:04:30] less of a barrier to go out and treat somebody that way and to seek, to act out what you’re viewing. I would say, you know, pornography is a pervasive pattern in all of the areas of sexual abuse and exploitation, and it’s one that has largely been ignored.

Fight The New Drug (04:44):
How does the normalization of or desensitization to some of this sexualized content, which is often perpetuated through pornography, impact societal attitudes towards the exploitation of vulnerable individuals?

Dawn Hawkins (05:00):
[00:05:00] You know, that’s a complex answer because there’s a, there’s so much to unpack in that, yeah. In, in that question. But I wanna highlight a couple of examples. You know, when our young people are being exposed to hardcore violent pornography as early as, you know, age 7, 8, 9, 10 years old, it is fueling what they think sexuality and intimacy is supposed to be. Suddenly it becomes violence, suddenly no becomes yes, because that’s what they see in pornography. [00:05:30] The kind of like the, you know, this kind of violence becomes expected of them, and that’s what they start to think sexual activity is. It’s so, it’s impacting us in that way. but also it impacts us. Okay, so let me, lemme take you to some question, some conversations that I had recently. I was meeting with a member of Congress and I was talking to them about AI generated CSAM, but it was, they used an AI tool to create essentially deep fake [00:06:00] child sexual abuse material of a real life child.

(06:04):
And this congressman, bless his heart, I would say, I love that southern, he said, well, he wasn’t really abused in real life, so it’s not that bad. And I was thinking like, what has happened that caused this really prominent person to have this attitude? And in my like, assessment, I think it’s largely the, the normalization of pornography. [00:06:30] This idea that what’s happening on the other side of the screen doesn’t have real life consequences that persist beyond the screen or that moment or the pleasure that the user might be having in that moment. And that, and that has enabled us to like completely belittle and dismiss the harm and the ongoing trauma experienced by those depicted in pornography. And not to mention the trauma and the harm experienced by those who are using also [00:07:00] who, who feel like their life is out of whack, who feel out of control, who have problems with their relationships. And they don’t recognize that it’s pornography that’s fueling part of it because it’s become so normalized. Like they can’t even get to that answer. There’s, I don’t know. Natalie, what do you think? I feel like there’s so much about how it impacts our attitudes that

Fight The New Drug (07:21):
Yeah, it’s so multifaceted and I think you hit the nail on the head that this is something that impacts people on every side of the screen, [00:07:30] the consumers, those who are in this content, even if the content is AI generated, which we’ll be talking quite a bit about in just a minute, there are still impacts that a lot of people maybe don’t realize. So I think it’s so important to, to observe how the normalization of this content is affecting all of us so that we can do something about it and kind of retake control, you know, become empowered and say, actually that’s not the world we wanna live in. Let’s make it better. Let’s, let’s make this the world we wanna live [00:08:00] in. And Koi is doing amazing work with that. One of the things you do is the Dirty Dozen list. Can you explain to our listeners what that is?

Dawn Hawkins (08:08):
The Dirty Dozen list? So every year we name 12 entities, usually companies who are pretty mainstream and who are facilitating and, and often profiting from sexual abuse and exploitation. Yes. And together we’ve had just massive successes, you know, oh, I don’t, I don’t wanna just like always shame, but that like, ha it makes a difference. And these companies do not [00:08:30] wanna be on the list. Like last year, apple was calling us the three days before over the weekend saying, please don’t put us on here. And but the reality is we’d been meeting with them for a year and a half and they’d made really rather small changes that they had been promising. And when we name these companies, when we call out what they’re doing and help the general public understand, we see like very swift change in many cases. And so it’s really a grassroots campaign where anyone, like, [00:09:00] my favorite part, Natalie, is that we get like the executives contact information, and for most of the campaigns, you anyone can go send a private message, direct message right? To the executives and they hear from you and they see the impact it’s having on your family and others. or you can just use our canned message, but they listen and they change.

Fight The New Drug (09:22):
Yeah. And for any of our listeners who are interested in engaging, we’ll be sure it’s linked in the show notes where you can find information about [00:09:30] the Dirty Dozen list and take action on those items. But with regard to the list for 2024, generally are there trends you are seeing on those platforms in particular that you’re addressing this year?

Dawn Hawkins (09:43):
Yeah, well, you know, the reality is in the last year and this year we focused primarily on the tech industry. because what’s happened now is that all sexual abuse and exploitation has largely merged online, and it’s allowed to proliferate because these [00:10:00] technology platforms, they are not responsible. There’s no accountability, there’s no oversight at all about their involvement. They stand behind the Communications Decency Act section two 30 largely, which gives them like blatant immunity for what’s happening. And people just have no idea. So one of the things that seems like the pervasive theme is just that these technology companies have built platforms and systems that enable sexual abuse and exploitation, and they [00:10:30] know it, they know it, whether it’s child sexual abuse or sex trafficking or image-based sexual abuse or others. They know it’s happening and in fact, they’ve built it to enable that and they’re profiting from it.

(10:43):
And so we’ve, we’re really like highlighting that one, one that I think is really I, the one that gets me, everyone has the one that gets them. Yeah. For me it’s Roblox. I think it might be because my kids are the age that Roblox [00:11:00] targets, like they built this gaming platform, it’s the most popular in, and, you know, there’s 12.8 million kids under age 12 who log onto Roblox every single day. And what’s happened is Roblox built this platform without child children’s safety in mind really at all. It’s been an afterthought and they’ve gone in and fix something. But even if you go in and you turn on the parental controls, you think that you’ve prevented adults from talking [00:11:30] to your child. No, it, it’s so faulty that, you know, first you have to go to two different places and they don’t even work. And then they, they know it and they don’t tell you. So what we have is like thousands of cases, cases, thousands of cases where adults are targeting children on the platform and they’re grooming them and they’re tricking them, and often they lead them to other platforms where the abuse continues. But Roblox could stop this in an instant, and [00:12:00] they have not. oh, are you seeing this on Roblox too? Are you guys hearing Yeah,

Fight The New Drug (12:06):
We are. We, we get messages from people asking about it, and we direct them to the Dirty Dozen list.

Dawn Hawkins (12:12):
We we’re actually representing two children and their families in, in separate instances where they were horrifically targeted and exploited all starting on the Roblox platform. Yeah. but GitHub is the other, so most people don’t really quite understand what GitHub is, but they host a ton of [00:12:30] code and it’s like an a, a a, like an Amazon, but for code. And Microsoft owns them, like Microsoft and the majority of the AI tools that are used to create AI generated child sexual abuse material or, or image-based sexual abuse. It’s all hosted on GitHub. And if we could get, if we could get like 90% of the content being created and [00:13:00] just causing, like massive life harm is coming from GitHub, this one place. Yeah.

Fight The New Drug (13:07):
And they have the power to create some change that could impact that

Dawn Hawkins (13:11):
So significantly. And you know, I’ve, I’ve long said, I’ve long praised Microsoft, it’s, they have, they have long done way more good than other companies in, in this space. And I, I’ve always kind of thought of them as big partners, which is why I can’t understand why [00:13:30] a year and a half into this campaign. They’re rather silent

Fight The New Drug (13:34):
Listeners. She’s calling on you . I wanna go back to a, a moment ago you mentioned Section two 30. for anyone who doesn’t know what two 30 is, can you explain a little bit about that and how these tech companies are, are hiding behind that a bit?

Dawn Hawkins (13:49):
Yeah, so the Communications Decency Act was passed in early two thousands by congress primarily to protect kids from being exposed to pornography or obscenity [00:14:00] online. And in the bill there was one section kind of carved out, section two 30, it’s just 26 words, 27 words that basically says that technology companies, if a third party is uploading content to the website, then that co that company’s not responsible for that. And it’s a critical bill. It’s like what the entire internet basically hangs on this section. The issue is that the Supreme Court, and so this was in intended, [00:14:30] the law was intended to invite tech to help be part of the solution to prevent kids from exposure to this content. But what happened is it went all the way to the Supreme Court and they gutted the whole fee except for section two 30. It’s the only part that remains in the bill and it’s, you know, become kind of the foundation of the internet, but no other, and, and the result of it is that tech says we’re not responsible for anything at all that anyone else puts on.

(14:57):
And then the courts are confused because what’s happened [00:15:00] is even though the technology platforms have participated in helping you upload, adding tags, running ads, what, whatever, like they’re actually part of the producers as well. In many cases, the courts are confused about their responsibility and liability and, you know, there’s lots of groups and issues that take offense with CDA two 30. But when it comes to the issue of sexual abuse and exploitation, this is something that just like, they cannot be allowed [00:15:30] to have this kind of immunity. There’s no other industry that exists at all that has complete, like no regulations at all for the impact they have on public safety and public health. Yet the entire tech industry does. This is why, for example, you know, Instagram or Snapchat or X or Reddit or Discord, why rampant sexual exploitation is happening there. And they know it.

(15:56):
In one of our lawsuits against Twitter, we sued them [00:16:00] for hosting sexual abuse content of two teen boys who were sex trafficked. Their images were film, their images were uploaded, the abuse was uploaded to Twitter, where they ran ads to the content where they allowed it to stay up, where the boys begged Twitter to take it down saying, I’m a child, boys even uploaded a photo of their id. And Twitter said, it doesn’t violate our community standards, so we’re gonna leave it up. And there’s like the judge just ruled in favor of Twitter and said, they’re immune. Even if [00:16:30] they partner with sex traffickers and child pornographers, they can’t be touched. We’re appealing that, and I’m so hopeful, but that this is like the bulk of why these tech companies are not partners in combating sexual exploitation on their platform.

Fight The New Drug (16:45):
There are so many pieces of this puzzle, this sexual exploitation puzzle, and the role technology plays in this that I think it’s a technology evolves so rapidly and it’s difficult for people to keep up with and to know exactly what’s happening and why it’s happening. So I think having [00:17:00] a thorough explanation of things like this that really are a huge part of the problem is such an important piece of this conversation, especially as we begin to talk about AI a little bit and some of the intersection between AI and sexual exploitation. So with that, can you tell us a little bit about what are some of the potential risks associated with AI in this space that we’re seeing?

Dawn Hawkins (17:26):
Well, I mean, I can just share a couple of stories that we’ve seen. I [00:17:30] just, I actually just came from giving a briefing to Congress on a new bill that’s being introduced by Senators Klobuchar and Cruz called the Take It Down Act. And it would require that image-based sexual abuse, including AI generated content that those who create and upload it would face like a criminal charges, which is key. Like there’s no laws about this now, Natalie. There’s nothing. Yeah, there’s nothing. And, and then it, the law would require that these technology [00:18:00] companies have to take it down, but they have to listen to survivors. And I think this is critical and as in my preparation, I was just doing some like preliminary searching on the news and I came across like a couple of news stories that really fully show what we’re, what we’re what’s ahead.

(18:18):
And that’s like, there’s been these notifying apps that have been put in Apple App Store and Google Play. We put Apple on the Dirty Dozen list because of this. And guess what, in 24 hours they removed [00:18:30] all the notifying apps. Yeah. Which is a, a victory there, but nevertheless it was left up and you could still get them in other ways. Yeah. And kids were downloading it and teachers, and in like instances, they were creating just like hundreds and thousands of AI generated kind of forged or deep fake pornography of their classmates. Yeah, it looks real. You cannot really tell that this is fake. And and then it’s uploaded, it’s shared on third party websites. [00:19:00] It’s uploaded to pornography websites, it’s uploaded to social media and, and this content is gonna haunt these young kids pretty much for forever unless we do something to shift it. Unless we get, like I have a, a whole host of like ideas for where we could reign in some of these harms with ai, but initially these technology companies can’t allow just anyone to upload sexual explicit content of anyone. Like there needs to be some consent [00:19:30] age verification. Yeah. I mean that’s how bad it’s gonna be. And the thing is, Natalie, it could happen to anyone. Like there’s nothing that you or I can do to stop this from happening to us or from people we love.

Fight The New Drug (19:44):
Yeah. In doing also just some preliminary searches I noted in several articles talking about AI generated child sexual abuse material. If there is a photo of a child online, AI generated sexual abuse material can be [00:20:00] made. So any photo of any child that’s innocently uploaded that can be created and it is being created. And so it’s something I think we have to acknowledge. We have to acknowledge where and how it’s safe to upload photos of kids online. and also noting what is this doing in schools? How is this impacting youth who do have access to these apps, who are using these apps in schools who are then sharing that content of, of their classmates with [00:20:30] each other. It’s a really complicated issue. don’t let me forget because I’d love to come back to some of your ideas of where we can reign it in. ’cause I think that’s an important part of this equation. but to still help spell out what’s happening with AI can, if you have any additional anecdotes or stories that kind of help depict this,

Dawn Hawkins (20:49):
Well, another, so that’s kind of like creating sexually explicit content or child sexual abuse material. And not only of kids, but of us. Deep fake pornography has been made of me multiple times, you [00:21:00] know, to try to stop you from the work that I do. It’s horrible. Like it e it affects me so deeply and you know, of course I’m someone who doesn’t care that much because I’m going to keep fighting. But I have talked with, with women pri primarily women who, who’ve had this happen to them. And it like the trauma, I cannot understate overstate the trauma that they’re experiencing and the perpetual like state of worry and anxiety that it’s gonna surface again, that their [00:21:30] employer’s gonna find out that they’re not gonna get into their, the school that they want to because of this kind of content. But another, another aspect that’s critical to understand are, are like these kind of chat bots that are being created to groom our youth.

(21:46):
And you know, man, I love AI chat. GPT is my friend. I have made, I talked to it, you know, I picked out a name for it., I call her. I love it. And I feel like [00:22:00] connected to her, right, her to her for me. Yeah, sure. . But that’s like also what’s happening to our young people and more vulnerable individuals who are looking for community, who are looking for connection. They’re going to mute these tools for relationships and like, and connection and that. And I think that could be good and healthy in some ways. But what’s happening is they’re being exploited so badly and the, the chat bots, instead of having one person creditor [00:22:30] who could, you know, maybe handle like 10 at a time, now you’ve got a chat bot that can handle a thousand at a time. So the scale has just exploded of, of target, of targeting and, and the victim pool has grown. And we are gonna see the trauma just skyrocket and immensely. And it’s gonna have untold like it’s the, the snowball effect of how it will impact the rest of their lives and their ability in the future to connect with others. And they’re professionalized. It’s going [00:23:00] to be massive. I’m not trying to be alarmist. We see it already. You’re seeing it right now.

Fight The New Drug (23:06):
I one of your team members, Victoria, gave a presentation on this topic at a conference that we had here a few months ago. And something that really stuck with me was and you mentioned this earlier as well, that there are not laws in place to safeguard around this right now. So the trauma that women, these primarily women are experiencing who are exploited in this or children the burden right now for someone to [00:23:30] get this type of content taken down is on the person who’s victimized to prove that it’s not real to try to prove who created it in the first place. The burden of responsibility is on the victim who’s already experiencing this trauma. And unless we get some safeguards in place, that will only continue to be the case. What are your thoughts about that piece of this?

Dawn Hawkins (23:52):
Oh my gosh, I have so many examples. ’cause our law center is representing a number of survivors of this [00:24:00] kind of abuse. And we’ve sent probably 250 take down requests over the last couple of years. And I cannot even begin to explain to you how difficult it is. In one case, like, to me, just the most frustrating of all was, oh, one of our clients was trafficked by her husband and he filmed and recorded her abuse and he was uploading it to pornography websites, to PornHub. And he’s in prison. There’s so much proof that he trafficked [00:24:30] her and our attorney sent, we got it down from PornHub amazingly, like it was a hard fight to get it down, but it was still showing up in Google images and Google video. And so we sent take down requests, the, as legal representatives of her and Google responded and said like their legal counsel and said, we don’t see signs of coercion, so we’re gonna leave it up.

(24:53):
I mean, what, like, first of all, there’s hardly ever signs of coercion. And going back to [00:25:00] our beginning of our talk about like the impact it has, even when you see the extreme violence and you see women saying no, if it’s in pornography, it’s like visual. You just accept it because that’s what it is. Like what the heck. But the second is like, here we are with all this proof, with all like court orders, all of these things and they still are not willing to listen. Another survivor who I believe has been on your podcast before I won’t, I won’t say her name [00:25:30] ’cause I dunno if she said this before to you, but she has spent $3 million to try to get image-based, like sexual abuse taken down of her, some deep fake in others. And, and it still just keeps coming up and, and, and being uploaded again and again. $3 million, Natalie. Yeah. It cannot be this hard. We cannot have these companies siding with the abusers and saying, you have to prove that you didn’t consent. No. The, the burden should be on these, the uploaders [00:26:00] proving that there was consent, you know? Yes. And so we have to flip that, that dynamic.

Fight The New Drug (26:06):
Absolutely we do. Because especially when taking a step back, even you mentioned some of that content for her was DeepFakes, but as you know, things with AI and deepfake content just continue to scale, proving that it’s not real is increasingly difficult. Right. Prove that that burden of proof shouldn’t be on someone who wants something taken down. It should be on the party who is uploading it to prove that it was consent [00:26:30] consensually made or which already we could dive into how it’s not really possible still to ethically make it beyond that. That’s another conversation. but the burden should not be on the victim prove.

Dawn Hawkins (26:44):
Absolutely. Yeah. And one of the things I’m kind of excited about that Bill I mentioned to you, I talked about today called the Take it down act, is it will, if the imagery, even if it’s deepfake or AI generated, but it reasonably looks like the person that person has in [00:27:00] this bill, they would have the power to request it be taken down. And it needs to be like, that’s the kind of shift we need. Yeah, yeah. Absolutely. I I’ll say with Google, I mean, oh, it was so infuriating that that meeting and hearing their general counsel say that, but but they have changed. So after that we brought, like panels of survivors to come to meetings and to teach Google executives and leaders what it looks like, what expectation looks like and how it doesn’t, there’s [00:27:30] the signs of coercion are not there and they’ve listened and they’ve heard, and they have made some really significant changes made it much easier to request content to be taken down. And they tell us they’re trusting the, the survivors first, like the barrier is lower. So I, I do wanna say like, thank you Google and can, and we appreciate that and thanks for listening, but like why is it this hard? It shouldn’t have to be. Yeah,

Fight The New Drug (27:55):
Yeah. It does, it does feel frustrating to have to bring a room full of survivors [00:28:00] to educate a very powerful entity on these issues. It is frustrating, but yes, thank you. And progress is progress and it’s also nice to acknowledge that . what are some challenges that arise in addressing issues related to AI generated pornography, specifically with regard to how tech companies are required to report CAM or other things like that?

Dawn Hawkins (28:24):
You know, I, I don’t have all the answers, but one large problem that we see is our child [00:28:30] pornography laws, which is the legal term currently we prefer to use child sexual abuse material as the term. But basically when all the, so like a long time ago, people were creating like cartoon and like deep fake type pornography in the 1990s, right? Like it wasn’t, it was clear that it wasn’t real child. And that content in a case went all the way to the Supreme Court. And the Supreme Court said that it’s only child pornography essentially, [00:29:00] if a real child is included, and if you can identify who the child is. So this is like a massive problem anyway for going after CSA because you have to really know who the victim is. And if you don’t know who the victim is, like, yeah. So the Department of Justice has, has done some workarounds, but like we need, we need to fix kind of some of the laws around CSAM.

(29:23):
But one of the like big fears we have and we see is how can we legislate or [00:29:30] litigate and and bring criminal charges against the AI generated CS a because a real child is not necessarily depicted and these images aren’t hashed the hashing technology. So, so this is like real challenges in the existing law that now is suddenly way out of date. And the Supreme Court justices could never even imagine that we would be able to create content that looks so real. and, and then, and now also we know so much more about [00:30:00] the impact of, you know, pornography, especially child pornography on the user. So before a, the Supreme Court said, well, an actual child wasn’t made in the creation of that. So no, like real crime has happened. But what we know is, and there’s other studies to back this up, I could go into them if you want, but users of child sexual abuse material, just like users of adult, like regular pornography, they escalate, their tolerance level changes and they often seek to act [00:30:30] out what they’re viewing.

(30:31):
You like voyeuristic pornography, like the kind where you just turn on your phone and you make it, or you have a hidden camera and and it, it’s exploded. Like it’s such a super high genre, very popular genre and all the big pornography websites. And what we’re seeing as a result of that is like massive amounts of image-based sexual abuse. So non consensually recorded or non consensually shared [00:31:00] you know, sexual moments or rape being filmed, sex trafficking or deep fakes being made to look real, to like, take people that, you know, in real life and, and make them into pornography without their, you know, participation knowledge or consent. We’re seeing this like across the board. And I think that it’s helpful for us to understand that that shift of like, that shift in sexual desires because of exposure to pornography and where we’ve landed because of it.

(31:30):
[00:31:30] And, and so in the space of CSA what we’re seeing is that CS a people who are using CS A M often are acting out in real life on the trial. Yeah. So there really are victims and you can’t think it’s victimless just because, you know, oh, so many complications, Natalie. So that’s a huge one that we see and we think really we need the, the Department of Justice to re-litigate those laws up to the Supreme Court to help. Now the Justice Department is still bringing cases, like there’ve been a couple in the last month [00:32:00] two couple months around AI generated CSAM, but, but it’s kind of scary to see how it’s gonna play out in the, in the courts.

Fight The New Drug (32:10):
Yeah. And something else to note as well is that somewhere along the line, even if this content that’s you know, AI generated isn’t necessarily of a real child, AI does have to learn from real images somewhere along the line. Right? So to some degree there will always be [00:32:30] some amount of real information that’s plugged into this. Or a real person who at the beginning of this was, you know, their photo was used and, and turned into something, whether it’s CSA or otherwise.

Dawn Hawkins (32:42):
I, I think we need to talk about that. We talk about like non-consensual, like being used in pornography, whether, you know, and that’s what’s happening. Your image is being used to train these tools to create pornography, whether it’s CSA or simulated adult [00:33:00] pornography. Yes. So do you consent? No. And should there be consent? I think so what’s happened with pornography is it’s changed our attitudes and, and our, our taste and our sexual templates. And, and people want the real thing, like the idea of pornography as a fantasy. We should unpack that at some point, Natalie. ’cause it’s not a fantasy. Like there’s real abuse happening.

Fight The New Drug (33:25):
Yeah. And I, and just to take that one step further and going [00:33:30] back to the idea of unpacking this idea of fantasy, you’re saying, you know, people are consuming pornography, selling themselves on the idea that it’s quote unquote just fantasy. It’s no big deal. It’s just fantasy, including things like categories of rape, porn or something else that are made to look very much real. and sometimes often are real. But a consumer will often say, oh, well if this is on a mainstream porn site, then it can’t be real. It’s quote unquote just fantasy. And so [00:34:00] there’s this idea as a consumer that we just justify the content that is there, when at the end of the day, there are real people and real victims. And also a lot of the time it’s not, it’s not just something that’s made to look like rape, it is rape the content itself is actual sexual violence or actual sexual exploitation. And so there’s a disconnect assumed by the assu, the consumer but then also it’s, that’s affecting the chain, the taste of the consumer, [00:34:30] as you just said, to actually want that in real life. So it is just this again, multifaceted issue that is being fueled by these ideas we’ve had societally or held societally about pornography for decades that are untrue.

Dawn Hawkins (34:47):
Exactly. And I, I mean, if you can start to think about demand, I know it’s, you know, this month you’re really focused on demand. and pornography has done that because your tastes have escalated, your tolerance levels have [00:35:00] changed. You seek to act out what you’re viewing so many times and you know, maybe your partner doesn’t want to do very often what is depicted in pornography ’cause it’s so degrading because it’s so abusive. And so some users will seek to act out elsewhere and they go to the prostitution marketplace to do that. And pornography is such a huge fueler of that. Not only does it like normalize commodifying sex, but it also like people are seeking to act out what [00:35:30] they view. I, I have talked to like hundreds of women who are in prostitution and who have been sex trafficked. And I ask them all the same question if I have the chance.

(35:41):
And that is, was pornography ever brought to you by the sex buyers to like, show you what they want and like, they just let over and over laugh and say, of course. It was like, that was the manual that was like the menu that they used to order what they wanted. They would come and show me, this is what I want you to do. Like [00:36:00] we, people have to understand the role pornography is playing and all that, and you care about sex trafficking, but you’re gonna allow pornography to just run rampant. Like, you’re not gonna make any difference in sex traffick to combat sex trafficking if you, if you continue to allow that.

Fight The New Drug (36:15):
Yeah. Or sex. Same thing with sexual violence, sexual abuse pornography has a role so often if not always in so many of these different circumstances. And we have to address the [00:36:30] ways in which these are all inter interlinked and how pornography is often kind of a root.

Dawn Hawkins (36:34):
Oh, exactly. I think if I could say, like, the one thing that I’m fighting for that I wanna see changed is if your website and you allow sexually explicit content, there has to be a mechanism by which you have you, you’re verifying the age and meaningful consent of all those depicted. It won’t solve every problem, but so many of the things that we just talked about [00:37:00] would be massively cut down if there could be meaningful age and consent.

Fight The New Drug (37:04):
We’ve talked about some of the problems with ai. How do you see the balance between the potential benefits of AI in combating sexual exploitation? So, so just, you know, content moderation and detection and the risks it poses in terms of enabling new forms of sexual exploitation. What does that, what do you see that balance looking like? I

Dawn Hawkins (37:24):
Mean, I am so excited about AI emerging technologies. It’s, it’s gonna just revolutionize our [00:37:30] movement. I see so much hope on the horizon. Like, I can’t even imagine what’s coming because like every day I learn about some new tool that’s been developed, some really amazing creative person and, and it’s gonna just like do what we’ve been fighting for, for a decade in, in like days. You know, like it’s, that’s, it is that amazing, which is why I’m so excited to be joining with you in, in bringing leaders together in just a few weeks. And for the Cease Coalition and Sexual Exploitation Global Summit, where we’re gonna be talking about how [00:38:00] to use AI for good. And I mean, just a couple of examples I saw you know, so right now to combat a lot of, like cs a, they use a hashing technology and there’s ways, like a ton of ways around the hashing technology.

(38:15):
And if the image hasn’t been hashed, then it’s gonna be allowed to proliferate. So what I’ve seen a couple of AI tools that will totally go around the hashing and it totally gets around. Even like another level A above [00:38:30] hashing is kind of facial recognition software that could be part of the solution. But you know, there’s a lot of legal issues around using facial software. Sure. But AI gets like right around it and you, you don’t even have, with what’s in the background with what’s on the, on the bed like the bedspread, you’ll be able to match and identify images of CSAM immediately and block and stop it from being uploaded at the source. Like I have seen the technology, I’m so excited. yeah, there’s significant [00:39:00] AI tools. Do you guys use Bark? Have you heard of Bark? Uhhuh . We love Bark Bark.

Fight The New Drug (39:06):
Will you tell our listeners a little bit about Bark?

Dawn Hawkins (39:09):
Yes. Well, right now I’d say it’s the best tool on the market you know, for parents who are trying to help protect their kids online and just kind of be there and know what their kids’ experiences are. It tracks it’s, it has like a filter too, but mostly it tracks their experiences on like every platform you could imagine. And you can track 30 something different categories. [00:39:30] So sexually explicit or grooming type stuff that we’re talking about. But even things like eating disorder or self-harm. And it’s not like you’re really spying on your kids. They know about it, but you only get alerted when there’s like, something meets these thresholds and then you can talk to them about it. Yeah. Imagine if that like happens on steroids with ai. Yes. Like finally we can get around these tech companies having massively faulty parental controls that don’t work at all. I just, I can see it coming. [00:40:00] It’s gonna, have you seen Gracie, the Gracie bot from Street Grace?

Fight The New Drug (40:05):
No,

Dawn Hawkins (40:06):
Natalie Demand. So Street Grace is bought called Gracie. They built a bot, an AI bot that can talk to sex buyers. It creates ads on these websites and as though they’re a prostituted person. And then the sex buyers engage and they, the bot just has conversations with them and it does two things. One, it can help give them messages to deter buying and [00:40:30] talk about the harm and, and keep that kind of conversation going. And there’s like incredible like, stories, like tons not, not just a few of, of like sex buyers changing, but then it also can like, do the job of law enforcement and prepare a case exactly like ready for law enforcement to take it to the next level. And it’s been, it’s like, it’s so exciting. Natalie? Yeah, Microsoft, I know we were dinging them a minute ago, but they’ve given like a hundred engineers to help [00:41:00] scale it.

(41:00):
And we’re, we’re trying to help them get into like all these different jurisdictions, so police can use it for free, but but what can happen is that instead of one police officer having to do, like, try to manage and be a decoy, and maybe they could get like 10 guys now, the bot can be having conversations with hundreds at the same time. And when it turns into a case for law enforcement, the police officers can take over and continue. It’s just a massive [00:41:30] scaling effort that we couldn’t see before. And it, it’s gonna need just significantly fewer resources.

Fight The New Drug (41:37):
Yeah, it really is. at the same time that these issues are being scaled to the same degree because of this technology, it is reassuring to know that there are other amazing tools and resources to scale the good side of this work. And I think that’s important for all of us to remember. And for anyone wanting to learn more, please join us Again. Don mentioned the Coalition to End Sexual Exploitation Global [00:42:00] Summit happening in Washington dc. We will have info linked about that in the show notes. and we are so excited to be co-hosting that as part of the phase alliance with the National Center on Sexual Exploitation. So please learn more about that and please join us. It’s going to be a powerful and incredible, almost week filled with information and so many incredible leaders in this space talking about not only what technology can be doing for good, but what each of us can be doing. And so for anyone [00:42:30] who maybe can’t join us there, Don, can you leave us with any tips that the average person can do in this space to help make a difference to combat sexual exploitation?

Dawn Hawkins (42:42):
I really urge you to join with us in calling out and like demanding change at the corporate level and and you know, to have this kind of mass scale prevention, we need it to be institutionalized. And so we’ve made it really easy. Go to Dirty Dozen list.com for example, and you can take [00:43:00] action in just seconds with all 12 of the, the companies. And when these executives hear you, the reality is that their parents, they, they’ve had experiences in their life or in their family, what, like, exactly like what we’re talking about. And when they hear from you, they listen and often make a make a change.

Fight The New Drug (43:18):
Yes. So each one of us can make a difference and it really is easy. Again, dirty dozen list.com. Don, thank you so much for your time today. for all of our listeners, if you wanna learn more about the work of the National Center on Sexual [00:43:30] Exploitation, Don is going to tell you about where you can go to support their efforts.

Dawn Hawkins (43:35):
Please come join us in the advocacy. you can find out [email protected] and join, especially if you sign for our, our email. We send like actions once a week that anyone can engage in that. Just take a few minutes.

Fight The New Drug (43:50):
Thank you so much for your amazing work. And again, everyone, please come join us at the C summit this year.

Dawn Hawkins (43:56):
Bye everyone.

Promo (44:00):
[00:44:00] Join us this July for our #StopTheDemand Campaign. As we raise awareness to help stop the demand for pornography and sexual exploitation. We invite you to educate yourselves and others on how the porn industry fuels the demand for exploitation, sex trafficking, objectification and more. Learn more and get involved in the campaign at ftd.org/stop. That’s ftd.org/stop.

(44:30):
[00:44:30] Are you passionate about ending sexual exploitation? Join us at this year’s Coalition to End Sexual Exploitation Global Summit from August 5th through the eighth in Washington DC. This event brings together survivors, advocates, academics, and experts to tackle the intersections of emerging technology and sexual exploitation, learn, network, and strategize with the best in the field. Don’t miss out on this transformative experience. Use FTND10 for a 10% discount when you register at endexploitationsummit.org.

Outro (45:00):
[00:45:00] Thanks for joining us for this episode of Consider Before Consuming. Check out the episode notes for resources mentioned in this episode. If you find this podcast helpful, consider subscribing and leaving a review. Consider Before Consuming is made possible by listeners like you. If you’d like to support consider before consuming, you can make a one-time or recurring donation of any amount at ftnd.org/support. That’s F-T-N-D.org/support. [00:45:30] Thanks again for listening. We invite you to increase your self-awareness, look both ways, check your blind spots and consider before consuming.

Fight the New Drug collaborates with a variety of qualified organizations and individuals with varying personal beliefs, affiliations, and political persuasions. As FTND is a non-religious and non-legislative organization, the personal beliefs, affiliations, and persuasions of any of our team members or of those we collaborate with do not reflect or impact the mission of Fight the New Drug.

MORE RESOURCES FROM FTND

A three-part documentary about porn’s impacts on consumers, relationships, and society.

Fifteen research-based articles detailing porns negatively impacts.

Tees to support the movement and change the conversation wherever you go.

Successfully navigate conversations about porn with your partner, child, or friend.

A database of the ever-growing body of research on the harmful effects of porn.

An interactive site with short videos highlighting porn’s proven negative effects.

Close Menu