1 00:00:15,680 --> 00:00:18,119 Speaker 1: Hey, Welcome to Tech Stuff. OZ is out this week, 2 00:00:18,200 --> 00:00:20,479 Speaker 1: so instead of a weekend tech episode, I wanted to 3 00:00:20,480 --> 00:00:24,280 Speaker 1: bring in an expert to talk about the absolute hellscape 4 00:00:24,360 --> 00:00:28,680 Speaker 1: unfolding on Elon Musk's social platform X. Heads up, We're 5 00:00:28,680 --> 00:00:30,760 Speaker 1: going to be covering a sensitive topic, so if there 6 00:00:30,760 --> 00:00:32,680 Speaker 1: are children with you, you might want to listen to 7 00:00:32,720 --> 00:00:36,880 Speaker 1: this at another time. Since late December, Elon Musk's AI 8 00:00:37,000 --> 00:00:40,920 Speaker 1: model Grock, which is built into X, has been generating 9 00:00:41,040 --> 00:00:45,120 Speaker 1: non consensual sexual imagery when prompted. Now you may be thinking, 10 00:00:45,680 --> 00:00:48,080 Speaker 1: I've heard of this happening before, and you'd be right. 11 00:00:48,720 --> 00:00:51,280 Speaker 1: Congress even passed a law to curb the creation and 12 00:00:51,320 --> 00:00:54,880 Speaker 1: distribution of sexually explicit deep fakes, So why is it 13 00:00:55,000 --> 00:00:58,400 Speaker 1: still happening? Four or four Media Samantha Cole has been 14 00:00:58,440 --> 00:01:01,840 Speaker 1: following the adult industry online culture of sex for years. 15 00:01:02,200 --> 00:01:05,959 Speaker 1: She recently wrote an article titled Grock's AI sexual abuse 16 00:01:06,040 --> 00:01:09,560 Speaker 1: didn't come out of nowhere, and while it's disturbing, we 17 00:01:09,600 --> 00:01:12,960 Speaker 1: are thrilled to have her here on the podcast. So welcome, Sam. 18 00:01:13,000 --> 00:01:15,320 Speaker 1: Thank you so much for joining us. Thrilled to be 19 00:01:15,400 --> 00:01:19,120 Speaker 1: here to disturb. It is the thrilled disturbance that we 20 00:01:19,160 --> 00:01:21,600 Speaker 1: are living in sam Can you talk a little bit 21 00:01:22,040 --> 00:01:24,240 Speaker 1: about the early days of the Grock scandal, Like how 22 00:01:24,280 --> 00:01:24,960 Speaker 1: did this start? 23 00:01:25,280 --> 00:01:27,120 Speaker 2: Grok has been doing this for a while, like you said, 24 00:01:27,200 --> 00:01:31,440 Speaker 2: like the prompts that are like make her blank, make 25 00:01:31,480 --> 00:01:37,040 Speaker 2: her wear a skinny bikini, make her wear see through shirt, 26 00:01:37,080 --> 00:01:39,520 Speaker 2: make her bend over and do such and such has 27 00:01:39,520 --> 00:01:42,720 Speaker 2: been a thing for a while, and I think the 28 00:01:44,120 --> 00:01:48,080 Speaker 2: escalation in the last couple of weeks has been I mean, 29 00:01:48,240 --> 00:01:50,520 Speaker 2: it went very viral and became very popular with a 30 00:01:50,560 --> 00:01:54,880 Speaker 2: lot more users on the platform, and it's started generating 31 00:01:55,080 --> 00:01:59,200 Speaker 2: what other outlets have reported is actual child sexual abuse 32 00:01:59,320 --> 00:02:03,840 Speaker 2: material AI generated, which is, you know, non consensual intimate imagery, 33 00:02:04,080 --> 00:02:07,400 Speaker 2: And now they're reports of it creating images of children 34 00:02:07,880 --> 00:02:11,280 Speaker 2: using AI, So not real kids that we know of, 35 00:02:11,600 --> 00:02:14,880 Speaker 2: but still illegal in a lot of places and obviously 36 00:02:14,919 --> 00:02:17,720 Speaker 2: still extremely harmful content to be on one of the 37 00:02:17,760 --> 00:02:20,480 Speaker 2: biggest and most mainstream platforms that we have right now. 38 00:02:20,800 --> 00:02:23,640 Speaker 1: So how does this compare to other, like commercially available 39 00:02:23,639 --> 00:02:25,600 Speaker 1: AI models, How is it different? 40 00:02:25,960 --> 00:02:31,440 Speaker 2: So Grock was made by Elon Musk as this alternative 41 00:02:31,720 --> 00:02:34,920 Speaker 2: to open AIS chat, GBT or even Claude or some 42 00:02:34,960 --> 00:02:40,000 Speaker 2: of the other big popular tech giant chatbots that also 43 00:02:40,080 --> 00:02:44,360 Speaker 2: create images, And he made it because he wanted a 44 00:02:44,480 --> 00:02:51,639 Speaker 2: quote unquote based, un restricted air quotes, free speech chatbot 45 00:02:51,800 --> 00:02:54,560 Speaker 2: that would not have a bunch of guardrails that these 46 00:02:54,600 --> 00:02:58,079 Speaker 2: others have. And these others have guardrails such as they 47 00:02:58,120 --> 00:03:02,040 Speaker 2: won't generate sexual igery in many of these cases, they 48 00:03:02,160 --> 00:03:07,960 Speaker 2: definitely won't generate images of children in sexual scenarios. So 49 00:03:08,600 --> 00:03:12,160 Speaker 2: it wasn't his explicit intention when he first made this chatbot, 50 00:03:12,240 --> 00:03:14,639 Speaker 2: or at least what he said what we know of publicly, 51 00:03:15,080 --> 00:03:19,000 Speaker 2: to create a se sam generator, a child sexual abusymery generator. 52 00:03:19,440 --> 00:03:22,200 Speaker 2: But he wanted something that was going to be like 53 00:03:22,320 --> 00:03:27,200 Speaker 2: quote unquote, non woke, no censorship. You know, it had 54 00:03:27,200 --> 00:03:30,840 Speaker 2: the cool little like rodent that would say a curse 55 00:03:30,880 --> 00:03:33,760 Speaker 2: word and like the anime wife Fu Girl, who would 56 00:03:33,880 --> 00:03:36,160 Speaker 2: you know, sex with you while you're in your test tesla, 57 00:03:36,240 --> 00:03:39,360 Speaker 2: stuff like that. So I think that's that's what sets 58 00:03:39,360 --> 00:03:41,240 Speaker 2: it apart in a lot of ways. And it's also 59 00:03:41,880 --> 00:03:44,960 Speaker 2: it's being run, like you said, on a platform natively 60 00:03:45,080 --> 00:03:49,120 Speaker 2: where people are also just posting normal stuff, posting news articles, 61 00:03:49,120 --> 00:03:51,400 Speaker 2: posting like their thoughts for the day, posting a joke, 62 00:03:51,480 --> 00:03:54,440 Speaker 2: a meme, and then alongside that you have images of 63 00:03:54,920 --> 00:03:58,440 Speaker 2: women being undressed and you know, neutified is what a 64 00:03:58,440 --> 00:04:01,640 Speaker 2: lot of these apps that do it professionally call it 65 00:04:01,760 --> 00:04:05,760 Speaker 2: notification or undressing alongside just everyday life. 66 00:04:06,120 --> 00:04:11,320 Speaker 1: So unfortunately, creating explicit, non consensual imagery of women and 67 00:04:11,400 --> 00:04:16,680 Speaker 1: girls is not something new. Can you give our listeners 68 00:04:16,680 --> 00:04:20,960 Speaker 1: a sort of brief rundown of when this started and 69 00:04:21,000 --> 00:04:22,880 Speaker 1: when we started to see this online? 70 00:04:23,200 --> 00:04:27,320 Speaker 2: Yeah, So the thing that pretty much struck me at 71 00:04:27,320 --> 00:04:30,760 Speaker 2: first when I first saw this happening with Rock and 72 00:04:30,760 --> 00:04:34,800 Speaker 2: everyone talking about sexual abuse imagery on x formerly known 73 00:04:34,839 --> 00:04:37,880 Speaker 2: as Twitter, is that this has been happening on Twitter 74 00:04:37,920 --> 00:04:40,039 Speaker 2: when it used to be called Twitter for a very 75 00:04:40,080 --> 00:04:44,000 Speaker 2: long time. It's something that lots of people who are 76 00:04:44,520 --> 00:04:47,839 Speaker 2: in the space of preventing sexual abuse imagery harms have 77 00:04:47,880 --> 00:04:49,679 Speaker 2: been talking about for a long time is that Twitter 78 00:04:50,480 --> 00:04:53,960 Speaker 2: was and still is full of abusive imagery of women, 79 00:04:54,080 --> 00:04:57,200 Speaker 2: real stuff like photos and videos that women don't want 80 00:04:57,240 --> 00:04:59,920 Speaker 2: on the Internet. We used to call it revenge more 81 00:05:00,800 --> 00:05:05,560 Speaker 2: it's non essential content that should not be spread like this, 82 00:05:05,720 --> 00:05:08,440 Speaker 2: especially on a very mainstream platform. So that's been an 83 00:05:08,440 --> 00:05:10,520 Speaker 2: issue for years and years and years. I would say 84 00:05:10,720 --> 00:05:13,440 Speaker 2: it's kind of something baked into Twitter at this point 85 00:05:13,560 --> 00:05:16,680 Speaker 2: where they didn't solve that problem before elam Must showed up, 86 00:05:17,160 --> 00:05:20,280 Speaker 2: and they're definitely not solving it now after he showed 87 00:05:20,320 --> 00:05:24,239 Speaker 2: up and fired thousands of moderators. So that's the context 88 00:05:24,240 --> 00:05:26,479 Speaker 2: that we're in as far as the real stuff, like 89 00:05:26,520 --> 00:05:31,960 Speaker 2: the actual recorded images and videos of women who don't 90 00:05:32,000 --> 00:05:35,240 Speaker 2: want their images out there. And then to add to this, 91 00:05:35,320 --> 00:05:39,280 Speaker 2: we have generative AI and apps that we've put it 92 00:05:39,320 --> 00:05:41,200 Speaker 2: on a bunch of four or four and are being 93 00:05:41,279 --> 00:05:44,720 Speaker 2: advertised in lots of spaces on you know, TikTok and 94 00:05:44,760 --> 00:05:48,880 Speaker 2: Instagram and are very accessible to lots of people, including 95 00:05:48,880 --> 00:05:52,040 Speaker 2: teenagers that say, would you like to see that girl 96 00:05:52,040 --> 00:05:55,760 Speaker 2: from the gym nude? Do you want to make a 97 00:05:55,800 --> 00:05:58,279 Speaker 2: chat out of her where she can never say no 98 00:05:58,360 --> 00:06:03,400 Speaker 2: to you? But that over to x where people are 99 00:06:03,400 --> 00:06:06,240 Speaker 2: just like I said, having normal conversations or not anymore. 100 00:06:06,240 --> 00:06:08,200 Speaker 2: Nobody has a normal conversational next anymore and every thing. 101 00:06:08,200 --> 00:06:11,760 Speaker 2: But you know, people were using it like they would 102 00:06:11,760 --> 00:06:13,320 Speaker 2: any other social platform. And then at the same time 103 00:06:13,360 --> 00:06:17,240 Speaker 2: you have built into it a non ponsential imagery generator, 104 00:06:17,360 --> 00:06:21,640 Speaker 2: this factory that is an escalation of what we're already 105 00:06:21,640 --> 00:06:24,040 Speaker 2: seeing with the standalone apps that you can download and 106 00:06:24,080 --> 00:06:27,360 Speaker 2: then swap someone's face into someone else's body, put them 107 00:06:27,400 --> 00:06:29,160 Speaker 2: in any situation you want. 108 00:06:29,400 --> 00:06:31,200 Speaker 1: Can you talk a little bit about gam or Gate 109 00:06:31,440 --> 00:06:34,719 Speaker 1: and just like the rise of deep fakes in seventeen eighteen, 110 00:06:35,160 --> 00:06:37,800 Speaker 1: you know, where people were not actually using generative ad, 111 00:06:37,800 --> 00:06:39,760 Speaker 1: but they were using photoshop. Yeah. 112 00:06:39,880 --> 00:06:42,240 Speaker 2: I mean, that's another thing that this entire situation is 113 00:06:42,880 --> 00:06:46,600 Speaker 2: reminded me of is the way that Gamergate, which was 114 00:06:46,640 --> 00:06:50,720 Speaker 2: a harassment campaign primarily focused on women in gaming and 115 00:06:50,760 --> 00:06:52,440 Speaker 2: then kind of spread out to just be women on 116 00:06:52,480 --> 00:06:58,120 Speaker 2: the Internet in general. Very misogynistic harassment campaign. It originated 117 00:06:58,200 --> 00:07:02,039 Speaker 2: on places like four tan chan and on these forums 118 00:07:02,040 --> 00:07:06,560 Speaker 2: that were known to be toxic and full of in cells, 119 00:07:07,240 --> 00:07:12,440 Speaker 2: but it eventually migrated and got the big popularity that 120 00:07:12,480 --> 00:07:15,480 Speaker 2: it eventually got and had the big impact on people's 121 00:07:15,480 --> 00:07:19,800 Speaker 2: actual lives because all of that content landed on Twitter, 122 00:07:20,360 --> 00:07:24,920 Speaker 2: and that's where you see people actually, you know, reporting 123 00:07:24,960 --> 00:07:27,000 Speaker 2: on it in a serious way, you know, people meaning 124 00:07:27,040 --> 00:07:29,800 Speaker 2: like the media taking it seriously. Because now it's on 125 00:07:29,840 --> 00:07:34,440 Speaker 2: this mainstream platform, it's getting so much more exposure from 126 00:07:34,520 --> 00:07:38,160 Speaker 2: lots of people. It's radicalizing some people being this harassment 127 00:07:38,240 --> 00:07:40,720 Speaker 2: campaign that it is a lot of it relied on 128 00:07:40,960 --> 00:07:45,360 Speaker 2: shame and sexual shame, so that definitely felt like a 129 00:07:45,400 --> 00:07:47,480 Speaker 2: parallel to me. It's like we had these sort of 130 00:07:47,560 --> 00:07:52,000 Speaker 2: like almost but not quite underground communities for lack of 131 00:07:52,040 --> 00:07:57,040 Speaker 2: a better word, creating content, creating harassment campaigns away from 132 00:07:57,040 --> 00:08:00,400 Speaker 2: the mainstream that eventually make their way to a place 133 00:08:00,440 --> 00:08:02,960 Speaker 2: like Twitter or literally Twitter and in this case x 134 00:08:03,440 --> 00:08:07,000 Speaker 2: where it has this explosive effect and it has this 135 00:08:07,080 --> 00:08:09,680 Speaker 2: farther reach on people's lives because you know, you're searching 136 00:08:09,800 --> 00:08:12,720 Speaker 2: sample on Google, it might be my Twitter that comes up, 137 00:08:12,760 --> 00:08:15,000 Speaker 2: and then you click on my Twitter, it might be 138 00:08:15,040 --> 00:08:17,880 Speaker 2: like tons of replies from people talking about my body 139 00:08:17,960 --> 00:08:20,520 Speaker 2: and harassing me in that way. And this is something 140 00:08:20,520 --> 00:08:22,560 Speaker 2: that like lots of women have to do with on 141 00:08:23,600 --> 00:08:27,160 Speaker 2: the Internet today. So that's just kind of a hypothetical example, 142 00:08:27,160 --> 00:08:29,880 Speaker 2: but it was actually happening to lots of women during 143 00:08:29,920 --> 00:08:32,880 Speaker 2: the time that we're talking about Steen twenty seventeen, twenty teen. 144 00:08:33,400 --> 00:08:36,320 Speaker 2: It feels very like a pattern, like a formula at 145 00:08:36,320 --> 00:08:37,120 Speaker 2: this point. 146 00:08:37,480 --> 00:08:41,200 Speaker 1: So you talk about Twitter sort of being the for 147 00:08:41,280 --> 00:08:43,440 Speaker 1: lack of a better word, the dumping ground for this 148 00:08:43,559 --> 00:08:46,560 Speaker 1: kind of content and you write about how the National 149 00:08:46,559 --> 00:08:50,480 Speaker 1: Center for Missing and Exploited Children has consistently ranked Twitter, 150 00:08:50,520 --> 00:08:53,480 Speaker 1: and you know, subsequently x as quote one of the 151 00:08:53,600 --> 00:08:56,600 Speaker 1: leading hosts of child sexual abuse material every year for 152 00:08:56,640 --> 00:08:59,240 Speaker 1: the last seven years. What do you think it is 153 00:08:59,480 --> 00:09:04,720 Speaker 1: about Twitter an x as a platform that has allowed 154 00:09:05,080 --> 00:09:07,600 Speaker 1: for this type of content to proliferate for so long 155 00:09:08,320 --> 00:09:09,760 Speaker 1: before Elon bought it. 156 00:09:09,800 --> 00:09:13,440 Speaker 2: I would have said moderation and the unwillingness and also 157 00:09:13,559 --> 00:09:18,600 Speaker 2: like the hesitancy to overmoderate or the fear of being 158 00:09:18,679 --> 00:09:22,360 Speaker 2: seen as like stamping out free speech and letting people 159 00:09:22,440 --> 00:09:25,520 Speaker 2: just harass other people on your platform endlessly. Twitter had 160 00:09:25,559 --> 00:09:29,320 Speaker 2: a Nazi problem before Elon took it over, and now 161 00:09:29,360 --> 00:09:33,240 Speaker 2: we have Elon, who is a white supremacist sympathizer at best, 162 00:09:33,880 --> 00:09:38,080 Speaker 2: owning the platform that had a Nazi problem and is 163 00:09:38,679 --> 00:09:43,160 Speaker 2: seemingly not at all interested in creating a healthy and 164 00:09:43,360 --> 00:09:48,040 Speaker 2: productive social media platform. It's mostly just an engagement and 165 00:09:48,200 --> 00:09:50,760 Speaker 2: rage bait and outrage bait farm, and part of that 166 00:09:51,000 --> 00:09:55,840 Speaker 2: is the non consensual stuff. So you had an existing problem, 167 00:09:56,040 --> 00:09:58,720 Speaker 2: it's just ten times worse now because Elon is you know, 168 00:09:58,800 --> 00:10:02,400 Speaker 2: in fucking mode and doesn't care what happens to the 169 00:10:02,440 --> 00:10:04,640 Speaker 2: people that are using his platform other than using it 170 00:10:04,640 --> 00:10:07,240 Speaker 2: as a way to talk directly to people who are 171 00:10:07,280 --> 00:10:10,800 Speaker 2: in power at this point. So I would say that's 172 00:10:11,640 --> 00:10:15,160 Speaker 2: what makes it uniquely fraught, and also just the influence 173 00:10:15,160 --> 00:10:17,800 Speaker 2: that it has on everyday life. I mean, tweets were 174 00:10:17,800 --> 00:10:22,120 Speaker 2: on CNN and still are. It's very much a place 175 00:10:22,160 --> 00:10:26,439 Speaker 2: where people thought was like the town Square. Hate that phrase, 176 00:10:26,480 --> 00:10:28,400 Speaker 2: but that's kind of about how people saw on Twitter. 177 00:10:28,440 --> 00:10:30,400 Speaker 2: Is you know, things were getting hashed out on Twitter 178 00:10:30,520 --> 00:10:33,400 Speaker 2: that were representative of real life. Was the attitude towards 179 00:10:33,400 --> 00:10:35,440 Speaker 2: Twitter for a really long time. That's part of why 180 00:10:35,440 --> 00:10:37,719 Speaker 2: Ellan bought it. He wanted that sort of influence. He 181 00:10:37,760 --> 00:10:40,559 Speaker 2: thought that he could make it the everything app where 182 00:10:40,600 --> 00:10:43,079 Speaker 2: everything that you wanted to do in life happened on 183 00:10:44,080 --> 00:10:47,679 Speaker 2: x So I think that effect where it's like it 184 00:10:47,800 --> 00:10:51,160 Speaker 2: is somewhere where everyone is or used to be. Now 185 00:10:51,280 --> 00:10:54,480 Speaker 2: a lot of people are not, but the fact that 186 00:10:54,520 --> 00:10:57,400 Speaker 2: it was this place where everyone was gathering, everyone's congreting. 187 00:10:57,600 --> 00:11:00,000 Speaker 2: You get jobs off of being on Twitter, especially as 188 00:11:00,080 --> 00:11:03,199 Speaker 2: writer or creative. You might get discovered off of Twitter, and. 189 00:11:03,240 --> 00:11:05,240 Speaker 1: So many of people got hired off of Twitter. Yeah. 190 00:11:05,320 --> 00:11:06,480 Speaker 1: And comedy writers. 191 00:11:06,480 --> 00:11:08,960 Speaker 2: Comedy writers, yeah, it's like where you like go to 192 00:11:09,000 --> 00:11:12,319 Speaker 2: stand out and end up working at you know, like 193 00:11:12,360 --> 00:11:13,439 Speaker 2: Buzzfew News or something. 194 00:11:13,559 --> 00:11:14,400 Speaker 1: It was huge for that. 195 00:11:14,559 --> 00:11:16,520 Speaker 2: So yeah, I know that I felt like I had 196 00:11:16,520 --> 00:11:18,800 Speaker 2: to have a Twitter presence because I was a journalist 197 00:11:19,679 --> 00:11:20,600 Speaker 2: in that era. 198 00:11:21,120 --> 00:11:22,199 Speaker 1: Do you feel that way anymore? 199 00:11:22,600 --> 00:11:24,600 Speaker 2: No, not at all. I mean I am on Twitter. 200 00:11:25,640 --> 00:11:28,600 Speaker 2: I'm on Twitter now because I'm reporting on Twitter, but 201 00:11:28,640 --> 00:11:30,120 Speaker 2: I'm not using it in the way that I did 202 00:11:30,240 --> 00:11:33,120 Speaker 2: then at all. Thank God I don't have to be. 203 00:11:33,800 --> 00:11:34,760 Speaker 2: I'm not required to be. 204 00:11:34,840 --> 00:11:38,760 Speaker 1: But yeah, so you mentioned that the child sexual abuse 205 00:11:38,840 --> 00:11:42,800 Speaker 1: material is being created, but we don't currently think the 206 00:11:43,040 --> 00:11:46,840 Speaker 1: children depicted are real. You know, what would you say 207 00:11:46,840 --> 00:11:49,160 Speaker 1: to someone who says that this is a victimless crime. 208 00:11:49,800 --> 00:11:54,400 Speaker 2: So we know from just speaking about the child sexual 209 00:11:54,440 --> 00:12:00,280 Speaker 2: abuse material stuff. We know from experts who work in 210 00:12:00,880 --> 00:12:04,200 Speaker 2: fields where they're trying to find victims, who are trying 211 00:12:04,200 --> 00:12:08,640 Speaker 2: to find perpetrators of sexual abuse material, that generated AI 212 00:12:08,760 --> 00:12:10,960 Speaker 2: has made all of their jobs so much harder, and 213 00:12:11,000 --> 00:12:14,680 Speaker 2: their job was already so incredibly hard aside from being 214 00:12:14,800 --> 00:12:18,360 Speaker 2: often trained on real children, and in some cases we've 215 00:12:18,360 --> 00:12:22,200 Speaker 2: found trained on sexual abuse material to make the generative 216 00:12:22,240 --> 00:12:25,800 Speaker 2: AI stuff. So there is actually a victim behind this stuff. 217 00:12:26,440 --> 00:12:30,959 Speaker 2: It's used to groom children in cases where a perpetrator 218 00:12:31,040 --> 00:12:36,040 Speaker 2: might send an AI generated image to a child and say, look, 219 00:12:36,240 --> 00:12:39,320 Speaker 2: this kid did this. Can you do this? And it's 220 00:12:39,360 --> 00:12:42,319 Speaker 2: like you don't even need to have an existing victim 221 00:12:42,360 --> 00:12:45,720 Speaker 2: to create more victims. It's just something you can generate 222 00:12:45,800 --> 00:12:48,880 Speaker 2: up with AI. Now. It also just it makes it 223 00:12:48,960 --> 00:12:53,360 Speaker 2: harder to find the real stuff and the real victims 224 00:12:53,440 --> 00:12:56,440 Speaker 2: who are being abused in real life because it looks 225 00:12:56,480 --> 00:12:58,839 Speaker 2: real and it's hard to tell the difference. So investigators 226 00:12:58,880 --> 00:13:02,599 Speaker 2: are spending timelyzing AI to find out whether or not 227 00:13:02,679 --> 00:13:04,840 Speaker 2: it's real and whether it's fake in the way of 228 00:13:04,880 --> 00:13:09,480 Speaker 2: finding actual kids, it's wasting their time. So it's just 229 00:13:10,440 --> 00:13:15,040 Speaker 2: it's polluting their occupation in a way that is unfathomable 230 00:13:15,559 --> 00:13:18,160 Speaker 2: considering what they have to contend with every day. Already. 231 00:13:18,840 --> 00:13:21,880 Speaker 1: Has X tried to address this issue at all. 232 00:13:23,960 --> 00:13:31,319 Speaker 2: I mean, I think there have been gestures at condoning 233 00:13:32,040 --> 00:13:39,079 Speaker 2: or condemning, sorry for Randslip, at making this problem go away, 234 00:13:39,120 --> 00:13:40,920 Speaker 2: But I think anything that they try to do at 235 00:13:40,960 --> 00:13:43,560 Speaker 2: this point, it's short of turning off Grock entirely and 236 00:13:43,559 --> 00:13:46,839 Speaker 2: making the damage generation not something that people can access 237 00:13:46,960 --> 00:13:52,160 Speaker 2: anymore is not enough. They're now being investigated by the 238 00:13:52,240 --> 00:13:57,000 Speaker 2: Attorney General. Rab Banta is just announced that he's gonna 239 00:13:57,240 --> 00:13:59,960 Speaker 2: investigate groc and X. I mean, I think it's just 240 00:14:00,160 --> 00:14:02,920 Speaker 2: like the cat is out of the bag for them. 241 00:14:03,200 --> 00:14:06,000 Speaker 2: This is a problem that they are going to end 242 00:14:06,080 --> 00:14:09,400 Speaker 2: up having dancer for I think, I hope. 243 00:14:09,640 --> 00:14:10,080 Speaker 1: I don't know. 244 00:14:10,160 --> 00:14:12,640 Speaker 2: I mean, who knows, you know, it's like these elon's 245 00:14:12,679 --> 00:14:15,640 Speaker 2: slippery like that, so who knows if we'll get away 246 00:14:15,640 --> 00:14:18,120 Speaker 2: with it. But it's definitely caught the eye of investigators. 247 00:14:18,200 --> 00:14:19,720 Speaker 2: It's caught the eye of a lot of these groups 248 00:14:19,760 --> 00:14:23,280 Speaker 2: that are working to prevent harms like this online. I 249 00:14:23,320 --> 00:14:25,320 Speaker 2: think it's definitely it's crossed a line for a lot 250 00:14:25,360 --> 00:14:30,080 Speaker 2: of people who maybe saw X as toxic and not 251 00:14:30,160 --> 00:14:31,600 Speaker 2: a place they want to be, and now it's like, oh, 252 00:14:31,640 --> 00:14:35,920 Speaker 2: this is actively creating really really bad material at scale, 253 00:14:36,480 --> 00:14:38,760 Speaker 2: and that's a difference. That's a line to be crossed 254 00:14:38,800 --> 00:14:52,080 Speaker 2: for them. 255 00:14:52,840 --> 00:14:56,280 Speaker 1: After the break is AI generated non consensual imagery the 256 00:14:56,400 --> 00:15:17,160 Speaker 1: new normal stay with us. You know, it's obvious that 257 00:15:17,240 --> 00:15:21,200 Speaker 1: not everyone on X is generating and posting these explicit images. 258 00:15:21,240 --> 00:15:24,080 Speaker 1: Some people just use X the way that they use Twitter, 259 00:15:24,480 --> 00:15:28,800 Speaker 1: but I'm curious what effect you think these images have 260 00:15:28,960 --> 00:15:30,640 Speaker 1: on the broader X community. 261 00:15:31,760 --> 00:15:34,240 Speaker 2: Yeah, I mean, I think that's a good question and 262 00:15:34,320 --> 00:15:36,880 Speaker 2: something that we should actually be thinking about more and 263 00:15:37,160 --> 00:15:38,880 Speaker 2: the effect that it has on people who are using 264 00:15:39,040 --> 00:15:40,960 Speaker 2: X who are probably used to this anyway, and then 265 00:15:40,960 --> 00:15:43,760 Speaker 2: also people who are not using it, And it definitely 266 00:15:43,800 --> 00:15:46,640 Speaker 2: normalizes normalizes it in a big way. I'm sure it 267 00:15:46,680 --> 00:15:50,680 Speaker 2: emboldens the people who are making apps that are for 268 00:15:50,720 --> 00:15:55,120 Speaker 2: this purpose, that are monetizing deep fakes and non essential 269 00:15:55,600 --> 00:15:59,360 Speaker 2: pornography at scale. I'm sure that they see this as like, Okay, 270 00:15:59,400 --> 00:16:03,040 Speaker 2: hell yeah, he's not facing any repercussions, so we're gonna 271 00:16:03,080 --> 00:16:07,239 Speaker 2: go full tilt at it, right, And I think that's 272 00:16:07,680 --> 00:16:11,640 Speaker 2: a worst case scenario, and I'm sure it's happening. But 273 00:16:11,680 --> 00:16:15,720 Speaker 2: then it's also like it just it normalizes this content, 274 00:16:17,040 --> 00:16:20,360 Speaker 2: which at the heart of it is about non content. 275 00:16:20,760 --> 00:16:24,160 Speaker 2: It's about using women's bodies in ways that they don't want, 276 00:16:24,720 --> 00:16:29,000 Speaker 2: it's about ignoring body of the autonomy. It normalizes all 277 00:16:29,000 --> 00:16:32,760 Speaker 2: of that to other people on the platform who might 278 00:16:32,800 --> 00:16:34,960 Speaker 2: be like, oh, well, if that guy's doing it, you. 279 00:16:34,920 --> 00:16:36,560 Speaker 1: Know, maybe I should do it at the gym. 280 00:16:36,760 --> 00:16:38,280 Speaker 2: Yeah, maybe I should take a picture of the girl 281 00:16:38,280 --> 00:16:40,680 Speaker 2: at the gym and try to see what ROC can 282 00:16:40,680 --> 00:16:40,960 Speaker 2: do with. 283 00:16:40,920 --> 00:16:42,880 Speaker 1: It, which is just. 284 00:16:45,320 --> 00:16:48,200 Speaker 2: Like, it's a shocking I think that's probably the most 285 00:16:48,240 --> 00:16:50,600 Speaker 2: shocking thing about it is that it's just it's no 286 00:16:50,680 --> 00:16:53,560 Speaker 2: longer like I'm using this app that pretends to be 287 00:16:53,600 --> 00:16:56,040 Speaker 2: something else but is secretly a porn app that makes 288 00:16:56,720 --> 00:16:59,880 Speaker 2: face wop images of girls in my class. It's we're 289 00:17:00,040 --> 00:17:03,240 Speaker 2: doing this in the open on Twitter, on x on 290 00:17:03,440 --> 00:17:05,920 Speaker 2: like the biggest, one of the biggest social media platforms 291 00:17:05,920 --> 00:17:08,439 Speaker 2: out there, and nothing bad is happening to me for it, 292 00:17:08,800 --> 00:17:12,959 Speaker 2: and the platform says it's fine. I think, especially when 293 00:17:12,960 --> 00:17:14,800 Speaker 2: there's no one else in your life saying it's not fine, 294 00:17:15,400 --> 00:17:18,760 Speaker 2: you know, it's it's just kind of creating the snowball 295 00:17:18,760 --> 00:17:21,440 Speaker 2: effect of people who are interested in it want to 296 00:17:21,440 --> 00:17:21,760 Speaker 2: try it. 297 00:17:22,119 --> 00:17:25,680 Speaker 1: So what are the numbers in terms of like proliferation 298 00:17:25,800 --> 00:17:28,080 Speaker 1: of this, Like what what are we looking at in 299 00:17:28,200 --> 00:17:30,679 Speaker 1: terms of how much of this is online? 300 00:17:30,960 --> 00:17:33,520 Speaker 2: Anyone can search on x right now you're at rock 301 00:17:33,600 --> 00:17:36,040 Speaker 2: and then like make her, you know, do this and 302 00:17:36,119 --> 00:17:38,919 Speaker 2: that or like some version of the prompt that they're doing, 303 00:17:38,960 --> 00:17:42,760 Speaker 2: and it's just multiple times a minute, every minute. People 304 00:17:42,760 --> 00:17:45,520 Speaker 2: are making a new one of those. I think conservatively 305 00:17:45,680 --> 00:17:48,880 Speaker 2: it's in the like hundreds of thousands of images all 306 00:17:48,880 --> 00:17:51,679 Speaker 2: together in the last week or so. 307 00:17:51,680 --> 00:17:52,760 Speaker 1: So crazy, I. 308 00:17:52,680 --> 00:17:54,640 Speaker 2: Mean, just based on the every time I check on it, 309 00:17:54,520 --> 00:17:57,280 Speaker 2: and it's just like a running You can't keep up 310 00:17:57,280 --> 00:17:59,480 Speaker 2: with the feed of people trying to make her do 311 00:17:59,560 --> 00:18:02,720 Speaker 2: this and that, make her where clear date bikini or 312 00:18:02,760 --> 00:18:03,320 Speaker 2: whatever it is. 313 00:18:03,840 --> 00:18:07,160 Speaker 1: This is obviously a huge problem or other countries responding 314 00:18:07,200 --> 00:18:09,240 Speaker 1: differently than the United States. 315 00:18:10,840 --> 00:18:15,400 Speaker 2: Yeah, so I think the UK was talking about banning 316 00:18:15,880 --> 00:18:17,840 Speaker 2: x entirely in the country. 317 00:18:19,280 --> 00:18:23,280 Speaker 1: On account of this deep fake pornography and grog. Yeah. 318 00:18:23,400 --> 00:18:26,240 Speaker 2: I think offcom was getting involved in saying that it 319 00:18:26,280 --> 00:18:29,600 Speaker 2: could happen if they don't figure it out and turn 320 00:18:29,680 --> 00:18:32,760 Speaker 2: it off and somehow make it get better, Which is 321 00:18:32,760 --> 00:18:36,639 Speaker 2: interesting because they passed the Online Safety Act a while back, 322 00:18:37,160 --> 00:18:40,240 Speaker 2: and it's requiring a lot more censorship and a lot 323 00:18:40,240 --> 00:18:42,919 Speaker 2: more it's the age vocation stuff that we're seeing here 324 00:18:42,960 --> 00:18:45,680 Speaker 2: in the States where they're requiring a lot of sites 325 00:18:45,680 --> 00:18:48,159 Speaker 2: and social media sites to verify ages before you can 326 00:18:48,240 --> 00:18:51,520 Speaker 2: use them because they want to kind of push adult 327 00:18:51,560 --> 00:18:55,480 Speaker 2: content in that direction. And now they're kind of teasing 328 00:18:55,520 --> 00:18:59,399 Speaker 2: the idea of banning X over something that is so 329 00:18:59,520 --> 00:19:04,159 Speaker 2: blatantly criminal and so so harmful. It's like, you know, 330 00:19:04,200 --> 00:19:06,679 Speaker 2: where's the line for you guys? Where's when are we 331 00:19:06,720 --> 00:19:10,280 Speaker 2: gonna hit the button? But I don't know, I think 332 00:19:10,280 --> 00:19:12,199 Speaker 2: they I'm sure it's more complicated than that. 333 00:19:12,320 --> 00:19:14,720 Speaker 1: And is there anything that like the general public can. 334 00:19:14,600 --> 00:19:17,480 Speaker 2: Do stay off acts? If you don't want to see 335 00:19:17,800 --> 00:19:21,840 Speaker 2: some bad stuff? I would say that's that's a big one. 336 00:19:22,160 --> 00:19:24,480 Speaker 2: Maybe call your reps if this is an issue you 337 00:19:24,480 --> 00:19:27,919 Speaker 2: care about. It's definitely illegal in a lot of states 338 00:19:27,600 --> 00:19:31,400 Speaker 2: to create and to at least to generate and spread 339 00:19:31,840 --> 00:19:36,520 Speaker 2: and disseminate AI child sexual abas material, So you know, 340 00:19:36,520 --> 00:19:37,920 Speaker 2: what are those states going to do about it? Is 341 00:19:37,960 --> 00:19:38,679 Speaker 2: kind of my question. 342 00:19:39,040 --> 00:19:41,160 Speaker 1: How does the Take It Down Act play into this? 343 00:19:41,960 --> 00:19:42,440 Speaker 1: If at all? 344 00:19:42,840 --> 00:19:48,000 Speaker 2: The Takedown Act affords victims more recourse. So if you're 345 00:19:48,040 --> 00:19:50,359 Speaker 2: in a deep fake, and if you're in a sexually 346 00:19:50,359 --> 00:19:54,159 Speaker 2: explicit deep fake and you want that taken down, you 347 00:19:54,200 --> 00:19:58,480 Speaker 2: can message the websites and say I need this removed. 348 00:19:58,640 --> 00:20:03,840 Speaker 2: This is non extent material, and those websites need to 349 00:20:04,200 --> 00:20:07,720 Speaker 2: comply with those sorts of requests. And that's a federal law. 350 00:20:07,800 --> 00:20:10,320 Speaker 2: So if someone's making images of you that you don't 351 00:20:10,359 --> 00:20:14,120 Speaker 2: want to happen on X. Under that law, X should 352 00:20:14,160 --> 00:20:16,639 Speaker 2: not be hosting them. I mean, it's it should require 353 00:20:16,640 --> 00:20:21,000 Speaker 2: platforms to take some action when someone reaches out and says, hey, 354 00:20:21,680 --> 00:20:23,200 Speaker 2: this is bad stuff, I don't want it on here. 355 00:20:23,920 --> 00:20:27,520 Speaker 2: That's different than like actually putting the onus on the 356 00:20:27,520 --> 00:20:28,360 Speaker 2: platforms themselves. 357 00:20:28,520 --> 00:20:34,119 Speaker 1: Right, So, the Pentagon actually announced that GROC will be 358 00:20:34,240 --> 00:20:37,159 Speaker 1: used in their network. This is kind of related to 359 00:20:37,200 --> 00:20:39,000 Speaker 1: what we've been talking about, But like, what will it 360 00:20:39,040 --> 00:20:42,000 Speaker 1: actually take for the US government to regulate X and GROC? 361 00:20:42,040 --> 00:20:44,120 Speaker 1: Will the US government regulate X and GROCK? 362 00:20:49,000 --> 00:20:51,040 Speaker 2: I mean that when I saw that, I was like, 363 00:20:51,240 --> 00:20:57,160 Speaker 2: this is so perfect for the way that things are now. Yeah, 364 00:20:57,160 --> 00:21:00,000 Speaker 2: I don't know what it'll take anymore, especially since there 365 00:21:00,080 --> 00:21:04,000 Speaker 2: all in bed together. I really don't know. It feels 366 00:21:04,040 --> 00:21:08,960 Speaker 2: like we're in this extremely weird gray zone as far 367 00:21:09,000 --> 00:21:13,080 Speaker 2: as what is and isn't law, and like what is 368 00:21:13,160 --> 00:21:13,840 Speaker 2: I mean, what is law? 369 00:21:13,880 --> 00:21:14,440 Speaker 1: Is what is law? 370 00:21:14,480 --> 00:21:16,720 Speaker 2: But like, is there going to be any kind of 371 00:21:17,080 --> 00:21:20,000 Speaker 2: any kind of action taken to abide by the law 372 00:21:20,040 --> 00:21:23,639 Speaker 2: when it comes to these sites? And especially now that 373 00:21:23,880 --> 00:21:26,280 Speaker 2: the Pentagon is like, hey, we love Rock and we're 374 00:21:26,280 --> 00:21:29,360 Speaker 2: going to use it just unreal to announce that right 375 00:21:29,359 --> 00:21:31,520 Speaker 2: in the middle of this huge scandal. 376 00:21:32,160 --> 00:21:35,320 Speaker 1: Is there anything about this particular scandal that we didn't 377 00:21:35,400 --> 00:21:38,080 Speaker 1: cover that you are concerned about. 378 00:21:38,720 --> 00:21:40,200 Speaker 2: It's something that's kind of in the back of my mind, 379 00:21:40,240 --> 00:21:43,120 Speaker 2: but I'm not even sure how to articulate. Is how 380 00:21:43,200 --> 00:21:45,639 Speaker 2: striking it is that there is such a demand for this, 381 00:21:45,760 --> 00:21:48,159 Speaker 2: that it is so many people who really want to 382 00:21:48,280 --> 00:21:50,320 Speaker 2: make this kind of content kind of tells me that 383 00:21:50,359 --> 00:21:52,960 Speaker 2: it's already normalized and it's already happening, and this is 384 00:21:53,040 --> 00:21:55,280 Speaker 2: just the next step up from that. So you know, 385 00:21:55,320 --> 00:21:57,000 Speaker 2: behind all these prompts, it's a lot of them are 386 00:21:57,000 --> 00:22:00,840 Speaker 2: probably bots, a lot of them are probably people. And 387 00:22:00,920 --> 00:22:03,080 Speaker 2: we know for a fact that this sort of thing 388 00:22:03,119 --> 00:22:06,560 Speaker 2: and making images of people in intimate situations when they 389 00:22:06,640 --> 00:22:10,000 Speaker 2: do not want that is a very popular industry. And 390 00:22:10,040 --> 00:22:12,919 Speaker 2: I think that's something that we need to take some 391 00:22:13,000 --> 00:22:18,520 Speaker 2: action on, do some educating on, intervene earlier in especially 392 00:22:18,560 --> 00:22:20,879 Speaker 2: young men's lives, when they start going down these routes. 393 00:22:21,320 --> 00:22:23,400 Speaker 2: I think it's just and it's going to get worse 394 00:22:23,400 --> 00:22:25,240 Speaker 2: and worse is it gets more and more industrialized and 395 00:22:25,240 --> 00:22:28,440 Speaker 2: monetized and productized in the way that we've seen with ROC. 396 00:22:29,320 --> 00:22:31,600 Speaker 1: Have you talked to victims of this content and like, 397 00:22:31,680 --> 00:22:34,720 Speaker 1: if so, what have you heard from them? Yeah? I have. 398 00:22:34,960 --> 00:22:44,960 Speaker 2: It's it's always hard to really grasp what it feels 399 00:22:45,000 --> 00:22:48,879 Speaker 2: like to be in a deep fake, and especially a 400 00:22:48,880 --> 00:22:51,480 Speaker 2: sexual deep fake, until it happens to you. So I 401 00:22:51,520 --> 00:22:53,159 Speaker 2: think it is important to keep talking to victims of 402 00:22:53,200 --> 00:22:55,879 Speaker 2: this because they they have felt it firsthand and they 403 00:22:55,880 --> 00:22:59,600 Speaker 2: can really describe it. But what they feel is it's shocking, 404 00:22:59,640 --> 00:23:03,240 Speaker 2: it's rings, it follows you throughout your day, your week, 405 00:23:03,280 --> 00:23:05,399 Speaker 2: whatever it is your life. Some of the women that 406 00:23:05,440 --> 00:23:07,560 Speaker 2: have talked to you have ben targets of defate porn 407 00:23:08,080 --> 00:23:12,399 Speaker 2: talk about how they are afraid to go out in 408 00:23:12,480 --> 00:23:15,760 Speaker 2: public because their face is on this stuff and they 409 00:23:15,760 --> 00:23:17,080 Speaker 2: didn't want it to be. And it's like they have 410 00:23:17,160 --> 00:23:21,359 Speaker 2: nothing against like the porn industry or adult content or 411 00:23:21,400 --> 00:23:24,199 Speaker 2: sex work, but they didn't ask to be in this, 412 00:23:24,280 --> 00:23:26,800 Speaker 2: and it's something that changes your life and you have 413 00:23:26,840 --> 00:23:30,000 Speaker 2: to be ready for that, and they're afraid to apply 414 00:23:30,040 --> 00:23:32,720 Speaker 2: for jobs, they're afraid to date in some of the 415 00:23:32,720 --> 00:23:38,240 Speaker 2: extreme cases. It's just something that really latches onto your 416 00:23:38,480 --> 00:23:40,720 Speaker 2: name and your image in a way that is hard 417 00:23:40,760 --> 00:23:43,960 Speaker 2: to imagine if it hasn't happened to you, And most 418 00:23:43,960 --> 00:23:45,440 Speaker 2: of them just want it to stop. They just want 419 00:23:45,440 --> 00:23:48,199 Speaker 2: the imagery to stop being spread. They want people to 420 00:23:48,200 --> 00:23:50,000 Speaker 2: stop putting their name on this stuff and tagging them 421 00:23:50,040 --> 00:23:53,200 Speaker 2: and replying to them on social media with it. They 422 00:23:53,240 --> 00:23:56,000 Speaker 2: care less sometimes about the person who's making it and 423 00:23:56,040 --> 00:23:57,919 Speaker 2: more that they just stop doing it. 424 00:23:57,800 --> 00:24:02,760 Speaker 1: Right right, sense that this is something that will slow down? 425 00:24:03,240 --> 00:24:07,240 Speaker 2: No, no, Now, I mean the way that it's going now, No, 426 00:24:07,240 --> 00:24:09,720 Speaker 2: no way. I think people whose interest in a way, 427 00:24:09,880 --> 00:24:14,000 Speaker 2: you know, it's like this particular news cycle and like 428 00:24:14,119 --> 00:24:20,399 Speaker 2: hype cycle of Grock making this abuse imagery will slow down, 429 00:24:20,680 --> 00:24:23,440 Speaker 2: and as a result, people will make less of those 430 00:24:23,480 --> 00:24:28,240 Speaker 2: images on Grock, I'm sure. But all of this has 431 00:24:28,359 --> 00:24:31,359 Speaker 2: just created more and more of a wildfire for the 432 00:24:31,359 --> 00:24:36,080 Speaker 2: bigger issue of Yeah, I generated abuse material. So no, 433 00:24:36,359 --> 00:24:38,359 Speaker 2: I don't think I've been wanting I've been waiting for 434 00:24:38,320 --> 00:24:41,120 Speaker 2: it to slow down for six years, and. 435 00:24:41,080 --> 00:24:43,439 Speaker 1: It just seems like there are new tools. It doesn't 436 00:24:43,520 --> 00:24:45,840 Speaker 1: seem like there's less content. 437 00:24:46,119 --> 00:24:47,639 Speaker 2: Yeah, yeah, for sure. 438 00:24:51,280 --> 00:24:55,480 Speaker 1: I want to say thank you, Sam, are we disturbed? 439 00:24:55,680 --> 00:24:57,320 Speaker 1: I don't. I don't want to say thank you, but 440 00:24:57,359 --> 00:25:01,000 Speaker 1: I appreciate your time and care to covering the story 441 00:25:01,040 --> 00:25:01,960 Speaker 1: and for taking the time. 442 00:25:02,119 --> 00:25:04,359 Speaker 2: No, thank you for covering this. Thank you for shedding 443 00:25:04,400 --> 00:25:05,240 Speaker 2: light on this for sure. 444 00:25:05,320 --> 00:25:27,679 Speaker 1: Thank you. That's it for this week for tech Stuff. 445 00:25:27,720 --> 00:25:31,080 Speaker 1: I'm Cara Price. This episode was produced by Eliza Dennis 446 00:25:31,080 --> 00:25:34,800 Speaker 1: and Melissa Slaughter. It was executive produced by me Oswa Oshan, 447 00:25:35,000 --> 00:25:38,399 Speaker 1: Julian Nutter, and Kate Osborne for Kaleidoscope and Katrina Norvell 448 00:25:38,480 --> 00:25:42,000 Speaker 1: for iHeart Podcasts. Jack Insley mixed this episode and Kyle 449 00:25:42,080 --> 00:25:45,360 Speaker 1: Murdoch wrote our theme song. Please rate, review, and reach 450 00:25:45,400 --> 00:25:48,480 Speaker 1: out to us at tech Stuff podcast at gmail dot com. 451 00:25:48,520 --> 00:25:49,439 Speaker 1: We want to hear from you.