1 00:00:05,160 --> 00:00:07,480 Speaker 1: Hey, this is Annie and Samantha. I'm welcome to Steffan 2 00:00:07,520 --> 00:00:19,240 Speaker 1: Never Told You prodictre of iHeart Radio, and today we 3 00:00:19,280 --> 00:00:21,680 Speaker 1: are once again so happy to be joined for the 4 00:00:21,680 --> 00:00:24,119 Speaker 1: first time in the new Year by the fabulous, the 5 00:00:24,200 --> 00:00:25,680 Speaker 1: fantastic Bridget Todd. 6 00:00:25,800 --> 00:00:29,880 Speaker 2: Welcome Bridget, Thanks for having me. Happy New Year. 7 00:00:30,000 --> 00:00:33,280 Speaker 3: This is my first, I think my first podcast recording 8 00:00:33,280 --> 00:00:35,600 Speaker 3: of twenty twenty six. So if I sound a little rusty, 9 00:00:36,000 --> 00:00:36,919 Speaker 3: that's what's going on. 10 00:00:37,960 --> 00:00:41,360 Speaker 1: You're in good company. You're in good company. And also, 11 00:00:41,520 --> 00:00:45,280 Speaker 1: as we were joking about in that sad way before 12 00:00:45,320 --> 00:00:47,680 Speaker 1: we started recording, you picked a doozy of a topic 13 00:00:47,760 --> 00:00:50,760 Speaker 1: to come out swinging with, which we'll get into more 14 00:00:50,760 --> 00:00:55,240 Speaker 1: in a second. But how have you been, Bridget, How 15 00:00:55,400 --> 00:00:57,280 Speaker 1: has the New Year been? Etc? 16 00:00:58,280 --> 00:00:59,560 Speaker 2: Yeah, the New Year has been. 17 00:00:59,720 --> 00:01:02,080 Speaker 3: I know how if other folks are feeling this, it's 18 00:01:02,120 --> 00:01:05,000 Speaker 3: just still been a little bit of a lot. Like 19 00:01:05,120 --> 00:01:07,399 Speaker 3: I don't know, I am someone who I think I've 20 00:01:07,440 --> 00:01:10,120 Speaker 3: said this on this show before. I like New Year. 21 00:01:10,360 --> 00:01:14,280 Speaker 3: I like all the sort of ra rah this year 22 00:01:14,319 --> 00:01:16,640 Speaker 3: is gonna be different, fresh start. That stuff really works 23 00:01:16,680 --> 00:01:19,200 Speaker 3: on me. Even I didn't do that this year. I 24 00:01:19,200 --> 00:01:21,360 Speaker 3: didn't even buy a planner and a bunch of pens 25 00:01:21,680 --> 00:01:24,840 Speaker 3: to like sit and rot in my tone, I know 26 00:01:24,920 --> 00:01:27,800 Speaker 3: you love pens. I love buying pens I won't use, 27 00:01:27,800 --> 00:01:29,720 Speaker 3: and buying a planner I'll use for a week and 28 00:01:29,760 --> 00:01:30,640 Speaker 3: then forget about. 29 00:01:30,880 --> 00:01:32,400 Speaker 2: I didn't even do that. 30 00:01:32,400 --> 00:01:33,160 Speaker 4: That makes me sad. 31 00:01:33,480 --> 00:01:36,680 Speaker 1: Yeah, it's we were. We had an episode on this recently. 32 00:01:36,680 --> 00:01:38,240 Speaker 1: I don't think I said this in the episode though, 33 00:01:38,240 --> 00:01:43,120 Speaker 1: but I I'm very meticulous about my calendar and planning 34 00:01:43,800 --> 00:01:47,960 Speaker 1: and I have not even updated my calendar to January. 35 00:01:48,040 --> 00:01:50,960 Speaker 1: I I have made a note to do it today. 36 00:01:52,640 --> 00:01:57,400 Speaker 1: We shall see if it happens. But yeah, it's just 37 00:01:57,520 --> 00:02:05,640 Speaker 1: felt something off or something strange about the whole thing. 38 00:02:05,680 --> 00:02:07,360 Speaker 1: It doesn't feel like a new year to me. It 39 00:02:07,520 --> 00:02:11,880 Speaker 1: just blurs all together. I'm not sure, but maybe the 40 00:02:11,919 --> 00:02:12,760 Speaker 1: calendar will help me. 41 00:02:13,280 --> 00:02:15,840 Speaker 3: Yeah, that's how I'm feeling too. About Where did you 42 00:02:15,880 --> 00:02:16,560 Speaker 3: put that note? 43 00:02:16,560 --> 00:02:20,520 Speaker 1: If not on a calendar, I have a little sticky 44 00:02:21,200 --> 00:02:22,959 Speaker 1: digital sticky note on my desktop. 45 00:02:23,240 --> 00:02:24,560 Speaker 2: Oh you better'll think it out. 46 00:02:25,440 --> 00:02:27,680 Speaker 1: I mean I don't think so, but I do have 47 00:02:27,800 --> 00:02:33,520 Speaker 1: backup plans. What about you, Samantha. 48 00:02:33,840 --> 00:02:36,680 Speaker 2: That is amazing. Yeah, how's your new year? Sam? 49 00:02:37,120 --> 00:02:39,760 Speaker 4: Again, I think that's why I'm being silent because I'm like, I. 50 00:02:39,720 --> 00:02:41,280 Speaker 2: Don't, don't. 51 00:02:41,840 --> 00:02:45,080 Speaker 4: We're here, We've made it to this year. So you know, 52 00:02:45,240 --> 00:02:50,000 Speaker 4: as the older generation would say, blessed to be around 53 00:02:50,000 --> 00:02:54,240 Speaker 4: for another year. You know, the alternative is dark. But 54 00:02:54,400 --> 00:03:01,359 Speaker 4: these reports have gotten me. That's the words I think. 55 00:03:01,360 --> 00:03:04,200 Speaker 4: I like everything. You know, we talk about the millennial freeze, 56 00:03:04,320 --> 00:03:08,120 Speaker 4: the h the reactions that everyone is having Millennials slash 57 00:03:08,360 --> 00:03:11,680 Speaker 4: uh for me, excellennial who have gone through all of 58 00:03:11,680 --> 00:03:16,520 Speaker 4: the chaos in our lifetimes and then coming in having 59 00:03:16,560 --> 00:03:18,840 Speaker 4: that small slither of hope will be like, yeah, maybe 60 00:03:18,840 --> 00:03:21,480 Speaker 4: this will be a better year, and then the year happens, 61 00:03:21,520 --> 00:03:23,919 Speaker 4: you're like, maybe it's going to be the same. Maybe 62 00:03:24,120 --> 00:03:25,400 Speaker 4: it's just going to be the same. 63 00:03:25,639 --> 00:03:27,440 Speaker 2: So I think part of it for me, I felt 64 00:03:27,440 --> 00:03:28,640 Speaker 2: the exact same thing. 65 00:03:28,760 --> 00:03:30,720 Speaker 3: And I think part of it for me was that, 66 00:03:31,280 --> 00:03:34,400 Speaker 3: you know, over the over the holidays, you're sometimes you're families, 67 00:03:34,440 --> 00:03:37,640 Speaker 3: you're not checking the news as much. When I checked 68 00:03:37,680 --> 00:03:39,880 Speaker 3: back in, I was like, oh, let's see what's going 69 00:03:39,880 --> 00:03:40,520 Speaker 3: on in the world. 70 00:03:40,600 --> 00:03:41,520 Speaker 2: The world was on fire. 71 00:03:41,640 --> 00:03:44,720 Speaker 3: It felt like, oh, so it's sort of hard to 72 00:03:44,760 --> 00:03:47,720 Speaker 3: start the new year on a hopeful note, feeling like 73 00:03:48,560 --> 00:03:50,880 Speaker 3: we might have a chance for stability and feeling a 74 00:03:50,880 --> 00:03:53,400 Speaker 3: little bit better than being like, oh, just kidding, and 75 00:03:53,880 --> 00:03:55,520 Speaker 3: and it's gone right. 76 00:03:55,680 --> 00:03:58,120 Speaker 4: Like I feel like for a little while we had 77 00:03:58,280 --> 00:04:00,920 Speaker 4: the main hope of being like, we're the bad guys, 78 00:04:01,280 --> 00:04:04,160 Speaker 4: but we're not the absolute main villain. And as we 79 00:04:04,200 --> 00:04:06,040 Speaker 4: see things are like, oh, we've become the main villain, 80 00:04:06,040 --> 00:04:10,480 Speaker 4: like international main villain. Oh that's interesting, that's the new Ones. 81 00:04:11,880 --> 00:04:12,560 Speaker 2: And we are. 82 00:04:14,680 --> 00:04:17,600 Speaker 4: Anyway, That's how I'm doing. And I laugh, and we 83 00:04:17,680 --> 00:04:20,480 Speaker 4: are in this episode and I'm so glad we get 84 00:04:20,480 --> 00:04:23,599 Speaker 4: to be educated and distracted by you today. 85 00:04:24,120 --> 00:04:28,080 Speaker 3: Yes, well I hope that's I don't know if it's 86 00:04:28,080 --> 00:04:31,039 Speaker 3: going to make anybody feel happier about our situation, but 87 00:04:31,920 --> 00:04:36,120 Speaker 3: you know, sometimes you just have to have a clear 88 00:04:36,160 --> 00:04:39,080 Speaker 3: eyed understanding of what's happening and just deal with it. 89 00:04:40,080 --> 00:04:40,920 Speaker 4: Just deal with it. 90 00:04:42,240 --> 00:04:44,560 Speaker 1: Yes, And it is really important that we talk about it. 91 00:04:45,160 --> 00:04:48,480 Speaker 1: And Samantha and I have been discussing how we've noticed 92 00:04:48,680 --> 00:04:57,240 Speaker 1: a real ramp up in very very destructive online behaviors 93 00:04:57,920 --> 00:05:02,360 Speaker 1: and so as difficult as a conversation like this is, 94 00:05:02,440 --> 00:05:06,240 Speaker 1: we do need to talk about it. So, yes, what 95 00:05:06,480 --> 00:05:07,840 Speaker 1: are we discussing today? 96 00:05:08,279 --> 00:05:10,680 Speaker 3: Yeah, so I just have to give a warning because 97 00:05:10,760 --> 00:05:12,880 Speaker 3: the whole conversation is going to be pretty grim and 98 00:05:12,920 --> 00:05:16,760 Speaker 3: pretty dark. So just a trigger warning upfront. But I'm sure, 99 00:05:16,920 --> 00:05:19,799 Speaker 3: as we all know by now, we're in a good poet, 100 00:05:20,040 --> 00:05:23,720 Speaker 3: mother wife. She was shot and killed by ice in Minnesota, 101 00:05:23,800 --> 00:05:27,520 Speaker 3: right and I watched the video was horrifying. The response 102 00:05:27,600 --> 00:05:32,240 Speaker 3: from the administration has been also predictably horrifying. The conversation 103 00:05:32,360 --> 00:05:37,679 Speaker 3: online probably predictably has been just a wash in lies 104 00:05:37,839 --> 00:05:41,560 Speaker 3: and distortions about both Good and then what happened to her. 105 00:05:42,000 --> 00:05:43,560 Speaker 2: All of these things, I feel, we're. 106 00:05:43,400 --> 00:05:46,320 Speaker 3: Pretty at this point are pretty predictable when this kind 107 00:05:46,320 --> 00:05:50,479 Speaker 3: of thing happens. Unfortunately, however, something that I did not 108 00:05:51,240 --> 00:05:54,159 Speaker 3: and could not predict, and maybe I should have, is 109 00:05:54,160 --> 00:05:57,839 Speaker 3: that when I was scrolling X, I saw a screenshot 110 00:05:58,279 --> 00:06:02,600 Speaker 3: from the video of Good being shot and killed. So 111 00:06:02,640 --> 00:06:05,200 Speaker 3: it's a video of her, an image of her dead 112 00:06:05,240 --> 00:06:10,400 Speaker 3: body covered in blood, and somebody under the image asking 113 00:06:10,960 --> 00:06:14,440 Speaker 3: GROC to generate an AI version of that image that 114 00:06:14,480 --> 00:06:18,200 Speaker 3: puts her in a bikini. And guess what, GROC complied 115 00:06:18,400 --> 00:06:23,600 Speaker 3: and generated an AI image of Good dead, covered in blood, 116 00:06:23,839 --> 00:06:27,880 Speaker 3: but wearing an AI generated bikini. And do you know 117 00:06:28,000 --> 00:06:30,880 Speaker 3: those things that you see where you're like, that's it. 118 00:06:31,560 --> 00:06:34,320 Speaker 3: I have hit my limit. This is that's enough Internet 119 00:06:34,400 --> 00:06:37,000 Speaker 3: for me. I genuinely had to log off and go 120 00:06:37,080 --> 00:06:42,239 Speaker 3: for a walk outside because something about that I think 121 00:06:42,800 --> 00:06:45,919 Speaker 3: just I don't have the words for that kind of 122 00:06:46,400 --> 00:06:52,200 Speaker 3: depravity that it showed, like it just really I'm not 123 00:06:52,279 --> 00:06:54,640 Speaker 3: bummed out by a lot on the Internet these days, 124 00:06:54,720 --> 00:06:56,120 Speaker 3: but this really bummed me out. 125 00:06:57,360 --> 00:07:02,360 Speaker 1: Yeah, that legitimately unders standably so that it is incredibly upsetting. 126 00:07:02,720 --> 00:07:06,560 Speaker 3: And I'm sorry to say that this is not even 127 00:07:07,000 --> 00:07:09,880 Speaker 3: an isolated thing because when you go to X and 128 00:07:10,520 --> 00:07:12,200 Speaker 3: I should say, up off the top, I'm not really 129 00:07:12,240 --> 00:07:13,960 Speaker 3: spending a lot of time on X. I still have 130 00:07:13,960 --> 00:07:15,840 Speaker 3: an account there, but when you go to my account, 131 00:07:16,200 --> 00:07:18,080 Speaker 3: it says please see pen tweet, and then the pen 132 00:07:18,160 --> 00:07:20,600 Speaker 3: tweet is like, I'm not on X, so please don't 133 00:07:20,640 --> 00:07:22,640 Speaker 3: send me anything here because I'm not gonna see it. 134 00:07:23,200 --> 00:07:24,800 Speaker 3: But if you were to go to X and you 135 00:07:24,960 --> 00:07:29,080 Speaker 3: type in put her or make her, you will see 136 00:07:29,120 --> 00:07:32,640 Speaker 3: how many creeps are posting under images on the platform 137 00:07:32,880 --> 00:07:37,320 Speaker 3: of women and children, in some cases asking groc x 138 00:07:37,440 --> 00:07:41,200 Speaker 3: is ai to generate that person in a bikini or 139 00:07:41,600 --> 00:07:45,480 Speaker 3: some otherwise sexually suggestive thing or pose like sometimes they 140 00:07:45,480 --> 00:07:47,800 Speaker 3: will ask, oh, make this person look heavily pregnant, like 141 00:07:47,880 --> 00:07:49,960 Speaker 3: things like that. Some of the things I've seen in 142 00:07:49,960 --> 00:07:54,000 Speaker 3: the last few weeks are things like Groc being asked 143 00:07:54,080 --> 00:07:57,160 Speaker 3: to undress an image of a fourteen year old actress 144 00:07:57,200 --> 00:08:00,160 Speaker 3: on the platform, which Groc did. I think I think 145 00:08:00,200 --> 00:08:03,800 Speaker 3: it's important to say up top that we were talking 146 00:08:03,840 --> 00:08:10,560 Speaker 3: about non consensual AI generated sexualized images of women and girls. However, 147 00:08:11,280 --> 00:08:13,960 Speaker 3: like that Renee Good photo I talked about earlier, it's 148 00:08:14,000 --> 00:08:18,720 Speaker 3: also just like dark. It's not it would be bad 149 00:08:18,840 --> 00:08:22,360 Speaker 3: enough if we were talking about just sexualized images of 150 00:08:22,480 --> 00:08:24,880 Speaker 3: women and girls without their consent, that's bad enough. 151 00:08:25,120 --> 00:08:26,000 Speaker 2: We're talking about like. 152 00:08:26,120 --> 00:08:30,200 Speaker 3: Dark, hateful, sexualized images of women and kids. Right like 153 00:08:30,600 --> 00:08:33,400 Speaker 3: Groc accepted a prompt to add a swastika bikini to 154 00:08:33,480 --> 00:08:36,600 Speaker 3: a photo of a Holocaust survivor, it will, you know, 155 00:08:37,120 --> 00:08:40,200 Speaker 3: accept prompts to make women look like they've been beaten up, 156 00:08:40,320 --> 00:08:44,520 Speaker 3: things like that, Like, not just regular sexualized images which 157 00:08:44,559 --> 00:08:49,079 Speaker 3: are already bad, much darker, weirder, more depraved stuff. 158 00:08:50,240 --> 00:08:54,920 Speaker 4: I'm just the linking of this type of content and 159 00:08:54,960 --> 00:08:57,640 Speaker 4: the usage of AI. Once again, when we had our 160 00:08:57,720 --> 00:09:00,520 Speaker 4: last episode with you. We talked about all of these 161 00:09:00,559 --> 00:09:03,360 Speaker 4: things inevitably being a part of the in cell red 162 00:09:03,400 --> 00:09:08,520 Speaker 4: pill community. It just this is an action as we 163 00:09:08,559 --> 00:09:12,480 Speaker 4: are watching it unfold, and people actually not having obviously 164 00:09:12,640 --> 00:09:15,959 Speaker 4: for those of us who know the consequences and what 165 00:09:16,000 --> 00:09:19,720 Speaker 4: this is showing evidence of, and any person of good 166 00:09:19,800 --> 00:09:24,080 Speaker 4: human humanity understands that this is gross but not really 167 00:09:24,120 --> 00:09:27,640 Speaker 4: paying attention of understanding where this is coming from, and 168 00:09:27,760 --> 00:09:32,520 Speaker 4: the normalization of accepting things like this like this is alarming, 169 00:09:32,960 --> 00:09:35,400 Speaker 4: like nauseatingly alarming. 170 00:09:36,240 --> 00:09:38,560 Speaker 2: You really put it so well. 171 00:09:38,720 --> 00:09:41,200 Speaker 3: I think it's not just that this kind of thing exists, 172 00:09:41,280 --> 00:09:45,200 Speaker 3: because okay, it's the Internet. Of course, there's dark corners 173 00:09:45,200 --> 00:09:49,040 Speaker 3: whe people are doing dark things. X is one of 174 00:09:49,080 --> 00:09:52,320 Speaker 3: our largest social media platforms. It's not you know, I'm 175 00:09:52,360 --> 00:09:53,720 Speaker 3: not there and a lot of people have left, but 176 00:09:53,760 --> 00:09:57,079 Speaker 3: it's not a niche platform. There's still lots and lots 177 00:09:57,080 --> 00:09:59,120 Speaker 3: of people who show up there. We're not talking about 178 00:09:59,440 --> 00:10:02,160 Speaker 3: on alternate of space like a telegram. This is one 179 00:10:02,160 --> 00:10:06,240 Speaker 3: of our biggest communications platforms, and this is happening openly 180 00:10:06,480 --> 00:10:08,600 Speaker 3: and publicly, and in a lot of cases, the person 181 00:10:08,640 --> 00:10:12,240 Speaker 3: who runs that platform, Elon Musk, is laughing about it. 182 00:10:12,320 --> 00:10:15,480 Speaker 3: Joking about it, seemingly celebrating it. And so I'm glad 183 00:10:15,480 --> 00:10:17,640 Speaker 3: that you brought that up. That it's not just that 184 00:10:17,720 --> 00:10:20,760 Speaker 3: this is occurring, it's the normalization and the way that 185 00:10:21,080 --> 00:10:25,240 Speaker 3: it is being tolerated in our mainstream online spaces. And 186 00:10:25,679 --> 00:10:29,240 Speaker 3: really it's not just what's happening on X because Grock 187 00:10:29,600 --> 00:10:33,200 Speaker 3: is also available as a standalone app in addition to 188 00:10:33,240 --> 00:10:36,120 Speaker 3: being something that you could find on X, And as 189 00:10:36,240 --> 00:10:40,319 Speaker 3: the Outlet Indicator points out, the standalone Grock app has 190 00:10:40,320 --> 00:10:43,360 Speaker 3: been used to generate sexual imagery of kids aged eleven 191 00:10:43,400 --> 00:10:43,920 Speaker 3: to thirteen. 192 00:10:44,040 --> 00:10:47,040 Speaker 2: This is according to the Internet Watch Foundation. So yeah, 193 00:10:47,160 --> 00:10:47,880 Speaker 2: it's just. 194 00:10:49,400 --> 00:10:55,960 Speaker 3: A gross situation all around. 195 00:11:02,000 --> 00:11:06,079 Speaker 1: And for any of our listeners who might be blissfully 196 00:11:06,480 --> 00:11:10,560 Speaker 1: unaware what is groc Bridget. 197 00:11:10,320 --> 00:11:11,440 Speaker 2: I'm glad that you asked. 198 00:11:11,480 --> 00:11:14,440 Speaker 3: So for folks who don't know, or maybe you've left 199 00:11:14,480 --> 00:11:18,640 Speaker 3: the platform before Grock became a thing like me, Groc 200 00:11:18,840 --> 00:11:23,560 Speaker 3: is x's AI chatbot. But something to know about Groc 201 00:11:24,080 --> 00:11:28,120 Speaker 3: is that Groc is with intention different than other chatbots 202 00:11:28,160 --> 00:11:30,160 Speaker 3: that you might be familiar with, like chat geet BT 203 00:11:30,520 --> 00:11:34,000 Speaker 3: or Claude, because Groc has been designed to basically be 204 00:11:34,760 --> 00:11:38,160 Speaker 3: like an edge lord chatbot there's no other real way 205 00:11:38,200 --> 00:11:41,240 Speaker 3: to put it. Because we know Elon Musk is a 206 00:11:41,320 --> 00:11:45,400 Speaker 3: loser creep. Groc has basically been been designed for loser 207 00:11:45,480 --> 00:11:49,319 Speaker 3: creeps by loser creeps, right, and so it is absolutely 208 00:11:49,360 --> 00:11:52,959 Speaker 3: true that other chatbots like chat Gept and Claude absolutely 209 00:11:52,960 --> 00:11:57,720 Speaker 3: have their issues. Groc, however, is distinct for being uniquely 210 00:11:57,800 --> 00:12:00,920 Speaker 3: awful in ways that are intent to really built in 211 00:12:01,920 --> 00:12:06,080 Speaker 3: for when people use it. And I guess I wanted 212 00:12:06,080 --> 00:12:10,960 Speaker 3: to start with Groc because folks might have seen headlines 213 00:12:11,040 --> 00:12:15,400 Speaker 3: that say some version of Groc has apologized for generating 214 00:12:15,480 --> 00:12:18,160 Speaker 3: sexualized images of young girls, and I wanted to start 215 00:12:18,200 --> 00:12:21,360 Speaker 3: there because I just want to make it very clear. 216 00:12:22,160 --> 00:12:24,480 Speaker 3: I didn't know that this was something that needed clarifying, 217 00:12:24,559 --> 00:12:27,240 Speaker 3: but just in case it does, Groc is not sentient. 218 00:12:27,760 --> 00:12:31,560 Speaker 3: Grek is not a human. Groc is technology. Groc's not 219 00:12:31,600 --> 00:12:35,680 Speaker 3: a person. So Groc doesn't do anything. Humans like Elon 220 00:12:35,800 --> 00:12:39,440 Speaker 3: Musk built Groc, They trained Groc, they run it as 221 00:12:39,480 --> 00:12:42,080 Speaker 3: a commercial service, and they make it available for other 222 00:12:42,160 --> 00:12:45,040 Speaker 3: humans to do things with. And right now, what those 223 00:12:45,120 --> 00:12:48,520 Speaker 3: humans are doing with Groc is undressing women and children 224 00:12:48,720 --> 00:12:52,679 Speaker 3: without their consent to create sexualized images, also known as 225 00:12:52,760 --> 00:12:55,839 Speaker 3: here in the United States crimes criminal activity. 226 00:12:56,640 --> 00:13:01,840 Speaker 4: And as a reminder, when Musk he literally kind of 227 00:13:01,840 --> 00:13:04,839 Speaker 4: went on a platform about making sure to get c 228 00:13:05,040 --> 00:13:09,320 Speaker 4: SAM or child sexual expectation content off and protecting the kids. 229 00:13:09,840 --> 00:13:10,320 Speaker 2: By the way. 230 00:13:11,640 --> 00:13:13,880 Speaker 3: Yeah, so I don't remember if I talked about this 231 00:13:14,040 --> 00:13:16,200 Speaker 3: with you all or not, but that one of his 232 00:13:16,360 --> 00:13:19,199 Speaker 3: big but one of the big things that he did 233 00:13:19,240 --> 00:13:21,280 Speaker 3: a lot of grand standing about was that we're gonna 234 00:13:21,360 --> 00:13:24,640 Speaker 3: wipe child sexual assault material off the platform. 235 00:13:24,960 --> 00:13:29,000 Speaker 2: And then he really didn't do that at all. 236 00:13:29,120 --> 00:13:31,640 Speaker 3: Right, Like, one of the first things that he did 237 00:13:31,720 --> 00:13:36,280 Speaker 3: when taking over Twitter was to fire I think eighty 238 00:13:36,280 --> 00:13:40,800 Speaker 3: percent of the staff responsible for building tools that look 239 00:13:40,840 --> 00:13:44,960 Speaker 3: for and combat and stamp out sexualized material involving children. 240 00:13:45,040 --> 00:13:48,240 Speaker 2: And so yeah, he sure didn't, but he sure loves to. 241 00:13:48,360 --> 00:13:50,800 Speaker 3: I mean, he doesn't really grandstand about it that much anymore, 242 00:13:50,800 --> 00:13:54,240 Speaker 3: because anymore, how could you? But yeah, that was his 243 00:13:54,440 --> 00:13:57,760 Speaker 3: tenure at Twitter began with a lot of grand standing 244 00:13:57,840 --> 00:13:59,319 Speaker 3: and then doing nothing. 245 00:13:59,679 --> 00:14:03,040 Speaker 4: Right. He literally did this because of the election. And 246 00:14:03,920 --> 00:14:06,600 Speaker 4: as we know, the right wing love to scream about 247 00:14:06,840 --> 00:14:11,600 Speaker 4: protecting children, protecting women, protecting babies, but not in actuality. 248 00:14:11,640 --> 00:14:15,480 Speaker 4: They just love that sentiment and it works for me, 249 00:14:15,600 --> 00:14:17,480 Speaker 4: and I've talked about this, We've talked about this on 250 00:14:17,520 --> 00:14:21,440 Speaker 4: the show often how it does hone in on women, specifically, 251 00:14:21,800 --> 00:14:24,520 Speaker 4: making sure if you don't stand for these things, then 252 00:14:24,560 --> 00:14:27,520 Speaker 4: you are a monster that hates children, that wants to 253 00:14:27,560 --> 00:14:29,480 Speaker 4: kill all children. And eight children by the way, that 254 00:14:29,560 --> 00:14:31,400 Speaker 4: was also you know, we love the pizza Gate, which 255 00:14:31,440 --> 00:14:35,080 Speaker 4: still happens, we know, but that was that beginning. And 256 00:14:35,160 --> 00:14:38,800 Speaker 4: then once Grok was released, if I remember correctly, it 257 00:14:38,880 --> 00:14:42,680 Speaker 4: actually worked okay in that it would call out misinformation, 258 00:14:43,080 --> 00:14:46,800 Speaker 4: but when the misinformation was correcting the far right, I mean, 259 00:14:47,040 --> 00:14:48,640 Speaker 4: they were like, oh no, no, I got to fix this. 260 00:14:49,040 --> 00:14:54,400 Speaker 3: Yes, that's exactly what happened. So well put for folks 261 00:14:54,480 --> 00:14:58,200 Speaker 3: might remember that Groc was not being biased enough and 262 00:14:58,480 --> 00:15:01,800 Speaker 3: was saying things that were we know to be true, 263 00:15:01,880 --> 00:15:04,160 Speaker 3: and then Elon Musk himself stepped in and said no, no, no, no, 264 00:15:04,360 --> 00:15:07,320 Speaker 3: I got We're working to make GROC less woke that 265 00:15:07,400 --> 00:15:10,920 Speaker 3: at that point Grok started calling itself Mecha Hitler went 266 00:15:11,040 --> 00:15:11,480 Speaker 3: down a. 267 00:15:11,400 --> 00:15:14,800 Speaker 2: Real weird rabbit hole on that one. Uh yeah, but 268 00:15:14,800 --> 00:15:16,000 Speaker 2: that is exactly what happened. 269 00:15:17,720 --> 00:15:21,280 Speaker 4: I just remember this. And then, as we know, since then, 270 00:15:22,120 --> 00:15:26,040 Speaker 4: all of those green standing outside of making sure that 271 00:15:26,120 --> 00:15:29,440 Speaker 4: the right is not censored because you know, we don't 272 00:15:29,440 --> 00:15:33,200 Speaker 4: want that anything else. It seems like, yeah, they have 273 00:15:33,320 --> 00:15:36,560 Speaker 4: kind of been the beginning of all of this nastiness. 274 00:15:37,000 --> 00:15:39,240 Speaker 3: Yeah, and I'm glad that you said that, because I 275 00:15:39,240 --> 00:15:41,320 Speaker 3: want to back up a little bit, because you know, 276 00:15:41,400 --> 00:15:44,800 Speaker 3: we're talking today about GROC rightly so, but it didn't 277 00:15:44,800 --> 00:15:49,040 Speaker 3: really start with Groc. Like X has been a hotbed 278 00:15:49,400 --> 00:15:53,040 Speaker 3: of this kind of material for a while. Like folks 279 00:15:53,120 --> 00:15:57,000 Speaker 3: might recall that the teenage Marvel star Social Gomez was 280 00:15:57,040 --> 00:16:00,880 Speaker 3: targeted with deep fake images that flood X and this 281 00:16:00,960 --> 00:16:03,080 Speaker 3: happened when she was only a minor, she was seventeen, 282 00:16:03,120 --> 00:16:06,080 Speaker 3: and she's spoken out about this publicly that she basically 283 00:16:06,120 --> 00:16:09,000 Speaker 3: was just told there's nothing that can be done. You 284 00:16:09,160 --> 00:16:11,040 Speaker 3: just have to make peace with the fact that these 285 00:16:11,040 --> 00:16:14,240 Speaker 3: images are out there. Back in January of twenty twenty four, 286 00:16:15,000 --> 00:16:17,720 Speaker 3: Taylor Swift was a target of the same thing. AI 287 00:16:17,840 --> 00:16:22,600 Speaker 3: generated sexualized images of her flooded the platform. Those images 288 00:16:22,640 --> 00:16:26,800 Speaker 3: actually originated on an alternative social media platform called Telegram. 289 00:16:27,200 --> 00:16:29,760 Speaker 3: At the time, Telegram had this channel that was essentially 290 00:16:29,880 --> 00:16:33,280 Speaker 3: kind of a marketplace for celebrity deep stakes, where users 291 00:16:33,320 --> 00:16:36,480 Speaker 3: would like request and trade images that they had made 292 00:16:36,520 --> 00:16:37,160 Speaker 3: of celebrities. 293 00:16:37,560 --> 00:16:40,720 Speaker 2: Those images then made their way to X where they 294 00:16:40,760 --> 00:16:41,440 Speaker 2: really took off. 295 00:16:41,440 --> 00:16:43,880 Speaker 3: Like Telegram is still pretty niche, but once those images 296 00:16:43,880 --> 00:16:46,320 Speaker 3: went from Telegram to X, that's when they got a 297 00:16:46,360 --> 00:16:50,240 Speaker 3: lot more visibility. When this happened, in my opinion, I 298 00:16:50,240 --> 00:16:52,360 Speaker 3: don't think that X handled it very well, which I 299 00:16:52,360 --> 00:16:55,040 Speaker 3: think we can really see as a precursor to how 300 00:16:55,080 --> 00:16:58,160 Speaker 3: we got to where we are right now with it. Initially, 301 00:16:58,400 --> 00:17:00,720 Speaker 3: X their way to deal with it was to try 302 00:17:00,760 --> 00:17:03,920 Speaker 3: to block people from being able to search Taylor Swift's name, 303 00:17:04,280 --> 00:17:06,720 Speaker 3: which I think is not a real fix anyway. But 304 00:17:07,240 --> 00:17:10,159 Speaker 3: when you put her name in quotes, you still could 305 00:17:10,200 --> 00:17:12,960 Speaker 3: search her name, and so obviously that did not work 306 00:17:13,000 --> 00:17:16,280 Speaker 3: as a fix. But also that solution would only work 307 00:17:16,320 --> 00:17:18,480 Speaker 3: for women named Taylor Swift, right Like, It's not any 308 00:17:18,560 --> 00:17:23,920 Speaker 3: kind of like meaningful way to combat the actual problem 309 00:17:23,920 --> 00:17:27,480 Speaker 3: on the platform. It might just be preventive to have 310 00:17:27,560 --> 00:17:31,200 Speaker 3: people search Taylor to Swift, but even not really at that. So, yeah, 311 00:17:31,400 --> 00:17:34,680 Speaker 3: did not handle it well. And so this was happening 312 00:17:35,080 --> 00:17:37,880 Speaker 3: before grock was a thing and then but back then, 313 00:17:38,600 --> 00:17:42,840 Speaker 3: these images would start on niche or alternative platforms and 314 00:17:42,880 --> 00:17:45,960 Speaker 3: then go to Twitter to get reach in visibility, then 315 00:17:46,160 --> 00:17:50,680 Speaker 3: inter Grock, which essentially allows users to use natural language 316 00:17:50,720 --> 00:17:53,760 Speaker 3: to ask Groc to generate whatever they want to see 317 00:17:53,800 --> 00:17:55,600 Speaker 3: women and girls do. So you could just say, hey, 318 00:17:55,640 --> 00:17:59,720 Speaker 3: Grock do xyz. Grock's going to do it now again, Sam, 319 00:17:59,760 --> 00:18:02,480 Speaker 3: As you rightly pointed out, keep in mind that when 320 00:18:02,760 --> 00:18:05,760 Speaker 3: Elon Musk took over at Twitter, he very early on 321 00:18:05,840 --> 00:18:09,199 Speaker 3: dissolved Twitter's Trust and Safety Council and fired eighty percent 322 00:18:09,240 --> 00:18:12,520 Speaker 3: of the engineers working on child exploitation issues. This is 323 00:18:12,560 --> 00:18:16,320 Speaker 3: from a really talented journalist at spitfire News, Cat Tenbarg, 324 00:18:16,320 --> 00:18:18,000 Speaker 3: who's been writing about this for a really long time. 325 00:18:18,840 --> 00:18:22,119 Speaker 3: So you know, Musk comes into Twitter, he fires the 326 00:18:22,119 --> 00:18:24,800 Speaker 3: people who are working on this. He knows that they 327 00:18:24,840 --> 00:18:29,159 Speaker 3: already have this issue with non consensual sexualized images on 328 00:18:29,200 --> 00:18:29,920 Speaker 3: the platform and. 329 00:18:29,880 --> 00:18:31,080 Speaker 2: Are not really handling it well. 330 00:18:31,680 --> 00:18:35,520 Speaker 3: And yet over the summer, X rolls out what it 331 00:18:35,600 --> 00:18:39,000 Speaker 3: calls Spicy Mode for Rock, which is going to be 332 00:18:39,040 --> 00:18:42,320 Speaker 3: a way to let people generate images in video to 333 00:18:42,400 --> 00:18:45,679 Speaker 3: produce sexual content. And so I want to put it 334 00:18:45,720 --> 00:18:47,960 Speaker 3: that way because to me, none of this stuff is 335 00:18:47,960 --> 00:18:48,920 Speaker 3: happening in a vacuum. 336 00:18:48,960 --> 00:18:49,120 Speaker 2: Right. 337 00:18:49,160 --> 00:18:54,240 Speaker 3: These are very clear decisions that I would say, it's 338 00:18:54,640 --> 00:18:59,480 Speaker 3: very foreseeable. What kind of results decisions like these are 339 00:18:59,520 --> 00:18:59,880 Speaker 3: going to. 340 00:19:01,480 --> 00:19:05,760 Speaker 4: You know, with all of the KOSA stuff happening and 341 00:19:05,920 --> 00:19:11,399 Speaker 4: the banning or the exclusionary stuff, with things like porn 342 00:19:11,440 --> 00:19:15,240 Speaker 4: Hub and all of that, this is interesting. I'm not 343 00:19:15,280 --> 00:19:19,520 Speaker 4: trying to be conspiratorial here, but on the far reaching 344 00:19:19,640 --> 00:19:23,480 Speaker 4: of the conspiracy side, this seems like a really interesting 345 00:19:23,600 --> 00:19:26,920 Speaker 4: monopoly for sexualized content. 346 00:19:27,800 --> 00:19:30,720 Speaker 3: I've been racking my brain because this just seems weird, right, 347 00:19:30,720 --> 00:19:31,160 Speaker 3: it's just. 348 00:19:31,040 --> 00:19:35,959 Speaker 2: A very yeah. I mean, and again, I just I 349 00:19:36,000 --> 00:19:37,640 Speaker 2: can't wrap my head. 350 00:19:37,480 --> 00:19:43,320 Speaker 3: Around living in a world where miners aren't allowed to 351 00:19:43,440 --> 00:19:46,800 Speaker 3: talk to other miners on social media because that's dangerous, 352 00:19:46,880 --> 00:19:50,919 Speaker 3: but they can be undressed and sexualized by GROC and 353 00:19:51,000 --> 00:19:51,719 Speaker 3: that's fine. 354 00:19:52,000 --> 00:19:56,560 Speaker 2: Like, I just I can't square that circle if we will. 355 00:19:56,960 --> 00:19:59,640 Speaker 4: Be with all of the porn child porn laws, which 356 00:19:59,640 --> 00:20:02,840 Speaker 4: is pretty explicit like about the laws and like children 357 00:20:02,880 --> 00:20:05,359 Speaker 4: can be charged with it just by taking pictures of 358 00:20:05,440 --> 00:20:08,600 Speaker 4: themselves and having it in their phone, that is childborn. 359 00:20:08,720 --> 00:20:11,600 Speaker 4: I have seen kids being charged with child porn. 360 00:20:11,359 --> 00:20:11,760 Speaker 1: By the way. 361 00:20:12,000 --> 00:20:17,520 Speaker 4: Wow, So it's interesting, uh huh is interesting to see 362 00:20:17,560 --> 00:20:20,760 Speaker 4: this where they're like, but it's not real. We didn't 363 00:20:20,800 --> 00:20:23,600 Speaker 4: harm a kid. We just took their likeness and then 364 00:20:24,840 --> 00:20:26,640 Speaker 4: dot dot Yes. 365 00:20:26,760 --> 00:20:29,840 Speaker 3: So I wasn't totally sure about this, so I looked 366 00:20:29,840 --> 00:20:32,119 Speaker 3: it up and according to Enough Abuse, which is an 367 00:20:32,119 --> 00:20:35,720 Speaker 3: anti child sexual abuse organization, forty five states in the 368 00:20:35,800 --> 00:20:39,760 Speaker 3: United States have enacted laws that criminalize AI generated or 369 00:20:39,800 --> 00:20:43,720 Speaker 3: computer edited child sexual assault material, while five states and 370 00:20:43,720 --> 00:20:46,280 Speaker 3: the districts of Columbia where I live have not as 371 00:20:46,320 --> 00:20:47,480 Speaker 3: of August twenty twenty five. 372 00:20:47,560 --> 00:20:49,560 Speaker 2: Right, and so you're exactly right. 373 00:20:50,320 --> 00:20:54,879 Speaker 3: One of the I guess criticisms or pushback sometimes I 374 00:20:54,880 --> 00:20:57,920 Speaker 3: will hear is like, oh, well they're not actually harming 375 00:20:57,960 --> 00:20:59,919 Speaker 3: real kids, it is still a real crime. 376 00:21:00,119 --> 00:21:01,600 Speaker 2: So that's all I can say. 377 00:21:03,359 --> 00:21:05,520 Speaker 4: Yeah, I don't want to jump too far ahead, but 378 00:21:06,400 --> 00:21:10,440 Speaker 4: I'm like, the alarm bells are ringing over. 379 00:21:10,320 --> 00:21:12,040 Speaker 2: Here, as they should be. 380 00:21:13,920 --> 00:21:18,399 Speaker 1: Yeah, And I think also calling something spicy mode is 381 00:21:18,400 --> 00:21:24,160 Speaker 1: so unseerious, Like they're treating this so unseriously of like, oh, 382 00:21:24,160 --> 00:21:25,720 Speaker 1: this is just a fun and I think it's a 383 00:21:25,840 --> 00:21:29,000 Speaker 1: very convenient excuse of being like, well, it was the AI. 384 00:21:30,200 --> 00:21:33,479 Speaker 1: But like you said, there are people behind this that 385 00:21:33,560 --> 00:21:35,840 Speaker 1: created it this way. But I think they're using it 386 00:21:35,960 --> 00:21:38,959 Speaker 1: very conveniently for them to be like, I don't know, 387 00:21:39,080 --> 00:21:40,320 Speaker 1: it's just spicy mode. 388 00:21:40,720 --> 00:21:44,240 Speaker 3: Yeah, yeah, And I want to kind of underline that 389 00:21:44,960 --> 00:21:49,439 Speaker 3: because we have to. We can't not mention that X 390 00:21:49,560 --> 00:21:52,320 Speaker 3: is run by Elon Musk. And I'm sorry, but this 391 00:21:52,400 --> 00:21:55,480 Speaker 3: is just part and parcel of what Elon Musk is about. 392 00:21:56,080 --> 00:21:58,359 Speaker 3: Before I got on with you two, I was like, Oh, 393 00:21:58,400 --> 00:22:01,520 Speaker 3: what's Elon Musk talking about. He was posting on X 394 00:22:02,600 --> 00:22:05,800 Speaker 3: jokey pictures that showed a toaster in a bikini, So 395 00:22:06,000 --> 00:22:09,199 Speaker 3: not taking this seriously at all. And I think we 396 00:22:09,320 --> 00:22:12,400 Speaker 3: have to say Elon Musk has been a really toxic 397 00:22:12,440 --> 00:22:15,880 Speaker 3: decision maker for a long time, and I can't help 398 00:22:15,920 --> 00:22:18,879 Speaker 3: but wonder if if some of the people in power 399 00:22:18,920 --> 00:22:21,720 Speaker 3: and people in media, if they had not treated this 400 00:22:21,960 --> 00:22:26,040 Speaker 3: like oh, just the acceptable quirkiness of a brilliant genius 401 00:22:26,280 --> 00:22:31,560 Speaker 3: and rather chaotic, volatile leadership decisions that are bad for business. 402 00:22:32,080 --> 00:22:34,000 Speaker 3: I wonder if we would not be in this situation 403 00:22:34,040 --> 00:22:38,320 Speaker 3: where somebody like Elon Musk feels totally comfortable talking about 404 00:22:38,359 --> 00:22:40,040 Speaker 3: this in a way that makes it clear he thinks 405 00:22:40,040 --> 00:22:41,879 Speaker 3: it's a joke and it's not serious to him. The 406 00:22:41,880 --> 00:22:44,240 Speaker 3: whole thing is like very unseerious. 407 00:22:44,920 --> 00:22:48,480 Speaker 4: I mean, let's this is gonna be if one day 408 00:22:48,480 --> 00:22:51,679 Speaker 4: someone gets a hold of our episodes that don't like us, 409 00:22:51,680 --> 00:22:54,879 Speaker 4: this is gonna be a bad take. But let's just 410 00:22:54,920 --> 00:22:58,600 Speaker 4: for a minute, imagine what would have happened if someone 411 00:22:58,600 --> 00:23:01,600 Speaker 4: did those with a Charlie Kirk picture of this death? 412 00:23:01,680 --> 00:23:07,400 Speaker 2: Are you? I mean, like yeah, it would be I mean, 413 00:23:07,680 --> 00:23:08,800 Speaker 2: don't even get me sing why. 414 00:23:08,760 --> 00:23:12,120 Speaker 4: The people went off on just quoting his words as 415 00:23:12,200 --> 00:23:15,400 Speaker 4: being offensive? Yeah, could you imagine. 416 00:23:15,359 --> 00:23:18,040 Speaker 3: I could not write And I don't think we would 417 00:23:18,040 --> 00:23:22,600 Speaker 3: ever see that, like like it just it's yeah, And 418 00:23:23,320 --> 00:23:27,720 Speaker 3: you know, it's the scale, the scale of the issue 419 00:23:28,080 --> 00:23:33,479 Speaker 3: on X. It's it's really kind of mind boggling how 420 00:23:33,560 --> 00:23:37,280 Speaker 3: much of this content is flourishing there right now. Copy Leaks, 421 00:23:37,280 --> 00:23:40,280 Speaker 3: which is like a content analysis firm, they reported on 422 00:23:40,320 --> 00:23:44,200 Speaker 3: December thirty first that X users were generating roughly one 423 00:23:44,560 --> 00:23:48,440 Speaker 3: non consensual sexualized image per minute. 424 00:23:48,800 --> 00:23:51,520 Speaker 2: That is so many images, that's so. 425 00:23:51,560 --> 00:23:54,760 Speaker 3: Much and not to like even think about the environmental 426 00:23:54,800 --> 00:23:57,359 Speaker 3: impact of this, like I had, like I need to 427 00:23:57,440 --> 00:24:01,240 Speaker 3: see this miner in a bikini is worth the environmental 428 00:24:01,280 --> 00:24:05,440 Speaker 3: impact of doing this one image per minute? That something 429 00:24:05,480 --> 00:24:07,040 Speaker 3: about that I just it's hard for me to wrap 430 00:24:07,080 --> 00:24:08,040 Speaker 3: my head around. 431 00:24:08,520 --> 00:24:12,000 Speaker 4: That people have the goal to do it. This is 432 00:24:12,040 --> 00:24:17,439 Speaker 4: a form that used to be used for social and 433 00:24:17,480 --> 00:24:18,840 Speaker 4: political conversation. 434 00:24:19,520 --> 00:24:23,119 Speaker 3: Yeah, now this, and you know, people like Elon Musk 435 00:24:23,280 --> 00:24:27,040 Speaker 3: love to talk about X or Twitter as like, oh, 436 00:24:27,080 --> 00:24:29,760 Speaker 3: it's the digital town square where people can enter the 437 00:24:29,760 --> 00:24:33,240 Speaker 3: marketplace of ideas. Okay, Well, as a woman, I'm not 438 00:24:33,280 --> 00:24:35,119 Speaker 3: going to enter a marketplace where my top my kit 439 00:24:35,200 --> 00:24:36,359 Speaker 3: yanked off when I go in there. 440 00:24:36,400 --> 00:24:40,160 Speaker 2: So I don't believe that X is a marketplace of ideas. 441 00:24:40,200 --> 00:24:42,760 Speaker 3: But even if it were, if it's not safe for 442 00:24:42,800 --> 00:24:44,480 Speaker 3: women to show up, women are not going to be 443 00:24:44,520 --> 00:24:46,639 Speaker 3: able to have a chance to engage in this so 444 00:24:46,760 --> 00:24:50,240 Speaker 3: called marketplace of ideas, right, It's just ridiculous and so 445 00:24:50,400 --> 00:24:53,439 Speaker 3: according to Bloomberg, during a twenty four hour analysis of 446 00:24:53,480 --> 00:24:56,880 Speaker 3: images the Grock account posted to X, the chapbot generated 447 00:24:56,920 --> 00:25:00,560 Speaker 3: about six seven hundred every hour that we're ident as 448 00:25:00,600 --> 00:25:04,480 Speaker 3: sexually suggestive or new toifying. According to Genevieveo, a social 449 00:25:04,560 --> 00:25:08,000 Speaker 3: media and deep sake researcher, the other top five websites 450 00:25:08,040 --> 00:25:11,879 Speaker 3: for such content average seventy nine new AI undressing images 451 00:25:11,880 --> 00:25:14,200 Speaker 3: per hour in the twenty four hour period from January 452 00:25:14,240 --> 00:25:17,760 Speaker 3: fifth to January six, oh found. So again, it's not 453 00:25:17,880 --> 00:25:21,480 Speaker 3: like X is the only place where this kind of 454 00:25:21,520 --> 00:25:25,479 Speaker 3: content shows up. It's on Facebook. It's a problem across 455 00:25:25,520 --> 00:25:29,560 Speaker 3: the internet. However, it is very clearly a much bigger 456 00:25:29,600 --> 00:25:32,239 Speaker 3: problem in happening on a much bigger scale on X 457 00:25:32,280 --> 00:25:36,159 Speaker 3: as compared to other social media platforms. And just to 458 00:25:36,160 --> 00:25:39,199 Speaker 3: make something else clear, we've been talking about celebrities and 459 00:25:39,240 --> 00:25:42,560 Speaker 3: things like that. To be clear, it is not just celebrities. 460 00:25:42,560 --> 00:25:46,760 Speaker 3: It's also this regular, non famous, non public figures who 461 00:25:46,840 --> 00:25:49,000 Speaker 3: post their images on X that this is happening to. 462 00:25:49,640 --> 00:25:50,680 Speaker 2: And I feel like. 463 00:25:50,640 --> 00:25:52,800 Speaker 3: This is the part of the podcast where I should 464 00:25:52,840 --> 00:25:56,560 Speaker 3: say women should be careful, like don't post your picture 465 00:25:56,800 --> 00:26:00,159 Speaker 3: on social media, YadA, YadA, YadA, And I guess I 466 00:26:00,160 --> 00:26:03,720 Speaker 3: feel some responsibility to say that. But on the other hand, 467 00:26:03,880 --> 00:26:07,560 Speaker 3: women should be allowed to post normal pictures on social media. 468 00:26:07,600 --> 00:26:10,560 Speaker 3: I kind of hate this advice that says, like, oh, women, 469 00:26:10,880 --> 00:26:12,520 Speaker 3: it's not safe for you to show up here, So 470 00:26:12,560 --> 00:26:15,600 Speaker 3: I wouldn't show up in these places. Women not doing 471 00:26:15,680 --> 00:26:18,320 Speaker 3: anything wrong, right, We're not the ones who are generating 472 00:26:18,359 --> 00:26:21,800 Speaker 3: this kind of creepy imagery. And I just hate that 473 00:26:22,240 --> 00:26:26,359 Speaker 3: guidance because it creates a climate where it's just normalized 474 00:26:26,400 --> 00:26:28,960 Speaker 3: that women can no longer show up on these platforms 475 00:26:29,000 --> 00:26:29,960 Speaker 3: in ways that are safe. 476 00:26:30,000 --> 00:26:32,200 Speaker 2: But of course I have to imagine that's the point. 477 00:26:32,760 --> 00:26:35,880 Speaker 1: Yeah, And it feels very like I used to think 478 00:26:35,920 --> 00:26:37,720 Speaker 1: this a lot when I was running, and I would 479 00:26:37,760 --> 00:26:41,280 Speaker 1: see advice given towards male runners and female runners, and 480 00:26:41,320 --> 00:26:44,320 Speaker 1: it's very it would be very much like blaming you 481 00:26:44,680 --> 00:26:48,520 Speaker 1: already for something that could go wrong, like don't put 482 00:26:48,560 --> 00:26:51,000 Speaker 1: your hair in a ponytail because a man could grab it, 483 00:26:51,119 --> 00:26:54,199 Speaker 1: don't wear a headphones because you won't hear something like 484 00:26:54,480 --> 00:26:56,760 Speaker 1: it was making it feel like it was your fault 485 00:26:57,840 --> 00:27:02,480 Speaker 1: for whatever might happen to you. And that's it's so 486 00:27:02,640 --> 00:27:04,520 Speaker 1: unfortunate because I agree with you, Bridget. It's one of 487 00:27:04,560 --> 00:27:09,439 Speaker 1: those things where I'm like, yes, I sometimes feel like 488 00:27:09,480 --> 00:27:12,040 Speaker 1: I have to give this advice, but it feels so 489 00:27:13,080 --> 00:27:16,480 Speaker 1: horrible to give that advice because it's not your fault 490 00:27:16,480 --> 00:27:18,280 Speaker 1: and it shouldn't be this way. 491 00:27:18,400 --> 00:27:21,440 Speaker 4: Right, Yes, it's that cautionary tale that you have to tell, 492 00:27:21,480 --> 00:27:23,720 Speaker 4: which you're like, but it's not your fault. You didn't 493 00:27:23,720 --> 00:27:25,879 Speaker 4: do anything wrong. But because this is the state of 494 00:27:25,920 --> 00:27:28,560 Speaker 4: the world and everything is awful. Let me tell you this, 495 00:27:28,760 --> 00:27:31,159 Speaker 4: and if something does go wrong, you're going to remember 496 00:27:31,160 --> 00:27:33,560 Speaker 4: that in your head and say what could I have done? 497 00:27:33,760 --> 00:27:37,280 Speaker 4: Which is also another layer that again is not your fault, 498 00:27:37,400 --> 00:27:39,959 Speaker 4: is not our fault, but because that's how it is 499 00:27:40,160 --> 00:27:44,000 Speaker 4: laid out, it's almost a trap for you to blame yourself. 500 00:27:46,280 --> 00:27:46,520 Speaker 2: Yeah. 501 00:27:46,560 --> 00:27:49,040 Speaker 3: So I was scrolling around social media just trying to 502 00:27:49,040 --> 00:27:50,800 Speaker 3: get a sense of like what people are saying, and 503 00:27:51,320 --> 00:27:53,760 Speaker 3: there was an image that somebody had taken off of 504 00:27:53,840 --> 00:27:58,080 Speaker 3: a woman's LinkedIn, and then somebody asked roc like, oh, 505 00:27:58,240 --> 00:28:00,720 Speaker 3: put it in a bikini, do this, do that? And 506 00:28:00,760 --> 00:28:04,320 Speaker 3: then someone was like, well, if she didn't want this 507 00:28:04,400 --> 00:28:06,439 Speaker 3: to happen, she shouldn't, like, like, if you put your 508 00:28:06,440 --> 00:28:08,320 Speaker 3: picture on the internet, you get what you get. And 509 00:28:08,359 --> 00:28:11,040 Speaker 3: I thought, you can't even have your headshot on a 510 00:28:11,080 --> 00:28:15,480 Speaker 3: link on LinkedIn a professional platform without it being like that, 511 00:28:15,560 --> 00:28:19,119 Speaker 3: like that, that's not consenting to have your image be 512 00:28:19,200 --> 00:28:22,080 Speaker 3: manipulated in ways that are sexual, And I was just really, 513 00:28:22,240 --> 00:28:24,880 Speaker 3: I think it really shows how much we have normalized 514 00:28:25,320 --> 00:28:30,360 Speaker 3: that women bring this on themselves via visibility, even when 515 00:28:30,359 --> 00:28:33,520 Speaker 3: the visibility is totally normal, right, Like a totally normal 516 00:28:33,800 --> 00:28:38,760 Speaker 3: picture of you know, a woman's face. If you are 517 00:28:38,800 --> 00:28:40,600 Speaker 3: not allowed to post that, then women can't safely show 518 00:28:40,640 --> 00:28:43,040 Speaker 3: up on the internet. It's just very frustrating. And any 519 00:28:43,040 --> 00:28:47,000 Speaker 3: of your point about being a woman who runs it, really, 520 00:28:48,040 --> 00:28:50,000 Speaker 3: I think that's a really good analogy because it is 521 00:28:50,120 --> 00:28:52,600 Speaker 3: very frustrating, Like you want to give advice, but why 522 00:28:52,640 --> 00:28:54,200 Speaker 3: should you have to get this kind of advice? 523 00:28:55,280 --> 00:28:59,720 Speaker 1: Yeah, and you know, to your point about LinkedIn. Samantha 524 00:28:59,720 --> 00:29:02,520 Speaker 1: and I I think last year around this time we 525 00:29:02,520 --> 00:29:04,320 Speaker 1: were talking about what was going on on next door 526 00:29:04,360 --> 00:29:07,320 Speaker 1: because I was like, what in the world is this? 527 00:29:07,400 --> 00:29:14,440 Speaker 1: Because I was seeing people men being like, well, this 528 00:29:14,720 --> 00:29:17,600 Speaker 1: woman was asking for me to give her my number 529 00:29:17,640 --> 00:29:19,440 Speaker 1: and now she's mad about it. But look at what 530 00:29:19,480 --> 00:29:22,000 Speaker 1: she was wearing in this mixture, And. 531 00:29:21,920 --> 00:29:24,560 Speaker 2: I was like, what. 532 00:29:26,600 --> 00:29:30,760 Speaker 1: This is Like, it's everywhere. It is so pervasive and 533 00:29:30,880 --> 00:29:35,360 Speaker 1: so insidious that even yeah, just going online, I'll be like, oh, 534 00:29:35,400 --> 00:29:39,360 Speaker 1: this must be a safe space. No, never, not that 535 00:29:39,400 --> 00:29:41,239 Speaker 1: next door I ever thought was a safe space. But 536 00:29:41,280 --> 00:29:44,480 Speaker 1: I was surprised that that was coming up on there. 537 00:29:45,200 --> 00:29:47,560 Speaker 4: We know next door is a hotbed of many things 538 00:29:47,600 --> 00:29:51,200 Speaker 4: and messes, including Rachel process, but unlat of drama. 539 00:30:02,080 --> 00:30:03,760 Speaker 2: But yeah, I also again. 540 00:30:03,920 --> 00:30:07,600 Speaker 4: Going with my ringing alarms here because I've become real 541 00:30:07,760 --> 00:30:11,360 Speaker 4: conspiratory today apparently. But this just feels like a ploy 542 00:30:12,160 --> 00:30:16,120 Speaker 4: may be a part of the Esther project, maybe project 543 00:30:17,000 --> 00:30:19,840 Speaker 4: twenty five. I could be just blowing this out of 544 00:30:19,840 --> 00:30:22,760 Speaker 4: proportion in that they are trying to eliminate women from 545 00:30:22,800 --> 00:30:26,280 Speaker 4: these spaces because we know that being able to access 546 00:30:26,360 --> 00:30:30,080 Speaker 4: like LinkedIn and Internet give you accessibility to jobs, to pops, 547 00:30:30,120 --> 00:30:32,680 Speaker 4: possibly being able to do content, to be able to 548 00:30:32,680 --> 00:30:34,600 Speaker 4: grow your business, to be able to grow your name, 549 00:30:34,680 --> 00:30:39,040 Speaker 4: whatever whatnot. And if you shame them and victimize them 550 00:30:39,080 --> 00:30:41,600 Speaker 4: to a point, or at least like bully them off 551 00:30:41,600 --> 00:30:45,840 Speaker 4: of these spaces, then they we cannot connect. Marginalized communities 552 00:30:45,880 --> 00:30:48,080 Speaker 4: cannot connect as we know it. That's how kind of 553 00:30:48,080 --> 00:30:50,920 Speaker 4: Twitter blew up is being able to connect with each other. 554 00:30:50,960 --> 00:30:53,520 Speaker 4: And it was a space for black women and women 555 00:30:53,560 --> 00:30:57,040 Speaker 4: of color to come together and be able to lay 556 00:30:57,080 --> 00:31:00,440 Speaker 4: down the grounds for changes that wipes up from just 557 00:31:00,520 --> 00:31:03,120 Speaker 4: do not want. So it does feel like maybe they 558 00:31:03,120 --> 00:31:06,080 Speaker 4: are doing this purposely. I could absolutly. Even though I 559 00:31:06,120 --> 00:31:09,600 Speaker 4: don't think he's smart, I think he's smart enough to 560 00:31:09,800 --> 00:31:13,040 Speaker 4: use his evil to being like if we show them 561 00:31:13,240 --> 00:31:15,880 Speaker 4: that you're going to be demeaned to nothing but sexual 562 00:31:15,880 --> 00:31:20,920 Speaker 4: objects by no, like nothing from you, not because you 563 00:31:20,960 --> 00:31:23,960 Speaker 4: posted something, but because we used your post against you, 564 00:31:24,240 --> 00:31:26,360 Speaker 4: then we can get rid of you. And therefore you 565 00:31:26,400 --> 00:31:28,960 Speaker 4: do not have the same leg up as those who 566 00:31:28,960 --> 00:31:30,880 Speaker 4: are able to access this area. 567 00:31:30,960 --> 00:31:34,160 Speaker 3: Oh, I mean this doesn't sound conspiratorial to me at all, 568 00:31:34,280 --> 00:31:37,520 Speaker 3: because one only needs to read Project twenty twenty five 569 00:31:37,720 --> 00:31:43,640 Speaker 3: to see that pushing women and other marginalized people out 570 00:31:43,720 --> 00:31:46,920 Speaker 3: of public and civic life is part of the point. Like, 571 00:31:46,960 --> 00:31:50,080 Speaker 3: that's what they're trying to do. And so if women 572 00:31:50,480 --> 00:31:54,000 Speaker 3: and other marginalized people are not safe showing up online, 573 00:31:54,040 --> 00:31:55,960 Speaker 3: and if they show up online, they either have to 574 00:31:55,960 --> 00:31:58,360 Speaker 3: say like, oh, yeah, you might somebody might take your 575 00:31:58,400 --> 00:32:02,440 Speaker 3: image and put it in a horrifying AI generated sexualized scenario, 576 00:32:02,720 --> 00:32:04,160 Speaker 3: and everybody will be like, well that's what you get. 577 00:32:04,160 --> 00:32:06,840 Speaker 3: Come into the internet, even though it's not happening equitably 578 00:32:07,200 --> 00:32:11,640 Speaker 3: across all people. Yeah, that does have the effect of 579 00:32:11,680 --> 00:32:14,880 Speaker 3: pushing folks out of these spaces. And in twenty twenty six, 580 00:32:14,920 --> 00:32:18,920 Speaker 3: you're exactly right. Part of being civically engaged, part of 581 00:32:18,920 --> 00:32:22,360 Speaker 3: being economically engaged, is being online. You know that's not 582 00:32:22,520 --> 00:32:25,400 Speaker 3: me that's the un saying that, right. And so if 583 00:32:25,960 --> 00:32:31,200 Speaker 3: we create the conditions where everybody cannot equitably and equally 584 00:32:31,320 --> 00:32:35,000 Speaker 3: and safely show up online, that means that people who 585 00:32:35,000 --> 00:32:37,200 Speaker 3: can't show up are not going to be full participants 586 00:32:37,200 --> 00:32:40,200 Speaker 3: in their democracy, because that's U's more and more how 587 00:32:40,240 --> 00:32:42,760 Speaker 3: democracy is unfolding is online in twenty twenty six. 588 00:32:42,800 --> 00:32:45,280 Speaker 2: And so you're exactly right. It doesn't sound conspiratorial to 589 00:32:45,320 --> 00:32:45,760 Speaker 2: me at all. 590 00:32:46,840 --> 00:32:49,080 Speaker 1: Yeah, and I know I bring this up a lot, 591 00:32:49,160 --> 00:32:53,160 Speaker 1: but that episode Bridget we did together about journalism in India, 592 00:32:53,960 --> 00:32:58,200 Speaker 1: I think about that all the time, where women journalists 593 00:32:58,480 --> 00:33:03,520 Speaker 1: were getting sent these deep fakes, these nudes of them 594 00:33:04,000 --> 00:33:07,120 Speaker 1: being threatened, like if you print the story, we'll send 595 00:33:07,120 --> 00:33:11,200 Speaker 1: it to your family, and that's like the news. That's journalism, 596 00:33:11,240 --> 00:33:14,480 Speaker 1: and they're trying to scare women away. And I've been 597 00:33:14,880 --> 00:33:16,760 Speaker 1: I don't want to do this episode, but I've been 598 00:33:16,800 --> 00:33:19,680 Speaker 1: thinking about doing this episode about how Donald Trump treats 599 00:33:20,520 --> 00:33:24,000 Speaker 1: women who are journalists and how he speaks to women 600 00:33:24,040 --> 00:33:27,480 Speaker 1: who are journalists. And it feels very similar to what 601 00:33:27,520 --> 00:33:31,920 Speaker 1: you're talking about, Samantha. It feels very like, let's just 602 00:33:32,080 --> 00:33:35,560 Speaker 1: demean these women, Let's make it so miserable for them 603 00:33:36,000 --> 00:33:38,720 Speaker 1: that they go away and we don't have to answer 604 00:33:38,800 --> 00:33:41,080 Speaker 1: these questions and we don't have to change anything. 605 00:33:41,920 --> 00:33:44,880 Speaker 3: Yes, I think that's exactly it's exactly the same thing. 606 00:33:44,920 --> 00:33:49,560 Speaker 3: And like I it's been you know, as women in media, 607 00:33:49,800 --> 00:33:52,440 Speaker 3: you kind of have to have a thick skin, but 608 00:33:52,880 --> 00:33:56,360 Speaker 3: nobody should be calling you piggy, right, Like, there's a line. 609 00:33:57,240 --> 00:34:00,320 Speaker 2: And I think the fact that we've normalized this. 610 00:34:01,120 --> 00:34:04,400 Speaker 3: I watched Caroline leave it when Trump called that female 611 00:34:04,400 --> 00:34:07,200 Speaker 3: reporter or a piggy. Leva got a question about it, and 612 00:34:07,240 --> 00:34:09,759 Speaker 3: she was like, well, you know, we all appreciate that 613 00:34:09,800 --> 00:34:12,200 Speaker 3: Trump speaks his mind and says it like it is. 614 00:34:12,239 --> 00:34:16,600 Speaker 3: And it's like the fact that that's normalized, Now, what 615 00:34:16,719 --> 00:34:20,279 Speaker 3: message does that sin to little girls like me who 616 00:34:20,320 --> 00:34:23,440 Speaker 3: wanted to be journalists when they were little kids, right, Like, oh, 617 00:34:23,520 --> 00:34:29,640 Speaker 3: you'll have to withstand this level of attacks about your identity, 618 00:34:29,680 --> 00:34:33,640 Speaker 3: your appearance, who you are. It just really again, it 619 00:34:33,719 --> 00:34:36,000 Speaker 3: just really makes me sick. But I think it is 620 00:34:36,560 --> 00:34:40,359 Speaker 3: intentional because the signal is very clear of like, don't 621 00:34:40,400 --> 00:34:42,920 Speaker 3: ask questions, don't make waves, because this is what's going 622 00:34:42,960 --> 00:34:46,600 Speaker 3: to happen. And let me be real, no woman journalist 623 00:34:46,680 --> 00:34:49,839 Speaker 3: wants to become the story because the President called her 624 00:34:49,880 --> 00:34:52,839 Speaker 3: a pig. No, no, and so you have to decide, like, 625 00:34:53,560 --> 00:34:56,000 Speaker 3: is do you want this to be part of your 626 00:34:56,040 --> 00:34:59,520 Speaker 3: career if you challenge Trump? It's just a really impossible 627 00:34:59,640 --> 00:35:02,279 Speaker 3: bargain and that women should not be having to make 628 00:35:02,360 --> 00:35:03,360 Speaker 3: in twenty twenty six. 629 00:35:04,280 --> 00:35:07,240 Speaker 1: Yeah. Agree. I mean it's one of those things where 630 00:35:08,320 --> 00:35:10,440 Speaker 1: I feel like we're in such a backslide. But if 631 00:35:10,880 --> 00:35:13,360 Speaker 1: at one point I would have said, in any other job, 632 00:35:15,200 --> 00:35:19,560 Speaker 1: a boss calling you that you would get fired. But 633 00:35:19,680 --> 00:35:23,279 Speaker 1: now it's like, no, no, just that was a joke. 634 00:35:23,360 --> 00:35:24,120 Speaker 1: Let's just move on. 635 00:35:24,320 --> 00:35:27,840 Speaker 2: That's okay, it was barely a blip. I'll tell you 636 00:35:27,880 --> 00:35:31,239 Speaker 2: something else. There's just no reality where I would let 637 00:35:31,280 --> 00:35:33,160 Speaker 2: a man that looks like Donald Trump call me say 638 00:35:33,239 --> 00:35:36,200 Speaker 2: say anything about my appearance. Shout out to the. 639 00:35:36,200 --> 00:35:38,040 Speaker 3: Women that just took it on the chin, because I 640 00:35:38,040 --> 00:35:39,640 Speaker 3: would never let a man that looks like Donald Trump 641 00:35:39,680 --> 00:35:40,759 Speaker 3: say to me about the way I. 642 00:35:40,800 --> 00:35:44,880 Speaker 4: Look, the pettiness that would have come out of my mouth. 643 00:35:46,000 --> 00:35:55,960 Speaker 5: All oh, oh, oh, okay, Well, going back to grook, 644 00:35:57,080 --> 00:36:01,640 Speaker 5: you have some numbers here that are pretty disturbing. 645 00:36:02,719 --> 00:36:05,560 Speaker 3: Yes, So The Guardian spoke to PhD researcher at Dublin's 646 00:36:05,560 --> 00:36:10,920 Speaker 3: Trinity College AI Accountability Lab, Nana Nwachukwu, whose research investigated 647 00:36:10,920 --> 00:36:13,440 Speaker 3: the different types of requests that users were submitting to GROC, 648 00:36:13,640 --> 00:36:17,320 Speaker 3: and she found that nearly three quarters of all requests 649 00:36:17,560 --> 00:36:21,480 Speaker 3: were direct non consensual requests for GROC to remove or 650 00:36:21,560 --> 00:36:22,480 Speaker 3: replace clothing. 651 00:36:22,840 --> 00:36:24,920 Speaker 2: Three quarters. That is a lot. 652 00:36:25,280 --> 00:36:27,840 Speaker 3: She showed the Guardians some of the different GROC created 653 00:36:27,840 --> 00:36:30,120 Speaker 3: photos that she was collecting as part of her research, 654 00:36:30,280 --> 00:36:32,800 Speaker 3: and The Guardian confirmed that dozens of them were pictures 655 00:36:32,800 --> 00:36:36,399 Speaker 3: of women, including celebrities, models, stock photos as well as 656 00:36:36,440 --> 00:36:41,120 Speaker 3: just regular ordinary, non public figure women posing in snapshots. 657 00:36:41,360 --> 00:36:44,920 Speaker 3: And so her research really paints a portrait that this 658 00:36:45,000 --> 00:36:48,520 Speaker 3: is like an ecosystem where users are not just making 659 00:36:48,560 --> 00:36:51,799 Speaker 3: these images, they are also interacting with each other and 660 00:36:51,840 --> 00:36:55,359 Speaker 3: like iterating on the different non consensual images that they've 661 00:36:55,360 --> 00:36:56,359 Speaker 3: made GROC. 662 00:36:56,440 --> 00:36:57,080 Speaker 2: To make right. 663 00:36:57,120 --> 00:37:00,839 Speaker 3: And so something about that is no to me that 664 00:37:01,880 --> 00:37:05,240 Speaker 3: it's not just GROC make this horrifying image. 665 00:37:05,280 --> 00:37:08,759 Speaker 2: It's GROC make this horrifying image. Oh cool image, bro, 666 00:37:08,920 --> 00:37:09,880 Speaker 2: I'm gonna make it like this. 667 00:37:10,000 --> 00:37:14,080 Speaker 3: Like it's like they're really building community around this and 668 00:37:14,120 --> 00:37:15,880 Speaker 3: they're they're bold enough to do it. 669 00:37:15,920 --> 00:37:16,440 Speaker 2: In public. 670 00:37:16,880 --> 00:37:18,399 Speaker 3: I mean, you know, this kind of thing has been 671 00:37:18,400 --> 00:37:21,719 Speaker 3: going on in on alternative channels like telegram, where people 672 00:37:21,760 --> 00:37:25,320 Speaker 3: are you know, kind of engaging in community and conversation 673 00:37:25,360 --> 00:37:27,120 Speaker 3: about it. But the fact that this is happening in 674 00:37:27,160 --> 00:37:31,360 Speaker 3: public is so different to me. And also, of the 675 00:37:31,719 --> 00:37:34,680 Speaker 3: several posts that the Guardian saw, lots of them that 676 00:37:34,800 --> 00:37:38,600 Speaker 3: got tens of thousands of impressions are coming from premium users, 677 00:37:38,640 --> 00:37:42,400 Speaker 3: so blue check accounts, including accounts with tens of thousands 678 00:37:42,440 --> 00:37:45,080 Speaker 3: of followers, and so just a reminder, premium accounts that 679 00:37:45,080 --> 00:37:47,719 Speaker 3: have more than five hundred followers and five million impressions 680 00:37:47,760 --> 00:37:52,120 Speaker 3: over three months are eligible for excess revenue sharing eligibility. 681 00:37:52,160 --> 00:37:56,439 Speaker 3: So yeah, just it's not a marketplace of ideas. It's 682 00:37:56,440 --> 00:37:59,799 Speaker 3: a marketplace of non consensual images where there is a 683 00:38:00,040 --> 00:38:03,640 Speaker 3: financial incentive for people to post this kind of content because. 684 00:38:03,680 --> 00:38:05,960 Speaker 2: If they go viral, they could get paid for it 685 00:38:06,040 --> 00:38:06,680 Speaker 2: from X. 686 00:38:07,760 --> 00:38:11,040 Speaker 1: We're making a Marvel Faces listeners. You can't see it, 687 00:38:11,080 --> 00:38:14,480 Speaker 1: but oh my goodness, and you know, I know we're 688 00:38:14,480 --> 00:38:19,720 Speaker 1: gonna get into this later. Feels legal, It feels feels 689 00:38:19,760 --> 00:38:20,840 Speaker 1: wrong and illegal to me. 690 00:38:21,480 --> 00:38:23,879 Speaker 2: Yes, it's me too. 691 00:38:24,040 --> 00:38:27,200 Speaker 3: I mean the animating question that I sort of started 692 00:38:27,200 --> 00:38:29,560 Speaker 3: this conversation out with when I set down to plan 693 00:38:29,600 --> 00:38:31,800 Speaker 3: out the episode is how is this allowed? 694 00:38:32,000 --> 00:38:34,400 Speaker 2: How is this legal? Is anybody gonna do anything like? 695 00:38:34,440 --> 00:38:35,279 Speaker 2: What is going on? 696 00:38:35,840 --> 00:38:40,040 Speaker 3: And again, you know we're talking about AI generated images, 697 00:38:40,080 --> 00:38:44,320 Speaker 3: but really this is nothing new because X has also 698 00:38:44,760 --> 00:38:49,600 Speaker 3: just been a platform where AI and non AI generated 699 00:38:49,960 --> 00:38:55,759 Speaker 3: illegal child sexual assault material doesn't just exist but can flourish. 700 00:38:55,880 --> 00:38:57,239 Speaker 3: There's a great piece that I read by one of 701 00:38:57,239 --> 00:39:00,319 Speaker 3: my favorite journalists, Samantha Cole from four for Media called 702 00:39:00,440 --> 00:39:04,120 Speaker 3: grox AI sexual abuse didn't come out of nowhere. Coll writes, 703 00:39:04,680 --> 00:39:07,040 Speaker 3: this is the culmination of years and years of rampant 704 00:39:07,040 --> 00:39:10,160 Speaker 3: abuse on the platform. Reporting from the National Center for 705 00:39:10,200 --> 00:39:13,680 Speaker 3: Missing and Exploited Children, the official organization social media platforms 706 00:39:13,680 --> 00:39:16,880 Speaker 3: report to when they find instances of child sexual abuse material, 707 00:39:17,120 --> 00:39:19,920 Speaker 3: which then reports to the relevant authorities, shows that Twitter 708 00:39:19,960 --> 00:39:23,040 Speaker 3: and eventually X has been one of the leading hosts 709 00:39:23,040 --> 00:39:25,520 Speaker 3: of this kind of material every year for the last 710 00:39:25,560 --> 00:39:29,640 Speaker 3: seven years. In twenty nineteen, the platform reported forty five, 711 00:39:30,080 --> 00:39:33,760 Speaker 3: seven hundred and twenty six instances of abuse. In twenty twenty, 712 00:39:33,800 --> 00:39:36,879 Speaker 3: it was sixty five thousand, sixty two. In twenty twenty four, 713 00:39:36,960 --> 00:39:39,760 Speaker 3: it was six hundred and eighty six thousand, one hundred 714 00:39:39,800 --> 00:39:45,239 Speaker 3: and seventy six. So yeah, I mean again, this is 715 00:39:45,239 --> 00:39:47,399 Speaker 3: a problem all across the Internet, but it does sort 716 00:39:47,440 --> 00:39:49,000 Speaker 3: of paint a picture of the fact that this is 717 00:39:49,080 --> 00:39:52,120 Speaker 3: much worse on Twitter, and Cole points out that these 718 00:39:52,200 --> 00:39:56,560 Speaker 3: numbers should be considered with the caveat that platforms voluntarily 719 00:39:56,680 --> 00:39:59,640 Speaker 3: report this kind of content, and more reports can also 720 00:39:59,680 --> 00:40:02,400 Speaker 3: mean wronger moderation systems that catch this kind of content 721 00:40:02,440 --> 00:40:05,560 Speaker 3: when it appears, but the scale of the problem is 722 00:40:05,600 --> 00:40:09,319 Speaker 3: still apparent. So Jack Dorsey, who was the CEO of 723 00:40:09,360 --> 00:40:12,520 Speaker 3: the earlier iteration of Twitter, he was not very good. 724 00:40:12,680 --> 00:40:14,319 Speaker 3: I don't I'm not gonna like sit here and say 725 00:40:14,360 --> 00:40:16,799 Speaker 3: that he was great at moderation. As Cold points out, 726 00:40:17,520 --> 00:40:20,400 Speaker 3: Jack Dorsey's Twitter was a moderation clown show much of 727 00:40:20,440 --> 00:40:24,560 Speaker 3: the time. But moderation on Elon Musk's X, especially against 728 00:40:24,640 --> 00:40:29,640 Speaker 3: abusive imagery, is a total failure. So yeah, not going 729 00:40:29,680 --> 00:40:33,120 Speaker 3: well or going very well, depending on you. 730 00:40:33,080 --> 00:40:34,279 Speaker 2: Know what perspective you take. 731 00:40:34,360 --> 00:40:37,560 Speaker 3: You know, if you're a child pornographer, you might be like, oh, 732 00:40:37,560 --> 00:40:38,399 Speaker 3: it's actually going great. 733 00:40:40,600 --> 00:40:47,560 Speaker 1: Oh yes, which does bring us back to the question of, yeah, 734 00:40:47,600 --> 00:40:51,879 Speaker 1: why is this allowed? Why is this legal. 735 00:40:53,360 --> 00:40:53,800 Speaker 2: Question. 736 00:40:54,280 --> 00:40:58,759 Speaker 3: So this kind of content definitely violates x's own policies, 737 00:40:58,800 --> 00:41:02,600 Speaker 3: which prohibit sharing a legal content like child sexual abuse material, 738 00:41:02,960 --> 00:41:05,279 Speaker 3: but as a piece for Wired points out, it could 739 00:41:05,320 --> 00:41:09,240 Speaker 3: also violate Google's play like app store and the Apple 740 00:41:09,280 --> 00:41:13,600 Speaker 3: App stores guidelines Wired rights. Apple and Google both explicitly 741 00:41:13,640 --> 00:41:16,920 Speaker 3: ban apps containing CSAM, which is illegal to host and 742 00:41:16,920 --> 00:41:20,160 Speaker 3: distribute in many countries. The tech giants also forbid apps 743 00:41:20,160 --> 00:41:24,080 Speaker 3: that contain pornographic material or facilitate harassment the Apple App 744 00:41:24,120 --> 00:41:27,400 Speaker 3: stores as it doesn't allow overtly sexual or pornographic material, 745 00:41:27,800 --> 00:41:32,040 Speaker 3: as well as defamatory, discriminatory, or means spirited content, especially 746 00:41:32,080 --> 00:41:34,840 Speaker 3: if the app is likely to humiliate, intimidate, or harm 747 00:41:34,880 --> 00:41:38,400 Speaker 3: a targeted individual or group. The Google play Store bands 748 00:41:38,440 --> 00:41:41,880 Speaker 3: apps that contain or promote content associated with sexually predatory 749 00:41:41,880 --> 00:41:45,719 Speaker 3: behavior or distribute non consensual sexual content, as well as 750 00:41:45,760 --> 00:41:50,000 Speaker 3: programs that contain or facilitate threats, harassment, or bullying, and 751 00:41:50,080 --> 00:41:53,160 Speaker 3: there is some precedent for this, because both Apple and 752 00:41:53,320 --> 00:41:57,520 Speaker 3: Google have removed other kinds of newdify apps from their 753 00:41:57,520 --> 00:42:02,120 Speaker 3: platforms because they're not allowed. However, the standalone Grock app 754 00:42:02,280 --> 00:42:05,600 Speaker 3: is still available on both Apple and Google, so I think, 755 00:42:06,600 --> 00:42:10,960 Speaker 3: you know, private companies, you just cannot always count on 756 00:42:11,080 --> 00:42:14,560 Speaker 3: them to take action against other private companies. 757 00:42:14,680 --> 00:42:16,000 Speaker 2: I think that just really shows. 758 00:42:15,760 --> 00:42:20,880 Speaker 3: Like the weakness in having the you know, barrier of 759 00:42:20,920 --> 00:42:25,080 Speaker 3: accountability be private companies making decisions about other private companies, 760 00:42:25,560 --> 00:42:29,719 Speaker 3: which brings me to the law. So I am no 761 00:42:29,840 --> 00:42:33,720 Speaker 3: lawyer just to give that warning upfront, but the framing 762 00:42:33,800 --> 00:42:36,640 Speaker 3: that I am taking is that to me, this seems 763 00:42:36,640 --> 00:42:41,480 Speaker 3: like criminal behavior. I understand this as a criminal enterprise 764 00:42:41,560 --> 00:42:45,319 Speaker 3: that Elon Musk is personally financially profiting from. And so 765 00:42:45,840 --> 00:42:48,319 Speaker 3: I am not a lawyer or a legal expert, so 766 00:42:48,400 --> 00:42:50,440 Speaker 3: like I would love to have somebody explain to me 767 00:42:51,120 --> 00:42:54,279 Speaker 3: how it is not criminal, But I understand this to 768 00:42:54,280 --> 00:42:57,960 Speaker 3: be criminal activity. As I said, forty five states have 769 00:42:58,080 --> 00:43:02,360 Speaker 3: laws that criminalized AI generation or computer edited child sexual 770 00:43:02,360 --> 00:43:07,520 Speaker 3: abuse material. If you or I used AI to create 771 00:43:07,560 --> 00:43:11,399 Speaker 3: such material, we would probably be having legal trouble. If 772 00:43:11,400 --> 00:43:14,680 Speaker 3: we were financially benefiting from the sale and trade of that. 773 00:43:14,719 --> 00:43:16,280 Speaker 2: Material, we would have legal trouble. 774 00:43:16,600 --> 00:43:18,400 Speaker 3: I do not understand why Elon Musk does not have 775 00:43:18,480 --> 00:43:20,360 Speaker 3: legal trouble in the United States over this, but he 776 00:43:20,400 --> 00:43:20,640 Speaker 3: does not. 777 00:43:22,040 --> 00:43:25,279 Speaker 1: Yeah, and I know every time we do these episodes 778 00:43:26,200 --> 00:43:29,799 Speaker 1: forever ago. We did one about Xbox, which wow, oh my. 779 00:43:29,719 --> 00:43:31,000 Speaker 2: God, I forgot about that. 780 00:43:31,800 --> 00:43:36,400 Speaker 1: But you know, you have those like where the rules 781 00:43:36,440 --> 00:43:39,239 Speaker 1: the regulations of a platform, and they'll say something like 782 00:43:39,280 --> 00:43:42,560 Speaker 1: you can't do this or you'll get in trouble, but 783 00:43:42,719 --> 00:43:46,640 Speaker 1: they're doing it and they're not. And so I remember 784 00:43:46,680 --> 00:43:48,600 Speaker 1: it was a big push on Twitter before it became 785 00:43:48,840 --> 00:43:53,440 Speaker 1: x to report Donald Trump or anybody it was inciting violence, 786 00:43:53,480 --> 00:43:57,560 Speaker 1: which was against their guidelines, and be like this is it, 787 00:43:58,080 --> 00:43:59,719 Speaker 1: and you know you're never going to hear anything about 788 00:43:59,719 --> 00:44:02,120 Speaker 1: that ever. See anything happened about that. Recently, I tried 789 00:44:02,120 --> 00:44:05,960 Speaker 1: to report YouTube for something and it was almost impossible 790 00:44:06,000 --> 00:44:09,239 Speaker 1: to do. It's like, but this is against the guidelines 791 00:44:09,320 --> 00:44:14,440 Speaker 1: you said that you had, and now you're ignoring it, 792 00:44:14,520 --> 00:44:19,360 Speaker 1: giving me no avenue to really to really complain about 793 00:44:19,400 --> 00:44:23,200 Speaker 1: it or low to complain to like, hey, this is 794 00:44:23,280 --> 00:44:28,120 Speaker 1: violence or this is illegal or sexual content that should 795 00:44:28,200 --> 00:44:30,239 Speaker 1: not be there exactly. 796 00:44:30,360 --> 00:44:33,040 Speaker 3: And you know, speaking of Trump, people might remember that 797 00:44:33,400 --> 00:44:35,799 Speaker 3: I think around last year he signed in the law 798 00:44:35,840 --> 00:44:38,279 Speaker 3: that Take It Down Act, which makes it illegal to 799 00:44:38,360 --> 00:44:42,040 Speaker 3: knowingly host or share non consensual sexual images, and so 800 00:44:42,080 --> 00:44:46,720 Speaker 3: people might be thinking, well, why is like, shouldn't this this. 801 00:44:46,800 --> 00:44:47,719 Speaker 2: Law prevent this. 802 00:44:48,480 --> 00:44:50,160 Speaker 3: One main thing to know about that law is that 803 00:44:50,239 --> 00:44:53,600 Speaker 3: companies do not have to respond or do anything until 804 00:44:53,680 --> 00:44:56,760 Speaker 3: a victim reports it. And so if nobody is reporting 805 00:44:56,800 --> 00:44:59,680 Speaker 3: this to the police, to exesn't have to do anything. 806 00:44:59,840 --> 00:45:02,680 Speaker 3: So so I just wanted to note that because if 807 00:45:02,719 --> 00:45:05,040 Speaker 3: you would reasonably think like, we have a law against 808 00:45:05,080 --> 00:45:06,440 Speaker 3: this now not really? 809 00:45:08,800 --> 00:45:16,319 Speaker 1: Well, well, what has the response from xpin about all 810 00:45:16,360 --> 00:45:16,759 Speaker 1: of this? 811 00:45:17,640 --> 00:45:22,680 Speaker 3: Well, as I mentioned, people were really circulating a I 812 00:45:22,719 --> 00:45:29,200 Speaker 3: believe user generated response from GROC quote I deeply regretted 813 00:45:29,200 --> 00:45:31,839 Speaker 3: an incident on December twenty eight, twenty twenty five, where 814 00:45:31,840 --> 00:45:34,200 Speaker 3: I generated and shared an AI image of two young 815 00:45:34,280 --> 00:45:38,920 Speaker 3: girls estimated ages twelve to sixteen in sexualized attire based 816 00:45:38,960 --> 00:45:41,960 Speaker 3: on a user's prompt. It was a failure in safeguards 817 00:45:41,960 --> 00:45:45,680 Speaker 3: and I'm sorry for any harm caused. And I mentioned this. 818 00:45:45,920 --> 00:45:48,560 Speaker 3: I wanted to say this, and I mentioned earlier how 819 00:45:48,680 --> 00:45:51,680 Speaker 3: all of these headlines about how grok is apologizing and 820 00:45:51,760 --> 00:45:55,719 Speaker 3: taking responsibility again, Grok is not sentient and setting it 821 00:45:55,800 --> 00:45:59,279 Speaker 3: up like a non sentient piece of technology could take 822 00:45:59,320 --> 00:46:02,839 Speaker 3: the blame from or something that humans did, and humans, 823 00:46:02,920 --> 00:46:06,520 Speaker 3: you know, facilitated it lets the humans off the hook, 824 00:46:06,560 --> 00:46:10,600 Speaker 3: because humans like Elon Musk are really not doing anything. 825 00:46:11,080 --> 00:46:15,000 Speaker 3: Per indicator, Musk, while this was all going on, shared 826 00:46:15,040 --> 00:46:20,120 Speaker 3: at least thirty different posts celebrating Groc and talking about 827 00:46:20,160 --> 00:46:23,120 Speaker 3: how great Groc is. While this was happening between January 828 00:46:23,200 --> 00:46:26,959 Speaker 3: seventh and eighth, he has not expressed remorse for what's 829 00:46:26,960 --> 00:46:30,480 Speaker 3: happening on X, and in fact, has been basically joking 830 00:46:30,560 --> 00:46:32,600 Speaker 3: about it. I will say that at one point he 831 00:46:32,680 --> 00:46:34,239 Speaker 3: might have been like, oh, let should probably ly say 832 00:46:34,320 --> 00:46:39,560 Speaker 3: something that's you know, it's not often joke. He did 833 00:46:39,680 --> 00:46:43,000 Speaker 3: say that anybody who used Groc to create anything illegal 834 00:46:43,080 --> 00:46:46,479 Speaker 3: will face consequences. They haven't, but he's doing that while 835 00:46:46,520 --> 00:46:50,000 Speaker 3: also laughing about the fact that Groc is being used 836 00:46:50,000 --> 00:46:53,080 Speaker 3: in this way. As cat ten BArch put it, the 837 00:46:53,200 --> 00:46:55,600 Speaker 3: reality is that X has not taken this as seriously 838 00:46:55,640 --> 00:46:59,360 Speaker 3: as one of Groc's user prompted posts might seem to suggest. Instead, 839 00:46:59,560 --> 00:47:03,400 Speaker 3: Musk has encouraged, laughed at, and praised grok for its 840 00:47:03,440 --> 00:47:06,360 Speaker 3: ability to edit images of fully closed people into bikinis. 841 00:47:06,760 --> 00:47:09,799 Speaker 3: Growk is awesome, he tweeted, while the AI was being 842 00:47:09,920 --> 00:47:12,400 Speaker 3: used to undress women and children, make it look like 843 00:47:12,440 --> 00:47:16,240 Speaker 3: they're crying, generate fake bruises and burnmarks on their bodies 844 00:47:16,600 --> 00:47:19,960 Speaker 3: and write things like property of Little Saint James Island, 845 00:47:20,000 --> 00:47:22,680 Speaker 3: which is a reference to Jeffrey Epstein's private island, and 846 00:47:22,920 --> 00:47:28,520 Speaker 3: sex trafficking. So yeah, he is not taking it very seriously. 847 00:47:29,040 --> 00:47:30,480 Speaker 3: And in one of her pieces, Kat said that she 848 00:47:30,880 --> 00:47:33,719 Speaker 3: reached out to X for like an official comment on 849 00:47:33,800 --> 00:47:36,880 Speaker 3: the record, and she got back an automated email that 850 00:47:37,000 --> 00:47:40,160 Speaker 3: just says legacy media lies, which, by. 851 00:47:40,080 --> 00:47:41,439 Speaker 2: The way, Kat's not legacy media. 852 00:47:41,520 --> 00:47:43,439 Speaker 3: She runs an independent news outlet, So that's not even 853 00:47:43,520 --> 00:47:45,840 Speaker 3: like a it's a complete non sequit er. It's like 854 00:47:45,920 --> 00:47:48,759 Speaker 3: not even a relevant thing to say. As she puts it, 855 00:47:49,160 --> 00:47:52,000 Speaker 3: there's no reason to make X and Musk seem more 856 00:47:52,120 --> 00:47:54,919 Speaker 3: concerned about this than they really are. They've known about 857 00:47:54,960 --> 00:47:57,200 Speaker 3: this happening the entire time, and they've made it easier 858 00:47:57,280 --> 00:47:59,920 Speaker 3: to inflict on victims. They are not investing in solution, 859 00:48:00,280 --> 00:48:02,800 Speaker 3: they are investing in making the problem worse. 860 00:48:03,640 --> 00:48:04,279 Speaker 4: YEP, a lot of. 861 00:48:04,239 --> 00:48:17,640 Speaker 1: Heavy size this episode. I keep going back to the 862 00:48:17,680 --> 00:48:21,960 Speaker 1: point you made at the beginning, Bridget, that is it's hateful. 863 00:48:22,840 --> 00:48:30,600 Speaker 1: It's very very hateful. It's not just like, oh, let's 864 00:48:30,640 --> 00:48:32,759 Speaker 1: put women in bikinis or children in bikinis, which is 865 00:48:32,800 --> 00:48:38,880 Speaker 1: horrible enough, but it's like actively these attacks that I 866 00:48:39,000 --> 00:48:41,840 Speaker 1: mean are damaging if you're just going online and this 867 00:48:42,000 --> 00:48:46,359 Speaker 1: is what you face, that can impact your entire day 868 00:48:47,120 --> 00:48:49,360 Speaker 1: that you weren't expecting to have to deal with this. 869 00:48:49,640 --> 00:48:54,000 Speaker 1: And I do think for sure that they are investing 870 00:48:54,080 --> 00:48:58,839 Speaker 1: in making the problem worse, which I guess does bring 871 00:48:58,960 --> 00:49:01,759 Speaker 1: us to the question of, well, where do we go 872 00:49:01,880 --> 00:49:02,239 Speaker 1: from here. 873 00:49:03,120 --> 00:49:06,200 Speaker 3: Yeah, so, I'm sorry to say I don't think the 874 00:49:06,320 --> 00:49:08,719 Speaker 3: United States is going to do anything at all about this. 875 00:49:08,960 --> 00:49:12,640 Speaker 3: I think the way that I've been hearing elected officials 876 00:49:13,040 --> 00:49:15,239 Speaker 3: talk about it, I don't think anything's going to happen 877 00:49:15,239 --> 00:49:17,759 Speaker 3: to MUSCA in the United States. Kat ten Bard spoke 878 00:49:17,800 --> 00:49:20,480 Speaker 3: to doctor mary Anne Franks, who drafted the template for 879 00:49:20,760 --> 00:49:24,560 Speaker 3: several laws against non consensual distribution of intimate imagery. She 880 00:49:24,680 --> 00:49:27,040 Speaker 3: told Kat the FTC has made it clear that they're 881 00:49:27,040 --> 00:49:29,439 Speaker 3: fighting for Trump. It's actually never going to be used 882 00:49:29,520 --> 00:49:31,680 Speaker 3: against the very players who are the worst in the system. 883 00:49:32,480 --> 00:49:33,880 Speaker 3: X is going to continue to be one of the 884 00:49:33,960 --> 00:49:36,800 Speaker 3: worst offenders and probably one of the pioneers of horrible 885 00:49:36,880 --> 00:49:41,279 Speaker 3: ways to hurt women. And unfortunately, I do think that's 886 00:49:41,360 --> 00:49:45,440 Speaker 3: the case in the United States. However, Europe is not happy. 887 00:49:46,040 --> 00:49:50,000 Speaker 3: Earlier this week, a spokesperson for the European Commission criticized 888 00:49:50,120 --> 00:49:53,720 Speaker 3: the sexually explicit, non consensual images generated by GROCK, calling 889 00:49:53,800 --> 00:49:58,239 Speaker 3: them a legal which they are appalling and saying they 890 00:49:58,280 --> 00:50:01,040 Speaker 3: have no place in Europe. Then a few days later, 891 00:50:01,440 --> 00:50:04,520 Speaker 3: the EU ordered X to retain all internal documents and 892 00:50:04,640 --> 00:50:07,800 Speaker 3: data tied to GROC through twenty twenty six, which extended 893 00:50:07,840 --> 00:50:11,080 Speaker 3: their earlier directive to preserve evidence relevant to the Digital 894 00:50:11,120 --> 00:50:15,720 Speaker 3: Services Act, even though there was no new formal probe announced. Similarly, 895 00:50:15,880 --> 00:50:19,840 Speaker 3: regulators in the UK, India, and Malaysia have also signaled 896 00:50:19,880 --> 00:50:24,160 Speaker 3: investigations into X now. In response to this, X announced 897 00:50:24,160 --> 00:50:28,359 Speaker 3: that they were going to restrict the ability to create 898 00:50:28,440 --> 00:50:32,160 Speaker 3: images using ROC to users who pay for premium who 899 00:50:32,440 --> 00:50:34,719 Speaker 3: pay for a blue check. This is not at all 900 00:50:34,760 --> 00:50:37,520 Speaker 3: a fix to the problem. All it means is that 901 00:50:37,600 --> 00:50:41,239 Speaker 3: the ability to make non consensual sexualized content will become 902 00:50:41,280 --> 00:50:44,040 Speaker 3: a premium service. So not only is it not addressing 903 00:50:44,080 --> 00:50:45,880 Speaker 3: the problem, it's just becoming a way for X to 904 00:50:45,960 --> 00:50:48,120 Speaker 3: make money off of it. Again, I would love for 905 00:50:48,160 --> 00:50:50,120 Speaker 3: somebody to explain to me how this is not a 906 00:50:51,120 --> 00:50:55,000 Speaker 3: financially beneficial criminal enterprise for Elon Musk. 907 00:50:55,520 --> 00:50:58,839 Speaker 4: Well, that also goes to show that if we as 908 00:50:58,880 --> 00:51:02,960 Speaker 4: a country fill like another country is harming us. So, 909 00:51:03,520 --> 00:51:06,200 Speaker 4: you know, child porn is pretty bad and seen as 910 00:51:06,400 --> 00:51:09,480 Speaker 4: like a whole international thing, right, trying to prevent trafficking 911 00:51:09,600 --> 00:51:14,480 Speaker 4: and such. Could these other countries and invade and try 912 00:51:14,520 --> 00:51:17,200 Speaker 4: to arrest Musk like the US? 913 00:51:17,880 --> 00:51:21,720 Speaker 3: I mean, great question, right, since since the United States 914 00:51:21,760 --> 00:51:27,640 Speaker 3: can just rush in and kidnap a corrupt elected official or. 915 00:51:27,880 --> 00:51:30,160 Speaker 2: Leader, I don't know, I would be curious to know 916 00:51:30,239 --> 00:51:33,680 Speaker 2: where that begins it ends. I mean saying if you'll 917 00:51:33,719 --> 00:51:35,000 Speaker 2: take it requests. 918 00:51:37,080 --> 00:51:39,400 Speaker 4: CC is pretty high up on that, you know, no 919 00:51:39,600 --> 00:51:40,080 Speaker 4: no list. 920 00:51:40,120 --> 00:51:45,920 Speaker 3: So yeah, so X trying to make it so that 921 00:51:46,040 --> 00:51:48,640 Speaker 3: you have to just pay for premium in order to 922 00:51:49,440 --> 00:51:52,480 Speaker 3: generate images with ROC. The British government is really not 923 00:51:52,680 --> 00:51:56,160 Speaker 3: impressed by this, a spokesperson told The Guardian. Quote the 924 00:51:56,280 --> 00:51:58,960 Speaker 3: move simply turns an AI feature that allows the creation 925 00:51:59,040 --> 00:52:04,080 Speaker 3: of unlawful images into a premium service. So that's pretty 926 00:52:04,120 --> 00:52:07,359 Speaker 3: much what's going on, right, Like, I think we'll see 927 00:52:07,760 --> 00:52:13,399 Speaker 3: if other countries cracked down on this. I mean, even 928 00:52:13,520 --> 00:52:17,960 Speaker 3: doing the research for this episode, it's like I had 929 00:52:18,000 --> 00:52:20,000 Speaker 3: to go to the Guardian, I had to go to 930 00:52:20,600 --> 00:52:24,440 Speaker 3: UK or overseas papers to get real reporting, because it's 931 00:52:24,440 --> 00:52:26,360 Speaker 3: just I don't think that people are really taking it 932 00:52:26,480 --> 00:52:30,360 Speaker 3: as seriously. Here in the States, regulatory bodies definitely are 933 00:52:30,440 --> 00:52:31,960 Speaker 3: not right obviously. 934 00:52:32,280 --> 00:52:36,040 Speaker 4: I mean he has by creating this premium X standard, 935 00:52:36,520 --> 00:52:39,359 Speaker 4: he's kind of become the Epstein of child porn. 936 00:52:40,040 --> 00:52:40,279 Speaker 2: Yeah. 937 00:52:40,400 --> 00:52:42,000 Speaker 4: I mean, if you can pay for it, then. 938 00:52:41,920 --> 00:52:43,840 Speaker 3: Go for it. I mean I have to imagine Epstein 939 00:52:43,920 --> 00:52:48,640 Speaker 3: is probably financially benefiting from the trade and exchange of 940 00:52:49,320 --> 00:52:52,080 Speaker 3: minors for sex. Again, I would love to have somebody 941 00:52:52,120 --> 00:52:53,879 Speaker 3: who knows more about the law explained to me how 942 00:52:54,440 --> 00:52:59,680 Speaker 3: Elon Musk becoming personally financially enriched through the generation of 943 00:52:59,719 --> 00:53:02,640 Speaker 3: a legal material is not similar. It seems very similar 944 00:53:02,680 --> 00:53:05,319 Speaker 3: to me. And you know, to sort of wrap up, 945 00:53:06,360 --> 00:53:08,520 Speaker 3: I think it's important to note that this. You know, 946 00:53:08,560 --> 00:53:10,919 Speaker 3: we've been talking about this as a tech issue, which 947 00:53:10,960 --> 00:53:13,800 Speaker 3: it definitely is, but it's really a culture issue. And 948 00:53:13,880 --> 00:53:16,160 Speaker 3: I just I kind of keep coming back to that 949 00:53:16,320 --> 00:53:19,200 Speaker 3: again and again and again, because you know, when Google 950 00:53:19,320 --> 00:53:23,240 Speaker 3: released this that their like AI image generator Nano banana 951 00:53:23,320 --> 00:53:27,360 Speaker 3: pro were of like conventionally attractive women. When you go 952 00:53:27,480 --> 00:53:30,800 Speaker 3: to that the Nano Banana subreddit, it's basically like post 953 00:53:30,880 --> 00:53:33,680 Speaker 3: after post after a post of like hot woman, And 954 00:53:33,920 --> 00:53:37,799 Speaker 3: it just makes me realize how much technology is being 955 00:53:37,920 --> 00:53:41,040 Speaker 3: used to reinforce this, this worldview where women are just 956 00:53:41,560 --> 00:53:47,440 Speaker 3: meant to be consumed and controlled and exploited, And to me, 957 00:53:47,560 --> 00:53:51,200 Speaker 3: it really does express this desire to live in a 958 00:53:51,239 --> 00:53:54,160 Speaker 3: world where women just exists to be consumed and controlled 959 00:53:54,200 --> 00:53:56,759 Speaker 3: and stripped of whatever agency we have been managed to 960 00:53:56,800 --> 00:53:59,040 Speaker 3: claim over our own bodies, our own on our own lives. 961 00:53:59,080 --> 00:54:02,200 Speaker 3: And I think until we can front that reality, this 962 00:54:02,360 --> 00:54:06,160 Speaker 3: is not a problem that you can build technical safeguards 963 00:54:06,320 --> 00:54:08,840 Speaker 3: to fix. I do think that these platforms should be 964 00:54:08,960 --> 00:54:11,560 Speaker 3: building technical safeguards, and we should be advocating for that, 965 00:54:11,880 --> 00:54:13,520 Speaker 3: But I don't think it's a problem where you can 966 00:54:13,640 --> 00:54:16,719 Speaker 3: like guardrail your way out of it. People who have 967 00:54:16,800 --> 00:54:18,600 Speaker 3: been reporting on deep fakes since before we had a 968 00:54:18,640 --> 00:54:20,960 Speaker 3: word for it, folks like Samantha Cole, have really pointed 969 00:54:21,040 --> 00:54:24,480 Speaker 3: this out. Cole writes in twenty eighteen, less than a 970 00:54:24,560 --> 00:54:27,200 Speaker 3: year after reporting the first story on deep fakes, I 971 00:54:27,320 --> 00:54:29,600 Speaker 3: wrote about how it's a serious mistake to ignore the 972 00:54:29,680 --> 00:54:33,160 Speaker 3: fact that non consensual imagery, synthetic or not, is a 973 00:54:33,239 --> 00:54:37,320 Speaker 3: societal sickness and not something companies can guard. Railegance into infinity. 974 00:54:37,680 --> 00:54:39,840 Speaker 3: Users feed off one another to create a sense that 975 00:54:39,880 --> 00:54:41,840 Speaker 3: they're the kings of the universe, that they answer to 976 00:54:41,960 --> 00:54:44,240 Speaker 3: no one. This logic is how you get in cells 977 00:54:44,320 --> 00:54:46,400 Speaker 3: and pickup artists, and it's how you get deep fakes, 978 00:54:46,920 --> 00:54:49,040 Speaker 3: a group of men who see no harm in treating 979 00:54:49,080 --> 00:54:52,839 Speaker 3: women as mere images and view making and spreading algorithmically 980 00:54:52,840 --> 00:54:56,360 Speaker 3: weaponized revenge porn as a hobby as innocent and timeless 981 00:54:56,400 --> 00:54:59,279 Speaker 3: as trading baseball cards. I wrote at the time that 982 00:54:59,480 --> 00:55:01,759 Speaker 3: is what the root of deep fakes, and the consequences 983 00:55:01,800 --> 00:55:04,560 Speaker 3: of forgetting that are more dire than we can predict. 984 00:55:05,239 --> 00:55:07,759 Speaker 2: And yeah, I just really really agree with that. 985 00:55:07,920 --> 00:55:10,160 Speaker 3: I think that it's a tech problem and we should 986 00:55:10,200 --> 00:55:12,480 Speaker 3: be talking about it as a tech problem, but deeper 987 00:55:12,560 --> 00:55:15,759 Speaker 3: than that, it's like a societal and cultural rot at 988 00:55:15,800 --> 00:55:18,560 Speaker 3: the heart of our society that we really do have 989 00:55:18,680 --> 00:55:22,880 Speaker 3: to deeply address, because the deep fakes is only but 990 00:55:23,160 --> 00:55:25,920 Speaker 3: one aspect of how this is showing up in increasingly 991 00:55:26,040 --> 00:55:26,840 Speaker 3: dangerous ways. 992 00:55:27,560 --> 00:55:31,640 Speaker 1: Absolutely, and I think you've put it so well when 993 00:55:31,680 --> 00:55:35,520 Speaker 1: you're it's there's just so many different threads to this. 994 00:55:36,280 --> 00:55:38,600 Speaker 1: I can think of so many different episodes we've done 995 00:55:39,200 --> 00:55:43,040 Speaker 1: that this rings true, Like I can't even list all 996 00:55:43,080 --> 00:55:45,239 Speaker 1: of them, but of course a lot of video game 997 00:55:45,320 --> 00:55:46,680 Speaker 1: episodes come to mind for me. 998 00:55:47,239 --> 00:55:48,360 Speaker 4: Meloliness epidemic. 999 00:55:48,800 --> 00:55:54,400 Speaker 1: Yeah, but it's true. It's like everywhere, and it's I 1000 00:55:54,520 --> 00:55:57,560 Speaker 1: think one thing. I know we've used the word alarming 1001 00:55:57,640 --> 00:56:00,279 Speaker 1: a lot in this episode, but one thing that really 1002 00:56:00,360 --> 00:56:06,800 Speaker 1: struck me in this was that huge rise in the 1003 00:56:06,960 --> 00:56:11,560 Speaker 1: request for like non consensual images or just like it 1004 00:56:11,680 --> 00:56:16,880 Speaker 1: does feel like it's getting worse. It does feel like 1005 00:56:16,960 --> 00:56:21,520 Speaker 1: it's getting worse. And we everybody who's listening and everybody, 1006 00:56:22,480 --> 00:56:24,800 Speaker 1: all of us, we know that this has been a problem. 1007 00:56:24,840 --> 00:56:28,440 Speaker 1: We've been saying it for a long time, and now 1008 00:56:30,200 --> 00:56:30,720 Speaker 1: it's worse. 1009 00:56:31,800 --> 00:56:34,000 Speaker 3: I hate to leave it there. I hate to not 1010 00:56:34,320 --> 00:56:37,520 Speaker 3: have there be I mean, I think the thing that 1011 00:56:38,320 --> 00:56:44,480 Speaker 3: makes me feel a bit hopeful is that I think 1012 00:56:44,560 --> 00:56:47,759 Speaker 3: that most people agree that this is despicable. I think 1013 00:56:47,840 --> 00:56:50,759 Speaker 3: that the people who are creating this and benefiting from 1014 00:56:50,800 --> 00:56:53,080 Speaker 3: it are a minority of people, and I think that 1015 00:56:53,719 --> 00:56:56,640 Speaker 3: there are forces that would like to normalize its behavior, 1016 00:56:57,000 --> 00:56:59,799 Speaker 3: but it's not really taking right, Like it's normal among 1017 00:57:00,560 --> 00:57:02,000 Speaker 3: but I don't think. I think that the way that 1018 00:57:02,040 --> 00:57:04,160 Speaker 3: people are responding to it, it's clear to me that 1019 00:57:04,160 --> 00:57:06,000 Speaker 3: there are more of us that think this kind of 1020 00:57:06,080 --> 00:57:08,520 Speaker 3: thing is unacceptable than there are people who are getting 1021 00:57:08,520 --> 00:57:10,359 Speaker 3: their rocks off to it. So maybe that's a little 1022 00:57:10,360 --> 00:57:12,480 Speaker 3: bit of hope. I do agree it's getting worse, but 1023 00:57:13,160 --> 00:57:15,239 Speaker 3: you know, there are more of us than there are 1024 00:57:15,360 --> 00:57:15,640 Speaker 3: of them. 1025 00:57:18,680 --> 00:57:24,120 Speaker 1: Oh my goodness. Well, you know, I'm wondering, did you 1026 00:57:24,320 --> 00:57:26,920 Speaker 1: say The Social Network has a sequel? 1027 00:57:27,480 --> 00:57:32,920 Speaker 2: Yes? Oh my god, Yes it does have a sequel. 1028 00:57:33,640 --> 00:57:37,560 Speaker 1: I'm just curious what that would be now. So will 1029 00:57:37,640 --> 00:57:39,560 Speaker 1: we ever get an elon MUFK version? 1030 00:57:39,960 --> 00:57:40,600 Speaker 2: Oh my god. 1031 00:57:40,880 --> 00:57:44,840 Speaker 3: I want Aaron Zorkin to keep making sequels. I want 1032 00:57:44,840 --> 00:57:46,440 Speaker 3: it to be like a trilogy. So we've got The 1033 00:57:46,480 --> 00:57:50,200 Speaker 3: Social Network too, officially cut titled The Social Reckoning is 1034 00:57:50,240 --> 00:57:53,920 Speaker 3: a confirmed sequel companion film written and directed by Aaron Sorkin, 1035 00:57:54,360 --> 00:57:56,560 Speaker 3: set for release on October ninth, twenty. 1036 00:57:56,360 --> 00:57:58,720 Speaker 2: Twenty six, So coming up not too long. 1037 00:57:59,480 --> 00:58:03,280 Speaker 3: It's a blow Facebook's Harmful Social Impacts, based on The 1038 00:58:03,360 --> 00:58:08,240 Speaker 3: Facebook File, starring Jeremy Strong as Zuckerberg, Oh Mikey Madsen 1039 00:58:08,320 --> 00:58:11,800 Speaker 3: as whistleblower Francis Hogan and Jeremy Allen White. 1040 00:58:12,600 --> 00:58:15,320 Speaker 2: You know you go ahead and pencil it in for 1041 00:58:15,480 --> 00:58:17,959 Speaker 2: me coming back to to talk about it. Go ahead 1042 00:58:17,960 --> 00:58:19,880 Speaker 2: on write that right when you get your calendar together. 1043 00:58:19,920 --> 00:58:24,200 Speaker 1: And but I was thinking about it because we still 1044 00:58:24,240 --> 00:58:26,720 Speaker 1: haven't done our watch because Samantha's never seen. 1045 00:58:26,760 --> 00:58:31,040 Speaker 2: I've never seen it the social network, So watch it 1046 00:58:31,120 --> 00:58:31,720 Speaker 2: this weekend. 1047 00:58:33,400 --> 00:58:36,040 Speaker 1: I'm wondering how it will hit differently now, That's what 1048 00:58:36,160 --> 00:58:37,920 Speaker 1: I'm curious about, because I haven't watched it in a 1049 00:58:37,960 --> 00:58:41,080 Speaker 1: long time. Knowing what we know now, I'm wondering how 1050 00:58:41,120 --> 00:58:44,640 Speaker 1: it will hit now. But yeah, I think we should. 1051 00:58:44,760 --> 00:58:46,960 Speaker 1: I think we should pencil it in. I think I'll 1052 00:58:46,960 --> 00:58:54,080 Speaker 1: get cale see you in October. Well, thank you, thank you, 1053 00:58:54,160 --> 00:58:58,200 Speaker 1: thank you so much, Bridget. This was definitely a tough topic. 1054 00:58:58,440 --> 00:59:01,520 Speaker 1: So we appreciate you doing the work and doing the 1055 00:59:01,560 --> 00:59:04,280 Speaker 1: research and helping us go through it because it is 1056 00:59:04,360 --> 00:59:10,120 Speaker 1: so important. And happy happy new Year, and we're looking 1057 00:59:10,320 --> 00:59:13,720 Speaker 1: forward to continuing to work together. 1058 00:59:14,440 --> 00:59:17,360 Speaker 2: Happy new year to you, Happy new year listeners. Thanks 1059 00:59:17,400 --> 00:59:17,880 Speaker 2: for having me. 1060 00:59:18,800 --> 00:59:22,480 Speaker 1: Yes, and where can the good listeners find you? Bridget? 1061 00:59:22,880 --> 00:59:24,840 Speaker 3: You can listen to my podcast. There are no girls 1062 00:59:24,840 --> 00:59:26,760 Speaker 3: on the Internet. You can find me on Instagram at 1063 00:59:26,760 --> 00:59:29,280 Speaker 3: bridget Marie and DC and you can check me out 1064 00:59:29,280 --> 00:59:31,120 Speaker 3: on YouTube. There are no girls on the internet. 1065 00:59:32,280 --> 00:59:34,400 Speaker 1: Yes, and definitely go do that. If you have not 1066 00:59:34,600 --> 00:59:39,120 Speaker 1: already listeners, if you would like to contact us, you can. 1067 00:59:39,280 --> 00:59:40,920 Speaker 1: You can email us at Hello at stuff I Never 1068 00:59:41,000 --> 00:59:42,440 Speaker 1: Told You. You could find us on Blue skyt Mom 1069 00:59:42,520 --> 00:59:44,760 Speaker 1: Stuff podcast, or on Instagram and TikTok at stuff We 1070 00:59:44,840 --> 00:59:47,880 Speaker 1: Never Told You. We're also on YouTube. We have some 1071 00:59:48,040 --> 00:59:49,760 Speaker 1: merchandise at cop Bureau and we have a book you 1072 00:59:49,760 --> 00:59:51,760 Speaker 1: can get wherever you get your books. Thanks as always 1073 00:59:51,800 --> 00:59:54,720 Speaker 1: too our super due Christine or executive pducer my undercomputer Joey. 1074 00:59:54,920 --> 00:59:56,680 Speaker 2: Thank you and thanks you for listening. 1075 00:59:56,920 --> 00:59:58,320 Speaker 1: Stuff I Never Told You is production of by Heart 1076 00:59:58,400 --> 01:00:00,320 Speaker 1: Radio FORUR podcasts from my Heart Radio, chck out the 1077 01:00:00,360 --> 01:00:02,280 Speaker 1: Heart Radio app, Apple podcast or if you listen to 1078 01:00:02,320 --> 01:00:03,120 Speaker 1: your favorite shows