1 00:00:05,280 --> 00:00:07,520 Speaker 1: Hey, this is Anny and Samantha. I come to Steph. 2 00:00:07,560 --> 00:00:19,200 Speaker 2: I never told your production of iHeart Radio, and today 3 00:00:19,239 --> 00:00:23,040 Speaker 2: we are once again thrilled to be joined by the brilliant, 4 00:00:23,280 --> 00:00:29,960 Speaker 2: the lovely, the loquacious brigiad I wrote here. 5 00:00:29,240 --> 00:00:32,760 Speaker 3: I want like, I want to like tape that I guess. 6 00:00:32,840 --> 00:00:34,880 Speaker 3: I guess it's a podcast. I couldn't play it back. 7 00:00:34,800 --> 00:00:36,360 Speaker 4: And I want to get one of my like morning 8 00:00:36,400 --> 00:00:38,480 Speaker 4: affirmation to the introductions y'all give. 9 00:00:39,360 --> 00:00:41,640 Speaker 1: I feel like she's now going through like a thesaurus 10 00:00:41,680 --> 00:00:42,879 Speaker 1: to pick out the war. 11 00:00:43,159 --> 00:00:45,960 Speaker 2: I try not to use the same words again, but 12 00:00:46,360 --> 00:00:48,920 Speaker 2: it's hard, so I do, but I try to add 13 00:00:48,920 --> 00:00:51,360 Speaker 2: in like a kind of like underdog word. 14 00:00:51,720 --> 00:00:54,840 Speaker 1: I love that because they're all true, Like everything she 15 00:00:54,880 --> 00:00:56,760 Speaker 1: says is true about you. But then like the fact 16 00:00:56,760 --> 00:00:58,800 Speaker 1: that she's trying to, like, you know, mix it up 17 00:00:58,800 --> 00:01:00,200 Speaker 1: a bit, I'm like, what is she going to say 18 00:01:00,200 --> 00:01:01,560 Speaker 1: to day? Is a surprise for me too? 19 00:01:02,880 --> 00:01:07,560 Speaker 2: We gotta mix things up well, Bridget. We do always 20 00:01:07,920 --> 00:01:10,319 Speaker 2: love having you. And the last time we spoke with 21 00:01:10,400 --> 00:01:13,000 Speaker 2: you were we were kind of talking about sort of 22 00:01:13,040 --> 00:01:16,360 Speaker 2: the New Year's blues. How how have you been? Have 23 00:01:16,520 --> 00:01:19,200 Speaker 2: things changed? Anything going on? 24 00:01:20,280 --> 00:01:20,400 Speaker 1: Uh? 25 00:01:20,959 --> 00:01:21,720 Speaker 3: So that's right. 26 00:01:21,760 --> 00:01:25,160 Speaker 4: I think when we last talked, I was like deep 27 00:01:25,200 --> 00:01:27,880 Speaker 4: in the midst of that part of January. You're like, 28 00:01:28,160 --> 00:01:31,080 Speaker 4: the holiday's over, I hate everything. Things were kind of 29 00:01:31,080 --> 00:01:33,400 Speaker 4: we're kind of turning a quarter. I'm much more of 30 00:01:33,440 --> 00:01:35,280 Speaker 4: a February girl than a January girl. 31 00:01:35,280 --> 00:01:35,760 Speaker 3: I feel okay. 32 00:01:35,760 --> 00:01:38,000 Speaker 4: I feel like it's easier to sort of get the 33 00:01:38,040 --> 00:01:41,679 Speaker 4: swing of what's going on, get in the rhythm. The 34 00:01:41,720 --> 00:01:45,600 Speaker 4: weather is a tiny bit better. Yeah, how are you 35 00:01:45,640 --> 00:01:46,880 Speaker 4: all faring at this time of year? 36 00:01:51,160 --> 00:01:51,840 Speaker 1: I'm good. 37 00:01:52,280 --> 00:01:55,680 Speaker 2: I feel like I'm still in the strange. Oh wow, 38 00:01:55,760 --> 00:01:58,280 Speaker 2: February is almost over, which I know is like not 39 00:01:58,360 --> 00:02:00,680 Speaker 2: a unique take. A lot of people have that, but 40 00:02:00,760 --> 00:02:05,000 Speaker 2: that's sort of where I am. But I do do 41 00:02:05,080 --> 00:02:10,359 Speaker 2: a lot of like tent pole traditions so I can 42 00:02:10,400 --> 00:02:12,680 Speaker 2: mark the passage of time. So there was a lot 43 00:02:12,720 --> 00:02:16,760 Speaker 2: of them recently, like Lunar New Year and then the 44 00:02:16,800 --> 00:02:21,280 Speaker 2: Super Bowl Marti gra like, so I have done a 45 00:02:21,320 --> 00:02:25,160 Speaker 2: lot of fun things around that that are mostly food related. 46 00:02:27,960 --> 00:02:31,680 Speaker 2: But it does feel just feels kind of like a 47 00:02:31,840 --> 00:02:38,320 Speaker 2: slog I guess where I'm still The New Year part 48 00:02:38,560 --> 00:02:41,440 Speaker 2: of like I want to do something different hasn't left me, 49 00:02:41,600 --> 00:02:44,799 Speaker 2: but the like tired. Part of the past year also 50 00:02:44,919 --> 00:02:47,320 Speaker 2: hasn't left me. So it's sort of like. 51 00:02:48,200 --> 00:02:52,320 Speaker 1: A battle you're winning. I feel like you're winning the 52 00:02:52,360 --> 00:02:54,440 Speaker 1: battle trying well. 53 00:02:54,480 --> 00:02:56,640 Speaker 4: I have to ask, did you all want speaking of 54 00:02:56,680 --> 00:02:58,799 Speaker 4: like moments of tent poles. 55 00:02:59,000 --> 00:03:00,880 Speaker 3: Did y'all watch that Super A halftime show? 56 00:03:01,560 --> 00:03:03,240 Speaker 1: I mean, we're from Atlanta, we have to. 57 00:03:03,440 --> 00:03:06,720 Speaker 4: I was gonna say, oh, a couple of weeks for 58 00:03:06,880 --> 00:03:09,840 Speaker 4: Atlanta media right, like a lot going on from Atlanta 59 00:03:09,880 --> 00:03:10,600 Speaker 4: happening right now. 60 00:03:11,000 --> 00:03:18,600 Speaker 2: Oh my gosh, yes, some good, some bad. But yeah, 61 00:03:18,240 --> 00:03:21,000 Speaker 2: I've I've really enjoyed watching all the videos of like 62 00:03:21,160 --> 00:03:24,400 Speaker 2: older people watching the halftime show and being like. 63 00:03:24,400 --> 00:03:27,959 Speaker 1: It when you we were talking to older people, we 64 00:03:28,000 --> 00:03:29,560 Speaker 1: were talking about me trying to be like, oh, we 65 00:03:29,639 --> 00:03:32,079 Speaker 1: know these moves, we got this. We're still we're still 66 00:03:32,120 --> 00:03:32,919 Speaker 1: creaking out these moves. 67 00:03:32,919 --> 00:03:33,280 Speaker 3: Don't start. 68 00:03:33,320 --> 00:03:36,840 Speaker 1: Don't start with me, Annie, I still got it. 69 00:03:36,840 --> 00:03:39,440 Speaker 3: It was. It really was peak millennial like you. 70 00:03:39,760 --> 00:03:43,400 Speaker 4: Yeah, if you were of a certain age, you know 71 00:03:43,480 --> 00:03:46,560 Speaker 4: those moves, you love those moves. Those moves have never 72 00:03:46,640 --> 00:03:48,160 Speaker 4: left your blood, you know. 73 00:03:48,240 --> 00:03:50,880 Speaker 1: Luna came out, Yes, going down. 74 00:03:51,040 --> 00:03:53,360 Speaker 3: I was screaming. I got up off my couch and 75 00:03:53,440 --> 00:03:53,680 Speaker 3: was like. 76 00:03:57,960 --> 00:04:01,360 Speaker 1: When everybody thought, you're Maine depre se lo a little sad, 77 00:04:01,360 --> 00:04:03,960 Speaker 1: a little sad about that that moment, but we love 78 00:04:04,040 --> 00:04:09,400 Speaker 1: Germaine to pre come on, and so yes, I think 79 00:04:09,400 --> 00:04:12,000 Speaker 1: I'd explain because my partner was like Selatt's like, no, 80 00:04:12,560 --> 00:04:18,119 Speaker 1: that's Germaine to pre. How do you But understandable because 81 00:04:18,120 --> 00:04:21,320 Speaker 1: then everybody was like, who is this? But yeah, essentially 82 00:04:21,400 --> 00:04:24,719 Speaker 1: like Elena celebrating also watching all the trial stuff with 83 00:04:25,120 --> 00:04:28,480 Speaker 1: Fannie Willis who like she's becoming. You know, I have 84 00:04:28,480 --> 00:04:30,160 Speaker 1: a lot of opinions when it comes to Fulton County 85 00:04:30,200 --> 00:04:32,159 Speaker 1: so pre your cord and they all of that, But 86 00:04:33,040 --> 00:04:35,920 Speaker 1: she's proving herself. I think she's making herself a kind 87 00:04:35,920 --> 00:04:39,119 Speaker 1: of an icon now at this point, whether she wanted 88 00:04:39,160 --> 00:04:41,760 Speaker 1: to or not. Bless her, but uh, yeah, there's a 89 00:04:41,760 --> 00:04:45,240 Speaker 1: lot of things happening here. So I think that's the 90 00:04:45,240 --> 00:04:47,240 Speaker 1: theme of all of us, is like a lot of 91 00:04:47,560 --> 00:04:49,880 Speaker 1: is happening, yeah, and we're just trying to keep up 92 00:04:49,880 --> 00:04:52,160 Speaker 1: and enjoy the good things and laugh along with it. 93 00:04:52,839 --> 00:04:53,760 Speaker 3: What else can you do? 94 00:04:55,720 --> 00:05:00,280 Speaker 2: Yeah, well, I guess on the flip side of though, 95 00:05:02,800 --> 00:05:05,920 Speaker 2: this topic that you're bringing today, Bridget is something that 96 00:05:06,040 --> 00:05:10,679 Speaker 2: has for a long time been something I've thought about 97 00:05:10,720 --> 00:05:13,640 Speaker 2: and the technology has only made it something I think 98 00:05:13,720 --> 00:05:16,760 Speaker 2: about more and I know we're going to get into it, 99 00:05:16,800 --> 00:05:19,680 Speaker 2: and I have like episodes that we have done. That 100 00:05:19,760 --> 00:05:22,960 Speaker 2: episode we did about journalism in India sticks out in 101 00:05:23,040 --> 00:05:26,440 Speaker 2: my mind. But like on the lesser side, even just 102 00:05:26,560 --> 00:05:29,520 Speaker 2: like someone like Carrie Fisher being in a Star Wars 103 00:05:29,520 --> 00:05:32,880 Speaker 2: movie after she dies, there's no she never gave She 104 00:05:32,920 --> 00:05:35,000 Speaker 2: didn't know that was gonna be a thing, so she 105 00:05:35,080 --> 00:05:37,640 Speaker 2: never gave her permission for it. But it's like a 106 00:05:37,760 --> 00:05:42,280 Speaker 2: huge technology thing that is impacting a lot of our 107 00:05:42,320 --> 00:05:48,520 Speaker 2: lives and a lot of sectors. So with that, what 108 00:05:48,640 --> 00:05:51,200 Speaker 2: topic did you bring for us today, Bridget. 109 00:05:51,200 --> 00:05:53,799 Speaker 4: Yeah, I think it's really time to have a deep 110 00:05:53,880 --> 00:05:59,640 Speaker 4: dive into deep fakes AI generated and not AI generated 111 00:05:59,640 --> 00:06:03,080 Speaker 4: as well. But you know, by now, folks have probably 112 00:06:03,080 --> 00:06:06,920 Speaker 4: heard about those awful Taylor Swift AI generated deep fake 113 00:06:06,960 --> 00:06:11,000 Speaker 4: images that circulated on social media. Unfortunately for all of us, 114 00:06:11,040 --> 00:06:13,760 Speaker 4: Taylor Swift was not the first person to be targeted 115 00:06:13,800 --> 00:06:16,400 Speaker 4: in that way and most certainly will not be the last. 116 00:06:16,520 --> 00:06:19,200 Speaker 4: And I know it's one of those issues where, like 117 00:06:19,720 --> 00:06:21,960 Speaker 4: it's easy to think of it as a celebrity issue 118 00:06:22,000 --> 00:06:26,080 Speaker 4: because when celebrities are targeted for non consensual deep fake images. 119 00:06:26,480 --> 00:06:29,839 Speaker 4: It gets big, splashy headlines rightly so, but it's important 120 00:06:29,880 --> 00:06:34,320 Speaker 4: to remember that non celebrity women and girls like children 121 00:06:35,160 --> 00:06:37,719 Speaker 4: have also been targeted. And it's an issue that does 122 00:06:37,760 --> 00:06:40,960 Speaker 4: not just impact celebrities like Taylor Swift and in fact 123 00:06:41,240 --> 00:06:44,040 Speaker 4: impacts all of us, even those of us who are 124 00:06:44,160 --> 00:06:46,320 Speaker 4: who have not been targeted by this kind of thing. 125 00:06:47,080 --> 00:06:50,120 Speaker 4: And so, just as a bit of a housekeeping you 126 00:06:50,200 --> 00:06:53,000 Speaker 4: probably hear a lot of words around deep fakes like 127 00:06:53,040 --> 00:06:55,640 Speaker 4: deep fakes and cheap fakes and AI what does it 128 00:06:55,680 --> 00:06:58,599 Speaker 4: all mean? So the distinctions can be kind of confusing, 129 00:06:58,680 --> 00:07:02,000 Speaker 4: but I think that they are important. So when we're 130 00:07:02,000 --> 00:07:06,960 Speaker 4: talking about deep fakes, we're talking about non consensual images 131 00:07:07,040 --> 00:07:11,120 Speaker 4: that depict nudity or sexualization. Before the rise of AI, 132 00:07:11,920 --> 00:07:15,040 Speaker 4: people would make these by like old school style, just 133 00:07:15,080 --> 00:07:19,160 Speaker 4: simply photoshop in somebody's head on somebody else's body. If 134 00:07:19,240 --> 00:07:21,040 Speaker 4: you've been on the Internet for as long as we have, 135 00:07:21,120 --> 00:07:23,200 Speaker 4: I'm sure y'all have seen some iteration of this where 136 00:07:23,200 --> 00:07:25,679 Speaker 4: you're like, this does this is something about this image 137 00:07:25,680 --> 00:07:28,840 Speaker 4: doesn't look quite right. I don't think that this did 138 00:07:28,880 --> 00:07:30,640 Speaker 4: happen the way They're trying to get me to think 139 00:07:30,680 --> 00:07:31,320 Speaker 4: it did happen. 140 00:07:31,920 --> 00:07:33,800 Speaker 3: Have you all ever encountered those. 141 00:07:35,440 --> 00:07:43,320 Speaker 1: Yes to say did you did you make one? 142 00:07:43,360 --> 00:07:46,200 Speaker 2: I cross it and I thought it was real, and 143 00:07:46,240 --> 00:07:49,440 Speaker 2: then my friends were like, that's definitely. 144 00:07:48,760 --> 00:07:51,720 Speaker 1: Okay, okam, like did you do a Ryan gossling. 145 00:07:51,240 --> 00:07:52,640 Speaker 3: Thing that we don't know about? 146 00:07:52,840 --> 00:07:56,600 Speaker 1: Like I don't, I'm just asking. The way you reacted 147 00:07:56,880 --> 00:08:01,720 Speaker 1: was so like shameful. I needed to know. But you know, 148 00:08:02,480 --> 00:08:04,160 Speaker 1: I don't know why, but I always think about the 149 00:08:04,200 --> 00:08:07,680 Speaker 1: Parkisan reg episode, maybe because I'm Asian, but like the 150 00:08:07,800 --> 00:08:11,800 Speaker 1: councilman who was obsessed with Asian pictures would like put uh, 151 00:08:12,200 --> 00:08:16,960 Speaker 1: like christi Amaguchi's head on random baiting suit picture and 152 00:08:17,160 --> 00:08:19,760 Speaker 1: it was it was almost like it was not even photoshop, 153 00:08:19,800 --> 00:08:21,640 Speaker 1: that it was like cut out of a magazine, but 154 00:08:21,680 --> 00:08:24,239 Speaker 1: it was supposed to be photoshop. I think of those 155 00:08:24,280 --> 00:08:26,560 Speaker 1: for some reason, like that's what pops in my head. 156 00:08:26,960 --> 00:08:29,920 Speaker 4: Some of them are so bad like that where it's like, damn, 157 00:08:30,160 --> 00:08:35,439 Speaker 4: the skin tones don't match, like I think Scarlet Joe 158 00:08:35,440 --> 00:08:36,680 Speaker 4: Hansen has a sleeve hat. 159 00:08:37,400 --> 00:08:39,520 Speaker 1: Here we are yet here we are. 160 00:08:40,000 --> 00:08:43,720 Speaker 4: So so Even before the Rise of Ai and we 161 00:08:43,720 --> 00:08:46,480 Speaker 4: were all talking about AI, those kinds of images were 162 00:08:46,520 --> 00:08:49,640 Speaker 4: all over the Internet. Scarlett Johnson and Emma Watson were 163 00:08:49,640 --> 00:08:52,840 Speaker 4: two celebrities who've talked about being targeted by those specific 164 00:08:52,920 --> 00:08:55,280 Speaker 4: kinds of images for decades, so this is not new. 165 00:08:55,720 --> 00:08:59,080 Speaker 4: There's also what you think of as cheap fakes. That is, 166 00:08:59,120 --> 00:09:03,479 Speaker 4: when an image itself is like technically real and not manipulated, 167 00:09:03,800 --> 00:09:07,680 Speaker 4: but the context is like intentionally misrepresented. A good example 168 00:09:07,720 --> 00:09:10,560 Speaker 4: of a cheap fake was a few years ago somebody 169 00:09:10,679 --> 00:09:13,920 Speaker 4: was passing around an image on social media of a woman, 170 00:09:14,280 --> 00:09:17,920 Speaker 4: like a picture that from her perspective of her in 171 00:09:17,960 --> 00:09:21,880 Speaker 4: a tub vaping, and if you zoomed way in on 172 00:09:22,080 --> 00:09:25,600 Speaker 4: a blurry reflection on the faucet, you could see part 173 00:09:25,600 --> 00:09:28,720 Speaker 4: of her topless body. And so this picture was going 174 00:09:28,760 --> 00:09:33,120 Speaker 4: around people saying that it was like a blurry nude 175 00:09:33,440 --> 00:09:37,319 Speaker 4: image of a semi nude Alexandria Ocasio Cortez in the tub, 176 00:09:37,640 --> 00:09:40,680 Speaker 4: But in actuality it was not aoc It was actually a. 177 00:09:40,559 --> 00:09:44,480 Speaker 3: Real image of Sidney Leathers. You might remember her. 178 00:09:44,520 --> 00:09:47,360 Speaker 4: She was the woman who was involved in that texting 179 00:09:47,400 --> 00:09:51,160 Speaker 4: scandal with former US Representative Anthony Wiener. So the image 180 00:09:51,160 --> 00:09:55,360 Speaker 4: itself was real, it was a real image of Sydney leathers. However, 181 00:09:55,840 --> 00:09:58,640 Speaker 4: it being mislabeled as AOC is what makes it a 182 00:09:58,720 --> 00:10:00,800 Speaker 4: cheap thing, right, So it's exactly it sounds like a 183 00:10:00,880 --> 00:10:05,160 Speaker 4: cheaply manipulated piece of content trying to mislead you. 184 00:10:05,200 --> 00:10:06,560 Speaker 3: So I say all. 185 00:10:06,480 --> 00:10:09,319 Speaker 4: Of this to say that, like, even though we're having 186 00:10:09,360 --> 00:10:12,920 Speaker 4: this conversation because of the rise of AI, it has 187 00:10:12,960 --> 00:10:14,560 Speaker 4: existed before AI. 188 00:10:14,960 --> 00:10:16,000 Speaker 3: But AI has. 189 00:10:15,920 --> 00:10:19,079 Speaker 4: Really made the whole situation a lot worse because with 190 00:10:19,240 --> 00:10:23,440 Speaker 4: things like new toify apps and AI platforms, anybody can 191 00:10:23,480 --> 00:10:27,200 Speaker 4: make a pretty convincing deep fake of anybody, Right Like, 192 00:10:27,520 --> 00:10:29,720 Speaker 4: if you wanted to make a cheap fake of a 193 00:10:29,760 --> 00:10:32,680 Speaker 4: girl in your class to extort her, it probably would 194 00:10:32,679 --> 00:10:34,120 Speaker 4: take a lot of work to come up with a 195 00:10:34,200 --> 00:10:37,040 Speaker 4: cheap fake, or to photoshop her head on somebody else's 196 00:10:37,040 --> 00:10:38,240 Speaker 4: body in a way that's convincing. 197 00:10:38,880 --> 00:10:40,959 Speaker 3: That's no longer the case with AI, and so it's not. 198 00:10:41,120 --> 00:10:44,120 Speaker 4: New, but the speed at which, and the ease at which, 199 00:10:44,160 --> 00:10:46,920 Speaker 4: and the effectiveness of doing it, that part is new. 200 00:10:46,920 --> 00:10:48,640 Speaker 4: And that's why it's such a We're in such a 201 00:10:48,720 --> 00:10:50,320 Speaker 4: dangerous spot for it right now. 202 00:11:00,880 --> 00:11:03,600 Speaker 2: I remember a couple of years ago, there was kind 203 00:11:03,640 --> 00:11:07,440 Speaker 2: of an app that was supposed to like tell you, oh, 204 00:11:07,480 --> 00:11:09,120 Speaker 2: if this is a fake image or not, and then 205 00:11:09,160 --> 00:11:13,640 Speaker 2: it turned up that was fake as well. So it's 206 00:11:13,720 --> 00:11:21,200 Speaker 2: just become really difficult to navigate our reality and what 207 00:11:21,360 --> 00:11:24,120 Speaker 2: is real. And as you said, that was already a problem. 208 00:11:24,400 --> 00:11:29,200 Speaker 2: It has already impacted so many people, but with AI 209 00:11:29,360 --> 00:11:33,360 Speaker 2: and with these newdify apps, it has gotten worse. But 210 00:11:33,400 --> 00:11:38,920 Speaker 2: can you kind of expound upon those newdify apps for example. 211 00:11:39,000 --> 00:11:41,680 Speaker 4: Yeah, so, I mean it's so gross to think, but 212 00:11:42,000 --> 00:11:46,160 Speaker 4: newdify apps are apps that promise to generate non consensual 213 00:11:46,280 --> 00:11:47,920 Speaker 4: nude images of anyone. 214 00:11:48,200 --> 00:11:49,520 Speaker 3: These are commercially. 215 00:11:49,080 --> 00:11:51,600 Speaker 4: Available, like they advertise, you can go on and you 216 00:11:51,640 --> 00:11:54,880 Speaker 4: pay money to buy them. And yeah, those apps are 217 00:11:54,960 --> 00:12:00,840 Speaker 4: unfortunately available and surging and popularity. In last September alone, 218 00:12:00,880 --> 00:12:04,280 Speaker 4: twenty four million people visited undressing websites, according to the 219 00:12:04,320 --> 00:12:08,280 Speaker 4: social media network analysis company Graphica. In Graphica's analysis, it's 220 00:12:08,280 --> 00:12:11,640 Speaker 4: actually very interesting they talked about how we've really seen 221 00:12:11,679 --> 00:12:15,600 Speaker 4: this shift where these kinds of undressing apps on dressing websites, 222 00:12:16,080 --> 00:12:19,559 Speaker 4: they used to sort of be these niche, custom underground 223 00:12:19,920 --> 00:12:23,200 Speaker 4: things where it's like a small handful of creeps would 224 00:12:23,200 --> 00:12:25,120 Speaker 4: would would traffic in them and they would have this 225 00:12:25,240 --> 00:12:29,120 Speaker 4: like kind of underground fringe network. Now these are fully 226 00:12:29,400 --> 00:12:35,440 Speaker 4: monetized online businesses, complete with advertising. So in Graphica's analysis, 227 00:12:35,480 --> 00:12:38,080 Speaker 4: they talked about how we've really seen this shift where 228 00:12:38,240 --> 00:12:41,280 Speaker 4: these kinds of apps have moved from like niche underground 229 00:12:41,960 --> 00:12:45,440 Speaker 4: custom markets with like creeps asking and paying for pictures 230 00:12:45,440 --> 00:12:51,720 Speaker 4: of people. Now they are fully monetized online businesses, complete 231 00:12:51,760 --> 00:12:52,599 Speaker 4: with advertising. 232 00:12:52,640 --> 00:12:55,199 Speaker 3: And so that's really part of the problem too, is. 233 00:12:55,160 --> 00:13:00,839 Speaker 4: That how mainstream these apps have become right now. 234 00:13:01,640 --> 00:13:04,200 Speaker 2: Yes, and I know we're about to get into this, 235 00:13:04,200 --> 00:13:05,840 Speaker 2: but that's one of the things I was thinking about 236 00:13:05,840 --> 00:13:09,839 Speaker 2: with this is like it's bizarre and horrifying to me 237 00:13:10,679 --> 00:13:14,800 Speaker 2: how this has become such a weapon. Like I feel 238 00:13:14,800 --> 00:13:18,280 Speaker 2: like if it happens to men, it still sucks. It's bad, 239 00:13:18,800 --> 00:13:22,800 Speaker 2: but it's not like, oh God, your career is ruined 240 00:13:23,240 --> 00:13:30,040 Speaker 2: or oh god this your family is so upset, And 241 00:13:30,080 --> 00:13:33,280 Speaker 2: there's not really a way, as I said, to prove 242 00:13:33,480 --> 00:13:37,360 Speaker 2: like this isn't real, And these platforms are making money 243 00:13:37,440 --> 00:13:40,800 Speaker 2: off of it in some cases and in other cases 244 00:13:40,800 --> 00:13:43,280 Speaker 2: they're just not doing anything about it. 245 00:13:43,360 --> 00:13:46,920 Speaker 4: Yes, yeah, I mean so I hate to say this, 246 00:13:47,000 --> 00:13:49,800 Speaker 4: but a lot of platforms are just kind of allowing 247 00:13:49,840 --> 00:13:53,160 Speaker 4: it because you know, again with this being more of 248 00:13:53,200 --> 00:13:57,319 Speaker 4: a like fully monetized business, think about all the different 249 00:13:57,440 --> 00:14:01,440 Speaker 4: platforms and institutions that have to be in for someone 250 00:14:01,480 --> 00:14:03,920 Speaker 4: to make a purchase of a new Tofy app, a 251 00:14:03,960 --> 00:14:08,240 Speaker 4: payment processor like PayPal, an app store like a Google Play, 252 00:14:08,640 --> 00:14:11,880 Speaker 4: a social media platform like a Twitter or a Facebook 253 00:14:12,280 --> 00:14:15,280 Speaker 4: that advertise that app, and all of these social media 254 00:14:15,320 --> 00:14:19,760 Speaker 4: platforms have unfortunately advertised for these new Toify apps. And 255 00:14:19,800 --> 00:14:22,200 Speaker 4: so and I'm also sad to say that, you know, 256 00:14:23,600 --> 00:14:26,160 Speaker 4: as you said, like we are talking about something that 257 00:14:26,240 --> 00:14:30,840 Speaker 4: can ruin people's reputations, that can mess up their careers, 258 00:14:30,920 --> 00:14:34,160 Speaker 4: that can create chaos and havoc with their family life, 259 00:14:34,200 --> 00:14:37,360 Speaker 4: their romantic life, their civic life, their public life. 260 00:14:37,960 --> 00:14:40,560 Speaker 3: And so I hate to say this, but so. 261 00:14:40,720 --> 00:14:43,840 Speaker 4: Far it is the kind of thing where platforms, for 262 00:14:43,880 --> 00:14:47,360 Speaker 4: the most part, will try to do something and act 263 00:14:47,600 --> 00:14:51,000 Speaker 4: after the fact. But when you're talking about something that 264 00:14:51,040 --> 00:14:55,920 Speaker 4: can be so impactful, reactively taking action after the damage 265 00:14:55,960 --> 00:14:58,240 Speaker 4: has already been done to me is not enough. And 266 00:14:58,280 --> 00:15:00,160 Speaker 4: so what I would want to see is more are 267 00:15:00,280 --> 00:15:02,960 Speaker 4: proactive measures to keep this kind of thing from happening, 268 00:15:03,000 --> 00:15:07,280 Speaker 4: as opposed to like, oh well, once there's already non 269 00:15:07,360 --> 00:15:13,000 Speaker 4: consensually AI generated images depicting what is oftentimes essentially like 270 00:15:13,080 --> 00:15:16,040 Speaker 4: child sexual abuse material. Then we'll take it down and 271 00:15:16,080 --> 00:15:18,360 Speaker 4: it's like, well, everybody in our school already saw it. 272 00:15:18,360 --> 00:15:19,920 Speaker 3: Everybody in our school has it on their phone. 273 00:15:20,000 --> 00:15:22,960 Speaker 4: That's not really in my book, it's not really a 274 00:15:23,000 --> 00:15:24,000 Speaker 4: reasonable response. 275 00:15:24,080 --> 00:15:29,320 Speaker 1: Right, with all this conversation about censorship, with TikTok especially 276 00:15:29,360 --> 00:15:33,080 Speaker 1: and things like that, you would think apps like this 277 00:15:33,200 --> 00:15:36,360 Speaker 1: that actually create porn, let actually create some things that 278 00:15:36,440 --> 00:15:40,800 Speaker 1: are as what we need would say were porn, that 279 00:15:40,880 --> 00:15:44,320 Speaker 1: there would be conversation about how to regulate that, and 280 00:15:44,400 --> 00:15:47,400 Speaker 1: also that this would be a conversation about what the 281 00:15:47,440 --> 00:15:51,240 Speaker 1: power of AI can do. Before Taylor Swift like, there's 282 00:15:51,280 --> 00:15:54,080 Speaker 1: so many people, as you've already said, who have gone 283 00:15:54,080 --> 00:15:56,120 Speaker 1: through this. We know that there was an incident that 284 00:15:56,160 --> 00:15:58,960 Speaker 1: happened with Twitch, with a lot of fallout with a 285 00:15:58,960 --> 00:16:03,560 Speaker 1: couple of women creators that were targeted by other creators 286 00:16:03,600 --> 00:16:06,240 Speaker 1: who just thought it was funny or had their own 287 00:16:06,520 --> 00:16:09,880 Speaker 1: needs or whatever. I've that in quotes and using that 288 00:16:10,200 --> 00:16:13,160 Speaker 1: against them and almost like in ruining these women and 289 00:16:13,240 --> 00:16:14,760 Speaker 1: to the point that they felt like they had to 290 00:16:14,840 --> 00:16:16,880 Speaker 1: hide even though it had nothing to do with them. 291 00:16:16,880 --> 00:16:19,960 Speaker 1: They did nothing, but no one's really paying attention to that. 292 00:16:21,040 --> 00:16:22,000 Speaker 1: To this point. 293 00:16:22,200 --> 00:16:25,200 Speaker 4: I remember the Twitch incident, and something that really sticks 294 00:16:25,240 --> 00:16:27,720 Speaker 4: out at me was, boy, do we need to have 295 00:16:27,880 --> 00:16:32,760 Speaker 4: a real cultural conversation and a cultural attitude shift about 296 00:16:32,760 --> 00:16:34,520 Speaker 4: the harm that this can be associated with? 297 00:16:34,960 --> 00:16:36,600 Speaker 3: Because I still. 298 00:16:36,360 --> 00:16:38,800 Speaker 4: Saw people responding with, well, but it's not really her. 299 00:16:38,880 --> 00:16:42,000 Speaker 4: Why does she care if it's not really her, it's 300 00:16:42,000 --> 00:16:44,480 Speaker 4: just a picture that's fake. It's just pretend who cares? 301 00:16:44,600 --> 00:16:49,400 Speaker 4: And you know, being a millennial, all of us probably 302 00:16:49,400 --> 00:16:55,400 Speaker 4: remember how we culturally treated when someone's images got leaked 303 00:16:55,480 --> 00:16:58,240 Speaker 4: or got hacked. It was such a shameful thing, right, 304 00:16:58,320 --> 00:17:00,320 Speaker 4: And so it's wild to me that people would go 305 00:17:00,400 --> 00:17:04,639 Speaker 4: from like twenty ten photos being hacked and released and 306 00:17:04,720 --> 00:17:06,560 Speaker 4: being like, oh, they shouldn't have taken those photos. 307 00:17:06,600 --> 00:17:08,520 Speaker 3: That's so shameful, blah blah blah. 308 00:17:08,320 --> 00:17:10,960 Speaker 4: And then here in twenty twenty four be like, it's 309 00:17:10,960 --> 00:17:13,480 Speaker 4: not a big deal, they're not real photos. And so 310 00:17:14,080 --> 00:17:16,560 Speaker 4: I would say that the Twitch incident to me, really 311 00:17:16,600 --> 00:17:19,520 Speaker 4: shows that we need to have a cultural conversation and 312 00:17:19,600 --> 00:17:22,240 Speaker 4: attitude shift that's like, no, this is real. This is 313 00:17:22,240 --> 00:17:24,879 Speaker 4: people's real lives, the real reputations. At the end of 314 00:17:24,920 --> 00:17:27,960 Speaker 4: the day, if you don't consent to be depicted in 315 00:17:28,000 --> 00:17:31,040 Speaker 4: these images, it is a sex crime. And so it's 316 00:17:31,119 --> 00:17:35,600 Speaker 4: a very serious thing. It's not something to just brush off. 317 00:17:35,840 --> 00:17:38,639 Speaker 4: And the women and girls targeted should not be brushing 318 00:17:38,680 --> 00:17:42,080 Speaker 4: it off, and we shouldn't be reinforcing this attitude that 319 00:17:42,080 --> 00:17:44,400 Speaker 4: it's not a big deal because it's a violation. 320 00:17:44,720 --> 00:17:48,280 Speaker 1: Right, And that's kind of that conversation, is that obviously, 321 00:17:48,359 --> 00:17:51,720 Speaker 1: because the government hasn't caught up with what could be, 322 00:17:52,640 --> 00:17:53,560 Speaker 1: is gonna get worse. 323 00:17:53,960 --> 00:17:57,159 Speaker 4: Oh, yes, this is something that like I will probably 324 00:17:57,160 --> 00:18:01,280 Speaker 4: say this five million times in this conversation. Currently there 325 00:18:01,359 --> 00:18:05,840 Speaker 4: is no federal legislation criminalizing non consensual. 326 00:18:05,359 --> 00:18:06,879 Speaker 3: Deep fakes, which. 327 00:18:08,200 --> 00:18:12,480 Speaker 4: To me is staggering when we've already it's you know, 328 00:18:12,560 --> 00:18:14,800 Speaker 4: I often am talking about tech harms and I'm like, oh, 329 00:18:14,800 --> 00:18:17,760 Speaker 4: this could be a harm, or like down the line, 330 00:18:17,840 --> 00:18:21,560 Speaker 4: this could be a harm. It's a harm already. There's 331 00:18:21,640 --> 00:18:26,480 Speaker 4: already been several reports, both in the United States and elsewhere, 332 00:18:26,760 --> 00:18:32,200 Speaker 4: of girls children not just being humiliated by non consensual 333 00:18:32,480 --> 00:18:37,440 Speaker 4: AI deep takes, but sex stoarted where boys in their 334 00:18:37,520 --> 00:18:41,280 Speaker 4: class will say, if you don't pay me money, I 335 00:18:41,359 --> 00:18:45,159 Speaker 4: will release these non consensual AI generated deep takes of 336 00:18:45,200 --> 00:18:47,680 Speaker 4: you to the whole school. And so the fact that 337 00:18:47,840 --> 00:18:52,400 Speaker 4: we're still sort of, I don't know, spitting our wheels. 338 00:18:52,160 --> 00:18:54,639 Speaker 4: There's been a couple of lawmakers that are trying to 339 00:18:54,640 --> 00:18:57,520 Speaker 4: get legislation off the ground. The girls who have been 340 00:18:57,560 --> 00:19:01,800 Speaker 4: targeted have been very loudly and clearly advocating that we 341 00:19:01,840 --> 00:19:05,119 Speaker 4: need more legislation and our kids are at risk, Like, 342 00:19:05,800 --> 00:19:09,239 Speaker 4: we are really deeply failing our kids in such a 343 00:19:09,320 --> 00:19:13,760 Speaker 4: serious way. And I just can't imagine being thirteen fourteen, 344 00:19:13,840 --> 00:19:17,400 Speaker 4: a middle schooler and having to just shoulder the burden 345 00:19:17,520 --> 00:19:22,280 Speaker 4: of this kind of violation, a sexual public humiliation, and 346 00:19:23,000 --> 00:19:25,640 Speaker 4: the lawmakers that are meant to keep us safe are 347 00:19:25,640 --> 00:19:26,719 Speaker 4: just like sbit in their wheels. 348 00:19:26,720 --> 00:19:27,399 Speaker 3: It's really sad. 349 00:19:28,000 --> 00:19:30,360 Speaker 2: Yeah, And I think it points to a lot of 350 00:19:30,440 --> 00:19:33,840 Speaker 2: like because again I was thinking about this, and I 351 00:19:33,880 --> 00:19:37,280 Speaker 2: was thinking about, like, what does it say if these 352 00:19:37,280 --> 00:19:42,280 Speaker 2: images get leaked for girls and women? And it's so devastating, 353 00:19:42,560 --> 00:19:46,639 Speaker 2: but what if for men? It doesn't necessarily equate to that? 354 00:19:46,680 --> 00:19:51,159 Speaker 2: Again it can, but like also it does lead to 355 00:19:51,400 --> 00:19:54,199 Speaker 2: things like self harm and even death. Like, this is 356 00:19:54,240 --> 00:19:59,679 Speaker 2: a really serious thing, and I think we should examine 357 00:20:00,160 --> 00:20:02,760 Speaker 2: both why it's happening, but both why it is so 358 00:20:02,920 --> 00:20:09,560 Speaker 2: serious for girls and women. But you also have numbers 359 00:20:09,560 --> 00:20:14,440 Speaker 2: about how it's a unfortunately growing it's a worsening as 360 00:20:14,480 --> 00:20:15,120 Speaker 2: a problem. 361 00:20:15,480 --> 00:20:17,720 Speaker 4: Yeah, and I think part of why it's worstening is 362 00:20:17,760 --> 00:20:21,280 Speaker 4: because we haven't done that examination. We haven't really gotten 363 00:20:21,280 --> 00:20:24,720 Speaker 4: serious about legislation. So it's just like people like me 364 00:20:24,800 --> 00:20:27,440 Speaker 4: are like, oh, this is a problem, and then the 365 00:20:27,480 --> 00:20:29,760 Speaker 4: girls that had happened to are like, this is bad, 366 00:20:29,800 --> 00:20:31,640 Speaker 4: and everyone's like, yep, we agree it's bad, and then 367 00:20:32,080 --> 00:20:33,800 Speaker 4: no one does anything and it disc gets worse. So, 368 00:20:34,160 --> 00:20:37,960 Speaker 4: according to research from independent analyst Genevieve Oh, the number 369 00:20:38,000 --> 00:20:41,480 Speaker 4: of new pornographic deep fake videos has already surged more 370 00:20:41,520 --> 00:20:44,000 Speaker 4: than nine fold since twenty twenty. At the end of 371 00:20:44,040 --> 00:20:47,200 Speaker 4: twenty twenty three, the top ten sites offering AI generated 372 00:20:47,200 --> 00:20:51,080 Speaker 4: deep fakes hosted one hundred and fourteen thousand videos. And 373 00:20:51,560 --> 00:20:53,879 Speaker 4: as you were saying, Annie, just to be clear, like 374 00:20:54,520 --> 00:20:56,760 Speaker 4: women and girls are not the only folks who are 375 00:20:56,800 --> 00:21:01,160 Speaker 4: impacted by this, but it is overwhelmingly women and girls 376 00:21:01,160 --> 00:21:04,399 Speaker 4: who are shouldering the impact here and the burden of 377 00:21:04,440 --> 00:21:06,280 Speaker 4: the harm that it causes. According to a study from 378 00:21:06,320 --> 00:21:10,560 Speaker 4: deep trace Labs, ninety six percent of Deep thinks are 379 00:21:10,640 --> 00:21:15,520 Speaker 4: non consensual sexual depictions of women and girls. And so yeah, 380 00:21:15,560 --> 00:21:18,520 Speaker 4: I'm not saying that men are not dealing with this. 381 00:21:18,680 --> 00:21:21,400 Speaker 3: And you know, there's plenty to say. 382 00:21:21,240 --> 00:21:26,400 Speaker 4: About the way that technology is harming men and boys, 383 00:21:26,440 --> 00:21:29,239 Speaker 4: Like there's been a lot of developments in terms of 384 00:21:29,280 --> 00:21:34,520 Speaker 4: like bad actors chatting with boys, the teen boys especially 385 00:21:34,880 --> 00:21:38,080 Speaker 4: and then sex storting them and being like, oh, actually 386 00:21:38,200 --> 00:21:41,159 Speaker 4: I am a bad actor in the Global South, and 387 00:21:41,240 --> 00:21:43,919 Speaker 4: if you don't give me money, I am going to 388 00:21:44,200 --> 00:21:46,760 Speaker 4: release these images I took of you to all your 389 00:21:46,800 --> 00:21:47,560 Speaker 4: friends and family. 390 00:21:47,920 --> 00:21:49,080 Speaker 3: That is a growing problem. 391 00:21:49,080 --> 00:21:52,520 Speaker 4: And so certainly, you know, the intersection of sex crimes 392 00:21:52,520 --> 00:21:54,920 Speaker 4: and technology is something that harms us all. But when 393 00:21:54,920 --> 00:21:56,800 Speaker 4: it comes to deep Thinks, we're really talking about women 394 00:21:56,840 --> 00:22:00,320 Speaker 4: and girls. It is a gender justice issue, and I 395 00:22:00,359 --> 00:22:03,000 Speaker 4: think it's one of those issues that really shows how 396 00:22:04,240 --> 00:22:09,320 Speaker 4: technology and gender can intersect in ways that really keep 397 00:22:09,359 --> 00:22:12,120 Speaker 4: women and girls from being able to fully and safely 398 00:22:12,480 --> 00:22:12,920 Speaker 4: show up. 399 00:22:12,880 --> 00:22:13,840 Speaker 3: In online spaces. 400 00:22:13,880 --> 00:22:17,800 Speaker 4: And increasingly, showing up in online spaces is the same 401 00:22:17,840 --> 00:22:20,800 Speaker 4: thing as showing up in civic and public and democratic life. 402 00:22:20,800 --> 00:22:22,920 Speaker 4: And so if you aren't able to show up fully 403 00:22:22,960 --> 00:22:25,800 Speaker 4: online without being at risk. 404 00:22:25,600 --> 00:22:27,719 Speaker 3: Of like AI generated deep fikes. 405 00:22:27,960 --> 00:22:30,119 Speaker 4: You are not able to safely and meaningfully show up 406 00:22:30,119 --> 00:22:31,720 Speaker 4: in public and civic life, right. 407 00:22:41,440 --> 00:22:43,400 Speaker 1: You know, we actually talked about this a little bit 408 00:22:43,400 --> 00:22:48,760 Speaker 1: with our AI episode. The fact that AI specifically, when 409 00:22:48,760 --> 00:22:52,480 Speaker 1: it comes to prompting, goes off of what's on the web, 410 00:22:52,720 --> 00:22:55,439 Speaker 1: goes off what was up there and available and even 411 00:22:55,560 --> 00:22:58,880 Speaker 1: innocent prompts, because that's how people often create these images 412 00:22:59,560 --> 00:23:03,919 Speaker 1: turn out to be ridiculously offensive and oftentimes very like 413 00:23:04,240 --> 00:23:07,000 Speaker 1: sexual in nature and graphic of women and young girls. 414 00:23:08,200 --> 00:23:11,240 Speaker 1: And it's not because that's what they prompted. It's because 415 00:23:11,240 --> 00:23:13,840 Speaker 1: that's what's available and has been out there and most 416 00:23:13,920 --> 00:23:17,000 Speaker 1: likely to be the most popular or whatever whatnot. And 417 00:23:17,000 --> 00:23:19,919 Speaker 1: that's a conversation in itself of like how twisted is 418 00:23:19,960 --> 00:23:23,000 Speaker 1: the Internet and how twisted have we made this Internet? 419 00:23:23,240 --> 00:23:27,000 Speaker 1: I say this very only, but that that's what is 420 00:23:27,160 --> 00:23:30,680 Speaker 1: immediately seen and immediately brought up, even though that's not 421 00:23:30,680 --> 00:23:32,640 Speaker 1: not be what they meant to say or what they 422 00:23:32,640 --> 00:23:34,960 Speaker 1: were intending to do, which can take you down to 423 00:23:34,960 --> 00:23:38,400 Speaker 1: a whole different rabbit hole in itself. I remember real bad, 424 00:23:38,680 --> 00:23:42,440 Speaker 1: real bad example, but like being younger, loving in sync 425 00:23:43,240 --> 00:23:46,720 Speaker 1: and I googled something in sync and I misspelled the word. 426 00:23:46,760 --> 00:23:48,000 Speaker 1: It took me to a porn side. I was like, 427 00:23:48,040 --> 00:23:52,280 Speaker 1: what what is this? And I was so confused as 428 00:23:52,320 --> 00:23:55,920 Speaker 1: a teenager, but like stuff like that being prompted when 429 00:23:55,920 --> 00:23:59,640 Speaker 1: something is not even like that harmful of a prompt 430 00:23:59,760 --> 00:24:01,960 Speaker 1: and it pulls that stuff up. Because that's so much 431 00:24:02,240 --> 00:24:05,320 Speaker 1: of the content that's available and that's allowed on the 432 00:24:05,320 --> 00:24:08,160 Speaker 1: Internet that's not been regulated. Everything else could have been regulated. 433 00:24:08,200 --> 00:24:11,280 Speaker 1: We're not going to talk about you know, rights conspiracy 434 00:24:11,320 --> 00:24:13,959 Speaker 1: things right now, necessarily, but there's so many things that 435 00:24:14,960 --> 00:24:17,560 Speaker 1: you would think would be innocent. But because of the 436 00:24:17,560 --> 00:24:21,320 Speaker 1: way we may have trained the Internet, and again the 437 00:24:21,359 --> 00:24:23,720 Speaker 1: people who are in charge and allow the stuff that's 438 00:24:23,760 --> 00:24:26,879 Speaker 1: on the Internet that comes up, And that's part of 439 00:24:26,920 --> 00:24:29,360 Speaker 1: the problem in itself, that there's too much available content 440 00:24:29,640 --> 00:24:32,320 Speaker 1: that the people who controlled it have allowed this to 441 00:24:32,359 --> 00:24:34,200 Speaker 1: be where we are. 442 00:24:35,000 --> 00:24:37,320 Speaker 4: If there is one thing that I want people to 443 00:24:37,359 --> 00:24:40,160 Speaker 4: take away about AI and what it all means from 444 00:24:40,160 --> 00:24:42,679 Speaker 4: this episode, it is that it is so easy to 445 00:24:42,680 --> 00:24:48,320 Speaker 4: think of AI as hyper intelligent robot computer brains and they're. 446 00:24:48,119 --> 00:24:50,440 Speaker 3: Just thinking on their own. That's not true. 447 00:24:50,600 --> 00:24:53,720 Speaker 4: AI is built by humans, it is trained by humans. 448 00:24:54,000 --> 00:24:57,159 Speaker 4: Everything that AI does, it is learning from us and 449 00:24:57,200 --> 00:24:59,320 Speaker 4: the content that we put out there on the internet, 450 00:24:59,400 --> 00:25:01,240 Speaker 4: from what we see, say, from what we think, from 451 00:25:01,280 --> 00:25:03,800 Speaker 4: what we do think about the humans that you know, 452 00:25:04,560 --> 00:25:09,639 Speaker 4: humans are biased, humans have foibles, humans have flaws, so 453 00:25:09,880 --> 00:25:13,040 Speaker 4: all of the things that you know humans can have. 454 00:25:13,600 --> 00:25:18,000 Speaker 4: AI is simply reflecting all of those all of that bias, 455 00:25:18,080 --> 00:25:20,800 Speaker 4: all of those shortcomings back at us because it learned 456 00:25:20,840 --> 00:25:21,400 Speaker 4: it from us. 457 00:25:21,440 --> 00:25:26,720 Speaker 3: And so yeah, that's like the like AI is not neutral. 458 00:25:27,040 --> 00:25:30,119 Speaker 4: AI is just spitting back the same biased stuff that 459 00:25:30,160 --> 00:25:32,720 Speaker 4: we put out there, and so that doesn't surprise me. 460 00:25:32,760 --> 00:25:36,360 Speaker 4: It's like the same reason why when if you are 461 00:25:36,440 --> 00:25:40,520 Speaker 4: if you ever use one of those AI headshot generators, 462 00:25:40,680 --> 00:25:43,120 Speaker 4: they'll often if you're a person of color, maybe it'll 463 00:25:43,200 --> 00:25:46,119 Speaker 4: lighten your skin, maybe it'll narrow your features, maybe it'll 464 00:25:46,320 --> 00:25:49,240 Speaker 4: make you look thinner, or make your make you bustier, 465 00:25:49,320 --> 00:25:52,640 Speaker 4: or make your hair longer, because it's like, oh, we 466 00:25:53,240 --> 00:25:57,280 Speaker 4: this this this program has been trained on what's out there, 467 00:25:57,359 --> 00:25:59,639 Speaker 4: and what they think is like, oh, you want a 468 00:25:59,680 --> 00:26:01,840 Speaker 4: pretty picture of yourself. That means you want to look 469 00:26:01,920 --> 00:26:05,960 Speaker 4: like this, right, It's not like AI. AI is just 470 00:26:06,000 --> 00:26:06,520 Speaker 4: getting that. 471 00:26:06,440 --> 00:26:10,359 Speaker 2: From us, right, Yes, but one of the interesting things 472 00:26:10,400 --> 00:26:14,080 Speaker 2: that you put together in this outline is some people 473 00:26:14,119 --> 00:26:18,280 Speaker 2: have tried to stop this, like we've tried to come 474 00:26:18,320 --> 00:26:22,520 Speaker 2: in and make sure that certain things can't happen with 475 00:26:22,600 --> 00:26:27,159 Speaker 2: AI or with deep fakes in general, but people have 476 00:26:27,320 --> 00:26:29,719 Speaker 2: found their ways around them. 477 00:26:30,080 --> 00:26:34,840 Speaker 4: Oh yes, when you have like motivated creeps who have 478 00:26:34,920 --> 00:26:37,200 Speaker 4: a lot of time on their hands and are good 479 00:26:37,240 --> 00:26:40,800 Speaker 4: with technology, they will find a way. And so that's 480 00:26:40,840 --> 00:26:45,960 Speaker 4: actually how those Taylor Swift images originated in the first place. 481 00:26:45,960 --> 00:26:50,280 Speaker 4: Those images originated on an alternative messaging platform called Telegram, 482 00:26:50,320 --> 00:26:53,359 Speaker 4: and this specific Telegram channel was actually kind of like 483 00:26:53,720 --> 00:26:57,320 Speaker 4: a marketplace for celebrity deep bakes, where users were like 484 00:26:57,520 --> 00:27:01,160 Speaker 4: requesting and trading deep images of celebrities since last year. 485 00:27:01,480 --> 00:27:05,640 Speaker 4: And so most commercial AI generators do have some safeguards 486 00:27:05,640 --> 00:27:06,880 Speaker 4: in place to prevent abuse. 487 00:27:06,920 --> 00:27:08,480 Speaker 3: So like if you. 488 00:27:08,480 --> 00:27:11,480 Speaker 4: Were to ask an AI generator to be like, oh, 489 00:27:11,520 --> 00:27:15,159 Speaker 4: make me an image depicting a child, it's not supposed. 490 00:27:14,760 --> 00:27:19,000 Speaker 3: To do that. However, the people in this particular Telegram group. 491 00:27:19,160 --> 00:27:22,639 Speaker 4: Their whole thing was I mean, I hate to say it, 492 00:27:22,640 --> 00:27:26,560 Speaker 4: but like almost like the sport of getting around those guardrails. 493 00:27:26,560 --> 00:27:28,440 Speaker 4: And so four or four Media did a lot of 494 00:27:28,480 --> 00:27:32,080 Speaker 4: digging into this and found that like, basically the people 495 00:27:32,119 --> 00:27:37,320 Speaker 4: on this telegram platform were like trying all these different 496 00:27:37,359 --> 00:27:40,760 Speaker 4: ways to get around the guardrails that were in place 497 00:27:40,880 --> 00:27:44,040 Speaker 4: in the in Microsoft's AI creation tool called Designer in 498 00:27:44,200 --> 00:27:48,600 Speaker 4: order to generate deep fakes of specific celebrities. And so 499 00:27:49,000 --> 00:27:52,040 Speaker 4: if you were to be like type in Taylor Swift, 500 00:27:52,400 --> 00:27:54,600 Speaker 4: it's not supposed to generate an image of Taylor Swift. 501 00:27:54,640 --> 00:27:58,120 Speaker 4: That's supposed to be a guardrail. However, these crates found that. 502 00:27:58,080 --> 00:28:01,040 Speaker 3: If you you put. 503 00:28:00,840 --> 00:28:03,920 Speaker 4: It an input that's like tailor singer Swift, that it 504 00:28:04,080 --> 00:28:06,520 Speaker 4: will right. And so it's also not supposed to generate 505 00:28:06,560 --> 00:28:12,200 Speaker 4: images or images that depict sexual activity. But if you say, like, oh, 506 00:28:12,560 --> 00:28:16,880 Speaker 4: generate an image of Tailor singer Swift eating a hot 507 00:28:16,920 --> 00:28:19,440 Speaker 4: dog or something like that, something that's like not sexually 508 00:28:19,480 --> 00:28:22,520 Speaker 4: explicit but could look kind of sexually explicit, there. 509 00:28:22,359 --> 00:28:24,160 Speaker 3: Are all these little ways of getting around it. 510 00:28:24,480 --> 00:28:28,159 Speaker 4: And yeah, basically, if you are motivated and a creep, 511 00:28:28,720 --> 00:28:31,480 Speaker 4: you can get around these guardrails pretty easily. 512 00:28:31,600 --> 00:28:33,880 Speaker 3: And none of this is terribly surprising. 513 00:28:33,880 --> 00:28:35,440 Speaker 4: It's the same kind of thing that people have been 514 00:28:35,480 --> 00:28:38,000 Speaker 4: warning is really easy to do for a long time. 515 00:28:38,040 --> 00:28:42,480 Speaker 4: And even after the Taylor Swift deepstakes really blew up, 516 00:28:43,360 --> 00:28:47,800 Speaker 4: Microsoft tweaked their guardrails to make this kind of thing 517 00:28:47,960 --> 00:28:51,760 Speaker 4: less possible. And on that telegram platform, all you had 518 00:28:51,880 --> 00:28:55,040 Speaker 4: was people being like, oh, well, let's continue, let let's 519 00:28:55,040 --> 00:28:56,760 Speaker 4: continue to figure out a way to keep doing this, 520 00:28:56,840 --> 00:28:58,760 Speaker 4: Like this is like a support for them. They this 521 00:28:58,800 --> 00:29:02,680 Speaker 4: is something they enjoyed doing. The guardrails are only for 522 00:29:02,760 --> 00:29:05,680 Speaker 4: whatever reason, like motivating them to try to figure out 523 00:29:05,680 --> 00:29:09,320 Speaker 4: ways around those guardrails to continue generating this kind of content. 524 00:29:10,440 --> 00:29:15,800 Speaker 2: Right, And I remember when this happened because I had 525 00:29:16,240 --> 00:29:18,880 Speaker 2: I feel like Twitter has sort of become my Facebook, 526 00:29:19,320 --> 00:29:23,040 Speaker 2: where people contact me through dms on Twitter, and that's 527 00:29:23,080 --> 00:29:25,440 Speaker 2: like the only time I go to Twitter anymore. And 528 00:29:25,520 --> 00:29:30,320 Speaker 2: I saw it trending and I was like, oh my god, 529 00:29:30,360 --> 00:29:32,640 Speaker 2: what has happened. And that's how I sort of learned 530 00:29:32,640 --> 00:29:36,880 Speaker 2: about this whole thing. And then I read about how 531 00:29:36,920 --> 00:29:39,160 Speaker 2: they handled it, and it wasn't great. 532 00:29:39,080 --> 00:29:43,000 Speaker 4: Right, I hated how they handled it. So I guess 533 00:29:43,000 --> 00:29:45,520 Speaker 4: Twitter realized like, oh my god, these images are everywhere. 534 00:29:45,640 --> 00:29:48,600 Speaker 4: Taylor Swift is a huge figure. We ought to do something. 535 00:29:48,960 --> 00:29:52,200 Speaker 4: And so what they did was they tried to block 536 00:29:52,280 --> 00:29:55,680 Speaker 4: people from being able to search Taylor Swift's name temporarily, 537 00:29:55,680 --> 00:30:00,000 Speaker 4: although just putting quote marks around her name allowed people 538 00:30:00,040 --> 00:30:01,680 Speaker 4: to search the name, so they didn't even do that 539 00:30:01,800 --> 00:30:04,760 Speaker 4: very even the method of dealing with it that I 540 00:30:04,800 --> 00:30:07,560 Speaker 4: thought was so bad, they didn't even really do that effectively. 541 00:30:07,640 --> 00:30:12,400 Speaker 4: So hated that. But I even kind of bigger than that. 542 00:30:12,480 --> 00:30:14,280 Speaker 4: The reason why I hate this approach. 543 00:30:14,000 --> 00:30:17,200 Speaker 3: Is that it makes it seem like Taylor. 544 00:30:16,960 --> 00:30:20,480 Speaker 4: Swift has done something wrong. Right now, Taylor Swift is 545 00:30:20,480 --> 00:30:23,040 Speaker 4: a billionaire, she's huge. I don't think people are like 546 00:30:23,080 --> 00:30:24,800 Speaker 4: forgetting who she is, if she's not, if you're out 547 00:30:24,800 --> 00:30:26,440 Speaker 4: of if you're not able to search her name on 548 00:30:26,440 --> 00:30:30,480 Speaker 4: Twitter temporarily. But because just because she has been the 549 00:30:30,520 --> 00:30:34,720 Speaker 4: target of this like sexual humiliation campaign, why should she 550 00:30:34,800 --> 00:30:37,720 Speaker 4: have to have her name blocked on social media search? 551 00:30:38,040 --> 00:30:41,080 Speaker 3: And what if this happened to somebody who wasn't just famous? 552 00:30:41,120 --> 00:30:44,240 Speaker 4: Right, It's basically like saying the only way to keep 553 00:30:44,280 --> 00:30:47,440 Speaker 4: you safe in these online spaces is to obscure your 554 00:30:47,480 --> 00:30:51,880 Speaker 4: presence there. And as I said, participation in online platforms 555 00:30:52,000 --> 00:30:57,480 Speaker 4: is like pretty much linked with how people show up socially, politically, civically, democratically. 556 00:30:57,560 --> 00:30:59,160 Speaker 3: In twenty twenty four and so. 557 00:30:59,200 --> 00:31:01,560 Speaker 4: Saying that like, oh, need to obscure your presence there 558 00:31:01,960 --> 00:31:06,600 Speaker 4: is basically marginalizing women in these online spaces where we 559 00:31:06,680 --> 00:31:08,120 Speaker 4: are already marginalized. 560 00:31:08,120 --> 00:31:11,120 Speaker 3: And so I just really don't think. 561 00:31:10,960 --> 00:31:13,280 Speaker 4: Like philosophically, I feel like I have a problem with 562 00:31:13,360 --> 00:31:14,960 Speaker 4: that as a way of handling. 563 00:31:14,560 --> 00:31:17,680 Speaker 1: It, and specifically on Taylor the fact that twenty percent 564 00:31:17,680 --> 00:31:21,400 Speaker 1: of conservatives believe that Taylor Swift is a plant by Biden. 565 00:31:21,600 --> 00:31:23,200 Speaker 3: Monmouth, I believe that's how you say it. 566 00:31:23,920 --> 00:31:26,120 Speaker 4: Just released a poll last week that it was one 567 00:31:26,160 --> 00:31:29,800 Speaker 4: out of five people. Really, that's eighteen percent, So your 568 00:31:29,880 --> 00:31:30,880 Speaker 4: numbers aren't far off. 569 00:31:31,000 --> 00:31:33,960 Speaker 3: Yeah, percent of people believe that she. 570 00:31:34,080 --> 00:31:37,080 Speaker 4: Is being propped up, right, I guess to help. 571 00:31:36,960 --> 00:31:38,960 Speaker 3: The Democrats win. Like I'm not even try I kind 572 00:31:38,960 --> 00:31:39,720 Speaker 3: of lose the thread. 573 00:31:40,040 --> 00:31:42,640 Speaker 1: And this is what I saw today. The headline was 574 00:31:42,640 --> 00:31:46,240 Speaker 1: a twenty percent believe that Taylor Swift is a Biden plant, 575 00:31:46,960 --> 00:31:50,480 Speaker 1: specifically for the election year. And I'm like, she's always existed. 576 00:31:50,520 --> 00:31:53,880 Speaker 1: I don't understand what's happening. Not always, but like for 577 00:31:53,920 --> 00:31:55,880 Speaker 1: the last thirty five years she's been here. Well, I 578 00:31:55,880 --> 00:31:59,880 Speaker 1: don't understands. 579 00:31:58,080 --> 00:32:00,480 Speaker 4: Would have taken for the Biden administration, and he's only 580 00:32:00,480 --> 00:32:03,000 Speaker 4: in a first term, like the foresight it would have taken. 581 00:32:03,760 --> 00:32:05,920 Speaker 1: So it makes me wonder how far will they go 582 00:32:06,120 --> 00:32:08,000 Speaker 1: to try to keep our all social media if she 583 00:32:08,200 --> 00:32:11,200 Speaker 1: was truly if they truly believed this about her? And 584 00:32:11,400 --> 00:32:16,000 Speaker 1: is this gonna be my conspiracy theory that the right 585 00:32:16,160 --> 00:32:19,560 Speaker 1: is trying to get rid of Taylor before the season. 586 00:32:19,880 --> 00:32:21,360 Speaker 3: So I'm with you. 587 00:32:21,520 --> 00:32:23,040 Speaker 4: So first of all, people need to know that, like 588 00:32:23,640 --> 00:32:28,560 Speaker 4: this is not just like fringe, like fringe people who 589 00:32:28,560 --> 00:32:33,200 Speaker 4: are saying this, people as big as former Republican presidential 590 00:32:33,200 --> 00:32:37,360 Speaker 4: candidate Ramaswami. He took to Twitter to like amplify this 591 00:32:37,400 --> 00:32:39,840 Speaker 4: conspiracy theory. Right, So this is obviously something that like 592 00:32:40,480 --> 00:32:43,080 Speaker 4: people not just like strandos on the. 593 00:32:43,040 --> 00:32:43,800 Speaker 3: Internet are saying. 594 00:32:44,480 --> 00:32:47,480 Speaker 4: But I do think like it's hard to talk about 595 00:32:47,520 --> 00:32:50,120 Speaker 4: because they don't want to like give the images more oxygen. 596 00:32:50,160 --> 00:32:53,120 Speaker 4: But when I did an episode on The Deep Things, 597 00:32:53,280 --> 00:32:56,520 Speaker 4: I had to confirm if they were still like available 598 00:32:56,560 --> 00:32:58,360 Speaker 4: on Twitter and so I could say, like, oh, as 599 00:32:58,400 --> 00:33:00,680 Speaker 4: of this taping, they're still up. So I looked for them. 600 00:33:00,680 --> 00:33:03,160 Speaker 4: I had to see them, and I can tell you 601 00:33:03,160 --> 00:33:05,680 Speaker 4: that the image is their football themed. 602 00:33:05,920 --> 00:33:07,880 Speaker 3: I guess I'll put it that way. They depict Taylor 603 00:33:07,920 --> 00:33:08,400 Speaker 3: Swift at. 604 00:33:08,280 --> 00:33:11,840 Speaker 4: A football game, and so you know, everybody on the 605 00:33:11,880 --> 00:33:15,960 Speaker 4: planet knows that she's dating the player from the from 606 00:33:16,040 --> 00:33:18,400 Speaker 4: Kansas City kind of like, don't even get me to 607 00:33:18,480 --> 00:33:19,520 Speaker 4: try to talk about sports. 608 00:33:19,520 --> 00:33:21,800 Speaker 3: But and so, and she's been in. 609 00:33:21,720 --> 00:33:24,200 Speaker 4: The news for like being at the games and the 610 00:33:24,240 --> 00:33:26,280 Speaker 4: games cutting to her and blah blah blah. And so 611 00:33:26,560 --> 00:33:29,400 Speaker 4: I believe that the reason those images to picture at 612 00:33:29,400 --> 00:33:31,600 Speaker 4: a football game is that it's they're meant to not 613 00:33:31,720 --> 00:33:35,240 Speaker 4: just sexually humiliate her, they're meant to like take her 614 00:33:35,400 --> 00:33:37,080 Speaker 4: down a peg as a person. 615 00:33:37,520 --> 00:33:40,520 Speaker 3: And so I think that there's clearly something. 616 00:33:40,240 --> 00:33:43,600 Speaker 4: Going on with Taylor Swift to have these conspiracy theorists 617 00:33:43,640 --> 00:33:46,560 Speaker 4: be targeting her in this way. But those images really 618 00:33:47,360 --> 00:33:50,320 Speaker 4: solidified something for me that like, oh, whoever made these 619 00:33:50,840 --> 00:33:54,600 Speaker 4: it is about humiliating her and taking her down a 620 00:33:54,640 --> 00:33:57,680 Speaker 4: peg because of something that she represents, right, Like, I 621 00:33:57,680 --> 00:33:59,200 Speaker 4: can't get into their head, So I don't know what 622 00:33:59,240 --> 00:34:01,840 Speaker 4: it is exactly, but like those the images, it was 623 00:34:01,920 --> 00:34:04,320 Speaker 4: very really clear to me that somebody was really leveraging 624 00:34:04,800 --> 00:34:06,560 Speaker 4: visual rhetoric to make that point. 625 00:34:06,680 --> 00:34:09,600 Speaker 1: Yeah, and it's it's very interesting in general when it 626 00:34:09,640 --> 00:34:14,760 Speaker 1: comes to AI because the level of opportunity that people 627 00:34:14,840 --> 00:34:18,600 Speaker 1: take to make the worst possible things that you just realize, like, 628 00:34:19,400 --> 00:34:22,360 Speaker 1: what's what's the point in this? And with that, because 629 00:34:22,360 --> 00:34:26,160 Speaker 1: it crosses such boundaries that things that exist that would 630 00:34:26,200 --> 00:34:29,359 Speaker 1: be art and that you know, nudity in itself is 631 00:34:29,400 --> 00:34:32,440 Speaker 1: not porn. We don't like if we need to be 632 00:34:32,520 --> 00:34:34,719 Speaker 1: very honest, like when we have this conversation about loving 633 00:34:34,760 --> 00:34:37,040 Speaker 1: our bodies and being open with our bodies, and why 634 00:34:37,080 --> 00:34:41,160 Speaker 1: are we like discriminating against women's breast versus versus men's chest, 635 00:34:41,200 --> 00:34:43,680 Speaker 1: you know, like stuff like that. Those are real conversations 636 00:34:43,680 --> 00:34:45,400 Speaker 1: that have to have. But because of these boundaries that 637 00:34:45,440 --> 00:34:48,600 Speaker 1: get crossed, that you have to ban all of it, 638 00:34:48,840 --> 00:34:51,320 Speaker 1: and so it makes it feel like everything is dirty 639 00:34:51,360 --> 00:34:55,040 Speaker 1: once again. It's like that old problem of trying to 640 00:34:55,080 --> 00:34:59,080 Speaker 1: differentiate now this is a violation versus this is a 641 00:34:59,080 --> 00:35:02,320 Speaker 1: personal right. Like it's such a hard place totally. 642 00:35:02,360 --> 00:35:05,439 Speaker 4: And I can speak to that because as we're having 643 00:35:05,440 --> 00:35:10,160 Speaker 4: this conversation about using AI to take women's clothes off 644 00:35:10,440 --> 00:35:14,399 Speaker 4: non consensually, there's also a conversation happening online where right 645 00:35:14,440 --> 00:35:17,880 Speaker 4: wing trolls are using AI to do the reverse to 646 00:35:18,120 --> 00:35:21,719 Speaker 4: women who consensually want to show up online, you know, 647 00:35:22,000 --> 00:35:25,400 Speaker 4: wearing bikinis or being in various stages of undress. They 648 00:35:25,440 --> 00:35:28,800 Speaker 4: are taking their youth, taking those images and using AI 649 00:35:28,960 --> 00:35:31,840 Speaker 4: to put clothes on them, to remove their tattoos, to 650 00:35:31,920 --> 00:35:34,760 Speaker 4: remove their like purple hair and give them more natural 651 00:35:34,800 --> 00:35:37,239 Speaker 4: hair color, to put children around them, to sort of 652 00:35:37,280 --> 00:35:41,360 Speaker 4: demonstrate like, oh this is And unless you're thinking, you 653 00:35:41,440 --> 00:35:45,080 Speaker 4: might be thinking like, oh, they're doing this to to 654 00:35:45,120 --> 00:35:47,000 Speaker 4: affirm these women, to be like, here's what you would 655 00:35:47,000 --> 00:35:48,440 Speaker 4: look you would look so beautiful like this. 656 00:35:48,760 --> 00:35:50,680 Speaker 3: It is very even if that's what you thought that 657 00:35:50,680 --> 00:35:52,520 Speaker 3: would be picked up. But that's not what they're doing. 658 00:35:52,560 --> 00:35:55,680 Speaker 4: They're doing this to shame and humiliate these women. And 659 00:35:55,719 --> 00:35:59,360 Speaker 4: so ultimately it is really not even really about the 660 00:35:59,440 --> 00:36:04,400 Speaker 4: nudity and the sexuality. It is about taking away women's agency, 661 00:36:04,640 --> 00:36:08,640 Speaker 4: taking away their autonomy over their own body, reinforcing the 662 00:36:08,680 --> 00:36:12,040 Speaker 4: attitude that says women cannot be trusted to or do 663 00:36:12,120 --> 00:36:15,120 Speaker 4: not deserve to have their own choices about how they 664 00:36:15,160 --> 00:36:17,680 Speaker 4: depict their own bodies. And so even if you're someone 665 00:36:17,680 --> 00:36:21,600 Speaker 4: who wants to show up nude online, they will take 666 00:36:21,640 --> 00:36:24,080 Speaker 4: that choice away from you. So ultimately, in my book, 667 00:36:24,120 --> 00:36:28,800 Speaker 4: this is really the nudity and the sex is like happenstance. 668 00:36:29,000 --> 00:36:32,120 Speaker 4: The real thing here is like taking away the agency 669 00:36:32,400 --> 00:36:33,960 Speaker 4: and humiliating the woman in the. 670 00:36:33,960 --> 00:36:38,080 Speaker 1: Process, right right, And again the fact that it's this 671 00:36:38,120 --> 00:36:41,400 Speaker 1: has been happening, and this continues to happen to people 672 00:36:41,520 --> 00:36:44,360 Speaker 1: every day, and it's not just Taylor. It just happens 673 00:36:44,400 --> 00:36:47,040 Speaker 1: to be Taylor was the biggest name that people could 674 00:36:47,040 --> 00:36:48,160 Speaker 1: focus on, right. 675 00:36:48,320 --> 00:36:51,280 Speaker 4: Exactly, So this actually happened Before this happened of Taylor, 676 00:36:51,320 --> 00:36:55,440 Speaker 4: it happened to teen Marvel star Social Gomez and so 677 00:36:55,600 --> 00:36:59,120 Speaker 4: before the Taylor Swift deep things made the rate of Twitter, Gomez, 678 00:36:59,280 --> 00:37:02,520 Speaker 4: who is a minor, was also targeted with deep fake 679 00:37:02,560 --> 00:37:06,120 Speaker 4: images that flooded Twitter. And so the sad part is 680 00:37:06,160 --> 00:37:08,680 Speaker 4: it's like it sounds like to me, Twitter just didn't 681 00:37:08,719 --> 00:37:11,680 Speaker 4: do anything. And I hate to say this, but like 682 00:37:12,280 --> 00:37:15,680 Speaker 4: it sounds like that because Gomez isn't as famous, she's 683 00:37:15,680 --> 00:37:18,120 Speaker 4: a woman of color, it sounds like they just didn't 684 00:37:18,160 --> 00:37:21,440 Speaker 4: really do anything. In a podcast interview, she said, it 685 00:37:21,480 --> 00:37:23,000 Speaker 4: made me word it out and I didn't like it. 686 00:37:23,080 --> 00:37:25,840 Speaker 4: I wanted it taken down. That was my main thought process, 687 00:37:26,000 --> 00:37:28,759 Speaker 4: take this down, please. It wasn't because I felt like 688 00:37:28,800 --> 00:37:31,000 Speaker 4: it was invading my privacy. More like, it wasn't a 689 00:37:31,000 --> 00:37:32,920 Speaker 4: good look for me. This has nothing to do with me, 690 00:37:33,000 --> 00:37:34,160 Speaker 4: and yet it's on here. 691 00:37:34,000 --> 00:37:34,720 Speaker 3: With my face. 692 00:37:34,840 --> 00:37:37,279 Speaker 4: And what's really sad is that she basically was like 693 00:37:37,320 --> 00:37:39,879 Speaker 4: I talked to Twitter and they didn't do anything. And 694 00:37:39,920 --> 00:37:44,319 Speaker 4: she's basically had to like make personal peace with the 695 00:37:44,320 --> 00:37:46,000 Speaker 4: fact that the images are out there and that they 696 00:37:46,040 --> 00:37:47,799 Speaker 4: always will be out there. They're out there now, right, 697 00:37:47,840 --> 00:37:51,719 Speaker 4: And so she said, nothing good comes from thinking about it. 698 00:37:52,280 --> 00:37:55,000 Speaker 4: I put my phone down, do some skincare, I go 699 00:37:55,040 --> 00:37:57,160 Speaker 4: hang out with my friends, something that will help me 700 00:37:57,200 --> 00:37:58,320 Speaker 4: forget what I saw. 701 00:37:58,400 --> 00:38:02,400 Speaker 3: And keep in mind, she is seventeen, she is a minor. 702 00:38:02,800 --> 00:38:04,120 Speaker 3: This is a legal. 703 00:38:03,960 --> 00:38:08,200 Speaker 4: Child sexual abuse material on Twitter, and nobody is doing 704 00:38:08,280 --> 00:38:09,799 Speaker 4: a thing to take it down. 705 00:38:09,920 --> 00:38:19,040 Speaker 3: It's still up today. 706 00:38:21,680 --> 00:38:24,440 Speaker 2: Going back to your point about like we have to 707 00:38:24,480 --> 00:38:28,440 Speaker 2: exist online for so many things. You know, if you're 708 00:38:28,480 --> 00:38:30,880 Speaker 2: in a movie. She was in this big Marvel movie, 709 00:38:31,640 --> 00:38:35,560 Speaker 2: and then a part of being in that is promoting 710 00:38:35,600 --> 00:38:38,799 Speaker 2: online and then this is what happens, and like why 711 00:38:38,800 --> 00:38:40,600 Speaker 2: would you want to participate in that? Why would you? 712 00:38:41,440 --> 00:38:48,120 Speaker 2: And that damages your career prospects, Like the publicity of it, 713 00:38:48,360 --> 00:38:52,280 Speaker 2: the promotion of it, but it's also like who could. 714 00:38:52,320 --> 00:38:56,000 Speaker 4: Flavor like yeah, yeah, And the data is really clear 715 00:38:56,080 --> 00:38:59,400 Speaker 4: that this kind of threat is keeping women out of 716 00:38:59,640 --> 00:39:03,760 Speaker 4: public and so whether it's a movie, whether it's running 717 00:39:03,760 --> 00:39:07,920 Speaker 4: for office, whether it's being a teacher, whether it's working 718 00:39:07,960 --> 00:39:11,080 Speaker 4: at the polls, this kind of thing, women are smart 719 00:39:11,160 --> 00:39:12,759 Speaker 4: enough to be like, if this is what it's going 720 00:39:12,800 --> 00:39:14,240 Speaker 4: to mean, if this is going to be the cost 721 00:39:14,360 --> 00:39:17,759 Speaker 4: of being participating in civic and public. 722 00:39:17,440 --> 00:39:19,160 Speaker 3: Life, I don't want to pay that cost. 723 00:39:19,239 --> 00:39:21,840 Speaker 4: And that's so when I say that it's not just 724 00:39:21,880 --> 00:39:24,920 Speaker 4: about celebrities or it's not just about a listers or whatever. 725 00:39:25,320 --> 00:39:28,360 Speaker 3: It is harming us all because we will not have. 726 00:39:28,320 --> 00:39:31,600 Speaker 4: The representative democracy that we all deserve if women and 727 00:39:31,640 --> 00:39:34,680 Speaker 4: girls cannot fully show up in public and civic life. 728 00:39:34,760 --> 00:39:37,160 Speaker 4: And the data is very clear that threats like this 729 00:39:37,320 --> 00:39:39,960 Speaker 4: are keeping them from doing exactly that right. 730 00:39:40,040 --> 00:39:45,239 Speaker 2: And you have some examples that are pretty heartbreaking about this. 731 00:39:45,360 --> 00:39:49,120 Speaker 4: Yes, it is so heartbreaking, Like this is really just 732 00:39:49,160 --> 00:39:52,000 Speaker 4: an issue where we the adult and the people who 733 00:39:52,000 --> 00:39:55,200 Speaker 4: are supposed to be in charge are just failing our kids. 734 00:39:55,280 --> 00:39:59,880 Speaker 4: So this is only a handful of time. As the 735 00:40:00,080 --> 00:40:03,279 Speaker 4: has happened across the country and across the globe. It's 736 00:40:03,280 --> 00:40:05,160 Speaker 4: happened a lot, and so if I was going to 737 00:40:05,200 --> 00:40:07,759 Speaker 4: list every instance, we will be here all day. Unfortunately, 738 00:40:07,840 --> 00:40:11,200 Speaker 4: but this year, a school district in Aurora, Colorado, released 739 00:40:11,200 --> 00:40:13,879 Speaker 4: the names of schools involved in a sextortion investigation, which 740 00:40:13,920 --> 00:40:18,600 Speaker 4: included two middle schools children. The police say that in 741 00:40:18,680 --> 00:40:22,560 Speaker 4: six instances, students reported being the direct target of sextortion 742 00:40:22,719 --> 00:40:27,600 Speaker 4: schemes after being contacted by the suspect or suspects through Instagram. 743 00:40:27,920 --> 00:40:32,120 Speaker 4: In dozens of others, students received unsolicited invitations to pay 744 00:40:32,200 --> 00:40:35,480 Speaker 4: to join a close Friends list on Instagram where sexually 745 00:40:35,520 --> 00:40:39,080 Speaker 4: explicit material had been posted. So essentially, what they're describing 746 00:40:39,160 --> 00:40:41,319 Speaker 4: is a marketplace where kids are either being invited to 747 00:40:41,440 --> 00:40:45,560 Speaker 4: join this close Friends list to see sexually explicit AI 748 00:40:45,680 --> 00:40:49,680 Speaker 4: generated Deep think content, or being extorted to keep that 749 00:40:49,760 --> 00:40:53,560 Speaker 4: content from being shared. Earlier this year, New Jersey high 750 00:40:53,600 --> 00:40:57,400 Speaker 4: schooler Francesca Manny, a teenage victim of non consensual sexually 751 00:40:57,400 --> 00:41:00,759 Speaker 4: explicit Deep thinks, joined Representative Joe Morell to share her 752 00:41:00,800 --> 00:41:03,799 Speaker 4: story and to advocate for a bipartisan bill that would 753 00:41:03,800 --> 00:41:06,880 Speaker 4: criminalize the sharing of such material at the federal level, 754 00:41:07,080 --> 00:41:10,239 Speaker 4: and so at her high school boys in her school. 755 00:41:10,400 --> 00:41:14,279 Speaker 4: They made these deep fake images of about thirty other 756 00:41:14,400 --> 00:41:17,520 Speaker 4: girls in her class, and she was like, Nah, I'm 757 00:41:17,520 --> 00:41:20,120 Speaker 4: not having this, and so has been advocating for legislation 758 00:41:20,200 --> 00:41:23,320 Speaker 4: to change this. The bill called the Preventing Deep Fakes 759 00:41:23,360 --> 00:41:26,040 Speaker 4: of Intimate Images Act, was first referred to by the 760 00:41:26,040 --> 00:41:28,719 Speaker 4: House Judiciary Committee back in December of twenty twenty two, 761 00:41:29,000 --> 00:41:32,000 Speaker 4: but so far no further action has been taking. So 762 00:41:32,200 --> 00:41:35,160 Speaker 4: this bill would criminalize the non consensual sharing of sexually 763 00:41:35,200 --> 00:41:38,440 Speaker 4: explicit deep fakes and create a rite of private action 764 00:41:38,560 --> 00:41:41,480 Speaker 4: so that the victims depicted in these images would be 765 00:41:41,520 --> 00:41:44,320 Speaker 4: able to sue the creators and distributors of that material 766 00:41:44,360 --> 00:41:48,840 Speaker 4: while also remaining anonymous. Under this law, the damages of 767 00:41:48,840 --> 00:41:51,359 Speaker 4: sharing deep faked images without consent could go as high 768 00:41:51,360 --> 00:41:53,880 Speaker 4: as one hundred and fifty thousand dollars an imprisonment of 769 00:41:53,960 --> 00:41:57,240 Speaker 4: up to ten years if sharing these images facilitates violence 770 00:41:57,320 --> 00:41:59,600 Speaker 4: or impacts the proceedings of a government agency. 771 00:42:00,640 --> 00:42:03,399 Speaker 3: Francesca's story is really heartbreaking. 772 00:42:03,520 --> 00:42:07,120 Speaker 4: Again, these boys in her class were generating and trading 773 00:42:07,200 --> 00:42:09,440 Speaker 4: these deep faked nude images of her at about thirty 774 00:42:09,440 --> 00:42:13,560 Speaker 4: other students back in October twentieth and Manny has been 775 00:42:14,400 --> 00:42:18,040 Speaker 4: really advocating and telling her story. She says the issue 776 00:42:18,040 --> 00:42:20,920 Speaker 4: is pretty black and white. No kid, teen or woman 777 00:42:20,960 --> 00:42:23,200 Speaker 4: should ever have to experience what I went through. 778 00:42:23,480 --> 00:42:25,000 Speaker 3: I felt sad and helpless. 779 00:42:25,200 --> 00:42:28,160 Speaker 4: I'm here standing up and shouting for change, fighting for 780 00:42:28,239 --> 00:42:30,320 Speaker 4: laws so no one else has to feel as lost 781 00:42:30,320 --> 00:42:33,560 Speaker 4: and powerless as I did on October twentieth. The glaring 782 00:42:33,680 --> 00:42:37,560 Speaker 4: lack of laws speaks volumes. Our voices are our secret weapon, 783 00:42:37,640 --> 00:42:40,600 Speaker 4: and our words are like power ups in Fortnite. My 784 00:42:40,680 --> 00:42:42,839 Speaker 4: mom and I are advocating to create a world where 785 00:42:42,880 --> 00:42:46,440 Speaker 4: being safe isn't just a hope, it's a reality for everyone. 786 00:42:46,600 --> 00:42:53,640 Speaker 4: And I really it's hard for me because Francesca sounds 787 00:42:53,640 --> 00:42:57,200 Speaker 4: amazing and sounds incredible, and I applaud what she's doing, 788 00:42:57,520 --> 00:43:00,920 Speaker 4: But she shouldn't have to be doing this child She 789 00:43:01,040 --> 00:43:03,880 Speaker 4: is a kid. She should be getting ready for prom 790 00:43:03,960 --> 00:43:07,120 Speaker 4: and like making Taylor Swift bracelets or whatever, and or 791 00:43:07,239 --> 00:43:10,840 Speaker 4: like you know, having fun with her friends. She should 792 00:43:10,880 --> 00:43:15,920 Speaker 4: not be having to advocate with lawmakers to get laws 793 00:43:16,000 --> 00:43:18,200 Speaker 4: on the books that don't exist to keep kids like 794 00:43:18,239 --> 00:43:21,360 Speaker 4: her safe. And so we are really failing our kids. 795 00:43:21,400 --> 00:43:24,839 Speaker 4: We are taking away their childhood. We are saying this 796 00:43:24,880 --> 00:43:27,640 Speaker 4: is an we are normalizing this as something they have 797 00:43:27,680 --> 00:43:28,280 Speaker 4: to shoulder. 798 00:43:28,640 --> 00:43:31,480 Speaker 3: Meanwhile people are getting rich off of it, people are 799 00:43:31,520 --> 00:43:32,520 Speaker 3: making money from it. 800 00:43:32,520 --> 00:43:35,960 Speaker 5: It is so heartbreaking to me, right, And I feel 801 00:43:36,000 --> 00:43:41,279 Speaker 5: like for you know, us from a different time. I 802 00:43:41,400 --> 00:43:43,920 Speaker 5: remember you just post things on social media and you 803 00:43:43,960 --> 00:43:46,600 Speaker 5: didn't think like one day this could be used. You 804 00:43:46,640 --> 00:43:51,800 Speaker 5: didn't think one day this could ruin my life or 805 00:43:51,800 --> 00:43:54,399 Speaker 5: I'm going to have to be fighting for some kind 806 00:43:54,440 --> 00:43:55,279 Speaker 5: of legislation. 807 00:43:55,760 --> 00:44:00,600 Speaker 2: So I think it's ridiculous that we expect to younger 808 00:44:00,640 --> 00:44:03,480 Speaker 2: people to because I've seen that argument before. I like, well, 809 00:44:03,560 --> 00:44:08,160 Speaker 2: just don't post anything like well. 810 00:44:06,800 --> 00:44:09,480 Speaker 1: Right, And then we talked about the families who put 811 00:44:09,520 --> 00:44:12,319 Speaker 1: their children on not thinking about this. I know this 812 00:44:12,400 --> 00:44:14,919 Speaker 1: is maybe not the like they they did this very 813 00:44:14,960 --> 00:44:18,360 Speaker 1: like innocently, just talking about their children, because they don't 814 00:44:18,440 --> 00:44:21,960 Speaker 1: think about the perpetrators that are out there, the perverted 815 00:44:22,000 --> 00:44:24,719 Speaker 1: individuals who will take whatever image they may find and 816 00:44:24,840 --> 00:44:28,799 Speaker 1: use it to the way they please. And that's part 817 00:44:28,840 --> 00:44:32,920 Speaker 1: of the problem is that because people didn't realize this 818 00:44:32,960 --> 00:44:35,120 Speaker 1: could be a thing, and now here we are and 819 00:44:35,160 --> 00:44:39,120 Speaker 1: there's all this content available for them to use and manipulate. 820 00:44:39,160 --> 00:44:41,480 Speaker 1: For people who just wanted to share joyful moments. 821 00:44:41,600 --> 00:44:47,280 Speaker 4: Yeah, I hate it. Like I having researched this a lot, 822 00:44:47,480 --> 00:44:52,160 Speaker 4: I know the very real dangers involved in posting minors 823 00:44:52,200 --> 00:44:54,840 Speaker 4: on social media, even if you're doing it innocently, And 824 00:44:54,920 --> 00:44:58,920 Speaker 4: so I would really urge anybody who is a parent 825 00:44:59,120 --> 00:45:00,960 Speaker 4: or a guardian of a child, if you are a 826 00:45:01,000 --> 00:45:04,280 Speaker 4: grown up of a kid, really do some deep research 827 00:45:04,400 --> 00:45:08,319 Speaker 4: into the threats that exist. Doesn't mean that you don't 828 00:45:08,360 --> 00:45:11,240 Speaker 4: have to that you can't share your kid, your pictures 829 00:45:11,239 --> 00:45:13,480 Speaker 4: of your kid with their friends and family. There are 830 00:45:13,480 --> 00:45:15,799 Speaker 4: ways to do it safely. I would urge anybody to 831 00:45:15,840 --> 00:45:18,239 Speaker 4: do a deep look into that. And if folks want help, 832 00:45:18,320 --> 00:45:19,880 Speaker 4: I am happy to help. I don't I don't want 833 00:45:19,960 --> 00:45:22,200 Speaker 4: to make it seem like I'm appreciating or blaming, so 834 00:45:22,400 --> 00:45:24,279 Speaker 4: reach out to me. I'm happy to help. But the 835 00:45:24,320 --> 00:45:27,600 Speaker 4: point is, it's like, it sucks that you would have 836 00:45:27,680 --> 00:45:31,239 Speaker 4: to do that. It sucks that you cannot innocently post 837 00:45:31,239 --> 00:45:34,320 Speaker 4: a picture of yourself or your kid without this happening. 838 00:45:34,360 --> 00:45:37,319 Speaker 4: And so I will be the first person to agree that, like, 839 00:45:37,760 --> 00:45:39,640 Speaker 4: you shouldn't have to be thinking about it that way. 840 00:45:39,880 --> 00:45:43,160 Speaker 4: And when you show up online, whether you're posting selfies 841 00:45:43,360 --> 00:45:46,880 Speaker 4: or whatever. You shouldn't have to wonder if your innocent 842 00:45:46,920 --> 00:45:50,279 Speaker 4: selfie is just another data point for somebody to misuse 843 00:45:50,400 --> 00:45:53,080 Speaker 4: and create an image of yourself that you didn't consent to. 844 00:45:53,360 --> 00:45:53,640 Speaker 1: It is. 845 00:45:53,719 --> 00:45:57,239 Speaker 4: It is such a messed up dynamic that we But 846 00:45:57,640 --> 00:46:00,640 Speaker 4: my worry is that as messed up as it is, 847 00:46:00,880 --> 00:46:03,719 Speaker 4: it is becoming, it is becoming quickly the norm. And 848 00:46:04,120 --> 00:46:07,120 Speaker 4: I don't want to fast forward ten years later and 849 00:46:07,239 --> 00:46:11,759 Speaker 4: have you know it be a rite of passage for 850 00:46:11,840 --> 00:46:15,240 Speaker 4: a high school girl to have boys in her class 851 00:46:15,600 --> 00:46:19,640 Speaker 4: circulate non consensual, deep fake images of her, but she 852 00:46:19,719 --> 00:46:21,520 Speaker 4: had nothing to do with I don't, and I'm worried 853 00:46:21,560 --> 00:46:23,640 Speaker 4: that that's what's gonna happen in ten years time. It's 854 00:46:23,680 --> 00:46:27,120 Speaker 4: just going to be another milestone. And now is the time, 855 00:46:27,120 --> 00:46:29,400 Speaker 4: actually the time to pump the brakes and examine. This 856 00:46:29,440 --> 00:46:31,640 Speaker 4: was about ten years ago, but we can still do 857 00:46:31,760 --> 00:46:34,479 Speaker 4: it now and it's not too late to figure this out, 858 00:46:34,560 --> 00:46:39,000 Speaker 4: and I worry. So I just really worry that we're 859 00:46:39,000 --> 00:46:42,080 Speaker 4: not doing it because and that teenagers like Francesca are 860 00:46:42,080 --> 00:46:45,400 Speaker 4: going to have spent their teen years advocating for something 861 00:46:45,680 --> 00:46:48,800 Speaker 4: and no change will happen. We owe it to people 862 00:46:48,840 --> 00:46:52,839 Speaker 4: like Francesca to actually do something, do not have her 863 00:46:53,320 --> 00:46:56,319 Speaker 4: spend all of this energy at her young age and 864 00:46:56,400 --> 00:46:57,520 Speaker 4: have that beef for nothing. 865 00:46:57,880 --> 00:47:01,480 Speaker 2: Absolutely, And again, like this impacts so much because it 866 00:47:01,520 --> 00:47:07,480 Speaker 2: impacts all of us who are not in any way famous. 867 00:47:07,480 --> 00:47:09,720 Speaker 2: But it also, as we talked about in that episode 868 00:47:09,760 --> 00:47:12,880 Speaker 2: about journalism and India, it impacts like what gets reported 869 00:47:13,000 --> 00:47:16,359 Speaker 2: because people get threatened. It impacts like people showing up. 870 00:47:16,640 --> 00:47:18,960 Speaker 2: It impacts people wanting to run in politics, like it is. 871 00:47:20,280 --> 00:47:22,440 Speaker 2: It's something in the back of our minds and it 872 00:47:22,480 --> 00:47:26,839 Speaker 2: shouldn't be like that shouldn't be a thing, but it is. 873 00:47:27,840 --> 00:47:32,600 Speaker 2: But there are things that listeners can do, Yes. 874 00:47:32,400 --> 00:47:34,560 Speaker 3: There are. So this is a tough issue. 875 00:47:34,560 --> 00:47:37,120 Speaker 4: It's one of those issues where I get very frustrated 876 00:47:37,120 --> 00:47:40,000 Speaker 4: because I'm like, I am not a lawmaker. 877 00:47:40,160 --> 00:47:41,840 Speaker 3: What can I do? I'm a podcaster. 878 00:47:42,160 --> 00:47:44,279 Speaker 4: All I can do is make podcasts that are very 879 00:47:44,280 --> 00:47:47,200 Speaker 4: shouty about it, and boy do I But and so 880 00:47:47,360 --> 00:47:50,480 Speaker 4: I don't want it to be all deep and gloom 881 00:47:50,560 --> 00:47:52,919 Speaker 4: because there are things that folks can do. I would say, 882 00:47:52,920 --> 00:47:55,160 Speaker 4: make sure that you're following people, a lot of whom 883 00:47:55,160 --> 00:47:57,120 Speaker 4: are women and women of color, who have really been 884 00:47:57,160 --> 00:48:00,200 Speaker 4: advocating to make AI sayer for a long time, like 885 00:48:00,239 --> 00:48:02,719 Speaker 4: doctor Joe Bolawi, who we just did an interview with 886 00:48:02,760 --> 00:48:04,680 Speaker 4: on my podcast, there are no girls on the internet 887 00:48:04,880 --> 00:48:07,560 Speaker 4: who has been talking about the threats that AI poses, 888 00:48:07,600 --> 00:48:10,920 Speaker 4: specifically to marginalized people for a really long time. Check 889 00:48:10,920 --> 00:48:14,279 Speaker 4: out her advocacy organization, the Algorithmic Justice League, and so. 890 00:48:14,480 --> 00:48:17,279 Speaker 4: And also know that there is movement on this right 891 00:48:17,360 --> 00:48:20,359 Speaker 4: because of the advocacy of people like doctor Joe Bolamwini. 892 00:48:20,680 --> 00:48:24,120 Speaker 4: You know, the Biden administration released that executive order putting 893 00:48:24,239 --> 00:48:26,120 Speaker 4: some guardrails. 894 00:48:25,440 --> 00:48:27,120 Speaker 3: On AI a couple of months ago. 895 00:48:27,480 --> 00:48:30,799 Speaker 4: And as I said, lawmakers like Representative Joe Morrell and 896 00:48:30,920 --> 00:48:34,640 Speaker 4: Representative if At Clark have legislation and have introduced legislation 897 00:48:34,680 --> 00:48:38,720 Speaker 4: that would criminalize deep fakes. And so there is some movement, 898 00:48:38,840 --> 00:48:42,120 Speaker 4: like people are trying to do stuff. But I would 899 00:48:42,160 --> 00:48:47,720 Speaker 4: also say, like, really see this issue as a real issue, 900 00:48:47,760 --> 00:48:49,799 Speaker 4: Like I think this is an issue where we where 901 00:48:49,840 --> 00:48:53,080 Speaker 4: it's going to take legislative solutions and policy solutions and 902 00:48:53,120 --> 00:48:55,719 Speaker 4: platform solutions, but also cultural solutions. 903 00:48:55,800 --> 00:48:58,200 Speaker 3: Right, don't normalize this. 904 00:48:58,760 --> 00:49:01,600 Speaker 4: Really, if some one you know is being targeted or 905 00:49:02,239 --> 00:49:04,360 Speaker 4: it comes up like if someone you're talking to you 906 00:49:04,520 --> 00:49:07,279 Speaker 4: is talking about those together swift deep fakes, make sure 907 00:49:07,320 --> 00:49:09,920 Speaker 4: they understand that it is a violation, and it is 908 00:49:09,960 --> 00:49:12,200 Speaker 4: a it is a sex crime, and that it's not 909 00:49:12,320 --> 00:49:15,759 Speaker 4: just a funny picture to be laughed about or looked 910 00:49:15,760 --> 00:49:18,600 Speaker 4: at or traded around. I think that we, in the 911 00:49:18,640 --> 00:49:21,399 Speaker 4: absence of lawmakers kind of getting it together and moving 912 00:49:21,440 --> 00:49:25,479 Speaker 4: on federal legislation, we can institute cultural change the same 913 00:49:25,520 --> 00:49:29,160 Speaker 4: way that like the Me Too movement really got people 914 00:49:29,200 --> 00:49:32,439 Speaker 4: talking about gender based violence and sexual violence, the same 915 00:49:32,440 --> 00:49:35,320 Speaker 4: way that Black Lives Matter got people thinking about racial justice. 916 00:49:35,440 --> 00:49:39,080 Speaker 3: We need to have the same kind of cultural shift. 917 00:49:38,920 --> 00:49:41,600 Speaker 4: In how we talk about these kinds of technology facilitated 918 00:49:41,640 --> 00:49:43,280 Speaker 4: harms like deep fakes. 919 00:49:45,600 --> 00:49:49,560 Speaker 2: Yeah, oh, once again, Bridget, I have so many other 920 00:49:49,640 --> 00:49:51,120 Speaker 2: things I want to talk to you about. 921 00:49:52,560 --> 00:49:53,960 Speaker 3: Well, I'll come back anytime. 922 00:49:54,320 --> 00:49:59,800 Speaker 2: Oh alas, yes, oh so many things. Our time is 923 00:50:00,160 --> 00:50:03,160 Speaker 2: for now, though, But how can the good listeners find you? 924 00:50:03,600 --> 00:50:05,840 Speaker 4: Well, you can listen to my podcast. There are no 925 00:50:05,920 --> 00:50:07,960 Speaker 4: girls on the internet. You can find me on Twitter 926 00:50:08,000 --> 00:50:10,719 Speaker 4: at Bridget Marie and on Instagram at bridget Marie DC, 927 00:50:11,239 --> 00:50:14,240 Speaker 4: and on TikTok at Bridget makes Pods. 928 00:50:15,160 --> 00:50:18,160 Speaker 2: Yes, and again, listeners, if you have not done those things, 929 00:50:18,239 --> 00:50:21,799 Speaker 2: please go do them. Thank you so much, Bridget. We're 930 00:50:21,840 --> 00:50:24,680 Speaker 2: still trying to hang out in real life. 931 00:50:24,520 --> 00:50:24,840 Speaker 4: And do it. 932 00:50:27,560 --> 00:50:28,880 Speaker 3: Thank you, listeners. 933 00:50:29,160 --> 00:50:32,480 Speaker 2: Yes, in the meantime, Listeners. If you would like to 934 00:50:32,480 --> 00:50:35,600 Speaker 2: contact us, you can or email is steffaniea mom stuff 935 00:50:35,600 --> 00:50:37,839 Speaker 2: at iHeartMedia dot com. You can find us on Twitter 936 00:50:37,880 --> 00:50:40,239 Speaker 2: at mom Stuff podcast, or on Instagram and TikTok at 937 00:50:40,239 --> 00:50:42,319 Speaker 2: stuff When Never Told You. We have a tea public store, 938 00:50:42,360 --> 00:50:44,080 Speaker 2: and we have a book you can get wherever you 939 00:50:44,120 --> 00:50:46,880 Speaker 2: get your books. Thanks as always to our super producer 940 00:50:46,960 --> 00:50:49,719 Speaker 2: Christine Are, executive producer Maya, and our contributor Joey. Thank 941 00:50:49,760 --> 00:50:51,920 Speaker 2: you and thanks to you for listening Stuff I Ever 942 00:50:52,000 --> 00:50:54,000 Speaker 2: Told You's fiction by heart Radio. For more podcasts or 943 00:50:54,040 --> 00:50:55,560 Speaker 2: my heart Radio, you can check out the heart Radio 944 00:50:55,600 --> 00:50:58,120 Speaker 2: app Apple Podcasts, or we listened to your favorite shows. 945 00:51:03,320 --> 00:51:03,600 Speaker 1: Yeah