1 00:00:01,720 --> 00:00:09,119 Speaker 1: Cool Zone Media, Welcome to It could happen here, a 2 00:00:09,200 --> 00:00:12,360 Speaker 1: show about things falling apart, and today the thing falling 3 00:00:12,360 --> 00:00:16,840 Speaker 1: apart is the Internet. And today we have a special 4 00:00:17,000 --> 00:00:21,639 Speaker 1: guest episode with Bridget Todd. Hello Bridget, So Garrett, it's. 5 00:00:21,600 --> 00:00:24,720 Speaker 2: Kind of funny that we are talking just a few 6 00:00:24,800 --> 00:00:28,520 Speaker 2: days after the Trump administration put out there Woke AI 7 00:00:28,800 --> 00:00:29,760 Speaker 2: Executive Orderer. 8 00:00:30,000 --> 00:00:32,199 Speaker 1: Yes, I have not read this yet. I have to 9 00:00:32,320 --> 00:00:35,879 Speaker 1: for next week's executive Disorder. I'm not looking forward to it. 10 00:00:36,920 --> 00:00:40,840 Speaker 2: I like that the Cool Zone team kind of sections 11 00:00:40,920 --> 00:00:44,040 Speaker 2: off all the Trump federal nonsense so you don't have 12 00:00:44,120 --> 00:00:46,040 Speaker 2: to be mired in it all the goddamn time. 13 00:00:46,560 --> 00:00:48,800 Speaker 1: I still kind of am. I just schedule it throughout 14 00:00:48,840 --> 00:00:51,519 Speaker 1: my week. I guess there's certain days where I have 15 00:00:51,600 --> 00:00:52,040 Speaker 1: to do it. 16 00:00:52,360 --> 00:00:54,200 Speaker 3: Yeah, you gotta pepper it in. You gotta pepper it in. 17 00:00:54,240 --> 00:00:57,520 Speaker 2: Well, yeah, not to give you a spoiler for when 18 00:00:57,560 --> 00:01:01,680 Speaker 2: you dive into it yourself, but it's all nonsense. Basically, 19 00:01:01,920 --> 00:01:05,160 Speaker 2: the Trump administration is saying that right now, the biggest 20 00:01:05,319 --> 00:01:08,959 Speaker 2: threat regarding AI is it being too woke and essentially 21 00:01:09,440 --> 00:01:13,280 Speaker 2: telling folks who make AI tech leaders essentially to be 22 00:01:13,360 --> 00:01:16,200 Speaker 2: more like Elon Musk and Grock and make sure that. 23 00:01:16,200 --> 00:01:17,280 Speaker 3: Your AI models. 24 00:01:17,319 --> 00:01:19,440 Speaker 2: The only AI models that we will accept in this 25 00:01:19,520 --> 00:01:23,240 Speaker 2: country are the non woke ones, ones that don't incorporate 26 00:01:23,360 --> 00:01:25,720 Speaker 2: DEI would love to know more about what he thinks 27 00:01:25,760 --> 00:01:27,640 Speaker 2: that means, but that's a little preview for you. 28 00:01:28,000 --> 00:01:31,080 Speaker 1: Fantastic, you know, seems like the most important issue facing 29 00:01:31,080 --> 00:01:32,000 Speaker 1: our nation right now. 30 00:01:32,640 --> 00:01:33,960 Speaker 3: Definitely, definitely. 31 00:01:34,040 --> 00:01:37,880 Speaker 2: And so it's funny that we're talking about AI because 32 00:01:38,560 --> 00:01:40,920 Speaker 2: I don't know if you're on TikTok, but there have 33 00:01:41,040 --> 00:01:47,120 Speaker 2: been these kind of shockingly racist AI generated videos all 34 00:01:47,160 --> 00:01:49,840 Speaker 2: over TikTok, to the point where I would say that 35 00:01:49,880 --> 00:01:53,760 Speaker 2: we are witnessing the revival of the minstrel Show using 36 00:01:53,800 --> 00:01:56,040 Speaker 2: AI on social media. This is not a claim I 37 00:01:56,120 --> 00:01:59,000 Speaker 2: use lightly. That is how extreme some of this content is. 38 00:02:00,120 --> 00:02:02,240 Speaker 1: I'm not on TikTok, but I think I've seen some 39 00:02:02,320 --> 00:02:08,480 Speaker 1: of this content permeate across platforms, certainly on like Instagram, 40 00:02:08,560 --> 00:02:12,079 Speaker 1: reels and even even bits of X the Everything app. 41 00:02:13,840 --> 00:02:16,239 Speaker 3: I love that you call it that. That's the full name. 42 00:02:18,120 --> 00:02:20,400 Speaker 2: So for folks who don't know, I want to scram 43 00:02:20,480 --> 00:02:22,959 Speaker 2: the conversation in what a minstrel show is. So the 44 00:02:23,000 --> 00:02:26,760 Speaker 2: Minstrel Show was a incredibly popular form of American theater 45 00:02:26,880 --> 00:02:30,760 Speaker 2: and entertainment in the nineteenth century, where mostly but not all, 46 00:02:30,880 --> 00:02:34,600 Speaker 2: white performers would wear black face makeup to make themselves 47 00:02:34,680 --> 00:02:39,120 Speaker 2: look like These exaggerated racist versions of black people and 48 00:02:39,320 --> 00:02:44,639 Speaker 2: essentially portray very racist stereotypes of black folks being lazy buffoons. 49 00:02:44,680 --> 00:02:48,799 Speaker 2: And a common trope in these skits was black people 50 00:02:49,240 --> 00:02:52,840 Speaker 2: trying and failing to gain American citizenship because at the time, 51 00:02:52,840 --> 00:02:56,040 Speaker 2: black Americans did not have full citizenship, and so a 52 00:02:56,080 --> 00:02:58,960 Speaker 2: big plotline would be like, Oh, we had to take 53 00:02:58,960 --> 00:03:01,240 Speaker 2: a test for citizenship, but we were too stupid to 54 00:03:01,240 --> 00:03:03,280 Speaker 2: figure it out. Are we spaced the data and overslept 55 00:03:03,280 --> 00:03:06,880 Speaker 2: because we're very lazy. When these shows would depict black women, 56 00:03:06,960 --> 00:03:08,960 Speaker 2: we were often shown as what you might think up 57 00:03:09,000 --> 00:03:13,679 Speaker 2: as like a sapphire caricature, which is rude, loud, malicious, stubborn, 58 00:03:13,720 --> 00:03:17,000 Speaker 2: and overbearing, kind of like the angry black woman trope 59 00:03:17,120 --> 00:03:20,240 Speaker 2: that you probably are familiar with in media today. So 60 00:03:20,720 --> 00:03:26,040 Speaker 2: these skits were incredibly popular entertainment, but they also served 61 00:03:26,400 --> 00:03:30,280 Speaker 2: the purpose of reaffirming political and social ideologies, and so 62 00:03:30,639 --> 00:03:34,160 Speaker 2: you know, the dominant way that people consumed media regarding 63 00:03:34,200 --> 00:03:37,400 Speaker 2: black people showed us as lazy, stupid, angry, loud, and 64 00:03:37,720 --> 00:03:41,760 Speaker 2: importantly not really able to conform to the dominant culture 65 00:03:41,880 --> 00:03:44,880 Speaker 2: of like mainstream, hardworking white Americans. 66 00:03:45,160 --> 00:03:47,520 Speaker 3: That is obviously an incredibly. 67 00:03:46,960 --> 00:03:51,280 Speaker 2: Powerful tool to uphold and reaffirm the idea that black 68 00:03:51,320 --> 00:03:53,960 Speaker 2: folks should not be given full citizenship, should not be 69 00:03:54,000 --> 00:03:58,080 Speaker 2: given full rights, cannot be you know, integrated into polite 70 00:03:58,080 --> 00:04:00,920 Speaker 2: white society. And it almost kind of became this for 71 00:04:01,080 --> 00:04:05,600 Speaker 2: their own good attitude that provided like a polite justification 72 00:04:05,640 --> 00:04:08,320 Speaker 2: for things like segregation. Well like, oh, well, you know, 73 00:04:08,400 --> 00:04:10,680 Speaker 2: I've seen in minstrel shows that black folks are very 74 00:04:10,760 --> 00:04:13,440 Speaker 2: lazy and stupid, So it's an as for their own 75 00:04:13,520 --> 00:04:16,120 Speaker 2: good that we treat them like shit in society, do 76 00:04:16,240 --> 00:04:16,719 Speaker 2: you feel me? 77 00:04:16,920 --> 00:04:17,599 Speaker 3: Yeah, yeah, yeah. 78 00:04:17,720 --> 00:04:20,560 Speaker 1: It's a sort of like infantilization exactly. 79 00:04:21,160 --> 00:04:23,840 Speaker 2: And so even though the minstrel show did die out, 80 00:04:24,160 --> 00:04:26,240 Speaker 2: I would argue that we are kind of seeing a 81 00:04:26,320 --> 00:04:30,080 Speaker 2: little bit of a comeback using AI in the digital realm, 82 00:04:30,200 --> 00:04:32,039 Speaker 2: and just like the menstrual shows of. 83 00:04:32,120 --> 00:04:34,080 Speaker 3: Yesteryear, we're used to affirm. 84 00:04:34,160 --> 00:04:37,760 Speaker 2: Political and social ideologies under the guise of just being 85 00:04:37,960 --> 00:04:41,120 Speaker 2: entertainment or just being jokes or just being funny. I 86 00:04:41,120 --> 00:04:43,520 Speaker 2: really think it's not a coincidence that we're also seeing 87 00:04:44,080 --> 00:04:47,760 Speaker 2: the rise of digital blackface, where non black creators are 88 00:04:47,880 --> 00:04:51,680 Speaker 2: using AI to create these viral racist skits that are 89 00:04:51,800 --> 00:04:55,000 Speaker 2: steeped in black stereotypes, and that they're really taking off 90 00:04:55,040 --> 00:04:56,360 Speaker 2: all over social media today. 91 00:04:56,800 --> 00:04:59,440 Speaker 1: That sounds not fun to hear about, but I'm excited 92 00:04:59,480 --> 00:05:01,200 Speaker 1: for you to explain it to me. 93 00:05:02,000 --> 00:05:06,279 Speaker 2: Yes, So I will say, initially, the first iteration of 94 00:05:06,279 --> 00:05:09,600 Speaker 2: one of these videos that I saw was not really racist. 95 00:05:09,640 --> 00:05:12,320 Speaker 2: It was made by a black creator, I think, trying 96 00:05:12,400 --> 00:05:15,320 Speaker 2: to use AI to create sort of humorous skits. But 97 00:05:15,440 --> 00:05:19,000 Speaker 2: when that first video took off, people on TikTok started 98 00:05:19,080 --> 00:05:22,320 Speaker 2: using AI to create more and more extreme, more and 99 00:05:22,400 --> 00:05:26,000 Speaker 2: more racist iterations of these kinds of videos, which is 100 00:05:26,000 --> 00:05:28,840 Speaker 2: what we're seeing today. So I will play a little 101 00:05:28,880 --> 00:05:30,560 Speaker 2: snippet of an example for you. 102 00:05:30,839 --> 00:05:31,440 Speaker 3: What's up, bitch? 103 00:05:31,480 --> 00:05:33,440 Speaker 2: Is this Bigfoot one hand the baddest bitch in the woods? 104 00:05:33,440 --> 00:05:35,600 Speaker 3: Part time cryptic, full time problem. Don't follow me if 105 00:05:35,600 --> 00:05:36,800 Speaker 3: you scared a please. 106 00:05:37,640 --> 00:05:41,320 Speaker 2: So this is a TikTok that got over two million views, 107 00:05:41,480 --> 00:05:45,360 Speaker 2: and it basically it uses AI to generate this black 108 00:05:45,480 --> 00:05:51,520 Speaker 2: woman stereotypical version of Bigfoot, and this account is so 109 00:05:51,720 --> 00:05:54,880 Speaker 2: popular that has generated so many copycats, like this is 110 00:05:54,920 --> 00:05:57,680 Speaker 2: a format that has really hit with TikTok. 111 00:05:58,279 --> 00:06:00,920 Speaker 3: There also is another kind of bucket of. 112 00:06:00,880 --> 00:06:04,640 Speaker 2: These that people call slave talk, where it uses AI 113 00:06:05,000 --> 00:06:08,920 Speaker 2: to sort of reimagine enslaved people on plantations if they 114 00:06:09,240 --> 00:06:12,400 Speaker 2: had social media and we're doing vlogs and so a 115 00:06:12,440 --> 00:06:15,680 Speaker 2: lot of those videos were taken down by TikTok, which 116 00:06:15,720 --> 00:06:18,800 Speaker 2: is I think good, but essentially it would reimagine these 117 00:06:18,839 --> 00:06:23,880 Speaker 2: AI in generated enslaved people basically saying like, oh, well, yeah, 118 00:06:23,920 --> 00:06:25,920 Speaker 2: I do have to work out here in the cotton fields, 119 00:06:25,920 --> 00:06:27,800 Speaker 2: but at least I'm gonna get meals. At least I 120 00:06:27,839 --> 00:06:31,120 Speaker 2: have a roof over my head, essentially really affirming the 121 00:06:31,160 --> 00:06:33,719 Speaker 2: idea that, like slavery, wasn't that bad. 122 00:06:34,279 --> 00:06:36,240 Speaker 3: One of the more heinous. 123 00:06:35,920 --> 00:06:38,480 Speaker 2: Examples that I saw of these that was removed from 124 00:06:38,520 --> 00:06:43,000 Speaker 2: TikTok was a TikTok shop sponsored video that showed an 125 00:06:43,000 --> 00:06:46,320 Speaker 2: AI generated enslaved person working in the fields wearing a 126 00:06:46,360 --> 00:06:49,640 Speaker 2: solar powered hat with a sand in it, and basically 127 00:06:49,680 --> 00:06:52,200 Speaker 2: he was like, Oh, this work in the field would 128 00:06:52,240 --> 00:06:54,560 Speaker 2: be so horrible if I did not have this hat. 129 00:06:54,800 --> 00:06:56,919 Speaker 2: And then there's a little link to the TikTok shop 130 00:06:57,160 --> 00:06:59,640 Speaker 2: and you can buy the actual hat, which is. 131 00:06:59,600 --> 00:07:01,760 Speaker 3: Just some really dystopian awful shit. 132 00:07:02,279 --> 00:07:05,360 Speaker 1: No, that is like quite literally it's like evocative of 133 00:07:05,440 --> 00:07:08,720 Speaker 1: like cyberpunk tropes that people I would assume would not 134 00:07:08,800 --> 00:07:12,480 Speaker 1: want to use due to fears of insensitivity. But it's 135 00:07:12,640 --> 00:07:15,200 Speaker 1: just on your phone like as like a real thing. 136 00:07:15,720 --> 00:07:18,280 Speaker 3: Yeah, I completely agree, and I love that comparison. 137 00:07:18,640 --> 00:07:21,200 Speaker 2: And I think, like I would imagine if I were 138 00:07:21,280 --> 00:07:25,080 Speaker 2: running a TikTok shop that using the AI generated image 139 00:07:25,120 --> 00:07:27,760 Speaker 2: of an enslaved person, I would think like, oh, well, 140 00:07:27,760 --> 00:07:29,920 Speaker 2: this is certainly not something that I would use to 141 00:07:30,040 --> 00:07:32,120 Speaker 2: like sell some cheap fan hat. 142 00:07:32,600 --> 00:07:34,320 Speaker 3: But I mean, I think it is exactly what you're 143 00:07:34,360 --> 00:07:34,920 Speaker 3: saying that. 144 00:07:35,400 --> 00:07:39,680 Speaker 2: I think that the extreme quality of these videos, people 145 00:07:39,720 --> 00:07:42,280 Speaker 2: are just like, well, it'll get views and then I'll 146 00:07:42,280 --> 00:07:44,480 Speaker 2: get more eyeballs on my TikTok shop. 147 00:07:44,640 --> 00:07:45,720 Speaker 3: I don't think there's any kind. 148 00:07:45,600 --> 00:07:48,880 Speaker 1: Of sure Yeah, no, it's a very gross way of 149 00:07:48,920 --> 00:07:53,800 Speaker 1: doing like outrage farming for engagement. I guess, like because 150 00:07:53,840 --> 00:07:55,400 Speaker 1: like surely they know that these are not going to 151 00:07:55,480 --> 00:07:58,400 Speaker 1: like go over easy. Like I think a part of 152 00:07:58,440 --> 00:08:01,720 Speaker 1: part of this is generating some degree of like attention 153 00:08:01,880 --> 00:08:05,800 Speaker 1: based on it being offensive or extremely gross and knowing 154 00:08:05,800 --> 00:08:08,440 Speaker 1: that people will like comment things of that nature. 155 00:08:09,440 --> 00:08:10,000 Speaker 3: Exactly. 156 00:08:10,080 --> 00:08:13,160 Speaker 2: And it's funny that you mentioned that, because the AI 157 00:08:13,840 --> 00:08:16,200 Speaker 2: component of this is sort of what makes this novel 158 00:08:16,200 --> 00:08:17,560 Speaker 2: and new. But that kind of thing has been all 159 00:08:17,600 --> 00:08:20,200 Speaker 2: over our social media for the longest time. Sure, I 160 00:08:20,280 --> 00:08:24,840 Speaker 2: remember how big stuff like skit culture was on TikTok. 161 00:08:24,920 --> 00:08:28,239 Speaker 2: And I don't mean skits like Saturday Night Live or Portlandia. 162 00:08:28,320 --> 00:08:30,440 Speaker 2: I mean skits where they are trying to get you 163 00:08:30,520 --> 00:08:33,240 Speaker 2: to think this is somebody's cell phone footage of something 164 00:08:33,280 --> 00:08:35,080 Speaker 2: that happened, but really it's like, well that. 165 00:08:35,040 --> 00:08:36,160 Speaker 3: Those are two actors. 166 00:08:36,360 --> 00:08:38,640 Speaker 2: And there was a type of these skits that would 167 00:08:38,640 --> 00:08:41,960 Speaker 2: really take off on TikTok, where it was purporting to 168 00:08:42,000 --> 00:08:45,520 Speaker 2: be oh, this is a parent who is going off 169 00:08:45,559 --> 00:08:48,840 Speaker 2: on a trans teacher for trying to indoctrinate their kid, 170 00:08:49,120 --> 00:08:50,840 Speaker 2: and all the comments would be like good for them, 171 00:08:50,920 --> 00:08:53,120 Speaker 2: good for that mom, And then the screen flips and 172 00:08:53,120 --> 00:08:54,680 Speaker 2: it's like, oh, well, the woman you were just telling 173 00:08:54,720 --> 00:08:58,000 Speaker 2: me is the trans teacher, Now she's the mom who. 174 00:08:57,840 --> 00:09:00,800 Speaker 3: Was the next video, yes, exactly. 175 00:09:01,240 --> 00:09:03,480 Speaker 1: No, I like the ones that are set on airplanes 176 00:09:03,480 --> 00:09:05,720 Speaker 1: where they all use the same airplane set, Yes, and 177 00:09:05,760 --> 00:09:10,000 Speaker 1: they get into like fake fights on airplanes using the 178 00:09:10,040 --> 00:09:12,200 Speaker 1: same like five actors playing different roles. 179 00:09:12,320 --> 00:09:13,360 Speaker 3: Yeah, and then if. 180 00:09:13,280 --> 00:09:15,959 Speaker 2: You look carefully in the background, you start thinking, well, 181 00:09:16,240 --> 00:09:20,000 Speaker 2: airplanes don't have those strip led lights that you can 182 00:09:20,040 --> 00:09:20,960 Speaker 2: buy on Amazon. 183 00:09:21,400 --> 00:09:24,920 Speaker 3: Does actually sme the TikTok lights and the hallways like 184 00:09:25,000 --> 00:09:25,800 Speaker 3: five feet wide? 185 00:09:26,920 --> 00:09:31,480 Speaker 2: Yeah, exactly, And listen, I am not above getting taken 186 00:09:31,520 --> 00:09:34,199 Speaker 2: in by those kinds of skits. And I guess I 187 00:09:34,760 --> 00:09:37,720 Speaker 2: don't love the idea that someone would be dedicating energy 188 00:09:37,760 --> 00:09:40,680 Speaker 2: and brain space to getting upset about a set of 189 00:09:40,679 --> 00:09:42,360 Speaker 2: circumstances that never really happened. 190 00:09:42,400 --> 00:09:45,000 Speaker 1: But it's the Internet. Come on, that's that's like, that's 191 00:09:45,000 --> 00:09:49,080 Speaker 1: half of the Internet. Yes, you know, I don't love it. 192 00:09:49,120 --> 00:09:51,000 Speaker 2: But when the stakes so like when the stakes are 193 00:09:51,080 --> 00:09:53,920 Speaker 2: low and it's just like a random fight on an airplane, fine, 194 00:09:53,960 --> 00:09:56,360 Speaker 2: when the stakes are higher and it's like, this is 195 00:09:56,360 --> 00:10:00,640 Speaker 2: a skit meant to like attack or demonize trans people, 196 00:10:00,720 --> 00:10:02,960 Speaker 2: queer people, black people, that's where I'm like, well, what 197 00:10:03,000 --> 00:10:04,040 Speaker 2: are we really doing here? 198 00:10:14,280 --> 00:10:16,199 Speaker 3: I think whether or not this. 199 00:10:16,280 --> 00:10:19,200 Speaker 2: Kind of content, like when it's AI generated, we're looking 200 00:10:19,240 --> 00:10:22,439 Speaker 2: at things that never actually happened, even though these these 201 00:10:22,480 --> 00:10:26,160 Speaker 2: circumstances in these situations never really happened, they still very 202 00:10:26,240 --> 00:10:29,760 Speaker 2: much affirm the worldview of the people who are consuming it, right, 203 00:10:29,800 --> 00:10:33,240 Speaker 2: And so if you are consuming a skit involving whether 204 00:10:33,280 --> 00:10:37,520 Speaker 2: it's human actors or AI generated black people, if that 205 00:10:37,679 --> 00:10:41,720 Speaker 2: skit reaffirms your worldview that these people cannot be trusted, 206 00:10:41,920 --> 00:10:44,240 Speaker 2: these people are bad in some way, it kind of 207 00:10:44,240 --> 00:10:45,520 Speaker 2: doesn't matter if it's real. 208 00:10:45,440 --> 00:10:47,760 Speaker 3: Or not, you know what I'm saying. Yeah, yeah, totally. 209 00:10:48,240 --> 00:10:52,280 Speaker 1: That's like the concept of like hyperreality, where you're trying 210 00:10:52,320 --> 00:10:55,600 Speaker 1: to like blend the Internet's exaggerated version of reality with 211 00:10:55,720 --> 00:10:59,720 Speaker 1: our physical lived existence, and how these things start combining 212 00:10:59,800 --> 00:11:02,199 Speaker 1: in to each other to create this idea of reality 213 00:11:02,240 --> 00:11:03,960 Speaker 1: in our heads that's more real than it actually is, 214 00:11:03,960 --> 00:11:05,680 Speaker 1: to the point where we take things on the screen 215 00:11:06,040 --> 00:11:08,920 Speaker 1: to be more accurately reflective of what's going on in 216 00:11:08,960 --> 00:11:11,000 Speaker 1: the world than what we actually experience in our day 217 00:11:11,040 --> 00:11:13,000 Speaker 1: to day lives. And so much of that concept is 218 00:11:13,000 --> 00:11:17,160 Speaker 1: what drives like American like reactionary politics exactly. 219 00:11:17,240 --> 00:11:20,680 Speaker 2: And when you actually go into the comments of these videos, 220 00:11:20,960 --> 00:11:24,680 Speaker 2: which in my opinion are very clearly AI generated, people 221 00:11:24,720 --> 00:11:25,920 Speaker 2: don't even comments well, I. 222 00:11:25,960 --> 00:11:29,280 Speaker 1: Mean, well, I mean that easy for you to say 223 00:11:30,360 --> 00:11:33,920 Speaker 1: someone who spends their time like researching what's going on 224 00:11:33,920 --> 00:11:34,679 Speaker 1: on the internet. 225 00:11:34,800 --> 00:11:37,680 Speaker 3: I'm not sure if Mema and Pop are finding. 226 00:11:37,360 --> 00:11:39,640 Speaker 1: These videos, they're gonna be like, well, this one's obviously 227 00:11:39,679 --> 00:11:40,600 Speaker 1: AI generated. 228 00:11:41,080 --> 00:11:43,480 Speaker 2: No, And that's my point is like, I don't even 229 00:11:43,559 --> 00:11:45,200 Speaker 2: think they're thinking about it that way, and I don't 230 00:11:45,240 --> 00:11:47,600 Speaker 2: think they care that it's not really In the comments 231 00:11:47,640 --> 00:11:50,640 Speaker 2: of these videos, It'll be a video, an AI generated 232 00:11:50,720 --> 00:11:54,319 Speaker 2: video of a black woman behaving in this very stereotypical 233 00:11:54,400 --> 00:11:57,920 Speaker 2: racist way, and the comments will say they're all like that, 234 00:11:58,280 --> 00:12:01,319 Speaker 2: and it kind of misses the point of like, well, there's. 235 00:12:00,760 --> 00:12:03,320 Speaker 3: No they in this video because it's AI generated. This 236 00:12:03,400 --> 00:12:06,400 Speaker 3: is just a computer puppet. This isn't real. Like, yeah, 237 00:12:06,520 --> 00:12:08,199 Speaker 3: I completely agree. 238 00:12:08,440 --> 00:12:11,199 Speaker 2: But I think when you see something online, whether it's 239 00:12:11,240 --> 00:12:15,199 Speaker 2: obviously AI generated or not, if it reaffirms your worldview, 240 00:12:15,240 --> 00:12:16,240 Speaker 2: it kind of doesn't matter. 241 00:12:16,280 --> 00:12:18,959 Speaker 3: It's the same reason why when there's. 242 00:12:18,760 --> 00:12:22,480 Speaker 2: Like four legged veterans in AI slop holding a sign 243 00:12:22,520 --> 00:12:26,400 Speaker 2: that says everyone forgot about me, wish me happy birthday. 244 00:12:26,320 --> 00:12:28,520 Speaker 3: Three billion likes on Facebook. 245 00:12:30,040 --> 00:12:32,000 Speaker 2: I mean, what do you think is going on there? 246 00:12:32,040 --> 00:12:33,560 Speaker 2: I find that so fascinating. 247 00:12:34,080 --> 00:12:38,640 Speaker 1: Oh, I mean, I'm not a psychologist, but I don't know. 248 00:12:38,760 --> 00:12:41,760 Speaker 1: I think it isn't just the simple reaffirming of someone's 249 00:12:41,760 --> 00:12:44,520 Speaker 1: previously held view people are very receptive to. And we 250 00:12:44,600 --> 00:12:46,360 Speaker 1: even see this with like you know, with like fake 251 00:12:46,440 --> 00:12:49,760 Speaker 1: news headlines, right, and people might point out that this 252 00:12:49,800 --> 00:12:53,200 Speaker 1: story isn't isn't actually real. And when people are confronted 253 00:12:53,360 --> 00:12:56,320 Speaker 1: with this idea of that they've been tricked by unreality, 254 00:12:56,320 --> 00:12:59,400 Speaker 1: they'll be like, no, maybe this one isn't real, but 255 00:12:59,520 --> 00:13:02,600 Speaker 1: it could be real. And that's what really matters is 256 00:13:02,640 --> 00:13:05,760 Speaker 1: that this this feels true, not that it is true, 257 00:13:06,000 --> 00:13:08,680 Speaker 1: but the fact that I feel it resonating is actually 258 00:13:08,720 --> 00:13:12,840 Speaker 1: more important than any kind of physical trueness out inside, 259 00:13:12,920 --> 00:13:16,400 Speaker 1: like the flesh world like that. That is honestly that 260 00:13:16,400 --> 00:13:19,400 Speaker 1: that matters far less than how it impacts how I 261 00:13:19,440 --> 00:13:22,160 Speaker 1: feel and how it reflects the world as I see it. 262 00:13:22,600 --> 00:13:25,040 Speaker 2: So I did an episode of my podcast or oar 263 00:13:25,080 --> 00:13:27,800 Speaker 2: norgles on the Internet, all about the sort of weird 264 00:13:27,960 --> 00:13:33,840 Speaker 2: economy of AI generated disinformation, essentially fan fiction that came 265 00:13:33,880 --> 00:13:35,400 Speaker 2: out of the trial of Sean P. 266 00:13:35,520 --> 00:13:38,640 Speaker 3: Diddy. Come oh, that sounds incredibly upsetting. 267 00:13:38,880 --> 00:13:40,880 Speaker 2: It was so upsetting, and the reason I looked into 268 00:13:40,960 --> 00:13:43,160 Speaker 2: it is because I have to be honest and say, 269 00:13:43,760 --> 00:13:46,400 Speaker 2: one of these AI generated videos got me, right. It 270 00:13:46,440 --> 00:13:50,400 Speaker 2: was a video that claimed that the late musician Prince 271 00:13:51,160 --> 00:13:54,120 Speaker 2: was able to testify in Ditty's trial from the Beyond 272 00:13:54,120 --> 00:13:56,800 Speaker 2: the Grave and that they played a video that Prince 273 00:13:56,880 --> 00:13:59,719 Speaker 2: made warning everybody that Didty is this bad guy? 274 00:13:59,800 --> 00:14:02,200 Speaker 3: Right. I am probably the world's biggest Prince fans. 275 00:14:02,200 --> 00:14:04,760 Speaker 2: While I was like Prince always like, it got me 276 00:14:04,840 --> 00:14:05,280 Speaker 2: and get. 277 00:14:05,240 --> 00:14:07,839 Speaker 3: Totally affirmed what I want to be true. 278 00:14:08,040 --> 00:14:08,960 Speaker 2: But it was all a lie. 279 00:14:09,559 --> 00:14:12,040 Speaker 1: It's compelling, it's trying to like, it's trying to impact 280 00:14:12,080 --> 00:14:14,760 Speaker 1: you emotionally, especially for people who who like Prince, who 281 00:14:14,760 --> 00:14:18,600 Speaker 1: who miss miss prints. This could be emotionally compelling, and like, 282 00:14:18,640 --> 00:14:21,640 Speaker 1: that's that's what they're like intentionally going after. I think 283 00:14:21,680 --> 00:14:24,080 Speaker 1: that's that's why something like that could work so well. 284 00:14:24,400 --> 00:14:25,000 Speaker 3: It got me. 285 00:14:25,320 --> 00:14:28,520 Speaker 2: And when I looked into kind of how these videos 286 00:14:28,600 --> 00:14:32,800 Speaker 2: are cranked out on YouTube, so basically any celebrity that 287 00:14:32,840 --> 00:14:36,360 Speaker 2: you can imagine there is an AI generated video on 288 00:14:36,440 --> 00:14:39,480 Speaker 2: YouTube saying that they were somehow involved in the Diddy trial. 289 00:14:39,800 --> 00:14:42,040 Speaker 2: And what's so interesting is in the comments of these 290 00:14:42,120 --> 00:14:45,440 Speaker 2: videos that are again pretty obviously AI generated or not real, 291 00:14:45,520 --> 00:14:48,200 Speaker 2: and even the description of the YouTube account will say 292 00:14:48,280 --> 00:14:50,720 Speaker 2: this is just for entertainment. Nothing here is supposed to 293 00:14:50,760 --> 00:14:54,120 Speaker 2: be true. People won't read that part. Basically, if you've 294 00:14:54,160 --> 00:14:57,320 Speaker 2: ever had a bad feeling about a celebrity, which who 295 00:14:57,360 --> 00:15:00,600 Speaker 2: hasn't totally see there's a video that a with that 296 00:15:00,640 --> 00:15:02,240 Speaker 2: worldview that is like, well did you know they were 297 00:15:02,280 --> 00:15:03,720 Speaker 2: involved in the ditty pre cops? 298 00:15:03,760 --> 00:15:04,960 Speaker 3: And everybody's like, I knew it. 299 00:15:05,320 --> 00:15:07,880 Speaker 1: That person always gave me the ac if fine, I 300 00:15:07,960 --> 00:15:10,120 Speaker 1: knew it. I was smart enough to pick it up. 301 00:15:10,160 --> 00:15:13,080 Speaker 1: Not everyone else was smart enough, but I was. And 302 00:15:13,360 --> 00:15:16,040 Speaker 1: that's that's a whole other emotional feeling that it's being 303 00:15:16,120 --> 00:15:19,560 Speaker 1: targeted by these like AI slop creators where they're trying 304 00:15:19,560 --> 00:15:23,000 Speaker 1: to get like affirm people's like like narcissism about their 305 00:15:23,040 --> 00:15:25,640 Speaker 1: ability to judge the moral character of strangers. 306 00:15:26,080 --> 00:15:29,920 Speaker 2: That is so it because the people, the celebrities they choose, 307 00:15:30,280 --> 00:15:32,760 Speaker 2: it's people that maybe you would have like, I have 308 00:15:32,760 --> 00:15:34,880 Speaker 2: no real reason for this, but I hate Kevin Hart 309 00:15:35,080 --> 00:15:37,880 Speaker 2: and so in the videos. Don't even ask me why. 310 00:15:38,000 --> 00:15:39,400 Speaker 2: I don't even have a real reason. I just don't 311 00:15:39,440 --> 00:15:41,760 Speaker 2: like him. Well, he is short, he is short. There 312 00:15:41,840 --> 00:15:44,480 Speaker 2: you go love to my short Kings. One of the 313 00:15:44,480 --> 00:15:46,160 Speaker 2: reasons I don't like him. This is just me spec 314 00:15:46,400 --> 00:15:48,440 Speaker 2: like he just does a lot of ads and you 315 00:15:48,480 --> 00:15:51,600 Speaker 2: can't get on social media without his cryptocurrency ad, his 316 00:15:51,720 --> 00:15:52,480 Speaker 2: draft Kings ad. 317 00:15:52,520 --> 00:15:55,000 Speaker 3: I just like hate seeing it short Yeah. 318 00:15:55,040 --> 00:15:57,720 Speaker 2: In the AI generated video claiming that he was mixed 319 00:15:57,800 --> 00:16:00,800 Speaker 2: up in the Diddy Trials, every comments like I knew it. 320 00:16:00,880 --> 00:16:04,520 Speaker 2: I always hated him, And that's affirming people like feeling 321 00:16:04,600 --> 00:16:06,960 Speaker 2: like they knew something that other people didn't see, and 322 00:16:06,960 --> 00:16:07,960 Speaker 2: they knew it early on. 323 00:16:08,440 --> 00:16:10,560 Speaker 1: Well, And I think what's something that's similar to this 324 00:16:10,600 --> 00:16:13,600 Speaker 1: that's happening right now is there's a massive media campaign 325 00:16:13,760 --> 00:16:18,280 Speaker 1: right now against Pedro Pascal with with AI generated videos 326 00:16:18,280 --> 00:16:21,600 Speaker 1: of him like touching his female co stars, and these 327 00:16:21,720 --> 00:16:25,160 Speaker 1: these videos have been have been digitally altered, and it's 328 00:16:25,200 --> 00:16:28,240 Speaker 1: in service of this this big harassment campaign against someone 329 00:16:28,240 --> 00:16:32,360 Speaker 1: who's like very vocally Protrance writes, there's other possible reasons 330 00:16:32,360 --> 00:16:36,160 Speaker 1: for why he's he's being targeted by these videos, but no, Similarly, 331 00:16:36,200 --> 00:16:39,120 Speaker 1: it's trying to create this like ick around Patri Pascal 332 00:16:39,600 --> 00:16:42,080 Speaker 1: using AI altered media, and it's it's gaining a lot 333 00:16:42,120 --> 00:16:44,720 Speaker 1: of traction right now, and it's something that people need 334 00:16:44,760 --> 00:16:47,360 Speaker 1: to be like very very cautious of. But yeah, it's 335 00:16:47,360 --> 00:16:49,720 Speaker 1: trying to affirm whatever. Maybe you, for some reason have 336 00:16:49,800 --> 00:16:52,520 Speaker 1: never liked Patro Pascal. I can't imagine why. But if 337 00:16:52,520 --> 00:16:54,840 Speaker 1: you find a video like this talking about how how 338 00:16:54,840 --> 00:16:59,640 Speaker 1: he's using a social anxiety diagnosis to inappropriately touch his. 339 00:16:59,560 --> 00:17:01,880 Speaker 3: Co stars, like I knew it. I knew it. 340 00:17:01,960 --> 00:17:05,240 Speaker 1: I never trusted Pedro Pascal and I don't like it. 341 00:17:05,320 --> 00:17:07,359 Speaker 1: He's pro trans writes, and you're like, there you go. 342 00:17:07,480 --> 00:17:10,080 Speaker 1: They've completely got you. They've been able to like automate 343 00:17:10,119 --> 00:17:14,640 Speaker 1: and monetize internet hate campaigns against people that you don't know. 344 00:17:15,280 --> 00:17:19,040 Speaker 2: Gerrett, Literally, right before you and I got on this episode, 345 00:17:19,320 --> 00:17:21,400 Speaker 2: I saw a video on Reddit and it's a it's 346 00:17:21,440 --> 00:17:24,040 Speaker 2: a scene from an episode of Always Sonny where one 347 00:17:24,040 --> 00:17:27,879 Speaker 2: of the guys is like essentially lifting d the female 348 00:17:28,000 --> 00:17:31,320 Speaker 2: lead up by her crotch, and the caption was Pedro 349 00:17:31,440 --> 00:17:34,080 Speaker 2: Pascal when he feels anxiety next to me, you got 350 00:17:34,040 --> 00:17:36,000 Speaker 2: a co star? And I remember thinking, like, this is 351 00:17:36,040 --> 00:17:38,639 Speaker 2: such a weird fucking video. But what corner of the 352 00:17:38,640 --> 00:17:40,200 Speaker 2: Internet have I wandered into? 353 00:17:40,400 --> 00:17:43,000 Speaker 3: But I didn't. I did not know that there. 354 00:17:42,840 --> 00:17:45,160 Speaker 2: Are horses trying to make me get the ick about 355 00:17:45,160 --> 00:17:48,760 Speaker 2: Pedro Pascal Coincidentally, he is someone who speaks up for 356 00:17:49,040 --> 00:17:51,600 Speaker 2: LGBTQ wrights, you know, progressive causes. 357 00:17:52,200 --> 00:17:52,880 Speaker 3: Of course. 358 00:17:53,560 --> 00:17:55,240 Speaker 1: Yeah, No, it's it's it's a it's a it's a 359 00:17:55,320 --> 00:17:56,920 Speaker 1: huge thing swooping the internet right now. 360 00:17:57,240 --> 00:18:00,080 Speaker 2: And I think it really goes to show how so 361 00:18:00,560 --> 00:18:04,080 Speaker 2: kind of easily we can be manipulated using digital content, 362 00:18:04,080 --> 00:18:06,679 Speaker 2: whether it's AI generated or AI manipulated or not. Like 363 00:18:07,240 --> 00:18:11,960 Speaker 2: our understandings of the sort of general temperature of what's 364 00:18:12,000 --> 00:18:14,760 Speaker 2: going on are so so much more tenuous than we think, 365 00:18:14,800 --> 00:18:17,280 Speaker 2: and so much more easily manipulated than we realize. 366 00:18:17,480 --> 00:18:20,480 Speaker 1: No, absolutely, no one is immune to propaganda. That is 367 00:18:20,520 --> 00:18:21,919 Speaker 1: a great way of putting it. 368 00:18:32,640 --> 00:18:36,400 Speaker 2: I'm happy that you used the word propaganda, because that's 369 00:18:36,440 --> 00:18:41,320 Speaker 2: what I really do think these AI generated, essentially menstrual 370 00:18:41,359 --> 00:18:45,240 Speaker 2: show videos are. I think it's not a surprise that 371 00:18:45,280 --> 00:18:48,439 Speaker 2: we are seeing them the same way that back in 372 00:18:48,480 --> 00:18:51,480 Speaker 2: the day, minstrel shows were very popular at a time 373 00:18:51,520 --> 00:18:55,439 Speaker 2: when there was an active campaign of attacking black folks 374 00:18:55,440 --> 00:18:57,439 Speaker 2: and saying they weren't smart enough and did not deserve 375 00:18:57,480 --> 00:19:00,240 Speaker 2: full citizenship, did not deserve rights. All of that I 376 00:19:00,240 --> 00:19:03,400 Speaker 2: think we're basically seeing the same thing today. I think 377 00:19:03,440 --> 00:19:06,159 Speaker 2: the rise of popularity of this kind of content is 378 00:19:06,560 --> 00:19:10,120 Speaker 2: against the backdrop of a very real attack on marginalized 379 00:19:10,119 --> 00:19:13,000 Speaker 2: people from this administration. You know, there was just this 380 00:19:13,320 --> 00:19:17,520 Speaker 2: very medipiece and ProPublica about how Trump and Musk their 381 00:19:17,640 --> 00:19:22,200 Speaker 2: goage stuff really was an attack on black women, specifically, 382 00:19:22,240 --> 00:19:25,199 Speaker 2: like black women with stable federal jobs totally, and that 383 00:19:25,400 --> 00:19:27,639 Speaker 2: these attacks essentially it was like you were able to 384 00:19:27,680 --> 00:19:31,239 Speaker 2: smear black women career civil servants as you know, they 385 00:19:31,280 --> 00:19:34,639 Speaker 2: were DEI hires, they were undeserving of these jobs, they 386 00:19:34,680 --> 00:19:38,240 Speaker 2: really just deserved to be fired. And you know, really 387 00:19:38,680 --> 00:19:42,119 Speaker 2: black women just became these easy targets for an administration 388 00:19:42,320 --> 00:19:44,879 Speaker 2: hostile to marginalized people. So if we have all of 389 00:19:44,880 --> 00:19:48,240 Speaker 2: that happening against the rise of this form of digital 390 00:19:48,280 --> 00:19:51,800 Speaker 2: media that is using AI to reaffirm these stereotypes about 391 00:19:51,840 --> 00:19:54,280 Speaker 2: black women that we aren't able to behave ourselves in 392 00:19:54,280 --> 00:19:57,200 Speaker 2: polite society, cannot figure out a way to solve conflicts 393 00:19:57,200 --> 00:20:00,119 Speaker 2: without resorting to violence, are loud and obnoxious. Then when 394 00:20:00,160 --> 00:20:03,399 Speaker 2: you hear about real life human black women getting pushed 395 00:20:03,400 --> 00:20:06,119 Speaker 2: out of their employment or attacked by this administration, you 396 00:20:06,200 --> 00:20:08,720 Speaker 2: might think, well, maybe it's for the best because they're 397 00:20:08,720 --> 00:20:10,960 Speaker 2: not suited for that work anyway, because the kind of 398 00:20:11,000 --> 00:20:13,600 Speaker 2: content that I have been consuming on TikTok, and I 399 00:20:13,640 --> 00:20:17,080 Speaker 2: think it just reaffirms this world of view that real 400 00:20:17,320 --> 00:20:21,360 Speaker 2: life human black folks are not self actualized human beings. 401 00:20:21,400 --> 00:20:24,679 Speaker 2: We're just a collection of tropes and stereotypes and caricatures. 402 00:20:25,560 --> 00:20:29,480 Speaker 3: I don't know what to say there, but I agree, yes, And. 403 00:20:29,520 --> 00:20:34,320 Speaker 2: I do think there's a kind of platform accountability question and. 404 00:20:34,320 --> 00:20:37,320 Speaker 3: All this because oh, most certainly. Yeah, Like, the. 405 00:20:37,280 --> 00:20:40,440 Speaker 2: Reason why we're seeing the rise of these videos is 406 00:20:40,480 --> 00:20:43,840 Speaker 2: because of the recent introduction of Google's VO three creator. 407 00:20:44,000 --> 00:20:44,800 Speaker 3: It came out about a. 408 00:20:44,760 --> 00:20:48,399 Speaker 2: Month ago and it's Google's latest AI video generation model, 409 00:20:48,640 --> 00:20:51,920 Speaker 2: and essentially it's designed to create these realistic looking videos 410 00:20:51,920 --> 00:20:54,400 Speaker 2: from text prompts. And the thing that kind of makes 411 00:20:54,400 --> 00:20:57,200 Speaker 2: it a step above is that you can incorporate things 412 00:20:57,280 --> 00:21:01,040 Speaker 2: like synchronized audio, dialogue, sound effects, me music. It is 413 00:21:01,160 --> 00:21:04,399 Speaker 2: really taken off with creators online who are using this 414 00:21:04,480 --> 00:21:06,040 Speaker 2: tool to create everything from. 415 00:21:06,040 --> 00:21:09,600 Speaker 3: These AI skits to AI influencers to AI muck. 416 00:21:09,440 --> 00:21:11,800 Speaker 2: Bangs you know where people eat tons and tons of food. 417 00:21:12,000 --> 00:21:15,040 Speaker 3: Oh, this is so upsetting it is. 418 00:21:15,200 --> 00:21:18,120 Speaker 2: And then like another kind of offshoot of this is 419 00:21:18,400 --> 00:21:22,119 Speaker 2: you have people who use VO three to make content 420 00:21:22,280 --> 00:21:24,639 Speaker 2: like this and they get tons of use and then 421 00:21:24,680 --> 00:21:26,919 Speaker 2: they're like, oh, if you want to learn how to 422 00:21:27,040 --> 00:21:30,040 Speaker 2: make this yourself, pay me and I'll teach you how 423 00:21:30,080 --> 00:21:32,439 Speaker 2: to do it too. So it's like there's always a 424 00:21:32,480 --> 00:21:35,040 Speaker 2: weird like MLM grift in there somewhere. 425 00:21:35,359 --> 00:21:38,680 Speaker 1: That is the content creator classics, like a mid tier influencer. 426 00:21:39,160 --> 00:21:41,240 Speaker 1: We're not like that good at what they do, but 427 00:21:41,440 --> 00:21:44,439 Speaker 1: is able to supplement their income by offering courses to 428 00:21:44,520 --> 00:21:49,160 Speaker 1: people to teach them how to make similarly some subpar content. 429 00:21:49,640 --> 00:21:52,280 Speaker 1: And it's interesting that we've reached the full AI automation 430 00:21:52,400 --> 00:21:53,920 Speaker 1: aspect of this, right is this used to be a 431 00:21:53,960 --> 00:21:56,960 Speaker 1: big thing among like YouTubers. I was not aware that 432 00:21:57,000 --> 00:22:00,800 Speaker 1: this is now a thing among like AI TikTok influencers, 433 00:22:00,840 --> 00:22:03,240 Speaker 1: But that makes sense because this is like the easiest 434 00:22:03,240 --> 00:22:05,320 Speaker 1: thing to automate, So of course there's going to be 435 00:22:05,320 --> 00:22:08,160 Speaker 1: like an influx of people trying to make a quick 436 00:22:08,200 --> 00:22:10,640 Speaker 1: buck on racist AI slop. 437 00:22:11,200 --> 00:22:14,560 Speaker 2: It makes me so sad, and I do think, I 438 00:22:14,600 --> 00:22:19,440 Speaker 2: mean when I guess I would be curious how Google 439 00:22:19,600 --> 00:22:21,960 Speaker 2: feels about the fact that like this is what their. 440 00:22:22,240 --> 00:22:23,439 Speaker 3: Tool is being used for. 441 00:22:23,680 --> 00:22:27,040 Speaker 2: Right, I wonder like if leaders have a sense that 442 00:22:27,119 --> 00:22:29,200 Speaker 2: this is harmful, not just harmful to black women like 443 00:22:29,280 --> 00:22:31,119 Speaker 2: me who are depicted in this kind of content, but 444 00:22:31,240 --> 00:22:33,679 Speaker 2: harmful for the Internet as a whole. It makes the 445 00:22:33,760 --> 00:22:37,920 Speaker 2: Internet experience worse for everybody. And I guess, I guess 446 00:22:37,920 --> 00:22:41,240 Speaker 2: I would imagine that like Google probably doesn't care that 447 00:22:41,320 --> 00:22:43,800 Speaker 2: this is what their their technology is being used for. 448 00:22:43,840 --> 00:22:45,680 Speaker 2: Like if I had a direct line to some Darbachi, 449 00:22:45,760 --> 00:22:47,679 Speaker 2: the head of Google, I would show him these clips 450 00:22:47,680 --> 00:22:49,480 Speaker 2: and say, like, is this what you had in mind 451 00:22:49,720 --> 00:22:52,439 Speaker 2: for Vo three or is this a misuse of this 452 00:22:52,480 --> 00:22:54,200 Speaker 2: tool that you just put out and unleashed on all 453 00:22:54,200 --> 00:22:54,719 Speaker 2: of us? 454 00:22:55,040 --> 00:22:57,840 Speaker 1: Yeah, And are you going to dedicate some like millions 455 00:22:57,880 --> 00:23:01,919 Speaker 1: of dollars of research into stopping this from happening? No, 456 00:23:02,320 --> 00:23:04,640 Speaker 1: of course not, Like they're not going to build comprehensive 457 00:23:04,680 --> 00:23:07,359 Speaker 1: tools that prevent platform abuse like this, Like that's not 458 00:23:07,359 --> 00:23:09,840 Speaker 1: going to happen as long as people are using it, 459 00:23:10,280 --> 00:23:12,240 Speaker 1: and then people are hearing about it and it's spreading, 460 00:23:12,280 --> 00:23:14,600 Speaker 1: Like that's that's what they want. If there happens to 461 00:23:14,640 --> 00:23:17,000 Speaker 1: be offensive use cases of it. If anything, That's good 462 00:23:17,040 --> 00:23:18,880 Speaker 1: because that drives engagement. It gets people to know about 463 00:23:18,920 --> 00:23:19,280 Speaker 1: the product. 464 00:23:19,600 --> 00:23:21,800 Speaker 2: And I think that's another one of the reasons why 465 00:23:22,359 --> 00:23:26,200 Speaker 2: Trump's you know, executive orders on AI. 466 00:23:26,080 --> 00:23:28,160 Speaker 3: That we saw early AI. 467 00:23:28,200 --> 00:23:30,320 Speaker 2: I mean, like, I will be the first person to 468 00:23:30,359 --> 00:23:33,720 Speaker 2: admit that we have very deep problems when it comes 469 00:23:33,720 --> 00:23:37,600 Speaker 2: to AI. Anybody who listens to Better Offline knows this, Like, 470 00:23:37,920 --> 00:23:38,880 Speaker 2: this is not a secret. 471 00:23:39,200 --> 00:23:40,479 Speaker 3: AI is often biased. 472 00:23:40,520 --> 00:23:43,320 Speaker 2: AI is often wrong because it is trained on us 473 00:23:43,440 --> 00:23:45,720 Speaker 2: humans the bias little bucks that we are right, and 474 00:23:45,760 --> 00:23:48,679 Speaker 2: so that shouldn't be a surprise to anybody. I also 475 00:23:49,240 --> 00:23:51,040 Speaker 2: will say, like, some of the solutions of how we 476 00:23:51,119 --> 00:23:54,000 Speaker 2: fix that are complex and not super simple. 477 00:23:54,160 --> 00:23:56,520 Speaker 3: But with Trump's executive order, he basically is. 478 00:23:56,560 --> 00:24:00,560 Speaker 2: Signing an order saying all AI must be objective, it 479 00:24:00,600 --> 00:24:03,360 Speaker 2: must adhere to the objective truth. 480 00:24:03,200 --> 00:24:05,480 Speaker 3: Of the United States. And it's like, well, who determines that? 481 00:24:05,680 --> 00:24:09,920 Speaker 1: Who who determines the objective truth of the United States? 482 00:24:10,200 --> 00:24:10,920 Speaker 3: The President? 483 00:24:11,320 --> 00:24:14,439 Speaker 2: I mean, if you ask Trump, yes, him, And I 484 00:24:14,440 --> 00:24:17,360 Speaker 2: guess that's the thing that pisses me off is that 485 00:24:17,480 --> 00:24:21,760 Speaker 2: there actually are complex issues and problems when it comes 486 00:24:21,800 --> 00:24:25,640 Speaker 2: to AI. But this executive order just is like, oh, 487 00:24:25,720 --> 00:24:27,760 Speaker 2: the problem is is that it's woke. The solution is 488 00:24:27,840 --> 00:24:30,639 Speaker 2: me signing an executive order saying no woke in AI, 489 00:24:30,840 --> 00:24:34,119 Speaker 2: and rather than getting any kind of actual solution or 490 00:24:34,160 --> 00:24:36,800 Speaker 2: having the conversation, we just get fucking nonsense. 491 00:24:37,760 --> 00:24:40,560 Speaker 1: You know, it is worrying for multiple levels, including the 492 00:24:40,640 --> 00:24:44,400 Speaker 1: fact that the president thinks he's the orbiter of objective 493 00:24:44,560 --> 00:24:47,560 Speaker 1: truth and it thinks he can legislate that or thinks 494 00:24:47,600 --> 00:24:51,840 Speaker 1: he can executive order that into being by either you know, 495 00:24:51,920 --> 00:24:55,879 Speaker 1: benefiting or punishing tech companies who follow his policies. 496 00:24:56,200 --> 00:24:58,800 Speaker 2: Yeah, I mean a spoiler alert for that executive order. 497 00:24:58,880 --> 00:25:02,360 Speaker 2: That's exactly what he saying. And you know you used 498 00:25:02,359 --> 00:25:05,960 Speaker 2: the word propaganda earlier, and that really is if there 499 00:25:06,040 --> 00:25:08,040 Speaker 2: was like a thesis statement of what I wanted to 500 00:25:08,040 --> 00:25:10,760 Speaker 2: say in this episode, is that that is exactly what I. 501 00:25:10,680 --> 00:25:11,560 Speaker 3: Think is going on here. 502 00:25:11,640 --> 00:25:14,360 Speaker 2: It really does remind me of minstrel shows because even 503 00:25:14,400 --> 00:25:17,119 Speaker 2: though minstrel shows back in the nineteenth century were this 504 00:25:17,320 --> 00:25:21,600 Speaker 2: popular form of entertainment, it also was an entire manufacturing 505 00:25:21,720 --> 00:25:25,480 Speaker 2: enterprise where people made very good money selling racist blackface 506 00:25:25,520 --> 00:25:28,639 Speaker 2: figurines as novelties and all of that. David Pilgrim, the 507 00:25:28,640 --> 00:25:31,280 Speaker 2: founder of the Jim Crow Museum of Racist Memorabilia at 508 00:25:31,280 --> 00:25:34,000 Speaker 2: Farris State University in Michigan put it like this, They 509 00:25:34,040 --> 00:25:38,120 Speaker 2: were everyday objects which portrayed black people as ugly different 510 00:25:38,240 --> 00:25:41,240 Speaker 2: and fun to laugh at. They were, in a word, propaganda, 511 00:25:41,560 --> 00:25:44,000 Speaker 2: And I think that's exactly what's going on here, Like 512 00:25:44,040 --> 00:25:46,320 Speaker 2: people like to think about racism as if it's just 513 00:25:46,400 --> 00:25:48,440 Speaker 2: this thing that hangs in the air, as opposed to 514 00:25:48,480 --> 00:25:52,800 Speaker 2: a system that specific people are personally and intentionally perpetuating 515 00:25:52,920 --> 00:25:54,199 Speaker 2: because they are cashing in on it. 516 00:25:54,240 --> 00:25:55,760 Speaker 3: I don't see how Google letting. 517 00:25:55,560 --> 00:25:58,760 Speaker 2: Creators use their tools to create content like this is 518 00:25:58,800 --> 00:26:01,760 Speaker 2: any different, Like, yeah, it's that is exactly what's going 519 00:26:01,840 --> 00:26:02,240 Speaker 2: on in my. 520 00:26:02,160 --> 00:26:05,080 Speaker 1: Book, that's flatly like that's just like one to one, 521 00:26:05,320 --> 00:26:09,720 Speaker 1: Like you're using tech to create like unreal depictions of 522 00:26:09,840 --> 00:26:14,800 Speaker 1: racist chriacatures, to please audiences, to reaffirm their own their 523 00:26:14,800 --> 00:26:18,440 Speaker 1: own biases, to reform their own racism, and you're monetizing 524 00:26:18,440 --> 00:26:21,560 Speaker 1: it and you're automating it to create hashtag viral moments 525 00:26:21,600 --> 00:26:26,200 Speaker 1: Like it's it's the most explicit and like gross blatant 526 00:26:26,320 --> 00:26:28,720 Speaker 1: form of this that I've like seen, Like I think 527 00:26:28,800 --> 00:26:31,000 Speaker 1: Robert a few years ago reported on people using AI 528 00:26:31,119 --> 00:26:33,680 Speaker 1: to like make like you know, like true crime videos 529 00:26:33,720 --> 00:26:37,520 Speaker 1: of like like like animating like victims of crimes or 530 00:26:37,560 --> 00:26:40,000 Speaker 1: like like murder victims and talking about how they were 531 00:26:40,040 --> 00:26:43,960 Speaker 1: killed or something, which is very gross and very very disgusting. 532 00:26:44,560 --> 00:26:47,440 Speaker 1: But this sort of like organized like like racist video 533 00:26:47,640 --> 00:26:50,399 Speaker 1: propaganda stuff can lead to a lot more like actual, 534 00:26:50,440 --> 00:26:52,480 Speaker 1: like real world damage. 535 00:26:52,760 --> 00:26:55,560 Speaker 2: I completely agree. I mean those true crime videos, I 536 00:26:55,600 --> 00:26:59,520 Speaker 2: remember that. Imagine if your kid was murdered and then no. 537 00:26:59,520 --> 00:27:00,000 Speaker 3: It's so gross. 538 00:27:00,000 --> 00:27:01,919 Speaker 2: Twenty years later someone is like, oh, I've made an 539 00:27:01,920 --> 00:27:05,080 Speaker 2: AI depiction of your murdered child telling their story. 540 00:27:05,840 --> 00:27:09,359 Speaker 1: No, yeah, it's it's evil. But I think the damage 541 00:27:09,400 --> 00:27:12,080 Speaker 1: that can do is is kind of limited. The damage 542 00:27:12,160 --> 00:27:16,040 Speaker 1: that this whole altered reality where racism can get affirmed 543 00:27:16,600 --> 00:27:19,200 Speaker 1: leads to I think a lot more actual, likely political 544 00:27:19,200 --> 00:27:20,400 Speaker 1: and personal consequences. 545 00:27:20,640 --> 00:27:21,800 Speaker 3: Completely agree. 546 00:27:21,880 --> 00:27:24,399 Speaker 2: And I also think just taking a step back in 547 00:27:24,480 --> 00:27:28,119 Speaker 2: the conversation about AI, we're all being told how the 548 00:27:28,160 --> 00:27:31,080 Speaker 2: proliferation of AI is going to be the lynchpin of 549 00:27:31,080 --> 00:27:33,520 Speaker 2: our economy. It's so important, it's going to change everything, 550 00:27:33,640 --> 00:27:35,359 Speaker 2: and then you actually look at some of these use 551 00:27:35,400 --> 00:27:36,919 Speaker 2: cases that are taking off, and it's like, well, was 552 00:27:36,920 --> 00:27:39,880 Speaker 2: this really worth all the fucking climate degradation to make 553 00:27:39,920 --> 00:27:43,720 Speaker 2: this racist AI version of a Bigfoot that looks like 554 00:27:43,720 --> 00:27:44,359 Speaker 2: a black woman. 555 00:27:44,840 --> 00:27:47,639 Speaker 1: No more rainforest, but at least we get racist bigfoot. 556 00:27:47,840 --> 00:27:50,639 Speaker 1: So oh my god, it well get it. I think 557 00:27:50,680 --> 00:27:52,760 Speaker 1: that's a good place to end. Thank you so much 558 00:27:52,800 --> 00:27:55,320 Speaker 1: for letting me rant at you about this. I really 559 00:27:55,400 --> 00:27:56,080 Speaker 1: appreciate it. 560 00:27:56,560 --> 00:27:58,879 Speaker 3: Where else can people find your work? Bridget Well? 561 00:27:58,920 --> 00:28:01,080 Speaker 2: You can listen to my podcast There Are No Girls 562 00:28:01,080 --> 00:28:03,040 Speaker 2: on the Internet. You can listen to my other podcasts 563 00:28:03,040 --> 00:28:06,280 Speaker 2: with Mozilla Foundation about ethics in AI called IRL, and. 564 00:28:06,280 --> 00:28:09,959 Speaker 3: You can find me on Instagram at bridget Marine DC. Fantastic, 565 00:28:10,640 --> 00:28:17,040 Speaker 3: Oh the Internet. It Could Happen Here is a production 566 00:28:17,119 --> 00:28:18,080 Speaker 3: of cool Zone Media. 567 00:28:18,280 --> 00:28:21,280 Speaker 1: For more podcasts from cool Zone Media, visit our website 568 00:28:21,400 --> 00:28:24,960 Speaker 1: coolzonmedia dot com, or check us out on the iHeartRadio app, 569 00:28:25,040 --> 00:28:28,600 Speaker 1: Apple Podcasts, or wherever you listen to podcasts. You can 570 00:28:28,640 --> 00:28:31,000 Speaker 1: now find sources for It Could Happen here listed directly 571 00:28:31,000 --> 00:28:32,160 Speaker 1: in episode descriptions. 572 00:28:32,480 --> 00:28:33,280 Speaker 3: Thanks for listening.