1 00:00:05,240 --> 00:00:07,600 Speaker 1: Hey, this is Anny and Samantha and welcome to stuff 2 00:00:07,600 --> 00:00:09,840 Speaker 1: we never told your prediction of iHeartRadio. 3 00:00:18,680 --> 00:00:22,640 Speaker 2: And welcome to another edition of Monday Mini. Quick content warning. 4 00:00:22,680 --> 00:00:28,800 Speaker 2: We are talking about pornography, sexual harassment, and stalking and exploitation, 5 00:00:29,320 --> 00:00:32,040 Speaker 2: all of the gross stuff. I'm not necessarily going to 6 00:00:32,040 --> 00:00:36,400 Speaker 2: get into specifics. I am going to talk about certain 7 00:00:36,400 --> 00:00:38,920 Speaker 2: incidents and what's happening around the world. 8 00:00:39,080 --> 00:00:40,880 Speaker 3: So yeah, there you go. 9 00:00:41,120 --> 00:00:43,920 Speaker 2: And in the world of this is why we can't 10 00:00:43,960 --> 00:00:45,159 Speaker 2: have anything nice. 11 00:00:46,360 --> 00:00:47,800 Speaker 3: We're talking about AI again. 12 00:00:48,040 --> 00:00:50,480 Speaker 2: So we've had several episodes about the world of AI 13 00:00:50,640 --> 00:00:54,080 Speaker 2: and how it's recently grown rapidly with all the talks 14 00:00:54,080 --> 00:00:57,959 Speaker 2: about the advantages and disadvantages, including some of the backlash 15 00:00:58,000 --> 00:01:00,200 Speaker 2: we talked with bridget about, who also talks a lot 16 00:01:00,280 --> 00:01:02,320 Speaker 2: about it on her show. There are no girls on 17 00:01:02,360 --> 00:01:05,759 Speaker 2: the Internet, and yeah, just overall, the usage of AI 18 00:01:06,720 --> 00:01:11,760 Speaker 2: has grown with that continued backlash and concern so recently 19 00:01:11,920 --> 00:01:14,200 Speaker 2: in South Korea. I know, I think my feed has 20 00:01:14,240 --> 00:01:17,240 Speaker 2: a lot of South Korean information, but hey, a major 21 00:01:17,280 --> 00:01:20,200 Speaker 2: incident occurred where it was discovered hundreds of men and 22 00:01:20,240 --> 00:01:23,280 Speaker 2: boys were using AI to create deep fakes in order 23 00:01:23,319 --> 00:01:26,240 Speaker 2: to harass or blackmail women. So here's a bit more 24 00:01:26,240 --> 00:01:30,080 Speaker 2: from the BBC. Authorities. Journalists and social media users recently 25 00:01:30,120 --> 00:01:32,880 Speaker 2: identified a large number of chat groups where members were 26 00:01:32,880 --> 00:01:37,000 Speaker 2: creating and sharing sexually explicit deep fake images, including some 27 00:01:37,080 --> 00:01:41,160 Speaker 2: of underage girls. Deep fakes are generated using artificial intelligence 28 00:01:41,160 --> 00:01:43,240 Speaker 2: and often combined the face of a real person with 29 00:01:43,280 --> 00:01:44,080 Speaker 2: a fake body. 30 00:01:44,440 --> 00:01:45,160 Speaker 3: And it continues. 31 00:01:45,360 --> 00:01:47,960 Speaker 2: The spate of chat groups linked to individual schools and 32 00:01:48,160 --> 00:01:51,600 Speaker 2: universities across the country were discovered on the social media 33 00:01:51,640 --> 00:01:55,640 Speaker 2: app Telegram over the past week. Users mainly teenage students, 34 00:01:55,680 --> 00:01:59,040 Speaker 2: would upload photos of people they knew, both classmates and teachers, 35 00:01:59,080 --> 00:02:02,040 Speaker 2: and other users would then turned them into sexually explicit 36 00:02:02,080 --> 00:02:05,160 Speaker 2: deep fake images. The discoveries followed the arrest of the 37 00:02:05,240 --> 00:02:09,080 Speaker 2: Russian born founder of Telegram, Pavel Durov, on Saturday, after 38 00:02:09,160 --> 00:02:12,720 Speaker 2: it was alleged that child pornography, drug trafficking and fraud 39 00:02:12,960 --> 00:02:16,720 Speaker 2: were taking place on the encrypted messaging app. And this 40 00:02:16,800 --> 00:02:19,160 Speaker 2: is from The Guardian with a little more detail. Police 41 00:02:19,200 --> 00:02:22,480 Speaker 2: will quote aggressively pursue people who make and spread the 42 00:02:22,560 --> 00:02:25,799 Speaker 2: material in a seven month campaign due to start on Wednesday, 43 00:02:26,080 --> 00:02:28,639 Speaker 2: the Yonhap news agency said, with a focus on those 44 00:02:28,639 --> 00:02:32,200 Speaker 2: who exploit children and teenagers. After a long struggle to 45 00:02:32,240 --> 00:02:35,760 Speaker 2: stamp out MLKA secretly filmed the material of a sexual nature, 46 00:02:36,040 --> 00:02:39,320 Speaker 2: South Korea is now battling a wave of deep fake 47 00:02:39,400 --> 00:02:43,600 Speaker 2: images deep fake videos targeting unspecified individuals who've been rapidly 48 00:02:43,639 --> 00:02:47,040 Speaker 2: spreading through social media. President told a cabinet meeting. According 49 00:02:47,080 --> 00:02:50,680 Speaker 2: to his office, many victims are miners and most perpetrators 50 00:02:50,720 --> 00:02:54,000 Speaker 2: have also been identified as teenagers. He called on authorities 51 00:02:54,000 --> 00:02:57,480 Speaker 2: to quote thoroughly investigate and address these digital sex crimes 52 00:02:57,520 --> 00:03:01,000 Speaker 2: to eradicate them. According to the country's police agency, two 53 00:03:01,120 --> 00:03:03,640 Speaker 2: hundred and ninety seven cases of deep fake crimes of 54 00:03:03,639 --> 00:03:06,560 Speaker 2: a sexual nature were reported in the first seven months 55 00:03:06,560 --> 00:03:08,880 Speaker 2: of the year, up from the one hundred eighty last 56 00:03:08,919 --> 00:03:11,200 Speaker 2: year and nearly double the number in twenty twenty one, 57 00:03:11,280 --> 00:03:13,720 Speaker 2: when data first began to be collated. Of this, one 58 00:03:13,840 --> 00:03:16,440 Speaker 2: hundred and seventy eight people charged one hundred and thirteen 59 00:03:16,560 --> 00:03:19,160 Speaker 2: or teenagers. But the problem is believed to be more 60 00:03:19,200 --> 00:03:23,000 Speaker 2: serious than the official figures suggest. One telegram chat room 61 00:03:23,120 --> 00:03:26,200 Speaker 2: has attracted about two hundred and twenty thousand members who 62 00:03:26,320 --> 00:03:30,040 Speaker 2: create and share deep fakes images by doctoring photographs of 63 00:03:30,040 --> 00:03:33,240 Speaker 2: women and girls. South Korean media said the victims include 64 00:03:33,320 --> 00:03:38,600 Speaker 2: university students, teachers, and military personnel. So on, one analysis 65 00:03:38,640 --> 00:03:42,680 Speaker 2: about South Korea's han Kiore newspaper to highlighted Telegram channels 66 00:03:42,720 --> 00:03:45,400 Speaker 2: itself were being used to share deep fakes of female 67 00:03:45,440 --> 00:03:48,000 Speaker 2: students and high school and middle school students. The Korean 68 00:03:48,040 --> 00:03:51,000 Speaker 2: Teachers and Education Workers' Unions said it had learned of 69 00:03:51,040 --> 00:03:54,120 Speaker 2: sexual deep fakes involving school students and had asked the 70 00:03:54,480 --> 00:03:58,640 Speaker 2: Education Ministry to investigate. The investigation into sexually explicit deep 71 00:03:58,640 --> 00:04:02,080 Speaker 2: fake images is expected to inflict further damage on Telegram's 72 00:04:02,120 --> 00:04:05,120 Speaker 2: reputation in South Korea, where the app was used to 73 00:04:05,120 --> 00:04:08,480 Speaker 2: operate an online sexual black mel ring which was a 74 00:04:08,520 --> 00:04:12,080 Speaker 2: whole different conversation that was super gross and super disgusting 75 00:04:12,120 --> 00:04:15,840 Speaker 2: and nobody being punished hardly. So as this incident just 76 00:04:16,120 --> 00:04:18,719 Speaker 2: recently came to light, it isn't anything new, and it 77 00:04:18,800 --> 00:04:20,800 Speaker 2: sounded like they have been investigating it. It took them 78 00:04:20,920 --> 00:04:23,320 Speaker 2: a very long time to get to this point. In fact, 79 00:04:23,360 --> 00:04:25,760 Speaker 2: South Korea is saying that the incidents like this have 80 00:04:25,880 --> 00:04:30,159 Speaker 2: created a need to declare a national emergency. It hasn't 81 00:04:30,200 --> 00:04:34,000 Speaker 2: yet though in South Korea alone. These deep fake sex crimes, 82 00:04:34,000 --> 00:04:37,680 Speaker 2: as they are called have only increased. It's interesting because 83 00:04:37,680 --> 00:04:41,000 Speaker 2: it says teenagers were responsible for a main amount of 84 00:04:41,160 --> 00:04:44,640 Speaker 2: these offenses, so there's a lot to be said. They 85 00:04:44,640 --> 00:04:47,240 Speaker 2: have talked about how there were men using their wives 86 00:04:47,240 --> 00:04:49,320 Speaker 2: pictures as well, so that there was enough that they 87 00:04:49,360 --> 00:04:51,360 Speaker 2: found that happening, which is odd. 88 00:04:51,440 --> 00:04:54,560 Speaker 3: I'm like, I've just been recently hearing. 89 00:04:54,360 --> 00:05:00,479 Speaker 2: More and more stories of their own husbands doing this 90 00:05:00,560 --> 00:05:04,120 Speaker 2: to their wives, but then also taking videos, explicit videos after. 91 00:05:03,960 --> 00:05:04,880 Speaker 3: Their wives are asleep. 92 00:05:04,960 --> 00:05:07,000 Speaker 2: I don't know why this is popping up all my stuff, 93 00:05:07,000 --> 00:05:09,520 Speaker 2: which is really concerning and wondering how often this happens, 94 00:05:09,520 --> 00:05:12,640 Speaker 2: and it's just being talked about now and being found 95 00:05:12,680 --> 00:05:17,320 Speaker 2: out about. And again, though this incident is specifically talking 96 00:05:17,400 --> 00:05:20,280 Speaker 2: about South Korea, it's a bigger issue. We know this 97 00:05:20,440 --> 00:05:23,760 Speaker 2: all around the world. In fact, as I was researching, 98 00:05:24,800 --> 00:05:27,760 Speaker 2: I was trying to get some statistics. The first site 99 00:05:27,760 --> 00:05:31,080 Speaker 2: that popped up was a site completely dedicated on deep 100 00:05:31,120 --> 00:05:33,599 Speaker 2: fakes of celebrity porn videos. That was my first and 101 00:05:34,040 --> 00:05:37,479 Speaker 2: like they was advertising like a porn hub saying, look, 102 00:05:37,520 --> 00:05:39,440 Speaker 2: we have all of these, come and find your video, 103 00:05:39,600 --> 00:05:42,480 Speaker 2: Come and find your celebrity you want to watch, and 104 00:05:42,520 --> 00:05:44,960 Speaker 2: also how to find deep fakes online like showing all 105 00:05:44,960 --> 00:05:48,280 Speaker 2: the taglines and where to find these taglines. So those 106 00:05:48,279 --> 00:05:52,200 Speaker 2: were the first that popped them. So yeah, I've stubbed 107 00:05:52,200 --> 00:05:53,479 Speaker 2: my algorithm for y'all. 108 00:05:53,520 --> 00:05:57,800 Speaker 3: I'm just kidding. Anyway, Please, if you're listening to this, 109 00:05:58,200 --> 00:06:03,080 Speaker 3: police people, I'm not trying to find it. I was researching. 110 00:06:03,680 --> 00:06:09,440 Speaker 3: I'm only researching for those surveilling me. You know there 111 00:06:10,520 --> 00:06:11,760 Speaker 3: would have known what I'm talking about. 112 00:06:13,560 --> 00:06:25,919 Speaker 2: Sorry, There's so much that we need to talk about. 113 00:06:26,000 --> 00:06:29,080 Speaker 2: For example, there are incidents here that happened in the US. 114 00:06:29,080 --> 00:06:30,479 Speaker 3: I just haven't seen. 115 00:06:30,520 --> 00:06:32,839 Speaker 2: It hasn't popped up on my feet, which is really unfortunate, 116 00:06:33,200 --> 00:06:36,760 Speaker 2: including a student creating a deep fake porn of other 117 00:06:36,839 --> 00:06:40,680 Speaker 2: students at a New Jersey high school and another incident 118 00:06:40,720 --> 00:06:44,400 Speaker 2: in a Beverly Hills middle school. And I am very 119 00:06:44,520 --> 00:06:46,560 Speaker 2: very disturbed by both of these things, obviously, but the 120 00:06:46,600 --> 00:06:48,320 Speaker 2: fact that we're not talking about it, the fact that 121 00:06:48,320 --> 00:06:52,159 Speaker 2: there's not really being much done or handled here. 122 00:06:52,640 --> 00:06:52,880 Speaker 4: Uh. 123 00:06:52,920 --> 00:06:55,000 Speaker 2: And by the way, as I as I was writing 124 00:06:55,040 --> 00:06:58,760 Speaker 2: this one, the word deep fake kept being caught on 125 00:06:58,800 --> 00:07:02,760 Speaker 2: my word check spell check, and I'm like having to 126 00:07:02,800 --> 00:07:06,040 Speaker 2: add this to my dictionary, and I'm like, wow, Google's 127 00:07:06,120 --> 00:07:09,960 Speaker 2: not even upon the fact, even though they have a 128 00:07:09,960 --> 00:07:14,680 Speaker 2: whole AI company that's not doing well. Apparently, the Beverly 129 00:07:14,720 --> 00:07:17,160 Speaker 2: Hills Unified School District says it's working with the Beverly 130 00:07:17,200 --> 00:07:21,400 Speaker 2: Hills Police Department in investigating this incident and those responsible 131 00:07:21,440 --> 00:07:23,880 Speaker 2: will face a disciplining action. A mother of one of 132 00:07:23,880 --> 00:07:26,600 Speaker 2: the students at the schools that many girls were targeted 133 00:07:26,680 --> 00:07:30,800 Speaker 2: by the deep fake pornographic images, and she goes on 134 00:07:30,880 --> 00:07:33,240 Speaker 2: to say her daughter was not one of the victims. 135 00:07:33,560 --> 00:07:36,280 Speaker 2: She still feels victimized knowing that this is her friend. 136 00:07:36,640 --> 00:07:40,200 Speaker 2: And of course the school came out with the statements 137 00:07:40,200 --> 00:07:42,560 Speaker 2: and they would thoroughly punish. This is a form of 138 00:07:42,720 --> 00:07:45,440 Speaker 2: bullying and we understand this. This is not going to 139 00:07:45,480 --> 00:07:48,120 Speaker 2: be tolerated in the schools. And if other people have 140 00:07:48,200 --> 00:07:50,840 Speaker 2: been involved, or you find images or know that someone 141 00:07:50,880 --> 00:07:53,240 Speaker 2: has those images, please let us know so all those 142 00:07:53,280 --> 00:07:54,760 Speaker 2: things are happening. 143 00:07:55,080 --> 00:08:01,680 Speaker 3: However, again there are no real state laws about this. 144 00:08:02,280 --> 00:08:05,160 Speaker 2: We know that it's difficult to go after people for 145 00:08:05,320 --> 00:08:09,800 Speaker 2: bullying online, which we can show that is bullying. We 146 00:08:09,880 --> 00:08:12,800 Speaker 2: know it's difficult to go after people for a revenge 147 00:08:12,880 --> 00:08:16,520 Speaker 2: porn where their tape that they have been they had 148 00:08:16,600 --> 00:08:19,240 Speaker 2: made may they didn't know they were being made, or 149 00:08:19,400 --> 00:08:21,920 Speaker 2: they did know but was only consented for personal use, 150 00:08:21,960 --> 00:08:25,200 Speaker 2: and then they use that against them or to blackmail 151 00:08:25,280 --> 00:08:30,400 Speaker 2: people again, and there's no real law stopping this, Like 152 00:08:30,440 --> 00:08:33,240 Speaker 2: there's laws that you can try to interpret, but that's 153 00:08:33,280 --> 00:08:35,600 Speaker 2: a really sleazy game to play for someone who's been 154 00:08:35,640 --> 00:08:38,280 Speaker 2: a victim of this. And again, if you're an up 155 00:08:38,320 --> 00:08:40,640 Speaker 2: to date listener, you may remember us talking about the 156 00:08:40,679 --> 00:08:43,680 Speaker 2: Taylor Swift deep fake that caused a flurry of conversation 157 00:08:43,760 --> 00:08:47,480 Speaker 2: earlier this year, and how that did initially push lawmakers 158 00:08:47,720 --> 00:08:51,080 Speaker 2: to taking a deeper look and present a bill to 159 00:08:51,200 --> 00:08:54,120 Speaker 2: make a law allowing for victims to sue the people 160 00:08:54,320 --> 00:08:58,480 Speaker 2: who have created these deep fakes under some circumstances. As 161 00:08:58,559 --> 00:09:02,160 Speaker 2: reported by Time magazine. They say the disrupt Explicit Forged 162 00:09:02,200 --> 00:09:06,520 Speaker 2: Images and non Consensual edits, or the Defiance Act, allows 163 00:09:06,640 --> 00:09:10,040 Speaker 2: victims to sue if those who created the deep fakes 164 00:09:10,080 --> 00:09:13,320 Speaker 2: knew or recklessly disregarded that the victim did not consent 165 00:09:13,400 --> 00:09:17,960 Speaker 2: to its making. But the reality is there's no federal 166 00:09:18,280 --> 00:09:22,240 Speaker 2: law making this illegal at all. And even with this law, 167 00:09:22,400 --> 00:09:26,080 Speaker 2: it's very limiting, and honestly, I'm assuming only rich people 168 00:09:26,200 --> 00:09:28,120 Speaker 2: people with some type of income are going to be 169 00:09:28,120 --> 00:09:31,439 Speaker 2: able to take advantage of this type of law and legislation. 170 00:09:31,760 --> 00:09:36,080 Speaker 2: Which again what mm hm that whole like take is 171 00:09:36,120 --> 00:09:41,640 Speaker 2: like they're allowed to sue. Yeah, if it's recklessly disregarding, 172 00:09:42,320 --> 00:09:43,800 Speaker 2: that's that's almost an excuse. 173 00:09:44,120 --> 00:09:44,320 Speaker 4: I know. 174 00:09:44,400 --> 00:09:48,480 Speaker 1: It's like I wasn't allowed to sue before. I feel 175 00:09:48,520 --> 00:09:51,120 Speaker 1: like I should have already had that right now to rightee. 176 00:09:52,040 --> 00:09:53,200 Speaker 3: And all they did was a mistake. 177 00:09:53,240 --> 00:09:54,760 Speaker 2: It sounds like the way we're saying is that the 178 00:09:54,760 --> 00:09:56,440 Speaker 2: person who created thiss just made a mistake. 179 00:09:56,520 --> 00:09:58,840 Speaker 3: It was just it was just a reckless a little 180 00:09:58,880 --> 00:09:59,480 Speaker 3: bit reckless. 181 00:10:00,160 --> 00:10:04,240 Speaker 1: Yeah, it is interesting because our laws are so behind 182 00:10:04,840 --> 00:10:08,520 Speaker 1: on a lot of technology things. But as you've probably 183 00:10:08,559 --> 00:10:14,000 Speaker 1: heard listeners also related to Taylor Swift, Donald Trump released 184 00:10:14,040 --> 00:10:16,960 Speaker 1: some AI images of her and they're trying to figure 185 00:10:17,000 --> 00:10:19,640 Speaker 1: out how to deal with that because there is a 186 00:10:19,760 --> 00:10:23,520 Speaker 1: law that says you can't do that. But it's like 187 00:10:23,800 --> 00:10:27,760 Speaker 1: not for technology, right, So we're just behind. We're just 188 00:10:27,800 --> 00:10:30,760 Speaker 1: behind everybody. Yeah, everybody's behind. 189 00:10:30,800 --> 00:10:32,560 Speaker 2: I mean the fact that South Korea does have a 190 00:10:32,720 --> 00:10:35,760 Speaker 2: digital sex crime unit, that's great, but it also was 191 00:10:35,800 --> 00:10:38,840 Speaker 2: because there has been so many complaints of women being like, hey, 192 00:10:38,880 --> 00:10:43,560 Speaker 2: I'm being illegally taped. Yeah, and so they still really 193 00:10:43,559 --> 00:10:47,480 Speaker 2: don't stop that, and it's only like a fine, right. 194 00:10:47,480 --> 00:10:50,280 Speaker 1: And we've talked about this, like I believe we talked 195 00:10:50,320 --> 00:10:53,480 Speaker 1: about it in that episode with bridget about this kind 196 00:10:53,520 --> 00:10:56,960 Speaker 1: of thing is that it is largely women that this 197 00:10:57,080 --> 00:11:00,000 Speaker 1: is impacting, and I think that points to a lot 198 00:11:00,040 --> 00:11:04,520 Speaker 1: larger issue, like a systemic societal issue. But the fact 199 00:11:04,679 --> 00:11:08,040 Speaker 1: is it is impacting more women. And it's once again 200 00:11:08,080 --> 00:11:11,120 Speaker 1: another instance where it seems like a lot of the 201 00:11:12,040 --> 00:11:16,160 Speaker 1: perhaps men in power sort of like, oh, well, that's 202 00:11:16,200 --> 00:11:18,560 Speaker 1: the price of being online as a woman. That's just 203 00:11:18,600 --> 00:11:21,520 Speaker 1: the price of existing as a woman. Doesn't impact me, 204 00:11:22,280 --> 00:11:23,280 Speaker 1: So it's not a priority. 205 00:11:23,600 --> 00:11:24,200 Speaker 3: I will say. 206 00:11:24,240 --> 00:11:27,640 Speaker 2: The one thing that might happen now because the Lincoln 207 00:11:27,679 --> 00:11:31,120 Speaker 2: Project apparently released an ad which I'm very surprised by, 208 00:11:32,280 --> 00:11:37,120 Speaker 2: of an AI image of Donald Trump crying. So I 209 00:11:37,160 --> 00:11:40,120 Speaker 2: have a feeling Donald Trump's gonna be pissed about that. Yeah, 210 00:11:40,160 --> 00:11:41,720 Speaker 2: and his campaign, So they're going to try to do 211 00:11:41,840 --> 00:11:43,920 Speaker 2: something now that's been released. 212 00:11:44,360 --> 00:11:47,800 Speaker 3: But again, and this is where I felt icky and 213 00:11:47,840 --> 00:11:50,040 Speaker 3: like I don't know. I don't I will. 214 00:11:49,840 --> 00:11:52,880 Speaker 2: Say from jomb I know nothing about the Lincoln Project, 215 00:11:53,080 --> 00:11:58,760 Speaker 2: and from what I understand, they are very capitalistic opportunists 216 00:11:59,240 --> 00:12:00,720 Speaker 2: that doesn't like Donald Trump. 217 00:12:00,840 --> 00:12:03,200 Speaker 3: So I don't know what that means. I don't believe that. 218 00:12:03,280 --> 00:12:05,640 Speaker 2: I think if I look deeper into it, and I 219 00:12:05,720 --> 00:12:07,720 Speaker 2: know enough that I don't agree with what they are 220 00:12:07,720 --> 00:12:10,000 Speaker 2: doing completely. Like I think some of the things could 221 00:12:10,000 --> 00:12:11,720 Speaker 2: be funny, but at the same time, I'm like, hmm, 222 00:12:12,520 --> 00:12:13,840 Speaker 2: you are not my cup of tea. I have a 223 00:12:13,880 --> 00:12:16,920 Speaker 2: feeling we are on the posing side still type of conversation, 224 00:12:17,080 --> 00:12:19,640 Speaker 2: so put that as a caveat. And with that, like, 225 00:12:19,720 --> 00:12:23,240 Speaker 2: I don't like AI images anyway, and I feel like 226 00:12:23,280 --> 00:12:25,320 Speaker 2: it should just be done with, Like there's no need 227 00:12:25,360 --> 00:12:28,120 Speaker 2: for any of this unless you're making me a cute 228 00:12:28,200 --> 00:12:32,160 Speaker 2: image of a raccoon or a squid with a hat, 229 00:12:32,400 --> 00:12:34,760 Speaker 2: or an octopus with a hat, or a cute mini 230 00:12:34,840 --> 00:12:37,120 Speaker 2: animal that I'm like, I wish this existed. 231 00:12:38,000 --> 00:12:38,840 Speaker 3: I don't understand. 232 00:12:39,800 --> 00:12:41,360 Speaker 1: Have I told you about search? 233 00:12:41,760 --> 00:12:44,480 Speaker 2: Yes, you have you yet to send me. You have 234 00:12:44,640 --> 00:12:50,160 Speaker 2: not sent me any pictures eighty or real, so I'm 235 00:12:50,160 --> 00:12:53,440 Speaker 2: still waiting. I've watched a clip with the octopus bargaining. 236 00:12:53,440 --> 00:12:55,720 Speaker 2: We've all talked about this so many times. Part of 237 00:12:55,760 --> 00:13:00,920 Speaker 2: our conversation even about deep faces. But yeah, I like 238 00:13:01,000 --> 00:13:03,880 Speaker 2: either way, it's still kind of concerning. But as a reminder, 239 00:13:03,920 --> 00:13:07,280 Speaker 2: here's some more information from Time about deep fakes. Non 240 00:13:07,320 --> 00:13:11,439 Speaker 2: consensual pornographic deep fakes are alarmingly easy to access and create. 241 00:13:12,120 --> 00:13:14,199 Speaker 2: Starting at the very top, there's a search engine where 242 00:13:14,240 --> 00:13:16,120 Speaker 2: you can search how do I make a deep fake? 243 00:13:16,240 --> 00:13:18,600 Speaker 2: Then they will give you a bunch of links. Carry Goldberg, 244 00:13:18,640 --> 00:13:21,880 Speaker 2: an attorney who represents tech abuse victims, previously told Time 245 00:13:22,400 --> 00:13:25,640 Speaker 2: deep fake software takes a person's photos and face swats 246 00:13:25,679 --> 00:13:29,280 Speaker 2: them onto pornographic videos, making it appear as if the 247 00:13:29,320 --> 00:13:32,720 Speaker 2: subject is partaking in sexual acts, and a twenty nineteen 248 00:13:32,760 --> 00:13:36,000 Speaker 2: study found that ninety six percent of all deep fake 249 00:13:36,160 --> 00:13:44,400 Speaker 2: videos were non consensual pornography. Ninety six percent. Yeah, I 250 00:13:44,400 --> 00:13:46,679 Speaker 2: can't imagine the number's gonna be. Well, I guess the 251 00:13:46,679 --> 00:13:48,840 Speaker 2: percentage may not change because the percentage works that way, 252 00:13:48,920 --> 00:13:52,040 Speaker 2: But the actual numbers, I bet are huge. And yeah, 253 00:13:52,080 --> 00:13:55,720 Speaker 2: there's a currently no law or legislation to regulate deep 254 00:13:55,760 --> 00:13:59,960 Speaker 2: fakes federally, but there is an act to research it. 255 00:14:00,960 --> 00:14:05,400 Speaker 2: So here's some information from Thomson Ruters dot com. Currently, 256 00:14:05,600 --> 00:14:10,040 Speaker 2: there's no comprehensive enacted federal legislation in the United States 257 00:14:10,040 --> 00:14:15,839 Speaker 2: that bans or even regulates the fakes. However, the Identifying 258 00:14:15,960 --> 00:14:20,640 Speaker 2: Outputs of Generative Adversarial Networks Act requires the Director of 259 00:14:20,680 --> 00:14:24,760 Speaker 2: the National Science Foundation to support research for the development 260 00:14:24,840 --> 00:14:29,119 Speaker 2: and measurements of standards needed to generate ga in outputs 261 00:14:29,480 --> 00:14:33,840 Speaker 2: and any other comparable techniques developed in the future. Real talk, 262 00:14:33,920 --> 00:14:35,640 Speaker 2: I don't know what that's telling me other than they're 263 00:14:35,680 --> 00:14:36,320 Speaker 2: just researching. 264 00:14:37,120 --> 00:14:37,680 Speaker 3: Am I wrong? 265 00:14:38,000 --> 00:14:42,920 Speaker 1: They're saying that they're gonna like make Yeah, they're gonna 266 00:14:42,960 --> 00:14:48,240 Speaker 1: support research for maybe coming up with standards maybe later. 267 00:14:48,520 --> 00:14:51,040 Speaker 3: They don't what they do with what they used to 268 00:14:51,040 --> 00:14:52,520 Speaker 3: do before it got like outlawed. 269 00:14:52,520 --> 00:14:54,240 Speaker 1: With gun violence all the time, they're like, well, we'll 270 00:14:54,240 --> 00:14:56,320 Speaker 1: look into it, and then they never. 271 00:14:56,760 --> 00:14:58,960 Speaker 2: So we're just on the same same route as being 272 00:14:59,080 --> 00:14:59,880 Speaker 2: the gun regulation. 273 00:15:00,240 --> 00:15:03,960 Speaker 3: Great great sounds, And. 274 00:15:03,920 --> 00:15:07,960 Speaker 2: Back to that possible federal legislation we mentioned earlier, Thomson Ruters' 275 00:15:08,000 --> 00:15:12,480 Speaker 2: Rights Congress is considering additional legislation that, if passed, would 276 00:15:12,560 --> 00:15:16,840 Speaker 2: regulate the creation, disclosure, and dissemination of deep fakes. Some 277 00:15:16,920 --> 00:15:20,840 Speaker 2: of the legislation includes the Deep Fake Report Act of 278 00:15:20,840 --> 00:15:25,600 Speaker 2: twenty nineteen, which requires the Science and Technology Director in 279 00:15:25,640 --> 00:15:29,480 Speaker 2: the US Department of Homeland Security to report at specified 280 00:15:29,480 --> 00:15:33,840 Speaker 2: intervals on the state of digital content forgery technology. The 281 00:15:33,920 --> 00:15:37,640 Speaker 2: Deep Fakes Accountability Act, which aims to protect national security 282 00:15:37,720 --> 00:15:41,080 Speaker 2: against the threats posed by deep fake technology and to 283 00:15:41,160 --> 00:15:45,520 Speaker 2: provide legal recourse to victims of harmful deep fakes. The 284 00:15:45,560 --> 00:15:49,160 Speaker 2: Defiance Act of twenty twenty four, which would improve rights 285 00:15:49,200 --> 00:15:53,960 Speaker 2: to relief for individuals affected by non consensual activities involving 286 00:15:54,040 --> 00:15:57,840 Speaker 2: intimate digital forgeries and for other purposes. And the Protecting 287 00:15:58,000 --> 00:16:01,640 Speaker 2: Consumers from a Deceptive ai I Act, which requires the 288 00:16:01,760 --> 00:16:05,640 Speaker 2: National Institute of Standards and Technology to establish task force 289 00:16:05,960 --> 00:16:09,720 Speaker 2: to facilitate and inform the development of technical standards and 290 00:16:09,840 --> 00:16:14,160 Speaker 2: guidelines relating to the identifications of content created by GENAI 291 00:16:14,520 --> 00:16:18,760 Speaker 2: to ensure the audio or visual content created or substantially 292 00:16:18,880 --> 00:16:23,520 Speaker 2: modified by GENAI includes a disclosure to acknowledging that Genai 293 00:16:23,840 --> 00:16:28,040 Speaker 2: origin of such content and for other purposes. So specifically, 294 00:16:28,040 --> 00:16:34,640 Speaker 2: we're talking about four different acts that could help possibly regulate. Now, 295 00:16:35,040 --> 00:16:37,120 Speaker 2: this again comes with the fact that we know that 296 00:16:37,160 --> 00:16:40,440 Speaker 2: standards of regulation the Supreme Court set is no longer 297 00:16:40,880 --> 00:16:43,440 Speaker 2: necessary in the US essentially, and they don't have to 298 00:16:43,480 --> 00:16:46,760 Speaker 2: regulate anything. So that's a little concerning. So just as 299 00:16:46,800 --> 00:16:51,200 Speaker 2: that as a reminder, also, this is four different ways 300 00:16:51,200 --> 00:16:53,520 Speaker 2: of looking at it. It really doesn't seem like it's 301 00:16:53,560 --> 00:16:56,240 Speaker 2: helping victims as well as it's punishing companies. 302 00:16:56,400 --> 00:16:58,800 Speaker 3: But not even that hard, right. 303 00:16:58,640 --> 00:17:01,280 Speaker 1: And we've seen that a million time, Like a company 304 00:17:03,000 --> 00:17:04,840 Speaker 1: paying a fine that for them is to drop in 305 00:17:04,880 --> 00:17:08,160 Speaker 1: the bucket. It might to us who don't have millions 306 00:17:08,200 --> 00:17:10,440 Speaker 1: of dollars, if not billions, look like oh my god. 307 00:17:10,560 --> 00:17:13,640 Speaker 1: But to them that's like nothing. And then they get 308 00:17:13,680 --> 00:17:17,479 Speaker 1: to continue doing what they're doing, right, and probably making 309 00:17:17,920 --> 00:17:21,880 Speaker 1: money off of the fact covering for that fine. Right, 310 00:17:21,920 --> 00:17:23,440 Speaker 1: But they get to continue what they're doing. 311 00:17:23,600 --> 00:17:25,680 Speaker 2: Yeah, it makes more sense and money just to pay 312 00:17:25,680 --> 00:17:28,720 Speaker 2: the fine and continue on like they're making so much 313 00:17:29,040 --> 00:17:31,159 Speaker 2: again and then protecting the consumers from the sept of 314 00:17:31,200 --> 00:17:31,840 Speaker 2: AI act. 315 00:17:32,359 --> 00:17:34,399 Speaker 3: I really have a feelion this is going to go 316 00:17:34,440 --> 00:17:36,080 Speaker 3: along the lines of the standards of like. 317 00:17:36,080 --> 00:17:39,800 Speaker 2: Oh, let's put uh an FDA level of this so 318 00:17:39,840 --> 00:17:42,880 Speaker 2: we can protect people's self. Nah, we don't need it. 319 00:17:43,920 --> 00:17:46,239 Speaker 3: They're smart. If not, oh well they'll get over it. 320 00:17:46,600 --> 00:17:57,080 Speaker 4: Yeah. 321 00:17:57,240 --> 00:18:01,679 Speaker 1: I bet there's going to be some arguments about I 322 00:18:01,720 --> 00:18:03,880 Speaker 1: know there already have been, but I bet there's continued 323 00:18:03,960 --> 00:18:06,359 Speaker 1: arguments about like freedom of speech and this right right 324 00:18:06,440 --> 00:18:08,240 Speaker 1: infringing on freedom of speech, and. 325 00:18:08,240 --> 00:18:11,639 Speaker 2: There's so many conversations and then even when you start 326 00:18:11,680 --> 00:18:13,560 Speaker 2: like going down the slippery slope of like, but it's 327 00:18:13,600 --> 00:18:16,320 Speaker 2: not really harming this people. We're not actually touching these 328 00:18:16,320 --> 00:18:18,359 Speaker 2: world and better that we do this and actually O 329 00:18:18,400 --> 00:18:19,240 Speaker 2: kidnap somebody. 330 00:18:19,880 --> 00:18:22,080 Speaker 3: I mean that is absolutely a conversation. 331 00:18:22,320 --> 00:18:25,320 Speaker 2: Yeah. I remember an SVU episode and this kind of 332 00:18:25,359 --> 00:18:28,520 Speaker 2: reminds me of this, in which they are shooting porn 333 00:18:28,960 --> 00:18:31,240 Speaker 2: and they get a report saying that she's a fifteen 334 00:18:31,320 --> 00:18:33,560 Speaker 2: year old having to shoot porn, so they go and 335 00:18:33,560 --> 00:18:35,600 Speaker 2: bust it down. It turns out she was actually nineteen 336 00:18:35,680 --> 00:18:38,840 Speaker 2: and they just use not They didn't say AI, they 337 00:18:38,840 --> 00:18:41,879 Speaker 2: said photoshop, which is essentially the same thing. They use 338 00:18:41,920 --> 00:18:44,560 Speaker 2: photoshop to de age her, and they're like, look at 339 00:18:44,600 --> 00:18:46,679 Speaker 2: this technology. It's gonna save so many children. 340 00:18:46,760 --> 00:18:50,840 Speaker 3: You're like, oh, no, that's not what's happening. 341 00:18:51,000 --> 00:18:53,880 Speaker 2: And the debate wasn't that SVU wasn't trying to law 342 00:18:53,880 --> 00:18:56,480 Speaker 2: and order. SVU y'all if you know, you know hard 343 00:18:56,520 --> 00:18:58,600 Speaker 2: to tell you, I love you, Yes, But like there 344 00:18:58,600 --> 00:19:01,440 Speaker 2: were they weren't arguing for either they were like, ehh, 345 00:19:01,520 --> 00:19:03,280 Speaker 2: and this is the bigger debate of like what is 346 00:19:03,320 --> 00:19:06,880 Speaker 2: going to happen with the children and the porn industry 347 00:19:06,880 --> 00:19:09,040 Speaker 2: and child abuse and all this, and by the way, 348 00:19:09,520 --> 00:19:13,560 Speaker 2: also as big as that conversation about save the children. 349 00:19:14,400 --> 00:19:18,520 Speaker 2: This is not being mentioned much. Yeah, this is not 350 00:19:18,560 --> 00:19:24,320 Speaker 2: being mentioned much. Uh, Like, I'm really wondering the level 351 00:19:24,440 --> 00:19:27,440 Speaker 2: of like, is anybody taking this seriously? 352 00:19:28,400 --> 00:19:30,720 Speaker 3: Why are you not taking this seriously? Literally? 353 00:19:30,800 --> 00:19:33,840 Speaker 2: They were using this type of technology in South Korea 354 00:19:33,920 --> 00:19:36,879 Speaker 2: to black male women and to sexual coercion, Like they 355 00:19:36,880 --> 00:19:38,399 Speaker 2: were using this to be like if you don't do 356 00:19:38,440 --> 00:19:40,919 Speaker 2: this with me, I'm gonna release this and shame you, 357 00:19:41,240 --> 00:19:44,119 Speaker 2: and I will say this is in South Korea, and 358 00:19:44,200 --> 00:19:46,240 Speaker 2: South Korea it's like, no, they're going to go after 359 00:19:46,240 --> 00:19:47,600 Speaker 2: the women. They are more than they are going to 360 00:19:47,880 --> 00:19:52,040 Speaker 2: go after the perpetrators, no matter what, no matter what, 361 00:19:52,440 --> 00:19:55,399 Speaker 2: just for the sake of the victim. Blaming is heavy, 362 00:19:55,440 --> 00:20:01,560 Speaker 2: and it's not just in South Korea. Just existing like 363 00:20:02,040 --> 00:20:04,320 Speaker 2: people saying you put her on the internet, what did 364 00:20:04,359 --> 00:20:06,719 Speaker 2: you expect? You have a picture on the internet, what 365 00:20:06,760 --> 00:20:11,679 Speaker 2: did you expect? That level of conversation is so disheartening, 366 00:20:12,040 --> 00:20:13,880 Speaker 2: but it's too real. 367 00:20:14,119 --> 00:20:15,160 Speaker 3: It's just real. 368 00:20:15,320 --> 00:20:17,639 Speaker 2: And the fact that they are not caring enough that 369 00:20:17,760 --> 00:20:19,800 Speaker 2: all they can do is will give you the power 370 00:20:19,840 --> 00:20:20,320 Speaker 2: to sue. 371 00:20:20,960 --> 00:20:22,720 Speaker 3: How about that? How about that? 372 00:20:22,920 --> 00:20:24,600 Speaker 2: And you may or may not win because you have 373 00:20:24,720 --> 00:20:29,840 Speaker 2: to prove that it was a reckless disregard of your safety. 374 00:20:30,119 --> 00:20:31,440 Speaker 3: What does that mean? 375 00:20:32,240 --> 00:20:36,240 Speaker 1: That's I mean, I can already tell that's designed to 376 00:20:36,280 --> 00:20:37,320 Speaker 1: be difficult to prove it. 377 00:20:39,280 --> 00:20:41,399 Speaker 2: You really think you're doing something here, like what is 378 00:20:41,440 --> 00:20:44,959 Speaker 2: this that you're doing? So I will say as of today, 379 00:20:45,119 --> 00:20:48,640 Speaker 2: around ten sakes in the US have placed their own 380 00:20:48,680 --> 00:20:51,520 Speaker 2: regulations on it. But three of those says seemed only 381 00:20:51,560 --> 00:20:56,320 Speaker 2: to be concerned about how candidates and elections will be affected, 382 00:20:56,960 --> 00:20:59,439 Speaker 2: so literally specifies to say it if it damages the 383 00:20:59,480 --> 00:21:03,679 Speaker 2: candidate's campaign. M three out of ten are those I 384 00:21:03,720 --> 00:21:07,359 Speaker 2: was like, wait, wow, wait, w let's all run for 385 00:21:07,480 --> 00:21:09,720 Speaker 2: off right. I guess that's the only way you're going 386 00:21:09,800 --> 00:21:13,159 Speaker 2: to be protected. Do it every year, no matter what, 387 00:21:13,320 --> 00:21:18,680 Speaker 2: just to coming in as then if you win, sorry. 388 00:21:21,920 --> 00:21:25,280 Speaker 2: And around five of them are specific. Two usage of 389 00:21:25,640 --> 00:21:29,200 Speaker 2: sexual content, but it's more specific related to minors, which 390 00:21:29,240 --> 00:21:32,040 Speaker 2: is necessary, yes, but I feel like, again. 391 00:21:32,600 --> 00:21:34,960 Speaker 3: A victim is a victim as a victim. 392 00:21:35,040 --> 00:21:37,640 Speaker 2: Yeah, so they had to put that under like, well, 393 00:21:37,720 --> 00:21:41,160 Speaker 2: first let's care about the children, right, then we'll talk 394 00:21:41,160 --> 00:21:45,600 Speaker 2: about everything else. Again, this is a bit confusing on 395 00:21:45,920 --> 00:21:48,920 Speaker 2: in general and the others. When I looked it up, 396 00:21:49,400 --> 00:21:52,240 Speaker 2: was just generic. It was just a generic conversation about 397 00:21:52,240 --> 00:21:54,159 Speaker 2: like don't do this, don't do that. But it really 398 00:21:54,359 --> 00:21:57,040 Speaker 2: wasn't about none. Nobody talked about jail time. No one 399 00:21:57,080 --> 00:21:58,560 Speaker 2: really talks about criminal charges. 400 00:21:59,119 --> 00:21:59,600 Speaker 4: Mm hmmm. 401 00:22:00,160 --> 00:22:03,760 Speaker 2: So there's no conversations because we know that if you 402 00:22:03,960 --> 00:22:08,200 Speaker 2: possess child porn, you should be on the sexual Fender registry, 403 00:22:08,720 --> 00:22:11,240 Speaker 2: which has its good and bads too, y'all. Like that's 404 00:22:11,280 --> 00:22:14,200 Speaker 2: a whole separate conversation that we can have. And I'm 405 00:22:14,200 --> 00:22:17,120 Speaker 2: not saying that we don't need to tell people about 406 00:22:17,160 --> 00:22:17,880 Speaker 2: sexual offenders. 407 00:22:17,880 --> 00:22:18,600 Speaker 3: We definitely do. 408 00:22:19,040 --> 00:22:21,080 Speaker 2: But this has been a power play and is a 409 00:22:21,160 --> 00:22:23,800 Speaker 2: racist play that needs to be talked about as well, 410 00:22:23,960 --> 00:22:26,360 Speaker 2: because people can come off of it really quickly if 411 00:22:26,359 --> 00:22:30,840 Speaker 2: you have money. Anyway, Oh, the system is corrupt. This 412 00:22:30,880 --> 00:22:34,720 Speaker 2: system is corrupt. But AI is not helping in any 413 00:22:34,800 --> 00:22:37,040 Speaker 2: least way. The fact that the first things that popped 414 00:22:37,040 --> 00:22:39,719 Speaker 2: out my site was that like, here you have access 415 00:22:39,720 --> 00:22:42,720 Speaker 2: to all this porn was so deeply fit like by 416 00:22:42,840 --> 00:22:44,280 Speaker 2: I just kind of sat there and stared at it. 417 00:22:45,200 --> 00:22:45,400 Speaker 1: Yeah. 418 00:22:45,480 --> 00:22:47,680 Speaker 3: First I was like, I messed up. I done messed up. 419 00:22:48,200 --> 00:22:50,560 Speaker 3: That was that moment. I was like, oh, I didn't 420 00:22:50,560 --> 00:22:51,080 Speaker 3: messed up. 421 00:22:51,560 --> 00:22:53,919 Speaker 2: Second and I was like, why why is this at 422 00:22:53,960 --> 00:22:56,399 Speaker 2: the top of the search. Shouldn't it be like something else, 423 00:22:56,480 --> 00:22:58,920 Speaker 2: Like shouldn't it be like like, yeah, this is a concern. 424 00:22:59,160 --> 00:23:01,080 Speaker 2: And when you look it up, it does show the 425 00:23:01,119 --> 00:23:05,720 Speaker 2: South Korean incident as the biggest headline, which I think 426 00:23:07,720 --> 00:23:10,399 Speaker 2: is interesting as well because we're so easy to like 427 00:23:10,640 --> 00:23:16,280 Speaker 2: pinpoint other areas first when there's a huge issue here, y'all. 428 00:23:16,520 --> 00:23:18,280 Speaker 3: It's a lot, it. 429 00:23:18,119 --> 00:23:21,840 Speaker 1: Is, and we're definitely going to have to come back 430 00:23:21,880 --> 00:23:24,639 Speaker 1: and revisit this because technology is changing so quickly. And 431 00:23:24,720 --> 00:23:28,679 Speaker 1: I do think, as Bridget always puts so well in 432 00:23:28,680 --> 00:23:33,199 Speaker 1: her episodes she comes on and does with us, I 433 00:23:33,240 --> 00:23:37,640 Speaker 1: think a lot of the victim blaming also comes back 434 00:23:37,680 --> 00:23:43,320 Speaker 1: to you agreed on that thing, the contract that you're 435 00:23:43,320 --> 00:23:46,840 Speaker 1: going to use this you're going to use this product online, 436 00:23:47,560 --> 00:23:50,239 Speaker 1: and therefore you basically have agreed to it. 437 00:23:50,280 --> 00:23:51,400 Speaker 3: But it's not true. 438 00:23:51,480 --> 00:23:54,359 Speaker 1: Like the way that we live, you can't live without 439 00:23:54,400 --> 00:23:57,080 Speaker 1: a lot of those things. You can't succeed without a 440 00:23:57,119 --> 00:23:59,840 Speaker 1: lot of those things. And to say like a company 441 00:23:59,880 --> 00:24:03,760 Speaker 1: is just kind of shrugging at you and your safety 442 00:24:03,960 --> 00:24:08,040 Speaker 1: and the way that you exist online. I don't think 443 00:24:08,920 --> 00:24:11,760 Speaker 1: is legit. I don't think that should be a thing. 444 00:24:12,400 --> 00:24:15,000 Speaker 1: So yeah, we have a lot of a lot of 445 00:24:15,000 --> 00:24:19,400 Speaker 1: conversations that we need to have about this, and we will. 446 00:24:19,520 --> 00:24:23,280 Speaker 1: But in the meantime, listeners, if you have any thoughts 447 00:24:23,359 --> 00:24:26,640 Speaker 1: on this, any any stories or topics we need to cover, 448 00:24:26,680 --> 00:24:30,239 Speaker 1: any resources, please let us know. You can email ust 449 00:24:30,240 --> 00:24:32,399 Speaker 1: Stuffani mom and Stuff at iHeartMedia dot com. You can 450 00:24:32,480 --> 00:24:34,440 Speaker 1: find us on Twitter at mom Stuff podcast, or on 451 00:24:34,480 --> 00:24:37,280 Speaker 1: Instagram and TikTok as stuff I ever told you. You 452 00:24:37,320 --> 00:24:39,240 Speaker 1: can find us on YouTube as well, and we have 453 00:24:39,240 --> 00:24:41,000 Speaker 1: a t poblic store and we have a book we 454 00:24:41,080 --> 00:24:43,479 Speaker 1: can get wherever you get your books. Thanks as always too, 455 00:24:43,480 --> 00:24:46,440 Speaker 1: our super producer Christina, executive ducer My and our contrict Joey. 456 00:24:46,640 --> 00:24:48,840 Speaker 1: Thank you and thanks to you for listening Stuff on 457 00:24:48,880 --> 00:24:50,480 Speaker 1: thever Told You is production. My Heart Radio for more 458 00:24:50,520 --> 00:24:51,960 Speaker 1: podcasts or my Heart Radio, you can check out the 459 00:24:52,000 --> 00:24:54,119 Speaker 1: Heart Radio, Apple podcast or wherever you listen to your 460 00:24:54,160 --> 00:24:54,800 Speaker 1: favorite shows. 461 00:25:00,600 --> 00:25:01,000 Speaker 2: No Mo