1 00:00:00,440 --> 00:00:03,920 Speaker 1: Hey, this is Olivia Carvell, one of the hosts of Levittown. 2 00:00:04,800 --> 00:00:08,200 Speaker 1: Since we released this series, the US has moved closer 3 00:00:08,320 --> 00:00:11,160 Speaker 1: to passing a bill that would crack down on deep 4 00:00:11,200 --> 00:00:15,600 Speaker 1: fake pornography. The US House passed the Take It Down 5 00:00:15,680 --> 00:00:19,640 Speaker 1: Act with a near unanimous vote, and now that bill 6 00:00:19,840 --> 00:00:23,520 Speaker 1: is heading to President Trump's desk. I spoke about this 7 00:00:23,680 --> 00:00:27,240 Speaker 1: legislation with the folks over at the tech Stuff podcast, 8 00:00:27,760 --> 00:00:32,839 Speaker 1: which is produced by our partners Kaleidoscope and iHeart. We 9 00:00:32,920 --> 00:00:35,960 Speaker 1: wanted to share my conversation with them here with you, 10 00:00:36,600 --> 00:00:39,480 Speaker 1: our Levittown listeners. So here it is. 11 00:00:43,440 --> 00:00:47,440 Speaker 2: Welcome to tex Stuff, a production of iHeart Podcasts and Kaleidoscope. 12 00:00:47,720 --> 00:00:50,440 Speaker 2: I'm mos Vloscian and today Karra price Night will talk 13 00:00:50,479 --> 00:00:53,960 Speaker 2: to Bloomberg's Olivia Carville about the Take It Down Act 14 00:00:54,200 --> 00:00:56,280 Speaker 2: and what it means for the future of the Internet. 15 00:00:59,160 --> 00:01:02,600 Speaker 2: There is a landmark bill aimed at combating AI harms, 16 00:01:02,920 --> 00:01:04,240 Speaker 2: specifically deep fakes. 17 00:01:04,520 --> 00:01:08,080 Speaker 3: They're used in scams, they're used in spreading misinformation online, 18 00:01:08,400 --> 00:01:11,679 Speaker 3: and I'd say most notably, they have been used in 19 00:01:11,720 --> 00:01:14,080 Speaker 3: the non consensual creation of porn. 20 00:01:14,240 --> 00:01:17,039 Speaker 2: Right and that's what this legislation is all about. This 21 00:01:17,080 --> 00:01:19,760 Speaker 2: week Congress passed the Take It Down Act, which aims 22 00:01:19,760 --> 00:01:22,680 Speaker 2: to crack down on the creation of revenge porn i e. 23 00:01:22,840 --> 00:01:26,560 Speaker 2: Pornographic images that are shared non consensually. The Act specifies 24 00:01:26,560 --> 00:01:29,480 Speaker 2: that those who distribute revenge porn, whether the quote real 25 00:01:29,640 --> 00:01:33,319 Speaker 2: or computer generated, could be fined or subject to prison time. 26 00:01:33,959 --> 00:01:36,600 Speaker 2: It's had rare backing from both sides of the political 27 00:01:36,640 --> 00:01:41,000 Speaker 2: aisle and from First Lady Milania Trump. As of Wednesday afternoon, 28 00:01:41,120 --> 00:01:43,959 Speaker 2: the time of this taping, the bill heads to President Trump, 29 00:01:44,000 --> 00:01:45,199 Speaker 2: who's likely to make it law. 30 00:01:45,680 --> 00:01:47,720 Speaker 3: Here to walk us through the Take It Down Act 31 00:01:47,760 --> 00:01:50,400 Speaker 3: and what it means for tech companies is Olivia Carville, 32 00:01:50,680 --> 00:01:54,000 Speaker 3: investigative reporter for Bloomberg News and co host of the 33 00:01:54,040 --> 00:01:58,080 Speaker 3: podcast Levittown, which is a must listen agreed wherever you 34 00:01:58,120 --> 00:02:01,440 Speaker 3: get your podcast put It covers the rise of deep 35 00:02:01,440 --> 00:02:04,520 Speaker 3: fake porn. It also happens to be a co production 36 00:02:04,640 --> 00:02:07,600 Speaker 3: of Kaleidoscope. Olivia, Welcome to tax Stuff. 37 00:02:07,800 --> 00:02:09,560 Speaker 1: Thank you so much for having me. It's great to 38 00:02:09,600 --> 00:02:11,280 Speaker 1: be back with Kaleidoscope's team. 39 00:02:11,720 --> 00:02:14,200 Speaker 2: Thanks thanks for being here, Olivia. You've been tracking this 40 00:02:14,240 --> 00:02:16,679 Speaker 2: bill for a long time. When did the push for 41 00:02:17,040 --> 00:02:20,400 Speaker 2: legislation on deep fake pornography begin. 42 00:02:21,160 --> 00:02:23,640 Speaker 1: I mean, it has been a very long journey to 43 00:02:23,720 --> 00:02:26,920 Speaker 1: get here. We've seen quite a lot of states across 44 00:02:26,960 --> 00:02:30,760 Speaker 1: the US rolling out legislation to try and target deep 45 00:02:30,760 --> 00:02:34,600 Speaker 1: fake porn since the revolution really began a number of 46 00:02:34,680 --> 00:02:38,160 Speaker 1: years ago. Now at the moment, more than twenty states 47 00:02:38,200 --> 00:02:41,760 Speaker 1: across the country have introduced new laws. But one of 48 00:02:41,760 --> 00:02:44,440 Speaker 1: the criticisms we heard time and time again, and something 49 00:02:44,520 --> 00:02:47,760 Speaker 1: we raised in the Levetown podcast is the fact that 50 00:02:47,760 --> 00:02:52,000 Speaker 1: there was no federal law criminalizing this across the US. 51 00:02:52,760 --> 00:02:56,400 Speaker 1: And this bill was first introduced last summer in twenty 52 00:02:56,440 --> 00:03:01,480 Speaker 1: twenty four, its bipartisan legislation. Senators Cruise and Clobashah put 53 00:03:01,520 --> 00:03:04,880 Speaker 1: it forward and it unanimously passed in the Senate, but 54 00:03:05,040 --> 00:03:08,080 Speaker 1: unfortunately it stalled in the House last year and that 55 00:03:08,160 --> 00:03:11,520 Speaker 1: led to a lot of frustration from the victims. Earlier 56 00:03:11,560 --> 00:03:14,520 Speaker 1: this year we saw it once again take it down, 57 00:03:14,680 --> 00:03:18,680 Speaker 1: was reintroduced, unanimously passed in the Senate, and then earlier 58 00:03:18,720 --> 00:03:22,760 Speaker 1: this week and very exciting news, it was also unanimously 59 00:03:22,840 --> 00:03:25,320 Speaker 1: passed in the House. And we're talking a vote of 60 00:03:25,400 --> 00:03:29,280 Speaker 1: four hundred and nine to two, and that's kind of remarkable. 61 00:03:29,280 --> 00:03:32,639 Speaker 1: At the moment, given the current polarized political climate we're 62 00:03:32,680 --> 00:03:35,360 Speaker 1: living in right now. The bill is en route to 63 00:03:35,400 --> 00:03:38,920 Speaker 1: President Trump's desk and there's a lot of expectation that 64 00:03:38,920 --> 00:03:39,920 Speaker 1: he's going to sign it soon. 65 00:03:40,920 --> 00:03:44,160 Speaker 3: So just to go back for a second, what is 66 00:03:44,400 --> 00:03:46,560 Speaker 3: the Take It Down Act? And what does it say? 67 00:03:47,160 --> 00:03:47,280 Speaker 2: So? 68 00:03:47,320 --> 00:03:49,760 Speaker 1: The Take It Down Act is actually an acronym for 69 00:03:49,840 --> 00:03:54,520 Speaker 1: a very long piece of legislation that's tools to address 70 00:03:54,760 --> 00:04:01,200 Speaker 1: known exploitation by immobilizing technological deep fakes on websites networks. 71 00:04:02,000 --> 00:04:04,040 Speaker 3: I think who came up with take it Down is 72 00:04:04,080 --> 00:04:07,360 Speaker 3: pretty easy to remember. Great, Yeah, you know it's it's 73 00:04:07,360 --> 00:04:08,000 Speaker 3: an acronym. 74 00:04:09,040 --> 00:04:12,640 Speaker 1: Yeah, So it is an acronym. And the law really 75 00:04:13,320 --> 00:04:17,279 Speaker 1: does exactly what that title implies, which provides a way 76 00:04:17,320 --> 00:04:20,719 Speaker 1: to ensure this content can be taken down from the Internet, 77 00:04:20,800 --> 00:04:24,080 Speaker 1: because that's where it's particularly harmful, is where it starts 78 00:04:24,120 --> 00:04:27,200 Speaker 1: to be shared across high schools and in friendship groups. 79 00:04:27,760 --> 00:04:31,719 Speaker 1: So the law goes after two main parties. One, it 80 00:04:31,839 --> 00:04:36,720 Speaker 1: makes it a crime for offenders to knowingly publish deep 81 00:04:36,720 --> 00:04:40,680 Speaker 1: fake pornography or intimate images, whether they're real or created 82 00:04:40,800 --> 00:04:45,000 Speaker 1: with AI, and then if they do, they can serve 83 00:04:45,160 --> 00:04:47,839 Speaker 1: up to two or three years in prison, depending if 84 00:04:48,080 --> 00:04:50,560 Speaker 1: the individual in the photo is an adult or a minor. 85 00:04:51,360 --> 00:04:54,039 Speaker 1: And then it also challenges or holds to account the 86 00:04:54,080 --> 00:04:58,400 Speaker 1: technology companies, the social media platforms where often this content 87 00:04:58,520 --> 00:05:02,120 Speaker 1: is shared and disseminated on and it forces them to 88 00:05:02,200 --> 00:05:05,359 Speaker 1: remove these deep fake images within forty eight hours of 89 00:05:05,440 --> 00:05:06,440 Speaker 1: being notified of them. 90 00:05:07,040 --> 00:05:10,200 Speaker 2: I have two questions for you, Olivia. Firstly, as this 91 00:05:10,279 --> 00:05:14,400 Speaker 2: phenomenon becomes more and more ubiquitous, what will this law 92 00:05:14,560 --> 00:05:17,720 Speaker 2: mean practically if you discover you're a victim? What will 93 00:05:17,760 --> 00:05:21,560 Speaker 2: it allow you to do you can't do today? And secondly, 94 00:05:21,600 --> 00:05:24,599 Speaker 2: you mentioned the liability of the platforms. How does this 95 00:05:24,640 --> 00:05:26,760 Speaker 2: intersect with Section two thirty. 96 00:05:27,200 --> 00:05:30,039 Speaker 1: So for a victim of deep fake porn, a young 97 00:05:30,080 --> 00:05:34,760 Speaker 1: person who maybe finds or discovers that fake pornographic non 98 00:05:34,800 --> 00:05:39,040 Speaker 1: consensual images are circulating online, now this law gives them 99 00:05:39,080 --> 00:05:41,960 Speaker 1: a path forward to get those photos taken down, to 100 00:05:41,960 --> 00:05:45,680 Speaker 1: get them scrubbed from the internet. Finally, so it enables 101 00:05:45,720 --> 00:05:48,920 Speaker 1: them to file a report with the social media platform 102 00:05:49,000 --> 00:05:52,480 Speaker 1: or the website or app where these images have been 103 00:05:52,520 --> 00:05:55,800 Speaker 1: published or disseminated, and to inform them that it's deep 104 00:05:55,880 --> 00:05:58,279 Speaker 1: fake porn, that it's non consensual and that they want 105 00:05:58,320 --> 00:06:01,880 Speaker 1: it removed, and then within two it has to be removed, 106 00:06:02,240 --> 00:06:06,039 Speaker 1: and the FTC, the Federal Trade Commission, is responsible for 107 00:06:06,200 --> 00:06:08,920 Speaker 1: holding those companies to account to get that taken down. 108 00:06:09,440 --> 00:06:12,520 Speaker 1: The other thing it gives victims is a path to justice. 109 00:06:13,000 --> 00:06:16,000 Speaker 1: It's a way to go after the offenders who publish 110 00:06:16,120 --> 00:06:20,040 Speaker 1: this content or even threatened to publish this content against 111 00:06:20,040 --> 00:06:24,080 Speaker 1: the survivors. Well, you ask about two thirty, and that's 112 00:06:24,120 --> 00:06:27,120 Speaker 1: a great question, because this is one of the only 113 00:06:27,200 --> 00:06:31,720 Speaker 1: pieces of consumer tech legislation where federal regulators have been 114 00:06:31,760 --> 00:06:35,000 Speaker 1: able to come in and actually sign a law in 115 00:06:35,120 --> 00:06:39,800 Speaker 1: place that impacts young people using these platforms Section two thirty, 116 00:06:39,920 --> 00:06:42,440 Speaker 1: and it comes from the Communications Decency Act. It's a 117 00:06:42,560 --> 00:06:46,320 Speaker 1: very controversial piece of legislation and it really did change 118 00:06:46,360 --> 00:06:49,000 Speaker 1: the Internet. And it was written into law back in 119 00:06:49,040 --> 00:06:52,720 Speaker 1: the mid nineties. And don't forget that that's before Facebook 120 00:06:52,839 --> 00:06:56,520 Speaker 1: was even created. This law, which governs all these social 121 00:06:56,560 --> 00:07:00,479 Speaker 1: media platforms, was written at a time before or social 122 00:07:00,520 --> 00:07:04,080 Speaker 1: media even existed. And what it does is it provides 123 00:07:04,120 --> 00:07:08,359 Speaker 1: an immunity shield. So these platforms are not responsible for 124 00:07:08,440 --> 00:07:11,760 Speaker 1: the content that is uploaded onto them. So anything that 125 00:07:11,840 --> 00:07:16,280 Speaker 1: is posted on Facebook Instagram, Snapchat, TikTok, Twitter, now X. 126 00:07:16,920 --> 00:07:20,840 Speaker 1: The platforms themselves cannot be held legally responsible for that 127 00:07:21,000 --> 00:07:24,240 Speaker 1: content in the choices they make around removing it or 128 00:07:24,280 --> 00:07:28,000 Speaker 1: allowing it to stay up. In this law, the platforms 129 00:07:28,120 --> 00:07:31,720 Speaker 1: are being held to account to take down deep fake porn, 130 00:07:31,960 --> 00:07:34,960 Speaker 1: to take down this specific form of content. And that's 131 00:07:35,000 --> 00:07:37,400 Speaker 1: why it's so controversial, and that's why there are critics 132 00:07:37,400 --> 00:07:41,040 Speaker 1: of this act because some people think that this law 133 00:07:41,080 --> 00:07:43,840 Speaker 1: will be weaponized or abused, and it's going to result 134 00:07:43,840 --> 00:07:46,480 Speaker 1: in the platforms taking down a lot more content than 135 00:07:46,520 --> 00:07:48,040 Speaker 1: what this legislation covers. 136 00:07:48,960 --> 00:07:49,200 Speaker 2: Wasn't. 137 00:07:49,240 --> 00:07:52,720 Speaker 3: Section two thirty in part introduced because of concerns over 138 00:07:52,760 --> 00:07:53,800 Speaker 3: online pornography. 139 00:07:54,600 --> 00:07:59,000 Speaker 1: So two thirty was first introduced because at the time, 140 00:07:59,600 --> 00:08:03,880 Speaker 1: judges and the legal system was ruling that platforms were 141 00:08:04,000 --> 00:08:07,040 Speaker 1: liable for any content that was posted on their sites. 142 00:08:07,400 --> 00:08:12,440 Speaker 1: And that meant that if a platform decided to remove harmful, grotesque, vile, 143 00:08:12,720 --> 00:08:18,400 Speaker 1: or violent content, say someone being cyber bullied or punched, 144 00:08:18,680 --> 00:08:22,800 Speaker 1: or content about drugs or alcohol, content that they just 145 00:08:22,800 --> 00:08:24,800 Speaker 1: didn't want to share with their other users, if they 146 00:08:24,840 --> 00:08:28,960 Speaker 1: took that down, they were actually being held responsible for 147 00:08:29,080 --> 00:08:32,400 Speaker 1: that decision in the legal system. Judges were saying they 148 00:08:32,400 --> 00:08:36,559 Speaker 1: would be held accountable and legally responsible for removing content 149 00:08:36,600 --> 00:08:39,800 Speaker 1: and people could sue the platforms for doing so. So 150 00:08:39,880 --> 00:08:42,800 Speaker 1: the law was written to actually protect the platforms and 151 00:08:42,960 --> 00:08:46,199 Speaker 1: enable them to moderate their content to try and make 152 00:08:46,240 --> 00:08:49,600 Speaker 1: the Internet a safer space. It's kind of counterintuitive when 153 00:08:49,640 --> 00:08:53,400 Speaker 1: you think about it, because unfortunately now what's resulted is 154 00:08:53,880 --> 00:08:57,000 Speaker 1: it's enabled these platforms to have so much power over 155 00:08:57,040 --> 00:08:59,520 Speaker 1: the content that's up and enabled them to wash their 156 00:08:59,520 --> 00:09:02,520 Speaker 1: hands and say this isn't our responsibility. We can't be 157 00:09:02,559 --> 00:09:05,559 Speaker 1: held legally liable for this. We're effectively walking away. 158 00:09:05,720 --> 00:09:09,400 Speaker 3: And necessitated a lot like this one to come into play. 159 00:09:09,440 --> 00:09:10,760 Speaker 3: I mean in a certain sense. 160 00:09:11,120 --> 00:09:15,040 Speaker 1: Yeah, I mean it definitely did. And here the law 161 00:09:15,160 --> 00:09:18,840 Speaker 1: is relatively narrow. We're not talking about any form of content. 162 00:09:19,200 --> 00:09:24,640 Speaker 1: We're talking about only content that involves non consensual intimate imagery, 163 00:09:25,040 --> 00:09:28,600 Speaker 1: whether that's real or created by AI. So that enables 164 00:09:28,640 --> 00:09:32,679 Speaker 1: people who see photos of themselves which have been manipulated 165 00:09:32,800 --> 00:09:37,280 Speaker 1: using technology to undress them or turn them naked or 166 00:09:37,360 --> 00:09:41,600 Speaker 1: put them into sexual acts, which is something we explored, 167 00:09:41,600 --> 00:09:44,520 Speaker 1: and leave itt town. Those images in that content can 168 00:09:44,559 --> 00:09:45,800 Speaker 1: be taken down with this act. 169 00:09:46,120 --> 00:09:51,160 Speaker 2: Some tech companies and adult websites only fans. Pornhub Matter 170 00:09:51,840 --> 00:09:55,480 Speaker 2: already have policies in place where users can request that 171 00:09:55,559 --> 00:09:58,640 Speaker 2: revenge porn be taken down. What will be the change 172 00:09:58,720 --> 00:10:02,800 Speaker 2: from a user victim point of view once this becomes law. 173 00:10:03,720 --> 00:10:06,240 Speaker 1: Yeah, you're right. I mean even Nick Meek, the National 174 00:10:06,280 --> 00:10:08,880 Speaker 1: Center for Missing and Exploited Children, has a tool which 175 00:10:08,960 --> 00:10:12,199 Speaker 1: is actually called take it Down, which does exactly the 176 00:10:12,240 --> 00:10:15,439 Speaker 1: same thing. Enables people to plug in a photo or 177 00:10:15,480 --> 00:10:18,320 Speaker 1: a hashtag which is like a unique idea of each image, 178 00:10:18,360 --> 00:10:20,840 Speaker 1: to say I don't want this online and I'm a 179 00:10:20,880 --> 00:10:24,160 Speaker 1: victim of this, and please remove it. But the law 180 00:10:24,920 --> 00:10:27,640 Speaker 1: regulates this, and it makes it a federal law to 181 00:10:27,679 --> 00:10:29,720 Speaker 1: say you have to remove it, and you have to 182 00:10:29,760 --> 00:10:32,280 Speaker 1: remove it within two days. So I guess it's just 183 00:10:32,360 --> 00:10:36,000 Speaker 1: putting a stricter approach to this, so the platforms know 184 00:10:36,120 --> 00:10:39,520 Speaker 1: they have to oblige and they have to get that 185 00:10:39,600 --> 00:10:41,439 Speaker 1: content scrubbed from their websites. 186 00:10:42,480 --> 00:10:46,440 Speaker 2: There's an amazing moment in the Levittown podcast where one 187 00:10:46,480 --> 00:10:49,880 Speaker 2: of the high school students who realizes she's been a 188 00:10:49,920 --> 00:10:52,880 Speaker 2: victim of deep Fate porn. Her father's actually a police officer, 189 00:10:53,600 --> 00:10:55,360 Speaker 2: so they try and figure out is there any legal 190 00:10:55,400 --> 00:10:58,840 Speaker 2: recourse and the response from the police is basically, there's 191 00:10:58,840 --> 00:11:00,959 Speaker 2: nothing we can do. It's kind of amazing in the 192 00:11:01,080 --> 00:11:04,080 Speaker 2: arc of your career as a reporter that the law 193 00:11:04,200 --> 00:11:06,600 Speaker 2: is actually changing in real time and response to the 194 00:11:06,640 --> 00:11:10,600 Speaker 2: stories that you've been covering, these very moving, horrifying stories. 195 00:11:11,160 --> 00:11:14,160 Speaker 2: What do you the victims think about this law and 196 00:11:14,160 --> 00:11:15,840 Speaker 2: what's been the response among your sources. 197 00:11:17,080 --> 00:11:19,680 Speaker 1: The victims have been waiting for this for a very 198 00:11:19,720 --> 00:11:22,679 Speaker 1: long time. When you think about the origin story of 199 00:11:22,800 --> 00:11:26,160 Speaker 1: Take It Down, it was when Aliston Barry, a young 200 00:11:26,240 --> 00:11:30,760 Speaker 1: teen from Texas, actually went to Senator Cruise's office and 201 00:11:30,840 --> 00:11:33,240 Speaker 1: told him that a deep fake image of her had 202 00:11:33,320 --> 00:11:36,640 Speaker 1: been circulating on Snapchat and she had asked the platform 203 00:11:36,679 --> 00:11:39,720 Speaker 1: to remove it, and after a year, the platform still 204 00:11:39,760 --> 00:11:43,920 Speaker 1: hadn't taken that image down. That's what really sparked this 205 00:11:43,960 --> 00:11:48,640 Speaker 1: particular piece of legislation. And we've seen young teenage you know, 206 00:11:48,720 --> 00:11:53,280 Speaker 1: high school students, college students speaking before Congress pleading for 207 00:11:53,320 --> 00:11:56,560 Speaker 1: a law like this, asking for help to find a 208 00:11:56,640 --> 00:11:59,760 Speaker 1: path to get these images removed from the internet. Because 209 00:11:59,760 --> 00:12:04,760 Speaker 1: i'm fortunately, you know, in teenagers' lives today the digital world, 210 00:12:04,800 --> 00:12:07,600 Speaker 1: as you bequit us. They exist within it, and they 211 00:12:07,640 --> 00:12:11,079 Speaker 1: merge between the online world and the offline world. They 212 00:12:11,080 --> 00:12:13,280 Speaker 1: don't call their friends on the phone, they don't call 213 00:12:13,320 --> 00:12:15,559 Speaker 1: their parents on the phone. You know, they'd be more 214 00:12:15,600 --> 00:12:18,439 Speaker 1: inclined to send a DM through Instagram or a message 215 00:12:18,440 --> 00:12:22,400 Speaker 1: on Snapchat. And when you exist in your social fabric 216 00:12:22,720 --> 00:12:27,080 Speaker 1: exists within the digital world. That means that when images 217 00:12:27,160 --> 00:12:31,280 Speaker 1: like this are shared, everybody sees them. And I think 218 00:12:31,400 --> 00:12:34,959 Speaker 1: that's the real harm here is the photos created. It's fake, 219 00:12:35,200 --> 00:12:39,840 Speaker 1: it looks unbelievably convincingly real, and it gets shared to 220 00:12:39,960 --> 00:12:44,240 Speaker 1: everyone in your social network within seconds. These young women 221 00:12:44,320 --> 00:12:48,000 Speaker 1: have been fighting for help and support, some at the 222 00:12:48,000 --> 00:12:51,360 Speaker 1: state level and they've been successful, but really they wanted 223 00:12:51,400 --> 00:12:54,080 Speaker 1: this at the federal level. So for a lot of 224 00:12:54,080 --> 00:12:56,000 Speaker 1: the young women, I think it's been like a sigh 225 00:12:56,000 --> 00:12:59,360 Speaker 1: of relief that finally we're here, and you've given us 226 00:12:59,400 --> 00:13:03,560 Speaker 1: and other young women who have been victimized or had 227 00:13:03,559 --> 00:13:07,439 Speaker 1: their images weaponized in this way a path to justice, 228 00:13:07,760 --> 00:13:11,520 Speaker 1: but also a path to get those photos removed from 229 00:13:11,520 --> 00:13:12,920 Speaker 1: the Internet once and for all. 230 00:13:13,440 --> 00:13:15,560 Speaker 3: Well, this all sounds like a very positive thing, and 231 00:13:15,600 --> 00:13:20,079 Speaker 3: it has bipartisan support. Are there people arguing against it? 232 00:13:20,280 --> 00:13:23,000 Speaker 3: And are there criticisms of the bill despite it being 233 00:13:23,080 --> 00:13:24,880 Speaker 3: overwhelmingly positive. 234 00:13:25,160 --> 00:13:28,280 Speaker 1: There definitely are As is the way when it comes 235 00:13:28,280 --> 00:13:31,679 Speaker 1: to social media or consumer tech, there is an ongoing 236 00:13:31,720 --> 00:13:35,880 Speaker 1: tension and like a push and pull between privacy and safety. 237 00:13:36,480 --> 00:13:40,760 Speaker 1: You have those who you know, prioritize safety and say 238 00:13:41,360 --> 00:13:44,560 Speaker 1: protecting children online is the most important thing we can do. 239 00:13:44,960 --> 00:13:47,600 Speaker 1: And then you have those who value privacy and say, 240 00:13:47,640 --> 00:13:52,400 Speaker 1: if we're going to create safety regulations or rules that 241 00:13:52,960 --> 00:13:55,360 Speaker 1: in any way we can our privacy, you know, that's 242 00:13:55,400 --> 00:13:58,840 Speaker 1: a bad thing to do, because privacy is something that 243 00:13:58,880 --> 00:14:02,480 Speaker 1: we need to priori ties as well. And so in 244 00:14:02,520 --> 00:14:06,200 Speaker 1: this case, you do have free speech and privacy advocates 245 00:14:06,360 --> 00:14:11,240 Speaker 1: criticizing this law for being unconstitutional, saying that it could 246 00:14:11,320 --> 00:14:15,040 Speaker 1: chill free expression, that it could foster censorship, that it 247 00:14:15,080 --> 00:14:18,120 Speaker 1: could result in what they describe as a knee jerk 248 00:14:18,320 --> 00:14:21,360 Speaker 1: takedown of content. And what I mean by that is 249 00:14:21,760 --> 00:14:25,600 Speaker 1: because these platforms and I'm talking about meta, Snapchat, TikTok, 250 00:14:25,720 --> 00:14:29,560 Speaker 1: because they've grown so big and we're talking billions of 251 00:14:29,560 --> 00:14:32,200 Speaker 1: pieces of content uploaded on a daily basis, if you're 252 00:14:32,240 --> 00:14:36,400 Speaker 1: going to enforce regulation or legislation that says they have 253 00:14:36,480 --> 00:14:39,840 Speaker 1: to take down certain content within forty eight hours, and 254 00:14:39,920 --> 00:14:42,800 Speaker 1: say they get flooded with millions of requests on a 255 00:14:42,880 --> 00:14:45,560 Speaker 1: daily basis, they are not going to have the bandwidth 256 00:14:46,120 --> 00:14:50,200 Speaker 1: to actually review each request and that could result in 257 00:14:50,240 --> 00:14:53,600 Speaker 1: them just deciding to remove everything that gets reported to them. 258 00:14:54,160 --> 00:14:57,000 Speaker 1: And that is what free speech and kind of privacy 259 00:14:57,040 --> 00:15:00,160 Speaker 1: advocates fear is going to result in a level of 260 00:15:00,200 --> 00:15:02,880 Speaker 1: censorship that we haven't seen before because no one's been 261 00:15:02,920 --> 00:15:05,680 Speaker 1: able to really adjust two thirty since it was written 262 00:15:05,680 --> 00:15:10,080 Speaker 1: into law. We've also, interestingly seen some criticism coming from 263 00:15:10,120 --> 00:15:13,640 Speaker 1: the child safety advocacy space, and they've come out swinging 264 00:15:13,760 --> 00:15:17,720 Speaker 1: saying that while this bill, in this legislation is necessary, 265 00:15:18,160 --> 00:15:21,320 Speaker 1: it's far from game changing, that it's taken too long 266 00:15:21,400 --> 00:15:24,840 Speaker 1: to get here, and that the penalties aren't severe enough 267 00:15:25,160 --> 00:15:26,680 Speaker 1: that this is going to put a lot of pressure 268 00:15:26,680 --> 00:15:31,280 Speaker 1: on local and state authorities, prosecutors, law enforcement to actually 269 00:15:31,320 --> 00:15:35,000 Speaker 1: go after the perpetrators in a more severe way. Because 270 00:15:35,080 --> 00:15:37,520 Speaker 1: when you look at Take it Down, we're talking two 271 00:15:37,600 --> 00:15:41,080 Speaker 1: years in prison for publishing an intimate image of an adult, 272 00:15:41,440 --> 00:15:44,080 Speaker 1: deep fake or real, and up to three years for 273 00:15:44,160 --> 00:15:44,600 Speaker 1: a minor. 274 00:15:45,560 --> 00:15:48,000 Speaker 2: What about the tech companies, I mean, are they viewing 275 00:15:48,040 --> 00:15:50,720 Speaker 2: this as the first battle line in the way to 276 00:15:50,800 --> 00:15:53,960 Speaker 2: fight over the future of Section two thirty. Have their 277 00:15:54,040 --> 00:15:57,920 Speaker 2: lobbyists been active on this issue, and how are they 278 00:15:58,000 --> 00:16:02,960 Speaker 2: preparing for this extraordinary new set of responsibilities that will 279 00:16:02,960 --> 00:16:04,320 Speaker 2: come with a passage of this bill? Is so you 280 00:16:04,360 --> 00:16:05,600 Speaker 2: mean to get signed by President Trump. 281 00:16:06,000 --> 00:16:08,840 Speaker 1: Well, the tech companies, a lot of them actually do 282 00:16:08,920 --> 00:16:12,800 Speaker 1: have rules in place that says non consensual intimate or 283 00:16:12,840 --> 00:16:16,920 Speaker 1: sexual images can't be shared. I mean, even on Metas 284 00:16:16,960 --> 00:16:21,160 Speaker 1: platforms alone, it's against the rules to post any nude photos. 285 00:16:21,480 --> 00:16:23,680 Speaker 1: But in this case, now that they're being kind of 286 00:16:23,720 --> 00:16:26,640 Speaker 1: forced to do so by regulation, Metas come out in 287 00:16:26,680 --> 00:16:29,200 Speaker 1: support of this, saying, you know, we do think that 288 00:16:29,440 --> 00:16:32,480 Speaker 1: deep fake porn shouldn't exist on our platform, and we 289 00:16:32,520 --> 00:16:35,080 Speaker 1: will do what we can to take it down. I 290 00:16:35,120 --> 00:16:40,600 Speaker 1: think that from the platform's perspectives, they don't want fake photos, 291 00:16:41,000 --> 00:16:45,200 Speaker 1: fake naked photos of teenage girls shared on their platforms, 292 00:16:45,200 --> 00:16:48,239 Speaker 1: like that's not a positive use case of their networks 293 00:16:48,320 --> 00:16:51,560 Speaker 1: at all. They don't want their users sharing or distributing 294 00:16:51,600 --> 00:16:55,400 Speaker 1: this content. And now they're being told and hold to 295 00:16:55,480 --> 00:16:58,480 Speaker 1: account to ensure that it's taken down within two days. 296 00:16:58,560 --> 00:17:02,280 Speaker 1: And I'd be interesting to see how the companies internally 297 00:17:02,320 --> 00:17:05,800 Speaker 1: are responding to this, and what the process is going 298 00:17:05,840 --> 00:17:07,920 Speaker 1: to be and whether it's actually going to change anything. 299 00:17:08,520 --> 00:17:11,520 Speaker 2: Olivia, just to close, I mean, you've had kind of 300 00:17:11,560 --> 00:17:16,320 Speaker 2: an extraordinary run this year, putting out the Levittown podcast, 301 00:17:17,320 --> 00:17:21,080 Speaker 2: also having extraordinary documentary called Can't Look Away that Bloomberg 302 00:17:21,080 --> 00:17:24,760 Speaker 2: produced distributed about the harms of social media. Can you 303 00:17:24,920 --> 00:17:27,760 Speaker 2: sort of take a step back and describe this moment, 304 00:17:27,800 --> 00:17:30,359 Speaker 2: because one thing that Karen and I talk about and 305 00:17:30,400 --> 00:17:34,320 Speaker 2: think about is that five years ago, the idea that 306 00:17:34,359 --> 00:17:37,240 Speaker 2: the law might catch up to the tech companies and 307 00:17:37,280 --> 00:17:41,240 Speaker 2: there would be enough social pressure to insist on changes 308 00:17:41,320 --> 00:17:45,040 Speaker 2: to protect users from harm seems to be like a fantasy. 309 00:17:45,560 --> 00:17:48,200 Speaker 2: But in this moment, there seems to be some promise 310 00:17:48,240 --> 00:17:50,439 Speaker 2: that it's actually happening. Can you speak about that. 311 00:17:51,200 --> 00:17:54,119 Speaker 1: I've been covering the dangers of the digital world for 312 00:17:54,160 --> 00:17:58,919 Speaker 1: Bloomberg for going on almost four years now, and I 313 00:17:59,080 --> 00:18:04,359 Speaker 1: have been terrified by what I've seen online. And I'm 314 00:18:04,400 --> 00:18:08,120 Speaker 1: not talking just deep fake porn and you know, witnessing 315 00:18:08,840 --> 00:18:14,119 Speaker 1: the real world consequences of these photographs being shared among 316 00:18:14,280 --> 00:18:17,480 Speaker 1: teenagers in high schools, and I'm talking the impact on 317 00:18:17,600 --> 00:18:20,840 Speaker 1: the young women who are targeted, but also the young 318 00:18:20,920 --> 00:18:25,119 Speaker 1: men who think that it's normal to create and share 319 00:18:25,240 --> 00:18:27,919 Speaker 1: photos like this, think it's a joke. The way in 320 00:18:28,000 --> 00:18:32,320 Speaker 1: which teens and this generation are kind of warped by technology, 321 00:18:33,000 --> 00:18:35,840 Speaker 1: I think we don't fully understand what the long term 322 00:18:35,880 --> 00:18:39,719 Speaker 1: consequences of that are going to be. But the harms 323 00:18:39,720 --> 00:18:43,359 Speaker 1: of the digital world exist far beyond deep fakes, and 324 00:18:43,760 --> 00:18:46,240 Speaker 1: that's what we were exploring and the Can't Look Away film, 325 00:18:46,280 --> 00:18:49,480 Speaker 1: and the film itself explores the other ways in which 326 00:18:49,520 --> 00:18:55,880 Speaker 1: social media can harm kids, from recommendation algorithms, pushing suicide, 327 00:18:55,920 --> 00:19:00,679 Speaker 1: glorifying content, content that is going to lead to or 328 00:19:00,720 --> 00:19:05,080 Speaker 1: mental health harms, or eating disorders. It explores the ways 329 00:19:05,119 --> 00:19:09,000 Speaker 1: in which kids have been targeted by predators online who 330 00:19:09,040 --> 00:19:12,360 Speaker 1: want to sell them drugs, and in many cases they 331 00:19:12,400 --> 00:19:17,439 Speaker 1: think they're buying counterfeit pills like xanax or oxycodone, and 332 00:19:17,480 --> 00:19:20,359 Speaker 1: it turns out to be laced with enough fentanyl to 333 00:19:20,440 --> 00:19:24,040 Speaker 1: kill their entire household, and parents are discovering their children 334 00:19:24,320 --> 00:19:28,119 Speaker 1: dead in their bedrooms. So it's been a really difficult 335 00:19:28,160 --> 00:19:32,960 Speaker 1: topic to explore, but also in just such a crucial one. 336 00:19:33,000 --> 00:19:35,879 Speaker 1: This is one of the most essential issues of our time, 337 00:19:36,400 --> 00:19:39,720 Speaker 1: and I think that this has been a challenging yet 338 00:19:40,320 --> 00:19:43,879 Speaker 1: very rewarding area to explore. And I know there's a 339 00:19:43,880 --> 00:19:45,960 Speaker 1: lot of criticism of the Take It Down Act, but 340 00:19:46,080 --> 00:19:49,679 Speaker 1: regardless of the controversy, most people agree this is a 341 00:19:49,720 --> 00:19:53,720 Speaker 1: step in the right direction. And I think this act 342 00:19:53,800 --> 00:19:56,639 Speaker 1: is a good thing. But it's very narrow. You know, 343 00:19:56,680 --> 00:20:02,520 Speaker 1: we're only talking about removing content that is non consensual 344 00:20:02,800 --> 00:20:06,679 Speaker 1: intimate imagery. We're not talking about all the other content 345 00:20:06,720 --> 00:20:09,720 Speaker 1: that could potentially harm kids. So while the fight here 346 00:20:10,320 --> 00:20:13,960 Speaker 1: is a win and we should celebrate that, the broader 347 00:20:14,040 --> 00:20:18,439 Speaker 1: concern around protecting our children in the online world is ongoing. 348 00:20:19,440 --> 00:20:21,120 Speaker 3: Olivia, Thank you, Thanks Olivia.