1 00:00:15,120 --> 00:00:18,400 Speaker 1: Hey guys, it's Andrea Gunning with some big Betrayal news. 2 00:00:18,880 --> 00:00:20,960 Speaker 1: I have been on location with some of the people 3 00:00:21,000 --> 00:00:24,280 Speaker 1: you heard in season two, Ashley Avea and their family 4 00:00:24,360 --> 00:00:27,159 Speaker 1: to shoot a docuseries for Hulu. I'll let you know 5 00:00:27,200 --> 00:00:30,680 Speaker 1: when the docuseries available on Hulu later this year. We're 6 00:00:30,720 --> 00:00:34,000 Speaker 1: all so excited to announce that Betrayal will become a 7 00:00:34,040 --> 00:00:38,920 Speaker 1: weekly series starting this summer. Thanks to your support of 8 00:00:38,960 --> 00:00:41,920 Speaker 1: this podcast, we'll be able to bring you many real 9 00:00:41,960 --> 00:00:46,239 Speaker 1: life stories of betrayal, making this community even stronger. So 10 00:00:46,360 --> 00:00:49,479 Speaker 1: if you've been thinking about sharing your story, now is 11 00:00:49,520 --> 00:00:53,080 Speaker 1: the time. Email has a Betrayal Pod at gmail dot com. 12 00:00:53,320 --> 00:00:58,160 Speaker 1: That's Betrayal Pod at gmail dot com. I want to 13 00:00:58,200 --> 00:01:02,160 Speaker 1: share some news that affects and children everywhere. Our second 14 00:01:02,240 --> 00:01:05,880 Speaker 1: season of Betrayal focus on families destroyed by child sexual 15 00:01:05,880 --> 00:01:10,280 Speaker 1: abuse material also called see SAM. The National Center for 16 00:01:10,360 --> 00:01:13,360 Speaker 1: Missing and Exploited Children has reviewed over three hundred and 17 00:01:13,400 --> 00:01:17,319 Speaker 1: twenty two million images and videos of child sexual exploitation. 18 00:01:18,120 --> 00:01:20,920 Speaker 1: It's hard to wrap your head around that. It's why 19 00:01:20,959 --> 00:01:23,800 Speaker 1: we couldn't stay away from the topic last season. It's 20 00:01:23,840 --> 00:01:27,440 Speaker 1: also been a big issue in Washington recently. Betrayal producer 21 00:01:27,520 --> 00:01:31,280 Speaker 1: Kerrie Hartman has been following developments. Carrie, I know you 22 00:01:31,319 --> 00:01:32,679 Speaker 1: watched it. What did you see? 23 00:01:33,120 --> 00:01:37,280 Speaker 2: Yeah, I watched it. It was fascinating. The Senate Judiciary Committee, 24 00:01:37,640 --> 00:01:40,520 Speaker 2: they subpoened five CEOs of some of the biggest tech 25 00:01:40,520 --> 00:01:46,680 Speaker 2: companies Discord, Snap, Meta X, you know formerly Twitter, and TikTok, 26 00:01:47,080 --> 00:01:50,520 Speaker 2: and the committee wants to advance several bills that address 27 00:01:50,640 --> 00:01:54,400 Speaker 2: online safety for children. And this hearing it got a 28 00:01:54,480 --> 00:01:59,480 Speaker 2: ton of publicity, and at the beginning Senate Judiciary Chair 29 00:02:00,440 --> 00:02:02,880 Speaker 2: explained how the committee was feeling. 30 00:02:03,960 --> 00:02:07,480 Speaker 3: These apps have changed the ways we live, work, and play, 31 00:02:08,080 --> 00:02:11,800 Speaker 3: but as investigations have detailed, social media and messaging apps 32 00:02:11,840 --> 00:02:16,800 Speaker 3: have also given predators powerful new tools to sexually exploit children. 33 00:02:17,520 --> 00:02:21,960 Speaker 3: Your carefully crafted algorithms can be powerful force in the 34 00:02:22,000 --> 00:02:25,639 Speaker 3: lives of our children. Today we'll hear from the CEOs 35 00:02:25,760 --> 00:02:30,320 Speaker 3: of those companies. Their constant pursuit of engagement and profit 36 00:02:30,720 --> 00:02:33,800 Speaker 3: over basic safety have all put our kids and grandkids 37 00:02:33,840 --> 00:02:36,400 Speaker 3: at risk. But the tech industry alone is not to 38 00:02:36,400 --> 00:02:39,639 Speaker 3: blame for the situation we're in. Those of us in 39 00:02:39,720 --> 00:02:41,200 Speaker 3: Congress need to look in the mirror. 40 00:02:42,320 --> 00:02:45,240 Speaker 2: This was a major issue for two New York Times 41 00:02:45,240 --> 00:02:48,280 Speaker 2: supporters that you talk with earlier this season. 42 00:02:48,720 --> 00:02:52,400 Speaker 1: Yeah, why don't we actually revisit that interview with Gabriel 43 00:02:52,520 --> 00:02:53,519 Speaker 1: Dance and Michael Keller. 44 00:02:54,200 --> 00:02:58,880 Speaker 4: We spoke with people who said that as early as 45 00:02:59,200 --> 00:03:02,640 Speaker 4: two thousand, tech companies knew this was a very serious 46 00:03:02,680 --> 00:03:07,880 Speaker 4: problem and we're doing nothing to solve it. In two 47 00:03:07,880 --> 00:03:12,040 Speaker 4: thousand and nine, when they introduce scanning technology, we knew 48 00:03:12,040 --> 00:03:15,280 Speaker 4: that it could be effective in helping stem the problem. 49 00:03:15,320 --> 00:03:17,680 Speaker 4: Still tech companies were not using it. 50 00:03:18,240 --> 00:03:22,520 Speaker 5: I would say if you talk with most technology policy people, 51 00:03:23,000 --> 00:03:27,359 Speaker 5: their answer would be technology companies don't have that much 52 00:03:27,560 --> 00:03:33,000 Speaker 5: pressure to get rid of harmful content on their platform 53 00:03:33,720 --> 00:03:39,000 Speaker 5: because Section two thirty of the Communications Decency Act shields 54 00:03:39,040 --> 00:03:43,720 Speaker 5: technology companies from any liability for content that users post. 55 00:03:44,480 --> 00:03:48,200 Speaker 1: Can you explain more about what section two thirty does? 56 00:03:48,920 --> 00:03:52,840 Speaker 2: Okay? So, Section two thirty means any lawsuit holding a 57 00:03:52,920 --> 00:03:58,040 Speaker 2: tech company liable for damages won't go anywhere they have immunity. 58 00:03:58,520 --> 00:04:02,800 Speaker 2: So if Facebook court Snapchat or ex as storing or 59 00:04:02,840 --> 00:04:06,640 Speaker 2: transmitting images of c SAM, for example, parents can't hold 60 00:04:06,680 --> 00:04:10,000 Speaker 2: the company responsible. And try to imagine if it was 61 00:04:10,080 --> 00:04:13,280 Speaker 2: your child's photo and if that child was tricked into 62 00:04:13,360 --> 00:04:17,400 Speaker 2: sending it but Section two thirty was passed almost thirty 63 00:04:17,480 --> 00:04:21,000 Speaker 2: years ago, back in nineteen ninety six. No one could 64 00:04:21,040 --> 00:04:26,320 Speaker 2: have imagined back then TikTok or Instagram or even sex stortion. 65 00:04:27,120 --> 00:04:29,960 Speaker 2: People still had their photos developed at the drug store. 66 00:04:30,560 --> 00:04:32,960 Speaker 2: And I have to tell you how real this is. 67 00:04:33,040 --> 00:04:35,799 Speaker 2: I mean, this happened to a close friend of mine, 68 00:04:36,200 --> 00:04:39,839 Speaker 2: to her child. You take a vulnerable kid and a 69 00:04:39,960 --> 00:04:43,880 Speaker 2: savvy adult with no conscience and no barriers. 70 00:04:43,800 --> 00:04:46,520 Speaker 1: Right, So why was there a hearing now? 71 00:04:47,080 --> 00:04:51,000 Speaker 2: It seems in recent months that frustration with tech's immunity 72 00:04:51,600 --> 00:04:53,800 Speaker 2: is just getting bigger on both sides of the aisle. 73 00:04:54,520 --> 00:04:57,320 Speaker 2: And look, this isn't the first time Congress has summoned 74 00:04:57,320 --> 00:05:01,320 Speaker 2: tech leaders for a shaming session. But I was really 75 00:05:01,400 --> 00:05:05,240 Speaker 2: curious was this more than a shaming session? So I 76 00:05:05,320 --> 00:05:09,240 Speaker 2: reached out to Politico technology reporter Rebecca Curred. She was 77 00:05:09,320 --> 00:05:11,919 Speaker 2: in the room for this whole thing, and she shared 78 00:05:11,920 --> 00:05:12,760 Speaker 2: some of her thoughts. 79 00:05:13,120 --> 00:05:18,520 Speaker 6: Oh interesting, I've been covering efforts in Congress to regulate 80 00:05:19,040 --> 00:05:23,919 Speaker 6: social media companies and how they handle kids online safety issues. 81 00:05:24,600 --> 00:05:28,400 Speaker 6: Typically there's a lot of posturing from the senators, but 82 00:05:28,640 --> 00:05:32,400 Speaker 6: in the room, it was very palpable. The emotion because 83 00:05:32,680 --> 00:05:38,520 Speaker 6: this time, the committee members invited families whose children have 84 00:05:39,120 --> 00:05:42,920 Speaker 6: died as a result, they say, of content they've been 85 00:05:42,960 --> 00:05:46,760 Speaker 6: exposed to on the platforms. A number of children have 86 00:05:46,839 --> 00:05:50,760 Speaker 6: committed suicide over cyberbullying, over a new phenomenon that I 87 00:05:50,800 --> 00:05:53,880 Speaker 6: know you guys have covered in the podcast called sextortion, 88 00:05:54,360 --> 00:05:59,320 Speaker 6: where organized criminal groups create fake accounts that opposed to 89 00:05:59,320 --> 00:06:03,520 Speaker 6: be other children and extort elicit images from children and 90 00:06:03,560 --> 00:06:04,840 Speaker 6: then hold them financially. 91 00:06:05,600 --> 00:06:09,080 Speaker 2: My gosh. Yeah, And the committee chair Dick Durbin co 92 00:06:09,200 --> 00:06:13,880 Speaker 2: sponsored the stop Ceesam bill. That bill would hold platforms 93 00:06:13,920 --> 00:06:18,720 Speaker 2: responsible if they host cesam or make it available. And 94 00:06:19,000 --> 00:06:21,920 Speaker 2: you're probably thinking, well, who would make those images available? 95 00:06:22,279 --> 00:06:25,560 Speaker 2: But haven't you ever searched for something like you just 96 00:06:25,640 --> 00:06:27,880 Speaker 2: took up skiing recently, right, so you want to see 97 00:06:27,920 --> 00:06:32,320 Speaker 2: more image of a skiing and then the platform's algorithm 98 00:06:32,480 --> 00:06:36,200 Speaker 2: recommends more content because they think that you like that. Well, 99 00:06:36,240 --> 00:06:40,000 Speaker 2: it does the same thing with nefarious and dangerous content. 100 00:06:40,520 --> 00:06:44,880 Speaker 2: And Senator Ted Cruz went after Meta on exactly that point. 101 00:06:45,880 --> 00:06:49,240 Speaker 7: Missus Szuckerberg and June of twenty twenty three, The Wall 102 00:06:49,279 --> 00:06:54,920 Speaker 7: Street Journal reported that Instagram's recommendation systems were actively connecting 103 00:06:54,960 --> 00:06:59,360 Speaker 7: pedophiles two accounts that were advertising the sale of child 104 00:06:59,480 --> 00:07:02,880 Speaker 7: sexual use material. In other words, this material wasn't just 105 00:07:03,040 --> 00:07:06,160 Speaker 7: living on the dark corners of Instagram. Instagram was helping 106 00:07:06,240 --> 00:07:14,080 Speaker 7: pedophiles find it by promoting graphic hashtags including hashtag pedhore 107 00:07:14,880 --> 00:07:22,120 Speaker 7: and hashtag preteen sex to potential buyers. Instagram also displayed 108 00:07:22,120 --> 00:07:26,480 Speaker 7: the following warning screen to individuals who were searching for 109 00:07:26,640 --> 00:07:32,400 Speaker 7: child abuse material. These results may contain images of child 110 00:07:32,800 --> 00:07:37,240 Speaker 7: sexual abuse. And then you gave users two choices, get 111 00:07:37,320 --> 00:07:45,320 Speaker 7: resources or C results anyway in what sane universe? Is 112 00:07:45,320 --> 00:07:47,600 Speaker 7: there a link for C results anyway? 113 00:07:48,560 --> 00:07:51,200 Speaker 1: How did Mark Zuckerberg respond to that. 114 00:07:51,680 --> 00:07:54,720 Speaker 2: There's no good answer for that, But here's what he said. 115 00:07:55,160 --> 00:07:57,680 Speaker 8: Well, because we might be wrong, we try to trigger 116 00:07:57,760 --> 00:08:03,240 Speaker 8: this this warning, or we tried to when we think 117 00:08:03,320 --> 00:08:04,720 Speaker 8: that there's any chance that there is. 118 00:08:04,720 --> 00:08:07,720 Speaker 2: OK. Here's more from Rebecca Kerrn. 119 00:08:08,240 --> 00:08:12,000 Speaker 6: Tech companies will admit and it is for sure not 120 00:08:12,120 --> 00:08:14,400 Speaker 6: something they want on their platforms. They don't want to 121 00:08:14,440 --> 00:08:18,280 Speaker 6: be hosting CCM, and they take great efforts to remove it, 122 00:08:18,520 --> 00:08:22,680 Speaker 6: and I will give them credit. They invest millions of 123 00:08:22,800 --> 00:08:26,640 Speaker 6: dollars into AI and machine learning to detect it early. 124 00:08:27,200 --> 00:08:30,840 Speaker 6: But it's still there and it gets spread across multiple platforms. 125 00:08:31,720 --> 00:08:35,760 Speaker 1: These companies are self pleasing and self reporting, but we're 126 00:08:35,800 --> 00:08:38,320 Speaker 1: depending on them to find it and shut it down. 127 00:08:39,000 --> 00:08:42,320 Speaker 2: It's interesting that you bring that up because a senator 128 00:08:42,440 --> 00:08:47,320 Speaker 2: from Rhode Island, Senator Sheldon white House commented exactly on 129 00:08:47,480 --> 00:08:48,320 Speaker 2: that issue. 130 00:08:49,080 --> 00:08:53,760 Speaker 9: We are here in this hearing because as a collective, 131 00:08:55,120 --> 00:09:00,800 Speaker 9: your platforms really suck at policing themselves. In my view, 132 00:09:00,840 --> 00:09:04,839 Speaker 9: Section two thirty is a very significant part of that problem. 133 00:09:05,360 --> 00:09:09,360 Speaker 2: Listen, there were great soundbites from senators, but that doesn't 134 00:09:09,400 --> 00:09:14,080 Speaker 2: translate to policy right. Rebecca Kern pointed out that Section 135 00:09:14,200 --> 00:09:18,040 Speaker 2: two thirty served an important purpose, at least for a while. 136 00:09:18,480 --> 00:09:21,760 Speaker 6: We wouldn't be leading the globe in these innovations without 137 00:09:21,800 --> 00:09:25,840 Speaker 6: Section two thirty allowing them to flourish without lawsuits. But 138 00:09:26,760 --> 00:09:29,520 Speaker 6: a lot of other senators are saying, Okay, we allowed 139 00:09:29,520 --> 00:09:32,000 Speaker 6: them to flourish and grow. Now we need to raten 140 00:09:32,040 --> 00:09:35,160 Speaker 6: them in and we're an outlier in the whole globe. 141 00:09:35,400 --> 00:09:38,800 Speaker 6: Europe has been able to pass regulations and hold them accountable, 142 00:09:39,320 --> 00:09:42,400 Speaker 6: and so a lot of people say it's time to 143 00:09:42,480 --> 00:09:45,120 Speaker 6: take away this quote unquote sweetheart deal that we have 144 00:09:45,240 --> 00:09:46,320 Speaker 6: given to tech companies. 145 00:09:55,880 --> 00:09:58,600 Speaker 1: Did any comments stand out to you while you were watching? 146 00:09:59,240 --> 00:10:02,640 Speaker 2: There were a lot of them, but this one from 147 00:10:02,679 --> 00:10:04,240 Speaker 2: Amy Klobrichar kind of got me. 148 00:10:05,080 --> 00:10:09,320 Speaker 10: When a Boeing plane lost a door in mid flight 149 00:10:09,440 --> 00:10:14,440 Speaker 10: several weeks ago, nobody questioned the decision to ground a 150 00:10:14,480 --> 00:10:18,840 Speaker 10: fleet of over seven hundred planes. So why aren't we 151 00:10:18,920 --> 00:10:24,040 Speaker 10: taking this same type of decisive action on the danger 152 00:10:24,600 --> 00:10:27,760 Speaker 10: of these platforms when we know these kids are dying? 153 00:10:29,040 --> 00:10:32,880 Speaker 1: She has a point, right, when everyone is worried about 154 00:10:32,920 --> 00:10:34,240 Speaker 1: their own physical safety. 155 00:10:34,679 --> 00:10:37,920 Speaker 2: Boom, it's done exactly. And I got to tell you 156 00:10:37,920 --> 00:10:41,480 Speaker 2: about another moment that really took the room down, and 157 00:10:41,559 --> 00:10:46,320 Speaker 2: that was when Meta CEO Zuckerberg testified that social media 158 00:10:46,720 --> 00:10:48,600 Speaker 2: doesn't really do any harm to kids. 159 00:10:49,240 --> 00:10:51,520 Speaker 8: With so much of our lives spent on mobile devices 160 00:10:51,559 --> 00:10:54,640 Speaker 8: and social media, it's important to look into the effects 161 00:10:54,800 --> 00:10:57,560 Speaker 8: on team mental health and well being. I take this 162 00:10:57,679 --> 00:11:01,760 Speaker 8: very seriously. Mental health is a complex issue, and the 163 00:11:01,840 --> 00:11:05,280 Speaker 8: existing body of scientific work has not shown a cause 164 00:11:05,280 --> 00:11:08,240 Speaker 8: a link between using social media and young people having 165 00:11:08,320 --> 00:11:09,600 Speaker 8: worse mental health outcomes. 166 00:11:10,040 --> 00:11:11,760 Speaker 1: Did he say that with a straight face? 167 00:11:12,360 --> 00:11:15,320 Speaker 2: He did, and there was some laughter. I mean, it 168 00:11:15,400 --> 00:11:18,240 Speaker 2: was one very short moment of levity. But you know, 169 00:11:18,280 --> 00:11:20,640 Speaker 2: it's just so absurd. You don't have to be a 170 00:11:20,679 --> 00:11:24,959 Speaker 2: social scienceist or a psychologist to understand that social media 171 00:11:25,080 --> 00:11:27,200 Speaker 2: impacts kids a lot. 172 00:11:27,880 --> 00:11:31,079 Speaker 1: Was there anyone there defending the work of technology companies? 173 00:11:31,360 --> 00:11:35,240 Speaker 1: I mean, there are ways they've enriched all of our lives. 174 00:11:35,559 --> 00:11:38,000 Speaker 1: Can you even remember life before Amazon? 175 00:11:38,520 --> 00:11:41,720 Speaker 2: Life before Amazon? We mean going to a store and 176 00:11:41,920 --> 00:11:45,200 Speaker 2: having to wait in line? No, of course, not no, 177 00:11:46,559 --> 00:11:50,560 Speaker 2: but all kidding aside. Some senators mentioned that and did 178 00:11:50,640 --> 00:11:54,320 Speaker 2: praise these companies for adding some value to society. But 179 00:11:54,480 --> 00:11:57,120 Speaker 2: this hearing wasn't set up for pushback. It was really 180 00:11:57,160 --> 00:12:02,040 Speaker 2: about these tech companies being told draconian measures are coming 181 00:12:02,240 --> 00:12:06,320 Speaker 2: if you don't do a better job. But outside of this, 182 00:12:06,840 --> 00:12:09,640 Speaker 2: there is an advocate for the tech company called net Choice, 183 00:12:09,720 --> 00:12:12,480 Speaker 2: and they are pushing back pretty hard. They have filed 184 00:12:12,520 --> 00:12:16,200 Speaker 2: several lawsuits against states that are tired of waiting for 185 00:12:16,240 --> 00:12:17,920 Speaker 2: the federal government to do something. 186 00:12:18,360 --> 00:12:20,000 Speaker 1: Can you give me an example. 187 00:12:20,520 --> 00:12:25,480 Speaker 2: Sure, there's one. Net Choice is suing the Ohio Attorney 188 00:12:25,520 --> 00:12:29,960 Speaker 2: General over the Social Media Parental Notification Act. This law 189 00:12:30,000 --> 00:12:34,600 Speaker 2: requires companies to obtain parental consent before individual's younger than 190 00:12:34,720 --> 00:12:41,800 Speaker 2: sixteen can use platforms like Facebook, Instagram, YouTube, Snapchat. So 191 00:12:41,920 --> 00:12:44,760 Speaker 2: ne choice does not support any of these bills being 192 00:12:44,800 --> 00:12:49,240 Speaker 2: pushed by the Judiciary Committee. What do they support, well, 193 00:12:49,400 --> 00:12:52,160 Speaker 2: free speech is what they hang their hat on. Free speech, 194 00:12:52,200 --> 00:12:55,040 Speaker 2: Free speech all the way. But one thing that they 195 00:12:55,080 --> 00:12:59,240 Speaker 2: did promote that we'll be familiar to our season two 196 00:12:59,280 --> 00:13:04,160 Speaker 2: listeners is to hold child abusers accountable by prosecuting more 197 00:13:04,160 --> 00:13:07,439 Speaker 2: of them. You know, far too many reports of c 198 00:13:07,640 --> 00:13:12,280 Speaker 2: SAM offenses are not investigated, not prosecuted. Because we talked 199 00:13:12,280 --> 00:13:16,760 Speaker 2: about this, Andrea like their triage right, there's not enough 200 00:13:16,840 --> 00:13:19,800 Speaker 2: law enforcement to go after all the people that are 201 00:13:19,800 --> 00:13:23,600 Speaker 2: breaking these laws, and when they're able to go after them, 202 00:13:23,720 --> 00:13:26,320 Speaker 2: they can prosecute them and at least put them in 203 00:13:26,400 --> 00:13:30,839 Speaker 2: for some kind of prison time. But despite that choice, 204 00:13:30,880 --> 00:13:34,079 Speaker 2: there was some movement on one of the bills called KOSA, 205 00:13:34,600 --> 00:13:37,880 Speaker 2: or the Kids Online Safety Act. Now, this bill wouldn't 206 00:13:37,920 --> 00:13:42,840 Speaker 2: repeal Section two thirty, so we asked Rebecca Kerrn, what would. 207 00:13:42,640 --> 00:13:48,760 Speaker 6: It do that one specifically, would hold tech companies accountable 208 00:13:49,000 --> 00:13:52,600 Speaker 6: and imposing a duty of care for them to make 209 00:13:52,679 --> 00:13:58,600 Speaker 6: sure that their recommendation systems, their algorithms do not recommend 210 00:13:58,920 --> 00:14:03,160 Speaker 6: harmful quote content. That is the key word, how do 211 00:14:03,200 --> 00:14:07,640 Speaker 6: you define harmful for them, They're saying it's suicide content, 212 00:14:08,000 --> 00:14:09,960 Speaker 6: it's eating disordered content. 213 00:14:10,720 --> 00:14:13,440 Speaker 2: And Rebecca pointed out that some groups are worried about 214 00:14:13,520 --> 00:14:14,679 Speaker 2: KOSA moving forward. 215 00:14:15,320 --> 00:14:20,200 Speaker 6: Progressive LGBTQ groups are saying, we're worried that this bill 216 00:14:20,240 --> 00:14:24,720 Speaker 6: also empowers state at Traded General to sue over harmful content. 217 00:14:24,760 --> 00:14:28,760 Speaker 6: And how they would define content maybe like trans content 218 00:14:29,040 --> 00:14:32,320 Speaker 6: or LGBTQ content that these communities would want to see 219 00:14:32,320 --> 00:14:35,600 Speaker 6: on the platforms. Some conservative liing ags may want to 220 00:14:35,640 --> 00:14:37,680 Speaker 6: take that down. So they said this could have an 221 00:14:37,680 --> 00:14:40,680 Speaker 6: inadvertently negative impact for certain vulnerable youth. 222 00:14:41,640 --> 00:14:44,320 Speaker 2: While the CEOs were on the hot seat, and you know, 223 00:14:44,360 --> 00:14:46,760 Speaker 2: the day before they were called to the hearing, they 224 00:14:46,800 --> 00:14:50,640 Speaker 2: did make some concessions that are worth mentioning. Here is 225 00:14:51,120 --> 00:14:54,320 Speaker 2: X CEO Lida yak Areina X. 226 00:14:54,200 --> 00:14:59,960 Speaker 11: Supports the Stop c SAM Act. The Kids Online Safe 227 00:15:00,080 --> 00:15:05,160 Speaker 11: DAC should continue to progress and we will support the 228 00:15:05,240 --> 00:15:09,760 Speaker 11: continuation to engage with it and ensure the protections of 229 00:15:09,800 --> 00:15:10,800 Speaker 11: the freedom of speech. 230 00:15:11,320 --> 00:15:14,560 Speaker 2: And you know, SNAP CEOs Evan Spiegel also came out 231 00:15:14,600 --> 00:15:18,880 Speaker 2: in support of KOSA and look, it's not everything, but 232 00:15:19,040 --> 00:15:22,360 Speaker 2: maybe it's a start. Here's Politico's Rebecca Kurrent. 233 00:15:22,400 --> 00:15:25,360 Speaker 6: Again, these are the constant battles these platforms have to 234 00:15:25,400 --> 00:15:29,440 Speaker 6: deal with between privacy, which is such a strong protection 235 00:15:29,560 --> 00:15:33,160 Speaker 6: in our country, and free speech and other protection and 236 00:15:33,200 --> 00:15:37,120 Speaker 6: safety and there's you know, no real mandate to put 237 00:15:37,120 --> 00:15:37,880 Speaker 6: safety first. 238 00:15:38,400 --> 00:15:40,600 Speaker 1: Do you think Section two thirty has a chance of 239 00:15:40,640 --> 00:15:41,360 Speaker 1: being repealed? 240 00:15:41,960 --> 00:15:45,320 Speaker 2: I asked re Becca that question, and she seemed pretty doubtful. 241 00:15:45,840 --> 00:15:48,240 Speaker 2: You know, it's not just the law passing, but it's 242 00:15:48,240 --> 00:15:50,920 Speaker 2: it's the lawsuits that would follow, and how many years 243 00:15:50,920 --> 00:15:52,240 Speaker 2: would it be caught up in court. 244 00:15:53,000 --> 00:15:55,880 Speaker 1: I can't help, but wonder did this hearing make a difference. 245 00:15:56,760 --> 00:16:00,000 Speaker 2: If you're asking will it create more safety for children online? 246 00:16:01,040 --> 00:16:03,320 Speaker 2: I think there is a reason for hope. There was 247 00:16:03,320 --> 00:16:07,160 Speaker 2: some movement we've never seen before. But people need to 248 00:16:07,240 --> 00:16:10,800 Speaker 2: keep applying pressure because that does make a difference. 249 00:16:12,800 --> 00:16:15,920 Speaker 1: Thank you to Politico's Rebecca Kern for her insight, and 250 00:16:15,960 --> 00:16:19,120 Speaker 1: thanks to our listeners for your support of Betrayal. Remember 251 00:16:19,200 --> 00:16:21,120 Speaker 1: if you want to share your story for the new 252 00:16:21,160 --> 00:16:24,400 Speaker 1: weekly series of Betrayal coming this summer, email us at 253 00:16:24,400 --> 00:16:28,600 Speaker 1: Betrayal Pod at gmail dot com. That's Betrayal Pod at 254 00:16:28,640 --> 00:16:32,800 Speaker 1: gmail dot com. Betrayal is a production of Glass Podcasts, 255 00:16:33,000 --> 00:16:36,320 Speaker 1: a division of Glass Entertainment Group and partnership with iHeart Podcasts. 256 00:16:36,920 --> 00:16:39,760 Speaker 1: The show was executive produced by Nancy Glass and Jennifer Fason, 257 00:16:40,360 --> 00:16:43,640 Speaker 1: posted and produced by me Andrea Gunning, Written and produced 258 00:16:43,680 --> 00:16:48,120 Speaker 1: by Kerry Hartman, also produced by Ben Fetterman, Associate producer 259 00:16:48,400 --> 00:16:52,800 Speaker 1: Kristin Melcurie. Our iHeart team is Ali Perry and Jessica Krincheck. 260 00:16:53,720 --> 00:16:57,640 Speaker 1: Audio editing and mixing by Matt Alvecchio. Betrayal's theme composed 261 00:16:57,640 --> 00:17:01,520 Speaker 1: by Oliver Bains. Music library provide by my Music and 262 00:17:01,600 --> 00:17:05,359 Speaker 1: For more podcasts from iHeart, visit the iHeartRadio app, Apple Podcasts, 263 00:17:05,400 --> 00:17:07,000 Speaker 1: or wherever you get your podcasts.