1 00:00:04,800 --> 00:00:08,119 Speaker 1: On this episode of Newsworld. On Monday, President Donald Trump 2 00:00:08,400 --> 00:00:12,760 Speaker 1: signed the Take It Down Act, bipartisan legislation that enacts 3 00:00:12,800 --> 00:00:17,480 Speaker 1: stricter penalties for the distribution of non consensual intimate imagery, 4 00:00:17,560 --> 00:00:21,360 Speaker 1: sometimes called revenge porn, as well as deep fakes created 5 00:00:21,360 --> 00:00:25,480 Speaker 1: by artificial intelligence. A legislation which goes into effect immediately. 6 00:00:25,920 --> 00:00:29,320 Speaker 1: Was introduced by Senator Ted Cruz, a Republican from Texas, 7 00:00:29,560 --> 00:00:33,479 Speaker 1: and Senator Amy Klobushar, a Democrat from Minnesota, and gained 8 00:00:33,479 --> 00:00:37,560 Speaker 1: the support of First Lady Milania Trump. The law makes 9 00:00:37,560 --> 00:00:41,640 Speaker 1: it illegal to knowingly publish or threaten to publish intimate 10 00:00:41,680 --> 00:00:46,879 Speaker 1: images without a person's consent, including AI created deep fakes. 11 00:00:47,240 --> 00:00:50,600 Speaker 1: It also requires websites and social media companies to remove 12 00:00:50,640 --> 00:00:53,360 Speaker 1: such material within forty eight hours of notice from a victim. 13 00:00:54,160 --> 00:00:56,840 Speaker 1: Here to talk about the Take It Down Act, I'm 14 00:00:56,880 --> 00:01:00,360 Speaker 1: really pleased to welcome my two guests, Senator Ted Cruz, 15 00:01:00,560 --> 00:01:04,200 Speaker 1: representing the great State of Texas, and Scott Berkowitz, founder 16 00:01:04,240 --> 00:01:17,680 Speaker 1: and president of Rain, said, Welcome to knuts World. 17 00:01:18,040 --> 00:01:19,200 Speaker 2: That's great to be with you, Nut. 18 00:01:19,440 --> 00:01:22,440 Speaker 1: I'm really curious how did you become involved in the 19 00:01:22,520 --> 00:01:25,880 Speaker 1: Take It Down Acting? How big of impact was the 20 00:01:26,360 --> 00:01:29,880 Speaker 1: meeting you had with Elliston, Barry and her mom, who 21 00:01:29,880 --> 00:01:33,840 Speaker 1: had snapchat, refused for nearly a year to remove an 22 00:01:33,840 --> 00:01:36,560 Speaker 1: AI generated deep fake of the then fourteen year old. 23 00:01:37,200 --> 00:01:40,559 Speaker 2: What had happened to Alliston was infuriating and it was wrong. 24 00:01:40,760 --> 00:01:44,039 Speaker 2: And over a year ago, Elliston was then fourteen. She 25 00:01:44,120 --> 00:01:48,240 Speaker 2: was a freshman in high school and one morning her 26 00:01:48,280 --> 00:01:50,880 Speaker 2: phone started blowing up. All her friends were texting her. 27 00:01:51,720 --> 00:01:54,680 Speaker 2: And what had happened is a classmate of hers, who 28 00:01:54,720 --> 00:01:58,120 Speaker 2: was a boy in her class, had taken a perfectly 29 00:01:58,160 --> 00:02:02,440 Speaker 2: innocent picture of her from social medi and had used 30 00:02:02,480 --> 00:02:06,080 Speaker 2: an app that uses AI or artificial intelligence, to create 31 00:02:06,120 --> 00:02:08,960 Speaker 2: a deep fake, and then sent to all of her 32 00:02:09,000 --> 00:02:12,480 Speaker 2: classmates in ninth grade what appeared to be naked pictures 33 00:02:13,040 --> 00:02:15,960 Speaker 2: of Elliston. And the technology is such that it's not 34 00:02:16,360 --> 00:02:19,400 Speaker 2: a photoshop with someone's head stuck on a body, but 35 00:02:19,480 --> 00:02:23,240 Speaker 2: it appears perfectly real. But it was entirely fake, and 36 00:02:23,320 --> 00:02:26,560 Speaker 2: Elliston understandably was in tears. Listen, it's hard to be 37 00:02:26,600 --> 00:02:29,200 Speaker 2: a teenage girl today. As the father of two daughters, 38 00:02:29,720 --> 00:02:32,600 Speaker 2: I know the pressures that are on particularly our girls. 39 00:02:33,520 --> 00:02:36,440 Speaker 2: It was even more frustrating because the way the law was, 40 00:02:36,800 --> 00:02:39,480 Speaker 2: what the boy had done was not illegal. It was 41 00:02:39,480 --> 00:02:42,800 Speaker 2: perfectly illegal to do that. The Take It Down Act 42 00:02:42,880 --> 00:02:46,160 Speaker 2: corrects that now. And what it does is two things. One, 43 00:02:46,760 --> 00:02:49,760 Speaker 2: it makes it a crime, a felony to post non 44 00:02:49,840 --> 00:02:53,760 Speaker 2: consensual intimate imagery, either real pictures so called revenge porn, 45 00:02:54,480 --> 00:02:59,040 Speaker 2: or in Elliston's case, deep fakes of real people. But two, 46 00:02:59,320 --> 00:03:02,200 Speaker 2: and this goes right to your question, new it puts 47 00:03:02,240 --> 00:03:05,280 Speaker 2: a legal obligation on the tech platforms to pull the 48 00:03:05,360 --> 00:03:06,160 Speaker 2: content down. 49 00:03:06,960 --> 00:03:09,960 Speaker 1: When you'd decided to sponsor the Take It Down Act, 50 00:03:10,080 --> 00:03:13,600 Speaker 1: I notice she got your colleague, Senator Emmykloboshar. How did 51 00:03:13,639 --> 00:03:16,000 Speaker 1: you and Amy come together to make this a bipartisan bill. 52 00:03:16,600 --> 00:03:19,360 Speaker 2: The way the issue came to my attention initially is 53 00:03:19,440 --> 00:03:22,880 Speaker 2: because Elliston is a Texan and her mom, Anna is 54 00:03:22,880 --> 00:03:25,320 Speaker 2: a Texan. They live in Alito, Texas, which is North 55 00:03:25,360 --> 00:03:29,600 Speaker 2: Texas outside Dallas Fort Worth. And her mom called my 56 00:03:29,720 --> 00:03:32,400 Speaker 2: office and said, you're my senator, can you help me? 57 00:03:33,160 --> 00:03:35,560 Speaker 2: And my staff elevated it to me and we looked 58 00:03:35,560 --> 00:03:37,600 Speaker 2: at what happened to Alliston and thought it was horrible, 59 00:03:38,560 --> 00:03:42,440 Speaker 2: and we discovered this was a phenomenon that was happening 60 00:03:42,440 --> 00:03:44,480 Speaker 2: all over the country. Last year there was a three 61 00:03:44,520 --> 00:03:48,560 Speaker 2: thousand percent increase in deep fakes, and over ninety percent 62 00:03:48,600 --> 00:03:51,120 Speaker 2: of the victims are women and teenage girls. And so 63 00:03:51,240 --> 00:03:54,000 Speaker 2: we drafted the Take It Down Act to fix this problem. 64 00:03:54,520 --> 00:03:57,080 Speaker 2: And Amy Klobuchar and I worked together quite a bit. 65 00:03:57,120 --> 00:03:59,800 Speaker 2: We're on two committees together. We're on Judiciary together, and 66 00:04:00,080 --> 00:04:02,520 Speaker 2: we're on Commerce together. I'm the chairman of the Senate 67 00:04:02,520 --> 00:04:05,200 Speaker 2: Commerce Committee. And she and I have had a good 68 00:04:05,240 --> 00:04:08,800 Speaker 2: relationship for thirteen years, and so I approached Amien. She's 69 00:04:08,840 --> 00:04:10,760 Speaker 2: been very active in the tech space as of I 70 00:04:10,920 --> 00:04:13,360 Speaker 2: and reining in abuses of big tech, and I asked, 71 00:04:13,440 --> 00:04:15,240 Speaker 2: you want to join me on this and she said yes. 72 00:04:16,400 --> 00:04:19,280 Speaker 2: And in fact, the first time I met Elliston and 73 00:04:19,320 --> 00:04:22,640 Speaker 2: Anna was when they came to d C to join 74 00:04:22,880 --> 00:04:25,640 Speaker 2: Amy and me for the press conference announcing the bill. 75 00:04:26,560 --> 00:04:30,040 Speaker 2: And we're sitting in my office and I asked, I said, well, 76 00:04:30,040 --> 00:04:34,599 Speaker 2: whatever happened to the pictures? And Anna Elliston's mother she 77 00:04:34,680 --> 00:04:38,640 Speaker 2: expressed enormous frustration. She said, you know, this was nine 78 00:04:38,680 --> 00:04:40,839 Speaker 2: months ago. This happened and she said, I've been on 79 00:04:40,880 --> 00:04:43,960 Speaker 2: the phone, I've been emailing Snapchat over and over and 80 00:04:44,000 --> 00:04:47,400 Speaker 2: over again, and I've been running into a complete stone 81 00:04:47,480 --> 00:04:50,320 Speaker 2: Wall and nude. I turned to my staff, I said, 82 00:04:50,360 --> 00:04:52,479 Speaker 2: I want you to get the CEO of Snapchat on 83 00:04:52,560 --> 00:04:57,680 Speaker 2: the phone today. I want those pictures down today. Within 84 00:04:57,760 --> 00:05:01,640 Speaker 2: two hours, they pulled them down. But it shouldn't take 85 00:05:01,720 --> 00:05:04,000 Speaker 2: a sitting senator making a phone call to make that 86 00:05:04,080 --> 00:05:06,240 Speaker 2: happen and to take it down act now that President 87 00:05:06,279 --> 00:05:09,679 Speaker 2: Trump has signed it in the law gives a legal 88 00:05:09,839 --> 00:05:12,800 Speaker 2: right to any victim that if you notify a tech 89 00:05:12,839 --> 00:05:17,680 Speaker 2: platform that picture is me. It is explicit, and you 90 00:05:17,839 --> 00:05:21,880 Speaker 2: do not have my consent, they have a federal statutory 91 00:05:21,920 --> 00:05:23,760 Speaker 2: obligation to take it down immediately. 92 00:05:24,560 --> 00:05:27,559 Speaker 1: It's impressive at a time of great partisanship. The bill 93 00:05:27,600 --> 00:05:31,599 Speaker 1: that you developed passed the Senate unanimously and then passed 94 00:05:31,600 --> 00:05:33,680 Speaker 1: the House in a four h nine to two vote. 95 00:05:33,720 --> 00:05:35,920 Speaker 1: I mean, they are not many things that have that 96 00:05:36,000 --> 00:05:40,000 Speaker 1: level of unanimity. So obviously you had found a sweet spot. 97 00:05:40,040 --> 00:05:42,520 Speaker 1: And I'm curious, how did you craft this to fit 98 00:05:42,600 --> 00:05:45,080 Speaker 1: within the First Amendment rights. 99 00:05:45,560 --> 00:05:51,160 Speaker 2: Well, we made it explicitly focused on non consensual intimate imagery. 100 00:05:51,160 --> 00:05:53,279 Speaker 2: And there are a lot of laws across the country 101 00:05:53,440 --> 00:05:57,240 Speaker 2: that deal with so called revenge porn, where say a 102 00:05:57,279 --> 00:06:00,680 Speaker 2: boyfriend and girlfriend are in a romantic relationship and they 103 00:06:00,960 --> 00:06:04,560 Speaker 2: choose to take explicit pictures or videos, and then they 104 00:06:04,560 --> 00:06:07,800 Speaker 2: have a bad breakup and one or the other decides 105 00:06:07,839 --> 00:06:10,120 Speaker 2: to post that to the world, to stick it to 106 00:06:10,360 --> 00:06:13,680 Speaker 2: their former boyfriend or girlfriend, and that I think is 107 00:06:13,680 --> 00:06:16,960 Speaker 2: a grotesque violation of privacy. And those laws have been 108 00:06:17,040 --> 00:06:19,360 Speaker 2: upheld across the country because you don't have a right 109 00:06:19,960 --> 00:06:22,919 Speaker 2: to do that to somebody else. Texas has a revenge 110 00:06:22,960 --> 00:06:27,560 Speaker 2: porn law. The problem is Texas's law did not cover 111 00:06:27,680 --> 00:06:30,320 Speaker 2: deep fakes. Deep fakes are a new enough phenomenon that 112 00:06:30,440 --> 00:06:33,240 Speaker 2: most of the laws in place. That's a hole in 113 00:06:33,279 --> 00:06:35,760 Speaker 2: the law, and so this did not meet the legal 114 00:06:35,800 --> 00:06:38,839 Speaker 2: definition of child pornography, even though these appeared to be 115 00:06:38,960 --> 00:06:42,039 Speaker 2: naked pictures of a fourteen year old girl and a 116 00:06:42,080 --> 00:06:45,480 Speaker 2: real fourteen year old girl. Because they were deep fakes, 117 00:06:45,520 --> 00:06:48,680 Speaker 2: they didn't fall under the definitions of child porn, and 118 00:06:48,760 --> 00:06:51,279 Speaker 2: so we define it. It's clear the First Amendment does 119 00:06:51,320 --> 00:06:53,520 Speaker 2: not cover the right to put out revenge porn and 120 00:06:53,600 --> 00:06:57,200 Speaker 2: target an individual. And it's also clear the First Amendment 121 00:06:57,240 --> 00:07:00,560 Speaker 2: doesn't cover child pornography, and so we put deep fakes 122 00:07:01,240 --> 00:07:04,760 Speaker 2: in that explicit category. And I'll tell you on the 123 00:07:04,800 --> 00:07:08,479 Speaker 2: take it down obligation new What we did is we 124 00:07:08,680 --> 00:07:12,400 Speaker 2: borrowed from an existing legal framework. So there's a long 125 00:07:12,440 --> 00:07:16,480 Speaker 2: standing law called the Digital Millennium Copyright Act that if 126 00:07:16,520 --> 00:07:20,720 Speaker 2: you tweet out a song from the Lion King, they'll 127 00:07:20,760 --> 00:07:23,160 Speaker 2: take it down within hours because you don't have a 128 00:07:23,240 --> 00:07:27,760 Speaker 2: right to violate someone else's copyright or trademark. And so 129 00:07:28,040 --> 00:07:32,400 Speaker 2: every tech platform has an office that deals with notice 130 00:07:32,440 --> 00:07:35,240 Speaker 2: and takedowns where they get notified Okay, this is copyrighted 131 00:07:35,280 --> 00:07:37,760 Speaker 2: material and they pull it down. Well, what we did 132 00:07:37,840 --> 00:07:40,040 Speaker 2: is we put this in that same bucket. So that 133 00:07:40,200 --> 00:07:42,920 Speaker 2: same office that is pulling down the tweet of the 134 00:07:42,960 --> 00:07:46,920 Speaker 2: Lion King. Now if a teenager or a woman notifies, hey, 135 00:07:47,120 --> 00:07:51,120 Speaker 2: this is nonconsensual intimate imagery, that same office and that 136 00:07:51,240 --> 00:07:55,520 Speaker 2: same mechanism has the statutory obligation to remove the content. 137 00:07:56,080 --> 00:07:58,559 Speaker 1: I know it says you were developing this. First Lady, 138 00:07:58,560 --> 00:08:02,600 Speaker 1: Milennia Trump, had been leading a be Best campaign. One 139 00:08:02,640 --> 00:08:05,640 Speaker 1: of her concerns was online safety. How helpful was she 140 00:08:06,280 --> 00:08:08,880 Speaker 1: in drawing attention to the whole concept of take it Down? 141 00:08:09,800 --> 00:08:13,960 Speaker 2: So the First Lady was incredibly helpful. In the last Congress, 142 00:08:14,120 --> 00:08:17,520 Speaker 2: we passed take it Down through the Senate passed it unanimously, 143 00:08:18,480 --> 00:08:22,480 Speaker 2: and then in the House, unfortunately it failed to get passed. 144 00:08:22,560 --> 00:08:26,320 Speaker 2: House leadership had added it to the Continuing Resolution in 145 00:08:26,360 --> 00:08:29,160 Speaker 2: December that was going to pass, but then if you remember, 146 00:08:29,800 --> 00:08:32,679 Speaker 2: that Continuing Resolution got pulled down and much of the 147 00:08:32,720 --> 00:08:34,680 Speaker 2: stuff that had been added got stripped out of it. 148 00:08:34,760 --> 00:08:39,840 Speaker 2: So it did not pass last Congress. And so this Congress, 149 00:08:39,880 --> 00:08:43,280 Speaker 2: starting in January, we passed it again through the Senate, 150 00:08:43,720 --> 00:08:46,200 Speaker 2: and the real challenge a new you've been speaker, you know, 151 00:08:46,280 --> 00:08:49,679 Speaker 2: House leadership has a million different demands in terms of 152 00:08:50,200 --> 00:08:54,040 Speaker 2: where to allocate floor time, and so the challenge was 153 00:08:54,120 --> 00:08:57,280 Speaker 2: getting this to rise up the priority so that it 154 00:08:57,360 --> 00:09:00,439 Speaker 2: gets a vote on the House floor. And the First 155 00:09:00,480 --> 00:09:02,880 Speaker 2: Lady reached out and called my office and said she 156 00:09:03,080 --> 00:09:05,480 Speaker 2: was very interested in this bill and she wanted to help, 157 00:09:05,520 --> 00:09:08,960 Speaker 2: which I thought was fantastic. So I invited the first 158 00:09:09,040 --> 00:09:11,839 Speaker 2: Lady to come to Capitol Hill and we did a 159 00:09:11,920 --> 00:09:14,800 Speaker 2: roundtable with the victims. And so the first Lady heard 160 00:09:14,840 --> 00:09:19,800 Speaker 2: from Elliston directly heard her story. She heard from Francesca Mani, 161 00:09:19,880 --> 00:09:22,080 Speaker 2: who is a fifteen year old girl in New Jersey 162 00:09:22,080 --> 00:09:24,680 Speaker 2: who the exact same thing happened to her. That happened 163 00:09:24,679 --> 00:09:28,079 Speaker 2: to Elliston. She also heard from Brandon Guffey, who is 164 00:09:28,160 --> 00:09:32,160 Speaker 2: a state rep in South Carolina, whose story is even 165 00:09:32,240 --> 00:09:36,160 Speaker 2: more tragic. His son got a direct message from what 166 00:09:36,200 --> 00:09:40,120 Speaker 2: he thought was a cute girl, and this supposed cute 167 00:09:40,160 --> 00:09:44,679 Speaker 2: girl convinced him to send naked pictures of himself to her. Well, 168 00:09:44,720 --> 00:09:47,199 Speaker 2: it turns out that that cute girl was in fact 169 00:09:47,200 --> 00:09:51,360 Speaker 2: a con man who began sextorting him and saying, send 170 00:09:51,360 --> 00:09:54,400 Speaker 2: me money now, or I'm going to take these naked 171 00:09:54,400 --> 00:09:56,160 Speaker 2: pictures you just said, I'm going to send them to 172 00:09:56,200 --> 00:09:59,720 Speaker 2: your parents, to your family, to your friends. And tragically, 173 00:09:59,760 --> 00:10:03,199 Speaker 2: this young man, Gavin, took his own life. He committed suicide. 174 00:10:03,800 --> 00:10:07,960 Speaker 2: And Brandon told that story. The time between the first 175 00:10:07,960 --> 00:10:11,440 Speaker 2: direct message from the con man and when his son 176 00:10:11,520 --> 00:10:15,439 Speaker 2: took his life was ninety minutes. And we've seen suicides 177 00:10:15,480 --> 00:10:18,880 Speaker 2: all across the country from this growing problem. And so 178 00:10:19,600 --> 00:10:22,920 Speaker 2: when the first lady came and heard these stories firsthand, 179 00:10:22,920 --> 00:10:26,199 Speaker 2: and she leaned in at the roundtable, the Speaker of 180 00:10:26,280 --> 00:10:29,360 Speaker 2: the House was there, Steve Scalise, the Majority leader, was there, 181 00:10:29,800 --> 00:10:32,840 Speaker 2: Brett Guthrie, the relevant committee chairman in the House was there, 182 00:10:32,880 --> 00:10:37,040 Speaker 2: and Milania asked them, will you commit to getting this done, 183 00:10:37,320 --> 00:10:41,559 Speaker 2: and House leadership said absolutely, And I think the first 184 00:10:41,640 --> 00:10:45,880 Speaker 2: Lady's involvement elevated and sped up the progress dramatically and 185 00:10:45,920 --> 00:10:47,360 Speaker 2: helped us get it over the finish. 186 00:10:47,200 --> 00:10:50,880 Speaker 1: Line, teenage suicide has become a terrible problem and now 187 00:10:50,920 --> 00:10:53,480 Speaker 1: extends down to like eight, nine and ten year olds, 188 00:10:53,720 --> 00:10:55,559 Speaker 1: and a lot of it I think comes straight out 189 00:10:55,640 --> 00:10:59,040 Speaker 1: of the effect of social media and the effect of 190 00:10:59,360 --> 00:11:02,600 Speaker 1: isolation because of people paying so much attention to their 191 00:11:02,679 --> 00:11:06,160 Speaker 1: cell phones. Let me ask you, do you see additional 192 00:11:06,240 --> 00:11:11,120 Speaker 1: legislation evolving as we learn more about these kinds of problems, 193 00:11:11,440 --> 00:11:13,920 Speaker 1: and do you think they can evolve in a bipartisan way. 194 00:11:14,400 --> 00:11:17,640 Speaker 2: I think Congress needs to do more to protect kids online. 195 00:11:18,640 --> 00:11:21,240 Speaker 2: When you and I were teenagers, their challenges to being 196 00:11:21,240 --> 00:11:25,000 Speaker 2: a teenager, but we didn't face all of these forces. 197 00:11:25,040 --> 00:11:29,160 Speaker 2: Our kids. We give them a phone, and every predator 198 00:11:29,240 --> 00:11:32,080 Speaker 2: on earth has access to them, every evil force, the 199 00:11:32,200 --> 00:11:35,640 Speaker 2: pressures that are on our kids. You look at social media, 200 00:11:35,760 --> 00:11:39,240 Speaker 2: they push substance abuse, they push self harm, they push 201 00:11:39,679 --> 00:11:44,040 Speaker 2: body image issues and self worth issues, and it increases 202 00:11:44,200 --> 00:11:49,360 Speaker 2: depression and increases anxiety and increases suicidal ideation. So one 203 00:11:49,440 --> 00:11:51,760 Speaker 2: example of a build that I want to move is 204 00:11:51,760 --> 00:11:55,720 Speaker 2: a bill called Keep Kids Off Social Media or COSMA, 205 00:11:56,040 --> 00:12:00,440 Speaker 2: which I've introduced with Brian Shotts, Democrat from Hawaii, and 206 00:12:00,480 --> 00:12:04,040 Speaker 2: it does three things. Number one, it prohibits children under 207 00:12:04,080 --> 00:12:06,800 Speaker 2: thirteen from having social media accounts. I think there's no 208 00:12:06,920 --> 00:12:09,439 Speaker 2: reason for a child that young to have a social 209 00:12:09,480 --> 00:12:15,280 Speaker 2: media account. Number two, it prohibits tech platforms from using 210 00:12:15,440 --> 00:12:20,839 Speaker 2: algorithmic boosting for kids under sixteen, and boosting is how 211 00:12:20,840 --> 00:12:25,160 Speaker 2: they push particular messages at kids. And then number three, 212 00:12:25,280 --> 00:12:28,600 Speaker 2: there's a provision in a bill that I introduced separately 213 00:12:28,640 --> 00:12:32,440 Speaker 2: called the Eyes on the Board Bill, that says that 214 00:12:32,480 --> 00:12:35,920 Speaker 2: any schools that receive federal funds have to block social 215 00:12:35,960 --> 00:12:39,120 Speaker 2: media on campus. That if you're in class, there's no 216 00:12:39,240 --> 00:12:41,600 Speaker 2: reason for you to be on Snapchat, you ought to 217 00:12:41,640 --> 00:12:45,120 Speaker 2: be listening to the lessons. And that bill we passed 218 00:12:45,160 --> 00:12:48,800 Speaker 2: out of the Commerce Committee with overwhelming bipartisan support. It 219 00:12:48,880 --> 00:12:51,120 Speaker 2: is yet to pass the Senate, but I think that 220 00:12:51,160 --> 00:12:54,920 Speaker 2: would be another important step. And I'll tell you what's interesting. 221 00:12:55,000 --> 00:13:00,920 Speaker 2: Newte are colleagues that are in their seventies and eighties. Frankly, 222 00:13:00,920 --> 00:13:02,840 Speaker 2: when it comes to this issue, most of them are 223 00:13:02,880 --> 00:13:05,760 Speaker 2: a little puzzled by it. They just haven't dealt with 224 00:13:05,840 --> 00:13:09,160 Speaker 2: it firsthand. You know, the co sponsors that I have 225 00:13:09,280 --> 00:13:13,840 Speaker 2: on this bill, they're almost exclusively senators in their forties 226 00:13:13,880 --> 00:13:17,640 Speaker 2: and fifties who have kids at home, who have teenagers 227 00:13:17,720 --> 00:13:20,920 Speaker 2: or adolescents. And every parent I know who's dealing with 228 00:13:21,000 --> 00:13:24,480 Speaker 2: this right now is frustrated and doesn't know how to 229 00:13:24,559 --> 00:13:27,640 Speaker 2: fully protect their kids online and on their phone. And 230 00:13:27,679 --> 00:13:28,800 Speaker 2: I think we need to do a lot more to 231 00:13:28,800 --> 00:13:29,200 Speaker 2: help them. 232 00:13:29,520 --> 00:13:32,400 Speaker 1: I say this as somebody in the age group you're describing. 233 00:13:32,679 --> 00:13:35,280 Speaker 1: You may want to encourage the older senators to talk 234 00:13:35,320 --> 00:13:36,280 Speaker 1: to their grandchildren. 235 00:13:36,600 --> 00:13:38,480 Speaker 2: Yeah, and I am doing exactly that. 236 00:13:39,840 --> 00:13:41,760 Speaker 1: Suddenly they'd be in a new world. You are a 237 00:13:41,840 --> 00:13:44,640 Speaker 1: leader on so many different topics, but on this one, 238 00:13:44,960 --> 00:13:47,600 Speaker 1: I suspect you will literally be able to say over 239 00:13:47,679 --> 00:13:51,720 Speaker 1: the years that you saved several thousand lives literally from 240 00:13:51,800 --> 00:13:55,560 Speaker 1: committing suicide, and you saved tens or twenty thousands of 241 00:13:55,640 --> 00:13:59,800 Speaker 1: people who otherwise would have been deeply humiliated and deeply affected, 242 00:13:59,840 --> 00:14:02,319 Speaker 1: so psychologically so ted up. Thank you for your time. 243 00:14:02,400 --> 00:14:04,640 Speaker 1: I know how amazingly busy you are, and this was 244 00:14:04,679 --> 00:14:07,600 Speaker 1: a real active leadership on your part, and something which 245 00:14:07,640 --> 00:14:09,800 Speaker 1: the country owes you a debt of gratitude. 246 00:14:09,840 --> 00:14:11,720 Speaker 2: Well, Newt, thank you. And let me just say I 247 00:14:11,760 --> 00:14:15,120 Speaker 2: appreciate the first lady very much. Her leadership was critical. 248 00:14:15,160 --> 00:14:17,959 Speaker 2: I appreciate President Trump. In the State of the Union, 249 00:14:18,000 --> 00:14:20,640 Speaker 2: he introduced Eliston Barry and he called on Congress to 250 00:14:20,680 --> 00:14:23,240 Speaker 2: pass the bill. I would be remiss if I didn't 251 00:14:23,280 --> 00:14:26,440 Speaker 2: say thank you, especially to the victim advocates, because they 252 00:14:26,480 --> 00:14:29,200 Speaker 2: could have taken their pain and just hurt from it, 253 00:14:29,240 --> 00:14:31,760 Speaker 2: and instead they decided to stand up and lead. And 254 00:14:31,840 --> 00:14:36,120 Speaker 2: the bravery, particularly Eliston and Francesca, but also certainly Brandon 255 00:14:36,160 --> 00:14:40,080 Speaker 2: and Breeslou and other victim advocates. The bravery they've shown 256 00:14:40,160 --> 00:14:42,880 Speaker 2: is extraordinary and this would not have gotten done without 257 00:14:42,880 --> 00:14:43,680 Speaker 2: their courage. 258 00:14:43,960 --> 00:14:46,480 Speaker 1: That's great, That's tremendous. Listen, have a wonderful day and 259 00:14:47,000 --> 00:14:48,040 Speaker 1: good luck in Texas. 260 00:14:48,280 --> 00:14:49,760 Speaker 2: All right, take care, Thanks Duke. 261 00:15:06,160 --> 00:15:08,320 Speaker 1: Jah, thank you for joining me. Can you talk about 262 00:15:08,360 --> 00:15:11,640 Speaker 1: how the Take It Down Act gained by partisan support? 263 00:15:12,360 --> 00:15:12,560 Speaker 2: Yeah? 264 00:15:12,600 --> 00:15:14,920 Speaker 3: Absolutely, a lot of credit to sender a Cruise and 265 00:15:14,920 --> 00:15:18,160 Speaker 3: Center a Clovi char Sexual abuse has always been a 266 00:15:18,200 --> 00:15:21,280 Speaker 3: non partisan issue. We're fortunate to have really great support 267 00:15:21,400 --> 00:15:24,960 Speaker 3: from both sides in Congress. I think what Congress saw 268 00:15:25,000 --> 00:15:28,440 Speaker 3: here is that tech enabled sexual abuse is the fastest 269 00:15:28,480 --> 00:15:32,240 Speaker 3: growing form of sexual abuse. Every month, we're seeing a 270 00:15:32,360 --> 00:15:37,640 Speaker 3: huge increase in cases. Rather than waiting until this is 271 00:15:37,960 --> 00:15:40,640 Speaker 3: a problem that is too big to be fixed, Sender 272 00:15:40,680 --> 00:15:43,280 Speaker 3: a Cruise and Center clovichar jumped in and wanted to 273 00:15:43,320 --> 00:15:46,040 Speaker 3: do something about it so that we can try and 274 00:15:46,360 --> 00:15:47,920 Speaker 3: slow it down, try and put an end to it 275 00:15:48,000 --> 00:15:49,840 Speaker 3: before it hits every school in the country. 276 00:15:50,120 --> 00:15:55,480 Speaker 1: The school issue of internet impacts psychologically is enormous. You 277 00:15:55,520 --> 00:15:57,600 Speaker 1: wouldn't have thought it thirty years ago. But a piece 278 00:15:57,600 --> 00:16:00,440 Speaker 1: of making America healthy again is getting our hand around 279 00:16:00,480 --> 00:16:04,000 Speaker 1: this whole internet and social media section, which has become, 280 00:16:04,360 --> 00:16:07,880 Speaker 1: particularly for very young people, an enormous part of their lives. 281 00:16:08,560 --> 00:16:13,440 Speaker 3: You're absolutely right. Creating non consensual intimate images has become 282 00:16:13,600 --> 00:16:17,560 Speaker 3: just shockingly easy. There are dozens of apps and websites 283 00:16:17,640 --> 00:16:21,680 Speaker 3: now that within minutes a kid can submit pictures of 284 00:16:21,720 --> 00:16:25,600 Speaker 3: their classmates and generate nude photos of them, and within 285 00:16:25,680 --> 00:16:28,440 Speaker 3: minutes that can be distributed around the school and all 286 00:16:28,440 --> 00:16:30,480 Speaker 3: around the web. So this is a new form of 287 00:16:30,560 --> 00:16:35,320 Speaker 3: abuse that is having a really devastating effect on victims, 288 00:16:35,920 --> 00:16:38,880 Speaker 3: and it's just growing like crazy and keep in mind 289 00:16:38,920 --> 00:16:42,120 Speaker 3: worth just at the beginning of this AI revolution. A 290 00:16:42,120 --> 00:16:45,040 Speaker 3: few years from now, when these tools become even more 291 00:16:45,080 --> 00:16:49,000 Speaker 3: accessible and easier to use and more universal, this problem 292 00:16:49,040 --> 00:16:50,640 Speaker 3: is going to be so far out of control if 293 00:16:50,640 --> 00:16:52,120 Speaker 3: we don't do something about it now. 294 00:16:52,600 --> 00:16:56,360 Speaker 1: The degree to which younger people in particular live in 295 00:16:56,400 --> 00:17:00,400 Speaker 1: that world, their reality is in many ways an electronic reality. 296 00:17:01,040 --> 00:17:04,000 Speaker 1: It shapes them. A lot of states, as I understand, though, 297 00:17:04,240 --> 00:17:06,080 Speaker 1: we're trying to find a way to address this. So 298 00:17:06,119 --> 00:17:08,679 Speaker 1: I think there was something like forty eight different states 299 00:17:08,680 --> 00:17:11,240 Speaker 1: that enacted some kind of law. What's your sense of 300 00:17:11,240 --> 00:17:12,439 Speaker 1: the state level effort. 301 00:17:13,440 --> 00:17:15,520 Speaker 3: There has been some good action in the states, but 302 00:17:15,560 --> 00:17:20,200 Speaker 3: the laws are very inconsistent, and states sometimes have difficulty 303 00:17:20,760 --> 00:17:24,440 Speaker 3: because the Internet is open, it's worldwide. State by state 304 00:17:24,520 --> 00:17:27,280 Speaker 3: regulation ultimately isn't going to work. There needs to be 305 00:17:27,320 --> 00:17:31,320 Speaker 3: some federal regulation so that we can criminalize the distribution 306 00:17:31,560 --> 00:17:33,320 Speaker 3: of these images across state lines. 307 00:17:33,800 --> 00:17:37,240 Speaker 1: Forty eight states had enacted some kind of law criminalizing 308 00:17:37,760 --> 00:17:42,000 Speaker 1: non consensual distribution of inimate images. South Carolina and Massachusetts 309 00:17:42,040 --> 00:17:44,920 Speaker 1: had not. What was happening in those two states, they. 310 00:17:44,840 --> 00:17:47,760 Speaker 3: Were flow to come to the game. I think that 311 00:17:48,040 --> 00:17:51,520 Speaker 3: initially there was some opposition on one side about first 312 00:17:51,520 --> 00:17:53,919 Speaker 3: and ending grounds the idea that we're going to be 313 00:17:53,920 --> 00:17:57,040 Speaker 3: making certain images illegal. But I think to Take It 314 00:17:57,080 --> 00:18:00,560 Speaker 3: Down Acted a really good job at carving out legit uses. 315 00:18:00,760 --> 00:18:04,760 Speaker 3: So doctors can still share intimate images for medical purposes, 316 00:18:04,800 --> 00:18:08,879 Speaker 3: law enforcement can still distribute them for investigative purposes, but 317 00:18:09,640 --> 00:18:14,160 Speaker 3: when they're used to harass or abuse the subject, we've 318 00:18:14,160 --> 00:18:15,160 Speaker 3: criminalized them now. 319 00:18:15,720 --> 00:18:19,639 Speaker 1: So if somebody actually goes out now and deliberately sends 320 00:18:19,640 --> 00:18:24,320 Speaker 1: out an artificial intelligence generated image, are they personally at 321 00:18:24,400 --> 00:18:25,760 Speaker 1: risk for having sent it out? 322 00:18:26,920 --> 00:18:29,840 Speaker 3: They are if they are distributing it without the consent 323 00:18:30,119 --> 00:18:33,080 Speaker 3: of the person pictured in the image. And I should 324 00:18:33,080 --> 00:18:37,000 Speaker 3: say distributing intimate images of children has always been illegal 325 00:18:37,040 --> 00:18:41,240 Speaker 3: and still is. This expands that to intimate images that 326 00:18:41,560 --> 00:18:45,680 Speaker 3: might not have been caught under child sexual abuse material laws, 327 00:18:46,080 --> 00:18:49,000 Speaker 3: as well as to images of adults that are being distributed. 328 00:18:49,560 --> 00:18:51,479 Speaker 1: The key part of implementing this is going to be 329 00:18:52,280 --> 00:18:57,440 Speaker 1: companies like Meta, TikTok, Google. What's their responsibility? 330 00:18:57,600 --> 00:19:00,480 Speaker 3: They do have really a key role here. One of 331 00:19:00,520 --> 00:19:03,120 Speaker 3: the most important provisions of the Take It Down Act 332 00:19:03,240 --> 00:19:06,960 Speaker 3: is a requirement that the big tech platforms, once they're 333 00:19:07,000 --> 00:19:10,240 Speaker 3: notified of a non consentual intimate image on their platform, 334 00:19:10,520 --> 00:19:12,840 Speaker 3: they have to take it down within forty eight hours. 335 00:19:12,960 --> 00:19:15,119 Speaker 3: That's the provision that the Take It Down Act is 336 00:19:15,200 --> 00:19:19,320 Speaker 3: named for. We are hoping that the tech companies will 337 00:19:19,359 --> 00:19:22,840 Speaker 3: really embrace the responsibility here. This is about looking out 338 00:19:22,880 --> 00:19:26,080 Speaker 3: for their customers, and this is about treating their user 339 00:19:26,119 --> 00:19:29,359 Speaker 3: as well. It's incredibly traumatic for a kid to find 340 00:19:29,440 --> 00:19:32,680 Speaker 3: out that their naked image is circulating around on TikTok, 341 00:19:32,800 --> 00:19:36,200 Speaker 3: around on Instagram and not have a way to get 342 00:19:36,240 --> 00:19:39,200 Speaker 3: it down. And so this finally makes it easy for 343 00:19:39,240 --> 00:19:42,720 Speaker 3: someone to identify an image tell the platform, and the 344 00:19:42,760 --> 00:19:45,040 Speaker 3: platform then has two days to take it down. 345 00:19:45,480 --> 00:19:47,919 Speaker 1: This bill gets signed in a law in May, and 346 00:19:48,000 --> 00:19:50,600 Speaker 1: yet you're setting a deadline of July fourth, which is 347 00:19:51,040 --> 00:19:55,399 Speaker 1: by normal federal standards, amazingly fast. What was your rationale 348 00:19:55,920 --> 00:19:58,280 Speaker 1: from these very big corporations moving this quickly. 349 00:19:58,760 --> 00:20:01,679 Speaker 3: We're being optimistic here, but the law gives the federal 350 00:20:01,720 --> 00:20:04,520 Speaker 3: government a year to fully implement it, and there's the 351 00:20:04,560 --> 00:20:07,639 Speaker 3: whole process of the Federal Trade Commission having to come 352 00:20:07,720 --> 00:20:11,119 Speaker 3: up with regulations around it. But what we're asking is, 353 00:20:11,440 --> 00:20:13,719 Speaker 3: we've worked with a lot of these tech companies before 354 00:20:13,960 --> 00:20:17,119 Speaker 3: they can move very quickly. This is not a complicated 355 00:20:17,160 --> 00:20:20,480 Speaker 3: process or complicated technology that they need to build to 356 00:20:20,560 --> 00:20:23,400 Speaker 3: implement this. They need to add a button on their 357 00:20:23,480 --> 00:20:27,720 Speaker 3: site that lets people click and report an image, and 358 00:20:27,760 --> 00:20:29,680 Speaker 3: then they need to set up an automated process and 359 00:20:29,720 --> 00:20:32,840 Speaker 3: that image gets taken down off their site. So this 360 00:20:32,920 --> 00:20:35,520 Speaker 3: is something if they are committed to it, they could 361 00:20:35,560 --> 00:20:36,800 Speaker 3: get done in days. 362 00:20:37,200 --> 00:20:41,320 Speaker 1: And the normal process of lobbyist in Washington. I'm surprised 363 00:20:41,359 --> 00:20:46,359 Speaker 1: there wasn't a huge effort to delay implementation. What would 364 00:20:46,400 --> 00:20:48,800 Speaker 1: the cost in human terms have been if you had 365 00:20:48,800 --> 00:20:50,080 Speaker 1: delayed the implementation for. 366 00:20:50,040 --> 00:20:54,119 Speaker 3: A year, literally millions of new victims who would have 367 00:20:54,160 --> 00:20:58,080 Speaker 3: had no recourse. Every one of these images, once they're 368 00:20:58,080 --> 00:21:00,960 Speaker 3: out on the web can be distributed thousand times and 369 00:21:01,080 --> 00:21:04,480 Speaker 3: can have millions of viewers, So the human cost would 370 00:21:04,480 --> 00:21:09,160 Speaker 3: have just been extraordinary. And one of the reasons that 371 00:21:09,240 --> 00:21:12,000 Speaker 3: this was passed by the House so quickly this year 372 00:21:12,640 --> 00:21:16,760 Speaker 3: was the involvement of the First Lady Lania Trump decided 373 00:21:16,800 --> 00:21:19,960 Speaker 3: to take up this cause and to push for the 374 00:21:20,000 --> 00:21:22,719 Speaker 3: passage of the takendown at This is actually the first 375 00:21:22,920 --> 00:21:25,399 Speaker 3: policy that she worked on when she became First Lady 376 00:21:25,640 --> 00:21:28,520 Speaker 3: this term, and that really motivated the leadership of the 377 00:21:28,520 --> 00:21:32,000 Speaker 3: House and members of Congress to do something about this, 378 00:21:32,040 --> 00:21:34,280 Speaker 3: to move this quickly. So I got to give her 379 00:21:34,480 --> 00:21:36,679 Speaker 3: huge credit for being in the lead on this and 380 00:21:36,760 --> 00:21:40,840 Speaker 3: understanding that this is such an important issue for the 381 00:21:40,840 --> 00:21:42,240 Speaker 3: health and well being of kids. 382 00:21:42,680 --> 00:21:46,399 Speaker 1: I suspect given her background as a world class model, 383 00:21:47,040 --> 00:21:50,440 Speaker 1: and she probably fully understood the importance of imagery and 384 00:21:50,520 --> 00:21:53,840 Speaker 1: how devastating it could be to have the wrong image 385 00:21:53,880 --> 00:21:57,000 Speaker 1: out there and what do we do to kids because 386 00:21:57,000 --> 00:21:59,960 Speaker 1: there's a period there and adolescents where the very sensitive, 387 00:22:00,200 --> 00:22:03,560 Speaker 1: very insecure, And of course you had a frightening increase 388 00:22:03,600 --> 00:22:07,479 Speaker 1: in suicide, which I think is partially associated with cyber 389 00:22:07,560 --> 00:22:11,800 Speaker 1: bullying and cyber isolation. And so I think her role 390 00:22:11,800 --> 00:22:14,760 Speaker 1: in this was a great place for her to focus 391 00:22:14,760 --> 00:22:33,240 Speaker 1: her prestige and her experience. But it wouldn't have happened 392 00:22:33,240 --> 00:22:37,320 Speaker 1: I think without your organization. RAIN. Talk about how RAIN 393 00:22:37,400 --> 00:22:39,399 Speaker 1: came to be and what its focus is. 394 00:22:40,560 --> 00:22:44,080 Speaker 3: Sure so RAIN is the largest anti sexual violence organization 395 00:22:44,200 --> 00:22:47,840 Speaker 3: in the US. We work on public policy, trying to 396 00:22:47,920 --> 00:22:51,119 Speaker 3: improve the criminal justice system, trying to make sure that 397 00:22:51,200 --> 00:22:55,600 Speaker 3: more perpetrators are caught and convicted. We work on public education, 398 00:22:55,800 --> 00:22:59,360 Speaker 3: trying to help affect what the country understands and how 399 00:22:59,400 --> 00:23:02,520 Speaker 3: they react to sexual violence and motivate them to help 400 00:23:02,520 --> 00:23:06,320 Speaker 3: prevent further harm. And then we helped victims. We run 401 00:23:06,359 --> 00:23:09,560 Speaker 3: the National Sexual Sault Hotline. We also run a hotline 402 00:23:09,600 --> 00:23:11,720 Speaker 3: for the Department of Defense for members of the US 403 00:23:11,760 --> 00:23:16,200 Speaker 3: military around the world, and our victim Service programs helped 404 00:23:16,240 --> 00:23:19,600 Speaker 3: more than thirty thousand survivors every month. We've helped more 405 00:23:19,640 --> 00:23:22,160 Speaker 3: than five million survivors of rape in their loved ones 406 00:23:22,280 --> 00:23:23,400 Speaker 3: since we started it up. 407 00:23:23,600 --> 00:23:25,080 Speaker 1: What does RAIN stand for. 408 00:23:25,400 --> 00:23:27,320 Speaker 3: Rape of These Incests National Network? 409 00:23:27,800 --> 00:23:28,280 Speaker 1: When was it? 410 00:23:28,400 --> 00:23:31,280 Speaker 3: Creed started it up in nineteen ninety four. 411 00:23:31,440 --> 00:23:32,560 Speaker 1: And what was the empetus so? 412 00:23:32,640 --> 00:23:36,560 Speaker 3: And who started I started it out and the emphasis 413 00:23:36,760 --> 00:23:38,919 Speaker 3: there was a service gap, there was a need for 414 00:23:39,000 --> 00:23:42,119 Speaker 3: a national hotline somewhere for victims of rape to go 415 00:23:42,280 --> 00:23:45,960 Speaker 3: to get support, to get advice, to get information and help. 416 00:23:46,440 --> 00:23:49,400 Speaker 3: We originally started by launching the National Sexual Salt Hotline, 417 00:23:49,440 --> 00:23:52,280 Speaker 3: and then within a couple of years started our work 418 00:23:52,280 --> 00:23:56,920 Speaker 3: on public policy and really had great support going back 419 00:23:56,960 --> 00:24:00,320 Speaker 3: to when you were speaker, you were terrifically helpful on 420 00:24:00,400 --> 00:24:02,840 Speaker 3: these issues and supportive of the work we were trying 421 00:24:02,840 --> 00:24:04,400 Speaker 3: to do, which I'm really grateful. 422 00:24:04,160 --> 00:24:06,720 Speaker 1: For During the twenty one years that you've been doing this, 423 00:24:07,200 --> 00:24:10,040 Speaker 1: has the hotline had a series of trends that devolved 424 00:24:10,119 --> 00:24:13,720 Speaker 1: or are the patterns you can see that are somehow informative. 425 00:24:14,480 --> 00:24:17,160 Speaker 3: We've seen a huge growth in usage of the National 426 00:24:17,200 --> 00:24:21,920 Speaker 3: Sexual Salt Hotline. We were helping about forty thousand people 427 00:24:21,960 --> 00:24:24,840 Speaker 3: a year back in nineteen ninety five. Now we're helping 428 00:24:24,880 --> 00:24:28,679 Speaker 3: nearly forty thousand people a month, So the demand has grown. 429 00:24:29,119 --> 00:24:31,640 Speaker 3: People being willing to reach out for help. We've seen 430 00:24:31,680 --> 00:24:35,160 Speaker 3: that grow and we've seen a big evolution in every 431 00:24:35,200 --> 00:24:38,760 Speaker 3: year more and more kids reach out to us. About 432 00:24:38,800 --> 00:24:42,680 Speaker 3: half of victims are minors, and that's reflected and the 433 00:24:42,720 --> 00:24:46,000 Speaker 3: people who are contacting the National Sexual Salt Hotline and 434 00:24:46,040 --> 00:24:46,800 Speaker 3: asking for help. 435 00:24:47,359 --> 00:24:50,840 Speaker 1: Do you think the increase represents the increased problem in 436 00:24:50,840 --> 00:24:53,720 Speaker 1: the society or an increased awareness of the hotline. 437 00:24:54,960 --> 00:24:58,760 Speaker 3: I think that the increase is primarily based because of 438 00:24:58,880 --> 00:25:02,000 Speaker 3: increased awareness, increase comfort in talking about the issue, and 439 00:25:02,119 --> 00:25:05,800 Speaker 3: the diminishment of the stigma around it. I think that 440 00:25:05,960 --> 00:25:09,400 Speaker 3: kids more and more have a better understanding that they're 441 00:25:09,440 --> 00:25:12,240 Speaker 3: not to blame here anymore than a victim of a 442 00:25:12,320 --> 00:25:14,760 Speaker 3: mugging or a victim of any other violent crime is 443 00:25:14,760 --> 00:25:18,160 Speaker 3: to blame. This is a violent crime that the FBI 444 00:25:18,440 --> 00:25:21,280 Speaker 3: ranks it second only to murder in terms of seriousness, 445 00:25:21,440 --> 00:25:25,359 Speaker 3: So there's nothing to be ashamed about for a kid 446 00:25:25,400 --> 00:25:28,000 Speaker 3: who's a victim or for an adult. The other trend 447 00:25:28,040 --> 00:25:31,600 Speaker 3: we're seeing is a huge increase every month in the 448 00:25:31,720 --> 00:25:36,000 Speaker 3: number of calls that we're getting better about technology enabled 449 00:25:36,000 --> 00:25:40,320 Speaker 3: sexual piece like nonconsensual images. We're seeing that increase every 450 00:25:40,359 --> 00:25:43,359 Speaker 3: single month, and I think that five years from now, 451 00:25:43,400 --> 00:25:45,440 Speaker 3: I think that that's probably going to be the majority 452 00:25:45,440 --> 00:25:47,000 Speaker 3: of the cases that we're getting. 453 00:25:47,280 --> 00:25:50,160 Speaker 1: If people do want to call with any help and support, 454 00:25:50,440 --> 00:25:51,480 Speaker 1: what number do they call? 455 00:25:52,800 --> 00:25:54,920 Speaker 3: They can reach us by phone at one eight hundred 456 00:25:55,280 --> 00:25:59,440 Speaker 3: sixty five six Hope Hope, or they can get help 457 00:25:59,520 --> 00:26:05,000 Speaker 3: through online chat at hotline dot ranrii n dot org. 458 00:26:05,200 --> 00:26:07,119 Speaker 1: And we'll post both of those on our show page, 459 00:26:07,160 --> 00:26:10,320 Speaker 1: so anybody who knows somebody who needs help can give 460 00:26:10,359 --> 00:26:13,479 Speaker 1: them that information and help them get to Scott and 461 00:26:13,560 --> 00:26:16,720 Speaker 1: to his team. I think there's a significant step forward 462 00:26:16,960 --> 00:26:19,560 Speaker 1: in dealing with what sadly is a crisis, and I 463 00:26:19,560 --> 00:26:22,120 Speaker 1: think frankly, what you've done with the take it Down 464 00:26:22,200 --> 00:26:24,959 Speaker 1: Act begins to get ahead of the curve. We're going 465 00:26:25,000 --> 00:26:28,800 Speaker 1: to face huge challenges and how we deal with an 466 00:26:28,800 --> 00:26:32,439 Speaker 1: electronic world that we're beginning to realize has many different 467 00:26:32,440 --> 00:26:35,320 Speaker 1: negative effects on our kids. And I think you're a 468 00:26:35,320 --> 00:26:37,800 Speaker 1: piece of that solution. And I think it's great that 469 00:26:37,840 --> 00:26:40,320 Speaker 1: you were able to work in a bipartisan way and 470 00:26:40,440 --> 00:26:43,560 Speaker 1: work with the first Lady to get this moved all 471 00:26:43,560 --> 00:26:45,919 Speaker 1: the way into law, which is in this Congress a 472 00:26:45,960 --> 00:26:47,080 Speaker 1: substantial achievement. 473 00:26:47,560 --> 00:26:49,920 Speaker 3: Well, thank you, we appreciate it. We're thrilled to have 474 00:26:50,000 --> 00:26:52,879 Speaker 3: worked on this. And I think that this is the 475 00:26:52,920 --> 00:26:56,439 Speaker 3: first big piece of legislation that has passed that puts 476 00:26:56,480 --> 00:26:59,840 Speaker 3: restrictions on this sort of abuse and puts requirements on 477 00:27:00,040 --> 00:27:04,080 Speaker 3: the big tech companies to address sexual abuse. So we're 478 00:27:04,119 --> 00:27:07,840 Speaker 3: really optimistic that this was passed quickly and early enough 479 00:27:07,880 --> 00:27:10,879 Speaker 3: to really make a difference and hopefully slow down the 480 00:27:10,920 --> 00:27:13,119 Speaker 3: growth of this and help Picktims Scott. 481 00:27:13,160 --> 00:27:15,560 Speaker 1: I want to thank you for joining me. Our listeners 482 00:27:15,560 --> 00:27:17,920 Speaker 1: can learn more about the Take It Down Act by 483 00:27:18,000 --> 00:27:22,159 Speaker 1: visiting your website at RAI N dot org. And I 484 00:27:22,240 --> 00:27:24,840 Speaker 1: encourage people to read about it and understand that this 485 00:27:24,920 --> 00:27:31,640 Speaker 1: law will only protect our children and our grandchildren. Thank 486 00:27:31,640 --> 00:27:34,560 Speaker 1: you to my guest Senator Ted Cruz and Scott Berkowitz. 487 00:27:35,000 --> 00:27:36,960 Speaker 1: You can learn more about the Take It Down Act 488 00:27:36,960 --> 00:27:40,600 Speaker 1: on our show page at newtsworld dot com. Newtsworld is 489 00:27:40,640 --> 00:27:44,359 Speaker 1: produced by Gingrid Spree sixty and iHeartMedia. Our executive producer 490 00:27:44,440 --> 00:27:48,639 Speaker 1: is Guarnsey Sloan. Our researcher is Rachel Peterson. The artwork 491 00:27:48,680 --> 00:27:52,320 Speaker 1: for the show was created by Steve Penley special thanks 492 00:27:52,320 --> 00:27:55,560 Speaker 1: to the team of Gingishtree sixty. If you've been enjoying Newtsworld, 493 00:27:55,840 --> 00:27:58,320 Speaker 1: I hope you'll go to Apple Podcasts and both rate 494 00:27:58,400 --> 00:28:01,560 Speaker 1: us with five stars and us a review so others 495 00:28:01,560 --> 00:28:04,480 Speaker 1: can learn what it's all about. Right now, listeners of 496 00:28:04,560 --> 00:28:08,280 Speaker 1: nuts World can sign up for my three freeweekly columns 497 00:28:08,520 --> 00:28:12,640 Speaker 1: at Gangwish three sixty dot com slash newsletter I'm nude, Gingrich. 498 00:28:12,760 --> 00:28:13,600 Speaker 1: This is nuts World.