1 00:00:01,280 --> 00:00:05,000 Speaker 1: There is a new warning out for children in this country, 2 00:00:05,000 --> 00:00:07,200 Speaker 1: and if you're a parent or a grandparent, you need 3 00:00:07,240 --> 00:00:10,880 Speaker 1: to understand what is happening to so many kids right now, 4 00:00:11,280 --> 00:00:14,440 Speaker 1: and it is really being underreported on. It is kids 5 00:00:14,440 --> 00:00:17,599 Speaker 1: that are taking their own lives because they are being 6 00:00:17,640 --> 00:00:22,760 Speaker 1: bullied online with content that many times is explicit in 7 00:00:22,880 --> 00:00:28,200 Speaker 1: nature and actually is not even the child that is 8 00:00:28,240 --> 00:00:32,520 Speaker 1: being targeted by their peers. It is AI and deep 9 00:00:32,560 --> 00:00:36,120 Speaker 1: fake technology. There are people now and the air programs 10 00:00:36,159 --> 00:00:38,400 Speaker 1: and their websites, and there are places that you can 11 00:00:38,440 --> 00:00:42,280 Speaker 1: go for free and take the face of someone you 12 00:00:42,400 --> 00:00:45,440 Speaker 1: know and put it on the body of someone else 13 00:00:45,520 --> 00:00:50,040 Speaker 1: that is doing explicit content or activity. There is also 14 00:00:50,400 --> 00:00:54,920 Speaker 1: something else that's called revenge pornography, and that is where 15 00:00:54,960 --> 00:00:57,800 Speaker 1: maybe you had something that was supposed to stay private 16 00:00:57,840 --> 00:01:00,880 Speaker 1: between you and someone else, and then someone gets angry 17 00:01:00,960 --> 00:01:05,520 Speaker 1: or gets broken up with or the relationship goes sideways, 18 00:01:05,640 --> 00:01:09,679 Speaker 1: and people will then post it online to get revenge 19 00:01:09,880 --> 00:01:15,319 Speaker 1: on that person. This is also causing a significant amount 20 00:01:15,440 --> 00:01:19,440 Speaker 1: of harm to so many kids that are in this 21 00:01:19,600 --> 00:01:23,320 Speaker 1: country now having to deal with what happens online and 22 00:01:23,360 --> 00:01:27,520 Speaker 1: on social media. There is new legislation that I want 23 00:01:27,560 --> 00:01:31,440 Speaker 1: to tell you about, and it is important legislation that 24 00:01:31,480 --> 00:01:34,600 Speaker 1: I need you to get behind and call your congressman 25 00:01:34,640 --> 00:01:37,520 Speaker 1: and your center now Center. Ted Cruz is going to 26 00:01:37,600 --> 00:01:39,720 Speaker 1: join me to talk about that in a few minutes, 27 00:01:39,760 --> 00:01:41,880 Speaker 1: but I wanted to lay that out there for you. 28 00:01:41,920 --> 00:01:45,520 Speaker 1: There's also something else politically that's happening, and that is 29 00:01:45,720 --> 00:01:47,840 Speaker 1: we are seeing some new data here. And I mentioned 30 00:01:47,880 --> 00:01:51,640 Speaker 1: on the show before that Barack Obama coming in to 31 00:01:51,680 --> 00:01:54,480 Speaker 1: try to save Joe Biden actually may have made things 32 00:01:54,520 --> 00:01:58,200 Speaker 1: worse because of how senile Joe Biden looked on stage. 33 00:01:58,240 --> 00:01:59,960 Speaker 1: Now we're going to talk about that, but there's something 34 00:02:00,080 --> 00:02:03,080 Speaker 1: else it's happening. When they brought in Barack Obama, they 35 00:02:03,080 --> 00:02:08,040 Speaker 1: were hoping specifically that he would be able to stop 36 00:02:08,080 --> 00:02:12,680 Speaker 1: the bleeding with the African American vote that is leaving 37 00:02:13,000 --> 00:02:16,359 Speaker 1: Joe Biden at record numbers, and where they're going they 38 00:02:16,360 --> 00:02:20,640 Speaker 1: are actually going to Donald Trump. The support among minorities 39 00:02:20,760 --> 00:02:23,799 Speaker 1: right now is at a historic high for a Republican 40 00:02:23,880 --> 00:02:27,480 Speaker 1: presidential candidate in the Hispanic community as well as the 41 00:02:27,560 --> 00:02:30,960 Speaker 1: African American community. We're going to get into that also 42 00:02:31,360 --> 00:02:33,920 Speaker 1: now before we get into the deep fake aspect of this, 43 00:02:34,280 --> 00:02:37,920 Speaker 1: I want to lay some groundwork, and just so you 44 00:02:38,040 --> 00:02:42,639 Speaker 1: understand how bad it's gotten online for so many kids 45 00:02:42,840 --> 00:02:47,120 Speaker 1: in this country. Los Angeles, Okay, not exactly, like a 46 00:02:47,240 --> 00:02:52,880 Speaker 1: conservative bastion of elected officials has now come out in 47 00:02:52,919 --> 00:02:56,880 Speaker 1: their unified school district and they have just banned cell 48 00:02:56,919 --> 00:03:02,400 Speaker 1: phones and they have also just and social media in 49 00:03:02,480 --> 00:03:05,960 Speaker 1: the classroom. The Los Angeles School District voted five to 50 00:03:06,000 --> 00:03:08,840 Speaker 1: two in favor of banning cell phones and social media 51 00:03:09,000 --> 00:03:14,160 Speaker 1: use during the day, amid a massive outcry from educators 52 00:03:14,200 --> 00:03:17,680 Speaker 1: about the harm that is doing to kids, specifically when 53 00:03:17,720 --> 00:03:20,720 Speaker 1: it comes to bullying, but also when it comes to 54 00:03:20,800 --> 00:03:24,800 Speaker 1: their education. Now, this ban in Los Angeles, again, this 55 00:03:24,919 --> 00:03:30,880 Speaker 1: is a liberal city with a liberal, liberal, liberal ideology 56 00:03:31,040 --> 00:03:34,320 Speaker 1: and school board. What they have now said is this 57 00:03:34,400 --> 00:03:36,120 Speaker 1: is going to go into effect in the spring of 58 00:03:36,160 --> 00:03:39,920 Speaker 1: twenty twenty five and goes beyond the previous district policy 59 00:03:39,920 --> 00:03:43,400 Speaker 1: of only banning cell phones during class instruction while limited 60 00:03:43,480 --> 00:03:49,760 Speaker 1: social media, while limiting social media to quote educational purposes. Now, 61 00:03:49,800 --> 00:03:55,840 Speaker 1: the ban comes following a Pew Research Center Institute research 62 00:03:55,920 --> 00:04:00,760 Speaker 1: paper showing that seventy two percent of high school teachers 63 00:04:00,760 --> 00:04:03,400 Speaker 1: in the United States believe cell phone use has become 64 00:04:03,440 --> 00:04:09,960 Speaker 1: a quote major problem inside of their classrooms because kids 65 00:04:10,040 --> 00:04:13,160 Speaker 1: now have the ability with their phones and with their 66 00:04:13,240 --> 00:04:17,760 Speaker 1: iPods and are their earbuds, i should say, and their 67 00:04:17,880 --> 00:04:21,880 Speaker 1: watches to constantly be on social media while they're supposed 68 00:04:21,920 --> 00:04:25,599 Speaker 1: to be learning. How would you like to know that 69 00:04:25,720 --> 00:04:29,480 Speaker 1: every time you make a phone call, you're actually giving 70 00:04:29,560 --> 00:04:33,240 Speaker 1: back and standing up for the conservative values that you 71 00:04:33,360 --> 00:04:37,560 Speaker 1: stand for. Now that is pretty fun because when you're 72 00:04:37,560 --> 00:04:40,760 Speaker 1: with Patriot Mobile, every time you pay your bill, they 73 00:04:40,800 --> 00:04:42,919 Speaker 1: take about five percent of your bill at no extra 74 00:04:43,000 --> 00:04:45,640 Speaker 1: cost to you, and they give it back to organizations 75 00:04:45,640 --> 00:04:50,200 Speaker 1: that support free speech, religious freedom, the sanctity of life, 76 00:04:50,600 --> 00:04:53,360 Speaker 1: as well as our Second Amendment. And then they do 77 00:04:53,400 --> 00:04:56,680 Speaker 1: something else. They stand and help our veterans, whether it 78 00:04:56,760 --> 00:04:59,679 Speaker 1: is with mental health issues or it is our wounded 79 00:04:59,720 --> 00:05:02,960 Speaker 1: warrior that were wounded in battle. They also stand with 80 00:05:03,000 --> 00:05:06,919 Speaker 1: our police or fire or eemts, our first responder heroes. 81 00:05:07,480 --> 00:05:10,200 Speaker 1: That's one reason why I love Patriot Mobile. But the 82 00:05:10,240 --> 00:05:12,840 Speaker 1: real thing you need to know is this. You if 83 00:05:12,839 --> 00:05:15,359 Speaker 1: you're with Big Mobile, you may not understand this or 84 00:05:15,360 --> 00:05:18,039 Speaker 1: even realize it, but now you do because I'm telling 85 00:05:18,080 --> 00:05:22,440 Speaker 1: you they're taking your money and they give massive donations 86 00:05:22,440 --> 00:05:28,000 Speaker 1: to Democratic Party Democratic candidate and democratic and liberal causes, 87 00:05:28,120 --> 00:05:32,039 Speaker 1: including planned parenthood. Now, if you don't want your money 88 00:05:32,080 --> 00:05:34,560 Speaker 1: going there, you need to make the switch. And switching 89 00:05:34,600 --> 00:05:36,600 Speaker 1: now has never been easier when it comes to your 90 00:05:36,600 --> 00:05:39,719 Speaker 1: cell phone and you're and the number one reason why 91 00:05:39,760 --> 00:05:43,200 Speaker 1: people don't switch is because they're afraid of coverage and 92 00:05:43,360 --> 00:05:46,360 Speaker 1: issues when they switch. That doesn't happen anymore. Either. You're 93 00:05:46,360 --> 00:05:48,400 Speaker 1: on the same exact towers you're on right now when 94 00:05:48,440 --> 00:05:51,039 Speaker 1: you switch to Patriot Mobile. But you're getting rid of 95 00:05:51,080 --> 00:05:54,680 Speaker 1: the woke agenda of the radical left. So call them 96 00:05:55,240 --> 00:05:57,599 Speaker 1: save money on your bill, and not only will you 97 00:05:57,640 --> 00:05:59,240 Speaker 1: be doing that, but you'd be standing up for the 98 00:05:59,360 --> 00:06:02,440 Speaker 1: values you Even every time you make a call nine 99 00:06:02,520 --> 00:06:06,440 Speaker 1: seven to two Patriot, it's nine to seven two Patriot. 100 00:06:06,560 --> 00:06:09,480 Speaker 1: Nine seven to two Patriot, get free activation when you 101 00:06:09,520 --> 00:06:12,240 Speaker 1: use the offer code Ben. Or you can call them 102 00:06:12,440 --> 00:06:15,280 Speaker 1: nine seven to two Patriot or online at Patriotmobile dot 103 00:06:15,320 --> 00:06:21,440 Speaker 1: com slash ben. That's Patriot Mobile dot com slash ben. Now, 104 00:06:21,600 --> 00:06:24,680 Speaker 1: the previous policy that I just mentioned took effect in 105 00:06:24,720 --> 00:06:29,520 Speaker 1: twenty eleven and became relatively toothless in the face of 106 00:06:29,640 --> 00:06:34,560 Speaker 1: smartphones used exploited over the past decade, where many are 107 00:06:34,600 --> 00:06:38,720 Speaker 1: claiming well, hey, it's for educational purposes, so I need 108 00:06:38,920 --> 00:06:42,839 Speaker 1: my phone. Quote. Our students are glued to their cell phones, 109 00:06:43,000 --> 00:06:47,720 Speaker 1: not unlike adult said board member Nick Melvaughn, who supports 110 00:06:47,760 --> 00:06:53,440 Speaker 1: the band, saying quote, they're scrolling in school in class time, 111 00:06:53,920 --> 00:06:56,919 Speaker 1: they have their head in their hands, walking down the hallways. 112 00:06:56,960 --> 00:06:59,640 Speaker 1: They're not talking to each other or playing at lunch 113 00:06:59,720 --> 00:07:03,720 Speaker 1: or reys because they have their air pods in district 114 00:07:03,720 --> 00:07:07,359 Speaker 1: members supporting the band wrote that research shows quote excessive 115 00:07:07,400 --> 00:07:11,640 Speaker 1: cell phone use impacts adolescents mental health and well being 116 00:07:11,920 --> 00:07:17,640 Speaker 1: and is associated with increased stress, anxiety, depression, sleep issues, 117 00:07:18,040 --> 00:07:22,320 Speaker 1: feelings of aggression, and suicidal thoughts. Let me say that again, 118 00:07:22,400 --> 00:07:25,080 Speaker 1: because if you're a parent or grandparent, you really need 119 00:07:25,120 --> 00:07:28,840 Speaker 1: to understand what you are up against. Right now. The 120 00:07:29,800 --> 00:07:32,680 Speaker 1: use of a cell phone and excessive cell phone use 121 00:07:32,720 --> 00:07:37,160 Speaker 1: has now become the norm among kids. It impacts mental 122 00:07:37,200 --> 00:07:39,920 Speaker 1: health and the well being of your child or grandchild, 123 00:07:40,360 --> 00:07:47,280 Speaker 1: and it is directly now associated with increased stress, anxiety, depression, 124 00:07:48,080 --> 00:07:54,120 Speaker 1: sleep issues, feelings of aggression, and the worst one, suicidal thoughts. 125 00:07:54,920 --> 00:07:59,720 Speaker 1: Research indicates the limiting cell phone usage and social media 126 00:08:00,040 --> 00:08:05,680 Speaker 1: access during the school day increases academic performance and has 127 00:08:06,040 --> 00:08:12,160 Speaker 1: positive effects on student mental health. Parents express concerns, of course, 128 00:08:12,640 --> 00:08:15,120 Speaker 1: that the blanket band could put their children at risk 129 00:08:15,200 --> 00:08:17,400 Speaker 1: in the case of an emergency. By the way, we 130 00:08:17,520 --> 00:08:21,200 Speaker 1: survived and had fewer emergencies when we didn't have cell phones, 131 00:08:21,240 --> 00:08:22,640 Speaker 1: And if a kid needs to get in touch with 132 00:08:22,680 --> 00:08:25,080 Speaker 1: a parent, it's not hard. You can go to the 133 00:08:25,160 --> 00:08:28,360 Speaker 1: office and make that phone call. Quote. I think in 134 00:08:28,440 --> 00:08:32,840 Speaker 1: emergencies and with parent communication, this is definitely where a 135 00:08:32,840 --> 00:08:35,480 Speaker 1: lot of parents have expressed their concerns to me that 136 00:08:35,559 --> 00:08:38,160 Speaker 1: one school board member said, I think it's such a 137 00:08:38,200 --> 00:08:41,320 Speaker 1: tragic sign of the times that is that what we 138 00:08:41,400 --> 00:08:44,120 Speaker 1: initially think of, and we all need to do better 139 00:08:44,120 --> 00:08:46,680 Speaker 1: in this country when it comes to he says, gun 140 00:08:46,760 --> 00:08:49,520 Speaker 1: violence prevention and keeping our students safe. Now, I'm not 141 00:08:49,559 --> 00:08:51,720 Speaker 1: going to go in on the gun violence issue right now. 142 00:08:52,000 --> 00:08:55,240 Speaker 1: We can save that for another show. But what this 143 00:08:55,360 --> 00:08:58,920 Speaker 1: fools down to is they're telling you and warning you 144 00:08:59,440 --> 00:09:03,000 Speaker 1: that you're the kid's cell phone is not only used 145 00:09:03,040 --> 00:09:08,439 Speaker 1: to exploit your child, to attack your child. Your kids 146 00:09:08,480 --> 00:09:12,400 Speaker 1: dealing with anxiety and depression and suicidal thoughts from having 147 00:09:12,400 --> 00:09:14,640 Speaker 1: these phones twenty four to seven and the bullying that 148 00:09:14,720 --> 00:09:17,679 Speaker 1: comes along with it. But they're also, and this is 149 00:09:17,800 --> 00:09:26,720 Speaker 1: extremely important, not getting a education. They're not receiving a 150 00:09:26,920 --> 00:09:30,880 Speaker 1: quality education because they're all on their phones. Twenty four 151 00:09:30,920 --> 00:09:35,280 Speaker 1: to seven. I was sitting at the pool recently with 152 00:09:35,400 --> 00:09:39,000 Speaker 1: my kids. My kids are younger seven to five, and 153 00:09:39,720 --> 00:09:42,680 Speaker 1: there was a birthday party and it was a birthday 154 00:09:42,720 --> 00:09:45,720 Speaker 1: party for I'm guessing the kid was thirteen or fourteen. 155 00:09:46,559 --> 00:09:50,200 Speaker 1: It was a large birthday party. It was probably ten 156 00:09:50,240 --> 00:09:54,040 Speaker 1: to fifteen kids that were of each sex, boys and girls. 157 00:09:54,520 --> 00:09:59,199 Speaker 1: And what was shocking in the moment that I witnessed 158 00:09:59,360 --> 00:10:02,640 Speaker 1: this birthday party was none of the kids were swimming. 159 00:10:03,559 --> 00:10:06,320 Speaker 1: They were all quote eating dinner at that point, at 160 00:10:06,360 --> 00:10:08,240 Speaker 1: part in the point, at that point in the party, 161 00:10:08,800 --> 00:10:11,560 Speaker 1: not a single one of the boys was speaking to 162 00:10:11,640 --> 00:10:15,880 Speaker 1: any of the other boys. And they were completely isolated 163 00:10:15,920 --> 00:10:18,640 Speaker 1: from the girls whose party. It was. So the boys 164 00:10:18,679 --> 00:10:24,000 Speaker 1: showed up and they were all sitting on their cell phones. 165 00:10:24,600 --> 00:10:31,280 Speaker 1: None of them were talking to one another, none of them. 166 00:10:31,679 --> 00:10:35,319 Speaker 1: You think about, you go to a birthday party and 167 00:10:35,640 --> 00:10:40,520 Speaker 1: all of these kids, thirteen, fourteen, whatever, they were, all 168 00:10:40,559 --> 00:10:42,920 Speaker 1: of them on their phones. Not a single one of 169 00:10:42,920 --> 00:10:46,640 Speaker 1: the men boys were conversing with one another. They weren't 170 00:10:46,640 --> 00:10:51,680 Speaker 1: even showing each other things on their phones. They literally 171 00:10:51,760 --> 00:10:55,040 Speaker 1: were all sitting there eating the food at the party, 172 00:10:55,240 --> 00:10:58,760 Speaker 1: not talking to anyone else. This is just how bad 173 00:10:58,800 --> 00:11:02,320 Speaker 1: it's gotten. And it's not just that it is making 174 00:11:02,320 --> 00:11:05,000 Speaker 1: it where these kids do not know how to have conversations. 175 00:11:05,600 --> 00:11:08,520 Speaker 1: They don't know how to have eye contact, they don't 176 00:11:08,559 --> 00:11:11,000 Speaker 1: know how to converse with one another, they don't know 177 00:11:11,000 --> 00:11:14,720 Speaker 1: how to communicate in basic I guess you could say 178 00:11:14,800 --> 00:11:18,240 Speaker 1: terms that is the biggest problem here that I see 179 00:11:18,280 --> 00:11:21,400 Speaker 1: with all of this. Now, let me just say this. 180 00:11:22,720 --> 00:11:26,080 Speaker 1: You look at what's happening in our schools and you 181 00:11:26,120 --> 00:11:30,720 Speaker 1: can can directly tie it to this other issue and 182 00:11:30,760 --> 00:11:37,840 Speaker 1: this legislation that is incredibly important legislation that we desperately need, 183 00:11:39,000 --> 00:11:44,040 Speaker 1: that is bipartisan now to protect people online and when 184 00:11:44,080 --> 00:11:48,280 Speaker 1: someone is putting out information about them that is horrific 185 00:11:48,520 --> 00:11:52,600 Speaker 1: or embarrassing, you need a way to take it down online. 186 00:11:52,640 --> 00:11:57,440 Speaker 1: And right now, if you're the victim of this exploitation, 187 00:11:58,200 --> 00:12:01,960 Speaker 1: it is virtually impossible to get it taken down online. 188 00:12:02,400 --> 00:12:04,920 Speaker 1: Take a listen to what Senata Cruz had to say 189 00:12:05,360 --> 00:12:08,120 Speaker 1: about this legislation that all of you need to get behind. 190 00:12:09,000 --> 00:12:12,760 Speaker 1: I want to start with painting a picture here, especially 191 00:12:12,800 --> 00:12:17,800 Speaker 1: so people understand just how important this bipartisan legislation is. 192 00:12:17,800 --> 00:12:22,040 Speaker 1: You're working with Amy Klobuchar introducing the tools to address 193 00:12:22,240 --> 00:12:28,480 Speaker 1: known exploitation of innocent people. This has become a moneymaker online, 194 00:12:28,520 --> 00:12:31,360 Speaker 1: by the way, for people that want to do this 195 00:12:31,440 --> 00:12:36,079 Speaker 1: to celebrities. We saw that recently happened with Taylor Swift 196 00:12:36,600 --> 00:12:39,920 Speaker 1: and it really brought to light just how damaging these 197 00:12:39,960 --> 00:12:42,920 Speaker 1: deep fakes can be. And and these and this this 198 00:12:43,040 --> 00:12:46,200 Speaker 1: kind of revenge that some people will take on on 199 00:12:46,520 --> 00:12:48,600 Speaker 1: X's and others or people they just want to harm 200 00:12:48,640 --> 00:12:51,040 Speaker 1: or or even just make money off of. But there's 201 00:12:51,080 --> 00:12:55,800 Speaker 1: also I know people personally who have been affected by 202 00:12:55,800 --> 00:12:58,960 Speaker 1: this that are not famous, that are not a celebrity, 203 00:12:59,360 --> 00:13:03,559 Speaker 1: and it has ruined the reputation. We have seen warnings 204 00:13:03,559 --> 00:13:07,280 Speaker 1: from parents on the issue of suicide with young girls, 205 00:13:07,320 --> 00:13:10,720 Speaker 1: young teens, young boys who have had been exploited in 206 00:13:10,760 --> 00:13:13,000 Speaker 1: this way with things that aren't real, that are fake, 207 00:13:13,679 --> 00:13:15,600 Speaker 1: and they can't get them taken down. They are then 208 00:13:15,679 --> 00:13:18,480 Speaker 1: bullied and it's been tragic and cost many young people 209 00:13:18,520 --> 00:13:20,760 Speaker 1: their lives. And that's why you guys are doing this. 210 00:13:21,559 --> 00:13:23,800 Speaker 2: Well, that's exactly right, and I'll tell you one of 211 00:13:23,800 --> 00:13:26,439 Speaker 2: the major things that brought this issue to my attention. 212 00:13:27,480 --> 00:13:30,199 Speaker 2: What was actually a call we got from a constituent, 213 00:13:30,720 --> 00:13:32,920 Speaker 2: a mom, a family that live in North Texas and 214 00:13:32,960 --> 00:13:37,920 Speaker 2: Alito and her daughter. A fourteen year old girl was targeted, 215 00:13:37,960 --> 00:13:42,040 Speaker 2: targeted by a classmate, a classmate that took her face 216 00:13:42,080 --> 00:13:45,040 Speaker 2: and her image from social media, from Snapchat and other 217 00:13:45,080 --> 00:13:50,440 Speaker 2: social media and used AI use deep fake technology to 218 00:13:50,600 --> 00:13:54,720 Speaker 2: graft her face onto explicit nude images that were not 219 00:13:54,880 --> 00:14:00,320 Speaker 2: her and publicized those to all of her classmates. She's 220 00:14:00,400 --> 00:14:03,480 Speaker 2: fourteen years old. This poor girl goes into school with 221 00:14:03,640 --> 00:14:07,400 Speaker 2: all of her classmates but believing that they've seen explicit 222 00:14:07,480 --> 00:14:09,720 Speaker 2: sexual images of her and it was a total lie. 223 00:14:10,240 --> 00:14:15,160 Speaker 2: But obviously it traumatized this this this teenage girl significantly 224 00:14:15,200 --> 00:14:18,480 Speaker 2: as any as any young person what er for that matter, 225 00:14:18,800 --> 00:14:21,960 Speaker 2: most adults would that that that is a horrific experience 226 00:14:22,440 --> 00:14:25,640 Speaker 2: to go through and and listen, there's always been the 227 00:14:25,680 --> 00:14:29,120 Speaker 2: ability for someone to photoshop and you know, put put 228 00:14:29,600 --> 00:14:33,920 Speaker 2: Ben Ferguson's head on Arnold Schwarzenegger's body. And by the way, 229 00:14:33,960 --> 00:14:37,360 Speaker 2: that was you and Connan the Barbarian. I was really impressed. Ben. 230 00:14:37,400 --> 00:14:40,520 Speaker 2: You should not punch the camel. I'm gonna sick pete on. 231 00:14:40,560 --> 00:14:43,840 Speaker 2: You punch you a camel is a bad move. But look, 232 00:14:44,000 --> 00:14:46,760 Speaker 2: if you graft your head onto a different body, it 233 00:14:46,800 --> 00:14:51,280 Speaker 2: was fairly obvious to someone someone watching. What's changed is 234 00:14:52,360 --> 00:14:57,640 Speaker 2: with the deep fake technology, it can now look completely 235 00:14:57,800 --> 00:15:00,240 Speaker 2: real that people can look at it and go do food. 236 00:15:00,320 --> 00:15:05,160 Speaker 2: I didn't know Ferguson was so ripped and that here's 237 00:15:05,200 --> 00:15:11,360 Speaker 2: an amazing stat. Up to ninety five percent of all 238 00:15:11,640 --> 00:15:18,560 Speaker 2: Internet deep fake videos depict non consensual intimate images, up 239 00:15:18,600 --> 00:15:21,920 Speaker 2: to ninety five percent, and the vast majority of them 240 00:15:21,960 --> 00:15:24,560 Speaker 2: target women and girls. And so this is becoming a tool, 241 00:15:24,960 --> 00:15:28,240 Speaker 2: a tool for cyber bullying, a tool for attacking, maybe 242 00:15:28,280 --> 00:15:30,440 Speaker 2: attacking an ex girlfriend or an ex wife or an 243 00:15:30,440 --> 00:15:33,360 Speaker 2: ex partner that you're mad at and using it just 244 00:15:33,400 --> 00:15:37,280 Speaker 2: to smear them and attack them. And so this bill, 245 00:15:37,840 --> 00:15:40,640 Speaker 2: I've introduced it along with Amy klobachar Amy Clobchard as 246 00:15:40,640 --> 00:15:44,080 Speaker 2: a Democrat, as you know, from Minnesota. It's also co 247 00:15:44,200 --> 00:15:48,760 Speaker 2: sponsored by Cynthia Lummis, who's a Republican from Wyoming, Richard Blumenthal, 248 00:15:48,800 --> 00:15:52,040 Speaker 2: who's a Democrat from Connecticut. Shelley Moore Capito is a 249 00:15:52,080 --> 00:15:55,680 Speaker 2: Republican from West Virginia and Jackie Rosen who's a Democrat 250 00:15:55,680 --> 00:15:58,920 Speaker 2: from Nevada. So we brought together both Republicans and Democrats. 251 00:16:00,040 --> 00:16:02,320 Speaker 2: And the two elements of this are number one, making 252 00:16:02,400 --> 00:16:06,160 Speaker 2: it a crime, making it a felony to publish non 253 00:16:06,200 --> 00:16:10,920 Speaker 2: consensual intimate images without without consent, again against someone's wishes. 254 00:16:11,600 --> 00:16:14,400 Speaker 2: But number two, as I mentioned, and I think this 255 00:16:14,600 --> 00:16:17,440 Speaker 2: is the piece. So there's other bills people have introduced 256 00:16:17,480 --> 00:16:19,320 Speaker 2: that make it a crime to do so, and so 257 00:16:19,320 --> 00:16:21,000 Speaker 2: there are multiple bills and I'm a co sponsor of 258 00:16:21,040 --> 00:16:23,920 Speaker 2: other bills to do that. That's a good idea. The 259 00:16:24,040 --> 00:16:27,320 Speaker 2: piece that is not reflected in any other legislation I 260 00:16:27,360 --> 00:16:31,080 Speaker 2: know of that's been filed is the take it down component, 261 00:16:31,120 --> 00:16:36,920 Speaker 2: which is the obligation on the social media company or 262 00:16:37,640 --> 00:16:41,000 Speaker 2: the big tech company to take the content down. 263 00:16:41,040 --> 00:16:43,480 Speaker 1: And so well, and victims have said this to me, 264 00:16:43,520 --> 00:16:46,240 Speaker 1: and say this to you, is it's one thing that 265 00:16:46,400 --> 00:16:49,520 Speaker 1: someone does it. It's another thing when there's no mechanism 266 00:16:49,560 --> 00:16:51,800 Speaker 1: to take it down when you see it, when it's 267 00:16:51,840 --> 00:16:55,440 Speaker 1: being used against you. We saw this recently with a 268 00:16:55,520 --> 00:16:58,600 Speaker 1: doctor and nurses and they were used and they tried 269 00:16:58,640 --> 00:17:00,320 Speaker 1: to get it taken down, and they could get it 270 00:17:00,320 --> 00:17:05,359 Speaker 1: taken down. Like you just said, there's no incentive in 271 00:17:05,480 --> 00:17:08,439 Speaker 1: essence for these big tech comings to even care or 272 00:17:08,560 --> 00:17:10,600 Speaker 1: notice or respond to you. 273 00:17:11,280 --> 00:17:14,480 Speaker 2: Well, look, that's exactly right. And as we all know, 274 00:17:15,000 --> 00:17:17,919 Speaker 2: this happened to Taylor Swift. So Taylor Swift, you had 275 00:17:17,960 --> 00:17:22,320 Speaker 2: people who took her image and made deep fake, false, 276 00:17:22,480 --> 00:17:26,320 Speaker 2: explicit images. And because Taylor Swift is a global superstar, 277 00:17:26,920 --> 00:17:29,440 Speaker 2: she spoke out and big tech took it all down. 278 00:17:29,520 --> 00:17:32,480 Speaker 2: So if you happen to be a global superstar, you 279 00:17:32,520 --> 00:17:35,239 Speaker 2: can get it taken down. But if you're just an 280 00:17:35,359 --> 00:17:38,520 Speaker 2: ordinary kid, you're powerless. And let me tell you a 281 00:17:38,560 --> 00:17:41,399 Speaker 2: story that really illustrates this. At the press conference we 282 00:17:41,440 --> 00:17:46,320 Speaker 2: did this week, we had victims and we had the 283 00:17:47,040 --> 00:17:49,440 Speaker 2: moms of victims who have been targeted by this who 284 00:17:49,480 --> 00:17:53,439 Speaker 2: came and joined us. And one of the victims was 285 00:17:53,480 --> 00:17:56,160 Speaker 2: this fourteen year old girl. And I mentioned that her 286 00:17:56,200 --> 00:17:59,560 Speaker 2: mom initially reached out to my office and she's a Texan. 287 00:18:00,600 --> 00:18:04,720 Speaker 2: When I sat down with the mom last week, she 288 00:18:04,920 --> 00:18:09,840 Speaker 2: expressed to me frustration. Last week she said, the images 289 00:18:09,840 --> 00:18:13,840 Speaker 2: are still up. Snapchat has left them up, and she said, 290 00:18:13,880 --> 00:18:16,960 Speaker 2: I've called repeatedly, I've emailed. I've tried to say, take 291 00:18:17,080 --> 00:18:20,600 Speaker 2: down these false images of my daughter. They're horrific, they're wrong, 292 00:18:21,160 --> 00:18:24,439 Speaker 2: and Snapchat basically said, go jump in a lake. She 293 00:18:24,480 --> 00:18:26,480 Speaker 2: got to run around, she couldn't talk to a human being. 294 00:18:26,560 --> 00:18:28,280 Speaker 2: She got referred over and over and over, and they're 295 00:18:28,320 --> 00:18:30,639 Speaker 2: still up. And I'm sitting there in my office and 296 00:18:30,680 --> 00:18:33,159 Speaker 2: I turned to my team and I said, that is garbage. 297 00:18:33,560 --> 00:18:35,199 Speaker 2: And I told my team, I said, I want us 298 00:18:35,200 --> 00:18:38,600 Speaker 2: to get on the phone with Snapchat today. And I 299 00:18:38,640 --> 00:18:41,280 Speaker 2: told my team, if need be, put me on the 300 00:18:41,280 --> 00:18:43,359 Speaker 2: phone with the CEO of Snapchat, and I told the 301 00:18:43,400 --> 00:18:46,600 Speaker 2: mom we are going to get this taken down within 302 00:18:46,680 --> 00:18:49,480 Speaker 2: twenty four hours. They took the images down now. 303 00:18:49,520 --> 00:18:51,880 Speaker 1: But it took and this is the scary part. It 304 00:18:51,920 --> 00:18:58,320 Speaker 1: took someone getting a hold of you, yes, and then 305 00:18:58,520 --> 00:19:03,280 Speaker 1: having a United States from Texas to then call and 306 00:19:03,680 --> 00:19:06,399 Speaker 1: basically say I want to talk to the CEO to 307 00:19:06,520 --> 00:19:10,280 Speaker 1: get them to respond. The reality is the majority of 308 00:19:10,280 --> 00:19:12,200 Speaker 1: Americans are not going to be able to pull that off. 309 00:19:12,240 --> 00:19:16,600 Speaker 2: And that's exactly important, and it shows its effortless. They 310 00:19:16,640 --> 00:19:19,159 Speaker 2: could do it instantaneously. They know how to pull it 311 00:19:19,240 --> 00:19:23,520 Speaker 2: down that they're just they're so unaccountable. There's such a 312 00:19:23,760 --> 00:19:27,680 Speaker 2: hubris and arrogance that they don't feel the need to respond. 313 00:19:27,800 --> 00:19:31,639 Speaker 2: So if you're a global superstar, you can get it 314 00:19:31,680 --> 00:19:34,280 Speaker 2: taken down. If you're able to get a member of 315 00:19:34,280 --> 00:19:37,120 Speaker 2: Congress to intervene on your behalf, maybe you can get 316 00:19:37,119 --> 00:19:40,480 Speaker 2: it taken down. But if you're just an ordinary teenager 317 00:19:40,520 --> 00:19:43,680 Speaker 2: in Texas who wakes up one day to the friggin' 318 00:19:43,840 --> 00:19:47,720 Speaker 2: nightmare of walking into eighth grade and every kid there 319 00:19:47,800 --> 00:19:50,600 Speaker 2: is looking at an image on their phone that appears 320 00:19:50,680 --> 00:19:55,080 Speaker 2: to be you doing explicit sexual activity, that is an 321 00:19:55,119 --> 00:20:00,000 Speaker 2: absolute nightmare. And right now, big tech acts within pure 322 00:20:00,080 --> 00:20:01,960 Speaker 2: unity this legislation. I think we're going to get this 323 00:20:02,040 --> 00:20:05,560 Speaker 2: legislation passed. It needs to be passed because you should 324 00:20:05,560 --> 00:20:08,320 Speaker 2: have a right to say, take this damn thing down, 325 00:20:08,359 --> 00:20:09,480 Speaker 2: and take it down now. 326 00:20:10,400 --> 00:20:12,919 Speaker 1: So let's talk about what the big tech world is 327 00:20:12,960 --> 00:20:15,240 Speaker 1: going to respond with. And we've already seen this play. 328 00:20:15,720 --> 00:20:18,119 Speaker 1: They've said, well, look, this is a state issue. The 329 00:20:18,320 --> 00:20:20,280 Speaker 1: states should deal with this, and there's some states that 330 00:20:20,359 --> 00:20:23,359 Speaker 1: have there's i think over twenty states that have dealt 331 00:20:23,400 --> 00:20:27,840 Speaker 1: with this. But you guys are doing it differently from 332 00:20:27,840 --> 00:20:30,520 Speaker 1: a federal perspective, and that is where there's going to 333 00:20:30,560 --> 00:20:34,400 Speaker 1: be real teeth in the repercussions for those that don't 334 00:20:34,400 --> 00:20:37,159 Speaker 1: abide by this federal law that you guys. 335 00:20:36,920 --> 00:20:40,960 Speaker 2: Are working to pass. Correct, Correct, and listen. We've talked 336 00:20:41,000 --> 00:20:44,359 Speaker 2: before about that. That's one of the benefits of my 337 00:20:44,560 --> 00:20:48,520 Speaker 2: role as the ranking member on the Senate Committee of Commerce, Science, 338 00:20:48,560 --> 00:20:52,560 Speaker 2: and Transportation. The Commerce Committee has jurisdiction over about forty 339 00:20:52,560 --> 00:20:57,119 Speaker 2: percent of the US economy, and it is as the 340 00:20:57,200 --> 00:21:01,320 Speaker 2: ranking members the senior Republican, I may to drive legislation 341 00:21:01,600 --> 00:21:04,720 Speaker 2: like this, as you know. You know, the FA Reauthorization 342 00:21:04,840 --> 00:21:06,639 Speaker 2: Bill that was just signed a law a few weeks 343 00:21:06,680 --> 00:21:09,840 Speaker 2: ago was the one hundredth piece of legislation that I've 344 00:21:09,880 --> 00:21:12,679 Speaker 2: authored and passed into law. And from the Commerce Committee, 345 00:21:12,760 --> 00:21:16,040 Speaker 2: we have jurisdiction over big tech and so few things 346 00:21:16,080 --> 00:21:23,359 Speaker 2: focused the mind of an industry more than the government 347 00:21:24,160 --> 00:21:28,560 Speaker 2: with jurisdiction over that industry. And so states are acting 348 00:21:28,560 --> 00:21:30,359 Speaker 2: and I'm glad in the state of Texas has acted, 349 00:21:30,400 --> 00:21:33,400 Speaker 2: and that's important. But what we're not seeing is the 350 00:21:33,440 --> 00:21:37,439 Speaker 2: mechanism to require that the content being taken down and 351 00:21:37,480 --> 00:21:42,120 Speaker 2: actually we modeled it after existing federal legislation. When you're 352 00:21:42,119 --> 00:21:46,199 Speaker 2: dealing with a violation of trademarks or copyrights that you 353 00:21:46,320 --> 00:21:49,440 Speaker 2: have a right, you know, if you are Jimmy Hendrix. 354 00:21:49,440 --> 00:21:51,000 Speaker 2: And by the way, if you're Jimmy Hendrix, you're dead. 355 00:21:51,040 --> 00:21:53,439 Speaker 2: So I hope you're not Jimmy Hendrix. But if you're 356 00:21:53,520 --> 00:21:55,760 Speaker 2: Jimmy Hendrix or the estate of Jimmy Hendrix and someone 357 00:21:55,800 --> 00:21:58,880 Speaker 2: puts your content online, you have a right to say, 358 00:21:58,880 --> 00:22:02,399 Speaker 2: go take it down, because that's my content and I 359 00:22:02,440 --> 00:22:06,399 Speaker 2: own it. And so that's an existing legal mechanism in 360 00:22:06,440 --> 00:22:10,520 Speaker 2: the world of intellectual property, and we borrowed that model 361 00:22:10,680 --> 00:22:15,880 Speaker 2: for taking down non consensual intimate images. Now, I'll say 362 00:22:15,960 --> 00:22:18,919 Speaker 2: earlier or earlier this week, I was talking with an 363 00:22:19,000 --> 00:22:23,160 Speaker 2: editorial board of a major paper in Texas and they 364 00:22:23,160 --> 00:22:25,840 Speaker 2: were asking I think Devil's Advocate questions. But one of 365 00:22:25,880 --> 00:22:28,280 Speaker 2: the things they said is they said, well, what about 366 00:22:28,280 --> 00:22:30,520 Speaker 2: the First Amendment. Isn't there a First Amendment right to 367 00:22:30,600 --> 00:22:33,960 Speaker 2: do this? And I said, no, no, there's not. And 368 00:22:34,320 --> 00:22:36,040 Speaker 2: I said a couple of things. I said, Number One, 369 00:22:36,600 --> 00:22:39,320 Speaker 2: from the very beginning of the First Amendment, it has 370 00:22:39,359 --> 00:22:43,280 Speaker 2: never covered defamation. It's never covered libel or slander. And 371 00:22:43,359 --> 00:22:47,080 Speaker 2: so when you have false content that depicts someone in 372 00:22:47,119 --> 00:22:51,560 Speaker 2: a way that is defamatory, and intimate and sexual images 373 00:22:51,560 --> 00:22:56,480 Speaker 2: that are not you are by definition defamatory. That has 374 00:22:56,600 --> 00:22:59,679 Speaker 2: always been understood from from the very first days the 375 00:22:59,680 --> 00:23:02,760 Speaker 2: First Adment was adopted as not being within the protection 376 00:23:03,520 --> 00:23:07,240 Speaker 2: of the First Amendment. But secondly, the First Amendment free 377 00:23:07,240 --> 00:23:11,560 Speaker 2: speech protections have never covered the intentional infliction of emotional distress. 378 00:23:11,600 --> 00:23:14,399 Speaker 2: And so so if you engage and this is this 379 00:23:14,440 --> 00:23:19,159 Speaker 2: is connected to uh, connected to defamation. But but the 380 00:23:19,200 --> 00:23:23,760 Speaker 2: First Amendment does does not empower you to harass and 381 00:23:23,760 --> 00:23:26,679 Speaker 2: and and and exploit and abuse other people. And you 382 00:23:26,760 --> 00:23:29,080 Speaker 2: can be sued and liable for doing so. And it's 383 00:23:29,080 --> 00:23:31,800 Speaker 2: no defense to say, well, gosh, under the First Amendment, 384 00:23:31,840 --> 00:23:35,840 Speaker 2: I can say anything I want. That's not how how 385 00:23:35,880 --> 00:23:38,440 Speaker 2: our legal system works. And and and it's wise I said, 386 00:23:38,480 --> 00:23:41,639 Speaker 2: I think we are going to get this done. I 387 00:23:42,000 --> 00:23:45,720 Speaker 2: think we are likely to to get this legislation passed. 388 00:23:45,760 --> 00:23:48,719 Speaker 2: And when we do, it is going to be a 389 00:23:48,880 --> 00:23:53,480 Speaker 2: major protection for for for kids, UH and and and 390 00:23:53,480 --> 00:23:57,280 Speaker 2: and for Americans across the country, in Texas and across 391 00:23:57,320 --> 00:23:57,800 Speaker 2: the country. 392 00:23:58,200 --> 00:24:00,160 Speaker 1: And that brings me, and that brings me this questues 393 00:24:00,440 --> 00:24:04,080 Speaker 1: and there are a lot of people, especially parents, grandparents, 394 00:24:04,119 --> 00:24:07,120 Speaker 1: are going to be listening and this is something they say, Hey, 395 00:24:07,200 --> 00:24:09,760 Speaker 1: this isn't Republican Democrat. And this is why I think 396 00:24:09,800 --> 00:24:11,840 Speaker 1: you guys are working so together with you and am 397 00:24:11,880 --> 00:24:14,159 Speaker 1: Kobachar on the others. You mentioned this is just an 398 00:24:14,200 --> 00:24:17,080 Speaker 1: issue of right and wrong. So what do they need 399 00:24:17,119 --> 00:24:19,320 Speaker 1: to do to make sure that their voice is heard, 400 00:24:19,440 --> 00:24:21,160 Speaker 1: especially if they're in favor of this. 401 00:24:21,440 --> 00:24:24,199 Speaker 2: Yeah, look, speak out and make clear that this is 402 00:24:24,280 --> 00:24:26,040 Speaker 2: the right thing to do. Look, you and I are 403 00:24:26,080 --> 00:24:30,199 Speaker 2: both parents, if you could imagine the frustration and so 404 00:24:30,320 --> 00:24:33,880 Speaker 2: many parents listen dealing with big tech generally, and as 405 00:24:33,920 --> 00:24:35,520 Speaker 2: you know, this has been a passion of mine for 406 00:24:35,560 --> 00:24:40,320 Speaker 2: a long time. I've authored and supported multiple bills designed 407 00:24:40,320 --> 00:24:43,600 Speaker 2: to protect kids online. So a different bill that I 408 00:24:43,640 --> 00:24:46,320 Speaker 2: have is a bill called Cosma, where I've joined with 409 00:24:46,320 --> 00:24:52,480 Speaker 2: with Brian Shotts, who's a Democrat from Hawaii, and Cosma 410 00:24:53,560 --> 00:24:58,040 Speaker 2: would prohibit children under thirteen from being on social media 411 00:24:58,080 --> 00:25:00,520 Speaker 2: at all, so it would ban that across the board, 412 00:25:01,200 --> 00:25:04,720 Speaker 2: and then children under seventeen. It would prohibit big tech 413 00:25:05,280 --> 00:25:08,919 Speaker 2: from engaging in algorithmic boosting, which is the process that 414 00:25:08,960 --> 00:25:13,639 Speaker 2: they push content at teenagers. And as you know, big 415 00:25:13,720 --> 00:25:17,760 Speaker 2: tech in the pursuit of the dollar, pushes all sorts 416 00:25:17,800 --> 00:25:21,640 Speaker 2: of horrible content at our kids. They push substance abuse 417 00:25:21,720 --> 00:25:26,199 Speaker 2: and alcohol abuse, They push self harm, they push suicidal 418 00:25:26,200 --> 00:25:31,040 Speaker 2: ideation that they push body image, particularly for young girls, 419 00:25:31,080 --> 00:25:35,480 Speaker 2: all sorts of body image issues. We're seeing dramatically increasing 420 00:25:36,320 --> 00:25:40,480 Speaker 2: instances of depression, of anxiety. You know, the Surgeon General 421 00:25:40,520 --> 00:25:42,280 Speaker 2: under Joe Biden just wrote an op ed in the 422 00:25:42,280 --> 00:25:46,720 Speaker 2: New York Times decrying the risk to mental health from 423 00:25:46,760 --> 00:25:50,359 Speaker 2: social media of young people. The solution, or at least 424 00:25:50,359 --> 00:25:52,560 Speaker 2: part of the solution, I think, is the legislation I 425 00:25:52,600 --> 00:25:56,800 Speaker 2: have with Brian Shotts. It also includes another bill that 426 00:25:57,600 --> 00:26:01,760 Speaker 2: I've authored with John Fetterman, obviously other Democrat, called the 427 00:26:01,800 --> 00:26:05,240 Speaker 2: Eyes on the Board Bill, which says that any school 428 00:26:05,280 --> 00:26:09,120 Speaker 2: that receives federal funds, which is pretty much all of them, 429 00:26:09,600 --> 00:26:14,080 Speaker 2: is required to ban social media from their Wi Fi 430 00:26:14,240 --> 00:26:18,200 Speaker 2: on campus because kids on campus ought to be paying 431 00:26:18,240 --> 00:26:21,440 Speaker 2: attention to school and not on snapchat, Chat or Instagram 432 00:26:21,560 --> 00:26:24,439 Speaker 2: or something else. And what I will say on all 433 00:26:24,480 --> 00:26:30,160 Speaker 2: of these ben is, I literally don't know a parent 434 00:26:31,080 --> 00:26:35,080 Speaker 2: who is not scared, who is not frustrated about how 435 00:26:35,080 --> 00:26:37,080 Speaker 2: do you protect your kids online? 436 00:26:39,800 --> 00:26:42,600 Speaker 1: It's shocking now it's scary. It's a wild West, and 437 00:26:42,640 --> 00:26:45,520 Speaker 1: there seems to be no real protections for children. 438 00:26:46,640 --> 00:26:49,160 Speaker 2: And look, when you and I were kids, we didn't 439 00:26:49,200 --> 00:26:53,240 Speaker 2: deal with this. Now that this was not a world. 440 00:26:53,680 --> 00:26:56,480 Speaker 2: You know, when I was a teenager, my parents would 441 00:26:56,240 --> 00:26:58,280 Speaker 2: would have me. I'd like, you know, lay on the 442 00:26:58,320 --> 00:27:02,960 Speaker 2: floor on like the landline phone talking to my girlfriend 443 00:27:02,960 --> 00:27:05,960 Speaker 2: to one in the morning, and that was like as 444 00:27:06,040 --> 00:27:09,920 Speaker 2: much as you dealt with You now have these phones 445 00:27:10,080 --> 00:27:13,720 Speaker 2: that are these portals to just about everything evil in 446 00:27:13,720 --> 00:27:17,920 Speaker 2: the world, sexual predators, to bullies, to people just targeting 447 00:27:17,960 --> 00:27:22,159 Speaker 2: our kids. And you know what's interesting, Ben So in 448 00:27:22,200 --> 00:27:24,560 Speaker 2: the Senate, as you know, look, there are a lot 449 00:27:24,560 --> 00:27:27,879 Speaker 2: of people who are really damn old. I mean a 450 00:27:27,880 --> 00:27:30,000 Speaker 2: lot of my colleagues are their seventies or eighties or 451 00:27:30,000 --> 00:27:33,320 Speaker 2: even nineties, and I got to admit that my colleagues 452 00:27:33,359 --> 00:27:37,160 Speaker 2: that are of that generation are kind of a little 453 00:27:37,160 --> 00:27:40,680 Speaker 2: bit befuddled by this issue. They look at it and frankly, 454 00:27:41,760 --> 00:27:44,120 Speaker 2: you know, their kids weren't dealing with it when their 455 00:27:44,200 --> 00:27:47,960 Speaker 2: kids were children, and their grandkids are now, but they're 456 00:27:47,960 --> 00:27:51,000 Speaker 2: often less connected to it than they would be. And 457 00:27:51,040 --> 00:27:53,679 Speaker 2: what's interesting is the people that I'm joining with the 458 00:27:53,680 --> 00:27:55,640 Speaker 2: senators that are teaming up with me on this, both 459 00:27:55,640 --> 00:27:59,280 Speaker 2: Democrats and Republicans. Almost all of them are either in 460 00:27:59,320 --> 00:28:03,160 Speaker 2: their forties and fifties. They're younger senators, and almost all 461 00:28:03,200 --> 00:28:06,960 Speaker 2: of us have kids that are either teenagers or adolescents. 462 00:28:08,040 --> 00:28:10,280 Speaker 2: And almost to a person, if you're a parent and 463 00:28:10,320 --> 00:28:12,880 Speaker 2: you've got kids who are younger, so you're just starting 464 00:28:12,920 --> 00:28:15,760 Speaker 2: to face this, but I'm sure you're worried about it now, 465 00:28:15,800 --> 00:28:18,000 Speaker 2: and I promise you in five or ten years, you're 466 00:28:18,000 --> 00:28:20,320 Speaker 2: going to be terrified about it because every parent I 467 00:28:20,359 --> 00:28:21,960 Speaker 2: know is makes you. 468 00:28:21,960 --> 00:28:24,520 Speaker 1: Share this podcast with your family and your friends. Please 469 00:28:24,520 --> 00:28:26,160 Speaker 1: write the five sorry of you. It helps us reach 470 00:28:26,240 --> 00:28:28,680 Speaker 1: new people, and I'll see you back here tomorrow