1 00:00:01,840 --> 00:00:06,120 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio, the George 2 00:00:06,200 --> 00:00:10,880 Speaker 1: Washington Broadcast Center, Jack Armstrong and Joe, Ketty Armstrong and 3 00:00:10,960 --> 00:00:17,240 Speaker 1: Jettie and he Armstrong and Jetty. 4 00:00:23,200 --> 00:00:26,079 Speaker 2: That's a good way to end up dead, especially in Florida. 5 00:00:26,239 --> 00:00:28,200 Speaker 2: You got to think you're about to become a victim 6 00:00:28,200 --> 00:00:30,840 Speaker 2: of a home invasion robbery under the castle. Doctor, you're 7 00:00:30,840 --> 00:00:33,800 Speaker 2: gonna shoot first and ask questions later. You're endangering your 8 00:00:33,880 --> 00:00:35,880 Speaker 2: future with this TikTok challenge. 9 00:00:36,360 --> 00:00:38,560 Speaker 3: Oh my god, a TikTok challenge. And I love anytime 10 00:00:38,560 --> 00:00:42,080 Speaker 3: the castle doctrine is invoked. What are we talking about, Well, 11 00:00:42,159 --> 00:00:47,640 Speaker 3: let's see, Sheriff of Lusia County in sunny Florida. There's 12 00:00:47,680 --> 00:00:52,720 Speaker 3: a new viral social media trend where teenagers are kicking 13 00:00:52,760 --> 00:00:55,040 Speaker 3: in people's doors in the middle of the night. 14 00:00:55,840 --> 00:00:56,960 Speaker 1: You've got to be kidding. 15 00:00:57,560 --> 00:01:01,040 Speaker 4: I know I have at least decent memories of being 16 00:01:01,040 --> 00:01:03,800 Speaker 4: a teenager, and my response would have been immediately, we're 17 00:01:03,800 --> 00:01:06,880 Speaker 4: gonna get shot. There's no effing way I'm doing that. 18 00:01:07,880 --> 00:01:12,960 Speaker 4: I'm shut. I fought for a long time whenever I'd 19 00:01:12,959 --> 00:01:15,399 Speaker 4: hear the term TikTok challenge. That wasn't really happening and 20 00:01:15,440 --> 00:01:20,240 Speaker 4: nobody was really doing it. There are more people doing 21 00:01:20,240 --> 00:01:22,600 Speaker 4: it than I think, because that's I don't know what 22 00:01:22,800 --> 00:01:26,120 Speaker 4: is saying that's not Yeah, So they. 23 00:01:27,720 --> 00:01:32,000 Speaker 3: The TikTok drilled pounded a nail into their eyes, So 24 00:01:32,040 --> 00:01:33,039 Speaker 3: I guess I'd better do it. 25 00:01:33,160 --> 00:01:34,959 Speaker 1: I just I don't understand the thinking here. 26 00:01:35,600 --> 00:01:37,920 Speaker 4: So the sheriff Mike Chitwood, who we just heard, said 27 00:01:37,920 --> 00:01:41,600 Speaker 4: teenagers are kicking indoors and scaring people inside. They give 28 00:01:41,680 --> 00:01:45,520 Speaker 4: a couple of examples. These folks had security cameras and 29 00:01:45,560 --> 00:01:47,840 Speaker 4: it took deputies about two hours to catch a thirteen 30 00:01:47,880 --> 00:01:51,960 Speaker 4: year old girl and a fifteen year old boy. Let's see, 31 00:01:53,000 --> 00:01:54,960 Speaker 4: teens are accused of taking part in a viral prank 32 00:01:55,000 --> 00:01:57,920 Speaker 4: known as the door Kicking Challenge on TikTok. 33 00:01:58,160 --> 00:01:59,640 Speaker 3: Oh, well, I'm sure. And then you have a friend 34 00:01:59,680 --> 00:02:01,960 Speaker 3: stam and video it. That's the whole thing, right, then 35 00:02:02,000 --> 00:02:04,040 Speaker 3: you post it and how hilarious it is that some 36 00:02:04,080 --> 00:02:06,560 Speaker 3: people inside were scared to death and thought that they 37 00:02:06,600 --> 00:02:07,680 Speaker 3: and their kids were gonna die. 38 00:02:07,880 --> 00:02:09,640 Speaker 1: See, that's that's crazy you are. 39 00:02:09,800 --> 00:02:13,679 Speaker 3: Yeah, even if you don't get shot, just the morality 40 00:02:13,760 --> 00:02:18,040 Speaker 3: of it that you scared to death the people inside. 41 00:02:18,560 --> 00:02:20,919 Speaker 1: Yeah, there's a combination. 42 00:02:20,680 --> 00:02:24,240 Speaker 4: Of stupidity and moral bankruptcy there that's a little bit challenging. 43 00:02:24,320 --> 00:02:26,920 Speaker 1: Yeah, but you know, I don't come. 44 00:02:26,680 --> 00:02:30,600 Speaker 4: From the sort of neighborhood where we would like kill 45 00:02:30,639 --> 00:02:34,639 Speaker 4: people from a minor dispute or a hard look. And 46 00:02:34,800 --> 00:02:38,239 Speaker 4: so nah, I think it's a I blame Johnny Knoxville. 47 00:02:38,240 --> 00:02:41,880 Speaker 4: It's the whole jackass thing has just started decades ago 48 00:02:41,960 --> 00:02:44,080 Speaker 4: and it's just continued to where people are willing to 49 00:02:44,160 --> 00:02:47,160 Speaker 4: abuse anyone for supposed. 50 00:02:46,840 --> 00:02:49,480 Speaker 1: Laughs as a teenager. Yeah, well. 51 00:02:50,919 --> 00:02:53,880 Speaker 4: You open up uh camps, draconian punishments. 52 00:02:54,400 --> 00:02:57,520 Speaker 3: Have an update on diabetes, Barbie, just because it's come 53 00:02:57,600 --> 00:02:59,720 Speaker 3: up a couple of times. I mean, we're making the 54 00:02:59,800 --> 00:03:03,280 Speaker 3: joke about it's a it's a condition. It's you can't 55 00:03:03,280 --> 00:03:06,640 Speaker 3: look at somebody until they've got diabetes. So like you said, 56 00:03:06,680 --> 00:03:09,560 Speaker 3: what's a Barbie that was born with one kidney? I mean, 57 00:03:09,600 --> 00:03:14,480 Speaker 3: is that a Barbie? No diabetic Barbie? Where's a glucose 58 00:03:14,600 --> 00:03:17,840 Speaker 3: glucose monitor on her arm? Comes with the Barbie set 59 00:03:18,280 --> 00:03:20,960 Speaker 3: an insulin infusion set on her body and carries it 60 00:03:20,960 --> 00:03:26,440 Speaker 3: carries the meter, so very visible markers of type one diabetes. Okay, 61 00:03:26,600 --> 00:03:30,919 Speaker 3: I don't know how this makes a little Yeah, exactly, 62 00:03:31,040 --> 00:03:31,760 Speaker 3: that's a good question. 63 00:03:31,800 --> 00:03:32,560 Speaker 1: Who's wanting that? 64 00:03:34,160 --> 00:03:37,240 Speaker 4: How about lime disease? People get that, ticks bite them, 65 00:03:37,280 --> 00:03:41,320 Speaker 4: et cetera. How about lime disease. We deserve a barbie 66 00:03:41,400 --> 00:03:42,120 Speaker 4: lime disease. 67 00:03:42,240 --> 00:03:44,960 Speaker 3: So you want a Barbie who's for whatever reason pulling 68 00:03:45,000 --> 00:03:46,920 Speaker 3: one of those things, you know, the metal things on 69 00:03:46,960 --> 00:03:48,760 Speaker 3: the wheels I've had at the hospital when I had 70 00:03:49,000 --> 00:03:51,000 Speaker 3: getting chemo. You know, you gotta go the bathroom and 71 00:03:51,040 --> 00:03:52,720 Speaker 3: you gotta walk down the hall with that thing behind 72 00:03:52,800 --> 00:03:55,600 Speaker 3: you because you gotta drip in your arm. Right, that's 73 00:03:55,640 --> 00:03:56,520 Speaker 3: that the barbie you want? 74 00:03:56,960 --> 00:04:00,440 Speaker 4: Uh yeah, maybe like an all purpose, not in hirely 75 00:04:00,520 --> 00:04:04,600 Speaker 4: healthy barbie, right. 76 00:04:04,520 --> 00:04:09,560 Speaker 3: An odd idea, right, yeah, I agree, very odd idea. 77 00:04:10,960 --> 00:04:13,040 Speaker 3: And let's instead of having a whole bunch of different 78 00:04:13,040 --> 00:04:15,760 Speaker 3: barbies I like yours at all purpose, let's have all 79 00:04:15,840 --> 00:04:18,719 Speaker 3: the ailments or bad things that could happen to hiven 80 00:04:18,760 --> 00:04:22,200 Speaker 3: being in one barbie. Right, a set of accessories you 81 00:04:22,200 --> 00:04:25,080 Speaker 3: can buy. It's like beach house, you know, Yell, she's 82 00:04:25,120 --> 00:04:28,640 Speaker 3: got skin problems, she's missing an eye, she's mangy, she's 83 00:04:28,680 --> 00:04:29,560 Speaker 3: just all kinds of things. 84 00:04:29,640 --> 00:04:32,440 Speaker 1: Headlighte barbie, Barbie. 85 00:04:33,040 --> 00:04:37,080 Speaker 4: Well, right, but I mean, diabetes can be a serious condition, 86 00:04:37,279 --> 00:04:41,080 Speaker 4: of course, so why not like and I don't mean 87 00:04:41,080 --> 00:04:43,880 Speaker 4: to make light of this obviously, but why not leukemia 88 00:04:43,920 --> 00:04:48,040 Speaker 4: Barbie or you know, how did they come. 89 00:04:47,880 --> 00:04:53,280 Speaker 1: To this decision? I don't know. She's uh, yeah, you 90 00:04:53,360 --> 00:04:56,719 Speaker 1: gotta pilot all in one thing, get it over with. 91 00:04:57,560 --> 00:05:01,480 Speaker 4: Yeah, just all purpose unhealthy Barbie making of beloved American products. 92 00:05:01,520 --> 00:05:06,479 Speaker 4: I just wanted to mention this very quickly. Air India's 93 00:05:06,560 --> 00:05:10,440 Speaker 4: probe into that horrific crash that just one guy walked 94 00:05:10,440 --> 00:05:16,080 Speaker 4: away from and two hundred and some people died, absolutely horrific. 95 00:05:16,200 --> 00:05:17,720 Speaker 1: Obviously, whatever happened to. 96 00:05:18,520 --> 00:05:21,120 Speaker 3: Is he, like I expect it ended up on Dancing 97 00:05:21,120 --> 00:05:22,120 Speaker 3: with the Stars or anything. 98 00:05:23,279 --> 00:05:25,120 Speaker 4: Not yet I haven't heard a word from him. We 99 00:05:25,200 --> 00:05:27,920 Speaker 4: probably realized, wait a minute, they're exploiting me, all these 100 00:05:27,960 --> 00:05:30,360 Speaker 4: media people. They don't care about me, and he has 101 00:05:31,000 --> 00:05:33,520 Speaker 4: got gone with a lower profile. Yeah, two hundred and 102 00:05:33,520 --> 00:05:36,800 Speaker 4: sixty people died, which is just terrible, obviously, but the 103 00:05:36,880 --> 00:05:40,880 Speaker 4: investigation is focusing on the actions of the jets pilots 104 00:05:42,320 --> 00:05:45,120 Speaker 4: and not the Boeing seven eighty seven Dreamliner, which has 105 00:05:45,200 --> 00:05:50,599 Speaker 4: an unbelievably great safety record, and the engines appeared to 106 00:05:50,640 --> 00:05:55,400 Speaker 4: be fine too. Preliminary findings indicate that the switches controlling 107 00:05:55,480 --> 00:05:58,800 Speaker 4: fuel flow to the jets two engines were turned off, 108 00:05:59,120 --> 00:06:01,960 Speaker 4: leading to an apparent loss of thrust shortly after takeoff. 109 00:06:02,000 --> 00:06:04,720 Speaker 3: The people said, hey, motorcycle riders, you ever leave your 110 00:06:05,200 --> 00:06:07,839 Speaker 3: little switch to the wrong side and then your motorcycle 111 00:06:07,839 --> 00:06:09,120 Speaker 3: starts to die a block from your house? 112 00:06:09,120 --> 00:06:11,400 Speaker 1: Oh yeah, I haven't flipped that on show of hands. Yeah, 113 00:06:11,440 --> 00:06:14,080 Speaker 1: everybody's done that, But these pilots did that. 114 00:06:14,839 --> 00:06:18,680 Speaker 4: Unclear why they were turned off or how unclear whether 115 00:06:18,760 --> 00:06:22,400 Speaker 4: it was accident or a dental or intentional, or whether 116 00:06:22,440 --> 00:06:23,640 Speaker 4: there was an attempt to turn. 117 00:06:23,480 --> 00:06:24,200 Speaker 1: Them back on. 118 00:06:24,800 --> 00:06:26,600 Speaker 4: If the switches were off, that could explain whether the 119 00:06:26,640 --> 00:06:29,600 Speaker 4: jets emergency power generator, known as the RAM air turbine 120 00:06:29,720 --> 00:06:32,840 Speaker 4: or RAT, appears to have activated in the moments before 121 00:06:32,880 --> 00:06:36,960 Speaker 4: the aircraft plummeted into a nearby hostile for medical students. 122 00:06:37,120 --> 00:06:38,960 Speaker 1: Nightmarish blah blah blah. 123 00:06:39,000 --> 00:06:42,919 Speaker 4: So you know, it is not a blow to Boeing's 124 00:06:43,000 --> 00:06:44,280 Speaker 4: already bruised reputation. 125 00:06:44,400 --> 00:06:47,040 Speaker 1: It would seem boy, very few people will hear that. 126 00:06:47,200 --> 00:06:48,359 Speaker 1: But god dang it. 127 00:06:48,400 --> 00:06:49,880 Speaker 3: So they didn't get more than what was it like, 128 00:06:49,920 --> 00:06:52,040 Speaker 3: six hundred feet off the ground. Why can't we get 129 00:06:52,040 --> 00:06:53,039 Speaker 3: this plane to go higher? 130 00:06:53,160 --> 00:06:55,720 Speaker 1: Dude, where's the thrust? Where's the throw? I have the 131 00:06:55,760 --> 00:06:56,200 Speaker 1: gas on? 132 00:06:57,040 --> 00:07:01,080 Speaker 4: Oh good, Lord, that's a bad story. You want better 133 00:07:01,120 --> 00:07:01,919 Speaker 4: pilots than that. 134 00:07:02,839 --> 00:07:06,400 Speaker 3: God, I would say, we're going to get more into 135 00:07:06,480 --> 00:07:09,560 Speaker 3: the immigration raids in Socow. Coming up a little bit later, 136 00:07:09,840 --> 00:07:16,080 Speaker 3: the uh ICE raided an illegal pot farm. It turned 137 00:07:16,280 --> 00:07:20,280 Speaker 3: ugly when Antifa turned out to be there and resisted 138 00:07:20,280 --> 00:07:22,400 Speaker 3: the ICE people in the way that they do violently. 139 00:07:22,840 --> 00:07:25,760 Speaker 3: So more on that this. This has got the potential 140 00:07:25,800 --> 00:07:29,560 Speaker 3: to turn into a really big story, not this individual one, 141 00:07:29,640 --> 00:07:35,559 Speaker 3: but just the overall Antifa violence against ICE agents, right 142 00:07:35,680 --> 00:07:39,840 Speaker 3: and the militant left more broadly. But yeah, these Antifa 143 00:07:39,880 --> 00:07:42,800 Speaker 3: goons who you remember the mainstream media told us didn't 144 00:07:42,920 --> 00:07:46,120 Speaker 3: exist just a few years ago when they were raining 145 00:07:46,200 --> 00:07:49,080 Speaker 3: violence down on Portland for instance, one hundred and twelve 146 00:07:49,160 --> 00:07:50,960 Speaker 3: nights in a row. Yeah, they're out and proud. 147 00:07:51,000 --> 00:07:51,640 Speaker 1: Now let me. 148 00:07:51,600 --> 00:07:53,720 Speaker 3: Read this from Byron New York to Washington Examiner. I 149 00:07:53,760 --> 00:07:56,320 Speaker 3: thought this was really interesting. There's no doubt antifa is 150 00:07:56,360 --> 00:07:59,400 Speaker 3: a fringe extremist group. Y'all know when tifa is right, 151 00:07:59,440 --> 00:08:03,000 Speaker 3: that's short for anti fascist. Even though they're absolutely not that. 152 00:08:03,440 --> 00:08:07,680 Speaker 3: But a lot of the East Coast mainstream media who 153 00:08:07,760 --> 00:08:11,160 Speaker 3: didn't know what Antifa was on the West Coast believed they. 154 00:08:11,000 --> 00:08:13,440 Speaker 1: Were anti fascists and went with that for a while. 155 00:08:13,440 --> 00:08:17,480 Speaker 3: But Antifa is a fringe extremist group, not isolated. There 156 00:08:17,480 --> 00:08:20,880 Speaker 3: are cells throughout the country, but extremists, says Byron York. 157 00:08:21,200 --> 00:08:24,240 Speaker 3: Now the issue of opposition to Trump's immigration enforcement has 158 00:08:24,280 --> 00:08:27,840 Speaker 3: brought the fanatics of Antifa more closely than ever in 159 00:08:27,960 --> 00:08:32,640 Speaker 3: line with the beliefs of progressive Democrats and perhaps those 160 00:08:32,640 --> 00:08:36,120 Speaker 3: in the party's mainstream too. Would it be fair to 161 00:08:36,160 --> 00:08:39,240 Speaker 3: call Antifa the militant wing of the Democratic Party. 162 00:08:39,360 --> 00:08:39,880 Speaker 1: Maybe so. 163 00:08:40,240 --> 00:08:42,280 Speaker 3: And if it's not fair, it is closer to true 164 00:08:42,280 --> 00:08:45,800 Speaker 3: than many Democrats would ever want to acknowledge, writes Byron York. 165 00:08:47,480 --> 00:08:51,680 Speaker 3: I saw Fetterman, Senator Fetterman in Pennsylvania say something fairly 166 00:08:51,760 --> 00:08:55,760 Speaker 3: similar on Fox TV today and that ICE needs to 167 00:08:55,760 --> 00:09:01,360 Speaker 3: be funded and supported and they're just doing their job. He's, 168 00:09:01,960 --> 00:09:06,640 Speaker 3: you know, an outlier, I think, But uh, how closely 169 00:09:06,679 --> 00:09:09,000 Speaker 3: alied is Antifa and the thinking of a lot of 170 00:09:09,040 --> 00:09:11,560 Speaker 3: the Democratic Party, like like Byron new York says, I 171 00:09:11,559 --> 00:09:15,120 Speaker 3: think a lot closer than previous Antifa issues. 172 00:09:15,840 --> 00:09:20,040 Speaker 4: I'm really intrigued by his question are they the militant 173 00:09:20,080 --> 00:09:24,240 Speaker 4: wing of the Democratic Party, because they are absolutely organized. 174 00:09:24,280 --> 00:09:27,840 Speaker 4: They are decentralized, but they're one hundred percent organized, which 175 00:09:27,880 --> 00:09:30,520 Speaker 4: is it runs counter to one of the other lies 176 00:09:30,559 --> 00:09:32,320 Speaker 4: that the media tried to tell us for a long time. 177 00:09:33,000 --> 00:09:38,560 Speaker 4: Their views overlap a great deal with the Democratic Party, 178 00:09:38,600 --> 00:09:42,000 Speaker 4: certainly the progressive wing. They have been aided, abetted, and 179 00:09:42,080 --> 00:09:45,760 Speaker 4: covered up for by the left wing of the Democratic Party. 180 00:09:45,800 --> 00:09:49,040 Speaker 3: Well, if if the politics were flipped because of the 181 00:09:49,120 --> 00:09:52,600 Speaker 3: way the media leans left so much, every Democrat would 182 00:09:52,600 --> 00:09:55,360 Speaker 3: be asked do you support what antifa is doing or not? 183 00:09:55,400 --> 00:09:56,679 Speaker 1: They'd have to answer for it. 184 00:09:56,760 --> 00:10:00,679 Speaker 3: But oh, Republicans you mean yeah, if it were Republicans, yeah, yeah, 185 00:10:01,640 --> 00:10:04,840 Speaker 3: every Republican every time they showed up anywhere would be 186 00:10:04,880 --> 00:10:07,400 Speaker 3: asked about that and have to denounce them, you know, 187 00:10:07,480 --> 00:10:10,200 Speaker 3: like the people at Charlottesville after that, and you know, 188 00:10:10,320 --> 00:10:13,520 Speaker 3: all that sort of thing endlessly. But Democrats won't be 189 00:10:13,559 --> 00:10:16,320 Speaker 3: asked to speak to this, at least not yet. But 190 00:10:16,559 --> 00:10:18,600 Speaker 3: this story is not over by far. 191 00:10:19,520 --> 00:10:23,679 Speaker 4: And in related story, came across some really interesting scholarship 192 00:10:23,720 --> 00:10:26,520 Speaker 4: that finally thoroughly answers the question I've been asking over 193 00:10:26,559 --> 00:10:29,160 Speaker 4: and over again and trying to dig into why are 194 00:10:29,440 --> 00:10:32,240 Speaker 4: young women, not only in America but around the world 195 00:10:32,280 --> 00:10:39,160 Speaker 4: becoming so radicalized and so left and way disproportionately women 196 00:10:39,640 --> 00:10:42,959 Speaker 4: like that Big frankis at Columbia University where eighty people 197 00:10:42,960 --> 00:10:45,560 Speaker 4: were arrested after taking over and vandalizing the library and 198 00:10:45,559 --> 00:10:48,920 Speaker 4: holding people hostage. Briefly, eighty people arrested, sixty one of 199 00:10:48,960 --> 00:10:53,760 Speaker 4: them women. What's going on? Finally got a great, thorough answer. 200 00:10:54,480 --> 00:10:57,240 Speaker 4: It's troubling, but man, you have to you have to 201 00:10:57,360 --> 00:11:00,800 Speaker 4: diagnose the problem before you can start talking about a cure. 202 00:11:01,720 --> 00:11:03,880 Speaker 1: I did my first ever cold plunge. 203 00:11:03,920 --> 00:11:06,839 Speaker 3: It was kind of a baby cold plunge, and that 204 00:11:06,960 --> 00:11:10,600 Speaker 3: I just jumped into a cold shower. Although it was 205 00:11:10,640 --> 00:11:13,439 Speaker 3: shocking enough. This morning when I jumped in the cold shower, 206 00:11:13,679 --> 00:11:17,320 Speaker 3: I went, I'm actually made that noise. I'm glad, my 207 00:11:17,440 --> 00:11:19,880 Speaker 3: thanks you get credit. Yeah, I'm glad. My bedroom's far 208 00:11:19,920 --> 00:11:21,880 Speaker 3: away from the kids so they don't think something horrible 209 00:11:21,880 --> 00:11:23,880 Speaker 3: has happened, or or I don't want them to hear 210 00:11:23,880 --> 00:11:24,920 Speaker 3: their dad make that noise. 211 00:11:25,400 --> 00:11:29,160 Speaker 4: I'm picturing Multislamic obscenities. I don't think I cussed. 212 00:11:29,200 --> 00:11:33,360 Speaker 3: I think I just made that very whoosy sound and 213 00:11:33,440 --> 00:11:38,040 Speaker 3: expelled all my breath. It was shocking. But man, I 214 00:11:38,120 --> 00:11:40,000 Speaker 3: came out of that and just like, let's. 215 00:11:39,760 --> 00:11:43,079 Speaker 1: Go, come on, world, bring it on. I couldn't take it. 216 00:11:43,080 --> 00:11:44,280 Speaker 1: It was It was fantastic. 217 00:11:44,880 --> 00:11:47,880 Speaker 4: Oh, I need to consult a cardiologist whether that's a 218 00:11:47,880 --> 00:11:49,920 Speaker 4: good idea for me. But I gotta admit I know 219 00:11:50,080 --> 00:11:54,720 Speaker 4: that feeling, that exhilarated feeling, because I've had to take 220 00:11:54,760 --> 00:11:56,920 Speaker 4: cold showers at campgrounds and that sort of thing. 221 00:11:57,640 --> 00:12:01,160 Speaker 1: And you do come out just to take on the world. 222 00:12:01,320 --> 00:12:03,920 Speaker 3: Why isn't everybody doing this if it's a good idea, well, 223 00:12:03,960 --> 00:12:05,319 Speaker 3: because it sucks. 224 00:12:05,120 --> 00:12:07,160 Speaker 1: It sucks. It sucks. 225 00:12:07,280 --> 00:12:08,800 Speaker 3: I don't even know if I can do it tomorrow. 226 00:12:08,800 --> 00:12:11,920 Speaker 3: Even with the positive results. It was hard to work. 227 00:12:12,600 --> 00:12:14,160 Speaker 3: I just had to do it without thinking about it, 228 00:12:14,559 --> 00:12:17,920 Speaker 3: just did it because it just seemed horrible. I could 229 00:12:17,920 --> 00:12:19,959 Speaker 3: feel the cold water splashing on my feet and I 230 00:12:20,000 --> 00:12:24,040 Speaker 3: thought this is gonna be awful. And it was briefly 231 00:12:24,800 --> 00:12:26,360 Speaker 3: so we got a lot more stuff you get Tuesday 232 00:12:26,400 --> 00:12:32,800 Speaker 3: here a little AI talk. One of my favorite pundits 233 00:12:32,840 --> 00:12:35,720 Speaker 3: was ranting on a podcast yesterday about how underwhelmed they 234 00:12:35,760 --> 00:12:38,040 Speaker 3: are with chat GPT and I thought, well, that's interesting. 235 00:12:38,080 --> 00:12:40,520 Speaker 3: That is the opposite of my experience. I think it's 236 00:12:40,720 --> 00:12:41,520 Speaker 3: freaking amazing. 237 00:12:41,920 --> 00:12:45,560 Speaker 1: What's he doing with it? That's what I wondered. Yeah. Interesting. 238 00:12:45,760 --> 00:12:49,600 Speaker 4: So, speaking of AI, this is kind of tangentially connected 239 00:12:49,640 --> 00:12:53,360 Speaker 4: to it. Special report on Faction News Brett Bear had 240 00:12:53,360 --> 00:12:56,760 Speaker 4: a really interesting report about how much power AI systems 241 00:12:56,840 --> 00:13:01,360 Speaker 4: need and it's astonishing amounts of power, and how it 242 00:13:01,440 --> 00:13:05,160 Speaker 4: might be provided and how it is fueling a return 243 00:13:05,240 --> 00:13:09,520 Speaker 4: to nuclear energy which only went away because of Graham Nash, 244 00:13:09,520 --> 00:13:12,840 Speaker 4: Bruce Springsteen and their band of mary leftist idiots. 245 00:13:13,120 --> 00:13:14,880 Speaker 1: Maybe the seventies. 246 00:13:16,280 --> 00:13:16,480 Speaker 5: On you. 247 00:13:16,559 --> 00:13:19,080 Speaker 4: That's a short short list of the people involved with 248 00:13:19,120 --> 00:13:25,160 Speaker 4: the no Nukes movement. But I found these clips very interesting, 249 00:13:25,200 --> 00:13:28,240 Speaker 4: particularly some of the statistics held within them. 250 00:13:28,480 --> 00:13:30,079 Speaker 1: Let's just go ahead and roll at Michael. 251 00:13:30,360 --> 00:13:33,800 Speaker 6: In order to supply the increasing demand, data centers are 252 00:13:33,840 --> 00:13:38,679 Speaker 6: providing a twenty four hour connection to continue advancing AI technology. 253 00:13:39,040 --> 00:13:43,719 Speaker 7: Running all of these computational resources that modern AI needs 254 00:13:43,760 --> 00:13:46,079 Speaker 7: requires an awful lot of electricity. 255 00:13:47,760 --> 00:13:52,320 Speaker 6: AI models are frequently trained to remain relevant, software requires 256 00:13:52,400 --> 00:13:56,280 Speaker 6: regular updates, and data centers need large cooling systems to 257 00:13:56,360 --> 00:13:57,280 Speaker 6: keep everything running. 258 00:13:57,600 --> 00:14:01,080 Speaker 7: The plans for the largest computing clusters to run the 259 00:14:01,160 --> 00:14:04,080 Speaker 7: largest AI algorithms in the world in the not too 260 00:14:04,080 --> 00:14:07,200 Speaker 7: distant future is in the range of one gigawatt to 261 00:14:07,320 --> 00:14:12,920 Speaker 7: five gigawatts. One gigawat is about one hoover dam worth 262 00:14:12,960 --> 00:14:17,480 Speaker 7: of electricity, So imagine five hoover dams being used to 263 00:14:17,600 --> 00:14:21,520 Speaker 7: just power one data center full of one company's AI. 264 00:14:22,160 --> 00:14:26,640 Speaker 3: That seems like an underappreciated aspect of AI. 265 00:14:27,800 --> 00:14:32,800 Speaker 4: Help me signed the grid. I mean, it's already over 266 00:14:32,800 --> 00:14:37,160 Speaker 4: attached in a lot of areas. Of course, my understanding 267 00:14:37,200 --> 00:14:39,360 Speaker 4: of it is in many cases they'll have their own 268 00:14:39,440 --> 00:14:43,120 Speaker 4: quote unquote grid. There will be a nuclear reactor. It 269 00:14:43,160 --> 00:14:47,000 Speaker 4: will provide power to the data center right next door. 270 00:14:47,240 --> 00:14:47,720 Speaker 1: Period. 271 00:14:48,080 --> 00:14:49,720 Speaker 3: I wonder if AI is never going to get going 272 00:14:49,840 --> 00:14:52,040 Speaker 3: until we perfect that whole fusion thing where we have 273 00:14:52,160 --> 00:14:54,480 Speaker 3: kind of unlimited free energy. 274 00:14:55,120 --> 00:14:57,240 Speaker 4: That would be exciting. There's one more clip. I found 275 00:14:57,240 --> 00:14:58,880 Speaker 4: this interesting as well. 276 00:14:59,280 --> 00:15:02,560 Speaker 6: US reactors supply nearly twenty percent of the nation's power. 277 00:15:02,800 --> 00:15:07,360 Speaker 6: The ninety three nuclear generators create more electricity annually than 278 00:15:07,400 --> 00:15:12,000 Speaker 6: the eight thousand wind, solar, and geothermal power plants combined. 279 00:15:12,400 --> 00:15:15,040 Speaker 8: The grid operators tell us and we develop a lot 280 00:15:15,080 --> 00:15:18,000 Speaker 8: of solar that we have to develop twenty times as 281 00:15:18,120 --> 00:15:21,960 Speaker 8: much solar to get the same impact as one megwatt 282 00:15:22,200 --> 00:15:23,840 Speaker 8: of nuclear energy. 283 00:15:25,080 --> 00:15:29,000 Speaker 4: Yeah, so again the twenty nine did they say nuclear 284 00:15:29,160 --> 00:15:31,600 Speaker 4: power plants that are functioning right now in the US 285 00:15:31,960 --> 00:15:35,160 Speaker 4: produce more than the eight thousand wind and solar and 286 00:15:35,200 --> 00:15:38,400 Speaker 4: geothermal sites around the country that are enormously large and 287 00:15:38,440 --> 00:15:39,680 Speaker 4: disruptive to the environment. 288 00:15:39,760 --> 00:15:42,880 Speaker 3: So if the hippies hadn't killed nuclear power, with the 289 00:15:43,000 --> 00:15:45,880 Speaker 3: United States being you know, better than everybody in the 290 00:15:45,920 --> 00:15:49,960 Speaker 3: world at everything for the past eighty years, God. 291 00:15:49,760 --> 00:15:51,560 Speaker 1: How far down the road would we have been. 292 00:15:51,600 --> 00:15:53,320 Speaker 3: I mean, if we'd have gone all in on nuclear 293 00:15:53,320 --> 00:15:55,560 Speaker 3: power back in the seventies, how far down the road 294 00:15:55,560 --> 00:15:58,440 Speaker 3: would we be to just everything is run on nuclear power? 295 00:16:00,000 --> 00:16:03,160 Speaker 3: And all the odd circumstance being that the very people 296 00:16:03,160 --> 00:16:05,480 Speaker 3: that killed nuclear power are the people that hate fossil 297 00:16:05,480 --> 00:16:08,200 Speaker 3: fuel power, Well, we wouldn't need hardly any fossil fuel 298 00:16:08,240 --> 00:16:09,160 Speaker 3: power if we'd. 299 00:16:09,000 --> 00:16:10,320 Speaker 1: Have gone all in on nuke. 300 00:16:10,840 --> 00:16:12,920 Speaker 4: I was just gonna say, hey, Bruce, we would have 301 00:16:13,000 --> 00:16:17,800 Speaker 4: put five zillion metric tons less carbon into the air. 302 00:16:18,200 --> 00:16:20,520 Speaker 4: If y'all had just said, hey, let's make sure our 303 00:16:20,600 --> 00:16:23,320 Speaker 4: safety is up to snuff on this stuff, as opposed 304 00:16:23,360 --> 00:16:26,080 Speaker 4: to just reacted emotionally and acted like it was an 305 00:16:26,120 --> 00:16:27,880 Speaker 4: evil spirit or something like that. 306 00:16:28,120 --> 00:16:30,520 Speaker 3: Be interesting if it turns out that AI is what 307 00:16:31,320 --> 00:16:36,400 Speaker 3: gets us over our nuclear energy phobia just out of practicality. 308 00:16:36,480 --> 00:16:38,240 Speaker 1: You gotta have h that. 309 00:16:38,240 --> 00:16:41,880 Speaker 4: It's just another example of I love you know, dreamers 310 00:16:41,880 --> 00:16:44,600 Speaker 4: and poets and songwriters and artists and stuff. I just 311 00:16:44,680 --> 00:16:50,240 Speaker 4: don't want them in charge, right eh. Boy, So speaking 312 00:16:50,400 --> 00:16:55,840 Speaker 4: of people on the far left, really interesting thinking slash 313 00:16:55,960 --> 00:17:00,480 Speaker 4: science about why young women are so prone to radicallyation 314 00:17:00,720 --> 00:17:01,240 Speaker 4: right now. 315 00:17:01,560 --> 00:17:03,200 Speaker 1: It's a huge issue. 316 00:17:03,840 --> 00:17:07,160 Speaker 4: Ask any young dude, they'll tell you chicks are crazy. 317 00:17:07,320 --> 00:17:10,240 Speaker 3: We'll describe why scientifically cool. And if you missed the 318 00:17:10,240 --> 00:17:12,120 Speaker 3: segment at the podcast Armstrong, you get. 319 00:17:11,920 --> 00:17:17,359 Speaker 5: Me on demand Armstrong and Geeddy, you have stolen my 320 00:17:17,480 --> 00:17:20,840 Speaker 5: dreams in my childhood with your empty words. 321 00:17:21,680 --> 00:17:27,960 Speaker 1: I'm sorry about that. My bad. That's on me. That's 322 00:17:27,960 --> 00:17:28,359 Speaker 1: of course. 323 00:17:28,640 --> 00:17:33,320 Speaker 4: Young Greta Tunberg Back in twenty nineteen or eighteen, Time 324 00:17:33,480 --> 00:17:36,440 Speaker 4: named her the youngest ever person of the year. 325 00:17:36,600 --> 00:17:38,000 Speaker 1: She was that speech. 326 00:17:38,040 --> 00:17:39,960 Speaker 3: Did you remember that? Do we have more from her? 327 00:17:40,920 --> 00:17:43,000 Speaker 3: We love her, we love these clips from her. Yeah, 328 00:17:43,040 --> 00:17:44,600 Speaker 3: she was how old was she at the time when 329 00:17:44,600 --> 00:17:50,880 Speaker 3: she was Person of the Year, young, six months old. 330 00:17:51,160 --> 00:17:54,000 Speaker 7: Blah blah blah, blah blah blah. 331 00:17:54,040 --> 00:17:56,720 Speaker 4: Well she might have been a little older at that point. Anyway, 332 00:17:57,119 --> 00:18:01,960 Speaker 4: I came across this piece by Claire Layman about when 333 00:18:02,000 --> 00:18:05,560 Speaker 4: women are radicalized, and I'm going to characterize some of 334 00:18:05,560 --> 00:18:08,080 Speaker 4: it and read some of it. Obviously, Katie Jack, anybody 335 00:18:08,160 --> 00:18:12,359 Speaker 4: jump in anytime you want. And she writes that women 336 00:18:12,400 --> 00:18:16,040 Speaker 4: moving to the left is a global phenomenon. There's a 337 00:18:16,119 --> 00:18:21,200 Speaker 4: study on a radical environmental group in the UK described 338 00:18:21,200 --> 00:18:24,680 Speaker 4: it as a highly feminized protest culture. Surveys found that 339 00:18:24,720 --> 00:18:28,160 Speaker 4: attendants at climate demonstrations and cities around the world tend 340 00:18:28,160 --> 00:18:32,480 Speaker 4: to be about sixty percent female. Recent American progressive movements 341 00:18:32,600 --> 00:18:36,720 Speaker 4: from Black Lives Matter to GOS encampments, many of which 342 00:18:36,720 --> 00:18:40,679 Speaker 4: were supported or led by female founded Jewish Voice for Peace, 343 00:18:41,680 --> 00:18:45,800 Speaker 4: a bunch of examples South Korea, United States, Germany, the 344 00:18:45,880 --> 00:18:47,760 Speaker 4: United Kingdom, gen Z women. 345 00:18:47,560 --> 00:18:50,040 Speaker 1: Have shifted toward hyper progressive political. 346 00:18:49,640 --> 00:18:52,840 Speaker 4: Positions, while men in the same age cohort have held 347 00:18:52,840 --> 00:18:54,280 Speaker 4: steady or moved to the right. 348 00:18:54,520 --> 00:18:59,000 Speaker 3: Well, that doesn't help with the global lack of babies problem. No, 349 00:18:59,040 --> 00:19:02,080 Speaker 3: in fact, they were now blitting completely along political lines. 350 00:19:02,640 --> 00:19:03,439 Speaker 1: Yeah, no kidding. 351 00:19:03,680 --> 00:19:06,400 Speaker 4: In the US, according to Gallop data, women age eighteen 352 00:19:06,400 --> 00:19:10,440 Speaker 4: to thirty are now thirty percentage points more liberal than 353 00:19:10,440 --> 00:19:12,040 Speaker 4: their male peers. 354 00:19:12,200 --> 00:19:16,800 Speaker 3: And if I remember correctly from various polls I've looked 355 00:19:16,800 --> 00:19:20,640 Speaker 3: at this, it used to be we're pretty much in line, 356 00:19:20,920 --> 00:19:24,880 Speaker 3: right right, Why is this happening? In a minute or two? 357 00:19:25,119 --> 00:19:28,680 Speaker 3: And it, trust me, it is really interesting. And then 358 00:19:28,720 --> 00:19:31,600 Speaker 3: she mentions that there's a growing awareness of how young 359 00:19:31,640 --> 00:19:35,479 Speaker 3: men are drawn into radicalization, and there are studies about it, 360 00:19:35,680 --> 00:19:37,680 Speaker 3: and people are curious about it, partly because men tend 361 00:19:37,680 --> 00:19:41,440 Speaker 3: to be more violent and so there's a more immediate 362 00:19:41,480 --> 00:19:45,399 Speaker 3: need to understand it. But there's practically zero study of 363 00:19:45,480 --> 00:19:53,400 Speaker 3: radicalization among women other than some in radical Islam. Sometimes 364 00:19:53,440 --> 00:19:57,879 Speaker 3: mainstream institutions don't just overlook female extremism, they actively encourage it. 365 00:19:58,119 --> 00:20:00,919 Speaker 3: And she gives a bunch of examples that are interesting 366 00:20:00,960 --> 00:20:03,720 Speaker 3: but would take a lot of time, and says this 367 00:20:04,240 --> 00:20:07,440 Speaker 3: dynamics is perhaps best reflected in the career of Greta Tunberg, 368 00:20:07,520 --> 00:20:09,560 Speaker 3: since she began skipping school at the age of fifteen 369 00:20:09,600 --> 00:20:14,440 Speaker 3: to demand action on climate change. Well, I see, that's 370 00:20:14,480 --> 00:20:19,680 Speaker 3: funny we agree on that, sweetheart. Tuneberg has been showered 371 00:20:19,680 --> 00:20:21,359 Speaker 3: with encouragement and awards. 372 00:20:21,440 --> 00:20:22,840 Speaker 1: They mentioned the time thing. 373 00:20:23,400 --> 00:20:27,760 Speaker 4: She's received multiple Nobel Peace Prize nominations and array of 374 00:20:27,800 --> 00:20:30,840 Speaker 4: awards from media, philanthropic, scientific pandemic. 375 00:20:31,280 --> 00:20:33,199 Speaker 1: It makes up for the fact that we stole her dreams. 376 00:20:33,720 --> 00:20:34,120 Speaker 1: They paid. 377 00:20:34,320 --> 00:20:37,840 Speaker 5: They painted like a fifty foot mural, terrifying mural of 378 00:20:37,880 --> 00:20:39,879 Speaker 5: her on the side of a building in San Francisco 379 00:20:40,280 --> 00:20:41,920 Speaker 5: to see every time you left the city. 380 00:20:41,920 --> 00:20:44,720 Speaker 1: And it was awful. Wow dah you yeah. 381 00:20:46,040 --> 00:20:50,640 Speaker 4: And so Tunberg's trajectory illustrates brought her pattern. Radical behavior 382 00:20:50,640 --> 00:20:53,320 Speaker 4: from young women is not just tolerated, but actively encouraged 383 00:20:53,359 --> 00:20:57,320 Speaker 4: through awards, platforms, and institutional support. This creates a feedback loop. 384 00:20:57,760 --> 00:20:59,920 Speaker 4: But wait, Joe, you said you were going to differ. 385 00:21:00,119 --> 00:21:03,280 Speaker 4: That's true of men too, that sort of thing. Okay, 386 00:21:03,320 --> 00:21:07,720 Speaker 4: here we go. We're going to take one more step, 387 00:21:07,960 --> 00:21:10,560 Speaker 4: kind of a preliminary step, and get to why women 388 00:21:10,800 --> 00:21:15,320 Speaker 4: or girls in particular. The incentive structures that rewarded Tuneberg 389 00:21:15,400 --> 00:21:18,200 Speaker 4: so handsomely for her climate activism have since incursed her 390 00:21:18,200 --> 00:21:23,080 Speaker 4: to expand into pro Palestinian activism. But some, including me, 391 00:21:23,240 --> 00:21:27,800 Speaker 4: called the permanent omni cause. If you, as she said quote, 392 00:21:29,240 --> 00:21:32,199 Speaker 4: if you, as a climate activist don't also fight for 393 00:21:32,240 --> 00:21:35,640 Speaker 4: a free Palestine and an end of colonialism and oppression 394 00:21:35,760 --> 00:21:38,720 Speaker 4: all over the world, then you should not be able 395 00:21:38,760 --> 00:21:40,760 Speaker 4: to call yourself a climate activist. 396 00:21:41,040 --> 00:21:43,560 Speaker 1: Well that's a nonsensical. 397 00:21:43,960 --> 00:21:48,880 Speaker 4: Now we get back into Claire's absolutely excellent writing and scholarship. 398 00:21:48,960 --> 00:21:51,719 Speaker 3: If you just tore about the death penalty, then you 399 00:21:51,760 --> 00:21:54,040 Speaker 3: can't call yourself a fiscal conservative. 400 00:21:54,560 --> 00:21:55,960 Speaker 1: What right? Right? 401 00:21:57,040 --> 00:22:02,520 Speaker 4: Let's see This demand for ideal logical purity across unrelated 402 00:22:02,560 --> 00:22:06,879 Speaker 4: causes is a significant move of female radicalism and a 403 00:22:06,960 --> 00:22:12,120 Speaker 4: feature of how intersectionality is used in activist cultures. Intersectionality, 404 00:22:12,160 --> 00:22:15,560 Speaker 4: which was originally like an academic framework for understanding different 405 00:22:15,560 --> 00:22:18,600 Speaker 4: forms of disadvantage and how they can overlap, it's now 406 00:22:18,680 --> 00:22:22,439 Speaker 4: a litmus test for moral conformity, not only on issues 407 00:22:22,480 --> 00:22:25,320 Speaker 4: like climate and gaza, but also on heavily charged topics 408 00:22:25,359 --> 00:22:28,439 Speaker 4: like abortion, where deviation from the dominant view is treated 409 00:22:28,480 --> 00:22:32,320 Speaker 4: as betrayal. While generally not coercing people through violence, female 410 00:22:32,440 --> 00:22:36,360 Speaker 4: radicals coerce through threats of shaming and social exclusion. It's 411 00:22:36,440 --> 00:22:39,240 Speaker 4: easy to dismiss such actions, as in consequential compared to 412 00:22:39,280 --> 00:22:42,760 Speaker 4: the violence of male radicals, and she gets into the 413 00:22:42,840 --> 00:22:46,320 Speaker 4: damage that some of the social coercion does among young women. 414 00:22:46,400 --> 00:22:48,560 Speaker 4: But I promise to you the really interesting stuff here 415 00:22:48,560 --> 00:22:55,320 Speaker 4: it comes. Still still existing studies in moral psychology and 416 00:22:55,400 --> 00:22:58,919 Speaker 4: social behavior offer valuable clues about the underlying dynamics of 417 00:22:58,920 --> 00:23:03,800 Speaker 4: what we're talking about. The moral foundations theory, developed by 418 00:23:03,840 --> 00:23:07,639 Speaker 4: social psychologist Jonathan hate Or Height and his colleagues, and 419 00:23:07,680 --> 00:23:10,560 Speaker 4: I'm a big admirer of his, argues that human moral 420 00:23:10,600 --> 00:23:14,200 Speaker 4: reasoning is built on a set of intuitive foundations. All right, 421 00:23:14,240 --> 00:23:17,840 Speaker 4: we all have moral reasoning built on the following things. 422 00:23:18,040 --> 00:23:19,960 Speaker 1: And you might quibble with some of the things. 423 00:23:19,680 --> 00:23:28,919 Speaker 4: But this is his theory, loyalty, authority, care, fairness, and purity, 424 00:23:29,160 --> 00:23:32,479 Speaker 4: And yes, we will explain these. A twenty twenty study 425 00:23:32,560 --> 00:23:36,080 Speaker 4: using this framework across sixty seven countries found that women 426 00:23:36,200 --> 00:23:39,720 Speaker 4: consistently scored higher than men on the latter three. That 427 00:23:39,760 --> 00:23:44,399 Speaker 4: would be care, fairness, and purity. The care foundation relates 428 00:23:44,400 --> 00:23:47,560 Speaker 4: to our sensitivity to the suffering of others, an extension 429 00:23:47,560 --> 00:23:51,080 Speaker 4: of the instinct that compels parents, especially mothers, to respond 430 00:23:51,119 --> 00:23:54,640 Speaker 4: to infant distress. I tell you what, if you're a parent, 431 00:23:54,720 --> 00:23:57,680 Speaker 4: especially a woman, and there's a baby crying. You cannot 432 00:23:57,720 --> 00:24:04,920 Speaker 4: maintain an even keel in a poker face is impossible anyway. Fairness, 433 00:24:05,280 --> 00:24:07,560 Speaker 4: the second of those three, is tied to notions of 434 00:24:07,760 --> 00:24:12,640 Speaker 4: justice and equality, while purity, originally evolved to protect against disease, 435 00:24:13,160 --> 00:24:17,280 Speaker 4: can manifest as a desire for ideological or moral cleanliness. 436 00:24:17,920 --> 00:24:21,520 Speaker 4: These tendencies, while adaptive in many contexts, can also make 437 00:24:21,600 --> 00:24:26,920 Speaker 4: young women particularly receptive to political narratives framed in terms 438 00:24:26,960 --> 00:24:32,760 Speaker 4: of trauma, injustice and moral absolutism. And they also create 439 00:24:32,880 --> 00:24:38,120 Speaker 4: vulnerability to ideologies that use victimhood as currency. And then 440 00:24:38,400 --> 00:24:42,560 Speaker 4: the way young women organize their social lives compounds this vulnerability. 441 00:24:42,880 --> 00:24:46,600 Speaker 4: Studies by developmental psychologists who they mentioned have found that 442 00:24:46,680 --> 00:24:49,600 Speaker 4: female friend groups tend to be less resilient than those 443 00:24:49,680 --> 00:24:53,040 Speaker 4: of males, and many women suffer from an intense fear 444 00:24:53,119 --> 00:24:54,560 Speaker 4: of social exclusion. 445 00:24:54,920 --> 00:24:55,600 Speaker 1: That's the pressure. 446 00:24:56,400 --> 00:24:58,679 Speaker 4: Yeah, And this is what I've been talking about with 447 00:24:58,800 --> 00:25:02,080 Speaker 4: kind of an imperfect non having raised boys and girls 448 00:25:02,080 --> 00:25:03,600 Speaker 4: and coach them and mentor to them and that sort 449 00:25:03,600 --> 00:25:08,560 Speaker 4: of thing. The fear of being kicked out of the 450 00:25:08,640 --> 00:25:14,960 Speaker 4: friend group among girls is like mostly unfamiliar to guys. Yeah, 451 00:25:15,000 --> 00:25:17,280 Speaker 4: they might realize that the cool guys don't want to 452 00:25:17,280 --> 00:25:19,480 Speaker 4: hang out with them, but they'll find their own friend 453 00:25:19,480 --> 00:25:21,360 Speaker 4: group and be pretty comfortable with it and not think 454 00:25:21,359 --> 00:25:23,440 Speaker 4: about it much anymore. Right, And they'll call the other 455 00:25:23,520 --> 00:25:26,720 Speaker 4: guys Dix and just just again won't think about it anymore. 456 00:25:28,280 --> 00:25:29,720 Speaker 1: Anyway, I'm so good. 457 00:25:29,960 --> 00:25:33,159 Speaker 3: If you're a woman, you're really really constantly on the 458 00:25:33,160 --> 00:25:33,639 Speaker 3: lookout for. 459 00:25:33,720 --> 00:25:35,200 Speaker 1: Am I about to get kicked out of this group? 460 00:25:35,200 --> 00:25:36,840 Speaker 1: And worried about it? Right? 461 00:25:36,960 --> 00:25:39,960 Speaker 4: Most women, anyway, many women suffer from an intense fear 462 00:25:40,040 --> 00:25:44,520 Speaker 4: of social exclusion, the pressure to fit true with you, Katie. 463 00:25:45,359 --> 00:25:47,960 Speaker 5: Uh, the fear of being kicked out of a group. 464 00:25:48,000 --> 00:25:50,480 Speaker 5: Not so much because I wasn't really part of my 465 00:25:50,520 --> 00:25:51,600 Speaker 5: group was guys. 466 00:25:51,880 --> 00:25:54,680 Speaker 1: So I didn't hear the earliest girl I've ever known. 467 00:25:54,720 --> 00:25:58,320 Speaker 5: You, No, I'm not, But I do get that whole, 468 00:25:58,359 --> 00:26:01,320 Speaker 5: like maybe Foma, Like I did have a group of 469 00:26:01,359 --> 00:26:04,359 Speaker 5: girl friends that all hung out together, and when they 470 00:26:04,400 --> 00:26:05,920 Speaker 5: would do that, I wasn't included. 471 00:26:06,359 --> 00:26:09,400 Speaker 1: That stung a little. Yeah. 472 00:26:09,480 --> 00:26:13,760 Speaker 4: Yeah, So anyway, again we're talking about tendencies and averages 473 00:26:13,800 --> 00:26:17,760 Speaker 4: in the typical But again, the pressure to fit into 474 00:26:17,880 --> 00:26:20,800 Speaker 4: a group is stronger for girls than for boys, possibly 475 00:26:20,880 --> 00:26:23,720 Speaker 4: leading girls to support beliefs or ideas out of a 476 00:26:23,840 --> 00:26:28,399 Speaker 4: desire for social harmony rather than true conviction. These dynamics 477 00:26:28,400 --> 00:26:33,920 Speaker 4: create perfect conditions for availability cascades, a social phenomenon described 478 00:26:34,000 --> 00:26:37,040 Speaker 4: by several other scientists in which a group comes to 479 00:26:37,080 --> 00:26:40,240 Speaker 4: hold the belief through chain reactions. I found this super 480 00:26:40,240 --> 00:26:43,720 Speaker 4: interesting too. Take, for example, Greta Tunberg's declaration that climate 481 00:26:43,760 --> 00:26:47,640 Speaker 4: activists must also fight for Palestinian liberation. In progressive social 482 00:26:47,680 --> 00:26:50,640 Speaker 4: circles where Tuneberg is held up as a moral authority, 483 00:26:50,880 --> 00:26:55,400 Speaker 4: good lord, h so so wrong, some girls might think 484 00:26:55,440 --> 00:27:00,440 Speaker 4: this are yes again, we find ourselves agreeing Greta anyway, 485 00:27:01,720 --> 00:27:04,399 Speaker 4: In progressive social circles where Tunberg is held up as 486 00:27:04,440 --> 00:27:07,800 Speaker 4: a moral authority, some girls might think this argument makes 487 00:27:07,880 --> 00:27:12,880 Speaker 4: no sense, but they won't say so. Collectively, such silence 488 00:27:12,920 --> 00:27:16,480 Speaker 4: can be mistaken for universal agreement, pressuring others to mold 489 00:27:16,520 --> 00:27:20,399 Speaker 4: their views to fit in. This artificial consensus can snowball 490 00:27:20,640 --> 00:27:24,040 Speaker 4: as individuals assume everyone else in their peer group agrees 491 00:27:24,080 --> 00:27:27,879 Speaker 4: with a given sentiment, completely unaware that many don't. The 492 00:27:27,960 --> 00:27:31,440 Speaker 4: result is a fragile system held together by fear rather 493 00:27:31,560 --> 00:27:32,320 Speaker 4: than belief. 494 00:27:32,560 --> 00:27:37,440 Speaker 3: Boy in the two loudest, most aggressive people with their 495 00:27:37,640 --> 00:27:40,080 Speaker 3: view would be spouting it, and everybody else would be like, 496 00:27:40,119 --> 00:27:41,679 Speaker 3: I guess we're all agreeing with this. 497 00:27:41,800 --> 00:27:43,639 Speaker 4: I assume everybody's agreement with that. I don't, but I 498 00:27:43,640 --> 00:27:46,120 Speaker 4: guess I'll go along with it. And this is and 499 00:27:46,160 --> 00:27:48,840 Speaker 4: you know, putting aside the male female thing. That last 500 00:27:48,880 --> 00:27:53,360 Speaker 4: part especially really helps describe or answer the question Jack 501 00:27:53,359 --> 00:27:55,680 Speaker 4: and I have asked over and over again. How did 502 00:27:55,720 --> 00:27:59,720 Speaker 4: the very very small number of people, although they have 503 00:27:59,840 --> 00:28:03,159 Speaker 4: the megaphone of the media and education, but the very 504 00:28:03,160 --> 00:28:04,919 Speaker 4: small number of people who believe a lot of this 505 00:28:05,040 --> 00:28:10,520 Speaker 4: progressive nonsense, the radical gender theory, the trans thing, for instance, 506 00:28:11,600 --> 00:28:13,959 Speaker 4: how did such a small group of people hold sway 507 00:28:14,040 --> 00:28:16,679 Speaker 4: over so many people. Well, part of it is that 508 00:28:17,200 --> 00:28:23,240 Speaker 4: availability cascade, or what do they call it, the assumption 509 00:28:23,320 --> 00:28:27,840 Speaker 4: that everybody agrees because nobody is disagreeing, just because everybody 510 00:28:27,840 --> 00:28:30,520 Speaker 4: doesn't want to stand up to the bullies, and so 511 00:28:30,600 --> 00:28:33,159 Speaker 4: it's snowballs. And then here's the final step in how 512 00:28:33,240 --> 00:28:37,760 Speaker 4: this works, especially with girls. Social media intensifies these cascades 513 00:28:38,040 --> 00:28:43,120 Speaker 4: when female friendship groups migrate online. Superficial displays of consensus 514 00:28:43,400 --> 00:28:48,000 Speaker 4: like the sharing of memes, badges, and hashtags can feel mandatory. 515 00:28:48,520 --> 00:28:51,560 Speaker 4: Platforms like Instagram and TikTok serve up a stream of 516 00:28:51,640 --> 00:28:56,280 Speaker 4: trauma related content, activating the care instinct while exposing young 517 00:28:56,320 --> 00:29:00,400 Speaker 4: women to constant cues that their safety, belonging, and self 518 00:29:00,440 --> 00:29:05,560 Speaker 4: worth depend on adopting pure ideological postures. The result is 519 00:29:05,560 --> 00:29:10,400 Speaker 4: a technological and ideological hijacking of female psychology. Can you 520 00:29:10,440 --> 00:29:13,560 Speaker 4: imagine a young woman who is part of one of 521 00:29:13,560 --> 00:29:15,840 Speaker 4: those friend groups and they all hashtag, and they all 522 00:29:15,880 --> 00:29:18,280 Speaker 4: agree on all the issues, saying, you know, I agree 523 00:29:18,280 --> 00:29:21,040 Speaker 4: with ninety percent of that, but that idea you just 524 00:29:21,280 --> 00:29:25,480 Speaker 4: expressed there, that's bunk. That would take a hell of 525 00:29:25,520 --> 00:29:27,760 Speaker 4: a lot of moral courage. Yeah, and not a lot 526 00:29:27,760 --> 00:29:31,080 Speaker 4: of people have a lot of moral courage. And then 527 00:29:31,120 --> 00:29:33,760 Speaker 4: this is kind of specially when you're young and you know, 528 00:29:33,840 --> 00:29:36,240 Speaker 4: you just want to have friends and hang out and 529 00:29:36,240 --> 00:29:41,080 Speaker 4: fit in, right, Yeah, yeah, And radical politics were not 530 00:29:41,160 --> 00:29:42,960 Speaker 4: part of that for our generation. 531 00:29:43,120 --> 00:29:44,280 Speaker 1: No, really, hardly at. 532 00:29:44,240 --> 00:29:48,200 Speaker 4: All, but it is almost constantly for kids, especially online. 533 00:29:48,360 --> 00:29:50,840 Speaker 4: It come as no surprise then, that progressive girls were 534 00:29:50,840 --> 00:29:53,920 Speaker 4: the first group to suffer a major mental health decline 535 00:29:54,160 --> 00:29:57,000 Speaker 4: following the mass adoption of smartphones and social media around 536 00:29:57,040 --> 00:29:59,760 Speaker 4: twenty twelve. As Hepe points out in his excellent book 537 00:29:59,760 --> 00:30:03,120 Speaker 4: The End Anxious Generation in his newsletter gen Z, girls 538 00:30:03,120 --> 00:30:06,640 Speaker 4: have been socialized online in a culture based on hyper 539 00:30:06,760 --> 00:30:16,160 Speaker 4: vigilance toward harm, accompanied by demands for moral absolutism and purity. 540 00:30:17,240 --> 00:30:18,600 Speaker 1: I find that. 541 00:30:18,560 --> 00:30:23,320 Speaker 4: Reasoning, combined with the science behind it, damn near air tight. 542 00:30:23,560 --> 00:30:27,520 Speaker 4: Claire Layman, writing in the Dispatch, Well done, Claire. 543 00:30:27,360 --> 00:30:30,240 Speaker 3: So what do we do about this? Take away the vote? 544 00:30:30,520 --> 00:30:31,959 Speaker 1: Oh boy? 545 00:30:32,880 --> 00:30:36,360 Speaker 4: Oh, I'm gonna have you shocked. 546 00:30:36,440 --> 00:30:37,440 Speaker 1: And I. 547 00:30:39,560 --> 00:30:44,640 Speaker 4: We need a little more study of female violence now, yeah, 548 00:30:44,680 --> 00:30:49,760 Speaker 4: but on a serious level, what do we do is 549 00:30:49,800 --> 00:30:51,200 Speaker 4: a great question. 550 00:30:51,800 --> 00:30:52,000 Speaker 1: Yeah. 551 00:30:53,600 --> 00:30:56,840 Speaker 3: I've read a number of books about the Vietnam War 552 00:30:56,920 --> 00:31:02,640 Speaker 3: in which uh prominent were young women who were fighting 553 00:31:02,680 --> 00:31:06,560 Speaker 3: for the North Vietnamese who were great at being infiltrating 554 00:31:06,600 --> 00:31:10,200 Speaker 3: because they weren't suspected and often could pull off looking 555 00:31:10,280 --> 00:31:12,360 Speaker 3: like thirteen year olds when they were nineteen year olds 556 00:31:12,440 --> 00:31:16,960 Speaker 3: or whatever. But I mean they were a major force 557 00:31:17,040 --> 00:31:19,280 Speaker 3: to be deal dealt with because they were so committed. 558 00:31:19,520 --> 00:31:23,880 Speaker 5: Well, and I watched a crazy documentary about how the 559 00:31:23,920 --> 00:31:28,600 Speaker 5: big time player isis recruiters were the women? Oh wow, right, 560 00:31:28,640 --> 00:31:31,120 Speaker 5: they were the ones out there pulling people in. 561 00:31:31,280 --> 00:31:33,920 Speaker 1: Wow, that's interesting too, Yeah, similar reasons. 562 00:31:34,320 --> 00:31:36,960 Speaker 4: Well and as Orwell wrote in nineteen eighty four, the 563 00:31:37,000 --> 00:31:40,280 Speaker 4: book not the year, and it's a work of fiction, 564 00:31:40,360 --> 00:31:45,800 Speaker 4: but it's describing how socialism reaches its ultimate you know, 565 00:31:45,840 --> 00:31:48,960 Speaker 4: it's it's inevitable endpoint of totalitarianism. 566 00:31:49,600 --> 00:31:50,120 Speaker 1: He wrote. 567 00:31:50,680 --> 00:31:53,280 Speaker 4: It was always the women, and above all the young 568 00:31:53,280 --> 00:31:56,160 Speaker 4: ones who are the most bigoted adherents of the party, 569 00:31:56,400 --> 00:31:59,840 Speaker 4: the swallowers of slogans, the amateur spies and nosers out 570 00:31:59,840 --> 00:32:04,680 Speaker 4: of unorthodoxy. And that goes right to her, you know, 571 00:32:04,840 --> 00:32:10,959 Speaker 4: her set of arguments that it's about conformity and purity 572 00:32:11,240 --> 00:32:16,600 Speaker 4: of belief. You can't express any doubts about the set 573 00:32:16,640 --> 00:32:20,200 Speaker 4: of beliefs or you will be cast out. He observed 574 00:32:20,200 --> 00:32:22,560 Speaker 4: that in the forties it was the forties or fifties 575 00:32:22,600 --> 00:32:23,280 Speaker 4: when you wrote. 576 00:32:23,080 --> 00:32:26,400 Speaker 3: That, he wrote eighty four and forty eight. That's why 577 00:32:26,400 --> 00:32:28,480 Speaker 3: he named it. He just reversed the year. It's gonner 578 00:32:28,560 --> 00:32:32,400 Speaker 3: lay to remember anyway. Fascinating stuff. Yeah, a lot more 579 00:32:32,440 --> 00:32:34,680 Speaker 3: in the ways to hear. 580 00:32:36,960 --> 00:32:39,480 Speaker 6: One of the biggest French fry factories in North America 581 00:32:39,560 --> 00:32:42,680 Speaker 6: is closing because Americans are eating fewer French fries. 582 00:32:44,280 --> 00:32:45,760 Speaker 1: People are eating fewer. 583 00:32:45,520 --> 00:32:47,840 Speaker 6: French fries than every Uber East driver was like, let's 584 00:32:47,840 --> 00:32:48,640 Speaker 6: speak for yourself. 585 00:32:51,560 --> 00:32:54,640 Speaker 3: Wow, you should never eat French fries. I should never 586 00:32:54,640 --> 00:32:58,840 Speaker 3: eat French fries. I do or fairly regularly. And some 587 00:32:58,920 --> 00:33:02,440 Speaker 3: of them are delicious. They're Satan's own side dish to me. 588 00:33:02,640 --> 00:33:07,960 Speaker 3: They are the accommodation of deliciousness and utter inadvisability. To 589 00:33:08,400 --> 00:33:11,880 Speaker 3: the top of it, they have like no nutritional value, 590 00:33:11,920 --> 00:33:15,280 Speaker 3: is that correct? Terrible idea, there's like no benefit. Oh 591 00:33:15,400 --> 00:33:20,840 Speaker 3: so good, okay, so I'll just read this. I came 592 00:33:20,880 --> 00:33:24,160 Speaker 3: across it. That doesn't mean I endorse it. Just came 593 00:33:24,200 --> 00:33:27,920 Speaker 3: across it, thought it was funny. And we got Katie 594 00:33:27,960 --> 00:33:29,880 Speaker 3: here for this because we need a woman to react. 595 00:33:30,200 --> 00:33:32,640 Speaker 3: Date a girl who wears glasses. It's like dating two 596 00:33:32,720 --> 00:33:36,800 Speaker 3: girls when she takes them off. Oh geez, this goes 597 00:33:37,000 --> 00:33:41,360 Speaker 3: this goes on, this goes on, this goes on, wash 598 00:33:41,400 --> 00:33:44,600 Speaker 3: off her makeup, and then you date three girls. Wow, 599 00:33:45,040 --> 00:33:48,160 Speaker 3: remove her Instagram filter and you could be dating four 600 00:33:48,200 --> 00:33:51,680 Speaker 3: girls with one girl. Take her meds away and you 601 00:33:51,680 --> 00:33:52,600 Speaker 3: could have up to ten. 602 00:33:52,920 --> 00:34:01,320 Speaker 1: All right, enough, good god, who I mean get of course? 603 00:34:01,360 --> 00:34:03,320 Speaker 3: Yeah, I told you I didn't endorse it. I just 604 00:34:03,320 --> 00:34:07,320 Speaker 3: came across it and found it control present it to 605 00:34:07,360 --> 00:34:08,120 Speaker 3: a national audience. 606 00:34:08,200 --> 00:34:08,480 Speaker 1: That's how. 607 00:34:09,160 --> 00:34:11,880 Speaker 4: And you said you did one hundred percent agree with 608 00:34:12,000 --> 00:34:12,680 Speaker 4: those sentences. 609 00:34:12,760 --> 00:34:14,960 Speaker 1: Correct? Is that what you said? Take your meds like 610 00:34:15,080 --> 00:34:21,600 Speaker 1: ten different girls. Jack, that's horrible. Stand Okay, retweets do 611 00:34:21,680 --> 00:34:23,000 Speaker 1: not necessarily imply. 612 00:34:23,280 --> 00:34:29,279 Speaker 3: No, no, the internet ladies and gentlemen, the internet plug 613 00:34:29,320 --> 00:34:32,719 Speaker 3: It is that I hop last night speaking of inadvisable 614 00:34:32,760 --> 00:34:35,760 Speaker 3: eating choices. Although I had the scrambled eggs with bacon, 615 00:34:35,880 --> 00:34:38,080 Speaker 3: you know, it's not a horrible thing with one pancake, 616 00:34:38,160 --> 00:34:41,520 Speaker 3: one small pancake. But they were advertising there while we 617 00:34:41,560 --> 00:34:45,319 Speaker 3: were there, the cookie butter pancake combo as well, as 618 00:34:45,360 --> 00:34:47,160 Speaker 3: I always say to my kids whenever we come across 619 00:34:47,160 --> 00:34:50,400 Speaker 3: this stuff, what is causing America's obesity problem? 620 00:34:50,560 --> 00:34:50,680 Speaker 2: Uh? 621 00:34:51,040 --> 00:34:54,600 Speaker 3: Where's the comma or the dash there in that cookie 622 00:34:54,640 --> 00:34:56,200 Speaker 3: butter pancake combo? 623 00:34:56,280 --> 00:34:59,120 Speaker 1: You get bacon and eggs and hash browns with your. 624 00:34:59,080 --> 00:35:03,320 Speaker 3: Cookie butter panking if you order cookie butter pancas, unless 625 00:35:03,320 --> 00:35:04,239 Speaker 3: you're doing it as like a. 626 00:35:04,160 --> 00:35:08,160 Speaker 1: Dessert for bird. Oh, what's a cookie butter? I've never 627 00:35:08,200 --> 00:35:12,120 Speaker 1: heard of cookie, but I'm sure it's very very good. 628 00:35:13,960 --> 00:35:15,080 Speaker 4: Armstrong and Getty