1 00:00:01,880 --> 00:00:06,160 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio the George 2 00:00:06,240 --> 00:00:10,280 Speaker 1: Washington Broadcast Center, Jack Armstrong and Joe Getty. 3 00:00:10,200 --> 00:00:17,279 Speaker 2: Armstrong and Getty and he Armstrong and Getty. 4 00:00:23,800 --> 00:00:26,920 Speaker 3: More than a dozen airports across the country have refused 5 00:00:26,960 --> 00:00:28,920 Speaker 3: to play a video of Homeland. 6 00:00:28,600 --> 00:00:31,520 Speaker 4: Security Secretary Christy Nome at TSA checkpoints. 7 00:00:31,760 --> 00:00:38,760 Speaker 2: It just really riles up the dogs. I had forgotten 8 00:00:38,760 --> 00:00:41,120 Speaker 2: about that story. Take me a second. That's a hell 9 00:00:41,159 --> 00:00:45,200 Speaker 2: of that's a funny joke. I get jokes. 10 00:00:46,680 --> 00:00:49,600 Speaker 1: A couple of words about the whole climate change thing. 11 00:00:50,200 --> 00:00:52,120 Speaker 1: Oh you know what, before I get to that, it's 12 00:00:52,159 --> 00:00:54,800 Speaker 1: become clear to me. I don't know who blew the 13 00:00:54,840 --> 00:00:59,680 Speaker 1: whistle or what, but the Democrat media establishment has gotten 14 00:00:59,800 --> 00:01:03,760 Speaker 1: per mission or looked around and given it self permission 15 00:01:04,040 --> 00:01:07,440 Speaker 1: to say our left flank are lunatics and are going 16 00:01:07,440 --> 00:01:08,759 Speaker 1: to ruin the party. 17 00:01:08,600 --> 00:01:09,679 Speaker 2: And we got to stop that. 18 00:01:10,280 --> 00:01:12,400 Speaker 1: New York Times with a big article about that today, 19 00:01:12,440 --> 00:01:16,480 Speaker 1: and Jonathan Martin, the senior political communist for Politico out 20 00:01:16,560 --> 00:01:19,080 Speaker 1: with a piece and titled something or other, what is it? 21 00:01:20,959 --> 00:01:24,720 Speaker 1: Democrats keep falling for political fantasies? When will they learn? 22 00:01:25,120 --> 00:01:27,200 Speaker 5: Well, that's interesting? And then the Wall Street Journal which 23 00:01:27,280 --> 00:01:30,760 Speaker 5: leans right. They've got an opinion piece on, hey, Democrats, 24 00:01:30,800 --> 00:01:33,040 Speaker 5: wake up, the mainstream. 25 00:01:32,520 --> 00:01:36,080 Speaker 2: Media is doing you more harm than good. Yeah, and yeah, 26 00:01:36,440 --> 00:01:37,160 Speaker 2: it's interesting. 27 00:01:37,520 --> 00:01:41,360 Speaker 1: Yeah, somebody, whether spoken or unspoken, said hey. 28 00:01:41,319 --> 00:01:43,640 Speaker 2: Now, now we got to do this. Maybe it was 29 00:01:43,680 --> 00:01:47,199 Speaker 2: after the big No Kings protest and everybody. 30 00:01:46,800 --> 00:01:50,200 Speaker 1: Just laughed at it except the people who were there 31 00:01:50,240 --> 00:01:51,720 Speaker 1: and charged up about it. 32 00:01:51,800 --> 00:01:52,200 Speaker 2: Anyway. 33 00:01:52,720 --> 00:01:55,240 Speaker 1: So speaking of that sort of thing, National Review had 34 00:01:55,360 --> 00:01:59,000 Speaker 1: a piece about the mayor, the woke mayor of Boulder, Colorado, 35 00:01:59,040 --> 00:02:03,120 Speaker 1: Aaron Brockett. He's made so called social justice his whole 36 00:02:03,240 --> 00:02:06,760 Speaker 1: public career. A website of Boulder which is a very 37 00:02:06,800 --> 00:02:10,680 Speaker 1: woke down it says, as part of the city's commitment 38 00:02:10,720 --> 00:02:14,160 Speaker 1: to advancing racial equity, Mayor Brocket has attended the advancing 39 00:02:14,240 --> 00:02:19,519 Speaker 1: racial equity, role of government and bias in microaggression trainings, so. 40 00:02:19,400 --> 00:02:21,000 Speaker 2: You can trust him. 41 00:02:22,160 --> 00:02:24,760 Speaker 1: What makes no sense, they write in the National Review, 42 00:02:25,000 --> 00:02:28,880 Speaker 1: is that this mayor wants to have the powdery make national, 43 00:02:28,919 --> 00:02:31,560 Speaker 1: even global energy policy. 44 00:02:31,720 --> 00:02:34,600 Speaker 2: So he has filed. 45 00:02:34,480 --> 00:02:42,640 Speaker 1: Suit against oil companies saying the problems of fossil fuels 46 00:02:43,080 --> 00:02:47,840 Speaker 1: have hurt Boulder, and therefore the energy companies owe Boulder. 47 00:02:48,720 --> 00:02:51,400 Speaker 1: And they point out the suit and others like it 48 00:02:51,480 --> 00:02:56,040 Speaker 1: is preposterous on his face and threatens our economic well 49 00:02:56,040 --> 00:02:58,880 Speaker 1: being in constitutional order. But the Colorado Supreme Court has 50 00:02:58,919 --> 00:03:00,880 Speaker 1: allowed the suit to go forward despite the fact that 51 00:03:00,919 --> 00:03:04,320 Speaker 1: it is a clear attempt to supersede federal environmental regulations. 52 00:03:04,320 --> 00:03:06,840 Speaker 1: In the US Supreme Court, it's now considering whether to 53 00:03:06,880 --> 00:03:08,919 Speaker 1: grant cert in the case and take up the novel 54 00:03:08,960 --> 00:03:16,640 Speaker 1: and consequential issues it presents. A Boulder has asserted specifically 55 00:03:16,960 --> 00:03:23,800 Speaker 1: that the oil companies have committed a public nuisance. Boulder 56 00:03:23,840 --> 00:03:28,600 Speaker 1: has asserted claims for public and private nuisance trespass, unjust enrichment, 57 00:03:28,760 --> 00:03:34,960 Speaker 1: and civil conspiracy. Anyway, it doesn't what their claiming, doesn't 58 00:03:34,960 --> 00:03:37,240 Speaker 1: fit any of the legal definitions, and it's ridiculous that 59 00:03:37,280 --> 00:03:40,000 Speaker 1: the Colorado Supreme Court let it go. But that is 60 00:03:40,040 --> 00:03:43,880 Speaker 1: the state of woke environmental well not in even environmental 61 00:03:44,440 --> 00:03:49,480 Speaker 1: like climate change activism. And I came across this virtually 62 00:03:49,520 --> 00:03:52,640 Speaker 1: at the same time that article came out, and it's 63 00:03:52,640 --> 00:03:56,360 Speaker 1: written by Ted Nordhaus, who if you are super into 64 00:03:56,400 --> 00:03:59,720 Speaker 1: the climate change thing, you might recognize his name. He's 65 00:03:59,760 --> 00:04:03,800 Speaker 1: a climate expert, a scientist and author, and he's written 66 00:04:03,840 --> 00:04:07,080 Speaker 1: a piece for the Free Press entitled I thought climate 67 00:04:07,160 --> 00:04:07,840 Speaker 1: change would. 68 00:04:07,640 --> 00:04:09,080 Speaker 2: End the world. I was wrong. 69 00:04:09,680 --> 00:04:15,160 Speaker 1: My worldview was built on apocalyptic models sprung from faulty assumptions. 70 00:04:15,520 --> 00:04:19,280 Speaker 5: It always really gets my attention when people go against 71 00:04:19,360 --> 00:04:23,640 Speaker 5: their own previous position, because you can really trust them, 72 00:04:23,680 --> 00:04:24,919 Speaker 5: most likely on that. 73 00:04:25,680 --> 00:04:26,440 Speaker 2: In a lot of cases. 74 00:04:26,520 --> 00:04:32,280 Speaker 1: Yeah, if they're so convinced of their new point of 75 00:04:32,360 --> 00:04:36,719 Speaker 1: view and concerned that they had led people wrong, that 76 00:04:36,720 --> 00:04:39,039 Speaker 1: they're saying, Hey, remember when I said that I was wrong, 77 00:04:39,279 --> 00:04:44,440 Speaker 1: I was totally wrong. Yeah, that makes an impression really Interestingly, 78 00:04:44,800 --> 00:04:48,279 Speaker 1: he co wrote his big two thousand book Breakthrough with 79 00:04:48,440 --> 00:04:55,360 Speaker 1: Michael Schellenberger. Anyway, the great to investigative journalist. I used 80 00:04:55,360 --> 00:04:57,440 Speaker 1: to argue that if the world kept burning fossil fuels 81 00:04:57,440 --> 00:05:01,240 Speaker 1: at current rates, catastrophe was virtually assured. Quote, the heating 82 00:05:01,320 --> 00:05:04,000 Speaker 1: of the Earth will cause the sea levels to rise 83 00:05:04,040 --> 00:05:07,040 Speaker 1: and the Amazon to collapse, and, according to scenarios commissioned 84 00:05:07,040 --> 00:05:09,520 Speaker 1: by the Pentagon, will trigger a series of wars over 85 00:05:09,560 --> 00:05:12,800 Speaker 1: the basic resources like food and water. No, that was 86 00:05:12,920 --> 00:05:15,559 Speaker 1: back in oh seven, right, he says, I no longer 87 00:05:15,560 --> 00:05:19,039 Speaker 1: believe this hyperbole. At the time, I, like most climate experts, 88 00:05:19,080 --> 00:05:21,680 Speaker 1: thought that business as usual admissions would lead to around 89 00:05:21,760 --> 00:05:24,600 Speaker 1: five degrees of warming by the end of this century. 90 00:05:25,200 --> 00:05:27,000 Speaker 2: That assumption was never plausible. 91 00:05:27,200 --> 00:05:30,880 Speaker 1: It assumed high population growth, high economic growth, and slow 92 00:05:30,960 --> 00:05:35,200 Speaker 1: technological change. But fertility rates have been falling, global economic 93 00:05:35,240 --> 00:05:39,040 Speaker 1: growth is slowing, and the global economy has been decarbonizing 94 00:05:39,080 --> 00:05:39,720 Speaker 1: for decades. 95 00:05:40,640 --> 00:05:41,680 Speaker 2: Nor is there a good reason to. 96 00:05:41,680 --> 00:05:44,160 Speaker 1: Think that the combination of these three trends could possibly 97 00:05:44,200 --> 00:05:46,640 Speaker 1: be sustained in concert. And he goes into some of 98 00:05:46,640 --> 00:05:50,919 Speaker 1: the practicalities of economic growth is strongly associated with falling 99 00:05:50,920 --> 00:05:53,920 Speaker 1: fertility rates. Technological change is the primary driver of long 100 00:05:53,960 --> 00:05:54,760 Speaker 1: term economic growth. 101 00:05:55,000 --> 00:05:59,600 Speaker 5: Blah, we're talking about fertility in our one and how 102 00:05:59,640 --> 00:06:01,880 Speaker 5: Europe going the wrong direction by a lot. But yeah, 103 00:06:02,080 --> 00:06:05,800 Speaker 5: we're probably at the about the peak of world population. 104 00:06:05,960 --> 00:06:08,640 Speaker 2: And then it's going to start going down. Yeah. 105 00:06:08,680 --> 00:06:10,960 Speaker 1: And he points about how it points out how the 106 00:06:11,000 --> 00:06:13,440 Speaker 1: worst case scenarios are much much less bad. 107 00:06:13,760 --> 00:06:16,240 Speaker 2: Now. I'm glad, I come it quietly. 108 00:06:16,760 --> 00:06:19,960 Speaker 1: Yeah, what a goalposts And he says this is all 109 00:06:20,000 --> 00:06:22,960 Speaker 1: the more confounding given the good news extends well beyond 110 00:06:22,960 --> 00:06:25,839 Speaker 1: projections of long term warming, despite close to a degree 111 00:06:25,839 --> 00:06:28,800 Speaker 1: and a half of warming over the last century global mortality. 112 00:06:29,200 --> 00:06:33,599 Speaker 1: Listen to this folks, global mortality from climate and weather 113 00:06:33,680 --> 00:06:37,320 Speaker 1: extremes has fallen by more than ninety six percent on 114 00:06:37,400 --> 00:06:40,440 Speaker 1: a per capita basis. Was that because world is don't 115 00:06:40,440 --> 00:06:42,640 Speaker 1: get hit in the head with a falling beam in 116 00:06:42,720 --> 00:06:44,720 Speaker 1: a hurricane, because we just build things better? 117 00:06:44,800 --> 00:06:47,039 Speaker 2: Is it a lot of matter? There's Yeah, there's a 118 00:06:47,080 --> 00:06:48,120 Speaker 2: lot of that. Yeah. 119 00:06:49,400 --> 00:06:52,240 Speaker 1: You could make the argument that modernity, which has been 120 00:06:52,279 --> 00:06:56,200 Speaker 1: fueled by fossil fuels, has been way, way, way better 121 00:06:56,279 --> 00:06:59,839 Speaker 1: for protecting human lives than any downside, which I think 122 00:06:59,880 --> 00:07:03,159 Speaker 1: is his point. He said, the world is on track 123 00:07:03,240 --> 00:07:05,480 Speaker 1: this year for what is almost certainly the lowest level 124 00:07:05,480 --> 00:07:08,440 Speaker 1: of climate related mortality in recorded human history. 125 00:07:08,520 --> 00:07:08,839 Speaker 3: Wow. 126 00:07:08,960 --> 00:07:12,200 Speaker 5: Yeah, Wow, that's quite a stat given the fact that 127 00:07:12,320 --> 00:07:15,400 Speaker 5: you know, if there's a hurricane, climate change got an 128 00:07:15,400 --> 00:07:18,120 Speaker 5: eight dead and because of climate change. But we're going 129 00:07:18,200 --> 00:07:21,120 Speaker 5: to have the lowest number of people dead due to 130 00:07:21,160 --> 00:07:23,520 Speaker 5: climate since they've been keeping track. 131 00:07:24,080 --> 00:07:27,640 Speaker 1: Experts say the tornado across Missouri is an example of 132 00:07:27,680 --> 00:07:31,320 Speaker 1: climate change, when any responsible scientists will tell you it 133 00:07:31,400 --> 00:07:36,200 Speaker 1: is impossible to attribute an individual weather phenomenon. 134 00:07:35,720 --> 00:07:36,880 Speaker 2: To climate change anyway. 135 00:07:37,280 --> 00:07:41,360 Speaker 1: Just to finish his thought, yes, the economic cost of 136 00:07:41,360 --> 00:07:44,520 Speaker 1: climate extremes continue to rise, but This is almost entirely 137 00:07:44,600 --> 00:07:48,080 Speaker 1: due to affluence, population growth, and the migration of global 138 00:07:48,080 --> 00:07:52,800 Speaker 1: populations toward climate hazards, mainly cities and coastal regions and floodplains. 139 00:07:53,200 --> 00:07:55,760 Speaker 1: So the far more interesting question is not why my 140 00:07:55,800 --> 00:08:00,280 Speaker 1: colleagues and I at the Breakthrough Institute have revised ours 141 00:08:00,280 --> 00:08:04,200 Speaker 1: about climate risk, but why so many progressive environmentalists have not. 142 00:08:05,760 --> 00:08:09,760 Speaker 1: And you know it's such a good piece. Well, I 143 00:08:09,800 --> 00:08:12,480 Speaker 1: got two reasons, go ahead. 144 00:08:12,920 --> 00:08:15,800 Speaker 5: One you're in on the making money part of it, 145 00:08:15,840 --> 00:08:17,720 Speaker 5: so you don't want to go away. And then two 146 00:08:18,040 --> 00:08:19,680 Speaker 5: he was your religion, and who wants to give up 147 00:08:19,720 --> 00:08:20,360 Speaker 5: their your religion? 148 00:08:21,200 --> 00:08:21,680 Speaker 2: Yeah? 149 00:08:21,960 --> 00:08:27,760 Speaker 1: Yeah. He talks about the narrative. The Harem scaram al 150 00:08:27,800 --> 00:08:31,840 Speaker 1: Gore narrative conflicts with exist thing evidence, including data collected 151 00:08:31,840 --> 00:08:36,040 Speaker 1: by political scientists and former environmental studies professor Roger Pilke Jr. 152 00:08:36,360 --> 00:08:39,080 Speaker 1: His work going back to the mid nineties showed again 153 00:08:39,160 --> 00:08:42,599 Speaker 1: and again that the normalized economic costs of climate related disasters, 154 00:08:42,720 --> 00:08:45,840 Speaker 1: when adjusted for wealth and economic growths or growth, were 155 00:08:45,880 --> 00:08:50,000 Speaker 1: not increasing despite the documented warming of the climate. Here's 156 00:08:50,040 --> 00:08:53,240 Speaker 1: the part that might tie back to yesterday's show, when 157 00:08:53,280 --> 00:08:56,240 Speaker 1: we were talking about the great feminization of America and 158 00:08:56,360 --> 00:09:02,320 Speaker 1: how cancel culture is a very feminine phenomenon. He writes, 159 00:09:02,360 --> 00:09:04,840 Speaker 1: The reason for my shift in opinion wasn't only that 160 00:09:04,880 --> 00:09:08,040 Speaker 1: Pilky had produced strong evidence that undermined a key claim 161 00:09:08,080 --> 00:09:12,480 Speaker 1: of the climate advocacy community. It wasn't even witnessing Pilky's cancelation, 162 00:09:12,840 --> 00:09:16,160 Speaker 1: which was brutal. It was rather that I came to 163 00:09:16,240 --> 00:09:18,920 Speaker 1: understand why you couldn't find a climate change signal in 164 00:09:18,960 --> 00:09:21,880 Speaker 1: the disaster lost at it's quite close to a degree 165 00:09:21,880 --> 00:09:23,800 Speaker 1: and a half of warming over the last century. And 166 00:09:23,840 --> 00:09:25,839 Speaker 1: then he goes into some technical stuff that takes a 167 00:09:25,840 --> 00:09:29,760 Speaker 1: little bit of time to explain, but I just appreciated 168 00:09:29,840 --> 00:09:38,560 Speaker 1: him reminding us of the brutal ostracization and cancelation that 169 00:09:38,720 --> 00:09:43,360 Speaker 1: takes place in science if you have quote unquote the 170 00:09:43,400 --> 00:09:48,840 Speaker 1: wrong opinion, and that is the most diametrically uns you know, 171 00:09:48,920 --> 00:09:53,720 Speaker 1: against science thing you could possibly do in science is 172 00:09:53,800 --> 00:09:57,079 Speaker 1: to ostracize someone for disagreeing and saying, hey, we need 173 00:09:57,120 --> 00:10:00,280 Speaker 1: to take another look at the data or collect more data. 174 00:10:00,840 --> 00:10:02,000 Speaker 2: That's an example of. 175 00:10:02,000 --> 00:10:05,120 Speaker 1: What I can never remember her name pretty obviously, though 176 00:10:05,120 --> 00:10:08,360 Speaker 1: you were Helen Andrews was writing about in The Great Feminization. 177 00:10:08,000 --> 00:10:10,040 Speaker 5: You were not going to keep your job in academia 178 00:10:10,360 --> 00:10:12,959 Speaker 5: or get grants in academia if you were going to say, 179 00:10:13,000 --> 00:10:15,120 Speaker 5: you know, I'm not sure about this whole climate change thing. 180 00:10:15,559 --> 00:10:16,079 Speaker 2: Obviously. 181 00:10:16,760 --> 00:10:20,120 Speaker 1: Yeah, yeah, it's a good piece. I think you might 182 00:10:20,120 --> 00:10:22,920 Speaker 1: get Paywalld. We'll post it at Armstrong and getty dot 183 00:10:22,920 --> 00:10:26,760 Speaker 1: com under hot links and again for the emptieenth time. 184 00:10:26,840 --> 00:10:28,720 Speaker 1: If you did not hear the segment on the Great 185 00:10:28,720 --> 00:10:34,480 Speaker 1: Feminization yesterday, it's October twenty first hour two and October 186 00:10:34,559 --> 00:10:38,559 Speaker 1: and also the twenty first hot links has Helen Andrews 187 00:10:39,160 --> 00:10:40,600 Speaker 1: essay if you'd like to read it yourself. 188 00:10:42,000 --> 00:10:45,920 Speaker 5: What do we have there? Michael Prize Picks. Yeah, should 189 00:10:45,960 --> 00:10:50,199 Speaker 5: have gone more on Luka Doncic last night. Maybe for 190 00:10:50,240 --> 00:10:52,880 Speaker 5: the whole season. Price Picks is an opportunity to take 191 00:10:52,920 --> 00:10:55,960 Speaker 5: your strong sports opinions and turn them into cash, whether 192 00:10:56,000 --> 00:10:59,760 Speaker 5: we're talking World Series which starts US Friday, or obviously 193 00:10:59,800 --> 00:11:01,360 Speaker 5: the NFL season. We're right in the midst of it 194 00:11:01,400 --> 00:11:04,720 Speaker 5: and basketball just started. Yeah, I got a full slate 195 00:11:04,720 --> 00:11:07,280 Speaker 5: of basketball games tonight. All you do is pick two 196 00:11:07,440 --> 00:11:10,520 Speaker 5: players or more and say less or more on their 197 00:11:10,520 --> 00:11:11,359 Speaker 5: stat projections. 198 00:11:11,520 --> 00:11:14,640 Speaker 1: It's easy, it's fun, and if you download the prize 199 00:11:14,640 --> 00:11:16,400 Speaker 1: picks at today and use the code Armstrong to get 200 00:11:16,400 --> 00:11:19,160 Speaker 1: fifty bucks in lineups. You get fifty dollars in lineups 201 00:11:19,200 --> 00:11:21,080 Speaker 1: after you play your first five dollars lineup. 202 00:11:21,440 --> 00:11:23,600 Speaker 2: It's automatic, you don't need to win. You use the 203 00:11:23,640 --> 00:11:24,480 Speaker 2: code of Armstrong. 204 00:11:24,679 --> 00:11:27,800 Speaker 1: They give you fifty dollars worth of lineups after you 205 00:11:27,880 --> 00:11:30,319 Speaker 1: play just a five dollars lineup. 206 00:11:30,520 --> 00:11:32,880 Speaker 5: And Price Picks has something called stacks where you can 207 00:11:32,880 --> 00:11:35,520 Speaker 5: pick the same player like Steph Curry. You could do points, 208 00:11:35,520 --> 00:11:37,679 Speaker 5: three pointers and assists for instance. 209 00:11:37,880 --> 00:11:40,800 Speaker 1: Right, and you can combine players from different sports free lineup. 210 00:11:40,920 --> 00:11:42,880 Speaker 1: So you know you got a theory on the Thursday 211 00:11:43,000 --> 00:11:46,600 Speaker 1: night football and basketball game, go ahead. Prize Picks use 212 00:11:46,640 --> 00:11:49,040 Speaker 1: that code Armstrong for that fifty dollars in lineups after 213 00:11:49,040 --> 00:11:50,920 Speaker 1: you play first five dollars lineup prize packs. 214 00:11:51,000 --> 00:11:51,880 Speaker 2: That's good to be right. 215 00:11:52,160 --> 00:11:55,520 Speaker 5: Why are so many people getting what influencers are calling 216 00:11:55,720 --> 00:11:57,400 Speaker 5: facial circumcisions? 217 00:11:57,520 --> 00:12:02,120 Speaker 2: Why does that become a thing? Did not see that coming? 218 00:12:02,679 --> 00:12:03,160 Speaker 2: Who would? 219 00:12:04,520 --> 00:12:06,720 Speaker 5: Who would have predicted? I was going to say that 220 00:12:06,840 --> 00:12:09,000 Speaker 5: nobody the old f c. 221 00:12:10,800 --> 00:12:11,079 Speaker 4: FC. 222 00:12:11,760 --> 00:12:13,680 Speaker 5: Why are people getting them? It's pretty obvious if you 223 00:12:13,679 --> 00:12:14,680 Speaker 5: think about it for a second. 224 00:12:14,720 --> 00:12:20,360 Speaker 6: And other news on the way stay here at see. 225 00:12:20,200 --> 00:12:23,840 Speaker 2: Do I get your thoughts in the game. I hate 226 00:12:23,840 --> 00:12:27,640 Speaker 2: this team. I was born into this and I'm not 227 00:12:27,679 --> 00:12:32,080 Speaker 2: gonna ever. I'm always a Jets fan, but like I 228 00:12:32,160 --> 00:12:33,120 Speaker 2: just I hate this team. 229 00:12:34,760 --> 00:12:38,240 Speaker 5: Somebody interviewed a fan after the Jets game the other day. 230 00:12:38,320 --> 00:12:40,000 Speaker 5: You don't have to be a fan of the NFL 231 00:12:40,120 --> 00:12:42,440 Speaker 5: to hear this. The Jets are own seven. They're the 232 00:12:42,440 --> 00:12:46,000 Speaker 5: only winless team and they're uh they're looking like, according 233 00:12:46,000 --> 00:12:49,000 Speaker 5: to experts, like one of the all time worst teams. 234 00:12:49,120 --> 00:12:52,760 Speaker 5: Not just not just bad, but like shockingly bad. 235 00:12:53,160 --> 00:12:57,880 Speaker 2: So yeah, there you go. They need new honors trade 236 00:12:57,880 --> 00:13:01,120 Speaker 2: for a new owner. Before we get to facial circumcisions. 237 00:13:02,480 --> 00:13:04,160 Speaker 2: What was the other thing, I don't even remember what 238 00:13:04,200 --> 00:13:04,679 Speaker 2: the other thing was. 239 00:13:04,840 --> 00:13:07,200 Speaker 1: Sometimes I think you bring things up just to shock. 240 00:13:07,440 --> 00:13:09,160 Speaker 1: I don't appreciate it. It's childish. 241 00:13:09,160 --> 00:13:12,719 Speaker 5: Well, do facial circumcisions first, Yeah, and yes, I do 242 00:13:12,760 --> 00:13:13,840 Speaker 5: bring things up just to shock. 243 00:13:13,880 --> 00:13:16,120 Speaker 2: But it's the name of it is shocking. 244 00:13:16,160 --> 00:13:19,559 Speaker 5: It's basically a facelift, but it's talking about the popularity 245 00:13:19,600 --> 00:13:23,120 Speaker 5: of it because of ozimpic face, people losing weight and 246 00:13:23,160 --> 00:13:26,560 Speaker 5: having all that extra skin and this influencer making the argument, like, 247 00:13:26,800 --> 00:13:29,080 Speaker 5: you cut your toenails if they're too long. Why wouldn't 248 00:13:29,080 --> 00:13:30,760 Speaker 5: you cut off skin if you got too much? I mean, 249 00:13:30,760 --> 00:13:33,520 Speaker 5: I don't understand why this is even a question. 250 00:13:35,720 --> 00:13:36,520 Speaker 2: Got good point. 251 00:13:39,480 --> 00:13:42,600 Speaker 1: Wow it is. I don't know if it's considered major surgery, 252 00:13:42,600 --> 00:13:46,160 Speaker 1: but it is surgery. Oh yeah, it's a pretty serious deal. 253 00:13:46,400 --> 00:13:46,720 Speaker 2: Yeah. 254 00:13:46,800 --> 00:13:48,760 Speaker 1: I'm going to be at a plastic surgeon today as 255 00:13:48,760 --> 00:13:50,480 Speaker 1: a matter of fact, getting my stitches out of my nose. 256 00:13:50,520 --> 00:13:52,000 Speaker 2: Maybe I get a booba lift while I'm in. I 257 00:13:52,040 --> 00:13:52,360 Speaker 2: don't know. 258 00:13:52,480 --> 00:13:55,920 Speaker 5: It is a problem though, When you lose weight, you 259 00:13:55,960 --> 00:14:00,199 Speaker 5: look older. Off you often look worse your face anyway, 260 00:14:00,960 --> 00:14:02,880 Speaker 5: and so that's a drag. I mean, you know you 261 00:14:02,880 --> 00:14:04,480 Speaker 5: put in all that effort and everything like that to 262 00:14:04,480 --> 00:14:06,880 Speaker 5: make your body look better, or you're cared about health. 263 00:14:06,880 --> 00:14:08,400 Speaker 5: I don't care about health. I just want to look better. 264 00:14:08,679 --> 00:14:11,959 Speaker 5: But why I've elected to stay porky. 265 00:14:12,000 --> 00:14:13,800 Speaker 2: It really fills in the lines, it does. 266 00:14:13,880 --> 00:14:17,320 Speaker 5: It absolutely does you look worse when you lose weight, 267 00:14:17,360 --> 00:14:20,560 Speaker 5: which sucks. Another one on Nature's in God's cruel trick? 268 00:14:20,560 --> 00:14:25,400 Speaker 5: Why does God hate is so so? 269 00:14:26,040 --> 00:14:26,320 Speaker 2: Uh? 270 00:14:26,360 --> 00:14:28,440 Speaker 5: Sora is one of the worst things to happen to mankind. 271 00:14:28,440 --> 00:14:30,960 Speaker 5: We are ahead of that direction. Anyway, that's the new 272 00:14:31,000 --> 00:14:33,720 Speaker 5: social media app that is all AI videos. Like my son, 273 00:14:33,840 --> 00:14:36,920 Speaker 5: my thirteen year old, he even set himself, I gotta 274 00:14:36,960 --> 00:14:39,520 Speaker 5: limit myself on this because it's just too damned entertaining 275 00:14:39,600 --> 00:14:42,840 Speaker 5: scrolling through all the AI videos that you posted every 276 00:14:42,880 --> 00:14:46,040 Speaker 5: single day. But I saw one last night. I got 277 00:14:46,040 --> 00:14:48,760 Speaker 5: some site that gives you the best five soro videos 278 00:14:48,840 --> 00:14:51,440 Speaker 5: of the day, and it takes a total of a 279 00:14:51,680 --> 00:14:54,040 Speaker 5: minute to watch all five of them as long as 280 00:14:54,080 --> 00:14:57,480 Speaker 5: you stop. What's the big deal but stopping, and it's 281 00:14:57,480 --> 00:14:58,840 Speaker 5: like eating one potato chip. 282 00:14:59,040 --> 00:15:00,440 Speaker 2: But what I like so much. 283 00:15:00,440 --> 00:15:02,560 Speaker 5: With baseball season going on right now, I don't know 284 00:15:02,600 --> 00:15:06,920 Speaker 5: what year it was, it looked like mid seventies playoff baseball. 285 00:15:06,960 --> 00:15:10,080 Speaker 5: It's probably Yankees, Dodgers or Red Sox or whoever. And 286 00:15:10,280 --> 00:15:13,880 Speaker 5: Elvis is at the plate and he's in full. He's 287 00:15:13,960 --> 00:15:16,520 Speaker 5: in full, and it's a wide shot and it's a 288 00:15:16,600 --> 00:15:19,520 Speaker 5: it's he's in full like karate gear and the red 289 00:15:19,560 --> 00:15:23,240 Speaker 5: scarf and the sunglasses. Anyway, Uh, three and three and 290 00:15:23,240 --> 00:15:25,360 Speaker 5: two in the count on the King and oh he 291 00:15:25,440 --> 00:15:27,280 Speaker 5: got a hold of that one and it goes deepen 292 00:15:27,320 --> 00:15:29,760 Speaker 5: into home run and he geeks rounds the bases and 293 00:15:29,760 --> 00:15:31,720 Speaker 5: he's pumping his fists and then they even have the 294 00:15:31,760 --> 00:15:33,640 Speaker 5: post game anywhere. Yeah, yeah, I was just trying to 295 00:15:33,640 --> 00:15:34,880 Speaker 5: get the bat on it and get a good hit. 296 00:15:34,920 --> 00:15:35,760 Speaker 5: I was all just trying to get a. 297 00:15:35,720 --> 00:15:44,440 Speaker 6: Good child freaking hilarious and just so incredibly real looking good. 298 00:15:44,600 --> 00:15:46,520 Speaker 5: I know who comes up with these ideas, They're gonna 299 00:15:46,560 --> 00:15:49,680 Speaker 5: run out of them eventually, right of ideas well, and 300 00:15:49,720 --> 00:15:50,680 Speaker 5: for free. 301 00:15:50,760 --> 00:15:53,680 Speaker 1: In roughly twenty five seconds last night, I reached out 302 00:15:53,680 --> 00:15:55,160 Speaker 1: to Jack to remind me of the name of an 303 00:15:55,160 --> 00:15:57,800 Speaker 1: app and I took a picture of my wife and 304 00:15:57,840 --> 00:15:59,400 Speaker 1: I in Britain. 305 00:15:59,440 --> 00:16:00,320 Speaker 2: We are in a. 306 00:16:00,080 --> 00:16:03,000 Speaker 1: Punt which is like a gondola, taking a little ride 307 00:16:03,120 --> 00:16:07,240 Speaker 1: in Cambridge, and I just typed in and again this 308 00:16:07,280 --> 00:16:11,160 Speaker 1: took what thirty forty five seconds, an animation of the 309 00:16:11,160 --> 00:16:14,400 Speaker 1: photograph of us in the boat leaping into the water. 310 00:16:17,080 --> 00:16:22,040 Speaker 5: For free, right in seconds. You gotta fooled me on 311 00:16:22,120 --> 00:16:23,560 Speaker 5: that second page. But I'll tell you, I just want 312 00:16:23,600 --> 00:16:24,680 Speaker 5: to get a stick on it and see if I 313 00:16:24,680 --> 00:16:25,440 Speaker 5: could get it out of here. 314 00:16:26,440 --> 00:16:28,880 Speaker 2: You got you got serious stuff. I had had a 315 00:16:28,920 --> 00:16:30,560 Speaker 2: hang out. Wull count. 316 00:16:30,560 --> 00:16:37,120 Speaker 5: You gotta protect the play jack, Okay, Wow, we got 317 00:16:37,120 --> 00:16:41,280 Speaker 5: more craziness from Mumdanni, who's going to be the first 318 00:16:41,320 --> 00:16:45,600 Speaker 5: communist mayor of a major city in America. Major editorial, 319 00:16:46,040 --> 00:16:50,280 Speaker 5: Mamdani will make New York unsafe for New York's Jews. 320 00:16:50,920 --> 00:16:53,240 Speaker 2: Wow, remember that front page stuff? 321 00:16:53,320 --> 00:16:57,160 Speaker 5: Remember that story we had last week questioning whether Jews 322 00:16:57,200 --> 00:16:59,120 Speaker 5: could continue to live in Great Britain? 323 00:16:59,800 --> 00:17:03,720 Speaker 1: Right, they're wondering and maybe not more. Is an Islamist 324 00:17:03,880 --> 00:17:07,320 Speaker 1: as well as a Communist. It's the Red Green Alliance, 325 00:17:07,359 --> 00:17:09,679 Speaker 1: which we've talked about before in one person. 326 00:17:09,840 --> 00:17:14,000 Speaker 5: That's probably a bigger threat, right, isn't it Islamism? 327 00:17:15,000 --> 00:17:15,200 Speaker 7: Yeah? 328 00:17:15,480 --> 00:17:15,879 Speaker 2: I don't know. 329 00:17:15,920 --> 00:17:18,879 Speaker 1: Yeah, because you can boot communists out of power unless 330 00:17:18,920 --> 00:17:23,000 Speaker 1: they like get actual power. But it's very difficult to 331 00:17:23,040 --> 00:17:25,679 Speaker 1: dislodge Islamists because they'll. 332 00:17:25,480 --> 00:17:26,720 Speaker 2: Kill you for trying. Yeah. 333 00:17:26,800 --> 00:17:28,200 Speaker 5: I don't know if he has the power to bring 334 00:17:28,240 --> 00:17:31,359 Speaker 5: many of his communist dreams true just because he's the mayor, 335 00:17:31,480 --> 00:17:35,639 Speaker 5: but he could do a lot of things around allowing 336 00:17:35,920 --> 00:17:38,480 Speaker 5: crazy versions of Islam. 337 00:17:38,119 --> 00:17:42,200 Speaker 2: To run wild in New York. Yeah, yep, agree? What 338 00:17:42,240 --> 00:17:42,680 Speaker 2: the heck? 339 00:17:45,720 --> 00:17:52,960 Speaker 1: We We have tolerated ourselves into a terrible situation, a 340 00:17:53,280 --> 00:17:58,040 Speaker 1: toxic tolerance. Gad Sad has written a book about that 341 00:17:58,200 --> 00:18:02,640 Speaker 1: recently the Great Professor about how excessive tolerance will doom 342 00:18:02,680 --> 00:18:03,200 Speaker 1: a society. 343 00:18:03,200 --> 00:18:03,919 Speaker 2: It's so obvious. 344 00:18:03,960 --> 00:18:05,840 Speaker 5: I don't understand why everybody doesn't get it. If you 345 00:18:05,880 --> 00:18:10,480 Speaker 5: were tolerant of a group that's intolerant, they will take over. 346 00:18:10,680 --> 00:18:14,840 Speaker 5: It's not complicated correct Anyway, We've got a lot of 347 00:18:14,840 --> 00:18:16,200 Speaker 5: stuff on the way. If you missed the secondent, get 348 00:18:16,200 --> 00:18:17,560 Speaker 5: the podcast Armstrong and Getty on. 349 00:18:17,560 --> 00:18:20,600 Speaker 2: Demand Armstrong and Getty. 350 00:18:22,160 --> 00:18:23,800 Speaker 7: But he would be I would say he would be 351 00:18:23,840 --> 00:18:26,200 Speaker 7: the leader of the party. He's not humor Schumer shot. 352 00:18:26,920 --> 00:18:29,199 Speaker 7: He's shot. This poor guy. I feels sorry for him, 353 00:18:29,760 --> 00:18:32,280 Speaker 7: known for a long time, but he's he's I think 354 00:18:32,320 --> 00:18:38,440 Speaker 7: he's mentally good. He's been beat up by young radical lunatics, 355 00:18:39,560 --> 00:18:44,080 Speaker 7: and I think Chuck Schumer is he's gotten I really do. 356 00:18:44,359 --> 00:18:45,520 Speaker 7: I think he's probably not. 357 00:18:45,480 --> 00:18:45,960 Speaker 6: Going to run. 358 00:18:46,400 --> 00:18:49,080 Speaker 7: It shows that he's losing in every pole. Now, this 359 00:18:49,240 --> 00:18:51,159 Speaker 7: is hard. You know, he wants to meet with me. 360 00:18:51,280 --> 00:18:53,320 Speaker 7: He's sort of hard to be with the guy after 361 00:18:53,359 --> 00:18:55,159 Speaker 7: I make a statement like that. But I'm just giving 362 00:18:55,240 --> 00:18:56,640 Speaker 7: them back. I think Chuck is. 363 00:18:56,600 --> 00:18:59,280 Speaker 5: Probably kind of hard to meet with a guy after 364 00:18:59,280 --> 00:19:01,280 Speaker 5: a making statement like that. But I'm just giving you 365 00:19:01,280 --> 00:19:07,719 Speaker 5: the facts. He's gone, Wow, hey, I'm looking up at 366 00:19:07,720 --> 00:19:11,960 Speaker 5: the TV. Edie Falco and Jeremy Renner have a new 367 00:19:12,000 --> 00:19:14,520 Speaker 5: show coming out of some sort. But I didn't know 368 00:19:14,640 --> 00:19:17,320 Speaker 5: he was recovered enough from when his snowblower ran over 369 00:19:17,400 --> 00:19:19,919 Speaker 5: him that he was like sitting on the couch promoting 370 00:19:19,960 --> 00:19:20,959 Speaker 5: new shows and stuff like that. 371 00:19:21,080 --> 00:19:22,879 Speaker 2: Okay, glad he when was that? 372 00:19:23,080 --> 00:19:24,639 Speaker 5: Was that a couple of years a few years ago? 373 00:19:24,680 --> 00:19:27,280 Speaker 5: But he was like really really he nearly died. I 374 00:19:27,280 --> 00:19:29,160 Speaker 5: didn't know he was telling Okay, it looks to be fun. 375 00:19:30,200 --> 00:19:32,600 Speaker 5: One other thing I wanted to mention before we get 376 00:19:32,640 --> 00:19:36,760 Speaker 5: back into the bulk of this segment. Interesting AI story 377 00:19:36,800 --> 00:19:40,280 Speaker 5: that just came out breaking news that Zuckerberg is going 378 00:19:40,320 --> 00:19:42,920 Speaker 5: to lay off about six hundred AI workers, but the 379 00:19:43,080 --> 00:19:45,680 Speaker 5: why is dang interesting. 380 00:19:46,520 --> 00:19:47,640 Speaker 2: And Jess and he. 381 00:19:47,640 --> 00:19:51,240 Speaker 5: Replaced him with AI. That's kind of funny. I'm more 382 00:19:51,280 --> 00:19:53,640 Speaker 5: scared about AI having read this article than I was before. 383 00:19:53,680 --> 00:19:58,680 Speaker 5: And I was already so this mom Donnie character. A 384 00:19:58,720 --> 00:20:00,919 Speaker 5: lot of people talk about him being communist and what 385 00:20:00,960 --> 00:20:03,520 Speaker 5: that would mean for New York and is like kind 386 00:20:03,560 --> 00:20:05,920 Speaker 5: of the face of the Democratic Party. That's what Trump 387 00:20:06,000 --> 00:20:08,480 Speaker 5: was talking about there, that Schumer's not the face of 388 00:20:08,520 --> 00:20:09,240 Speaker 5: the Democratic Party. 389 00:20:09,240 --> 00:20:09,960 Speaker 2: It's Mundani. 390 00:20:10,119 --> 00:20:13,919 Speaker 5: He's gotten so other people are not just worried about 391 00:20:13,920 --> 00:20:16,439 Speaker 5: his communist views. 392 00:20:16,480 --> 00:20:21,720 Speaker 3: It's this I believe Zoron Mamdani poses a danger to 393 00:20:21,840 --> 00:20:25,560 Speaker 3: the security of the New York Jewish community. A vote 394 00:20:25,640 --> 00:20:29,520 Speaker 3: for Sliwa, whatever his merits, is a vote for Mamdani. 395 00:20:29,920 --> 00:20:32,760 Speaker 3: There is a path to victory, but it means that 396 00:20:32,960 --> 00:20:35,680 Speaker 3: every eligible voter must vote. 397 00:20:35,640 --> 00:20:39,240 Speaker 5: Popular New York rabbi there who almost certainly always votes Democrats. 398 00:20:39,240 --> 00:20:42,200 Speaker 5: So it's not like he's a Republican most likely, and 399 00:20:42,640 --> 00:20:45,800 Speaker 5: he's scared for the Jewish community of Mandani wins. 400 00:20:47,200 --> 00:20:47,480 Speaker 2: Yeah. 401 00:20:47,600 --> 00:20:53,520 Speaker 1: The Wall Street Journal just published an editorial by Alicia Weissel, 402 00:20:53,680 --> 00:20:57,720 Speaker 1: who's the former chief information officer for Goldman Sachs is 403 00:20:57,760 --> 00:21:01,680 Speaker 1: a founding partner in a big tech company. He's also 404 00:21:01,800 --> 00:21:07,360 Speaker 1: the chairman of the Elifeil Foundation. But big lead editorial 405 00:21:07,359 --> 00:21:12,920 Speaker 1: in the paper today a Mamdani Mayor Mayoralty Mayor. Mayor 406 00:21:13,560 --> 00:21:16,440 Speaker 1: Mamdani being mayor mayor threatens New York's. 407 00:21:16,240 --> 00:21:17,600 Speaker 2: Jews is the title. 408 00:21:19,640 --> 00:21:23,960 Speaker 1: By propagating lies about occupation, apartheid, and genocide, he helps 409 00:21:24,000 --> 00:21:30,000 Speaker 1: promote anti Semitism, and he talks it's actually quite sad. 410 00:21:30,560 --> 00:21:34,360 Speaker 1: Most of this column is about conflicts. These are conversations 411 00:21:34,400 --> 00:21:37,840 Speaker 1: I said, say, he's had with a good old friend 412 00:21:37,880 --> 00:21:43,480 Speaker 1: of his who has taken in nothing but leftist coverage 413 00:21:43,720 --> 00:21:50,240 Speaker 1: of the conflict in Gaza. And this guy is completely 414 00:21:50,320 --> 00:21:58,480 Speaker 1: convinced of occupation, apartheid, genocide, that that October seventh was 415 00:21:58,920 --> 00:22:03,520 Speaker 1: justified and the rest of it, because he's been, you know, 416 00:22:03,720 --> 00:22:09,200 Speaker 1: just lapping up this incredibly bizarrely unfair, one sided coverage. 417 00:22:09,280 --> 00:22:11,520 Speaker 5: That's interesting that that would happen in New York, the 418 00:22:11,720 --> 00:22:16,000 Speaker 5: biggest collection of Jewish people on earth outside of Israel. 419 00:22:17,040 --> 00:22:17,560 Speaker 2: Yeah. 420 00:22:17,680 --> 00:22:21,760 Speaker 1: Yeah, And his point is, and I hate to summarize 421 00:22:21,760 --> 00:22:24,600 Speaker 1: it because it's quite eloquent, but his point is that 422 00:22:25,000 --> 00:22:31,040 Speaker 1: sort of pitching of that version of history inevitably causes 423 00:22:31,080 --> 00:22:35,080 Speaker 1: people to feel anger and hatred toward Jews, and a 424 00:22:35,080 --> 00:22:39,840 Speaker 1: hall he's asking for his affair hearing mister Mumdanni blamed 425 00:22:39,840 --> 00:22:45,440 Speaker 1: Hamas's butchery on the occupation the day after October seventh, while. 426 00:22:45,280 --> 00:22:48,080 Speaker 2: Israel was reeling. He omitted. 427 00:22:49,240 --> 00:22:52,480 Speaker 1: In the various quotes that Israel pulled out of Gaza 428 00:22:52,560 --> 00:22:56,840 Speaker 1: in two thousand and five and that Hamas built rockets 429 00:22:56,880 --> 00:23:00,760 Speaker 1: and tunnels with billions of dollars in aid. Took the aid, 430 00:23:00,920 --> 00:23:04,800 Speaker 1: They belt rockets and tunnels. And then he goes into 431 00:23:04,800 --> 00:23:06,520 Speaker 1: some of the history of Israel that his friend was 432 00:23:06,560 --> 00:23:12,359 Speaker 1: completely ignorant of. And you know, I can get to 433 00:23:12,400 --> 00:23:18,720 Speaker 1: his point. Mister Mumdanni's lies in endangered Jews everywhere, incendiary, 434 00:23:18,920 --> 00:23:19,800 Speaker 1: false and deadly. 435 00:23:19,880 --> 00:23:20,880 Speaker 2: They spread hatred. 436 00:23:21,200 --> 00:23:23,479 Speaker 1: More than half of New York's hate crimes last year 437 00:23:23,520 --> 00:23:27,720 Speaker 1: were anti Semitic. One attacker shouting free Palestine stabbed a Jew. 438 00:23:27,760 --> 00:23:30,359 Speaker 1: Another tried to run Jews over with a car. I 439 00:23:30,480 --> 00:23:33,760 Speaker 1: was assaulted, he writes last week by anti Israel marchers 440 00:23:33,760 --> 00:23:36,480 Speaker 1: that the kind of rally mister Mndannie attended, then encouraged, 441 00:23:36,680 --> 00:23:39,520 Speaker 1: and then endorsed only tacitly as he came under pressure. 442 00:23:39,920 --> 00:23:43,320 Speaker 1: The signs read glory to the martyrs, violence his resistance, 443 00:23:43,400 --> 00:23:52,119 Speaker 1: and killed the Jews at the demonstration. Mumdanni attended. It's 444 00:23:52,119 --> 00:23:58,280 Speaker 1: interesting that those editorials and statements are coming out at 445 00:23:58,280 --> 00:23:59,040 Speaker 1: the same time. 446 00:23:59,080 --> 00:24:00,399 Speaker 2: And I mentioned this little earlier. 447 00:24:01,119 --> 00:24:04,159 Speaker 1: Both Politico and The New York Times have big like 448 00:24:04,359 --> 00:24:08,439 Speaker 1: lead pieces today saying to the left, we've got to 449 00:24:08,480 --> 00:24:12,480 Speaker 1: abandon our left flank. They're too crazy. They're killing our party. 450 00:24:12,520 --> 00:24:16,520 Speaker 1: They're killing our electoral chances, and it's funny. The New 451 00:24:16,600 --> 00:24:20,240 Speaker 1: York Times has to point out that all the mistakes 452 00:24:20,280 --> 00:24:25,040 Speaker 1: that Republicans have made, frittering away winnable races in a 453 00:24:25,119 --> 00:24:28,399 Speaker 1: variety of states by nominating nut job candidates. 454 00:24:28,560 --> 00:24:31,280 Speaker 2: Oh and they're right, by the way, they're right in. 455 00:24:31,280 --> 00:24:33,679 Speaker 1: The New York Times, it's funny they that's the spoonful 456 00:24:33,720 --> 00:24:36,439 Speaker 1: of sugar that helps the medicine go down, because then 457 00:24:36,480 --> 00:24:40,479 Speaker 1: they turned to saying to their lefty raiders were killing 458 00:24:40,480 --> 00:24:43,440 Speaker 1: ourselves with these nut jobs. Anyway, I thought that was 459 00:24:43,480 --> 00:24:46,320 Speaker 1: interesting in Politico with a similar piece. Evidently the words 460 00:24:46,359 --> 00:24:48,240 Speaker 1: gotten around. It's okay to say it out loud. Now 461 00:24:48,440 --> 00:24:54,720 Speaker 1: I think that's really good. There's another step that needs 462 00:24:54,720 --> 00:24:59,159 Speaker 1: to be taken, though, when you're talking about Islamists, because 463 00:24:59,160 --> 00:25:03,240 Speaker 1: so many people have swallowed the lie that to criticize 464 00:25:03,240 --> 00:25:07,800 Speaker 1: fundamentalist Islam or Islamism, which is the spreading of Islam 465 00:25:07,840 --> 00:25:11,760 Speaker 1: as a political and economic system. To criticize that is 466 00:25:11,760 --> 00:25:16,280 Speaker 1: somehow bigotry, that you're quote unquote criticizing a religion. And 467 00:25:16,520 --> 00:25:18,679 Speaker 1: by the way, in America, you can criticize a religion 468 00:25:18,760 --> 00:25:21,400 Speaker 1: all you want. We get to it's in the First Amendment, 469 00:25:21,720 --> 00:25:28,400 Speaker 1: so go ahead. But I'm really really hoping that there's 470 00:25:28,480 --> 00:25:31,720 Speaker 1: been some sort of damn break. Oh. Another great example, 471 00:25:31,800 --> 00:25:34,719 Speaker 1: the Feminization of America article that's getting so much attention 472 00:25:34,840 --> 00:25:37,080 Speaker 1: that we talked about yesterday. In a little bit today, 473 00:25:38,480 --> 00:25:42,840 Speaker 1: I'm feeling like there's an increased permission structure for people 474 00:25:42,840 --> 00:25:44,160 Speaker 1: to say, you know, the wo crap. 475 00:25:44,200 --> 00:25:46,000 Speaker 2: I never bought it. I was just afraid to say. 476 00:25:46,040 --> 00:25:46,159 Speaker 3: So. 477 00:25:46,560 --> 00:25:50,359 Speaker 1: Could this be the great preference cascade that we've talked 478 00:25:50,359 --> 00:25:50,960 Speaker 1: about before. 479 00:25:51,040 --> 00:25:53,920 Speaker 2: I don't know. We'll live to see it. Yeah. 480 00:25:53,960 --> 00:25:57,119 Speaker 1: I hate to be too optimistic because grim determination is 481 00:25:57,200 --> 00:25:58,600 Speaker 1: kind of our mindset around here. 482 00:25:58,640 --> 00:25:59,679 Speaker 2: But there are signs. 483 00:26:00,320 --> 00:26:04,760 Speaker 5: So why did Zuckerberg fire six hundred AI people at Facebook? 484 00:26:04,880 --> 00:26:07,280 Speaker 5: It's pretty danged interesting to get to that coming up. 485 00:26:07,480 --> 00:26:09,399 Speaker 5: I did want to mention, so I brought this up 486 00:26:09,440 --> 00:26:12,800 Speaker 5: the other day. My son, one of his favorite musicians 487 00:26:12,840 --> 00:26:15,399 Speaker 5: currently is this guy named lul Yachti and Lil YACHTI 488 00:26:15,520 --> 00:26:18,800 Speaker 5: was playing locally over the weekend and a big giant 489 00:26:19,280 --> 00:26:22,439 Speaker 5: fight broke out and they had to shut down the concert. 490 00:26:22,480 --> 00:26:24,680 Speaker 5: And it's really popular in this kind of music, whatever 491 00:26:24,720 --> 00:26:26,800 Speaker 5: you call that kind of music, to have these open 492 00:26:26,800 --> 00:26:29,359 Speaker 5: mosh pit areas in front of the stage, and they 493 00:26:29,400 --> 00:26:31,640 Speaker 5: get really violent and lots of people get hurt and 494 00:26:31,960 --> 00:26:34,479 Speaker 5: nearly crushed to death and all these different things. And 495 00:26:34,520 --> 00:26:36,800 Speaker 5: that's one of the appeals to like my son and 496 00:26:36,840 --> 00:26:43,000 Speaker 5: his kids, who were as like gentrified, you know, suburban, ye, 497 00:26:43,480 --> 00:26:46,320 Speaker 5: middle upper class kids as you can possibly get. 498 00:26:47,400 --> 00:26:51,440 Speaker 2: Why do you think that is danger adjacent? 499 00:26:51,880 --> 00:26:53,840 Speaker 5: The reason I brought this up again was somebody texted this, 500 00:26:54,000 --> 00:26:56,639 Speaker 5: it's like the Running of the Bulls in Pamplona. You 501 00:26:56,680 --> 00:26:58,520 Speaker 5: want to be there and kind of be danger adjacent. 502 00:26:58,640 --> 00:27:00,879 Speaker 5: You don't want to be stabbed by a bowl yourself. 503 00:27:01,119 --> 00:27:02,440 Speaker 5: Do you want to be able to say you did 504 00:27:02,480 --> 00:27:04,840 Speaker 5: it and kind of you know, lived in edgy. I 505 00:27:04,880 --> 00:27:06,760 Speaker 5: guess it's like young men on going off to war 506 00:27:06,880 --> 00:27:08,200 Speaker 5: back when that was more of a thing. 507 00:27:08,280 --> 00:27:12,800 Speaker 1: And well, in the great feminization of American society, boys 508 00:27:12,800 --> 00:27:17,720 Speaker 1: are taught over and over again, like vigorous activity even 509 00:27:17,960 --> 00:27:20,200 Speaker 1: is forbidden on the playground because somebody might get hurt, 510 00:27:20,600 --> 00:27:25,800 Speaker 1: never mind like violent games and full contact sports, and 511 00:27:25,840 --> 00:27:27,600 Speaker 1: so they they're attracted to him. 512 00:27:27,640 --> 00:27:29,480 Speaker 5: Yeah, I think it's very much the same mindset of 513 00:27:29,520 --> 00:27:35,160 Speaker 5: that going to the Running of the Bulls. Yeah, if 514 00:27:35,160 --> 00:27:37,239 Speaker 5: I go to if I go into these concerts with him, 515 00:27:37,280 --> 00:27:41,680 Speaker 5: I'm sitting far away, watching from above and hoping he's 516 00:27:41,720 --> 00:27:42,439 Speaker 5: not very close. 517 00:27:43,359 --> 00:27:48,480 Speaker 1: There ought to be a designated dad section, mom section exactly. 518 00:27:48,680 --> 00:27:50,520 Speaker 2: I'll watch it on a monitor somewhere. 519 00:27:51,359 --> 00:27:54,639 Speaker 5: U Zuckerberg's really big on trying to create not just 520 00:27:54,840 --> 00:27:59,760 Speaker 5: artificial intelligence, but artificial superintelligence. Whatce I've been reading about 521 00:28:00,080 --> 00:28:03,520 Speaker 5: that will absolutely ruin mankind. And that's what he's spending 522 00:28:03,520 --> 00:28:05,640 Speaker 5: his money on now, among other things. On the way 523 00:28:05,720 --> 00:28:08,760 Speaker 5: stay here, Armstrong and Yetti. 524 00:28:09,520 --> 00:28:12,760 Speaker 4: According to this New York Times report and these internal documents, 525 00:28:13,160 --> 00:28:15,919 Speaker 4: Amazon's investments in AI are going to allow it to 526 00:28:15,960 --> 00:28:19,639 Speaker 4: hire six hundred thousand fewer humans by twenty three, twenty 527 00:28:19,720 --> 00:28:21,760 Speaker 4: thirty three. I should say, despite the fact that they 528 00:28:21,800 --> 00:28:24,480 Speaker 4: plan to double the amount of products they are selling, 529 00:28:25,000 --> 00:28:29,159 Speaker 4: seventy five percent of work in those warehouses automated. That 530 00:28:29,280 --> 00:28:31,960 Speaker 4: is the eventual plan, And according to the New York Times, 531 00:28:32,000 --> 00:28:34,959 Speaker 4: Amazon is actually developing comm strategies for how to handle 532 00:28:35,000 --> 00:28:37,720 Speaker 4: the backlash in these communities where jobs might be lost. 533 00:28:38,840 --> 00:28:43,120 Speaker 5: Six hundred thousand jobs will be eliminated at Amazon by 534 00:28:43,120 --> 00:28:47,160 Speaker 5: AI in the next seven and a half years. Wow 535 00:28:49,880 --> 00:28:59,160 Speaker 5: Wow is right, that is really something. Yeah, different AI story. 536 00:28:59,240 --> 00:29:02,240 Speaker 5: The headline just breaking now that Mark Zuckerberg is planning 537 00:29:02,240 --> 00:29:04,480 Speaker 5: to lay off. He sent out letters today They're going 538 00:29:04,520 --> 00:29:10,880 Speaker 5: to lay off six hundred AI employees. But it's a 539 00:29:10,920 --> 00:29:11,720 Speaker 5: little misleading. 540 00:29:11,760 --> 00:29:12,720 Speaker 2: I mean, if. 541 00:29:12,440 --> 00:29:16,680 Speaker 5: Your knee jerk reaction is that AI isn't panning out 542 00:29:16,680 --> 00:29:20,160 Speaker 5: for him something or he's not as engaged in it. No, 543 00:29:20,280 --> 00:29:25,959 Speaker 5: he's replacing a bunch of lower tier AI employees that 544 00:29:26,000 --> 00:29:28,840 Speaker 5: they had hired a bunch of in a mad rush 545 00:29:28,880 --> 00:29:32,280 Speaker 5: to try to catch up to chat, GPT and Elon 546 00:29:32,360 --> 00:29:35,240 Speaker 5: and others. And then he since and we talked about 547 00:29:35,280 --> 00:29:38,080 Speaker 5: this at the time, he went on a spending spree 548 00:29:38,160 --> 00:29:41,440 Speaker 5: this past summer. In JUNI invested fourteen point three billion 549 00:29:41,520 --> 00:29:45,000 Speaker 5: dollars in something called Scale Ai, and he went and 550 00:29:45,040 --> 00:29:47,320 Speaker 5: he plucked some of the best AI minds in the 551 00:29:47,360 --> 00:29:52,080 Speaker 5: world from other companies. He recruited top researchers from open Ai, 552 00:29:52,280 --> 00:29:57,240 Speaker 5: that's Altman's thing, Google, Microsoft, And as we talked about 553 00:29:57,280 --> 00:30:01,080 Speaker 5: at the time, pay packages that for some of these dudes, 554 00:30:01,160 --> 00:30:04,440 Speaker 5: and they're almost all dudes number in the hundreds of 555 00:30:04,640 --> 00:30:08,360 Speaker 5: millions of dollars. Makes Sho Hayo Tani seem like nothing. 556 00:30:08,680 --> 00:30:13,560 Speaker 5: I mean serious crazy money. Come work for me, leave 557 00:30:13,600 --> 00:30:17,400 Speaker 5: open AI. I'll pay you two hundred million dollars. Great 558 00:30:17,680 --> 00:30:20,400 Speaker 5: googly moogly. That is how dedicated he is to try 559 00:30:20,400 --> 00:30:23,480 Speaker 5: and be That is funny, Katie, I don't blame you 560 00:30:23,520 --> 00:30:28,240 Speaker 5: for laughing. Google m mooglely is a funny expression. Zuckerberg is 561 00:30:28,280 --> 00:30:30,760 Speaker 5: doubling down and most of them are in. So he's 562 00:30:30,800 --> 00:30:33,600 Speaker 5: got four different segments of AI he's working on. I 563 00:30:33,640 --> 00:30:35,880 Speaker 5: won't bog down in that. But one of the segments 564 00:30:35,920 --> 00:30:39,440 Speaker 5: is super intelligence, which doesn't get enough talk. I was 565 00:30:39,480 --> 00:30:41,440 Speaker 5: just listening to a podcast about it the other day. 566 00:30:42,160 --> 00:30:45,240 Speaker 5: People use the blanket term AI, and then there's the AGI, 567 00:30:45,280 --> 00:30:48,240 Speaker 5: which is artificial general intelligence when it can be as 568 00:30:48,280 --> 00:30:51,600 Speaker 5: smart as a human being. Basically, super intelligence is the 569 00:30:51,640 --> 00:30:55,880 Speaker 5: idea of AI being just vastly smarter than human beings 570 00:30:55,880 --> 00:30:58,440 Speaker 5: have ever been and just a completely different level of 571 00:30:58,480 --> 00:31:01,200 Speaker 5: being able to learn fast and do things. And when 572 00:31:01,280 --> 00:31:05,760 Speaker 5: that happens, if it happens soon, I seriously can't even 573 00:31:05,800 --> 00:31:08,880 Speaker 5: imagine how mankind survive revives it. There's some people think 574 00:31:08,880 --> 00:31:10,960 Speaker 5: that it can't happen. I'm kind of hoping it can't. 575 00:31:12,640 --> 00:31:15,680 Speaker 1: Yeah, Well, Zuckerberg is Satan I revealed that on the 576 00:31:15,720 --> 00:31:17,160 Speaker 1: show a number of years ago. 577 00:31:17,680 --> 00:31:19,600 Speaker 5: You know, you do say that a lot, But I 578 00:31:19,640 --> 00:31:27,160 Speaker 5: don't think he'd. I think he'd the impulse to want 579 00:31:27,160 --> 00:31:31,000 Speaker 5: to be first in and get to this before Elon 580 00:31:31,120 --> 00:31:34,080 Speaker 5: or somebody else does would override his I wonder what 581 00:31:34,120 --> 00:31:37,680 Speaker 5: this will do through the economy or society or yeah, 582 00:31:37,760 --> 00:31:40,720 Speaker 5: that sort of thing. And Honey, might be right, either 583 00:31:40,800 --> 00:31:43,440 Speaker 5: I do it or somebody else does it, And I'd 584 00:31:43,480 --> 00:31:46,560 Speaker 5: rather meet be me, which is you know, might just 585 00:31:46,600 --> 00:31:47,040 Speaker 5: be true. 586 00:31:47,800 --> 00:31:50,360 Speaker 1: Well, either way, we're doomed. Planet of the apes or 587 00:31:50,400 --> 00:31:52,320 Speaker 1: beavers or ants or something is on the way. 588 00:31:53,720 --> 00:31:56,240 Speaker 5: Has there anybody ever been anything even close to this 589 00:31:57,480 --> 00:32:01,160 Speaker 5: adjusted for inflation, where something came along where companies were 590 00:32:01,200 --> 00:32:05,280 Speaker 5: paying hundreds of millions of dollars per employee to have 591 00:32:05,400 --> 00:32:06,120 Speaker 5: the best talent. 592 00:32:06,360 --> 00:32:11,120 Speaker 2: I don't think so. I mean that's stunning. Yeah, And 593 00:32:11,240 --> 00:32:15,080 Speaker 2: is it a bubble? Nobody's sure about that. Either. Nobody's 594 00:32:15,080 --> 00:32:15,760 Speaker 2: sure about that. 595 00:32:16,400 --> 00:32:18,480 Speaker 5: And it could be, but you got a lot of 596 00:32:18,520 --> 00:32:21,840 Speaker 5: the smartest people in the world who clearly think it's not. 597 00:32:23,080 --> 00:32:27,400 Speaker 1: I hope they're wrong, right, Although part of being in 598 00:32:27,440 --> 00:32:29,840 Speaker 1: a bubble is you must swear up and down it's 599 00:32:29,880 --> 00:32:30,800 Speaker 1: not a bubble. 600 00:32:31,280 --> 00:32:34,680 Speaker 2: Right, so who knows? Who knows? I guess we'll all 601 00:32:34,680 --> 00:32:35,720 Speaker 2: find out together, or. 602 00:32:37,640 --> 00:32:39,920 Speaker 1: Or if the Lord sees fit to send a semi 603 00:32:39,920 --> 00:32:42,000 Speaker 1: truck my way today as I crossed the street to 604 00:32:42,360 --> 00:32:43,480 Speaker 1: just let me know how. 605 00:32:43,440 --> 00:32:45,600 Speaker 5: Coch close, friends of the Armstrong and Getty Show had 606 00:32:45,680 --> 00:32:47,400 Speaker 5: him on a couple of weeks ago. Craig got Walls 607 00:32:47,440 --> 00:32:49,720 Speaker 5: and Tim Sanderfer are both big believers that, like every 608 00:32:49,760 --> 00:32:56,240 Speaker 5: other technological advance, this will create more different jobs in 609 00:32:56,280 --> 00:32:58,000 Speaker 5: its wake, so you don't have to worry about all 610 00:32:58,000 --> 00:33:02,280 Speaker 5: the jobs lost. Maybe I don't believe it. I don't 611 00:33:02,280 --> 00:33:04,680 Speaker 5: see what AI is going to create in terms of 612 00:33:04,960 --> 00:33:09,040 Speaker 5: jobs to the tune of like the six hundred thousand 613 00:33:09,040 --> 00:33:11,240 Speaker 5: people Amazon's going to lay off in the next half 614 00:33:11,320 --> 00:33:11,960 Speaker 5: dozen years or. 615 00:33:12,000 --> 00:33:16,560 Speaker 1: So, right right, Yeah, I've got to admit I think 616 00:33:16,640 --> 00:33:20,040 Speaker 1: this is the one exception of the great rule. 617 00:33:20,760 --> 00:33:22,800 Speaker 2: But I'm intrigued by the question. I don't know that 618 00:33:22,840 --> 00:33:23,200 Speaker 2: I'm right. 619 00:33:23,320 --> 00:33:26,920 Speaker 1: Nobody knows if they're right about any of it. No, 620 00:33:27,280 --> 00:33:28,920 Speaker 1: which is the fun part, isn't it all. 621 00:33:29,000 --> 00:33:31,640 Speaker 5: I know that I use AI regularly, and I'm just 622 00:33:32,080 --> 00:33:34,480 Speaker 5: absolutely amazed by it. I talked about as buying a 623 00:33:34,520 --> 00:33:37,320 Speaker 5: computer for my son, and I used GROC and chat 624 00:33:37,360 --> 00:33:39,240 Speaker 5: GPT and said, this is what my son wants to do, 625 00:33:39,240 --> 00:33:40,680 Speaker 5: which computer should I get? And then it gave me 626 00:33:40,720 --> 00:33:42,480 Speaker 5: a couple of choices and said is he going to 627 00:33:42,560 --> 00:33:44,120 Speaker 5: do this or that? And I said, no, he doesn't 628 00:33:44,120 --> 00:33:46,280 Speaker 5: do that, but he does this. And then it just 629 00:33:46,360 --> 00:33:48,040 Speaker 5: kept narrowing it down in a way that it had 630 00:33:48,080 --> 00:33:50,640 Speaker 5: taken me all day long to do on my own 631 00:33:50,760 --> 00:33:53,440 Speaker 5: or even with Google. And I did the same thing 632 00:33:53,600 --> 00:33:55,680 Speaker 5: looking for a coffee maker. I mean, the amount of 633 00:33:55,720 --> 00:33:59,320 Speaker 5: information it gave me and boiling it all down was amazing. 634 00:33:59,440 --> 00:34:01,560 Speaker 5: Then he got the weirdness of Groc in my car, 635 00:34:01,840 --> 00:34:03,600 Speaker 5: but she yesterday? What did I ask Groc in the 636 00:34:03,640 --> 00:34:06,200 Speaker 5: car yesterday? Kind of got an argument with Grok, which 637 00:34:06,240 --> 00:34:08,160 Speaker 5: my son thought was hilarious because it's a woman. 638 00:34:10,080 --> 00:34:10,360 Speaker 2: Oh. 639 00:34:10,400 --> 00:34:13,279 Speaker 5: We were listening to a Beatles song and then Henry said, 640 00:34:13,320 --> 00:34:15,600 Speaker 5: is Yoko Ono still alive? And I said, I don't know, hey, Groc, 641 00:34:16,000 --> 00:34:18,920 Speaker 5: is a Yoko Onna still alive? And Groc said, oh, 642 00:34:18,960 --> 00:34:21,880 Speaker 5: you know it, She's still alive and she's still cranking 643 00:34:21,920 --> 00:34:24,279 Speaker 5: out art better than ever. And I said, yeah, I've 644 00:34:24,280 --> 00:34:27,479 Speaker 5: never really been a fan of Yoko on and groc said, well, 645 00:34:27,520 --> 00:34:30,640 Speaker 5: she's been on the the avant garde of writing and 646 00:34:30,680 --> 00:34:33,960 Speaker 5: painting and music, so many people do respect her work. 647 00:34:33,960 --> 00:34:36,359 Speaker 5: I thought it was kind of interesting that Grok automatically 648 00:34:36,360 --> 00:34:41,439 Speaker 5: took the hippie view of Yoko Ono. And I said, well, 649 00:34:41,480 --> 00:34:44,400 Speaker 5: we will have to agree to disagree, and she said, 650 00:34:44,640 --> 00:34:49,439 Speaker 5: that's fine, I'll talk to you later. But Henry said 651 00:34:49,440 --> 00:34:52,720 Speaker 5: I should have called her a bee, you stupid bee. 652 00:34:52,840 --> 00:34:54,760 Speaker 5: I don't know why I don't do that. It's weird. 653 00:34:54,920 --> 00:34:57,600 Speaker 5: I'm so convinced, like, look at you. You're a polled 654 00:34:57,640 --> 00:35:00,959 Speaker 5: by this. It's a freaking computer voice. There's no human 655 00:35:01,000 --> 00:35:03,759 Speaker 5: being on the list. Why wouldn't I say, you stupid me? 656 00:35:04,840 --> 00:35:06,720 Speaker 5: It's evil. 657 00:35:06,880 --> 00:35:08,960 Speaker 2: Don't have me evil. Pass your lips. 658 00:35:09,040 --> 00:35:12,839 Speaker 6: What it's a computer, it's trying to argue me into 659 00:35:12,880 --> 00:35:13,880 Speaker 6: liking Yoko Ono. 660 00:35:15,000 --> 00:35:18,080 Speaker 2: Why don't you drop an N bomb on it? Wow? 661 00:35:18,120 --> 00:35:20,960 Speaker 2: I know what would happen if you did that? I mean, 662 00:35:22,840 --> 00:35:27,160 Speaker 2: probably report you to something. Probably would. I'm just no, no, no, 663 00:35:27,320 --> 00:35:29,840 Speaker 2: I need to have to send adjacent I need. 664 00:35:29,719 --> 00:35:33,040 Speaker 5: To test this more and push back more. You're just stupid, Grock, 665 00:35:33,120 --> 00:35:36,600 Speaker 5: You're stupid. Yokono sucks or she's anything. She's married to 666 00:35:36,680 --> 00:35:38,719 Speaker 5: John Lennon. I'd see what she said to that. 667 00:35:39,800 --> 00:35:43,160 Speaker 1: If this segment of the podcast is not labeled that 668 00:35:43,239 --> 00:35:47,720 Speaker 1: Jack argues with Groc about Yoko Ono, I am resigning 669 00:35:47,800 --> 00:35:53,640 Speaker 1: my post. We will have to agree to disagree. If 670 00:35:53,680 --> 00:35:56,920 Speaker 1: you had that on your bingo card. Congratulations. Oh you 671 00:35:56,960 --> 00:35:58,800 Speaker 1: know what I left out one of my big articles 672 00:35:58,840 --> 00:36:01,600 Speaker 1: about how the Democrat Party realizes it's gone crazy. 673 00:36:01,600 --> 00:36:02,879 Speaker 2: Maybe we can hit that next hour. 674 00:36:03,160 --> 00:36:06,160 Speaker 5: We did twenty hours of this so much being Held 675 00:36:06,200 --> 00:36:08,600 Speaker 5: against My will song and dance every week. If you 676 00:36:08,880 --> 00:36:11,279 Speaker 5: miss a segment or an hour and you'll want to 677 00:36:11,320 --> 00:36:13,160 Speaker 5: hear it, you can get in podcast form. You just 678 00:36:13,160 --> 00:36:16,240 Speaker 5: look for Armstrong and Getty on demand. You should subscribe. 679 00:36:16,480 --> 00:36:17,920 Speaker 5: Oh yeah, absolutely should. 680 00:36:17,680 --> 00:36:21,200 Speaker 1: And you should listen to the great feminization discussion in 681 00:36:21,280 --> 00:36:22,280 Speaker 1: yesterday's podcast. 682 00:36:22,320 --> 00:36:25,360 Speaker 2: Hour two changed a lot of minds. 683 00:36:27,200 --> 00:36:31,120 Speaker 1: Armstrong and Getty