1 00:00:09,760 --> 00:00:14,040 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio the George 2 00:00:14,120 --> 00:00:18,160 Speaker 1: Washington Broadcast Center. Jack Armstrong, Joe, Getty. 3 00:00:18,079 --> 00:00:20,560 Speaker 2: Armstrong and Jettie and Hee. 4 00:00:23,280 --> 00:00:25,639 Speaker 3: Armstrong, Caddy Strong. 5 00:00:31,720 --> 00:00:32,199 Speaker 4: Welcome. 6 00:00:32,360 --> 00:00:34,640 Speaker 3: We are off this week, so you're gonna hear some 7 00:00:34,880 --> 00:00:37,960 Speaker 3: best of replays of the Armstrong and Getty Shaw You're 8 00:00:37,960 --> 00:00:39,560 Speaker 3: gonna love them. They're gonna be here. I'm gonna be 9 00:00:39,600 --> 00:00:41,440 Speaker 3: at home, sitting in my car and listening to the 10 00:00:41,520 --> 00:00:43,080 Speaker 3: radio while you do so. 11 00:00:43,479 --> 00:00:47,080 Speaker 5: While you're enjoying yourselves this week, why not hit Armstrong 12 00:00:47,159 --> 00:00:49,159 Speaker 5: in getty dot com and pick up an A and 13 00:00:49,240 --> 00:00:51,519 Speaker 5: G T shirt or hat for your favorite Ang fan, 14 00:00:51,600 --> 00:00:53,760 Speaker 5: including the hot Dogs are Dogs. 15 00:00:53,840 --> 00:00:56,680 Speaker 3: Yeah, our Black Friday special is same price as every 16 00:00:56,680 --> 00:00:59,480 Speaker 3: other day. So now enjoy the Armstrong and Geddy replay 17 00:00:59,520 --> 00:01:02,320 Speaker 3: after this. What are you doing there? Just chilling? 18 00:01:02,360 --> 00:01:03,240 Speaker 2: You're ready to help? 19 00:01:04,000 --> 00:01:05,679 Speaker 3: Hey optimist? You know where I can get a coke? 20 00:01:08,040 --> 00:01:08,400 Speaker 2: Sorry? 21 00:01:08,600 --> 00:01:13,319 Speaker 6: I don't, and I have a real time info, but 22 00:01:13,400 --> 00:01:14,840 Speaker 6: I can take you to the kitchen if you want 23 00:01:14,840 --> 00:01:15,880 Speaker 6: to check for a coke there. 24 00:01:16,120 --> 00:01:18,679 Speaker 3: Oh yeah, that'd be great. Go, yes, let's do that 25 00:01:20,880 --> 00:01:28,920 Speaker 3: and then it just stands there. Let's go awesome at 26 00:01:29,040 --> 00:01:33,720 Speaker 3: head to the kitchen. That's okay, okay, go I think 27 00:01:33,720 --> 00:01:35,039 Speaker 3: it's I think we need to give it a. 28 00:01:34,920 --> 00:01:35,480 Speaker 6: Bit more. 29 00:01:37,400 --> 00:01:41,120 Speaker 3: Okay. So that's the voice obviously of right there who 30 00:01:41,120 --> 00:01:42,720 Speaker 3: said we need to give it more room. They were 31 00:01:42,760 --> 00:01:44,880 Speaker 3: standing too close. I guess you can take that down, Michael. 32 00:01:45,040 --> 00:01:48,200 Speaker 3: They're standing next to Optimists as they're all going to 33 00:01:48,200 --> 00:01:51,200 Speaker 3: go to the kitchen to get a coke, and Optimist 34 00:01:51,200 --> 00:01:53,680 Speaker 3: who was just standing there looking at him, I wis 35 00:01:54,280 --> 00:01:55,960 Speaker 3: and Elon said, I think we need to back up 36 00:01:55,960 --> 00:01:58,360 Speaker 3: a little bit. Anyway. My takeaway from that video was 37 00:01:58,600 --> 00:02:02,560 Speaker 3: we ain't even close yet. We're not even close to 38 00:02:02,760 --> 00:02:06,680 Speaker 3: robots taking over yet. Now it's moving pretty fast. Maybe 39 00:02:06,720 --> 00:02:10,320 Speaker 3: it'll be exponentially better in a year. I'm sure it 40 00:02:10,360 --> 00:02:13,760 Speaker 3: will be. But the fact that Elon has got a 41 00:02:13,960 --> 00:02:19,079 Speaker 3: trillion dollar incentive paction package now from Tesla, and he's 42 00:02:19,080 --> 00:02:22,840 Speaker 3: focusing mostly on Optimist, the AI robot more than the 43 00:02:22,880 --> 00:02:26,640 Speaker 3: electric cars, I don't know. It seems like we're a 44 00:02:26,639 --> 00:02:29,600 Speaker 3: long way away. Where do I get a coke? I 45 00:02:29,639 --> 00:02:33,639 Speaker 3: don't know where to get and it stops, gets hung 46 00:02:34,639 --> 00:02:38,320 Speaker 3: in sort of glitch, has no idea, and then it 47 00:02:38,400 --> 00:02:41,080 Speaker 3: just stands there. It's I thought it would be further 48 00:02:41,120 --> 00:02:41,640 Speaker 3: along than that. 49 00:02:41,720 --> 00:02:44,800 Speaker 5: Didn't you wait a minute. I just googled where do 50 00:02:44,880 --> 00:02:49,160 Speaker 5: people keep cokes? It suggested the refrigerator, which is often 51 00:02:49,240 --> 00:02:55,040 Speaker 5: in a human kitchen. Let's go to the kitchen, All right, 52 00:02:55,120 --> 00:02:56,920 Speaker 5: let's go. You go first. 53 00:02:57,400 --> 00:02:59,240 Speaker 3: I'm not trying to come off as a guy who 54 00:02:59,360 --> 00:03:01,839 Speaker 3: mocks to achnology thinking it will never because I'm sure 55 00:03:01,840 --> 00:03:04,560 Speaker 3: it will be a thing eventually. But it's not as 56 00:03:04,600 --> 00:03:06,840 Speaker 3: close as I thought. But didn't you think that optimist 57 00:03:06,919 --> 00:03:08,240 Speaker 3: robot would be more impressive than that? 58 00:03:10,960 --> 00:03:11,680 Speaker 2: Yeah? 59 00:03:11,760 --> 00:03:14,799 Speaker 5: Yeah, certainly before you trotted it out to do what 60 00:03:14,840 --> 00:03:18,120 Speaker 5: they just did, right right, right, right right. You know, 61 00:03:18,160 --> 00:03:21,120 Speaker 5: I'm reminded of Elon trotting out the cyber truck for 62 00:03:21,160 --> 00:03:23,800 Speaker 5: the first time and saying, and the windows cannot be shattered. 63 00:03:23,800 --> 00:03:30,680 Speaker 3: Boom, he shatters the window. So this article, we got 64 00:03:30,680 --> 00:03:32,480 Speaker 3: a couple of AI stories for you. This article in 65 00:03:32,520 --> 00:03:35,680 Speaker 3: the Wall Street Journal today about China's push to catch 66 00:03:35,760 --> 00:03:40,040 Speaker 3: up with and surpassity the United States is flipping troubling. 67 00:03:41,240 --> 00:03:45,200 Speaker 3: For instance, this paragraph the escalating AI race is drawing 68 00:03:45,200 --> 00:03:49,000 Speaker 3: comparisons with the Cold War and the great scientific and 69 00:03:49,040 --> 00:03:52,280 Speaker 3: technological clashes that characterized it. It is likely to be 70 00:03:52,520 --> 00:03:56,960 Speaker 3: at least as consequential. The AI race between US and 71 00:03:57,080 --> 00:04:00,400 Speaker 3: China is going to be at least as consequent is 72 00:04:00,440 --> 00:04:02,920 Speaker 3: the Cold War between US and the Soviet Union, if 73 00:04:02,960 --> 00:04:05,600 Speaker 3: you're old enough to have lived through that, Holy crap. 74 00:04:08,840 --> 00:04:12,400 Speaker 3: China realized that all the big a AI was going 75 00:04:12,400 --> 00:04:15,400 Speaker 3: to be the next big thing, maybe the next big 76 00:04:15,440 --> 00:04:18,480 Speaker 3: only thing on planet Earth, and it was way behind 77 00:04:18,560 --> 00:04:21,360 Speaker 3: Open AI, Google, all the American companies that were doing 78 00:04:21,360 --> 00:04:23,719 Speaker 3: so well, and then decided we got to do something. 79 00:04:24,000 --> 00:04:26,560 Speaker 3: And they've done a whole of nation effort to try 80 00:04:26,560 --> 00:04:28,560 Speaker 3: to catch up and poured a ton of money into 81 00:04:28,560 --> 00:04:36,640 Speaker 3: it and relaxed all kinds of regulations, which is highly 82 00:04:36,640 --> 00:04:39,880 Speaker 3: troubling and even silenced concerns. 83 00:04:41,120 --> 00:04:43,240 Speaker 5: You know, I was just hey, quit talking about safety 84 00:04:43,400 --> 00:04:46,320 Speaker 5: and what's best for humanity. We don't have time. I 85 00:04:46,400 --> 00:04:49,080 Speaker 5: mentioned My favorite quote in the article is from JD. 86 00:04:49,240 --> 00:04:53,400 Speaker 5: Vance and he argued this in February, the AI future 87 00:04:53,480 --> 00:04:56,799 Speaker 5: is not going to be won by hand ringing about safety. 88 00:04:58,279 --> 00:05:02,560 Speaker 3: Well, he's right, I understand what he's saying. What he's 89 00:05:02,600 --> 00:05:07,000 Speaker 3: wanting to say is China and Russia, mostly China, because 90 00:05:07,080 --> 00:05:09,640 Speaker 3: China's got the money to put into this. China's gonna 91 00:05:09,640 --> 00:05:11,520 Speaker 3: do whatever the hell they want. And if they beat 92 00:05:11,600 --> 00:05:13,880 Speaker 3: us to the punch on this, it ain't gonna make 93 00:05:13,880 --> 00:05:17,040 Speaker 3: any difference. That we tried to be ethical and safe 94 00:05:17,080 --> 00:05:18,880 Speaker 3: about it, it ain't gonna make any difference. 95 00:05:19,800 --> 00:05:21,920 Speaker 5: And I'm certainly the wrong guy to ask this question, 96 00:05:21,960 --> 00:05:25,520 Speaker 5: but I find myself wondering, can their AI essentially crush 97 00:05:25,600 --> 00:05:28,719 Speaker 5: our AI if it gets to you know, whatever critical 98 00:05:28,760 --> 00:05:33,200 Speaker 5: stage first, it can mess with our efforts and our 99 00:05:33,240 --> 00:05:36,839 Speaker 5: programs and databases and the rest of it to the 100 00:05:36,880 --> 00:05:39,159 Speaker 5: point that it blows ours up. 101 00:05:39,960 --> 00:05:42,920 Speaker 3: Yeah, it could be some sort of like on purpose 102 00:05:42,960 --> 00:05:45,159 Speaker 3: effort like that, But I take in a ton of 103 00:05:45,200 --> 00:05:47,719 Speaker 3: AI information reading and listening to podcasts with the smartest 104 00:05:47,760 --> 00:05:50,040 Speaker 3: people in the world talking about this. The more likely 105 00:05:50,080 --> 00:05:54,600 Speaker 3: concern is without any attempt whatsoever to ethically control it, 106 00:05:54,600 --> 00:05:56,880 Speaker 3: it just gets loose on its own and gets into 107 00:05:56,880 --> 00:05:59,240 Speaker 3: computers and travels around the world and just kind of 108 00:05:59,279 --> 00:06:01,480 Speaker 3: does its own thing. And then then the genie is 109 00:06:01,480 --> 00:06:01,800 Speaker 3: out of. 110 00:06:01,720 --> 00:06:05,880 Speaker 5: The bottle, which is pretty much inevitable. How do we 111 00:06:05,960 --> 00:06:08,720 Speaker 5: prevent that though? I mean, even if we're first, oh, 112 00:06:08,760 --> 00:06:15,600 Speaker 5: we can't, okay, never mind having your oregans harvested. I mean, 113 00:06:15,600 --> 00:06:17,280 Speaker 5: because even if we beat them to the punch by 114 00:06:17,320 --> 00:06:19,760 Speaker 5: five years. When they catch up five years later, unless 115 00:06:19,800 --> 00:06:22,360 Speaker 5: our AI can trump their Ai, they will unleash it. 116 00:06:22,440 --> 00:06:25,040 Speaker 2: And the hell you were speaking of, I suppose. 117 00:06:24,760 --> 00:06:27,000 Speaker 3: My big first of all, that paragraph about the Cold War, 118 00:06:27,040 --> 00:06:31,479 Speaker 3: I find just like bone chilling. I don't. I don't 119 00:06:31,600 --> 00:06:38,920 Speaker 3: feel like the population is taking this like the challenge 120 00:06:39,000 --> 00:06:43,720 Speaker 3: that it is. The way the Cold War was. I mean, 121 00:06:43,720 --> 00:06:47,360 Speaker 3: my dad grew up hiding underneath his desks in rural 122 00:06:47,440 --> 00:06:51,719 Speaker 3: Iowa in case the Russians dropped the bomb, but it 123 00:06:51,760 --> 00:06:55,200 Speaker 3: was on their radar that we were in a you know, 124 00:06:55,560 --> 00:06:59,320 Speaker 3: fight to the death with a foe that was close 125 00:06:59,440 --> 00:07:01,120 Speaker 3: enough to where he to have to worry about it. 126 00:07:01,240 --> 00:07:03,159 Speaker 3: I don't feel like people feel that way about China 127 00:07:03,200 --> 00:07:05,520 Speaker 3: and Ai. The average person doesn't have any idea any 128 00:07:05,520 --> 00:07:10,720 Speaker 3: of this is happening. No, no, which is troubling. I think, 129 00:07:11,680 --> 00:07:16,760 Speaker 3: my uh, I guess we're better off that way. 130 00:07:16,960 --> 00:07:18,600 Speaker 5: I mean, because if we spend all of our time 131 00:07:18,720 --> 00:07:22,880 Speaker 5: terrified of our if our AI overlords, that's no way 132 00:07:22,920 --> 00:07:25,480 Speaker 5: to live. Instead, you'll be going about your business. One 133 00:07:25,560 --> 00:07:28,240 Speaker 5: day you'll turn around, there's a robot beyond you. You'll think, wow, 134 00:07:28,280 --> 00:07:30,600 Speaker 5: that's weird. Then it'll suffer your head. I mean, it's 135 00:07:30,640 --> 00:07:33,760 Speaker 5: just like that, and you won't have suffered the fear. 136 00:07:33,880 --> 00:07:37,080 Speaker 3: Hard to imagine. No, ahead, But what are you gonna do? Yeah, 137 00:07:37,280 --> 00:07:38,960 Speaker 3: I suppose worrying about it and not much you can 138 00:07:39,080 --> 00:07:41,920 Speaker 3: do about it. I was gonna, says, as a guy 139 00:07:41,960 --> 00:07:44,520 Speaker 3: who cares about his money, I worry about the economy 140 00:07:44,560 --> 00:07:46,680 Speaker 3: and what's gonna happen and whether or not this is 141 00:07:46,720 --> 00:07:48,880 Speaker 3: all a bubble and it's going to completely collapse, and 142 00:07:48,960 --> 00:07:52,800 Speaker 3: it is chip companies trading money with AI companies back 143 00:07:52,800 --> 00:07:54,760 Speaker 3: and forth and investing each other, and it could bust 144 00:07:54,880 --> 00:07:56,440 Speaker 3: and it doesn't turn out to be what they said. 145 00:07:56,640 --> 00:08:00,080 Speaker 3: But well that one of the other leads to or 146 00:08:00,240 --> 00:08:06,800 Speaker 3: that have got for today the right here. Yan Lukun 147 00:08:07,080 --> 00:08:11,560 Speaker 3: is Meta's chief AI scientist, the top guy working on 148 00:08:11,680 --> 00:08:17,000 Speaker 3: AI for Zuckerberg, who has spent tens of billions of dollars. 149 00:08:17,080 --> 00:08:19,360 Speaker 3: I think he spent one hundred billion dollars on this project. 150 00:08:19,600 --> 00:08:22,880 Speaker 3: His lead scientist is leaving and starting his own company. 151 00:08:24,040 --> 00:08:27,880 Speaker 3: All of these people, including the Chinese, can't all be wrong, 152 00:08:28,040 --> 00:08:30,320 Speaker 3: can they? That it turns into a dot com bubble 153 00:08:30,320 --> 00:08:31,960 Speaker 3: where it's like, oh, I guess AI is not going 154 00:08:31,960 --> 00:08:33,600 Speaker 3: to be profitable or do anything. 155 00:08:33,679 --> 00:08:36,280 Speaker 2: So never mind, gosh, I wouldn't think so, you wouldn't. 156 00:08:35,960 --> 00:08:39,640 Speaker 3: Think Elon and Zuckerberg and China and everybody else could 157 00:08:39,640 --> 00:08:42,599 Speaker 3: be wrong about this. So that's what leads me to 158 00:08:42,640 --> 00:08:44,760 Speaker 3: believe that this is going to be a thing unfolding 159 00:08:44,760 --> 00:08:47,920 Speaker 3: in front of our eyes at some point and then 160 00:08:47,960 --> 00:08:51,640 Speaker 3: looming behind us and severing our heads. So you say 161 00:08:51,640 --> 00:08:54,040 Speaker 3: that all the time, which is funny, But do you 162 00:08:54,160 --> 00:08:58,079 Speaker 3: do you have a real world sense of bad things 163 00:08:58,120 --> 00:08:58,959 Speaker 3: that AI could do. 164 00:09:01,160 --> 00:09:01,400 Speaker 4: Well. 165 00:09:01,559 --> 00:09:04,559 Speaker 5: It goes back to the commonly spoken theme of AI 166 00:09:04,720 --> 00:09:08,880 Speaker 5: decides the only thing impeding it is human beings or 167 00:09:09,520 --> 00:09:13,040 Speaker 5: the only thing impeding the planet being at peak health 168 00:09:13,520 --> 00:09:14,360 Speaker 5: is human beings. 169 00:09:15,080 --> 00:09:17,880 Speaker 2: I mean, those are the two classic why they would 170 00:09:17,880 --> 00:09:18,680 Speaker 2: sever our heads. 171 00:09:19,920 --> 00:09:23,280 Speaker 3: And then even if they don't do that, what if 172 00:09:23,320 --> 00:09:26,200 Speaker 3: it wipes out seventy five percent of jobs? 173 00:09:26,600 --> 00:09:28,240 Speaker 2: Well right right? Yeah? 174 00:09:28,280 --> 00:09:31,240 Speaker 5: And then political turmoil and revolution in the straits, et cetera, 175 00:09:31,240 --> 00:09:35,360 Speaker 5: et cetera while you're fighting robots. Great, oh yeah, no kidding, Wow. 176 00:09:36,160 --> 00:09:38,200 Speaker 5: So a couple more quick AI notes. I thought this 177 00:09:38,280 --> 00:09:42,000 Speaker 5: was really interesting. The Wall Street Journal reporting that Anthropic, 178 00:09:42,440 --> 00:09:46,480 Speaker 5: which is the company behind Claude, expects to break even 179 00:09:46,520 --> 00:09:51,439 Speaker 5: for the first time in twenty twenty eight. By contrast, 180 00:09:51,640 --> 00:09:56,920 Speaker 5: Open AI the chat GPT folks plan, they're forecasting, they're 181 00:09:56,960 --> 00:10:00,720 Speaker 5: operating losses that year twenty twenty eight will be about 182 00:10:00,760 --> 00:10:05,640 Speaker 5: seventy four billion dollars. They will lose seventy four billion 183 00:10:05,679 --> 00:10:09,200 Speaker 5: dollars in twenty twenty eight, or roughly three quarters of revenue, 184 00:10:09,240 --> 00:10:12,720 Speaker 5: thanks to ballooning spending on computing costs. And they don't 185 00:10:13,120 --> 00:10:16,320 Speaker 5: think they'll They're gonna burn through roughly fourteen times as 186 00:10:16,400 --> 00:10:19,800 Speaker 5: much cash as Anthropic before turning a profit in twenty thirty. 187 00:10:20,640 --> 00:10:23,520 Speaker 5: But certainly don't take my word for it. Through the 188 00:10:23,520 --> 00:10:25,439 Speaker 5: Wall Street Journal and invest carefully. 189 00:10:25,920 --> 00:10:28,400 Speaker 3: Yeah, and then you've got this story of Amazon that 190 00:10:28,640 --> 00:10:30,920 Speaker 3: never made any money and was losing money like crazy, 191 00:10:30,960 --> 00:10:32,760 Speaker 3: and I remember all the jokes about it never turned 192 00:10:32,760 --> 00:10:34,720 Speaker 3: to profit and everything like that, and obviously came to 193 00:10:34,840 --> 00:10:39,480 Speaker 3: dominate the landscape in so many different ways eventually. 194 00:10:40,559 --> 00:10:43,880 Speaker 2: And then I'm sorry, Michael's what did you say to us? Oh? 195 00:10:44,000 --> 00:10:46,920 Speaker 5: Price pick yea just a second coming up in one 196 00:10:46,920 --> 00:10:50,280 Speaker 5: more AI note from a website I had never heard of, 197 00:10:50,520 --> 00:10:54,480 Speaker 5: sent to us by alert listener Hillbilly Savingcountry music dot 198 00:10:54,480 --> 00:10:59,080 Speaker 5: Com The headline is AI song's top Billboard chart. Why 199 00:10:59,120 --> 00:11:03,080 Speaker 5: we need transre and c now. All right, so I 200 00:11:03,400 --> 00:11:06,880 Speaker 5: just opened this up again. Thank you Hillbilly for sending 201 00:11:06,920 --> 00:11:10,600 Speaker 5: this along. There's an alarmingly low sense of urgency. They 202 00:11:10,600 --> 00:11:13,520 Speaker 5: write about a rapidly developing dilemma that threatens to absolutely 203 00:11:13,600 --> 00:11:15,760 Speaker 5: eviscerate everything we know and love about music in a 204 00:11:15,760 --> 00:11:17,960 Speaker 5: matter of months. We're talking about AI, of course, but 205 00:11:17,960 --> 00:11:20,160 Speaker 5: it feels almost embarrassing and trite at this point even 206 00:11:20,200 --> 00:11:22,720 Speaker 5: bring it up in such a breathless context, in part 207 00:11:22,760 --> 00:11:25,080 Speaker 5: because we all have an inherent sense of how catastrophic 208 00:11:25,120 --> 00:11:27,320 Speaker 5: AI is going to be for the human creators and 209 00:11:27,400 --> 00:11:29,280 Speaker 5: how inevitable its impacts ultimately are. 210 00:11:29,559 --> 00:11:29,720 Speaker 6: You know? 211 00:11:29,840 --> 00:11:32,400 Speaker 5: Hillbilly mentioned that the guy who wrote this is a 212 00:11:32,480 --> 00:11:35,480 Speaker 5: terrific writer, and he is. Wow, that's some some good 213 00:11:35,520 --> 00:11:38,520 Speaker 5: writing anyway, what's his name? 214 00:11:39,080 --> 00:11:39,480 Speaker 2: I don't know? 215 00:11:40,000 --> 00:11:44,120 Speaker 5: But they there's the picture of AI generated artist breaking rust. 216 00:11:45,400 --> 00:11:48,679 Speaker 5: It's a little too perfect, country looking bearded guy in 217 00:11:48,720 --> 00:11:50,640 Speaker 5: a cowboy hat and right all. 218 00:11:51,440 --> 00:11:54,000 Speaker 3: Uh handsome, rugged yet sensitive? 219 00:11:54,960 --> 00:11:58,280 Speaker 5: Yes, how'd you know? You must have seen this picture, Papa? 220 00:11:58,320 --> 00:12:01,440 Speaker 5: But do we expect Congress to address this existential crisis 221 00:12:01,440 --> 00:12:05,240 Speaker 5: facing human creators. They're saying we should do something about it, 222 00:12:05,280 --> 00:12:08,840 Speaker 5: attempt to install some guardrails and guideposts, and expend at 223 00:12:08,920 --> 00:12:10,959 Speaker 5: least a modicum of effort to at least make sure 224 00:12:11,000 --> 00:12:13,240 Speaker 5: the public is aware of weight, what is AI and 225 00:12:13,280 --> 00:12:13,920 Speaker 5: what is not. 226 00:12:14,160 --> 00:12:16,760 Speaker 3: Yeah, there's some effort by lots of people that you 227 00:12:16,840 --> 00:12:20,240 Speaker 3: have to declare something in AI creation. Do you think 228 00:12:20,280 --> 00:12:23,920 Speaker 3: that makes any difference? You dig in a song, Oh 229 00:12:23,920 --> 00:12:26,360 Speaker 3: it's AI, will never mind? Then I don't right, either 230 00:12:26,480 --> 00:12:28,079 Speaker 3: like it or I don't like it. Do you think 231 00:12:28,120 --> 00:12:29,679 Speaker 3: I'd like it more if it turns out it's a 232 00:12:29,760 --> 00:12:34,000 Speaker 3: human Maybe maybe I find out it's some thirty four 233 00:12:34,080 --> 00:12:37,840 Speaker 3: year old former drug addict was in jail. You know, 234 00:12:37,840 --> 00:12:40,760 Speaker 3: I'm thinking of a what's the guy Jelly World type 235 00:12:40,760 --> 00:12:43,240 Speaker 3: of story or something like that that hooks you. 236 00:12:44,000 --> 00:12:48,559 Speaker 5: M yeah, because the sentiment in the song seems much 237 00:12:48,559 --> 00:12:51,760 Speaker 5: more real. You've asked the key question. That's a super 238 00:12:51,760 --> 00:12:55,760 Speaker 5: interesting question. Will people still enjoy it? This guy's advocating 239 00:12:56,120 --> 00:12:58,520 Speaker 5: any piece of music made by AI or even partially 240 00:12:58,640 --> 00:13:00,760 Speaker 5: may AI, must be a clar to such to the 241 00:13:00,800 --> 00:13:03,880 Speaker 5: public period. So like, if you use AI to clean 242 00:13:03,960 --> 00:13:06,240 Speaker 5: up the bassline, I don't know, because your bass player's 243 00:13:06,280 --> 00:13:07,080 Speaker 5: drunk or something. 244 00:13:10,640 --> 00:13:14,599 Speaker 2: I don't know, because evidently this breaking. 245 00:13:14,320 --> 00:13:19,120 Speaker 5: Rust song walk My Walk top the Billboard Country Digital 246 00:13:19,200 --> 00:13:20,559 Speaker 5: Song Sales Chart. 247 00:13:21,160 --> 00:13:23,720 Speaker 2: An AI track was number one song in country. 248 00:13:23,760 --> 00:13:25,280 Speaker 3: I'm going to listen to that during the break. 249 00:13:26,960 --> 00:13:27,520 Speaker 2: Please didn't. 250 00:13:27,559 --> 00:13:29,920 Speaker 3: We can't play it for copywriting reasons, but I'm going 251 00:13:29,960 --> 00:13:31,520 Speaker 3: to listen to it to the break. See what I think? 252 00:13:32,240 --> 00:13:34,520 Speaker 2: Yeah, uh yeah. 253 00:13:34,320 --> 00:13:35,880 Speaker 3: I read it into a weird world. 254 00:13:36,040 --> 00:13:38,400 Speaker 2: Yeah, yeah, I and everybody's guessing. 255 00:13:38,679 --> 00:13:40,560 Speaker 3: But the people with a lots of money, like the 256 00:13:40,640 --> 00:13:43,160 Speaker 3: richest people on earth, are guessing that it's going to 257 00:13:43,160 --> 00:13:46,280 Speaker 3: be a big deal and going to be profitable. I 258 00:13:46,280 --> 00:13:48,560 Speaker 3: don't know if there are any super wealthy people are saying, nah, 259 00:13:48,600 --> 00:13:52,200 Speaker 3: this is overhyped. I'll be in the woods if you 260 00:13:52,280 --> 00:13:57,280 Speaker 3: need me. Man, Well, good luck the AI robot. But 261 00:13:57,520 --> 00:13:59,440 Speaker 3: currently can't find a coke. We'll be able to find 262 00:13:59,440 --> 00:14:01,280 Speaker 3: you in the wood to chop off your head for 263 00:14:01,320 --> 00:14:01,960 Speaker 3: whatever reason. 264 00:14:02,440 --> 00:14:06,120 Speaker 5: Look, honey, look at that squirrel zeros and ones flashed 265 00:14:06,160 --> 00:14:06,760 Speaker 5: in his eyes. 266 00:14:07,000 --> 00:14:08,199 Speaker 2: It sends down about it? 267 00:14:08,440 --> 00:14:13,880 Speaker 4: Yeah, right, like the Armstrong and Getty show. Yeah, moorgho 268 00:14:14,160 --> 00:14:21,120 Speaker 4: podcasts and our hot links. 269 00:14:20,080 --> 00:14:21,840 Speaker 1: The arm Strong and Getdy Show. 270 00:14:25,800 --> 00:14:27,560 Speaker 2: So I uh for various reasons. 271 00:14:27,600 --> 00:14:31,240 Speaker 5: My doctor hit me with a pregnozone prescription and it's 272 00:14:31,640 --> 00:14:34,240 Speaker 5: totally screwed up my sleep. Last night I was awake 273 00:14:34,320 --> 00:14:37,280 Speaker 5: for two solid hours, my mind just racing in the 274 00:14:37,280 --> 00:14:38,360 Speaker 5: middle of the night last night. 275 00:14:38,400 --> 00:14:41,600 Speaker 3: Hey damn, yeah, no doubt. Oh my gosh, Yeah, didn't 276 00:14:41,600 --> 00:14:43,440 Speaker 3: make your raven to see hungry. That's what I always 277 00:14:43,440 --> 00:14:44,280 Speaker 3: had with pregnant zone. 278 00:14:44,760 --> 00:14:44,840 Speaker 6: No. 279 00:14:44,960 --> 00:14:47,000 Speaker 2: Actually, my stomach's kind of upset right now. 280 00:14:47,000 --> 00:14:51,920 Speaker 5: But anyway, So two things occupied my two hours of 281 00:14:52,800 --> 00:14:56,239 Speaker 5: like high speed thinking. Number one, I wrote, and rewrote, 282 00:14:56,400 --> 00:15:01,560 Speaker 5: and like combed through and perfected the first class of 283 00:15:01,600 --> 00:15:07,360 Speaker 5: a college class in basic economics that focuses on how 284 00:15:07,840 --> 00:15:11,840 Speaker 5: laws and regulations and prices cause people to change their 285 00:15:11,880 --> 00:15:16,320 Speaker 5: behavior Logical economics. I was going to call it, so 286 00:15:16,920 --> 00:15:19,520 Speaker 5: when you're trying to sleep, you couldn't sleep, So you 287 00:15:19,560 --> 00:15:24,120 Speaker 5: came up with a curriculum for a one hundred level 288 00:15:24,120 --> 00:15:27,200 Speaker 5: college class and economics that I will never teach. That's correct, 289 00:15:27,360 --> 00:15:28,120 Speaker 5: that's relaxing. 290 00:15:28,720 --> 00:15:29,000 Speaker 2: Yes. 291 00:15:29,760 --> 00:15:32,000 Speaker 5: The second thing I did when that was perfected was 292 00:15:32,200 --> 00:15:36,040 Speaker 5: I was working on the speech. You poor teachers, and 293 00:15:36,080 --> 00:15:38,880 Speaker 5: I'm thinking mostly of Californians, but in blue states everywhere, 294 00:15:38,920 --> 00:15:41,440 Speaker 5: and if your school district does this sort of stuff. 295 00:15:41,520 --> 00:15:44,240 Speaker 5: You poor teachers who are writing us emails this week saying, yeah, 296 00:15:44,280 --> 00:15:47,720 Speaker 5: we're in our mandatory DEI training right now. Guys, this 297 00:15:47,760 --> 00:15:51,280 Speaker 5: stuff is not dead. It is still here, and they're 298 00:15:51,320 --> 00:15:54,480 Speaker 5: being humiliated and told that getting your work done on 299 00:15:54,560 --> 00:16:00,240 Speaker 5: time is white supremacy. Being diligent is white supremacy, and 300 00:16:00,320 --> 00:16:03,160 Speaker 5: all of that just utter garbage. And so I was 301 00:16:03,280 --> 00:16:05,840 Speaker 5: like writing a speech for you that you could stand 302 00:16:05,880 --> 00:16:07,760 Speaker 5: up and say, excuse me, I hate to interrupt this, 303 00:16:07,880 --> 00:16:10,280 Speaker 5: but this is racist garbage, and it's wasting all of 304 00:16:10,320 --> 00:16:13,280 Speaker 5: our time and worse, and here's why exactly, and then 305 00:16:13,320 --> 00:16:16,840 Speaker 5: you would explain to them briefly that DEI does not 306 00:16:16,880 --> 00:16:19,920 Speaker 5: have anything to do with racial harmony or anything. 307 00:16:19,960 --> 00:16:22,880 Speaker 2: It's a tool of capture, it's a tool of takeover. 308 00:16:23,720 --> 00:16:26,400 Speaker 5: If you have three people up for the job of 309 00:16:26,480 --> 00:16:29,040 Speaker 5: boss and you can say one of those guys is 310 00:16:29,040 --> 00:16:32,280 Speaker 5: a racist, he will not get that authority, he will 311 00:16:32,280 --> 00:16:34,320 Speaker 5: not get that money, he will not get the good stuff. 312 00:16:34,640 --> 00:16:37,920 Speaker 5: And if you start calling everybody a racist who you 313 00:16:37,960 --> 00:16:40,600 Speaker 5: don't want to have power, and everybody's like afraid to 314 00:16:40,600 --> 00:16:43,680 Speaker 5: say that doesn't seem like racism to me, then you're 315 00:16:43,800 --> 00:16:48,520 Speaker 5: in power DEI needs to end wherever it exists everywhere 316 00:16:48,520 --> 00:16:50,000 Speaker 5: in America today. 317 00:16:50,200 --> 00:16:51,520 Speaker 2: It has nothing to do with race. 318 00:16:51,760 --> 00:16:54,960 Speaker 5: It is a Marxist technique to capture institutions. 319 00:16:55,080 --> 00:16:56,440 Speaker 2: Armstrong and Getting. 320 00:16:56,560 --> 00:17:03,960 Speaker 7: Christmas Shopping Sometimes it can feel like you're just buying 321 00:17:04,640 --> 00:17:08,960 Speaker 7: a bunch of random stuff. Gett focus and spend your 322 00:17:09,119 --> 00:17:15,520 Speaker 7: money right. We've got the perfect gift, but special person 323 00:17:15,760 --> 00:17:21,480 Speaker 7: in your life. The Armstrong and Geddy super Sto Shop 324 00:17:21,640 --> 00:17:31,880 Speaker 7: Now arm Strong and Getny dot Com. 325 00:17:33,040 --> 00:17:43,000 Speaker 1: Arm Strong and Getty, Jack Armstrong and Joe Getty, The 326 00:17:43,200 --> 00:17:44,600 Speaker 1: Armstrong and Getty Show. 327 00:17:45,400 --> 00:17:48,560 Speaker 3: The last show. 328 00:17:51,560 --> 00:17:54,639 Speaker 2: At seventy five. Both men and women. 329 00:17:54,600 --> 00:17:58,200 Speaker 6: Fall off a cliff at seventy five. It's a population level. 330 00:17:58,320 --> 00:18:01,880 Speaker 6: It's unmistakable what happens at the age of seventy five. 331 00:18:02,520 --> 00:18:05,240 Speaker 6: That's what we're up against. That's what I'm thinking about 332 00:18:05,280 --> 00:18:08,560 Speaker 6: in the practice, is how do I create an escape 333 00:18:08,640 --> 00:18:11,720 Speaker 6: velocity that gets somebody another fifteen years There. 334 00:18:13,000 --> 00:18:16,199 Speaker 3: Doctor Peter atya Atia, however he pronounce his name, was 335 00:18:16,240 --> 00:18:19,480 Speaker 3: on sixty Minutes on Sunday Night. He's very well known 336 00:18:19,520 --> 00:18:22,399 Speaker 3: for some of you because he's written some books that 337 00:18:22,440 --> 00:18:26,480 Speaker 3: have sold millions of millions copies about how to get 338 00:18:26,520 --> 00:18:30,480 Speaker 3: a little more enjoyment years out of your life. 339 00:18:30,920 --> 00:18:34,640 Speaker 5: Right, not just prolonging life, but prolonging quality years. 340 00:18:35,080 --> 00:18:39,679 Speaker 3: I immediately put my hand over my watch in my 341 00:18:39,760 --> 00:18:43,000 Speaker 3: wallet when anybody ever starts talking about health stuff. 342 00:18:42,720 --> 00:18:44,080 Speaker 2: Because there's just a lot of. 343 00:18:45,480 --> 00:18:48,120 Speaker 3: You know, very few people ever want to come out 344 00:18:48,119 --> 00:18:49,520 Speaker 3: and just say, well, you got to eat less and 345 00:18:49,600 --> 00:18:52,160 Speaker 3: exercise more would be the main way to lose weight, 346 00:18:52,200 --> 00:18:54,679 Speaker 3: as opposed to I've got a new way that nobody's 347 00:18:54,720 --> 00:18:55,840 Speaker 3: ever thought of before. 348 00:18:56,400 --> 00:19:00,400 Speaker 5: You eat only things that match this paint chips. 349 00:19:02,480 --> 00:19:05,480 Speaker 3: It's called beije eating only eat things that are beige, 350 00:19:06,080 --> 00:19:11,399 Speaker 3: the monochromatic diet. But this guy, I thought that was interesting. 351 00:19:11,400 --> 00:19:14,000 Speaker 3: Of course, your mileage may vary. It happens a little 352 00:19:14,000 --> 00:19:16,040 Speaker 3: earlier for some people, a little later like my dad 353 00:19:16,080 --> 00:19:19,760 Speaker 3: for some people, but in general, at seventy five you 354 00:19:19,840 --> 00:19:22,080 Speaker 3: just go off a cliff, and he wants to extend 355 00:19:22,080 --> 00:19:25,080 Speaker 3: that out a little further. Not your life span, but 356 00:19:25,119 --> 00:19:28,240 Speaker 3: the years that you can actually do stuff and enjoy yourself. Right. 357 00:19:29,320 --> 00:19:32,920 Speaker 5: Yeah, I'll hold my takeaway from that segment back. 358 00:19:32,960 --> 00:19:34,320 Speaker 2: I don't want to steal anybody's thunder. 359 00:19:34,400 --> 00:19:37,840 Speaker 3: Here's Nora O'Donnell from sixty minutes talking to the doctor, and. 360 00:19:37,800 --> 00:19:42,439 Speaker 8: So is your goal to minimize or essentially erase that 361 00:19:42,720 --> 00:19:43,640 Speaker 8: marginal decade. 362 00:19:43,960 --> 00:19:46,240 Speaker 6: The marginal decade's not going anywhere. We will all have 363 00:19:46,280 --> 00:19:48,560 Speaker 6: a final decade of life. My goal is to make 364 00:19:48,680 --> 00:19:52,080 Speaker 6: the marginal decade as enjoyable as possible. The way I 365 00:19:52,119 --> 00:19:54,760 Speaker 6: explain it to my patients is that last ten to 366 00:19:54,800 --> 00:19:56,960 Speaker 6: fifteen of your years, if you don't do anything about it, 367 00:19:57,320 --> 00:19:59,600 Speaker 6: you will fall to a level of about fifty percent 368 00:19:59,640 --> 00:20:01,960 Speaker 6: of your time total capacity cognitively, physically. 369 00:20:02,280 --> 00:20:04,159 Speaker 2: And people hear that, you're like, I don't want to 370 00:20:04,160 --> 00:20:04,720 Speaker 2: be that. 371 00:20:04,400 --> 00:20:06,320 Speaker 8: That's not how I want to spend the last decade 372 00:20:06,320 --> 00:20:06,959 Speaker 8: of my life. 373 00:20:07,080 --> 00:20:09,000 Speaker 6: A lot of people respond that way as though they're 374 00:20:09,000 --> 00:20:12,000 Speaker 6: hearing this for the first time, although if you ask them, 375 00:20:12,440 --> 00:20:15,880 Speaker 6: haven't you seen people in this state, they'll say, well, yeah, 376 00:20:15,880 --> 00:20:16,440 Speaker 6: I guess. 377 00:20:16,200 --> 00:20:19,680 Speaker 2: I have right, likely their own relatives, sure, their own parents. 378 00:20:19,720 --> 00:20:23,520 Speaker 5: Even well, I'm fairly functional on my best day, So 379 00:20:24,080 --> 00:20:25,840 Speaker 5: you know, fifty percent of that oof. 380 00:20:26,760 --> 00:20:30,159 Speaker 3: That's getting down there? Yeah, hey, iikes. That is a 381 00:20:30,200 --> 00:20:35,120 Speaker 3: funny aspect of human beings that even though we most 382 00:20:35,119 --> 00:20:39,080 Speaker 3: of us experience it with family members or whatever, we 383 00:20:39,440 --> 00:20:41,800 Speaker 3: feel like it's not going to happen to us or something. 384 00:20:41,840 --> 00:20:44,240 Speaker 5: It's yes, yeah, I don't. I don't have that old 385 00:20:44,280 --> 00:20:46,920 Speaker 5: guy attitude. I'm not going to be somewhat stooped over 386 00:20:47,000 --> 00:20:48,399 Speaker 5: and complain about arth right. 387 00:20:48,920 --> 00:20:51,359 Speaker 3: Or we're and this is a slightly different topic, but 388 00:20:51,400 --> 00:20:53,359 Speaker 3: not completely. I was talking to a guy last night 389 00:20:53,359 --> 00:20:55,919 Speaker 3: who's a friend of mine, and he's dealing with his 390 00:20:56,080 --> 00:20:59,359 Speaker 3: mom's final years and how they're going to deal with it, 391 00:20:59,400 --> 00:21:02,280 Speaker 3: and just that stat that I always think about, where 392 00:21:02,320 --> 00:21:04,720 Speaker 3: like ninety eight percent of people say they want to 393 00:21:04,720 --> 00:21:07,680 Speaker 3: die at home, ninety eight percent of people don't die 394 00:21:07,680 --> 00:21:12,359 Speaker 3: at home, right, and for a variety of reasons. But anyway, 395 00:21:12,600 --> 00:21:13,919 Speaker 3: let's hear it a little more of this before we 396 00:21:14,000 --> 00:21:14,679 Speaker 3: discuss go on. 397 00:21:15,040 --> 00:21:17,399 Speaker 6: I think this is the neglected part of medical testing 398 00:21:18,000 --> 00:21:18,919 Speaker 6: is how fit are you? 399 00:21:19,359 --> 00:21:20,119 Speaker 2: How strong are you? 400 00:21:20,240 --> 00:21:20,880 Speaker 3: All right? 401 00:21:20,920 --> 00:21:23,000 Speaker 2: How well do you move and up overhead? 402 00:21:23,160 --> 00:21:26,359 Speaker 6: And in many ways these tests are even more predictive 403 00:21:26,440 --> 00:21:29,600 Speaker 6: of how long you're going to live than what I might. 404 00:21:29,440 --> 00:21:30,360 Speaker 3: Get out of your blood work. 405 00:21:30,440 --> 00:21:31,240 Speaker 2: How do we know that? 406 00:21:31,400 --> 00:21:34,280 Speaker 6: The data are pretty clear when you look at things 407 00:21:34,320 --> 00:21:37,080 Speaker 6: like cardio respiratory fitness, when you look at muscle mass, 408 00:21:37,119 --> 00:21:39,800 Speaker 6: when you look at strength, they have a much higher 409 00:21:39,840 --> 00:21:43,800 Speaker 6: association than things like even cholesterol and blood pressure. 410 00:21:44,680 --> 00:21:49,639 Speaker 8: You think anyone, whether they're forty five or sixty five, 411 00:21:50,040 --> 00:21:54,280 Speaker 8: should be training like athletes, not for the Olympics, but 412 00:21:54,400 --> 00:21:55,879 Speaker 8: essentially for advanced age. 413 00:21:56,359 --> 00:22:01,720 Speaker 3: Absolutely, life is a sport of that idea. 414 00:22:02,800 --> 00:22:06,520 Speaker 5: I didn't hear anything that he said that contradicted things 415 00:22:06,520 --> 00:22:09,399 Speaker 5: that I've learned through the years about how to age well, 416 00:22:10,160 --> 00:22:14,200 Speaker 5: having to do with strength and flexibility and various measures. 417 00:22:15,520 --> 00:22:17,840 Speaker 5: The guy just struck me as a really high end, 418 00:22:18,040 --> 00:22:22,360 Speaker 5: extremely thorough personal trainer with a medical background. 419 00:22:22,480 --> 00:22:26,720 Speaker 2: I hear a lot of faux medical crap, and. 420 00:22:26,640 --> 00:22:29,199 Speaker 5: That seemed to be just I mean, essentially, he's going 421 00:22:29,280 --> 00:22:31,840 Speaker 5: to turn you into an Olympic athlete. Some of the 422 00:22:31,880 --> 00:22:34,440 Speaker 5: exercise regimens and tests they were going to do in all, 423 00:22:34,960 --> 00:22:37,520 Speaker 5: how could they not help you live longer? It looked 424 00:22:37,520 --> 00:22:39,240 Speaker 5: like an enormous amount of work. 425 00:22:40,000 --> 00:22:42,399 Speaker 3: Yeah. One of the reviews of one of his books, 426 00:22:42,480 --> 00:22:46,280 Speaker 3: his work has exploded in popularity amid the longevity boom, 427 00:22:46,680 --> 00:22:49,680 Speaker 3: but he stresses evidence over hype, urging people to train 428 00:22:49,840 --> 00:22:53,719 Speaker 3: seriously for vibrant old age rather than chasing quick fixes, 429 00:22:53,760 --> 00:22:56,159 Speaker 3: which was what I was talking about earlier. The crawl 430 00:22:56,200 --> 00:22:59,159 Speaker 3: like a dog exercise or what the lot new things? 431 00:23:00,440 --> 00:23:03,840 Speaker 5: Yeah, and I know a couple of people who fly 432 00:23:04,119 --> 00:23:08,520 Speaker 5: to like South American capitals to get various infusions. You 433 00:23:08,600 --> 00:23:12,160 Speaker 5: know people who do that, Yes, infusions of what one 434 00:23:12,160 --> 00:23:16,639 Speaker 5: guy in particular, I can't remember exactly how they blood 435 00:23:16,640 --> 00:23:18,320 Speaker 5: of a thirteen year old, blood of a ten year 436 00:23:18,320 --> 00:23:19,720 Speaker 5: own serum it. 437 00:23:21,320 --> 00:23:24,320 Speaker 2: The plasma of the jaguar. I don't know exactly. 438 00:23:24,400 --> 00:23:27,840 Speaker 3: You know people who fly to other countries, yes, to 439 00:23:27,880 --> 00:23:28,800 Speaker 3: get infusions. 440 00:23:29,160 --> 00:23:32,679 Speaker 5: The guy is super into the anti aging thing, like 441 00:23:33,400 --> 00:23:36,280 Speaker 5: you know less about doctor Atilla, who again struck me 442 00:23:36,320 --> 00:23:38,200 Speaker 5: as a lot more legit than ninety percent of what 443 00:23:38,240 --> 00:23:39,199 Speaker 5: I hear about this stuff. 444 00:23:39,400 --> 00:23:41,360 Speaker 2: But yeah, he's super into it. He believes that. He's 445 00:23:41,359 --> 00:23:42,040 Speaker 2: a very smart guy. 446 00:23:42,080 --> 00:23:44,919 Speaker 5: He's been very successful, although you know, one sort of 447 00:23:44,920 --> 00:23:47,160 Speaker 5: success doesn't necessarily translate. 448 00:23:46,720 --> 00:23:49,040 Speaker 2: To good judgment in another way. But he's a lovely guy. 449 00:23:49,040 --> 00:23:50,800 Speaker 3: But if you got a lot of money, what are 450 00:23:50,800 --> 00:23:52,800 Speaker 3: you going to spend your money on that's better than 451 00:23:53,520 --> 00:23:56,639 Speaker 3: trying to have more enjoyable years of your life? And well, 452 00:23:56,680 --> 00:23:58,840 Speaker 3: you know, if there's a five percent chance that works, 453 00:23:59,119 --> 00:24:00,600 Speaker 3: if you got a lot of money, it makes more 454 00:24:00,640 --> 00:24:04,159 Speaker 3: sense to do that than with about cow Wich. We 455 00:24:04,200 --> 00:24:05,600 Speaker 3: were talking about early exactly. 456 00:24:05,840 --> 00:24:08,880 Speaker 5: You were talking about this four hundred and fifty six 457 00:24:08,960 --> 00:24:12,639 Speaker 5: thousand dollars chair and a what was the name of that? Again, 458 00:24:14,280 --> 00:24:16,560 Speaker 5: you can take the boy out of the working class, 459 00:24:16,600 --> 00:24:17,960 Speaker 5: but you can't take the work. And oh here it 460 00:24:18,040 --> 00:24:23,160 Speaker 5: is a Jeanreer type sofa for a million dollars. 461 00:24:24,359 --> 00:24:28,719 Speaker 3: It makes sense to me completely that we are designed 462 00:24:28,760 --> 00:24:34,399 Speaker 3: to continue to work our minds and bodies, and then 463 00:24:34,440 --> 00:24:37,680 Speaker 3: as soon as we stop, they just decide, well, I guess, 464 00:24:38,000 --> 00:24:38,920 Speaker 3: guess we're done. 465 00:24:39,640 --> 00:24:40,880 Speaker 2: Right right? 466 00:24:41,280 --> 00:24:43,480 Speaker 5: I think it just it helps to remind yourself over 467 00:24:43,520 --> 00:24:46,879 Speaker 5: and over again. You are an animal. You are a 468 00:24:46,920 --> 00:24:51,520 Speaker 5: biological being, no different than beavers. Beavers don't have long, 469 00:24:51,640 --> 00:24:55,760 Speaker 5: happy retirements. Look at my teeth they where they said, 470 00:24:55,920 --> 00:24:58,440 Speaker 5: are those dentures beaver dentures? 471 00:24:58,560 --> 00:25:01,280 Speaker 2: They look totally natural? Well, they don't. Look at you. 472 00:25:01,359 --> 00:25:04,840 Speaker 5: I brought down a sapling yesterday. Good man, you gotta 473 00:25:04,840 --> 00:25:09,000 Speaker 5: stay in Jape. No, but the invention of a I 474 00:25:09,000 --> 00:25:12,240 Speaker 5: have reproduced. I have raised my young I have functioned 475 00:25:12,280 --> 00:25:15,160 Speaker 5: as the village elder, and now I'm just I play 476 00:25:15,240 --> 00:25:18,440 Speaker 5: golf and I watched movies that doesn't exist in the 477 00:25:18,480 --> 00:25:22,399 Speaker 5: animal world. We invented that, right, So you got to 478 00:25:22,400 --> 00:25:25,440 Speaker 5: think about, all right, what is every other beast doing? Well, 479 00:25:25,480 --> 00:25:28,119 Speaker 5: it's alive, it stays active. You've got to stay active. 480 00:25:28,200 --> 00:25:29,400 Speaker 5: I believe that absolutely. 481 00:25:29,440 --> 00:25:31,760 Speaker 3: I don't want to be annoying works out guy. So 482 00:25:31,800 --> 00:25:33,600 Speaker 3: I probably shouldn't say this very of a but I 483 00:25:33,640 --> 00:25:35,440 Speaker 3: go to the gym every single day, and I feel better. 484 00:25:35,520 --> 00:25:37,359 Speaker 3: I feel better now than I felt when I was forty. 485 00:25:37,400 --> 00:25:40,960 Speaker 3: I'm twenty years older than that. I am way more 486 00:25:42,880 --> 00:25:47,640 Speaker 3: in every way, feel better from exercising regularly. I hope 487 00:25:47,680 --> 00:25:49,639 Speaker 3: I can keep it up. I probably can't. Like I 488 00:25:49,680 --> 00:25:51,520 Speaker 3: said earlier, I'm doing it only out of vanity. 489 00:25:52,119 --> 00:25:52,840 Speaker 2: But it can't hurt. 490 00:25:53,200 --> 00:25:55,600 Speaker 3: No, it can't hurt. I'm just worried about when, you know, 491 00:25:56,359 --> 00:25:59,440 Speaker 3: like I said, I finally get a mate, I'll just 492 00:26:00,000 --> 00:26:05,040 Speaker 3: I'll become a you know, Jim Schmim huh. Let's I 493 00:26:05,080 --> 00:26:08,640 Speaker 3: watched TV. That's all the way over on the other 494 00:26:08,760 --> 00:26:12,840 Speaker 3: side of living. I'm gonna immediately go from going to 495 00:26:12,880 --> 00:26:15,719 Speaker 3: the gym every day to watching TV eating bowls of pudding, 496 00:26:15,880 --> 00:26:18,440 Speaker 3: and the transition will be very seamless. 497 00:26:18,840 --> 00:26:21,920 Speaker 2: Riding around the grocery store on a rascal. 498 00:26:22,000 --> 00:26:24,440 Speaker 3: Right exactly didn't I used to see you at the gym. 499 00:26:24,560 --> 00:26:26,320 Speaker 3: Maybe it all by way? 500 00:26:27,880 --> 00:26:28,160 Speaker 2: Wow? 501 00:26:28,880 --> 00:26:32,920 Speaker 3: Wow, No, I'm predicting my future there. Your new bow 502 00:26:33,000 --> 00:26:35,280 Speaker 3: is going to sue you for fraud. Anyway. I thought 503 00:26:35,320 --> 00:26:38,240 Speaker 3: that was uh. I thought that was a damn interesting story. 504 00:26:38,440 --> 00:26:42,160 Speaker 3: Just the idea of looking at extending the enjoyable years 505 00:26:43,600 --> 00:26:46,920 Speaker 3: the whole your your body and mind drops off fifty 506 00:26:47,040 --> 00:26:49,560 Speaker 3: percent after seventy five. 507 00:26:49,680 --> 00:26:50,160 Speaker 6: Wow. 508 00:26:50,880 --> 00:26:54,080 Speaker 5: And And in a similar vein the one medical story 509 00:26:54,160 --> 00:26:57,280 Speaker 5: that's caught my ear lately and really made an impression 510 00:26:57,880 --> 00:27:01,560 Speaker 5: is that study after study is coming act saying staying 511 00:27:01,640 --> 00:27:06,040 Speaker 5: physically active is good for your brain. Yeah, prevents Alzheimer's 512 00:27:06,040 --> 00:27:08,879 Speaker 5: and that sort of thing. It's just it's undeniable. 513 00:27:09,119 --> 00:27:11,199 Speaker 3: Jerry Seinfeld's big on that, he says that's why he 514 00:27:11,240 --> 00:27:14,239 Speaker 3: lifts weights is because all the studies that show how 515 00:27:14,320 --> 00:27:17,879 Speaker 3: what it does for your brain. So that's good too, 516 00:27:17,880 --> 00:27:20,399 Speaker 3: because my brain answer, I got it. I'm so damn lazy. 517 00:27:20,400 --> 00:27:21,880 Speaker 2: I gotta get it. 518 00:27:21,960 --> 00:27:26,040 Speaker 4: The Armstrong and Getty Show or your Joe podcasts and 519 00:27:26,200 --> 00:27:35,359 Speaker 4: our hot links Jack Armstrong and Joe Getty, The Armstrong 520 00:27:35,400 --> 00:27:36,200 Speaker 4: and Getty Show. 521 00:27:40,600 --> 00:27:43,639 Speaker 3: We mentioned one stupid fitness craze he got a real 522 00:27:43,640 --> 00:27:47,119 Speaker 3: fitness craze that is just came across yesterday. It's been 523 00:27:47,119 --> 00:27:48,480 Speaker 3: around for a while, but I want to talk about 524 00:27:48,480 --> 00:27:52,760 Speaker 3: it more. Hick. I think it's Hicks the hick. I'll 525 00:27:52,760 --> 00:27:54,679 Speaker 3: find it to tell it's a it's an acronym. We'll 526 00:27:54,680 --> 00:27:55,960 Speaker 3: get to that coming up next segment. 527 00:27:56,520 --> 00:27:57,600 Speaker 2: Okay, So a. 528 00:27:57,560 --> 00:28:02,959 Speaker 5: Couple of examples of DEI madness and just proof of 529 00:28:03,000 --> 00:28:05,840 Speaker 5: what Jack and I and James Lindsay and others I've 530 00:28:06,160 --> 00:28:09,359 Speaker 5: been saying for the longest time. The DEI thing, it's 531 00:28:09,400 --> 00:28:11,480 Speaker 5: a tool of takeover. It has nothing to do with 532 00:28:11,520 --> 00:28:13,800 Speaker 5: what it claims it is. Here are a couple examples. 533 00:28:13,800 --> 00:28:17,000 Speaker 5: This is just great stuff from the editors of the 534 00:28:17,080 --> 00:28:23,159 Speaker 5: National Review. They're talking about how academics try to have 535 00:28:23,240 --> 00:28:26,000 Speaker 5: it both ways. They claim to support diversity and robust 536 00:28:26,000 --> 00:28:28,960 Speaker 5: debate on campus right, but then they exclude any views 537 00:28:29,040 --> 00:28:32,760 Speaker 5: that challenge the left wing orthodoxy. And now an influential 538 00:28:32,800 --> 00:28:37,840 Speaker 5: academic publication is abandoning all pretense. The American Association of 539 00:28:37,960 --> 00:28:41,920 Speaker 5: University Professors, who in their very bylaws state that they 540 00:28:41,960 --> 00:28:45,880 Speaker 5: aim to champion academic freedom, advanced shared governance, and organize 541 00:28:45,920 --> 00:28:47,040 Speaker 5: all faculty. 542 00:28:46,800 --> 00:28:48,640 Speaker 2: To promote education for the common Good. 543 00:28:49,080 --> 00:28:53,320 Speaker 5: Well, their magazine just published an article seven theses against 544 00:28:53,440 --> 00:28:58,040 Speaker 5: viewpoint diversity by this lady professor at Johns Hopkins. 545 00:28:58,120 --> 00:29:01,400 Speaker 3: Wow, just flat out coming out against viewpoint diversity. 546 00:29:01,640 --> 00:29:01,840 Speaker 2: Wow. 547 00:29:02,600 --> 00:29:06,000 Speaker 5: And she's the president of her university's AAUP chapter, the 548 00:29:06,080 --> 00:29:10,320 Speaker 5: American Association of University Professors. She is at Johns Hopkins, 549 00:29:10,360 --> 00:29:13,520 Speaker 5: which is a prominent university, so she's a super heavyweight 550 00:29:13,600 --> 00:29:14,360 Speaker 5: among professors. 551 00:29:14,400 --> 00:29:16,560 Speaker 3: I like John's. I feel like Hopkins is a bit 552 00:29:16,640 --> 00:29:18,360 Speaker 3: much anyway. 553 00:29:18,480 --> 00:29:21,480 Speaker 5: So there's been plenty of criticism of her article, But 554 00:29:21,680 --> 00:29:25,760 Speaker 5: in an attempt to defend the article, the AAUP responded 555 00:29:25,760 --> 00:29:29,920 Speaker 5: with a following quote, Fascism generally doesn't do great under 556 00:29:30,000 --> 00:29:33,600 Speaker 5: peer review, but perhaps it's the intellectual values of academia, 557 00:29:33,640 --> 00:29:37,400 Speaker 5: which emphasizes critical inquiry and challenges traditional norms that may 558 00:29:37,440 --> 00:29:40,840 Speaker 5: be inherently less appealing to those with a more conservative worldview. 559 00:29:41,440 --> 00:29:48,200 Speaker 5: In other words, academia skews way left because fascism doesn't 560 00:29:48,240 --> 00:29:52,600 Speaker 5: survive intellectual scrutiny. So they're saying anyone who isn't sufficiently 561 00:29:52,600 --> 00:29:55,719 Speaker 5: progressive and woke is a Nazi who needs to be 562 00:29:55,800 --> 00:30:00,640 Speaker 5: ejected from higher education. As the national If you guys, say, 563 00:30:00,640 --> 00:30:03,680 Speaker 5: could there be a more stark confirmation of the public's 564 00:30:03,720 --> 00:30:08,240 Speaker 5: perception of universities as ideological hubs unaware of their own internal, 565 00:30:08,360 --> 00:30:09,520 Speaker 5: you know, one sidedness. 566 00:30:09,520 --> 00:30:11,000 Speaker 2: They use a lot of fancy words, but. 567 00:30:13,040 --> 00:30:17,640 Speaker 5: It is absolutely stunning coming out and saying no, we 568 00:30:17,680 --> 00:30:21,200 Speaker 5: don't have intellectual diversity because anybody who doesn't agree with us. 569 00:30:21,160 --> 00:30:23,560 Speaker 2: Is a fascist. Yeah. 570 00:30:23,600 --> 00:30:26,040 Speaker 3: I butted up against this fairly recently in a way 571 00:30:26,080 --> 00:30:29,040 Speaker 3: I don't want to get into. But the crowd that 572 00:30:30,880 --> 00:30:37,880 Speaker 3: actually equates the term conservative with something incredibly evil, Like 573 00:30:37,960 --> 00:30:41,880 Speaker 3: it's not a political philosophy that's always existed and always 574 00:30:41,920 --> 00:30:46,120 Speaker 3: will exist and needs to exist. That's intension with you know, 575 00:30:46,400 --> 00:30:51,200 Speaker 3: the other side, whatever you want to call it. It's 576 00:30:51,320 --> 00:30:53,600 Speaker 3: just it's flat out evil and should be given no 577 00:30:53,720 --> 00:30:56,640 Speaker 3: air whatsoever, which is nuts. 578 00:30:56,960 --> 00:30:57,440 Speaker 2: Yeah, it is. 579 00:30:57,840 --> 00:31:01,360 Speaker 5: It is divorced from all reality, all wisdom. I would 580 00:31:01,440 --> 00:31:06,200 Speaker 5: never say that experience everything should be to the right 581 00:31:06,440 --> 00:31:08,880 Speaker 5: forever and never looked at in any other. 582 00:31:08,760 --> 00:31:11,120 Speaker 3: Way that I think that'd be. That makes you a 583 00:31:11,160 --> 00:31:12,200 Speaker 3: weird sort of person. 584 00:31:13,080 --> 00:31:16,440 Speaker 5: Yeah, it's an aspect of radical leftism and UH and 585 00:31:16,800 --> 00:31:20,400 Speaker 5: revolutionary movements in general. They are utterly intolerant of dissent. 586 00:31:21,080 --> 00:31:22,280 Speaker 5: I mean, that's kind of common. 587 00:31:22,600 --> 00:31:24,880 Speaker 3: How do you ever come to how do you ever 588 00:31:24,920 --> 00:31:26,640 Speaker 3: come come to the conclusion, though maybe it's just a 589 00:31:26,640 --> 00:31:32,360 Speaker 3: way I'm built that you're right about everything, always right, always, 590 00:31:32,400 --> 00:31:33,400 Speaker 3: How do you get that way? 591 00:31:33,920 --> 00:31:35,240 Speaker 2: I don't know. I don't know. 592 00:31:35,640 --> 00:31:38,120 Speaker 5: And the idea that I mean, the idea that there 593 00:31:38,280 --> 00:31:41,600 Speaker 5: is a Marxist professor who believes everything to the right 594 00:31:41,640 --> 00:31:44,960 Speaker 5: of Marxism is fascism. The fact that there's a professor 595 00:31:44,960 --> 00:31:46,640 Speaker 5: who believes that, I'd think. 596 00:31:46,680 --> 00:31:48,560 Speaker 2: All right, you know, that's fine, takes all kinds. 597 00:31:48,640 --> 00:31:52,440 Speaker 5: But the idea that there's nothing but that on American 598 00:31:52,520 --> 00:31:54,960 Speaker 5: universities or those who are not that are terrified to 599 00:31:55,000 --> 00:31:58,680 Speaker 5: even speak that, that's I mean, folks, that's a serious, 600 00:31:59,040 --> 00:32:03,600 Speaker 5: serious problem. And now they're just saying it openly, the 601 00:32:03,720 --> 00:32:07,000 Speaker 5: lead organization for professors saying, no, we don't want diversity 602 00:32:07,040 --> 00:32:09,040 Speaker 5: of opinion because you're fascists. 603 00:32:09,800 --> 00:32:10,160 Speaker 2: Anyway. 604 00:32:10,240 --> 00:32:13,320 Speaker 5: On a much much lighter note, Real Clear Politics, with 605 00:32:13,360 --> 00:32:18,200 Speaker 5: an article out lately about how Biden's Secret Service was 606 00:32:18,280 --> 00:32:22,760 Speaker 5: so steeped in diversity, equity and inclusion DEI practices that 607 00:32:22,840 --> 00:32:26,240 Speaker 5: it could hardly function. Boy, did we see any examples 608 00:32:26,280 --> 00:32:31,880 Speaker 5: of that during Biden's reign. I don't know teenage or 609 00:32:31,880 --> 00:32:35,040 Speaker 5: twenty year old jackass near dwells getting clear shots at 610 00:32:35,040 --> 00:32:39,880 Speaker 5: the president for instance. Anyway, Real Clear Politics noted the 611 00:32:39,920 --> 00:32:43,680 Speaker 5: former Secret Service director Kimberly Cheadle had pushed an initiative 612 00:32:43,760 --> 00:32:47,720 Speaker 5: under Biden quote to make the federal government a DEI 613 00:32:47,880 --> 00:32:52,520 Speaker 5: model for the nation, so that lady who's going to 614 00:32:52,600 --> 00:32:55,120 Speaker 5: lead the Secret Service one the whole federal government to 615 00:32:55,160 --> 00:32:59,080 Speaker 5: be a model of DEI, which means more radical progressives. 616 00:32:59,120 --> 00:33:02,440 Speaker 2: It's not diverse, the it's more progressives. Anyway. 617 00:33:03,240 --> 00:33:05,640 Speaker 5: She resigned from her post in July twenty four after 618 00:33:05,640 --> 00:33:08,720 Speaker 5: being unable or unwilling to answer questions from lawmakers on 619 00:33:08,760 --> 00:33:11,800 Speaker 5: how the agency failed to prevent the assassination attempt against Trump. 620 00:33:11,800 --> 00:33:14,760 Speaker 5: As he spoke in Butler, Pennsylvania, that we all remember 621 00:33:15,920 --> 00:33:18,719 Speaker 5: and keeping in mind, Real Clear Politics is a pretty 622 00:33:18,760 --> 00:33:23,240 Speaker 5: down the middle ish, typical slightly left journalism outfit. They 623 00:33:23,280 --> 00:33:26,040 Speaker 5: offered further insight into the DEI push at the agency 624 00:33:26,120 --> 00:33:30,479 Speaker 5: during Biden's years, highlighting a reported overweight female agent who 625 00:33:30,560 --> 00:33:33,880 Speaker 5: was also a plus sized model and I read from 626 00:33:33,960 --> 00:33:37,600 Speaker 5: Real Clear Politics. Under Cheetle's leadership, DEI had become so 627 00:33:37,720 --> 00:33:41,840 Speaker 5: normalized that an overweight female agent who never passed her 628 00:33:41,840 --> 00:33:45,760 Speaker 5: physical fitness tests was not only retained on the staff, 629 00:33:45,880 --> 00:33:48,920 Speaker 5: she was allowed to moonlight as a model. The agent 630 00:33:48,960 --> 00:33:51,239 Speaker 5: who was featured in a magazine profile traded on her 631 00:33:51,280 --> 00:33:53,959 Speaker 5: job in federal law enforcement and hinted at her Secret 632 00:33:53,960 --> 00:33:59,360 Speaker 5: Service position in a photoshoot labeled undercover but never underdressed god. 633 00:33:59,800 --> 00:34:03,240 Speaker 5: The female agent who bills herself as a nationally published 634 00:34:03,280 --> 00:34:07,360 Speaker 5: curve model, plus sized fashion and fitness influencer and body 635 00:34:07,440 --> 00:34:08,840 Speaker 5: positivity advocate. 636 00:34:09,040 --> 00:34:13,120 Speaker 3: I feel like, even if she could do the physical 637 00:34:13,120 --> 00:34:15,759 Speaker 3: fitness stuff, that's not the sort of person you want 638 00:34:15,800 --> 00:34:20,240 Speaker 3: as a secret Service agent. Who's a model influencer type. 639 00:34:20,320 --> 00:34:25,640 Speaker 3: That just seems like a weird personality type to be 640 00:34:25,680 --> 00:34:26,560 Speaker 3: in the Secret Service. 641 00:34:27,040 --> 00:34:30,440 Speaker 5: What other side hustles are allowed for secret Service agents? 642 00:34:32,000 --> 00:34:34,040 Speaker 3: Yeah? Could you gi me a strip club DJ? Or 643 00:34:34,520 --> 00:34:37,160 Speaker 3: is there any limit to the sort of right? 644 00:34:37,239 --> 00:34:39,759 Speaker 5: Could you write like angry editorials for I don't know, 645 00:34:39,800 --> 00:34:40,960 Speaker 5: Breitbart or something like that. 646 00:34:41,000 --> 00:34:43,680 Speaker 2: I don't know, I don't know anyway, here's a little more. 647 00:34:44,880 --> 00:34:49,000 Speaker 5: The gal, the big gall who never passed her test 648 00:34:49,120 --> 00:34:52,600 Speaker 5: at all, was assigned to protect Kamala Harris's stepdaughter LamaH 649 00:34:52,719 --> 00:34:55,600 Speaker 5: off in New York. After several failed attempts to pass 650 00:34:55,600 --> 00:34:58,120 Speaker 5: a physical fitness test, the agent was placed in the 651 00:34:58,120 --> 00:35:01,640 Speaker 5: Special Services Division, which had support functions for the agency, 652 00:35:01,719 --> 00:35:04,480 Speaker 5: including the maintenance of the armored vehicle fleet and the 653 00:35:04,520 --> 00:35:07,040 Speaker 5: screening of mail and packages for the White House Complex. 654 00:35:07,239 --> 00:35:10,239 Speaker 5: According to four sources in the Secret Services community, you. 655 00:35:10,239 --> 00:35:12,680 Speaker 3: Know, there'd be so many people that want those jobs 656 00:35:12,680 --> 00:35:15,480 Speaker 3: in the Secret Service. There'd be so many candidates, And 657 00:35:15,520 --> 00:35:17,880 Speaker 3: you go ahead and let somebody who can't pass the 658 00:35:17,920 --> 00:35:22,239 Speaker 3: physical fitness test be on the team. I mean, that's 659 00:35:22,280 --> 00:35:27,120 Speaker 3: so weak, right next, right, you got two shots. Next, Sorry, 660 00:35:27,160 --> 00:35:28,600 Speaker 3: you can't do all the push ups or setups or 661 00:35:28,600 --> 00:35:29,239 Speaker 3: whatever you had to do. 662 00:35:30,120 --> 00:35:34,279 Speaker 5: Go back to being a nationally published curve model and 663 00:35:34,440 --> 00:35:38,480 Speaker 5: influencer and body positivity activist. And then they go into 664 00:35:38,520 --> 00:35:40,959 Speaker 5: several other scandals during the Biden years. 665 00:35:40,960 --> 00:35:44,160 Speaker 2: But you get the idea. DEI is insidious. 666 00:35:45,120 --> 00:35:50,040 Speaker 5: It's a tool of Marxist takeover, and it devalues every 667 00:35:50,680 --> 00:35:53,839 Speaker 5: person of color, minority or woman or whatever who gets 668 00:35:53,840 --> 00:35:57,560 Speaker 5: a gig because they're suspected of being a DEI higher. 669 00:35:57,800 --> 00:35:58,480 Speaker 3: It's awful. 670 00:35:59,040 --> 00:36:02,840 Speaker 5: End it everywhere now, rating three two one ended The 671 00:36:03,040 --> 00:36:03,520 Speaker 5: Armstrong 672 00:36:03,560 --> 00:36:06,880 Speaker 4: And Getty Show Yeah, more Jack, more, Joe podcasts and 673 00:36:07,040 --> 00:36:09,560 Speaker 4: our hotlinks and art com