1 00:00:01,800 --> 00:00:06,120 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio the George 2 00:00:06,160 --> 00:00:10,240 Speaker 1: Washington Broadcast Center, Jack Armstrong and Joe Getty. 3 00:00:10,119 --> 00:00:17,200 Speaker 2: Armstrong and Jettie and he Armstrong and Eddy. 4 00:00:23,600 --> 00:00:25,760 Speaker 3: The way I look at artificial intelligence, it's both like 5 00:00:25,840 --> 00:00:27,760 Speaker 3: it can be a threat for sure, and it can 6 00:00:27,840 --> 00:00:31,000 Speaker 3: be a massive opportunity. What I don't understand is when 7 00:00:31,000 --> 00:00:34,479 Speaker 3: I walk around this city or you talk to lawmakers, why. 8 00:00:34,280 --> 00:00:35,879 Speaker 4: People aren't obsessed with this. 9 00:00:36,040 --> 00:00:39,360 Speaker 3: It is going to reorder society over the next decade, 10 00:00:39,400 --> 00:00:42,200 Speaker 3: and almost any person who studied it at any level 11 00:00:42,240 --> 00:00:45,440 Speaker 3: has come to that exact same conclusion. I sometimes joke 12 00:00:45,520 --> 00:00:47,480 Speaker 3: to Mikey and others that I feel like I'm living 13 00:00:47,520 --> 00:00:50,440 Speaker 3: in a simulation where you see so clearly where the 14 00:00:50,440 --> 00:00:53,440 Speaker 3: world's going over the next five years, and yet Washington 15 00:00:53,520 --> 00:00:55,480 Speaker 3: pays very little attention to it. 16 00:00:55,760 --> 00:00:57,720 Speaker 5: Jim Vanda High is getting a lot of attention for 17 00:00:57,800 --> 00:01:00,920 Speaker 5: a PC he wrote for Axios about a there's a 18 00:01:00,920 --> 00:01:03,279 Speaker 5: big one in The New Yorker also in which they 19 00:01:03,320 --> 00:01:07,160 Speaker 5: have two AI experts, one arguing it's going to change 20 00:01:07,160 --> 00:01:10,679 Speaker 5: society more than anything ever has and very soon, and 21 00:01:10,760 --> 00:01:14,600 Speaker 5: another expert arguing that now it's way, way, way over 22 00:01:14,680 --> 00:01:17,920 Speaker 5: blown and any of the effects are years off, which 23 00:01:17,959 --> 00:01:21,039 Speaker 5: I hope that that guy's right, although I tend to 24 00:01:21,040 --> 00:01:23,560 Speaker 5: think the other guy's right, that it's happening sooner and 25 00:01:23,600 --> 00:01:24,440 Speaker 5: gonna be a big deal. 26 00:01:24,840 --> 00:01:29,360 Speaker 1: I've been reading James vander Heys piece in which, among 27 00:01:29,400 --> 00:01:32,120 Speaker 1: other things, he interviews Dario am a day. Is that 28 00:01:32,160 --> 00:01:34,920 Speaker 1: how you're protesting? He's the CEO of Anthropic, one of 29 00:01:34,959 --> 00:01:38,160 Speaker 1: the most powerful AI companies in the world, And it 30 00:01:38,200 --> 00:01:40,440 Speaker 1: occurred to me the perfect metaphor. I don't know why 31 00:01:40,440 --> 00:01:43,720 Speaker 1: I see the world in metaphors. I do, but it's 32 00:01:43,760 --> 00:01:44,440 Speaker 1: as if. 33 00:01:44,440 --> 00:01:47,760 Speaker 5: Doctor Franklin, You're still seeing the world in metaphors, is 34 00:01:47,880 --> 00:01:54,920 Speaker 5: like an ox needing something. It's like if doctor Frankenstein 35 00:01:56,200 --> 00:01:59,480 Speaker 5: was designing his monster, but it was going to be 36 00:01:59,520 --> 00:02:03,920 Speaker 5: like fifty feet tall. But doctor Frankenstein was perfectly sane 37 00:02:04,480 --> 00:02:07,680 Speaker 5: and was saying, now, reanimating a corpse is going to 38 00:02:07,720 --> 00:02:12,320 Speaker 5: be an incredible scientific breakthrough, but it could rampage down 39 00:02:12,360 --> 00:02:15,520 Speaker 5: to and through the village and kill many, many people, 40 00:02:15,600 --> 00:02:18,680 Speaker 5: and we're really not sure which. So Igor, if you 41 00:02:18,720 --> 00:02:22,760 Speaker 5: would fire up the lightning positron, please let's get this 42 00:02:22,880 --> 00:02:33,600 Speaker 5: started anyway, Thank you, dear More with Jim Vanda high. 43 00:02:35,360 --> 00:02:38,560 Speaker 3: Just pay attention, get way more familiar with this technology. 44 00:02:38,760 --> 00:02:41,520 Speaker 3: If you are about to be a graduate, figure out. 45 00:02:41,360 --> 00:02:43,240 Speaker 4: Are you in the right field. If you are in 46 00:02:43,280 --> 00:02:44,200 Speaker 4: the right field. 47 00:02:44,000 --> 00:02:46,600 Speaker 3: How would you utilize this technology for it to be 48 00:02:46,600 --> 00:02:49,720 Speaker 3: a force multiplier of the work that you do. How 49 00:02:49,760 --> 00:02:51,880 Speaker 3: can you help it ten x, two x, whatever the 50 00:02:51,919 --> 00:02:55,239 Speaker 3: number is your productivity, so that you can do but 51 00:02:55,400 --> 00:02:58,880 Speaker 3: really creative interesting things. And I think if society prepares 52 00:02:58,919 --> 00:03:01,880 Speaker 3: for it, if the company's for it, it doesn't have. 53 00:03:01,880 --> 00:03:03,760 Speaker 4: To be a massive upheaval. 54 00:03:04,639 --> 00:03:06,800 Speaker 5: Again, I hope he's right. I just I just don't know. 55 00:03:06,880 --> 00:03:09,440 Speaker 5: It's it's not It's not like, you know, except the 56 00:03:09,440 --> 00:03:12,160 Speaker 5: fact that the car is coming. Learn how to fix 57 00:03:12,200 --> 00:03:15,920 Speaker 5: a car because nobody's gonna need horseshoes anymore, right me? 58 00:03:15,960 --> 00:03:20,200 Speaker 5: You know, made sense, makes sense. I'm not sure if 59 00:03:20,200 --> 00:03:23,639 Speaker 5: it works the way the you know, the the people 60 00:03:23,680 --> 00:03:26,280 Speaker 5: on the extreme end of all the jobs AI can take, 61 00:03:27,120 --> 00:03:31,760 Speaker 5: If they're right, there are gonna be so many millions 62 00:03:31,760 --> 00:03:35,040 Speaker 5: of people out of work. There's there's no like, well, 63 00:03:35,240 --> 00:03:37,080 Speaker 5: I guess for your young pick a career that AI 64 00:03:37,160 --> 00:03:39,200 Speaker 5: can enhance. But well, here's a be a lot of 65 00:03:39,240 --> 00:03:41,760 Speaker 5: guessing involved in that. Let me set the table a 66 00:03:41,760 --> 00:03:44,720 Speaker 5: little bit, then we can discuss. So, mister m Day 67 00:03:44,880 --> 00:03:48,280 Speaker 5: and again, forgive me if I'm mispronouncing it. The CEO 68 00:03:48,320 --> 00:03:51,760 Speaker 5: of Anthropic, talking to Jim Vanda Hay, has a blunt, 69 00:03:51,800 --> 00:03:54,040 Speaker 5: scary warning for the US government and all of US. 70 00:03:54,360 --> 00:03:57,680 Speaker 5: AI could wipe out half of all entry level white 71 00:03:57,760 --> 00:04:01,560 Speaker 5: collar jobs and spike on themloyment ten to twenty percent 72 00:04:01,680 --> 00:04:05,040 Speaker 5: in the next one to five years. Five percent of 73 00:04:05,160 --> 00:04:08,760 Speaker 5: jobs wiped out. Interesting noways you said. 74 00:04:08,560 --> 00:04:13,360 Speaker 1: Half ten half of all entry level white collar jobs 75 00:04:13,560 --> 00:04:16,680 Speaker 1: and ten to twenty percent unemployment in general in the 76 00:04:16,720 --> 00:04:20,520 Speaker 1: next one to five years. And Day said, AI. Companies 77 00:04:20,520 --> 00:04:23,640 Speaker 1: and government need to stop sugarcoating what's coming. This is 78 00:04:23,720 --> 00:04:28,839 Speaker 1: doctor Frankenstein standing there I'm picturing on the drawbridge of 79 00:04:28,880 --> 00:04:32,560 Speaker 1: his moat saying to the villagers with the pitchforks and 80 00:04:32,600 --> 00:04:36,680 Speaker 1: the torches breaks, guys, you gotta be ready in case 81 00:04:36,720 --> 00:04:39,320 Speaker 1: the monster goes on a rampage. 82 00:04:39,440 --> 00:04:43,000 Speaker 5: Well, the first clip we played from Jim Vanahei makes 83 00:04:43,040 --> 00:04:46,120 Speaker 5: the most well, it's not hard to explain, makes the 84 00:04:46,120 --> 00:04:48,599 Speaker 5: most sense to me. How is this such a giant 85 00:04:48,600 --> 00:04:51,880 Speaker 5: topic everywhere but in Washington, d C. I wonder if 86 00:04:51,920 --> 00:04:53,960 Speaker 5: it has anything to do with the average person in 87 00:04:54,000 --> 00:04:55,560 Speaker 5: any leadership position. 88 00:04:55,240 --> 00:04:56,280 Speaker 2: Being eighty years old. 89 00:04:57,120 --> 00:04:57,719 Speaker 4: I think so. 90 00:04:58,160 --> 00:05:01,800 Speaker 1: And also there are so many unknowns it's difficult to 91 00:05:01,839 --> 00:05:05,719 Speaker 1: know what to do legislatively, particularly if, for instance, I 92 00:05:05,760 --> 00:05:09,919 Speaker 1: don't know your legislature is utterly cowardly and afraid of 93 00:05:09,960 --> 00:05:12,680 Speaker 1: doing anything lest they do the wrong thing, so they 94 00:05:12,720 --> 00:05:15,680 Speaker 1: just let the executive branch do everything. But let me 95 00:05:15,680 --> 00:05:19,720 Speaker 1: finish the thought. Emday said AI. Companies and government need 96 00:05:19,760 --> 00:05:22,839 Speaker 1: to stop sugarcoating what's coming. I'm sorry, what's coming the 97 00:05:23,040 --> 00:05:29,800 Speaker 1: possible mass elimination of jobs across technology, finance, law, consulting, 98 00:05:29,960 --> 00:05:33,599 Speaker 1: and other white collar professions, especially entry level gigs. I 99 00:05:33,640 --> 00:05:35,640 Speaker 1: got a kid who just kicked buttoner first year of 100 00:05:35,680 --> 00:05:38,440 Speaker 1: law school. I'm very proud of her. But we're gonna 101 00:05:38,440 --> 00:05:41,360 Speaker 1: have a serious talk, and practical part of me doesn't 102 00:05:41,360 --> 00:05:43,000 Speaker 1: want to say this on the air because I want 103 00:05:43,000 --> 00:05:45,360 Speaker 1: as few people to come to this realization as possible. 104 00:05:45,920 --> 00:05:49,800 Speaker 1: We're gonna have a serious talk about there's a very 105 00:05:49,800 --> 00:05:52,480 Speaker 1: good chance you're either going to be the person eliminated 106 00:05:52,560 --> 00:05:55,440 Speaker 1: by AI or you're going to be the person who's 107 00:05:55,480 --> 00:05:59,440 Speaker 1: become so adept at using it. You're the one out 108 00:05:59,480 --> 00:06:03,480 Speaker 1: of a that is kept on to get good at 109 00:06:03,480 --> 00:06:03,919 Speaker 1: this stuff. 110 00:06:04,120 --> 00:06:08,040 Speaker 5: That's exactly what bend Hi was saying. Accept the fact 111 00:06:08,080 --> 00:06:10,039 Speaker 5: that it's going to be here, and figure out a 112 00:06:10,080 --> 00:06:10,520 Speaker 5: way to. 113 00:06:12,360 --> 00:06:14,880 Speaker 2: Use AI. So you're one of the people that can 114 00:06:15,839 --> 00:06:16,840 Speaker 2: continue to make a living. 115 00:06:17,880 --> 00:06:20,640 Speaker 1: Yeah, so, amide says he's speaking out in hopes of 116 00:06:20,720 --> 00:06:24,400 Speaker 1: jarring government and fellow AI companies into preparing and protecting 117 00:06:24,440 --> 00:06:29,440 Speaker 1: the nation. Hardly anyone is paying attention. Lawmakers don't get 118 00:06:29,440 --> 00:06:32,320 Speaker 1: it or don't believe it. CEOs are afraid to talk 119 00:06:32,360 --> 00:06:35,240 Speaker 1: about it. Many workers won't realize the risks posed by 120 00:06:35,240 --> 00:06:40,280 Speaker 1: the possible job apocalypse until after it hits. And by 121 00:06:40,279 --> 00:06:42,599 Speaker 1: the way, if you're a new Ish listener to the show, 122 00:06:42,600 --> 00:06:46,479 Speaker 1: thank you for being here. First of all. Secondly, we're 123 00:06:46,520 --> 00:06:50,359 Speaker 1: not clickbait hyperbole. We try very hard to stay away 124 00:06:50,360 --> 00:06:52,760 Speaker 1: from trying to scare you all the time or making 125 00:06:52,800 --> 00:06:54,960 Speaker 1: you angry all the time. There are plenty of reasons 126 00:06:54,960 --> 00:06:57,000 Speaker 1: to be scared and or angry. We just try not 127 00:06:57,040 --> 00:07:01,400 Speaker 1: to go over the top. I don't think it's hyperbole 128 00:07:01,600 --> 00:07:04,520 Speaker 1: at all. To you know, have the weight on the 129 00:07:04,560 --> 00:07:06,719 Speaker 1: balls of your feet and your feet at shoulder position, 130 00:07:06,839 --> 00:07:10,160 Speaker 1: leaning forward at the waist, getting ready for this could 131 00:07:10,160 --> 00:07:12,240 Speaker 1: be very, very disruptive. 132 00:07:12,960 --> 00:07:16,640 Speaker 5: Well, I've read several of the long leading books on 133 00:07:16,680 --> 00:07:20,160 Speaker 5: the topic, Life three point zero by Max Tegmark, which 134 00:07:20,200 --> 00:07:23,600 Speaker 5: is fantastic, and then a newer one from Nick Bostrom, 135 00:07:23,640 --> 00:07:26,360 Speaker 5: who wrote an AI book that got tons of attention 136 00:07:26,400 --> 00:07:29,480 Speaker 5: several years ago. But this Deep Utopia book, which he 137 00:07:29,520 --> 00:07:34,880 Speaker 5: gets really into. If it does what some people claim 138 00:07:34,920 --> 00:07:40,360 Speaker 5: it's going to do, what the hell is mankind going 139 00:07:40,440 --> 00:07:42,800 Speaker 5: to do? What is society going to be? How is humanity? 140 00:07:43,000 --> 00:07:46,000 Speaker 5: Even even if you come up with a way for 141 00:07:46,080 --> 00:07:49,600 Speaker 5: people to have food and shelter, what do human beings 142 00:07:49,680 --> 00:07:54,040 Speaker 5: become if they have they don't have to work. It's 143 00:07:54,080 --> 00:07:57,640 Speaker 5: an experiment that's never been run before really, other than 144 00:07:57,720 --> 00:07:58,880 Speaker 5: like the super rich. 145 00:07:58,720 --> 00:08:00,160 Speaker 2: And you see how they often end up up. 146 00:08:01,280 --> 00:08:03,640 Speaker 1: Give us the ten second version of your working on 147 00:08:03,680 --> 00:08:07,920 Speaker 1: the song yesterday after this commercial or right now, no, 148 00:08:08,320 --> 00:08:11,360 Speaker 1: right right now? Okay, So I'm ordering you to do anything, 149 00:08:11,400 --> 00:08:13,440 Speaker 1: but I think it would fit in well, and then 150 00:08:13,520 --> 00:08:17,040 Speaker 1: coming up after the commercial jack reference the hey don't worry, 151 00:08:17,120 --> 00:08:18,600 Speaker 1: it's gonna be okay argument. 152 00:08:18,800 --> 00:08:20,600 Speaker 5: So I was working on this song yesterday, I'd come 153 00:08:20,680 --> 00:08:22,320 Speaker 5: up with these lyrics in my head, and then I 154 00:08:22,400 --> 00:08:24,880 Speaker 5: picked out a key I could sing in, some chord 155 00:08:24,960 --> 00:08:27,440 Speaker 5: structure and stuff like that. The working title is Jo's 156 00:08:27,480 --> 00:08:31,600 Speaker 5: an a Hole. But it's very tuneful, very nice. 157 00:08:31,680 --> 00:08:36,960 Speaker 2: I'd like to hear that song. But I thought, you know, 158 00:08:36,960 --> 00:08:38,319 Speaker 2: I need a bridge for this song. 159 00:08:38,600 --> 00:08:41,600 Speaker 5: And I almost feel guilty for even having done this, 160 00:08:41,720 --> 00:08:43,000 Speaker 5: but I thought it'd be kind of fun to see 161 00:08:43,000 --> 00:08:45,559 Speaker 5: what it did. I went to chat GBT and I said, 162 00:08:47,720 --> 00:08:50,920 Speaker 5: what's a good bridge for like a country style song? 163 00:08:51,640 --> 00:08:54,960 Speaker 5: And it presented me like with five different options immediately, 164 00:08:55,000 --> 00:08:57,520 Speaker 5: by the way of if you want something that's kind 165 00:08:57,559 --> 00:09:00,840 Speaker 5: of uplifting, go with this. If you want something that 166 00:09:01,040 --> 00:09:04,640 Speaker 5: is good for setting up a more poignant ending, go 167 00:09:04,720 --> 00:09:06,560 Speaker 5: with this. And it just gave me the chords in 168 00:09:06,600 --> 00:09:10,000 Speaker 5: that give you all the chord structures for doing it. 169 00:09:10,120 --> 00:09:13,200 Speaker 5: This is the most popular currently in like country pop, 170 00:09:13,840 --> 00:09:17,000 Speaker 5: and it was. I mean, you wouldn't even know, you 171 00:09:17,040 --> 00:09:18,200 Speaker 5: didn't know anything about. 172 00:09:18,120 --> 00:09:21,520 Speaker 1: Music to craft a song that way, So instead of 173 00:09:21,559 --> 00:09:24,160 Speaker 1: experimenting and struggling and listening to it and trying to 174 00:09:24,160 --> 00:09:26,439 Speaker 1: figure out how do I get back to the verse 175 00:09:26,559 --> 00:09:29,880 Speaker 1: chords and that's a little uncomfortable, And then finally, after 176 00:09:29,960 --> 00:09:32,080 Speaker 1: you know, minutes or hours of effort coming up with 177 00:09:32,120 --> 00:09:34,240 Speaker 1: something you're proud of. No, the computer just told you 178 00:09:34,280 --> 00:09:38,440 Speaker 1: what to do. How does that change songwriting? I've got 179 00:09:38,440 --> 00:09:41,200 Speaker 1: to admit I heard that and thought, wow, that's cool tool. 180 00:09:41,240 --> 00:09:43,199 Speaker 1: But I also thought, what the hell's the point? Right? 181 00:09:43,400 --> 00:09:46,079 Speaker 5: Yeah, that's what I felt like too. I haven't used 182 00:09:46,080 --> 00:09:48,160 Speaker 5: one of them because I thought, do I want that? 183 00:09:48,679 --> 00:09:50,480 Speaker 5: Do I want it to not be something? I came 184 00:09:50,559 --> 00:09:50,880 Speaker 5: up with? 185 00:09:51,080 --> 00:09:52,040 Speaker 4: What am I doing here? 186 00:09:52,840 --> 00:09:53,600 Speaker 1: Right right? 187 00:09:53,640 --> 00:09:55,400 Speaker 5: Why don't I just go to the end write a 188 00:09:55,440 --> 00:09:58,640 Speaker 5: song about this topic and then okay, now produce it? 189 00:09:58,800 --> 00:10:00,880 Speaker 2: Okay, now I'll listen to it. Well, that was fun. 190 00:10:01,800 --> 00:10:03,559 Speaker 2: What am I doing coming up? 191 00:10:04,040 --> 00:10:07,720 Speaker 1: It'll be fine? Argument, so stay tuned for that afterward 192 00:10:07,760 --> 00:10:09,880 Speaker 1: from our friends at Trust and Will. If you're not 193 00:10:10,160 --> 00:10:13,440 Speaker 1: familiar with the need for a trust and Will, I 194 00:10:13,480 --> 00:10:15,880 Speaker 1: hate to say it, but someday you will not be 195 00:10:15,960 --> 00:10:18,000 Speaker 1: here and your assets will go to the people you 196 00:10:18,040 --> 00:10:20,680 Speaker 1: care about. And if you don't have a trust and 197 00:10:20,840 --> 00:10:23,359 Speaker 1: or Will, it's going to go through a long, tortuous 198 00:10:23,920 --> 00:10:29,520 Speaker 1: government a bureaucracy abuse process known as probate. Don't do this, friends, 199 00:10:29,600 --> 00:10:31,720 Speaker 1: especially because Trust and Will makes it so easy and 200 00:10:31,800 --> 00:10:33,360 Speaker 1: affordable to get your act together. 201 00:10:33,480 --> 00:10:38,000 Speaker 5: Yeah, you die unexpectedly today, like sometimes happens, all kinds 202 00:10:38,040 --> 00:10:41,840 Speaker 5: of never ending legal battles in the state deciding what happens. 203 00:10:42,320 --> 00:10:43,880 Speaker 2: Or you can come up with this Trust. 204 00:10:43,679 --> 00:10:47,560 Speaker 5: And Will with this website, which is fantastic, starts at about 205 00:10:47,480 --> 00:10:51,440 Speaker 5: one hundred ninety nine dollars, create and manage a custom 206 00:10:51,480 --> 00:10:53,840 Speaker 5: of state plan. You can manage your trust or will 207 00:10:53,880 --> 00:10:56,600 Speaker 5: online and this is all legal based on wherever you 208 00:10:56,640 --> 00:10:59,679 Speaker 5: live with they're easy to use website and by the way, 209 00:10:59,880 --> 00:11:03,920 Speaker 5: not on your own live customers. Customer support chat, phone 210 00:11:04,040 --> 00:11:06,600 Speaker 5: or email. To get this all put together. Save your 211 00:11:06,640 --> 00:11:09,240 Speaker 5: family the heartbreak. Sometimes this stuff tears. Family is apart 212 00:11:09,280 --> 00:11:09,920 Speaker 5: after you're gone. 213 00:11:09,960 --> 00:11:11,520 Speaker 1: Don't do that. Trust and Will has made a state 214 00:11:11,559 --> 00:11:15,960 Speaker 1: planning accessible and affordable. It's really quite amazing. Secure your assets, 215 00:11:16,000 --> 00:11:18,040 Speaker 1: protect your loved ones with Trust and Will. Get twenty 216 00:11:18,040 --> 00:11:21,719 Speaker 1: percent off on your estate plan documents by visiting trustinwill 217 00:11:21,840 --> 00:11:26,640 Speaker 1: dot com slash armstrong. That's Trust and Will dot com 218 00:11:26,640 --> 00:11:28,000 Speaker 1: slash armstrong. 219 00:11:28,160 --> 00:11:29,959 Speaker 5: So we'll have to get to the optimistic side of 220 00:11:30,000 --> 00:11:33,880 Speaker 5: the AI thing next. But I just on the music thing, 221 00:11:33,960 --> 00:11:37,640 Speaker 5: as you mentioned it. 222 00:11:36,160 --> 00:11:38,480 Speaker 2: Is what is the point art wise? 223 00:11:38,760 --> 00:11:44,440 Speaker 5: Certainly, Hemingway famously said a writer writes for himself and others. 224 00:11:44,440 --> 00:11:46,400 Speaker 5: I mean, but you're doing it a lot for yourself. 225 00:11:46,600 --> 00:11:50,080 Speaker 5: And the reason I sat down to craft a song yesterday, 226 00:11:50,080 --> 00:11:52,080 Speaker 5: which I hadn't done in a long time, is I 227 00:11:52,120 --> 00:11:55,160 Speaker 5: was in emotional turmoil about a certain thing, and all 228 00:11:55,200 --> 00:11:57,160 Speaker 5: these thoughts were in my head and I just was 229 00:11:57,240 --> 00:12:00,080 Speaker 5: compelled to write a song about it. Well, if I 230 00:12:00,120 --> 00:12:04,800 Speaker 5: have AI do it, doesn't that negate the whole artistic 231 00:12:05,720 --> 00:12:09,080 Speaker 5: you know, getting your feelings out of you onto paper 232 00:12:09,200 --> 00:12:11,120 Speaker 5: or music or painting or whatever. 233 00:12:11,920 --> 00:12:15,720 Speaker 1: Well, right, to summarize again, what are we doing here? 234 00:12:16,200 --> 00:12:16,880 Speaker 2: Exactly? 235 00:12:17,280 --> 00:12:19,080 Speaker 5: So if I was gonna have them write the bridge, 236 00:12:19,120 --> 00:12:21,679 Speaker 5: well write the whole damn thing, write the lyrics, produce it, 237 00:12:21,960 --> 00:12:24,800 Speaker 5: you know, tell me in five minutes what to listen to. 238 00:12:24,760 --> 00:12:26,480 Speaker 1: And I'll listen to it in a five seconds. 239 00:12:26,559 --> 00:12:27,280 Speaker 2: Five seconds. 240 00:12:27,400 --> 00:12:29,160 Speaker 5: Then I'll listen to it and think, well, now I 241 00:12:29,240 --> 00:12:31,760 Speaker 5: feel like I've expunged my demons. 242 00:12:31,800 --> 00:12:33,199 Speaker 2: I'll get to go about my life. 243 00:12:33,400 --> 00:12:34,680 Speaker 1: What well? 244 00:12:34,760 --> 00:12:34,960 Speaker 4: Right? 245 00:12:35,200 --> 00:12:38,920 Speaker 1: And so the great conundrum is, okay AI is going 246 00:12:39,000 --> 00:12:41,400 Speaker 1: to free us all up to become poets and songwriters, 247 00:12:41,679 --> 00:12:42,640 Speaker 1: which the AI. 248 00:12:42,440 --> 00:12:46,480 Speaker 5: Will do so right, excellent point. And then there's also 249 00:12:46,600 --> 00:12:49,160 Speaker 5: the what if it's way better than what I would 250 00:12:49,160 --> 00:12:50,719 Speaker 5: have come up with, which it probably would be. 251 00:12:51,200 --> 00:12:52,520 Speaker 1: I'd say that's the material. 252 00:12:52,559 --> 00:12:55,080 Speaker 2: To me, Oh it is, but who wants that? 253 00:12:57,280 --> 00:13:01,199 Speaker 1: It's very discouraging to worry about it. The the positive 254 00:13:01,280 --> 00:13:21,320 Speaker 1: optimistic argument is coming up in moments Stay with us, friends. 255 00:13:14,120 --> 00:13:16,760 Speaker 5: We're talking about AI, both Axios and The New Yorker. 256 00:13:16,840 --> 00:13:19,640 Speaker 5: I now big pieces on the is this going to 257 00:13:21,240 --> 00:13:24,640 Speaker 5: change in the world? Is in ways we can't even imagine? 258 00:13:24,679 --> 00:13:27,839 Speaker 5: Maybe for the worse or is it not going to 259 00:13:27,880 --> 00:13:31,560 Speaker 5: be that big a deal? I certainly don't know which 260 00:13:31,559 --> 00:13:32,559 Speaker 5: I'm hoping for the latter. 261 00:13:32,640 --> 00:13:36,040 Speaker 1: Actually, well, and the guys who are designing all this 262 00:13:36,040 --> 00:13:38,840 Speaker 1: stuff don't know either, which is no Harry back to 263 00:13:38,880 --> 00:13:43,520 Speaker 1: my doctor Frankenstein metaphor. But I will say that the 264 00:13:43,559 --> 00:13:46,640 Speaker 1: admitting the monster might rampage through the village and kill everybody, 265 00:13:46,720 --> 00:13:49,800 Speaker 1: it's impossible that well, it's not impossible. 266 00:13:49,840 --> 00:13:53,719 Speaker 5: But obviously Google elon. All kinds of people think it's 267 00:13:53,720 --> 00:13:54,520 Speaker 5: going to be a big deal. 268 00:13:54,559 --> 00:13:55,079 Speaker 2: Bill Gates. 269 00:13:55,080 --> 00:14:00,000 Speaker 5: They're spending hundreds of millions of dollars billions of dollars 270 00:14:00,400 --> 00:14:01,920 Speaker 5: this because they think it's going to be that big 271 00:14:01,920 --> 00:14:02,200 Speaker 5: a deal. 272 00:14:03,080 --> 00:14:06,080 Speaker 1: Rich Lowry in the National Review has expressed the counter 273 00:14:06,280 --> 00:14:09,120 Speaker 1: argument that many of you find folks have expressed through 274 00:14:09,120 --> 00:14:14,640 Speaker 1: the years and certainly worth considering. Rich writes, chat GPT 275 00:14:14,840 --> 00:14:16,920 Speaker 1: is coming for your job. That's the fear about the 276 00:14:17,000 --> 00:14:21,880 Speaker 1: rapid advances in intelligent artificial intelligence. They mentioned the headline 277 00:14:21,960 --> 00:14:26,080 Speaker 1: Naxios who warned of a white color bloodbath. CEO of 278 00:14:26,120 --> 00:14:28,400 Speaker 1: Anthropic we've been quoting in the last segment told the 279 00:14:28,440 --> 00:14:32,200 Speaker 1: publication AI could destroy up to half, well destroy half 280 00:14:32,240 --> 00:14:34,600 Speaker 1: of all entry level white collar jobs in the next 281 00:14:34,640 --> 00:14:37,520 Speaker 1: one to five years. That's like right around the corner, 282 00:14:37,640 --> 00:14:40,160 Speaker 1: and drive the employment rate up to ten to twenty percent, 283 00:14:40,200 --> 00:14:43,320 Speaker 1: or roughly great depression levels. This sounds dire, but we've 284 00:14:43,320 --> 00:14:46,440 Speaker 1: been here before. In the thirties, John Maynard Keynes thought 285 00:14:46,440 --> 00:14:49,560 Speaker 1: that labor saving devices will quote out running the pace 286 00:14:49,600 --> 00:14:52,120 Speaker 1: at which we can find new uses for labor. And 287 00:14:52,280 --> 00:14:54,760 Speaker 1: let's thought the same thing in the sixties when John F. 288 00:14:54,840 --> 00:14:58,880 Speaker 1: Kennedy warned that quote, the automation problem is as important 289 00:14:58,920 --> 00:15:02,440 Speaker 1: as any we face, and in our era too. If 290 00:15:02,440 --> 00:15:06,280 Speaker 1: a prediction has been consistently wrong, it doesn't necessarily mean 291 00:15:06,320 --> 00:15:08,680 Speaker 1: that it will forever be wrong. Still, we shouldn't have 292 00:15:08,760 --> 00:15:13,840 Speaker 1: much confidence in the same alarmism repeated for the same reasons. 293 00:15:13,640 --> 00:15:16,560 Speaker 1: That gives a few examples to push back. 294 00:15:16,600 --> 00:15:18,760 Speaker 5: A lot of people have and I hope it's absolutely 295 00:15:19,720 --> 00:15:22,880 Speaker 5: holds true for AI. It has been true in the past, 296 00:15:23,240 --> 00:15:27,360 Speaker 5: but God AI could reach into so many different areas 297 00:15:27,360 --> 00:15:28,800 Speaker 5: of life at the same time. 298 00:15:29,800 --> 00:15:30,640 Speaker 2: A little murder. 299 00:15:30,480 --> 00:15:34,840 Speaker 1: Rich's argument, and then perhaps we'll counter it. If technological 300 00:15:34,840 --> 00:15:37,200 Speaker 1: advance was really a net killer of jobs, the labor 301 00:15:37,240 --> 00:15:39,400 Speaker 1: market should have been in decline since the invention of 302 00:15:39,400 --> 00:15:42,480 Speaker 1: the wheel. Instead, we live in a time of technological marvels, 303 00:15:42,480 --> 00:15:45,360 Speaker 1: and the unemployment rate is four point two percent. Rob 304 00:15:45,400 --> 00:15:48,520 Speaker 1: At Consent of the Information Technology and Innovation Foundation points 305 00:15:48,520 --> 00:15:50,720 Speaker 1: out that the average unemployment rate in the US has 306 00:15:50,760 --> 00:15:53,760 Speaker 1: not changed much over the last century, despite an increase 307 00:15:53,800 --> 00:16:00,440 Speaker 1: in productivity by almost ten times. Technology increases productivity, driving 308 00:16:00,480 --> 00:16:02,800 Speaker 1: down costs and making it possible to invest and spend 309 00:16:02,800 --> 00:16:05,400 Speaker 1: on other things, creating new jobs that replace the old. 310 00:16:05,760 --> 00:16:08,400 Speaker 1: This is the process of society becoming wealthier, and it's 311 00:16:08,440 --> 00:16:11,840 Speaker 1: why nations that innovate are better off than those who don't. 312 00:16:13,200 --> 00:16:17,160 Speaker 1: Good powerful argument. The rise of personal computers collapsed the 313 00:16:17,160 --> 00:16:20,440 Speaker 1: demand for typists and word processors. These positions were often 314 00:16:20,480 --> 00:16:23,320 Speaker 1: held by women. Did this decimate the economic prospects of 315 00:16:23,320 --> 00:16:26,680 Speaker 1: women in America? No, they got different and frequently better jobs. 316 00:16:27,360 --> 00:16:32,800 Speaker 1: It goes into some bookkeeping and accounting parallels. It's a 317 00:16:33,040 --> 00:16:36,760 Speaker 1: good solid argument. The only thing I would throw it 318 00:16:36,840 --> 00:16:40,560 Speaker 1: rich is and we've seen this in a lot of 319 00:16:40,720 --> 00:16:46,480 Speaker 1: different aspects of the modern age. There are two questions, 320 00:16:46,920 --> 00:16:50,840 Speaker 1: two challenges, the amount of change and the pace of change. Right, 321 00:16:52,520 --> 00:16:57,400 Speaker 1: And it could be that this, given the rapidity with twitch, 322 00:16:57,600 --> 00:17:00,160 Speaker 1: it'll drown us all or beat us to that like 323 00:17:00,200 --> 00:17:03,760 Speaker 1: Frankenstein's monster in my metaphor, It's gonna be so fast 324 00:17:03,920 --> 00:17:09,720 Speaker 1: and so huge, the jarring adjustment period is going to 325 00:17:09,800 --> 00:17:12,480 Speaker 1: be very ugly and volatile. 326 00:17:13,280 --> 00:17:16,240 Speaker 5: Yeah, we're pretty We're being pretty loose and fast with 327 00:17:16,320 --> 00:17:20,760 Speaker 5: the term AI. Also, it's all about a gi artificial 328 00:17:20,760 --> 00:17:24,120 Speaker 5: general intelligence, whether that happens or not. And I've read 329 00:17:24,200 --> 00:17:26,320 Speaker 5: enough of these books to know there are people that 330 00:17:26,440 --> 00:17:28,520 Speaker 5: think it will never happen. There are people that think 331 00:17:28,560 --> 00:17:30,400 Speaker 5: we're twenty years away. There are people that think we're 332 00:17:30,400 --> 00:17:34,120 Speaker 5: five years away. But if artificial general intelligence happens, where 333 00:17:34,119 --> 00:17:37,800 Speaker 5: computers are as smart as human beings, smarter than human beings, 334 00:17:37,800 --> 00:17:42,320 Speaker 5: can learn everything instantly, then it's impossible anything that you 335 00:17:42,440 --> 00:17:46,800 Speaker 5: come up with that human beings can do, that are 336 00:17:47,040 --> 00:17:51,320 Speaker 5: that come off of the technological technological advancement, can also. 337 00:17:51,080 --> 00:17:53,520 Speaker 2: Be done by artificial and general intelligence. 338 00:17:53,760 --> 00:17:55,719 Speaker 1: Yeah, that's too much for a person like me to 339 00:17:55,720 --> 00:17:59,439 Speaker 1: even contemplate. I can't. I'm stuck in the halfway between 340 00:17:59,520 --> 00:18:03,720 Speaker 1: period where you know or eighty percent of lawyers will 341 00:18:03,760 --> 00:18:08,320 Speaker 1: be replaced by a nap. That's going to be dislocating 342 00:18:08,400 --> 00:18:11,320 Speaker 1: enough your thing. Please, I don't even want to think. 343 00:18:11,160 --> 00:18:14,800 Speaker 5: About agi happens. I can't imagine how that doesn't do 344 00:18:14,880 --> 00:18:15,480 Speaker 5: them society. 345 00:18:15,480 --> 00:18:16,720 Speaker 2: But anyway, who knows. 346 00:18:18,880 --> 00:18:25,080 Speaker 1: Armstrong and Getty. So hey, before we get started, you 347 00:18:25,119 --> 00:18:26,960 Speaker 1: ought to talk about what we were just talking about 348 00:18:27,000 --> 00:18:29,400 Speaker 1: off the air. On the air, to whatever extent you can, 349 00:18:31,520 --> 00:18:34,240 Speaker 1: you're good at that. I just at some point the 350 00:18:34,240 --> 00:18:40,000 Speaker 1: theme being people are way crazier than you are told 351 00:18:40,040 --> 00:18:43,240 Speaker 1: as a kid. As a kid, you think, you know, 352 00:18:43,320 --> 00:18:45,560 Speaker 1: grown ups have it more or less buttoned up, and 353 00:18:45,800 --> 00:18:47,720 Speaker 1: they can at least perceive reality. 354 00:18:47,880 --> 00:18:50,800 Speaker 5: A lot of people cannot. They live in a world 355 00:18:50,800 --> 00:18:53,720 Speaker 5: of delusion. Especially. I never thought when I was younger 356 00:18:53,760 --> 00:18:57,720 Speaker 5: that rich, smart people could be this dumb. 357 00:18:59,320 --> 00:19:02,520 Speaker 1: Right, Okay, well, too good? You have a particular skill 358 00:19:02,560 --> 00:19:04,200 Speaker 1: set that makes you a lot of money. That doesn't 359 00:19:04,200 --> 00:19:06,679 Speaker 1: mean you have generalized intelligence or wisdom, and its correct you. 360 00:19:06,720 --> 00:19:09,080 Speaker 2: I will tell that story, uh in a little bit. 361 00:19:09,119 --> 00:19:12,199 Speaker 5: But first, executive producer Mike Hanson, who rarely comes on 362 00:19:12,280 --> 00:19:13,640 Speaker 5: the air, why are you here? 363 00:19:13,680 --> 00:19:18,080 Speaker 2: Hanson? I was inspired really by our AI conversation. 364 00:19:18,160 --> 00:19:20,080 Speaker 6: Yeah, I mean, you guys are talking about trying to 365 00:19:20,119 --> 00:19:22,080 Speaker 6: form up a song, and I see I've been picking 366 00:19:22,160 --> 00:19:25,160 Speaker 6: up my guitar more lately as well. I don't really 367 00:19:25,160 --> 00:19:27,600 Speaker 6: concern myself with a lot of you know, song structure 368 00:19:27,600 --> 00:19:29,280 Speaker 6: in the bridge and I just kind of fiddle around. 369 00:19:29,359 --> 00:19:32,320 Speaker 2: But I bang around. I make a lot of a 370 00:19:32,320 --> 00:19:33,840 Speaker 2: lot of noise. But I was inspired. 371 00:19:34,600 --> 00:19:38,040 Speaker 6: So I turned my AI tool, thinking I should craft 372 00:19:38,480 --> 00:19:42,640 Speaker 6: a tune this morning. Uh So I quickly jotted down 373 00:19:42,640 --> 00:19:44,520 Speaker 6: some lyrics and I wrote all the lyrics on this. 374 00:19:44,560 --> 00:19:46,400 Speaker 6: I didn't ask you to help me. I wrote them 375 00:19:46,440 --> 00:19:48,360 Speaker 6: in about twenty twenty seconds. 376 00:19:50,760 --> 00:19:52,360 Speaker 1: Way to put in the time. 377 00:19:52,440 --> 00:19:55,520 Speaker 6: Well, and a lot of great songs inspired. I was inspired. 378 00:19:55,640 --> 00:19:58,120 Speaker 6: I mean, come on, don't tell me about my art. 379 00:19:58,520 --> 00:19:59,239 Speaker 4: This was my art. 380 00:19:59,440 --> 00:20:01,000 Speaker 2: So way he came up with a song. 381 00:20:10,760 --> 00:20:18,040 Speaker 7: He gets up every morning, he's dedicated to his craft, 382 00:20:19,960 --> 00:20:26,800 Speaker 7: his hall mark, his fairness, and everyone loves to hear him. 383 00:20:26,960 --> 00:20:27,280 Speaker 5: Leader. 384 00:20:28,920 --> 00:20:35,399 Speaker 7: He's a master of metaphorm and a one of hurd smith. 385 00:20:35,960 --> 00:20:36,639 Speaker 2: Tell me why not. 386 00:20:37,240 --> 00:20:39,880 Speaker 1: He's got kind of a but don't ask him. 387 00:20:39,560 --> 00:20:43,760 Speaker 7: To work to harm cause of leisure. 388 00:20:44,760 --> 00:20:46,000 Speaker 2: He's the king. 389 00:20:49,680 --> 00:20:56,719 Speaker 4: Jokey Joe Gay, that was awesome. 390 00:20:56,800 --> 00:21:04,040 Speaker 2: This is my kind of music, dud Joe Getty, phone 391 00:21:04,119 --> 00:21:08,879 Speaker 2: up with the light on you god swaying back and forth. 392 00:21:10,119 --> 00:21:11,960 Speaker 5: This is the sort of song that was on he 393 00:21:12,160 --> 00:21:17,200 Speaker 5: Haw or the Glenn Campbell Show every Saturday night when 394 00:21:17,200 --> 00:21:17,920 Speaker 5: I was a kid. 395 00:21:18,080 --> 00:21:21,840 Speaker 1: Or if you're an alt country nineties guy like me, 396 00:21:22,040 --> 00:21:24,520 Speaker 1: it was just revd up a little bit and the 397 00:21:24,600 --> 00:21:28,040 Speaker 1: electric guitars and yeah, I love that music because of leisure. 398 00:21:28,560 --> 00:21:32,960 Speaker 1: The King. I am a man who craves leisure. Yes, 399 00:21:33,960 --> 00:21:35,320 Speaker 1: as I've made clear through the years. 400 00:21:35,440 --> 00:21:38,600 Speaker 2: Oh my god. The fact that you can do that 401 00:21:38,920 --> 00:21:41,400 Speaker 2: and like, how long did it take you total? I'm 402 00:21:41,400 --> 00:21:43,160 Speaker 2: telling you it was twenty seconds. 403 00:21:44,160 --> 00:21:47,960 Speaker 1: That is amazing, amusing and horrifying. 404 00:21:48,160 --> 00:21:51,560 Speaker 5: Yeah, there are people that used to make six figure 405 00:21:51,680 --> 00:21:55,199 Speaker 5: salaries back when that really meant something doing that for 406 00:21:55,359 --> 00:21:57,200 Speaker 5: radio shows back in the day. 407 00:21:59,000 --> 00:22:01,640 Speaker 1: Oh never mind, the just doing that right. 408 00:22:02,880 --> 00:22:05,040 Speaker 2: Yeah, So I gotta put together at Rush. 409 00:22:04,920 --> 00:22:08,880 Speaker 1: Limbaugh, you know, the famous song parodies and stuff everybody enjoyed. Yeah, 410 00:22:08,880 --> 00:22:09,720 Speaker 1: go ahead, And as I got to. 411 00:22:09,720 --> 00:22:15,840 Speaker 6: Put together a Jack song now too, I've got some ideas. 412 00:22:15,840 --> 00:22:17,600 Speaker 6: Something comes to the top of your mind right now, 413 00:22:17,640 --> 00:22:20,000 Speaker 6: just to spit it out, and I'll try to incorporate 414 00:22:20,040 --> 00:22:21,679 Speaker 6: that into my belligerents. 415 00:22:22,600 --> 00:22:25,520 Speaker 1: Maybe we can, maybe we can take suggestions from listening. 416 00:22:26,359 --> 00:22:26,800 Speaker 2: I like that. 417 00:22:26,880 --> 00:22:32,359 Speaker 5: Let's open the text line. In fact, let's take live calls. Okay, 418 00:22:32,520 --> 00:22:34,119 Speaker 5: So you work on that and we'll get to that 419 00:22:34,160 --> 00:22:37,440 Speaker 5: maybe an hour or four. So back to my story. 420 00:22:38,200 --> 00:22:41,920 Speaker 5: I was looking at the Twitter feed and the TV 421 00:22:42,000 --> 00:22:44,160 Speaker 5: and everything like that, and I expressed to Joe during 422 00:22:44,160 --> 00:22:47,199 Speaker 5: the commercial break, I don't quite get the hatred for 423 00:22:48,000 --> 00:22:50,760 Speaker 5: Elon Musk at you looking at a friend of ours 424 00:22:50,800 --> 00:22:52,280 Speaker 5: and his tweet. 425 00:22:52,560 --> 00:22:54,120 Speaker 2: I don't get the hatred for Elon Musk. 426 00:22:54,160 --> 00:22:57,920 Speaker 5: I can fully understand, like if you think Doge didn't 427 00:22:57,960 --> 00:23:00,080 Speaker 5: do enough, or you thought these cuts were. 428 00:23:00,119 --> 00:23:01,760 Speaker 2: Wrong or whatever it is. 429 00:23:01,680 --> 00:23:04,840 Speaker 5: But I don't get the like, you know, deep hatred 430 00:23:04,920 --> 00:23:11,240 Speaker 5: for him or whatever. But I do know a couple 431 00:23:11,320 --> 00:23:15,359 Speaker 5: that's leaving the country, the whole family there are children involved. 432 00:23:15,680 --> 00:23:18,400 Speaker 5: They're leaving the country because they don't want to live 433 00:23:18,440 --> 00:23:21,119 Speaker 5: in Trump and Elon's America. 434 00:23:22,040 --> 00:23:23,359 Speaker 2: And what. 435 00:23:24,400 --> 00:23:27,400 Speaker 5: The straw that broke the camel's back was Elon's Nazi 436 00:23:27,400 --> 00:23:32,080 Speaker 5: salute when at that convention or whatever he did, whatever 437 00:23:32,080 --> 00:23:32,560 Speaker 5: he said. 438 00:23:32,359 --> 00:23:34,520 Speaker 1: When the autistic fellow waved at the crowd and his 439 00:23:34,640 --> 00:23:36,480 Speaker 1: arm was out like that for a fraction of a second. 440 00:23:36,560 --> 00:23:40,600 Speaker 1: Yes that they they have believed and internalized that that 441 00:23:40,800 --> 00:23:43,600 Speaker 1: was an actual Nazi salute, and Nazism is coming to 442 00:23:43,640 --> 00:23:44,639 Speaker 1: America Idazi. 443 00:23:45,240 --> 00:23:47,720 Speaker 5: I didn't have the first hand conversation with him, but 444 00:23:47,800 --> 00:23:51,000 Speaker 5: I've talked to the person that did, and yes, they 445 00:23:51,119 --> 00:23:55,919 Speaker 5: can't believe that. You don't realize that was obviously a 446 00:23:56,000 --> 00:23:56,840 Speaker 5: Nazi salute. 447 00:23:58,359 --> 00:24:02,320 Speaker 1: To me, dot I would like to hear more. And therefore, 448 00:24:03,000 --> 00:24:06,840 Speaker 1: Crystal Knoch is coming. It's the left that's abusing Jews. 449 00:24:06,840 --> 00:24:07,800 Speaker 1: But anyway, back to you. 450 00:24:08,920 --> 00:24:11,160 Speaker 5: Right, I think that's so dumb. I can't even it's 451 00:24:11,160 --> 00:24:14,240 Speaker 5: hard to even engage in conversation. This is a very 452 00:24:14,280 --> 00:24:18,240 Speaker 5: wealthy couple, by the way, incredibly successful, both of them, 453 00:24:18,359 --> 00:24:26,320 Speaker 5: like really successful, and I don't what. 454 00:24:26,240 --> 00:24:29,840 Speaker 4: Do you do with that? How could you possibly think that. 455 00:24:31,359 --> 00:24:33,320 Speaker 1: I would like to spend the rest of my life 456 00:24:33,440 --> 00:24:38,800 Speaker 1: studying in between long periods of leisure, obviously, but studying 457 00:24:40,119 --> 00:24:46,639 Speaker 1: the relationship and lack of relationship between intelligence, skill at 458 00:24:46,680 --> 00:24:53,200 Speaker 1: a particular skill set, and the concept of wisdom, because 459 00:24:53,240 --> 00:24:58,560 Speaker 1: they can be completely unrelated. I mean, we have all known. 460 00:24:58,920 --> 00:25:01,200 Speaker 1: I mean the classic exam, I guess would be the brilliant, 461 00:25:01,200 --> 00:25:04,800 Speaker 1: brilliant musician who is just unspeakably talented. 462 00:25:05,160 --> 00:25:06,320 Speaker 2: But these people are. 463 00:25:06,119 --> 00:25:09,240 Speaker 1: Replaced by AI but cannot manage their life at all 464 00:25:09,320 --> 00:25:11,440 Speaker 1: and dies a drug addiction and is miserable. 465 00:25:11,440 --> 00:25:14,160 Speaker 5: And I understand anybody in the world of arn't more 466 00:25:14,480 --> 00:25:16,960 Speaker 5: being that way as a musician or whatever. 467 00:25:16,960 --> 00:25:17,600 Speaker 2: It's so different. 468 00:25:17,640 --> 00:25:23,119 Speaker 5: But these are like business people, one scientist adjacent, the 469 00:25:23,119 --> 00:25:26,240 Speaker 5: other one just flat out business person, like, like. 470 00:25:27,720 --> 00:25:29,840 Speaker 2: So successful, and I just I can't. 471 00:25:30,320 --> 00:25:35,200 Speaker 5: So you think Elon is secretly a Nazi and slipped, 472 00:25:35,920 --> 00:25:39,040 Speaker 5: or you think that was the signal to start something, 473 00:25:39,200 --> 00:25:42,320 Speaker 5: or I can't even imagine what the what the scenario 474 00:25:42,400 --> 00:25:42,960 Speaker 5: would be. 475 00:25:43,880 --> 00:25:47,480 Speaker 1: Yeah, yeah, I find this so interesting. 476 00:25:48,600 --> 00:25:51,439 Speaker 2: And you would pick up your family and move across 477 00:25:51,480 --> 00:25:52,240 Speaker 2: the ocean. 478 00:25:51,920 --> 00:25:52,480 Speaker 4: Because of it? 479 00:25:53,200 --> 00:25:55,640 Speaker 1: Yeah, yeah, nuts. 480 00:25:55,560 --> 00:25:56,520 Speaker 4: You're nuts. 481 00:25:58,359 --> 00:25:59,960 Speaker 1: On the other hand, you told me where they're going 482 00:26:00,040 --> 00:26:04,560 Speaker 1: and it's a pleasant place. Yeah, yeah, that's again. I 483 00:26:04,840 --> 00:26:08,159 Speaker 1: have a million comments. I just I'm so blown away. 484 00:26:09,480 --> 00:26:13,520 Speaker 1: And it takes money to indulge that sort of media. 485 00:26:14,160 --> 00:26:18,600 Speaker 1: I mean, that's part of it, because Joe Schmoe, you know, 486 00:26:18,720 --> 00:26:20,600 Speaker 1: struggling to get by and his wife and two little 487 00:26:20,640 --> 00:26:25,000 Speaker 1: kids might want desperately to go to Uruguay, for instance, 488 00:26:25,320 --> 00:26:27,360 Speaker 1: where I've been convinced if I ever go somewhere, that's 489 00:26:27,359 --> 00:26:33,240 Speaker 1: where I'm going. You're sure, No, I'm half, I'm like 490 00:26:33,440 --> 00:26:38,000 Speaker 1: sixty percent sure. But I was hearing about how interesting 491 00:26:38,200 --> 00:26:42,040 Speaker 1: a country that is in a little more American terms 492 00:26:42,040 --> 00:26:44,680 Speaker 1: of freedom and that sort of thing than sometimes America. 493 00:26:44,720 --> 00:26:47,320 Speaker 1: But you can only indulge that impulse if you've got 494 00:26:47,359 --> 00:26:48,760 Speaker 1: a hell of a lot of money or you're willing 495 00:26:48,800 --> 00:26:52,239 Speaker 1: to live extremely modestly. So anyway, that's why celebrities are 496 00:26:52,240 --> 00:26:56,360 Speaker 1: always doing it or talking about doing it, because they can. Well. 497 00:26:56,400 --> 00:27:01,879 Speaker 5: In this family, I assume is surrounded by people that 498 00:27:01,960 --> 00:27:04,440 Speaker 5: say I understand, or that's a good idea, or we're 499 00:27:04,440 --> 00:27:04,920 Speaker 5: going to do. 500 00:27:04,880 --> 00:27:07,360 Speaker 2: That too, which is so crazy. 501 00:27:08,240 --> 00:27:11,480 Speaker 1: Yeah, I am so fascinating by the idea of getting 502 00:27:11,560 --> 00:27:14,800 Speaker 1: in of understanding how other people's psyches work, which you know, 503 00:27:14,840 --> 00:27:18,720 Speaker 1: you never can fully But the sort of person who 504 00:27:18,800 --> 00:27:23,320 Speaker 1: processes all of the inputs of life, everything they see, 505 00:27:23,359 --> 00:27:28,400 Speaker 1: they hear, they perceive through an emotional lens, and then 506 00:27:28,440 --> 00:27:34,399 Speaker 1: they structure it quote unquote logically to make it sound 507 00:27:34,440 --> 00:27:38,560 Speaker 1: like it makes sense. But it's all emotionalism, and you 508 00:27:38,640 --> 00:27:42,119 Speaker 1: can't talk them away. It's like the classic example is 509 00:27:43,320 --> 00:27:45,320 Speaker 1: I don't care what you say. That's what I believe 510 00:27:46,359 --> 00:27:49,399 Speaker 1: you're stating. Wait a minute, I have presented rock solid, 511 00:27:49,440 --> 00:27:53,840 Speaker 1: irrefutable evidence to your conclusion being wrong, but you are 512 00:27:53,880 --> 00:27:56,399 Speaker 1: shouting at me that you have no interest in it. 513 00:27:57,080 --> 00:28:02,359 Speaker 1: That sort of person's psyche operates way differently than mine, 514 00:28:02,359 --> 00:28:05,280 Speaker 1: for instance, and I think my way is better, but 515 00:28:05,320 --> 00:28:08,720 Speaker 1: they probably think exactly the same thing. I just don't 516 00:28:08,720 --> 00:28:10,679 Speaker 1: want them in charge of anything. So I've said a 517 00:28:10,680 --> 00:28:14,199 Speaker 1: million times I love songwriters and poets and dreamers. We 518 00:28:14,240 --> 00:28:17,400 Speaker 1: would be lost without them. I don't want them in charge. 519 00:28:19,680 --> 00:28:22,359 Speaker 5: I was wondering the other day if anybody has recently 520 00:28:22,440 --> 00:28:26,080 Speaker 5: set the record for doing backflips while on fire. Good 521 00:28:26,119 --> 00:28:29,879 Speaker 5: news on that front, and maybe that song about me 522 00:28:29,920 --> 00:28:31,000 Speaker 5: an hour four or soon? 523 00:28:31,400 --> 00:28:34,879 Speaker 1: And what else? Plus coming up, let's meet some of 524 00:28:34,920 --> 00:28:38,080 Speaker 1: the young Columbia Radicals. You're in court the other day. 525 00:28:38,200 --> 00:28:44,240 Speaker 5: Oh boy, I have a guess. 526 00:28:50,560 --> 00:28:51,240 Speaker 2: I was freezing. 527 00:28:51,320 --> 00:28:53,880 Speaker 8: I am wearing a couple of layers that are soaked 528 00:28:53,880 --> 00:28:55,680 Speaker 8: in a jail. I'm putt in a fridge for twenty 529 00:28:55,680 --> 00:28:58,520 Speaker 8: four hours, so it helped protect me from the fire 530 00:28:58,800 --> 00:29:02,400 Speaker 8: that I'm on fire. So we're seven seconds in. I've 531 00:29:02,440 --> 00:29:04,440 Speaker 8: done two backflips, so I'm on good trot to beat 532 00:29:04,440 --> 00:29:06,400 Speaker 8: the previous record, and I need to move quickly. The 533 00:29:06,400 --> 00:29:08,880 Speaker 8: oxygen's burning up around me. If I don't keep moving, 534 00:29:09,040 --> 00:29:11,760 Speaker 8: I could find it very difficult to breathe on find 535 00:29:11,800 --> 00:29:13,920 Speaker 8: the heat a bit too overwhelming. There are parts that 536 00:29:13,960 --> 00:29:16,640 Speaker 8: are actually falling off what I'm wearing on burning. My 537 00:29:16,840 --> 00:29:20,360 Speaker 8: successful record title was seven bock flips in thirty seconds 538 00:29:20,440 --> 00:29:22,920 Speaker 8: while engaging in a full body burn. It's incredible what 539 00:29:23,040 --> 00:29:23,920 Speaker 8: humans can achieve. 540 00:29:24,800 --> 00:29:26,760 Speaker 5: All right, So there's a British man who has set 541 00:29:26,800 --> 00:29:29,720 Speaker 5: the new record for doing backflips while on fire. 542 00:29:31,000 --> 00:29:34,560 Speaker 1: Sounded irish to me, but back to you, all right, 543 00:29:34,880 --> 00:29:37,200 Speaker 1: another backflipping on fire news. 544 00:29:37,280 --> 00:29:39,560 Speaker 2: Wait a minute, there isn't any Well. 545 00:29:39,480 --> 00:29:42,240 Speaker 5: What's the record for doing backflips while on fire? If 546 00:29:42,280 --> 00:29:49,840 Speaker 5: you're if you have red hair, so I don't know 547 00:29:49,840 --> 00:29:50,560 Speaker 5: if you follow this. 548 00:29:50,680 --> 00:29:53,080 Speaker 1: A group of Kafia clad students to send it on 549 00:29:53,120 --> 00:29:56,760 Speaker 1: Manhattan Criminal Court on Wednesday to face formal arrangement arraignment 550 00:29:56,880 --> 00:30:00,000 Speaker 1: for their roles in the violent takeover of Columbia Universe 551 00:30:00,240 --> 00:30:03,280 Speaker 1: these Butler Library. You remember the video at the time, 552 00:30:03,440 --> 00:30:09,800 Speaker 1: the damage, the injuries, the they're demanding that the charges 553 00:30:09,880 --> 00:30:13,520 Speaker 1: be dismissed because they're peace activists and they set up 554 00:30:13,840 --> 00:30:17,280 Speaker 1: a mere teaching in the library. You may also recall 555 00:30:17,360 --> 00:30:21,680 Speaker 1: that they the mob injured to security officials, damaged bookshelves, 556 00:30:21,840 --> 00:30:26,440 Speaker 1: distributed pamphlets praising Hamas, and renamed the library after a terrorist. 557 00:30:26,640 --> 00:30:29,760 Speaker 1: They were led by the Columbia University Apartheid divest a 558 00:30:29,760 --> 00:30:33,040 Speaker 1: notorious anti Semitic group. It wants the Ivy League school 559 00:30:33,040 --> 00:30:34,960 Speaker 1: to cut all ties with Israel in chance. We want 560 00:30:34,960 --> 00:30:38,480 Speaker 1: investment devestment now, among other things like globalize the Intafada, 561 00:30:38,480 --> 00:30:41,600 Speaker 1: which means killed Jews wherever you find them. So who 562 00:30:41,680 --> 00:30:45,160 Speaker 1: are these global cream of the cream of the crop 563 00:30:45,200 --> 00:30:46,840 Speaker 1: of American youth yesterday? 564 00:30:46,920 --> 00:30:50,920 Speaker 5: Globalized the Intafada is exactly what that guy chanted after 565 00:30:51,000 --> 00:30:55,240 Speaker 5: he killed those two nice people who worked at the embassy. 566 00:30:55,320 --> 00:30:58,800 Speaker 1: Including one of whom was a Christian. Yes, yes, globalized 567 00:30:58,840 --> 00:31:04,880 Speaker 1: the Intafada. Indeed, So who are these brave, angry revolutionaries 568 00:31:04,960 --> 00:31:07,720 Speaker 1: rising up from the gutter to challenge the man jack. 569 00:31:08,240 --> 00:31:11,280 Speaker 1: Let's meet some of them, like Emma Biswus, who grew 570 00:31:11,360 --> 00:31:13,560 Speaker 1: up in a life of luxury. She led the robotics 571 00:31:13,560 --> 00:31:16,160 Speaker 1: team at the Harker School, which bills itself as one 572 00:31:16,200 --> 00:31:18,760 Speaker 1: of the nation's top college prep schools and tuition can 573 00:31:18,800 --> 00:31:21,720 Speaker 1: reach sixty five thousand dollars a year. Wow, that's right 574 00:31:21,760 --> 00:31:24,280 Speaker 1: for a high school. While there, she interned with a 575 00:31:24,280 --> 00:31:28,000 Speaker 1: biotech company that boasts prospective Nobel Prize winners work here, 576 00:31:28,360 --> 00:31:31,800 Speaker 1: and until she left for Tony Barnard College, Biswus lived 577 00:31:31,840 --> 00:31:34,480 Speaker 1: in the San Francisco Bay area in a mansion. 578 00:31:34,280 --> 00:31:39,960 Speaker 5: Worth six million dollars. Moving along, That's amazing how often 579 00:31:40,000 --> 00:31:40,600 Speaker 5: that's the case. 580 00:31:41,400 --> 00:31:44,600 Speaker 1: Interestingly, her dad is with a tech company that does 581 00:31:44,640 --> 00:31:47,560 Speaker 1: a lot of defense work, including with Israel. But let's 582 00:31:47,560 --> 00:31:48,520 Speaker 1: not get hung up on that. 583 00:31:48,920 --> 00:31:51,560 Speaker 5: The rich kids that don't have a purpose because everything 584 00:31:51,600 --> 00:31:53,880 Speaker 5: has been handed to them, they need to achieve something. 585 00:31:54,280 --> 00:31:58,280 Speaker 5: Falling gave it away, fall into these ideologies so easy. 586 00:31:58,760 --> 00:32:01,440 Speaker 1: But miss Biswuss was just one of eight Barnard and 587 00:32:01,440 --> 00:32:04,560 Speaker 1: Columbia students who attended swanking, prestigious private schools and were 588 00:32:04,640 --> 00:32:07,320 Speaker 1: charged with criminal trespassing, among other things. Although there was 589 00:32:07,360 --> 00:32:11,600 Speaker 1: Barnard's student Luna Firefly Deerfield cumming Shaw. 590 00:32:11,880 --> 00:32:13,000 Speaker 2: How many names is enough? 591 00:32:13,040 --> 00:32:19,600 Speaker 1: Sweetheart? Wow? Luna Firefly Deerfield cumming Shaw the granddaughter of 592 00:32:19,640 --> 00:32:22,280 Speaker 1: Mark Shaw, who describes himself as John F. Kennedy's unofficial 593 00:32:22,320 --> 00:32:29,080 Speaker 1: family photographer, stabbed, snapped many many celebrities, and her grandmother 594 00:32:29,120 --> 00:32:31,280 Speaker 1: is an actress best known for a role in Some 595 00:32:31,520 --> 00:32:35,280 Speaker 1: Show Anyway. Cumming Shaw attended the Putney School in Vermont, 596 00:32:35,440 --> 00:32:39,000 Speaker 1: which charges day students fifty thousand dollars a year and 597 00:32:39,120 --> 00:32:43,800 Speaker 1: boarding students over eighty thousand. Despite the price, it advertises 598 00:32:43,840 --> 00:32:48,160 Speaker 1: itself as a progressive school that considers the inclusivity a 599 00:32:48,200 --> 00:32:53,080 Speaker 1: fundamental principle of the school, its sports two committees dedicated 600 00:32:53,120 --> 00:32:55,840 Speaker 1: to the cause, allowing it to remain in the forefront 601 00:32:56,080 --> 00:32:59,360 Speaker 1: of the drive for racial justice. Her mother is a 602 00:32:59,360 --> 00:33:03,480 Speaker 1: long time left wing activists Bernard Alum and Bernie Sanders supporter. 603 00:33:04,800 --> 00:33:08,959 Speaker 1: Before Marisol Rojas Cheetham was arrested, The Columbia pre law 604 00:33:09,040 --> 00:33:12,080 Speaker 1: student grow up in a Berkeley California home value to 605 00:33:12,120 --> 00:33:14,920 Speaker 1: two million dollars, which, honestly, in Berkeley, isn't that fancy 606 00:33:15,120 --> 00:33:18,840 Speaker 1: and attended the Bentley School, where tuition reaches nearly sixty 607 00:33:18,920 --> 00:33:22,800 Speaker 1: thousand dollars a year according to your LinkedIn page. Rojas 608 00:33:22,920 --> 00:33:25,400 Speaker 1: Cheetham was the captain of the varsity women's soccer team, 609 00:33:25,520 --> 00:33:27,880 Speaker 1: played lacrosse, president of the student government, to name a 610 00:33:27,880 --> 00:33:31,560 Speaker 1: few of her extra curriculars on Bentley's sprawling twelve acre 611 00:33:31,680 --> 00:33:33,000 Speaker 1: upper school campus. 612 00:33:33,160 --> 00:33:40,360 Speaker 5: To Patty Hurst, if you're old enough to remember, oh yeah. 613 00:33:38,360 --> 00:33:39,640 Speaker 2: There is something to that. 614 00:33:39,720 --> 00:33:44,440 Speaker 5: The whole your life has been you know, the pathway 615 00:33:44,480 --> 00:33:45,480 Speaker 5: has been paid for you. 616 00:33:45,520 --> 00:33:46,560 Speaker 2: It's going to be too easy. 617 00:33:46,600 --> 00:33:50,600 Speaker 5: You've got no feeling of accomplishment or making a difference 618 00:33:50,640 --> 00:33:54,360 Speaker 5: in the world, because you know, you could go to 619 00:33:54,400 --> 00:33:56,480 Speaker 5: the fancy school and then immediately show up at a 620 00:33:56,600 --> 00:33:58,360 Speaker 5: job making a lot of money. It was handed to you, 621 00:33:58,400 --> 00:34:00,200 Speaker 5: and you just don't feel like you've done any thing, 622 00:34:00,360 --> 00:34:02,840 Speaker 5: so you fall for these ideologies. I think there's got 623 00:34:02,880 --> 00:34:05,160 Speaker 5: to be something to do. Ohsama, bin Laden was that? 624 00:34:05,880 --> 00:34:08,719 Speaker 1: Yeah? Yeah, A couple more, just to make the point. 625 00:34:08,760 --> 00:34:14,440 Speaker 1: Columbia grad student Ava ambrose Ta Muscula e Garcia seventy 626 00:34:14,480 --> 00:34:18,640 Speaker 1: eight thousand dollars boarding school or parents. Both professors at 627 00:34:18,640 --> 00:34:21,600 Speaker 1: the University of Notre Dame accomplished in their fields as 628 00:34:21,640 --> 00:34:25,320 Speaker 1: a novelist and an artist. Sophia Elizabeth Jones spent eleven 629 00:34:25,360 --> 00:34:28,320 Speaker 1: years living overseas at the American Community School in ab Dabi. 630 00:34:29,400 --> 00:34:34,480 Speaker 1: Honors student identifies as American with Indonesian and Canadian roots 631 00:34:34,520 --> 00:34:38,120 Speaker 1: to Barnard's Class twenty twenty seven, writes frequently about the 632 00:34:38,120 --> 00:34:42,200 Speaker 1: Palestinian issues for a Columbia student website. Then you got 633 00:34:42,480 --> 00:34:47,120 Speaker 1: Barnard student Dima Abat cozm honored guest at Mayor Eric 634 00:34:47,160 --> 00:34:51,440 Speaker 1: Adams Abate Hate Summit last July, a graduate of the 635 00:34:51,480 --> 00:34:55,760 Speaker 1: prestigious McDuffie School in Massachusetts, where annual boarding tuition exceeds 636 00:34:55,800 --> 00:35:00,480 Speaker 1: seventy five thousand dollars, another multimillion dollar mansion. All of 637 00:35:00,520 --> 00:35:07,320 Speaker 1: these very rich young women, yeah, and completely radicalized. 638 00:35:07,640 --> 00:35:14,759 Speaker 5: The percentage of these angry cafia wearing hate spewing nut 639 00:35:14,880 --> 00:35:19,440 Speaker 5: jobs who are wealthy women, young women is astounding. 640 00:35:19,920 --> 00:35:21,920 Speaker 1: They have no purpose in their life. 641 00:35:22,400 --> 00:35:24,520 Speaker 5: Well, and I'm sure that you know the teachers are 642 00:35:25,360 --> 00:35:28,640 Speaker 5: teaching them all to hate America and Israel and everything. 643 00:35:28,680 --> 00:35:31,600 Speaker 5: Also and if it's a boarding school, then you're really surrounded. 644 00:35:31,760 --> 00:35:33,880 Speaker 5: I mean you're immersed in it. 645 00:35:34,520 --> 00:35:37,480 Speaker 1: And then you add to that young women's tendency to 646 00:35:37,800 --> 00:35:41,960 Speaker 1: really crave belonging in approval of their peers, and if 647 00:35:42,000 --> 00:35:46,640 Speaker 1: approval of their peers is dependent upon adopting this radical ideology, 648 00:35:46,760 --> 00:35:50,080 Speaker 1: you get it in spades. It really is amazing. Look 649 00:35:50,080 --> 00:35:54,120 Speaker 1: in the picture of these very very rich girls trying 650 00:35:54,160 --> 00:35:56,839 Speaker 1: to achieve a bo ho radical look and the rest 651 00:35:56,880 --> 00:35:58,359 Speaker 1: of it oldest time. 652 00:36:00,080 --> 00:36:02,200 Speaker 5: Wow, you spend all that money sending your kid to 653 00:36:02,200 --> 00:36:05,279 Speaker 5: school and they end up nut jobs. 654 00:36:05,719 --> 00:36:09,160 Speaker 1: Yeah. Yeah, Well, we have a blockbuster hour four coming up. 655 00:36:09,160 --> 00:36:11,120 Speaker 1: If you don't get our four or you've got to 656 00:36:11,120 --> 00:36:12,880 Speaker 1: go off and do something, that's fine, just grab it 657 00:36:12,960 --> 00:36:16,640 Speaker 1: later via podcast. You should subscribe to Armstrong and Getty 658 00:36:16,760 --> 00:36:20,080 Speaker 1: on demand and listen to it whenever you like. 659 00:36:20,560 --> 00:36:23,319 Speaker 5: Yeah, that's pretty cool at your own leisure. You're a 660 00:36:23,320 --> 00:36:27,120 Speaker 5: man who craves leisure. As the song says, yes, we 661 00:36:27,239 --> 00:36:29,120 Speaker 5: got a big hour four coming, I hope you can 662 00:36:29,200 --> 00:36:29,640 Speaker 5: enjoy them. 663 00:36:30,880 --> 00:36:34,040 Speaker 7: Armstrong and Getty