1 00:00:01,880 --> 00:00:05,680 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio of the 2 00:00:05,760 --> 00:00:10,760 Speaker 1: George Washington Broadcast Center. Jack Armstrong and Joe Ketty Armstrong 3 00:00:10,840 --> 00:00:17,680 Speaker 1: and Jettie I know, he Armstrong and Yetty. 4 00:00:23,680 --> 00:00:27,160 Speaker 2: And then after the war, which we won, we won 5 00:00:27,280 --> 00:00:30,960 Speaker 2: it big. Without us, right now, you'd all be speaking 6 00:00:31,160 --> 00:00:33,120 Speaker 2: German and little Japanese. 7 00:00:33,159 --> 00:00:35,880 Speaker 3: Perhaps that's an interesting thing for the President of the 8 00:00:35,960 --> 00:00:37,640 Speaker 3: United States to say to European leaders. 9 00:00:37,680 --> 00:00:41,800 Speaker 1: If it were frust you'd be speaking German. That is 10 00:00:42,000 --> 00:00:47,839 Speaker 1: straight out of like a parody movie from yesteryear that 11 00:00:48,040 --> 00:00:51,159 Speaker 1: some American would say something like that in Europe. That's 12 00:00:51,200 --> 00:00:53,519 Speaker 1: out of a comedy. By the way, let me say 13 00:00:53,560 --> 00:00:55,200 Speaker 1: it again, just because I can't get it out of 14 00:00:55,200 --> 00:00:57,400 Speaker 1: my mind. I watched the movie Fury over the weekend. 15 00:00:57,440 --> 00:00:59,240 Speaker 1: This is a ten year old movie, so I'm like, 16 00:00:59,320 --> 00:01:01,680 Speaker 1: a dozen years i'd never seen it. 17 00:01:01,760 --> 00:01:06,319 Speaker 3: Brad Pitt Tank Commander, April forty five, end of World 18 00:01:06,319 --> 00:01:08,880 Speaker 3: War two, like when the Germans are really fighting to 19 00:01:08,920 --> 00:01:14,720 Speaker 3: the last. WHOA, that is some real life this is 20 00:01:14,760 --> 00:01:18,360 Speaker 3: what war is like movie if you want that flavor. 21 00:01:19,400 --> 00:01:22,360 Speaker 3: And it's from everything I've read pretty accurate, I mean 22 00:01:22,400 --> 00:01:23,759 Speaker 3: about the end of World War two. 23 00:01:23,800 --> 00:01:25,400 Speaker 1: But I mean, it's if you've. 24 00:01:25,240 --> 00:01:30,080 Speaker 3: Got a fanciful view of like you know what it's 25 00:01:30,120 --> 00:01:30,640 Speaker 3: like to be a. 26 00:01:30,640 --> 00:01:35,640 Speaker 1: Hero, and it's just gruesome. War is War is awful, 27 00:01:36,280 --> 00:01:37,520 Speaker 1: just awful. 28 00:01:37,360 --> 00:01:39,839 Speaker 3: Even when you win. But I couldn't recommend that movie 29 00:01:39,840 --> 00:01:42,840 Speaker 3: more highly. If you like that sort of thing. AnyWho 30 00:01:43,160 --> 00:01:46,320 Speaker 3: to actually come out eighty some years later say you'd 31 00:01:46,360 --> 00:01:48,360 Speaker 3: be speaking German if it wasn't for us is. 32 00:01:48,360 --> 00:01:52,960 Speaker 1: A hell of a flex. Yes. So obviously Greenland and 33 00:01:53,080 --> 00:01:56,600 Speaker 1: the idea of America's snatching Greenland is on everybody's mind. 34 00:01:56,600 --> 00:01:59,880 Speaker 1: And so Trump stood up at Davos and addressed that directly, 35 00:02:00,120 --> 00:02:01,080 Speaker 1: start with Ady Michael. 36 00:02:02,080 --> 00:02:04,280 Speaker 2: And the fact is, no nation or a group of 37 00:02:04,400 --> 00:02:08,520 Speaker 2: nations is in any position to be able to secure 38 00:02:08,600 --> 00:02:12,799 Speaker 2: Greenland other than the United States. Were great power, much 39 00:02:12,840 --> 00:02:15,360 Speaker 2: greater than people even understand. I think they found that 40 00:02:15,400 --> 00:02:20,560 Speaker 2: out two weeks ago in Venezuela. We saw this in 41 00:02:20,840 --> 00:02:25,760 Speaker 2: World War two when Denmark fell to Germany after just 42 00:02:25,840 --> 00:02:29,800 Speaker 2: six hours of fighting and was totally unable to defend 43 00:02:29,840 --> 00:02:34,440 Speaker 2: either itself or Greenland. So the United States was then 44 00:02:34,560 --> 00:02:37,720 Speaker 2: compelled we did it. We felt an obligation to do it, 45 00:02:38,560 --> 00:02:42,639 Speaker 2: to send our own forces to hold the Greenland territory 46 00:02:43,200 --> 00:02:46,239 Speaker 2: and hold it. We did at great costs and expense. 47 00:02:46,880 --> 00:02:49,680 Speaker 2: They didn't have a chance of getting on it, and 48 00:02:49,720 --> 00:02:50,280 Speaker 2: they tried. 49 00:02:51,800 --> 00:02:55,160 Speaker 1: Denmark knows that you got rolled by the Nazis once 50 00:02:55,240 --> 00:03:01,080 Speaker 1: you Dane, So don't be talking tough. Wow, you know, 51 00:03:01,160 --> 00:03:03,080 Speaker 1: we can put as many bases there as we want. 52 00:03:03,160 --> 00:03:04,840 Speaker 1: But he went on eighty two, Michael. 53 00:03:05,200 --> 00:03:10,600 Speaker 2: After the war, we gave Greenland back to Denmark. How 54 00:03:10,639 --> 00:03:12,040 Speaker 2: stupid were we to do that? 55 00:03:12,880 --> 00:03:16,520 Speaker 1: But we did it. But we gave it back. But 56 00:03:16,639 --> 00:03:21,799 Speaker 1: how ungrateful are they now? Yeah? Okay, okay, all right, 57 00:03:23,040 --> 00:03:24,560 Speaker 1: Jack's not going to say it, so I'll say it. 58 00:03:25,040 --> 00:03:31,000 Speaker 1: The idea that the uh, the territory of our allies 59 00:03:31,040 --> 00:03:33,520 Speaker 1: that we occupied to fight the Nazis were then by 60 00:03:33,560 --> 00:03:37,080 Speaker 1: rights ours and we should have kept them is horrific. 61 00:03:37,280 --> 00:03:40,960 Speaker 1: Well would apply horrific? Would that apply to France and 62 00:03:41,200 --> 00:03:44,120 Speaker 1: Belgium and wherever country we went into it? In a 63 00:03:44,120 --> 00:03:47,240 Speaker 1: good chunk of Germany in Spain and Italy and the 64 00:03:47,280 --> 00:03:49,360 Speaker 1: rest of Italy would be nice. I mean north, that's 65 00:03:49,560 --> 00:03:53,200 Speaker 1: the scenery, you know, the wine everything. H Yeah, that's 66 00:03:53,200 --> 00:03:58,160 Speaker 1: a that's an interesting attitude. Yeah, okay, okay, let's see 67 00:03:58,160 --> 00:04:03,400 Speaker 1: And then Wow. Wow. We were there fighting the Nazis, 68 00:04:03,680 --> 00:04:06,160 Speaker 1: and then once the Nazis are defeated, we're just going 69 00:04:06,240 --> 00:04:10,240 Speaker 1: to stay here the stars. Now you should have defended yourselves. Yeah. 70 00:04:10,280 --> 00:04:13,960 Speaker 1: Interesting again, a unique foreign policy idea from the current 71 00:04:13,960 --> 00:04:19,280 Speaker 1: American president eighty three. Michael Oh, this one. He well, 72 00:04:19,320 --> 00:04:20,840 Speaker 1: go ahead, go just just play it. 73 00:04:21,360 --> 00:04:26,000 Speaker 2: So now our country and the world face much greater 74 00:04:26,160 --> 00:04:30,400 Speaker 2: risks than it did ever before, because of missiles, because 75 00:04:30,440 --> 00:04:33,760 Speaker 2: of nuclear because of weapons of warfare that I can't 76 00:04:33,760 --> 00:04:37,599 Speaker 2: even talk about. Two weeks ago, there were weapons that 77 00:04:37,680 --> 00:04:40,480 Speaker 2: nobody ever heard of. They weren't able to fire one 78 00:04:40,560 --> 00:04:44,080 Speaker 2: shot at us. They said, what happened? Everything was discombobulated. 79 00:04:45,040 --> 00:04:47,520 Speaker 2: They said, we've got him in our sites, press the 80 00:04:47,600 --> 00:04:54,040 Speaker 2: trigger and nothing happened. No anti aircraft missiles went up. 81 00:04:55,000 --> 00:04:57,520 Speaker 2: There was one that went up about thirty feet and 82 00:04:57,600 --> 00:04:59,760 Speaker 2: crashed down right next to the people that sent it. 83 00:04:59,800 --> 00:05:05,039 Speaker 2: They said, what the hell is going on? Those those 84 00:05:05,080 --> 00:05:09,400 Speaker 2: defensive systems were made by Russia and by China. So 85 00:05:09,440 --> 00:05:11,159 Speaker 2: they're going to go back to the drawing boards. 86 00:05:11,200 --> 00:05:13,359 Speaker 3: I guess, well, that's pretty interesting for those of us 87 00:05:13,400 --> 00:05:15,520 Speaker 3: who have been wondering how we pulled off the snatch 88 00:05:15,560 --> 00:05:19,080 Speaker 3: and Maduro thing. So there's the president saying out loud, 89 00:05:19,080 --> 00:05:24,120 Speaker 3: we used some sort of new weapon that isn't regularly 90 00:05:24,160 --> 00:05:26,240 Speaker 3: talked about. I guess yeah. 91 00:05:26,320 --> 00:05:30,000 Speaker 1: I think probably the evildoers around the planet noticed that 92 00:05:30,080 --> 00:05:33,560 Speaker 1: he wasn't giving anything away. I hadn't heard anybody put 93 00:05:33,600 --> 00:05:36,120 Speaker 1: that specific. Was he talking about the description to it? 94 00:05:36,200 --> 00:05:39,680 Speaker 3: Was he talking about the old brain scrambler ray that 95 00:05:39,760 --> 00:05:40,440 Speaker 3: we know we used it? 96 00:05:40,560 --> 00:05:42,719 Speaker 1: Or is this just like shutting off the power and 97 00:05:42,760 --> 00:05:45,800 Speaker 1: everything a brain scrambler and shutting off the power wouldn't 98 00:05:45,800 --> 00:05:47,920 Speaker 1: cause a missile to go up to two hundred feet 99 00:05:47,960 --> 00:05:51,760 Speaker 1: then fall to the earth, right, So there's something else 100 00:05:51,800 --> 00:05:57,760 Speaker 1: going on. And then we'll skip eighty four and just 101 00:05:57,839 --> 00:05:58,560 Speaker 1: ended with eighty five. 102 00:05:58,600 --> 00:06:03,159 Speaker 2: Michael, United States alone that can protect this giant mass 103 00:06:03,200 --> 00:06:06,640 Speaker 2: of land, this giant piece of ice, develop it, and 104 00:06:06,720 --> 00:06:09,640 Speaker 2: improve it, and make it so that it's good for 105 00:06:09,760 --> 00:06:12,720 Speaker 2: Europe and save for Europe and good for us. And 106 00:06:12,760 --> 00:06:16,560 Speaker 2: that's the reason I'm seeking immediate negotiations to once again 107 00:06:16,600 --> 00:06:21,080 Speaker 2: discuss the acquisition of Greenland by the United States, just 108 00:06:21,120 --> 00:06:24,200 Speaker 2: as we have acquired many other territories throughout our history, 109 00:06:24,680 --> 00:06:26,719 Speaker 2: as many of the European nations have. 110 00:06:26,920 --> 00:06:30,359 Speaker 1: They've acquired. There's nothing wrong with it. Many of them. 111 00:06:30,640 --> 00:06:34,039 Speaker 1: Some went in reverse. Actually, if you look, some had. 112 00:06:33,920 --> 00:06:38,440 Speaker 2: Great vast wealth, great vast lands all over the world. 113 00:06:38,680 --> 00:06:41,719 Speaker 2: They went in reverse. They stuck back where they started. 114 00:06:42,120 --> 00:06:42,839 Speaker 2: That happens to. 115 00:06:45,279 --> 00:06:53,000 Speaker 1: So new age of conquest? Is that what he's settler colonialism? 116 00:06:54,560 --> 00:06:57,760 Speaker 1: I hate to make the idiot lefties right. Yeah, And 117 00:06:57,800 --> 00:07:00,279 Speaker 1: then I guess we're still working on a clip where 118 00:07:00,480 --> 00:07:02,919 Speaker 1: he says that people thought I would use force, but 119 00:07:02,960 --> 00:07:05,160 Speaker 1: now I want to negotiate out. No, that is that's 120 00:07:05,400 --> 00:07:06,800 Speaker 1: the headline of the day. 121 00:07:07,120 --> 00:07:11,120 Speaker 3: Definitely that he flat out said the United States is 122 00:07:11,120 --> 00:07:13,120 Speaker 3: not going to use force to take Greenland. 123 00:07:13,160 --> 00:07:15,240 Speaker 1: So how long is it? Michael? 124 00:07:15,320 --> 00:07:18,480 Speaker 2: Let's hear it probably won't get anything unless I decided 125 00:07:18,520 --> 00:07:22,160 Speaker 2: to use excessive strength and force where we would be 126 00:07:22,400 --> 00:07:23,760 Speaker 2: frankly unstoppable. 127 00:07:24,800 --> 00:07:28,840 Speaker 1: But I won't do that. Okay. 128 00:07:29,080 --> 00:07:33,520 Speaker 2: Now everyone's saying, oh good, that's probably the biggest statement 129 00:07:33,800 --> 00:07:35,920 Speaker 2: I made. Because people thought I would use force. 130 00:07:36,120 --> 00:07:37,520 Speaker 1: I don't have to use force. I don't want to 131 00:07:37,600 --> 00:07:39,000 Speaker 1: use force. I won't use force. 132 00:07:40,400 --> 00:07:42,960 Speaker 2: All the United States is asking for is a place 133 00:07:43,080 --> 00:07:44,720 Speaker 2: called Greenland. 134 00:07:45,440 --> 00:07:50,679 Speaker 3: Wow, So all that build up, which if I'm the Euros, 135 00:07:50,680 --> 00:07:51,920 Speaker 3: I'm sitting there thinking. 136 00:07:52,800 --> 00:07:55,320 Speaker 1: He's going to try to take Greenland. He at the 137 00:07:55,400 --> 00:07:59,720 Speaker 1: end discussion of all the fearsome weapons and stuff like that, Yeah. 138 00:07:59,000 --> 00:08:02,000 Speaker 3: You can't stop us. He then at the end throws in, 139 00:08:02,120 --> 00:08:04,880 Speaker 3: I won't use force. He said, he would be appreciative 140 00:08:04,880 --> 00:08:07,480 Speaker 3: if the world acquiesced to our desire to take over 141 00:08:07,520 --> 00:08:08,240 Speaker 3: the territory. 142 00:08:08,920 --> 00:08:13,000 Speaker 1: You can say no and we will remember. That's how 143 00:08:13,000 --> 00:08:16,240 Speaker 1: he ended. You know what I would love, What opportunity 144 00:08:16,240 --> 00:08:18,400 Speaker 1: I'd love is to like be in the same room 145 00:08:18,480 --> 00:08:21,960 Speaker 1: with uh, you know, the Republican leaders of the House 146 00:08:22,000 --> 00:08:27,400 Speaker 1: and Senate and Marco Rubio, who you always bring up, 147 00:08:28,040 --> 00:08:33,679 Speaker 1: say what do you think all this? Behind the scenes, 148 00:08:33,880 --> 00:08:39,199 Speaker 1: there are stressed out conversations taking place. What was he 149 00:08:39,800 --> 00:08:41,520 Speaker 1: was he I'd living there at the end though we 150 00:08:41,559 --> 00:08:45,760 Speaker 1: won't use force. Was there anybody on earth that knew 151 00:08:45,760 --> 00:08:47,560 Speaker 1: he was going to say that at the end other 152 00:08:47,600 --> 00:08:49,880 Speaker 1: than him? Did he start idea? 153 00:08:50,120 --> 00:08:52,720 Speaker 3: Did he start the speech with the intention of saying that, 154 00:08:53,160 --> 00:08:54,840 Speaker 3: and in the you know, toward the end of it, 155 00:08:54,880 --> 00:08:56,920 Speaker 3: decided to say that, I don't. 156 00:08:56,760 --> 00:09:00,440 Speaker 1: Know, No, what a great question. I don't know. I'd 157 00:09:00,440 --> 00:09:03,880 Speaker 1: be guessing. I think he threw that off the cuff. 158 00:09:04,080 --> 00:09:06,520 Speaker 3: Now, He might have gone out there planning to say that, 159 00:09:06,559 --> 00:09:08,160 Speaker 3: but I'll bet. 160 00:09:08,160 --> 00:09:09,520 Speaker 1: Nobody knew he was going to say that. 161 00:09:10,520 --> 00:09:12,319 Speaker 3: Yeah, I don't think he Yeah, I don't think he 162 00:09:12,360 --> 00:09:17,120 Speaker 3: sits around with speech writers coming up with these things. 163 00:09:16,440 --> 00:09:18,920 Speaker 1: Right right, They might hack out the outline for him. 164 00:09:18,920 --> 00:09:23,440 Speaker 1: But yeah, so yeah, yeah, I don't know. So you 165 00:09:23,480 --> 00:09:25,559 Speaker 1: think he looked around the room and saw those faces 166 00:09:25,600 --> 00:09:27,520 Speaker 1: and was like, no, we're not going to use force, 167 00:09:27,559 --> 00:09:29,920 Speaker 1: all right, Michael? Or do you think he planned to 168 00:09:29,960 --> 00:09:31,199 Speaker 1: say that? My guess would be. 169 00:09:31,480 --> 00:09:34,040 Speaker 3: My guess would be he never intended to use force. 170 00:09:34,320 --> 00:09:37,920 Speaker 3: It was just at what point was he going to. 171 00:09:37,240 --> 00:09:39,120 Speaker 1: Say it out loud? Right? 172 00:09:40,200 --> 00:09:43,200 Speaker 3: He even Trump knows that's you know, that's too far. 173 00:09:44,240 --> 00:09:46,719 Speaker 3: Michael's very happy because Michael had been doing the calistatics 174 00:09:46,760 --> 00:09:48,000 Speaker 3: because he was worried the draft was. 175 00:09:47,960 --> 00:09:50,880 Speaker 1: Going to come back. And we don't need strong men 176 00:09:50,960 --> 00:09:55,080 Speaker 1: to take Greenland. Why don't he in charge of like 177 00:09:55,120 --> 00:09:57,880 Speaker 1: getting the right size uniforms for everybody. I'm not sure 178 00:09:57,920 --> 00:10:00,480 Speaker 1: we need you like on the cross country with a 179 00:10:00,559 --> 00:10:03,040 Speaker 1: rifle in your hand in Greenland. Yeah, I like your 180 00:10:03,080 --> 00:10:06,840 Speaker 1: idea better. Yeah, logistics, I think is a field for 181 00:10:06,880 --> 00:10:11,439 Speaker 1: you on the cross country skis with a rifle right right. 182 00:10:12,360 --> 00:10:16,480 Speaker 1: If we're down to the crew of this show, it's yeah, 183 00:10:17,240 --> 00:10:18,240 Speaker 1: learn to speak Chinese. 184 00:10:18,320 --> 00:10:20,320 Speaker 3: Yeah, once you get to us. Greenland is putting up 185 00:10:20,320 --> 00:10:21,800 Speaker 3: more of a defense than we were expecting. 186 00:10:23,040 --> 00:10:25,040 Speaker 4: Yeah, you're in big trouble if you're defending you know, 187 00:10:25,120 --> 00:10:25,760 Speaker 4: depending on me. 188 00:10:27,360 --> 00:10:29,560 Speaker 1: These are insane times, completely insane. 189 00:10:29,880 --> 00:10:32,320 Speaker 3: I know when the when the headline all around the 190 00:10:32,320 --> 00:10:35,000 Speaker 3: world is Trump announces he won't use force to take 191 00:10:35,040 --> 00:10:37,520 Speaker 3: Greenland because everybody was concerned that he might. 192 00:10:39,400 --> 00:10:45,760 Speaker 1: That's something hella, it's it's horrific. I'm horrified. Eh keeps 193 00:10:45,760 --> 00:10:46,440 Speaker 1: life interesting. 194 00:10:46,600 --> 00:10:48,920 Speaker 3: That's bored. I was bored with my previous life. That's 195 00:10:48,960 --> 00:10:53,719 Speaker 3: interesting every single day. World War Three's is close. We're 196 00:10:53,720 --> 00:10:57,160 Speaker 3: gonna go and solve it any second. Yeah, tollyse things. 197 00:10:57,000 --> 00:10:57,720 Speaker 1: Are very exciting. 198 00:10:58,760 --> 00:11:02,679 Speaker 3: Ah boy, any thought to any of this text line 199 00:11:02,720 --> 00:11:09,959 Speaker 3: four one, five, two nine five k FTC, one of 200 00:11:10,000 --> 00:11:12,120 Speaker 3: the most important owners in the NFL is pushing for 201 00:11:12,280 --> 00:11:14,840 Speaker 3: adding another game to the season to make it an 202 00:11:14,840 --> 00:11:15,960 Speaker 3: eighteen game season. 203 00:11:16,040 --> 00:11:18,280 Speaker 1: Maybe some of the details on that later or not. 204 00:11:19,160 --> 00:11:24,480 Speaker 1: Wow wow, Okay, As my favorite team was decimated by 205 00:11:24,520 --> 00:11:26,800 Speaker 1: injuries by the current number of games, I'm not a 206 00:11:26,800 --> 00:11:29,760 Speaker 1: big fan of that idea, but we will discuss so, 207 00:11:31,440 --> 00:11:36,520 Speaker 1: putting aside the madness of Trump for the moment, I 208 00:11:36,559 --> 00:11:41,400 Speaker 1: feel like the Kamala Harris phenomenon was too short. We 209 00:11:41,480 --> 00:11:43,640 Speaker 1: didn't have enough time to enjoy it. It was over 210 00:11:43,760 --> 00:11:46,679 Speaker 1: and it was like a perfect movie. It's just very 211 00:11:46,800 --> 00:11:52,400 Speaker 1: sad when the credits run. Yeah exactly. Josh Shapiro in 212 00:11:52,520 --> 00:11:55,640 Speaker 1: preparation for his look I'm a Reasonable Guy presidential run, 213 00:11:55,880 --> 00:11:59,160 Speaker 1: in which he will be working that lane that. The 214 00:11:59,280 --> 00:12:02,680 Speaker 1: very term makes me shutter because it means presidential politics 215 00:12:02,679 --> 00:12:05,000 Speaker 1: is being discussed. But he'll be competing in that lane 216 00:12:05,200 --> 00:12:10,200 Speaker 1: four with Rama Manuel. He's a Jew, Yeah exactly, Manuel, 217 00:12:10,280 --> 00:12:14,840 Speaker 1: another Jew. Can you make it clear that you're being 218 00:12:15,040 --> 00:12:15,640 Speaker 1: ry here? 219 00:12:15,679 --> 00:12:22,240 Speaker 3: You sound like a nazy Well accepted wisdom seems to 220 00:12:22,240 --> 00:12:24,760 Speaker 3: be that the reason Josh Shapiro wasn't the running mate 221 00:12:24,840 --> 00:12:27,720 Speaker 3: is because he's Jewish, and now you got Rama Manuel Josh. 222 00:12:27,520 --> 00:12:35,120 Speaker 1: Shapiro, both Jewish. The Demommeratic parties pro hamas I'm not 223 00:12:35,160 --> 00:12:40,080 Speaker 1: anti Jew, I'm anti Zionists and all Jews are Zionists attitude. Yeah, indeed, 224 00:12:40,120 --> 00:12:41,719 Speaker 1: that's going to be a bit of a rub, as 225 00:12:41,720 --> 00:12:44,720 Speaker 1: they say. But anyway, that factors into the story. I 226 00:12:44,720 --> 00:12:49,480 Speaker 1: guess Josh is writing the Inevitable Memoir as he gets 227 00:12:49,520 --> 00:12:53,720 Speaker 1: ready to run for president, and according to it, Comparison's 228 00:12:53,880 --> 00:13:00,480 Speaker 1: VP vetting team led by a pair of dudes, Calder 229 00:13:00,520 --> 00:13:03,640 Speaker 1: and Dana Remus. Now these are the people, keep in mind, 230 00:13:03,800 --> 00:13:08,480 Speaker 1: who came up with old Coach Walls as the VP choice. 231 00:13:09,040 --> 00:13:14,240 Speaker 1: Is there a weaker act in American politics than old Josh? 232 00:13:14,320 --> 00:13:19,880 Speaker 1: Sure you can build the Treasury walls. At least Kevin 233 00:13:19,960 --> 00:13:25,600 Speaker 1: Newsom's a skilled liar and a capable con man. Yeah. 234 00:13:25,880 --> 00:13:28,120 Speaker 3: They should have figured it out through interviews, I guess. 235 00:13:28,120 --> 00:13:31,080 Speaker 3: But I remember that that that unveiling of him at 236 00:13:31,080 --> 00:13:31,520 Speaker 3: the convention. 237 00:13:31,720 --> 00:13:34,440 Speaker 1: He was good. He was freaking good that night. True, true, 238 00:13:34,520 --> 00:13:38,319 Speaker 1: just didn't have much depth. But so anyway, those two geniuses. 239 00:13:38,800 --> 00:13:42,640 Speaker 1: They actually asked Josh Shapiro, the governor Pennsylvania, whether he 240 00:13:42,720 --> 00:13:47,360 Speaker 1: ever had been an Israeli agent or communicated with undercover 241 00:13:47,480 --> 00:13:53,640 Speaker 1: agents of Israel since he was a Jew? Wow, had 242 00:13:53,640 --> 00:13:56,640 Speaker 1: I been a double agent for Israel? Wrote Shapiro, who 243 00:13:56,679 --> 00:13:59,480 Speaker 1: is Jewish? Was she kidding? I told her how offensive 244 00:13:59,480 --> 00:14:00,320 Speaker 1: the question was. 245 00:14:00,520 --> 00:14:03,960 Speaker 3: Should have said to Eric Holder, were you ever in 246 00:14:04,000 --> 00:14:05,320 Speaker 3: contact with the Black Panthers? 247 00:14:05,360 --> 00:14:09,040 Speaker 1: At any point. Yeah, because you're black. Yeah, and Remus 248 00:14:09,160 --> 00:14:13,800 Speaker 1: this h Dana, Remus u answers, what we have to ask? 249 00:14:14,080 --> 00:14:17,079 Speaker 1: Have you ever communicated with an undercover agent of Israel? 250 00:14:17,200 --> 00:14:20,120 Speaker 1: Shapiro responded, If they were undercover, how the hell would 251 00:14:20,160 --> 00:14:27,640 Speaker 1: I know? Wow? Wow, Wow. 252 00:14:27,720 --> 00:14:32,360 Speaker 3: That makes it really sound like they were overly suspicious 253 00:14:32,400 --> 00:14:34,400 Speaker 3: about a Jewish running mate. 254 00:14:34,720 --> 00:14:38,760 Speaker 1: Harris's consideration of Shapiro sparked backlash from left wing activists 255 00:14:38,760 --> 00:14:41,760 Speaker 1: who launched a no Genocide Josh campaign. 256 00:14:41,920 --> 00:14:43,320 Speaker 3: Well, I wish it would have happened. It would have 257 00:14:43,320 --> 00:14:45,640 Speaker 3: been great to have had that national conversation. 258 00:14:45,720 --> 00:14:48,800 Speaker 1: Oh, to expose them. Yeah yeah, yeah. That maligned the 259 00:14:48,840 --> 00:14:52,200 Speaker 1: governor's support for Israel and his criticism of illegal protests 260 00:14:52,200 --> 00:14:56,080 Speaker 1: on college campuses that were in favor of the Hamas 261 00:14:56,120 --> 00:15:01,680 Speaker 1: October seventh attacks of it. Blah blah blah. She claimed. 262 00:15:01,880 --> 00:15:04,560 Speaker 1: Now Harris, when she picked Waltz, said as anti Semitism, 263 00:15:04,640 --> 00:15:07,640 Speaker 1: bab like, no role in the decision or the dialogue 264 00:15:07,640 --> 00:15:11,040 Speaker 1: with the vice president, blah blah blah. Memoirship. I'm sorry, 265 00:15:11,080 --> 00:15:16,479 Speaker 1: Shapiro's memoir tells a different story, the governor said. Remus's 266 00:15:16,560 --> 00:15:19,320 Speaker 1: questioning said a lot about some of the people around 267 00:15:19,360 --> 00:15:21,800 Speaker 1: the VP. Shapiro also revealed, according to the Atlantic, that 268 00:15:21,920 --> 00:15:25,160 Speaker 1: Harris asked him quote if he would apologize for some 269 00:15:25,240 --> 00:15:28,600 Speaker 1: of his comments about protesters at the University of Pennsylvania 270 00:15:28,640 --> 00:15:35,440 Speaker 1: who'd built encampments. Shapiro flatly declined. Harris defended those anti 271 00:15:35,520 --> 00:15:38,840 Speaker 1: Israel campus agitators, saying, quote, they are showing you exactly 272 00:15:38,920 --> 00:15:42,600 Speaker 1: what the human emotion should be. As a response to Gaza. 273 00:15:43,000 --> 00:15:46,040 Speaker 3: I forget has Harris officially said she's not running. She 274 00:15:46,080 --> 00:15:48,720 Speaker 3: said she's not running for governor in California, but she 275 00:15:48,760 --> 00:15:50,040 Speaker 3: hasn't taken president off the. 276 00:15:50,000 --> 00:15:52,360 Speaker 1: Table as she No. 277 00:15:52,360 --> 00:15:55,040 Speaker 3: No, Well, that'd be something if she and Josh were 278 00:15:55,080 --> 00:15:57,000 Speaker 3: runn at the same time. That would get ugly fast. 279 00:15:57,640 --> 00:16:01,480 Speaker 1: Boy. If she could find somebody's head that was so 280 00:16:01,480 --> 00:16:05,400 Speaker 1: soft they would contribute money to her for a presidential run. 281 00:16:05,560 --> 00:16:09,120 Speaker 1: He gotta take that money. I mean, somebody that's stupid. 282 00:16:09,120 --> 00:16:11,200 Speaker 1: Their money's gonna go somewhere. It might as well go 283 00:16:11,240 --> 00:16:14,880 Speaker 1: to you. So why would she, you know, to paraphrase Trump, 284 00:16:14,880 --> 00:16:17,480 Speaker 1: why would she take that off the table. They're probably 285 00:16:17,520 --> 00:16:20,480 Speaker 1: gonna lose it or drop it anyway. So because they're 286 00:16:20,520 --> 00:16:24,560 Speaker 1: so dumb, you might as well give it to her raw. Wow. 287 00:16:24,920 --> 00:16:29,520 Speaker 1: Eric Holder is raking in as much as twenty three 288 00:16:29,640 --> 00:16:34,520 Speaker 1: hundred dollars an hour conducting racial equity audits for corporations 289 00:16:34,560 --> 00:16:39,040 Speaker 1: like Starbucks. Wow. And with his blessing, they're taking steps 290 00:16:39,040 --> 00:16:42,800 Speaker 1: to promote equity. Yeah, that's like the old Jesse Jackson thing. 291 00:16:43,000 --> 00:16:45,320 Speaker 3: So you hire him and pay that money, and then 292 00:16:45,360 --> 00:16:47,680 Speaker 3: you're safe because you can tell any activist that ever 293 00:16:47,720 --> 00:16:50,400 Speaker 3: comes after you for anything. Look look what I did here. 294 00:16:50,520 --> 00:16:52,440 Speaker 3: I hired Eric Holder and gave him twenty two hundred 295 00:16:52,440 --> 00:16:56,680 Speaker 3: dollars stuff, right right. It inoculates you against any criticism. 296 00:16:57,240 --> 00:17:00,880 Speaker 1: Oh yeah, yeah, yeah, you buy a no blackmail pass 297 00:17:01,360 --> 00:17:03,200 Speaker 1: for that. Of course you've been blackmailed, but you know 298 00:17:04,160 --> 00:17:06,800 Speaker 1: it's less expensive than if you got accused publicly and 299 00:17:06,840 --> 00:17:10,040 Speaker 1: had to scramble to cover. So some damn interesting AI 300 00:17:10,200 --> 00:17:13,240 Speaker 1: stuff out of a lot of fronts, including Elon's Mouth, 301 00:17:14,840 --> 00:17:16,320 Speaker 1: Armstrong and Getty. 302 00:17:17,680 --> 00:17:20,840 Speaker 3: Before we get to some really interesting AI stuff, scary 303 00:17:20,880 --> 00:17:24,000 Speaker 3: AI stuff if it's true. Robert Kraft is one of 304 00:17:24,040 --> 00:17:26,920 Speaker 3: the most powerful owners in all of football. He owns 305 00:17:26,960 --> 00:17:30,600 Speaker 3: the New England Patriots, and if you follow the NFL 306 00:17:30,640 --> 00:17:32,680 Speaker 3: at all, you know that whole story. And they may 307 00:17:32,680 --> 00:17:33,960 Speaker 3: go to the Super Bowl again this year. 308 00:17:34,000 --> 00:17:37,080 Speaker 1: Anyway, he and other remembered they mentioned it during one 309 00:17:37,080 --> 00:17:38,919 Speaker 1: of the games that he was a season ticket holder, 310 00:17:39,240 --> 00:17:42,760 Speaker 1: just a huge fan, and Richard Nell too, and he decided, 311 00:17:42,800 --> 00:17:44,320 Speaker 1: you know what, I like this team so much, I'm 312 00:17:44,320 --> 00:17:46,560 Speaker 1: gonna buy it. I know that either be nice to 313 00:17:46,600 --> 00:17:48,199 Speaker 1: have that sort of money to throw around. Although you 314 00:17:48,240 --> 00:17:50,679 Speaker 1: get me a couple of you know, Cubs beers, I'm 315 00:17:50,680 --> 00:17:53,280 Speaker 1: at Wrigley Field or something, I start thinking crazy stuff. 316 00:17:53,359 --> 00:17:55,840 Speaker 1: I sure like his cheese. I like his cheese slices. 317 00:17:56,840 --> 00:17:59,679 Speaker 3: But Robert Kraft says he and the other owners are 318 00:17:59,680 --> 00:18:03,760 Speaker 3: going to pose adding another game to the currently seventeen 319 00:18:03,800 --> 00:18:08,560 Speaker 3: game season, taking away one or two preseason games. 320 00:18:08,600 --> 00:18:10,399 Speaker 1: So you wouldn't actually, but they don't. 321 00:18:10,480 --> 00:18:12,720 Speaker 3: The starters don't play in the preseason game, so I 322 00:18:12,720 --> 00:18:16,280 Speaker 3: don't see that as a balancing act anyway. But one 323 00:18:16,280 --> 00:18:19,120 Speaker 3: of the games has to be in another country because 324 00:18:19,119 --> 00:18:23,120 Speaker 3: they're they're trying to spread this sport, because they don't 325 00:18:23,160 --> 00:18:25,479 Speaker 3: understand why this can't be popular around the world like 326 00:18:25,520 --> 00:18:26,080 Speaker 3: soccer is. 327 00:18:26,520 --> 00:18:28,000 Speaker 1: He pointed out that of. 328 00:18:27,960 --> 00:18:31,919 Speaker 3: The top one hundred television shows last year, the NFL 329 00:18:31,920 --> 00:18:33,720 Speaker 3: accounted for ninety. 330 00:18:33,240 --> 00:18:38,360 Speaker 1: Three of them. Wait a minute, Actually, I'm not surprised 331 00:18:38,359 --> 00:18:44,320 Speaker 1: by that. It's amazing though, that is a stark statement 332 00:18:44,440 --> 00:18:53,600 Speaker 1: of its importance. Yes, yeah, wow, anywow wow. I agree 333 00:18:53,600 --> 00:18:57,040 Speaker 1: with you, brave firefighters in a variety of cities around 334 00:18:57,040 --> 00:18:59,240 Speaker 1: America that the show is named for a cops or 335 00:18:59,440 --> 00:19:02,199 Speaker 1: CSCO guys or what no foot the ball over and 336 00:19:02,240 --> 00:19:06,600 Speaker 1: over again, you know, the whole Golden Goose parable. But 337 00:19:09,080 --> 00:19:11,240 Speaker 1: it's too brutal a sport for that many games. But 338 00:19:11,280 --> 00:19:12,240 Speaker 1: one of the great things. 339 00:19:12,080 --> 00:19:14,720 Speaker 3: About the NFL college football the same way, to a 340 00:19:14,720 --> 00:19:17,040 Speaker 3: certain extent, there are a few enough games that every 341 00:19:17,119 --> 00:19:21,960 Speaker 3: week matters, every week matters a lot, and you know, 342 00:19:22,119 --> 00:19:24,600 Speaker 3: you become like the NBA, where you know the starters 343 00:19:24,600 --> 00:19:26,720 Speaker 3: are going to sit some of these games because the 344 00:19:26,760 --> 00:19:29,000 Speaker 3: season so long and on all that order of thing. 345 00:19:29,040 --> 00:19:31,399 Speaker 1: But well, in an eighteen game season, you might be 346 00:19:31,480 --> 00:19:33,879 Speaker 1: able to lose six games in a row and still 347 00:19:33,880 --> 00:19:39,000 Speaker 1: make the playoffs, right, exactly, it's a difference. Thing shouldn't happen. Yeah, 348 00:19:39,000 --> 00:19:42,440 Speaker 1: So Elon said this yesterday in an interview. He said, 349 00:19:42,480 --> 00:19:47,760 Speaker 1: we are in the singularity, which we will get to the. 350 00:19:46,440 --> 00:19:49,119 Speaker 3: Definition of and everybody's got a different definition of this, 351 00:19:49,760 --> 00:19:52,440 Speaker 3: but we'll get to the GROS version of the definition, 352 00:19:52,480 --> 00:19:55,120 Speaker 3: and GROC is Elon's chatbot, so it should know what 353 00:19:55,440 --> 00:19:59,200 Speaker 3: Elon means by singularity. Elon said, we're in the singularity. 354 00:19:59,240 --> 00:20:01,200 Speaker 3: We are at the top the roller coaster and we're 355 00:20:01,240 --> 00:20:04,760 Speaker 3: about to go down. It blows my mind. It blows 356 00:20:04,760 --> 00:20:07,199 Speaker 3: my mind. Multiple times a week I'm like wow, than 357 00:20:07,240 --> 00:20:10,199 Speaker 3: two days later I'm more wow. I think we'll hit 358 00:20:10,280 --> 00:20:15,399 Speaker 3: AGI next year or this year. Twenty twenty six. He 359 00:20:15,480 --> 00:20:18,440 Speaker 3: was replying to posts from AI researchers and engineers who 360 00:20:18,440 --> 00:20:22,520 Speaker 3: are describing how current AI tools, especially advanced coding agents 361 00:20:23,119 --> 00:20:27,440 Speaker 3: and various models like Claude Opus, are dramatically accelerating their work. 362 00:20:27,720 --> 00:20:29,960 Speaker 3: Things that used to take years now happen in weeks 363 00:20:30,000 --> 00:20:34,679 Speaker 3: or months. Onboarding to massive code bases takes days instead 364 00:20:34,680 --> 00:20:38,040 Speaker 3: of months. Even top experts are stunned by the emergent 365 00:20:38,080 --> 00:20:42,960 Speaker 3: capabilities recently. That's what this says, with you know, the 366 00:20:43,000 --> 00:20:45,480 Speaker 3: caveat that we always have to throw in that most 367 00:20:45,480 --> 00:20:48,120 Speaker 3: everybody who's talking this stuff up is asking for more 368 00:20:48,400 --> 00:20:51,400 Speaker 3: money to be invested. Doesn't mean it's not true, though, 369 00:20:52,600 --> 00:20:55,439 Speaker 3: what or even if it's two thirds true, that's a 370 00:20:55,440 --> 00:20:56,560 Speaker 3: hell of an exaggeration. 371 00:20:56,760 --> 00:21:00,399 Speaker 1: But a hell of a headline too. Yeah, So what 372 00:21:00,400 --> 00:21:03,000 Speaker 1: did Elon mean by the singularity. 373 00:21:03,080 --> 00:21:07,400 Speaker 3: It refers to the hypothetical point when artificial intelligence becomes 374 00:21:07,480 --> 00:21:12,960 Speaker 3: super intelligent, capable of improving itself recursively, like bending back 375 00:21:13,040 --> 00:21:16,480 Speaker 3: on itself. It teaches itself to do something, then learns 376 00:21:16,480 --> 00:21:18,959 Speaker 3: from what it taught itself, and no humans involved at 377 00:21:18,960 --> 00:21:21,000 Speaker 3: that point, it just is off to the races, then 378 00:21:21,080 --> 00:21:24,000 Speaker 3: finds the flaws in what it just did, corrects. 379 00:21:23,560 --> 00:21:26,200 Speaker 1: Them, and over and over and over again with speed 380 00:21:26,280 --> 00:21:27,560 Speaker 1: that we can't comproms at. 381 00:21:28,080 --> 00:21:32,080 Speaker 3: An exponential rate, and exponential speed leading to technological progress 382 00:21:32,160 --> 00:21:37,080 Speaker 3: so rapid and profound that it becomes unpredictable and incomprehensible 383 00:21:37,119 --> 00:21:39,320 Speaker 3: to unheted human unaided humans. 384 00:21:39,600 --> 00:21:42,119 Speaker 1: And then whoops, there go your kidneys, says the enormous 385 00:21:42,119 --> 00:21:42,800 Speaker 1: to your organs. 386 00:21:42,960 --> 00:21:47,000 Speaker 3: Normal forecasting for where it goes completely breaks down. It's 387 00:21:47,119 --> 00:21:51,040 Speaker 3: like an event horizon for civilization. Like we are at 388 00:21:51,080 --> 00:21:53,440 Speaker 3: a new point. Now, nobody knows what's going to happen. 389 00:21:53,560 --> 00:21:55,280 Speaker 3: Let's just sit back and watch and see what the 390 00:21:55,320 --> 00:21:58,080 Speaker 3: computers decide to do with the planet and the humans 391 00:21:58,080 --> 00:21:58,360 Speaker 3: on it. 392 00:22:00,359 --> 00:22:02,840 Speaker 1: I'm trying to think of so this would be like 393 00:22:03,160 --> 00:22:09,439 Speaker 1: living through the but what nothing like it? Not at 394 00:22:09,480 --> 00:22:13,280 Speaker 1: the speed anyway. I mean the invention of oxygen. No, 395 00:22:13,400 --> 00:22:15,359 Speaker 1: that was already here. We didn't have to invent it. 396 00:22:16,480 --> 00:22:17,400 Speaker 1: The development of. 397 00:22:17,600 --> 00:22:21,960 Speaker 3: Speech, it'd be like living through the agricultural revolution or 398 00:22:22,040 --> 00:22:24,800 Speaker 3: the industrial revolution if it were, if it were compressed 399 00:22:24,800 --> 00:22:26,159 Speaker 3: into like a minute. 400 00:22:26,760 --> 00:22:29,920 Speaker 1: Yeah, you know what, honestly, the development of speech would 401 00:22:29,920 --> 00:22:33,160 Speaker 1: be a pretty good kind of parallel, to the extent 402 00:22:33,160 --> 00:22:35,240 Speaker 1: that you can have a parallel if again, that was 403 00:22:35,240 --> 00:22:36,440 Speaker 1: compressed into a year's time. 404 00:22:36,800 --> 00:22:39,239 Speaker 3: So Elon's signaling he thinks that this year we will 405 00:22:39,320 --> 00:22:48,040 Speaker 3: cross the threshold into the acceleration phase and the pace 406 00:22:48,040 --> 00:22:52,760 Speaker 3: will be supersonic. He's used the term supersonic some tsunami 407 00:22:53,200 --> 00:22:56,040 Speaker 3: for AI and robotics advances that, I mean it'll be 408 00:22:56,040 --> 00:22:57,600 Speaker 3: building its own robots. 409 00:22:57,240 --> 00:22:59,400 Speaker 1: For whatever it decides robots need to do. I mean 410 00:22:59,400 --> 00:23:02,160 Speaker 1: it just it'll build, It'll have a plant, it'll buy 411 00:23:02,200 --> 00:23:02,800 Speaker 1: some land. 412 00:23:02,880 --> 00:23:04,600 Speaker 3: I mean, I've read a bunch of books like the 413 00:23:05,200 --> 00:23:08,680 Speaker 3: AI might decide to it's it's made enough money off 414 00:23:08,680 --> 00:23:11,119 Speaker 3: of what it's doing. It buys some land, it builds 415 00:23:11,119 --> 00:23:14,920 Speaker 3: a plant, it starts making robots. Yeah, there are fences 416 00:23:14,920 --> 00:23:17,480 Speaker 3: around it. You don't even know what they're building robots for, and. 417 00:23:17,400 --> 00:23:19,480 Speaker 1: We're like, excuse me, excuse me, what are you doing 418 00:23:19,480 --> 00:23:23,080 Speaker 1: over there? Yeah, don't worry about none. Yeah robots. 419 00:23:25,040 --> 00:23:27,840 Speaker 3: Oh boy, twenty twenty six fields like the year this 420 00:23:27,920 --> 00:23:33,280 Speaker 3: becomes undeniable to more people, possibly tied to expected breakthroughs 421 00:23:33,320 --> 00:23:38,800 Speaker 3: toward that will exceed all anything human tasks that can 422 00:23:38,840 --> 00:23:41,120 Speaker 3: be done, and then gets to. 423 00:23:42,760 --> 00:23:46,520 Speaker 1: This part about. 424 00:23:45,680 --> 00:23:48,680 Speaker 3: We're gonna have to get to the universal basic income 425 00:23:49,480 --> 00:23:51,720 Speaker 3: sooner than anybody even. 426 00:23:51,520 --> 00:23:55,280 Speaker 1: Thought, and then the universal high income that they're process 427 00:23:55,440 --> 00:24:00,879 Speaker 1: yeah yeah, and then whoops, there goes your kidneys or 428 00:24:00,920 --> 00:24:03,400 Speaker 1: your eyes for some reason that needs eyeballs. I don't 429 00:24:03,400 --> 00:24:06,640 Speaker 1: know why, but they need them anyway. Elon is Now 430 00:24:06,800 --> 00:24:11,040 Speaker 1: I think they are going to take SpaceX public, which 431 00:24:11,080 --> 00:24:13,840 Speaker 1: they'd resisted for years and said, no, we're not going 432 00:24:13,840 --> 00:24:15,760 Speaker 1: to do that until we have people flying back and 433 00:24:15,760 --> 00:24:18,280 Speaker 1: forth to Mars. But now they're going to. 434 00:24:18,400 --> 00:24:20,560 Speaker 3: We have people flying back and forth to the Marsh currently 435 00:24:20,560 --> 00:24:21,280 Speaker 3: we do not, sir. 436 00:24:22,359 --> 00:24:25,760 Speaker 1: Now they're going to partly at least because there's a 437 00:24:25,840 --> 00:24:30,600 Speaker 1: rush now to put AI data centers in space, which 438 00:24:30,640 --> 00:24:34,600 Speaker 1: has prompted some skepticism from engineers given the technical challenges 439 00:24:35,000 --> 00:24:38,800 Speaker 1: posed by building solar powered AI AI data centers that 440 00:24:38,840 --> 00:24:42,400 Speaker 1: are zipping around the Earth as satellites. But a Musk 441 00:24:42,480 --> 00:24:44,679 Speaker 1: has become obsessed with the idea of SpaceX being the 442 00:24:44,680 --> 00:24:47,600 Speaker 1: first to do that. Such feet would be hard to 443 00:24:47,640 --> 00:24:50,360 Speaker 1: attempt without the billions of dollars in capital an initial 444 00:24:50,359 --> 00:24:53,040 Speaker 1: public offering could develop in one or deliver in one 445 00:24:53,040 --> 00:24:57,840 Speaker 1: fill swoop. And that's partly it has to do with 446 00:24:57,880 --> 00:25:03,359 Speaker 1: his obsession with staying ahead of long running rival Sam Altman, 447 00:25:03,359 --> 00:25:06,760 Speaker 1: who's in charge of Open AI. So Elon, who's now 448 00:25:07,080 --> 00:25:10,120 Speaker 1: essentially said Tesla doesn't make cars anymore, that's a sideline. 449 00:25:10,160 --> 00:25:13,119 Speaker 1: We make robots. It's all about the AI. In fact, 450 00:25:13,160 --> 00:25:16,439 Speaker 1: we're going to and i'm paraphrasing, I'm describing he hasn't 451 00:25:16,440 --> 00:25:19,400 Speaker 1: said this, but we're going to mortgage SpaceX to get 452 00:25:19,440 --> 00:25:28,399 Speaker 1: more AI money. So what by July. 453 00:25:28,560 --> 00:25:32,600 Speaker 3: By the way, we talked about this a little bit earlier. 454 00:25:32,600 --> 00:25:33,000 Speaker 1: So this. 455 00:25:35,119 --> 00:25:38,840 Speaker 3: Universal high income as Elon calls it, so not basic 456 00:25:38,880 --> 00:25:41,919 Speaker 3: income not you know, that experiment's been tried in various 457 00:25:42,000 --> 00:25:46,520 Speaker 3: towns where you're given enough money to like get by. 458 00:25:47,280 --> 00:25:50,040 Speaker 3: Everybody's given that much money, and it's like a social experiment. 459 00:25:50,840 --> 00:25:54,320 Speaker 3: Elon's talking about a situation universal hiring equipment, where everybody's 460 00:25:54,400 --> 00:25:58,199 Speaker 3: making well. He says, money will become pointless because everybody 461 00:25:58,240 --> 00:26:02,040 Speaker 3: all have plenty of food, plenty of healthcare, plenty of energy. 462 00:26:03,560 --> 00:26:04,720 Speaker 1: I don't know housing. 463 00:26:04,840 --> 00:26:10,840 Speaker 3: I mean, because I assume where's the cutoff for this 464 00:26:10,920 --> 00:26:13,480 Speaker 3: high income. I assume some people currently are. 465 00:26:13,400 --> 00:26:17,760 Speaker 1: Above it and would have to take a step backwards. 466 00:26:18,320 --> 00:26:20,440 Speaker 1: I think Elon would tell you, oh, no, no, no, no, no, 467 00:26:20,480 --> 00:26:22,040 Speaker 1: everybody's going to be raising edge. 468 00:26:22,000 --> 00:26:26,560 Speaker 3: One precise Yeah, we're all going to be one percenters, yeah, 469 00:26:27,240 --> 00:26:31,879 Speaker 3: or even ten to one percenters. Yeah, allegedly, as we 470 00:26:31,920 --> 00:26:38,640 Speaker 3: discussed earlier, though, by what mechanism, with what people enforcing it, 471 00:26:39,119 --> 00:26:41,240 Speaker 3: and how long will that take to set up and 472 00:26:41,400 --> 00:26:45,239 Speaker 3: actually successfully distribute that money so people aren't in the 473 00:26:45,240 --> 00:26:49,760 Speaker 3: streets fighting for their lives with no employment, no prospects 474 00:26:49,800 --> 00:26:53,080 Speaker 3: of employment, and the conventional economy is broken down. 475 00:26:53,200 --> 00:26:55,520 Speaker 1: While we wait for our robot overlords to save us. 476 00:26:55,560 --> 00:27:01,000 Speaker 1: I mean, even the most wildly optimistic prediction, what does 477 00:27:01,040 --> 00:27:02,359 Speaker 1: that interim period look like. 478 00:27:02,600 --> 00:27:04,840 Speaker 3: I have not heard a single person, and I take 479 00:27:04,880 --> 00:27:07,880 Speaker 3: in a lot of AI information podcast read a lot, 480 00:27:09,240 --> 00:27:11,880 Speaker 3: I've not heard a single person talk about that part 481 00:27:12,320 --> 00:27:16,760 Speaker 3: the mechanism for who's collecting this money. So your your 482 00:27:16,880 --> 00:27:19,840 Speaker 3: AI just bought a farm and now is running it 483 00:27:19,880 --> 00:27:22,760 Speaker 3: more efficiently than anything has ever been done before and 484 00:27:22,800 --> 00:27:25,320 Speaker 3: making more money than anybody's ever made before. Okay, that 485 00:27:25,640 --> 00:27:29,040 Speaker 3: money belongs to your company, and now you're gonna give 486 00:27:29,080 --> 00:27:31,280 Speaker 3: it out to everybody. Why, I guess you're a nice guy. 487 00:27:31,640 --> 00:27:34,960 Speaker 3: I mean, I don't understand what's the forcing mechanism there 488 00:27:35,000 --> 00:27:36,120 Speaker 3: and how much you give out? 489 00:27:36,440 --> 00:27:38,400 Speaker 1: Yeah, we're supposed to trust them. Oh, it'll be great 490 00:27:38,400 --> 00:27:41,160 Speaker 1: for everybody. It sounds like the sort of thing somebody 491 00:27:41,200 --> 00:27:45,520 Speaker 1: would say before they absolutely lay waste to the economy 492 00:27:45,560 --> 00:27:49,560 Speaker 1: and the populace for their own gain. I'm not sure 493 00:27:49,600 --> 00:27:53,960 Speaker 1: that's their actual purpose or even that they know that 494 00:27:54,080 --> 00:27:57,880 Speaker 1: they're doing that. But oh, trust me, it'll be fine. 495 00:27:57,880 --> 00:28:01,320 Speaker 1: Everybody will have plenty. Yet, explain to me, and I 496 00:28:01,359 --> 00:28:04,080 Speaker 1: tell you what, just the two minute version, what will 497 00:28:04,119 --> 00:28:07,040 Speaker 1: those mechanisms look like? Nobody's talking about that. And again 498 00:28:07,119 --> 00:28:09,639 Speaker 1: earlier I said, and I approached this to somebody who's 499 00:28:09,960 --> 00:28:12,240 Speaker 1: you know, spent out of lifetime looking at political systems 500 00:28:12,280 --> 00:28:16,240 Speaker 1: and how they actually work on the ground. What is 501 00:28:16,280 --> 00:28:21,920 Speaker 1: that taxation? Is it voluntarity. What are the I mean, Okay, 502 00:28:21,960 --> 00:28:24,520 Speaker 1: you're gonna have robots and computers distributed it all and 503 00:28:24,520 --> 00:28:28,360 Speaker 1: put it into everybody's bank accounts nearly instantaneously. 504 00:28:29,240 --> 00:28:32,440 Speaker 3: Well, I feel like it's always presented and I could 505 00:28:32,480 --> 00:28:35,080 Speaker 3: be wrong here. They never get into the details, which 506 00:28:35,200 --> 00:28:38,600 Speaker 3: seems like an oversight to me. But it always seems 507 00:28:38,600 --> 00:28:40,200 Speaker 3: like it's presented to me that it's just like out 508 00:28:40,200 --> 00:28:42,320 Speaker 3: of the goodness of their heart, all these big AI 509 00:28:42,400 --> 00:28:46,280 Speaker 3: companies will be so wealthy, they'll just everybody will have plenty. Well, why, 510 00:28:46,320 --> 00:28:51,040 Speaker 3: that's not the history of mankind. Well, no, certainly, the 511 00:28:51,120 --> 00:28:53,920 Speaker 3: Arab princes have more enough money to give out to 512 00:28:54,200 --> 00:28:56,320 Speaker 3: everybody as much as they want to, but they don't. 513 00:28:56,520 --> 00:28:59,400 Speaker 3: They give out a fair amount, but they keep most 514 00:28:59,440 --> 00:29:00,600 Speaker 3: of it from themselves. 515 00:29:00,800 --> 00:29:03,360 Speaker 1: They keep enough to keep everybody pacified, or I'm sorry, 516 00:29:03,400 --> 00:29:07,320 Speaker 1: they give out enough keep everybody pacified. Well, it just 517 00:29:07,360 --> 00:29:08,760 Speaker 1: occurred to me. I think one of the answers they 518 00:29:08,840 --> 00:29:11,440 Speaker 1: might give to my previous question is, Joe, you don't 519 00:29:11,480 --> 00:29:15,560 Speaker 1: understand money. We're not gonna give money to people. The 520 00:29:15,640 --> 00:29:18,560 Speaker 1: stores will be open. Just go get as much food 521 00:29:18,600 --> 00:29:21,280 Speaker 1: as you want, that's fine, and then go to the 522 00:29:21,800 --> 00:29:23,920 Speaker 1: You know, go to the golf store and get some 523 00:29:23,960 --> 00:29:26,400 Speaker 1: new sticks if you need them or want them. The 524 00:29:26,480 --> 00:29:31,080 Speaker 1: disco tech and dance the pardon me, go to the 525 00:29:31,080 --> 00:29:35,000 Speaker 1: Disco dech the disco thck and dance because there will 526 00:29:35,040 --> 00:29:37,320 Speaker 1: be no cover charge. Go to Disneyland and ride on 527 00:29:37,400 --> 00:29:40,320 Speaker 1: the rides. Prettyf and crowded over here. Hey, we'll have 528 00:29:40,360 --> 00:29:43,160 Speaker 1: the robots build another one, and then if that becomes 529 00:29:43,280 --> 00:29:46,000 Speaker 1: less popular, the robots will tear it down and build 530 00:29:46,000 --> 00:29:49,560 Speaker 1: you whatever you want. It's like like Charlie and the 531 00:29:49,640 --> 00:29:57,600 Speaker 1: Chocolate Factory, Willy, howonka if you prefer I I oh 532 00:29:57,720 --> 00:30:01,640 Speaker 1: for what it's worth. Now, that's the the singularity or 533 00:30:01,680 --> 00:30:04,840 Speaker 1: whatever you want to call it. The current state of things. 534 00:30:05,120 --> 00:30:08,520 Speaker 1: Wall Street Journal with a really interesting piece that say 535 00:30:08,760 --> 00:30:13,960 Speaker 1: CEOs are saying wonderful things about how much time and 536 00:30:14,000 --> 00:30:18,480 Speaker 1: money AI is so saving the companies these days, but 537 00:30:18,560 --> 00:30:21,160 Speaker 1: the people who are actually doing it and doing the 538 00:30:21,200 --> 00:30:25,959 Speaker 1: work and stuff for much much less optimistic about it, 539 00:30:26,120 --> 00:30:30,160 Speaker 1: or they have much lower numbers that could be just 540 00:30:30,160 --> 00:30:31,880 Speaker 1: to preserve their jobs, and we don't really have time 541 00:30:31,920 --> 00:30:33,560 Speaker 1: to talk about it. Maybe we'll get into that tomorrow. 542 00:30:33,920 --> 00:30:34,480 Speaker 1: I feel like. 543 00:30:34,440 --> 00:30:36,440 Speaker 3: That's the most likely thing that happens at least in 544 00:30:36,440 --> 00:30:39,560 Speaker 3: my lifetime. Is companies are able to do a lot 545 00:30:39,600 --> 00:30:42,200 Speaker 3: more with AI. They lay off a whole bunch of people. 546 00:30:42,240 --> 00:30:44,640 Speaker 3: They're just keeping their profits like companies always have, which 547 00:30:44,640 --> 00:30:47,640 Speaker 3: is fine. They get to they're not doling out money. 548 00:30:47,640 --> 00:30:51,320 Speaker 3: There's no basic income or in it. There's just millions 549 00:30:51,320 --> 00:30:53,720 Speaker 3: and millions and millions of people that can't get a job. 550 00:30:54,360 --> 00:30:57,400 Speaker 1: Yeah, the current concentration of wealth in a few hands 551 00:30:57,440 --> 00:31:02,320 Speaker 1: will look like you know, communism in five years exactly. 552 00:31:02,760 --> 00:31:09,520 Speaker 1: We will finish strong. Next. So what's that kid's name 553 00:31:09,520 --> 00:31:10,680 Speaker 1: that America fell in love with? 554 00:31:10,720 --> 00:31:15,560 Speaker 3: The quarterback for Indiana and his fun personality and winning ways, Fernando. 555 00:31:16,600 --> 00:31:21,800 Speaker 1: What's his last name? Mendoza, mend Mendoz. You know what 556 00:31:21,880 --> 00:31:22,720 Speaker 1: his big prize is. 557 00:31:23,840 --> 00:31:26,320 Speaker 3: Hey, your prize is you get to be quarterback for 558 00:31:26,360 --> 00:31:28,560 Speaker 3: the Oakland Raiders or the Los fe Looks. 559 00:31:29,320 --> 00:31:32,360 Speaker 1: Wait a minute, your prize. That's not a prize, that's 560 00:31:32,360 --> 00:31:34,719 Speaker 1: a punishment. I know it, I know it. 561 00:31:34,800 --> 00:31:38,720 Speaker 3: Congratulations, After all that work and luck and glory and 562 00:31:38,880 --> 00:31:41,120 Speaker 3: all the good stuff that has happened in your cool attitude, 563 00:31:41,280 --> 00:31:41,840 Speaker 3: you get to beat the. 564 00:31:41,880 --> 00:31:45,959 Speaker 1: Quarterback for the Raiders. My ridiculous accent reminds me that 565 00:31:46,120 --> 00:31:49,760 Speaker 1: during the game, I saw that Dosequies has brought back 566 00:31:49,920 --> 00:31:53,360 Speaker 1: the most interesting man in the world. He's aged a little, Yeah, 567 00:31:53,480 --> 00:31:57,280 Speaker 1: heaven't we all? Yes, yes, still very very interesting man though. 568 00:31:58,000 --> 00:32:02,360 Speaker 1: Glad to see that, right, absolutely, So. 569 00:32:02,560 --> 00:32:05,760 Speaker 3: Headline of the day is Trump backed off taking Greenland 570 00:32:05,760 --> 00:32:09,280 Speaker 3: by force. Hard to believe that anybody was ever taking 571 00:32:09,280 --> 00:32:10,840 Speaker 3: that seriously, but apparently lot's. 572 00:32:10,880 --> 00:32:14,680 Speaker 1: Hard to believe he ever proposed it. True. Yeah, you know, 573 00:32:14,880 --> 00:32:16,760 Speaker 1: you can't blame people for taking him seriously, I guess 574 00:32:16,760 --> 00:32:18,840 Speaker 1: because he said it like a fifty thousand times. So 575 00:32:19,320 --> 00:32:21,760 Speaker 1: there you go, there's your headline of the day. Yeah, 576 00:32:21,800 --> 00:32:26,800 Speaker 1: you know. And I've been harshly, strongly, unapologetically critical about 577 00:32:27,480 --> 00:32:31,760 Speaker 1: certain aspects of Trump's act and his foreign policy, and 578 00:32:31,920 --> 00:32:35,160 Speaker 1: I don't withdraw that at all. On the other hand, 579 00:32:35,360 --> 00:32:40,280 Speaker 1: I'm reminded every day of how fantastic the administration's been 580 00:32:40,480 --> 00:32:45,640 Speaker 1: for business compared to the Biden administration, with the exception 581 00:32:45,720 --> 00:32:47,720 Speaker 1: of the tariffs, which I think are just a bad idea, 582 00:32:48,440 --> 00:32:50,160 Speaker 1: a lot of great economic stuff, a lot of good 583 00:32:50,200 --> 00:32:54,280 Speaker 1: deregulation stuff, just absolutely wonderful. It's just a shame that, 584 00:32:54,400 --> 00:32:57,120 Speaker 1: you know, you got to order the entire cable package 585 00:32:57,440 --> 00:33:00,760 Speaker 1: and can't just like take the channels you like. People 586 00:33:00,760 --> 00:33:05,280 Speaker 1: have said that about me and lots human beings in general. 587 00:33:05,320 --> 00:33:06,960 Speaker 1: I think you. Uh so. 588 00:33:07,040 --> 00:33:09,880 Speaker 3: Trump was asked today though, because the Supreme Court is 589 00:33:09,920 --> 00:33:11,400 Speaker 3: going to rule one of these days on whether or 590 00:33:11,400 --> 00:33:13,360 Speaker 3: not he's allowed to do this tariff stuff, and he 591 00:33:13,440 --> 00:33:15,760 Speaker 3: was asked today he said, what what if the Supreme Court, 592 00:33:16,040 --> 00:33:17,320 Speaker 3: you know, says you don't have that part? 593 00:33:17,320 --> 00:33:18,600 Speaker 1: And he said, well, we'll have to figure out a 594 00:33:18,600 --> 00:33:20,040 Speaker 1: different way, which. 595 00:33:19,960 --> 00:33:26,600 Speaker 3: Was unusually accepting of the idea that maybe they could 596 00:33:26,680 --> 00:33:27,240 Speaker 3: decide that. 597 00:33:29,280 --> 00:33:30,840 Speaker 1: Yeah. One of his aides, I can't, one of his 598 00:33:30,880 --> 00:33:32,760 Speaker 1: top aids was out saying roughly the same thing. We'll 599 00:33:32,800 --> 00:33:34,880 Speaker 1: find a different authority under which to do it, but 600 00:33:35,120 --> 00:33:37,760 Speaker 1: you won't be able to it'll really really weaken it. 601 00:33:38,000 --> 00:33:54,760 Speaker 1: I wouldn't think so strong A strong Ready. Yeah, here's 602 00:33:54,760 --> 00:33:57,120 Speaker 1: your host for final thoughts, Joe Getty. Hey, let's get 603 00:33:57,120 --> 00:33:58,960 Speaker 1: a final thought from everybody on the crew to wrap 604 00:33:58,960 --> 00:34:00,520 Speaker 1: things up to the day. Wouldn't that be charming? How 605 00:34:00,560 --> 00:34:02,800 Speaker 1: about Michael Angelo leading us off from the control room. 606 00:34:02,880 --> 00:34:05,200 Speaker 4: Michael, I'm just checking out this email I got from 607 00:34:05,280 --> 00:34:09,960 Speaker 4: a coworker. They are emptying his office reconstruction, so he 608 00:34:10,000 --> 00:34:12,399 Speaker 4: has all this free alcohol and beer and says, come 609 00:34:12,440 --> 00:34:13,279 Speaker 4: grab all you want. 610 00:34:14,400 --> 00:34:17,040 Speaker 1: Wow, Hey, help a brother out, would you. Yeah? I'm 611 00:34:17,080 --> 00:34:18,719 Speaker 1: not a drinker, but I'm sure a lot of people 612 00:34:18,719 --> 00:34:22,920 Speaker 1: would like that. Katie Green are esteemed Newswoman. As a 613 00:34:22,960 --> 00:34:26,000 Speaker 1: final thought, Katie, if I boost does the guy have 614 00:34:26,040 --> 00:34:29,879 Speaker 1: in his office? And why? Ye, that's get past the lead. Sorry, Katie, 615 00:34:29,920 --> 00:34:30,320 Speaker 1: go ahead. 616 00:34:30,600 --> 00:34:32,520 Speaker 4: If I'm ever in a restaurant that serves like a 617 00:34:32,560 --> 00:34:34,920 Speaker 4: whole fish, I'm forever going to picture them in a 618 00:34:34,960 --> 00:34:39,040 Speaker 4: sleep mask, close its. 619 00:34:38,880 --> 00:34:41,799 Speaker 1: Eye out of decency? Would you Jack? 620 00:34:41,840 --> 00:34:45,719 Speaker 3: A final thought for us, it's both very exciting and 621 00:34:47,280 --> 00:34:50,880 Speaker 3: terrifying that we all could live through the most interesting 622 00:34:51,000 --> 00:34:54,160 Speaker 3: period in the history of humans in the next couple 623 00:34:54,200 --> 00:34:54,640 Speaker 3: of years. 624 00:34:55,360 --> 00:35:00,360 Speaker 1: Ever, by far Man, we'll get to experience been on 625 00:35:00,400 --> 00:35:03,279 Speaker 1: a roll too. How about the status quo? Can I 626 00:35:03,320 --> 00:35:06,160 Speaker 1: opt for the status quo? Is that a you know, 627 00:35:06,320 --> 00:35:10,480 Speaker 1: like choose your room on a hotel app or something? Anyway, 628 00:35:10,560 --> 00:35:14,320 Speaker 1: My final thought is also AI related. The current situation 629 00:35:14,680 --> 00:35:18,200 Speaker 1: is annunciated enunciating by a guy in Raleigh, North Carolina. 630 00:35:19,000 --> 00:35:21,680 Speaker 1: Executives automatically assume AI is going to be the savior 631 00:35:21,760 --> 00:35:23,399 Speaker 1: I can't count the number of times that I've sought 632 00:35:23,400 --> 00:35:26,040 Speaker 1: a solution for a problem, asked one of the LM's 633 00:35:26,160 --> 00:35:28,360 Speaker 1: large language machines, and it gave me a solution to 634 00:35:28,440 --> 00:35:30,600 Speaker 1: accessibility problems that was completely wrong. 635 00:35:31,400 --> 00:35:34,560 Speaker 3: Yeah, keeping in mind that the smart people who tell 636 00:35:34,600 --> 00:35:36,680 Speaker 3: you these chatbots that we're all using are like a 637 00:35:36,800 --> 00:35:38,440 Speaker 3: web page is to the Internet. 638 00:35:38,520 --> 00:35:42,319 Speaker 1: That's not what AI is. That's just one little piece 639 00:35:42,360 --> 00:35:45,759 Speaker 1: of it. Armstrong and Getty wrapping up another grueling four 640 00:35:45,800 --> 00:35:48,200 Speaker 1: hour workday. So many people who thanks so learning and 641 00:35:48,360 --> 00:35:51,600 Speaker 1: people who'd like to buy some nice real estate in Greenland. 642 00:35:51,680 --> 00:35:54,080 Speaker 1: Go to Armstrong and Getty dot com for the hotling 643 00:35:54,080 --> 00:35:56,160 Speaker 1: straps a note mailbag at Armstrong geddy dot com. 644 00:35:56,200 --> 00:35:59,640 Speaker 3: We will see you tomorrow. God bless America. 645 00:36:00,360 --> 00:36:05,640 Speaker 1: I'm Strong and Getty. Thanks for listening to the Armstrong 646 00:36:05,680 --> 00:36:08,440 Speaker 1: and Getty Show. We're done for the day and. 647 00:36:08,560 --> 00:36:12,120 Speaker 2: You won't have to pay for the back guys, because 648 00:36:12,120 --> 00:36:12,600 Speaker 2: it's free. 649 00:36:13,440 --> 00:36:19,240 Speaker 1: Subscribe right now, don't miss a thing. It's cold. Armstrong 650 00:36:20,400 --> 00:36:21,920 Speaker 1: and Getty on and 651 00:36:23,880 --> 00:36:28,399 Speaker 2: You made it right ladders riding a long time Armstrong 652 00:36:28,480 --> 00:36:29,040 Speaker 2: and Getty