1 00:00:10,960 --> 00:00:15,240 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio, the George 2 00:00:15,320 --> 00:00:20,880 Speaker 1: Washington Broadcast Center, Jack Armstrong Show, Kaddy Armstrong and Jackie. 3 00:00:20,720 --> 00:00:26,200 Speaker 2: And Key Arms get it. 4 00:00:31,720 --> 00:00:33,960 Speaker 1: I just got off the phone with nine one thirty 5 00:00:33,960 --> 00:00:34,480 Speaker 1: seconds ago. 6 00:00:34,479 --> 00:00:35,600 Speaker 2: I'll have to tell that story. 7 00:00:36,120 --> 00:00:41,159 Speaker 1: Great Scott Live from Studio C see signor Tim lee. 8 00:00:41,000 --> 00:00:43,160 Speaker 2: Let room deeper than the bowels. 9 00:00:42,680 --> 00:00:45,959 Speaker 1: Of the Armstrong and Getty Communications Compound, and today we're 10 00:00:46,000 --> 00:00:49,519 Speaker 1: toiling under the title of the show, so Too's Your Mama? 11 00:00:50,320 --> 00:00:53,519 Speaker 1: Or let's face it, the State of the Union is 12 00:00:53,600 --> 00:00:58,160 Speaker 1: kind of asked up. President gives a speech tonight that 13 00:00:58,240 --> 00:01:00,920 Speaker 1: a few million people watch, all the media talks about 14 00:01:00,960 --> 00:01:05,960 Speaker 1: and it means nothing. A cherished tradition in which the 15 00:01:06,000 --> 00:01:08,920 Speaker 1: president is forced to speak to his real bosses, the 16 00:01:08,959 --> 00:01:13,520 Speaker 1: people of the United States, an invaluable moment. Invaluable moment 17 00:01:13,560 --> 00:01:16,039 Speaker 1: because we never get to hear from a president. If 18 00:01:16,040 --> 00:01:17,759 Speaker 1: it weren't for the State of the Eating address, we'd 19 00:01:17,760 --> 00:01:19,720 Speaker 1: have no idea what they think about anything. I was 20 00:01:19,760 --> 00:01:23,200 Speaker 1: going to launch this argument earlier later, but it did 21 00:01:23,240 --> 00:01:26,479 Speaker 1: occur to me. I read a description and I kind 22 00:01:26,480 --> 00:01:29,160 Speaker 1: of paraphrased it of why the so Too is a 23 00:01:29,280 --> 00:01:33,119 Speaker 1: really good idea. Okay, the president is forced at least 24 00:01:33,120 --> 00:01:35,319 Speaker 1: once a year to stand up in front of the 25 00:01:35,360 --> 00:01:38,200 Speaker 1: people and say, hey, here's what's up, and then give 26 00:01:38,240 --> 00:01:40,160 Speaker 1: a stupid laundry list and jabber forever. 27 00:01:40,240 --> 00:01:42,000 Speaker 2: But in what way does he not do that all 28 00:01:42,040 --> 00:01:43,840 Speaker 2: the time? Biden didn't. 29 00:01:44,880 --> 00:01:47,880 Speaker 1: That forced him out of his Heidi hole, and he 30 00:01:47,920 --> 00:01:50,520 Speaker 1: came and screeched at the American people like a maniac 31 00:01:50,600 --> 00:01:52,760 Speaker 1: for an hour. So we're gonna have this stupid exercise 32 00:01:52,760 --> 00:01:55,840 Speaker 1: in case we ever have another president who so sni 33 00:01:56,000 --> 00:01:57,000 Speaker 1: he hides in his house. 34 00:01:57,640 --> 00:01:59,120 Speaker 2: Well, we're other. 35 00:01:59,040 --> 00:02:01,880 Speaker 1: Variations on the thing, but yeah, yeah, make him come 36 00:02:01,880 --> 00:02:02,920 Speaker 1: out like it. 37 00:02:03,000 --> 00:02:07,480 Speaker 2: Punk's a tawny Phil. It just doesn't mean anything. Anyway. 38 00:02:07,520 --> 00:02:09,120 Speaker 1: We can talk more about that letter. The reason I 39 00:02:09,120 --> 00:02:11,160 Speaker 1: called nine one one. So I turned the corner all 40 00:02:11,240 --> 00:02:14,400 Speaker 1: right right where I exit, and it's kind of the 41 00:02:14,400 --> 00:02:15,920 Speaker 1: way it's set up with the off lane. People are 42 00:02:15,960 --> 00:02:19,120 Speaker 1: driving pretty fast. There's an old man in a wheelchair 43 00:02:19,240 --> 00:02:21,000 Speaker 1: sitting right in the middle of the lane. 44 00:02:21,280 --> 00:02:24,160 Speaker 2: I turned my head right in his eyes, his big eyes, 45 00:02:24,160 --> 00:02:26,160 Speaker 2: and he's just sitting there in his wheelchair. 46 00:02:26,440 --> 00:02:32,720 Speaker 1: Oh my god, and I swerved around him. I called 47 00:02:32,800 --> 00:02:35,360 Speaker 1: nine one one. I hope he's okay, I suppose I 48 00:02:35,760 --> 00:02:37,680 Speaker 1: he appeared to be a man of the street. 49 00:02:37,840 --> 00:02:38,840 Speaker 2: It's not it. 50 00:02:38,880 --> 00:02:42,160 Speaker 1: I mean, I bet all my money he's a man 51 00:02:42,200 --> 00:02:44,680 Speaker 1: of the street. So it's not It's not like somebody 52 00:02:44,760 --> 00:02:47,440 Speaker 1: got loose from their home with Alzheimer's or something, or 53 00:02:47,520 --> 00:02:50,320 Speaker 1: the medical transport van got confused and dropped him in 54 00:02:50,360 --> 00:02:50,639 Speaker 1: the riddle. 55 00:02:50,639 --> 00:02:51,959 Speaker 2: It's a highway here, right. 56 00:02:52,080 --> 00:02:54,160 Speaker 1: I don't think this is a guy who's a stranger 57 00:02:54,240 --> 00:02:56,160 Speaker 1: to being out in the middle of the street in 58 00:02:56,200 --> 00:02:56,640 Speaker 1: the dark. 59 00:02:57,280 --> 00:02:59,840 Speaker 2: Yeah, the radio ranch is kind of in Bumville. 60 00:03:00,120 --> 00:03:03,240 Speaker 1: It is because like there was an old Alzheimer's dude 61 00:03:03,240 --> 00:03:06,560 Speaker 1: that passed away eventually, but he lived in my neighborhood 62 00:03:06,560 --> 00:03:08,320 Speaker 1: and he'd get loose every once in a while and 63 00:03:08,400 --> 00:03:09,280 Speaker 1: it was quite a crisis. 64 00:03:10,600 --> 00:03:10,960 Speaker 2: Terrible. 65 00:03:11,080 --> 00:03:15,120 Speaker 1: Uh, yeah, it's terrible. But this is a this is 66 00:03:15,120 --> 00:03:17,120 Speaker 1: a guy who used to living out. 67 00:03:16,960 --> 00:03:18,760 Speaker 2: In the dark. But still he's going to get run over. 68 00:03:18,840 --> 00:03:20,400 Speaker 1: So I called nine one one and said, hey, right 69 00:03:20,400 --> 00:03:22,440 Speaker 1: next to the Cane's chicken, there's like a seventy year 70 00:03:22,480 --> 00:03:24,920 Speaker 1: old dude sitting in his wheelchair right where he's going 71 00:03:25,000 --> 00:03:25,520 Speaker 1: to get run over. 72 00:03:26,120 --> 00:03:27,760 Speaker 2: Wow. I would assume the police. 73 00:03:27,560 --> 00:03:29,800 Speaker 1: Are already there. There's enough cop cars around here because, 74 00:03:29,800 --> 00:03:32,320 Speaker 1: as you mentioned, we live in but we work in Bumville. 75 00:03:33,360 --> 00:03:36,120 Speaker 1: There are cars around. Can you imagine if you are 76 00:03:36,160 --> 00:03:38,240 Speaker 1: some poor son of a gun, you know, reached over 77 00:03:38,280 --> 00:03:41,040 Speaker 1: to turn up the radio to hear our fascinating ravings, 78 00:03:41,040 --> 00:03:44,440 Speaker 1: for instance, and didn't see the fella in time and 79 00:03:45,200 --> 00:03:47,160 Speaker 1: had that death on his conscience for the rest of 80 00:03:47,200 --> 00:03:50,960 Speaker 1: his life and be horrible or her. So we've established 81 00:03:50,960 --> 00:03:53,160 Speaker 1: that running over an old man in a wheelchair would 82 00:03:53,200 --> 00:03:53,520 Speaker 1: be bad. 83 00:03:54,000 --> 00:04:00,760 Speaker 2: That's why I am here wisdom. Yeah, it's exactly, it would. 84 00:04:00,880 --> 00:04:03,240 Speaker 2: It would really really mess up your day. 85 00:04:03,360 --> 00:04:09,600 Speaker 1: Yeah, how's your day, honey? Pretty good overall. I ran 86 00:04:09,640 --> 00:04:11,360 Speaker 1: over an old man wheelchair and killed him. 87 00:04:11,440 --> 00:04:12,480 Speaker 2: But then work was pretty good. 88 00:04:12,480 --> 00:04:15,720 Speaker 1: Actually, I landed the Jones account, so exactly. You know, 89 00:04:16,440 --> 00:04:18,640 Speaker 1: easy come, easy go, got to the gym. It's been 90 00:04:18,680 --> 00:04:19,560 Speaker 1: a pretty good day all over. 91 00:04:19,720 --> 00:04:20,600 Speaker 2: Sometimes. 92 00:04:20,680 --> 00:04:26,919 Speaker 1: Yeah, I'm hoping that the antics at the State of 93 00:04:26,920 --> 00:04:29,400 Speaker 1: the Union address are so over the top tonight that 94 00:04:29,440 --> 00:04:34,480 Speaker 1: we as a nation decide we should move on. Uh, 95 00:04:34,839 --> 00:04:38,320 Speaker 1: that so many Democrats don't show up, or so a 96 00:04:38,320 --> 00:04:39,680 Speaker 1: bunch of them are not going to show up. My 97 00:04:39,760 --> 00:04:45,480 Speaker 1: senator living in California, our senator Dennis Neck. You've ever 98 00:04:45,520 --> 00:04:49,880 Speaker 1: seen Adam Schiff. Adam Schiff said. 99 00:04:49,680 --> 00:04:50,880 Speaker 2: He is not pilof shift. 100 00:04:50,960 --> 00:04:52,840 Speaker 1: He is not going to attend the State of the Union. 101 00:04:52,920 --> 00:04:55,640 Speaker 1: Always very brave, very I know, I love these brave 102 00:04:55,760 --> 00:04:58,360 Speaker 1: stances of I won't attend, like you get some sort 103 00:04:58,400 --> 00:05:01,960 Speaker 1: of credit for that, do They're going to the alternative 104 00:05:02,520 --> 00:05:06,720 Speaker 1: the people's State of the Union, held by some progressive group. Right, 105 00:05:06,760 --> 00:05:10,000 Speaker 1: there's a couple of those, and they couldn't get their 106 00:05:10,000 --> 00:05:13,240 Speaker 1: act together on one. I think you, if I'm a Democrat, 107 00:05:13,360 --> 00:05:16,360 Speaker 1: I would say we'd make a better point and get 108 00:05:16,400 --> 00:05:19,039 Speaker 1: more attention if we could all get together on one 109 00:05:19,320 --> 00:05:21,800 Speaker 1: alternate State of the Union address instead of like four 110 00:05:22,240 --> 00:05:24,480 Speaker 1: which there are. And then of the people who show up, 111 00:05:24,480 --> 00:05:28,280 Speaker 1: a whole bunch of them are going to walk out 112 00:05:28,279 --> 00:05:30,400 Speaker 1: halfway through. They've got it plan to walk out. 113 00:05:30,640 --> 00:05:34,200 Speaker 2: Oh, very dramatic. And again the courage is amazing. 114 00:05:36,720 --> 00:05:41,200 Speaker 1: So at some point he's not addressing Congress whoever's presidents, 115 00:05:41,240 --> 00:05:43,919 Speaker 1: just addressing his own party, who cheer everything he says. 116 00:05:44,520 --> 00:05:46,839 Speaker 1: So then, really, what are we doing at that point? 117 00:05:47,560 --> 00:05:52,360 Speaker 1: All right, here's your scenario. Trump goes after the Supreme Court, 118 00:05:52,400 --> 00:05:55,960 Speaker 1: specifically John Roberts as they sit more or less face 119 00:05:56,000 --> 00:05:59,279 Speaker 1: to face right there, the soup's up front, the president 120 00:05:59,360 --> 00:06:03,720 Speaker 1: up there on the ustrum, finger pointing, red faced, shouting 121 00:06:03,800 --> 00:06:09,040 Speaker 1: match between Trump and John Roberts f you f U 122 00:06:09,600 --> 00:06:13,120 Speaker 1: and it ends with Roberts saying life term bitch, and 123 00:06:13,160 --> 00:06:15,280 Speaker 1: then he sits down. I don't see that out of 124 00:06:15,320 --> 00:06:18,120 Speaker 1: John Roberts. Okay, all right, I was just thinking out 125 00:06:18,160 --> 00:06:20,520 Speaker 1: loud could happen? He too agrees you should not run 126 00:06:20,560 --> 00:06:24,200 Speaker 1: over old people in wheelchairs though. Oh yeah, yeah, that 127 00:06:24,320 --> 00:06:29,479 Speaker 1: was the scar Metti versus wheelchair I believe nineteen ninety six. Arly, 128 00:06:29,680 --> 00:06:32,480 Speaker 1: so Trump stands up at the State of the Union dress, 129 00:06:32,480 --> 00:06:34,640 Speaker 1: which a few million people will watch, but everybody will 130 00:06:34,640 --> 00:06:36,120 Speaker 1: hear clips of if they follow the news. 131 00:06:36,120 --> 00:06:36,360 Speaker 2: Oh. 132 00:06:36,400 --> 00:06:38,080 Speaker 1: I heard an interesting stat the other day. I was 133 00:06:38,120 --> 00:06:39,719 Speaker 1: going to try to figure out where this came from. 134 00:06:40,440 --> 00:06:45,800 Speaker 1: A who's a learned man is listening to who said that? 135 00:06:48,200 --> 00:06:48,599 Speaker 2: Roughly? 136 00:06:51,200 --> 00:06:53,320 Speaker 1: I think I got the numbers right, about twenty eight 137 00:06:53,360 --> 00:06:56,839 Speaker 1: percent of people follow politics at all, and it was 138 00:06:56,880 --> 00:06:59,880 Speaker 1: half that fourteen percent closely, because we wonder about that 139 00:07:00,040 --> 00:07:00,479 Speaker 1: all the time. 140 00:07:00,760 --> 00:07:01,960 Speaker 2: M that is interesting. 141 00:07:01,960 --> 00:07:03,880 Speaker 1: Now, that would be enough to keep talk radio alive 142 00:07:03,920 --> 00:07:06,240 Speaker 1: because it's you know, it's a big country, and if 143 00:07:06,400 --> 00:07:08,719 Speaker 1: fourteen percent of people are following closely enough to turn 144 00:07:08,720 --> 00:07:11,320 Speaker 1: into you know, talk radio and people talk about this stuff, 145 00:07:11,360 --> 00:07:14,559 Speaker 1: that's a lot of people. Not a lot of the electorate, though, 146 00:07:14,640 --> 00:07:17,960 Speaker 1: that's paying close attention, which brings me to So the 147 00:07:18,000 --> 00:07:22,120 Speaker 1: President's given his speech tonight with the lowest approval ratings 148 00:07:22,280 --> 00:07:27,640 Speaker 1: he's had maybe ever, including right after January sixth, which 149 00:07:27,680 --> 00:07:30,520 Speaker 1: is hard to believe. And the right track, wrong track? 150 00:07:30,640 --> 00:07:32,480 Speaker 1: Or do you think things are better or worse? The 151 00:07:32,560 --> 00:07:35,640 Speaker 1: CNN pull out today sixty percent of Americans say they're 152 00:07:35,640 --> 00:07:38,920 Speaker 1: worse off today than they were a year ago, which 153 00:07:39,560 --> 00:07:41,600 Speaker 1: that's a tough one when you're president because it doesn't 154 00:07:41,600 --> 00:07:45,000 Speaker 1: always match up with reality. But people feel what they feel, 155 00:07:45,000 --> 00:07:46,679 Speaker 1: and you get to feel what you can't tell people 156 00:07:46,680 --> 00:07:49,680 Speaker 1: you're feeling the wrong thing. Different presidents have tried that, 157 00:07:49,760 --> 00:07:53,120 Speaker 1: but it doesn't work. And on the top of politics generally, 158 00:07:53,160 --> 00:07:55,040 Speaker 1: I can't think of a time when both parties have 159 00:07:55,120 --> 00:07:59,360 Speaker 1: been more screwed up right and ineffective and useless. So 160 00:08:00,440 --> 00:08:06,720 Speaker 1: there's a real feeling of kind of nausea about politics. 161 00:08:06,720 --> 00:08:07,920 Speaker 1: I think among Americans. 162 00:08:08,040 --> 00:08:10,360 Speaker 2: I've got it me too. 163 00:08:10,440 --> 00:08:13,280 Speaker 1: Do you think that number sounds like it makes sense. 164 00:08:13,520 --> 00:08:15,880 Speaker 1: Sixty percent of people say they're worse off today than 165 00:08:15,880 --> 00:08:20,080 Speaker 1: they were a year ago, And how's that even possible, 166 00:08:20,800 --> 00:08:23,720 Speaker 1: like realistically worse off. I know, not everybody's in the 167 00:08:23,760 --> 00:08:26,040 Speaker 1: stock market, but if you're in the stock market at all, 168 00:08:26,280 --> 00:08:28,600 Speaker 1: if you're four and k or whatever, that's better by 169 00:08:28,680 --> 00:08:29,880 Speaker 1: far than it was a year ago. 170 00:08:30,480 --> 00:08:32,320 Speaker 2: Gas is cheaper than it was a year ago. 171 00:08:33,200 --> 00:08:35,640 Speaker 1: Yeah, I've got all sorts of rents, housing, aus or 172 00:08:35,679 --> 00:08:37,880 Speaker 1: stuff was already expensive. I don't know if it's significantly 173 00:08:37,920 --> 00:08:39,880 Speaker 1: more expensive than a year ago. I was going to 174 00:08:39,920 --> 00:08:41,400 Speaker 1: talk about this a little bit later on. I've got 175 00:08:41,440 --> 00:08:46,000 Speaker 1: all sorts of statistical kind of artifacts or pieces of 176 00:08:46,280 --> 00:08:50,199 Speaker 1: evidence about the state of the American economy, and I'm 177 00:08:50,200 --> 00:08:51,440 Speaker 1: going to steal my own thunder. 178 00:08:53,800 --> 00:08:55,800 Speaker 2: Overall, inflation has actually been tamed. 179 00:08:55,840 --> 00:08:57,920 Speaker 1: It's at a decent level to come down just a 180 00:08:57,920 --> 00:08:59,199 Speaker 1: little more to be in the sweet spot. 181 00:08:59,240 --> 00:09:00,600 Speaker 2: But it's in pretty shape. 182 00:09:01,120 --> 00:09:05,199 Speaker 1: But the one thing you have is the inflation in 183 00:09:05,200 --> 00:09:08,319 Speaker 1: the last couple of years still makes life seem unaffordable 184 00:09:08,320 --> 00:09:10,720 Speaker 1: to a lot of folks. And the other thing I 185 00:09:10,760 --> 00:09:14,320 Speaker 1: often say, if inflation is high, nothing else matters, And 186 00:09:14,320 --> 00:09:16,760 Speaker 1: there's still a perception that inflation is high. I mean, 187 00:09:16,800 --> 00:09:20,800 Speaker 1: for instance, beef is up seventeen percent from last year. 188 00:09:20,920 --> 00:09:22,640 Speaker 1: Gas is down. There are a couple of things that 189 00:09:22,679 --> 00:09:25,680 Speaker 1: are down, and the other factor is uncertainty. Where there 190 00:09:25,720 --> 00:09:29,560 Speaker 1: is uncertainty, practically nothing else matters. And there's never been 191 00:09:29,640 --> 00:09:33,160 Speaker 1: so much uncertainty. Is the AI headlines kick around? Dow 192 00:09:33,280 --> 00:09:36,960 Speaker 1: dropped eight hundred and some points yesterday based mostly on 193 00:09:37,000 --> 00:09:41,480 Speaker 1: a viral report, another viral report about AI that came out. 194 00:09:41,520 --> 00:09:45,360 Speaker 1: So that's why people, I think have that negative feeling. 195 00:09:45,600 --> 00:09:48,280 Speaker 1: You are right, though you've always been about the They 196 00:09:48,320 --> 00:09:50,000 Speaker 1: should mention what percent the. 197 00:09:51,679 --> 00:09:53,640 Speaker 2: Indexes went down as opposed to points. 198 00:09:53,760 --> 00:09:56,559 Speaker 1: Yeah, the Dow is now at fifty thousand on a 199 00:09:56,600 --> 00:09:59,079 Speaker 1: regular basis, So eight hundred is not the same as 200 00:09:59,120 --> 00:10:01,720 Speaker 1: back when we were talking about it being at twenty 201 00:10:01,760 --> 00:10:06,280 Speaker 1: thousand or whatever. It was very true. Fifty thousand. We 202 00:10:06,320 --> 00:10:09,080 Speaker 1: should start the show officially. But if people, you know, 203 00:10:09,160 --> 00:10:11,599 Speaker 1: if people I know, I still get shocked at the 204 00:10:11,640 --> 00:10:14,320 Speaker 1: grocery store and at restaurants at the price. I still 205 00:10:14,320 --> 00:10:17,640 Speaker 1: haven't adjusted to where I'm not shocked every time I 206 00:10:17,679 --> 00:10:19,400 Speaker 1: go to the grocery store. It happened just yesterday. I 207 00:10:19,440 --> 00:10:22,000 Speaker 1: get like four items. I'll be fifty eight dollars. Fifty 208 00:10:22,120 --> 00:10:24,120 Speaker 1: dollars I got like bananas and milk, and one of 209 00:10:24,160 --> 00:10:26,719 Speaker 1: the thing I was just fifty eight dollars. Well, and 210 00:10:26,800 --> 00:10:29,439 Speaker 1: Judy and I went out for Chinese for the other night, 211 00:10:29,440 --> 00:10:32,200 Speaker 1: our favorite Chinese place and got Mongolian beef, which we 212 00:10:32,240 --> 00:10:35,120 Speaker 1: referred to as Mongolian onions throughout the meal because you 213 00:10:35,160 --> 00:10:36,480 Speaker 1: had to hunt for any beef. 214 00:10:36,640 --> 00:10:37,600 Speaker 2: There wasn't much of it. 215 00:10:37,640 --> 00:10:39,559 Speaker 1: But now that I see the beef beef, I mean, 216 00:10:39,600 --> 00:10:43,839 Speaker 1: beef prices skyrocketed last year. They were insanely high, and 217 00:10:43,840 --> 00:10:45,520 Speaker 1: they're up almost twenty percent. 218 00:10:45,280 --> 00:10:51,560 Speaker 2: More Mongolian onions. Let's start the show. Here's the joge 219 00:10:51,640 --> 00:10:54,120 Speaker 2: aging on me. Oh right, you went Chinese? 220 00:10:54,200 --> 00:10:57,000 Speaker 1: May oh think maud about that. Luckily no one touched 221 00:10:57,040 --> 00:11:00,839 Speaker 1: my payenis. I'm wow, I'm Jack Armstrong. He's Joe Getty 222 00:11:00,880 --> 00:11:03,240 Speaker 1: on this. It is Tuesday, February twenty fourth, the year 223 00:11:03,280 --> 00:11:06,040 Speaker 1: twenty twenty six. Armstrong, you're getting we approve of this program. 224 00:11:06,360 --> 00:11:08,600 Speaker 1: Let's beakin then officially, according to s SC rules ranks, 225 00:11:08,640 --> 00:11:09,240 Speaker 1: here we go at. 226 00:11:09,240 --> 00:11:12,800 Speaker 3: Mark Washington's first State of the Union was eight hundred 227 00:11:12,840 --> 00:11:15,840 Speaker 3: and thirty three words. It was hardly worth throwing his 228 00:11:15,920 --> 00:11:21,199 Speaker 3: wig on four. It lasted all of ten minutes. Our 229 00:11:21,280 --> 00:11:25,079 Speaker 3: third president, Thomas Jefferson, stopped doing it in person altogether, 230 00:11:25,640 --> 00:11:28,079 Speaker 3: as did all the presidents for the next one hundred 231 00:11:28,080 --> 00:11:32,000 Speaker 3: and twelve years. They just wrote a brief report because 232 00:11:32,280 --> 00:11:35,560 Speaker 3: even they understood nobody likes a meeting that could have 233 00:11:35,600 --> 00:11:36,680 Speaker 3: been done by email. 234 00:11:36,800 --> 00:11:40,120 Speaker 2: Yes, that is exactly right. 235 00:11:40,320 --> 00:11:42,600 Speaker 1: It's a campaign freaking speech with a list of things 236 00:11:42,640 --> 00:11:45,160 Speaker 1: that will never happen. You'll never convince me that the 237 00:11:45,200 --> 00:11:47,800 Speaker 1: State of the Union address is not a complete waste 238 00:11:47,800 --> 00:11:51,480 Speaker 1: of time in its current configuration. Now, if they came 239 00:11:51,520 --> 00:11:53,800 Speaker 1: out and said the population is three hundred and forty 240 00:11:53,840 --> 00:11:57,679 Speaker 1: one million, unemployment is three point two percent, you know, 241 00:11:57,720 --> 00:12:00,120 Speaker 1: and just ran through the stats and walked off, I'd say, Okay, 242 00:12:00,240 --> 00:12:02,720 Speaker 1: that's fine. Constitution asked for it, you did it, but 243 00:12:03,000 --> 00:12:06,520 Speaker 1: in its current form, and as like my entire adult life, 244 00:12:07,559 --> 00:12:09,719 Speaker 1: Oh my god, I just I dread it. 245 00:12:10,320 --> 00:12:12,800 Speaker 2: I haven't watched the last half dozen. 246 00:12:13,320 --> 00:12:17,000 Speaker 1: Mar unleashed another humdinger of an argument against having the 247 00:12:17,040 --> 00:12:19,319 Speaker 1: so too in the current form that we'll get to 248 00:12:19,400 --> 00:12:21,960 Speaker 1: a little bit later on. He's about got me convinced. 249 00:12:22,360 --> 00:12:24,400 Speaker 1: Although I love democracy I'm like you, and I think 250 00:12:24,440 --> 00:12:26,280 Speaker 1: the president should answer to the people at least once 251 00:12:26,320 --> 00:12:29,800 Speaker 1: a year. Fine, then send the letter like the president's 252 00:12:29,800 --> 00:12:31,880 Speaker 1: did for one hundred and twenty years or whatever it was. 253 00:12:32,080 --> 00:12:35,040 Speaker 1: That makes perfectly good sense, strongly worded letter to follow 254 00:12:35,120 --> 00:12:37,800 Speaker 1: up standing up and giving a campaign speech that everybody 255 00:12:37,840 --> 00:12:41,199 Speaker 1: that half the people boo and half the people cheer. Whatever. 256 00:12:41,400 --> 00:12:45,480 Speaker 1: Good Lord. Second episode of Jerry Springer. We've got Katie's 257 00:12:45,480 --> 00:12:47,079 Speaker 1: headlines on the way and a bunch of other stuff. 258 00:12:47,080 --> 00:12:48,960 Speaker 2: Stay here. 259 00:12:51,640 --> 00:12:55,320 Speaker 1: Today is the four year anniversary of the start of 260 00:12:55,360 --> 00:12:58,240 Speaker 1: the war in Ukraine, launched by Vladimir Putin. 261 00:12:58,320 --> 00:13:01,360 Speaker 2: No matter what some people say and. 262 00:13:01,640 --> 00:13:05,280 Speaker 1: Can't believe that it continues on as the world just says, 263 00:13:05,280 --> 00:13:09,280 Speaker 1: and what are you gonna do? Bojo blasted Europe the 264 00:13:09,360 --> 00:13:11,440 Speaker 1: other day for not taking stronger action. Maybe we can 265 00:13:11,480 --> 00:13:13,640 Speaker 1: touch on that a little bit later on Big Talk 266 00:13:13,800 --> 00:13:16,679 Speaker 1: the Euros. Let's figure out who's reporting what it's the 267 00:13:16,720 --> 00:13:17,840 Speaker 1: lead story with Katie Green and. 268 00:13:17,920 --> 00:13:20,800 Speaker 2: Katie Well, that is the lead story. Starting with Pops. 269 00:13:20,880 --> 00:13:21,480 Speaker 2: Ukraine War. 270 00:13:21,559 --> 00:13:26,079 Speaker 4: It's four year mark after Russian invasion ABC quote. Everything 271 00:13:26,240 --> 00:13:31,120 Speaker 4: is covered with Russian bodies. Ukraine's frontline troops reflect on 272 00:13:31,200 --> 00:13:35,760 Speaker 4: four years of war. At NBC, Zelenski's public frustration grows 273 00:13:35,840 --> 00:13:38,079 Speaker 4: as Putin's war enters its fifth year. 274 00:13:38,320 --> 00:13:43,000 Speaker 1: And agreeing with Bojo, the biggest part of the story 275 00:13:43,000 --> 00:13:45,079 Speaker 1: to me is Europe's lack. 276 00:13:44,840 --> 00:13:49,880 Speaker 2: Of interest in doing anything about it. Just stunning, other 277 00:13:49,960 --> 00:13:51,560 Speaker 2: than hitting the US up. 278 00:13:53,640 --> 00:13:58,000 Speaker 4: From CNN, Trump confronts his three main options on Iran 279 00:13:58,360 --> 00:14:01,160 Speaker 4: from diplomacy to try to topple a regime. 280 00:14:03,320 --> 00:14:04,920 Speaker 2: They're having a big meeting Thursday. 281 00:14:06,240 --> 00:14:12,040 Speaker 1: Apparently who Dad wit Coough, Jared Kushner, and whoever. The 282 00:14:12,080 --> 00:14:16,760 Speaker 1: Iranian counterparts are last ditch effort at diplomacy. 283 00:14:17,840 --> 00:14:22,200 Speaker 4: From The Wall Street Journal, Novo Nordist will slash US 284 00:14:22,440 --> 00:14:26,240 Speaker 4: list prices for Ragov and Ozempic by as much as 285 00:14:26,440 --> 00:14:27,840 Speaker 4: half starting next year. 286 00:14:27,920 --> 00:14:30,040 Speaker 1: They had already gotten quite a bit cheaper, and starting 287 00:14:30,160 --> 00:14:32,240 Speaker 1: January of next year they might be half what they 288 00:14:32,280 --> 00:14:36,160 Speaker 1: are now. They're going to get to the You absolutely 289 00:14:36,320 --> 00:14:37,760 Speaker 1: one hundred percent can afford it. 290 00:14:37,760 --> 00:14:39,560 Speaker 2: It's just whether you want to do it or not. State. 291 00:14:39,680 --> 00:14:43,080 Speaker 1: I asked my extremely reasonable doctor about the drugs just 292 00:14:43,120 --> 00:14:44,840 Speaker 1: out of curiosity the other day. He said, Oh, I've 293 00:14:44,840 --> 00:14:47,560 Speaker 1: been prescribing them for years. They do great, patients tolerate 294 00:14:47,640 --> 00:14:50,440 Speaker 1: them real. Well, I'm a big fan, so we see 295 00:14:50,440 --> 00:14:52,560 Speaker 1: a lot more of it and then yeah, especially since 296 00:14:52,560 --> 00:14:54,120 Speaker 1: they're going to get so cheap. Yeah, I wonder if 297 00:14:54,120 --> 00:14:56,920 Speaker 1: we're going to look like, you know, the nineteen seventies 298 00:14:56,920 --> 00:14:59,160 Speaker 1: all of a sudden, everybody's going to be Finn. 299 00:15:00,800 --> 00:15:06,960 Speaker 4: From Fox News, Savannah Guthrie offers one million dollar family 300 00:15:07,000 --> 00:15:10,400 Speaker 4: reward for recovery of Nancy Guthrie, saying, quote, we still 301 00:15:10,400 --> 00:15:14,200 Speaker 4: believe in a miracle, but acknowledging that quote, she may 302 00:15:14,240 --> 00:15:14,800 Speaker 4: be lost. 303 00:15:15,200 --> 00:15:17,080 Speaker 2: Man. The family thrown out a million dollars. 304 00:15:17,160 --> 00:15:18,640 Speaker 1: I was going to talk a little bit about the 305 00:15:18,720 --> 00:15:21,640 Speaker 1: coverage the last several days. 306 00:15:21,320 --> 00:15:25,640 Speaker 2: Of this, which is just awful in my opinion. 307 00:15:25,680 --> 00:15:29,600 Speaker 4: But this one from the New York Post, and we've 308 00:15:29,640 --> 00:15:32,400 Speaker 4: touched on this a couple of times, Chasing the dream 309 00:15:32,840 --> 00:15:38,000 Speaker 4: your wearable data may be making your sleep worse. This 310 00:15:38,120 --> 00:15:40,760 Speaker 4: article is all about when tracking your sleep turns into 311 00:15:40,840 --> 00:15:42,280 Speaker 4: sleep score chasing. 312 00:15:43,400 --> 00:15:43,600 Speaker 2: Yeah. 313 00:15:43,600 --> 00:15:46,800 Speaker 1: I've heard about this phenomenon before. People get so freaked 314 00:15:46,800 --> 00:15:49,240 Speaker 1: out about their quality of sleep based on their device, 315 00:15:49,280 --> 00:15:50,280 Speaker 1: they get worse sleep. 316 00:15:51,640 --> 00:15:54,320 Speaker 4: Well, and it's like I've said before, I know I 317 00:15:54,360 --> 00:15:59,240 Speaker 4: slept like crap. I don't need my phone to chime in, well. 318 00:15:59,120 --> 00:16:01,680 Speaker 1: And if you're unable to sleep because you're so worried 319 00:16:01,680 --> 00:16:04,400 Speaker 1: about your quality of sleep, I mean something's gone wrong 320 00:16:04,600 --> 00:16:05,400 Speaker 1: wrong in your life. 321 00:16:05,560 --> 00:16:05,800 Speaker 2: Yeah. 322 00:16:07,480 --> 00:16:13,960 Speaker 4: From study fines, American kids spend at least four hours 323 00:16:14,000 --> 00:16:18,320 Speaker 4: a day on their screens and parents say it's destroying 324 00:16:18,360 --> 00:16:19,440 Speaker 4: the family bond. 325 00:16:20,080 --> 00:16:26,600 Speaker 1: Yeah, correct, yes, yes, go on, that's about it. 326 00:16:27,400 --> 00:16:31,760 Speaker 4: And finally from the Babylon Bee, the US hockey team 327 00:16:31,880 --> 00:16:35,320 Speaker 4: is melting down gold medals to replace missing teeth. 328 00:16:36,640 --> 00:16:40,920 Speaker 1: The Fox was playing some highlights of reporters. So the 329 00:16:41,560 --> 00:16:44,360 Speaker 1: men's hockey team landed in Miami with a really cool 330 00:16:44,440 --> 00:16:46,560 Speaker 1: water fountain salute thing. I don't know if you saw 331 00:16:46,600 --> 00:16:49,520 Speaker 1: that water cannon salute that they use for heroes landing 332 00:16:49,520 --> 00:16:51,520 Speaker 1: at the airport, but they didn't hockey. No, I didn't 333 00:16:51,520 --> 00:16:53,080 Speaker 1: see that. And then they're headed to the State of 334 00:16:53,080 --> 00:16:54,520 Speaker 1: the Union dress. I don't know if it's all of them, 335 00:16:54,520 --> 00:16:57,360 Speaker 1: are part of them or what, but any who, Fox 336 00:16:57,440 --> 00:16:59,960 Speaker 1: is going to send the teeth. Fox was showing highlight 337 00:17:00,160 --> 00:17:03,720 Speaker 1: of reporters trying to get some political answers out of them. 338 00:17:03,760 --> 00:17:05,480 Speaker 2: Did you vote for Trump? Why are you going to 339 00:17:05,520 --> 00:17:06,119 Speaker 2: the White House? 340 00:17:06,160 --> 00:17:07,760 Speaker 1: You know that sort of thing just trying to get 341 00:17:07,800 --> 00:17:09,800 Speaker 1: one of the hockey players so that they could turn 342 00:17:09,880 --> 00:17:13,560 Speaker 1: the country against them. Luckily, they've been coached up enough 343 00:17:13,560 --> 00:17:16,040 Speaker 1: to say, we're not into politics, we're just excited to, 344 00:17:16,200 --> 00:17:17,400 Speaker 1: you know, support our country. 345 00:17:17,680 --> 00:17:18,400 Speaker 2: That sort of thing. 346 00:17:18,480 --> 00:17:20,640 Speaker 1: I just said, Hey, equipment manager, hand me my stick, 347 00:17:20,800 --> 00:17:22,480 Speaker 1: no kidding, and take your teeth out. 348 00:17:22,680 --> 00:17:24,000 Speaker 2: God, I hate those people. 349 00:17:25,480 --> 00:17:27,080 Speaker 5: That's e ste Armstrong and. 350 00:17:28,840 --> 00:17:31,920 Speaker 1: Executive producer Hanson and I were just watching the latest 351 00:17:32,280 --> 00:17:35,000 Speaker 1: Brad Pitt Tom Cruise AI. 352 00:17:34,880 --> 00:17:36,120 Speaker 2: Movie that got released. 353 00:17:36,119 --> 00:17:38,720 Speaker 1: It's like two or three minutes long, and I just 354 00:17:38,760 --> 00:17:43,360 Speaker 1: watched it and it's quite quite stunning and really entertaining 355 00:17:43,400 --> 00:17:48,440 Speaker 1: as Epstein is involved. Oh kind just but I mean 356 00:17:48,440 --> 00:17:51,560 Speaker 1: like the opening minute was like is compelling, A wow, 357 00:17:51,560 --> 00:17:54,800 Speaker 1: What's about to happen? Is anything I've seen recently? And 358 00:17:54,880 --> 00:17:59,359 Speaker 1: it was just utterly seamless, right, and one looked like 359 00:17:59,400 --> 00:18:02,880 Speaker 1: Tom Cruise and Brad Pitt riding in a classic Mustang 360 00:18:02,920 --> 00:18:09,120 Speaker 1: getting ready to get involved in some hijinks. Anyway, that's 361 00:18:09,240 --> 00:18:12,359 Speaker 1: interesting enough, you said earlier, So I was talking about 362 00:18:12,400 --> 00:18:17,359 Speaker 1: the unease people have about the economy that doesn't exactly 363 00:18:17,400 --> 00:18:19,119 Speaker 1: match up with the numbers in the economy. But you 364 00:18:19,119 --> 00:18:22,159 Speaker 1: thought some of it might be people freaking about all 365 00:18:22,200 --> 00:18:24,800 Speaker 1: these headlines are about AI and everything like that. It's 366 00:18:24,800 --> 00:18:28,880 Speaker 1: got people freaked out and feeling uncomfortable no matter what happens. Yeah, 367 00:18:28,920 --> 00:18:32,600 Speaker 1: I mean pique sector of the economy that's feeling really 368 00:18:32,600 --> 00:18:36,760 Speaker 1: good about itself, energy and construction, maybe because of the 369 00:18:36,800 --> 00:18:40,080 Speaker 1: giant AI data centers. But white collar workers, oh no, 370 00:18:40,119 --> 00:18:44,520 Speaker 1: they're miserable with uncertainty. Manufacturing has not really rebounded, it's 371 00:18:44,560 --> 00:18:46,000 Speaker 1: continued to decline. 372 00:18:45,600 --> 00:18:46,960 Speaker 2: Well to that front. 373 00:18:46,960 --> 00:18:49,400 Speaker 1: Here's a little clip of Sam Altman, he's the guy 374 00:18:49,440 --> 00:18:53,919 Speaker 1: behind open ai, which is chat GPT, talking about training 375 00:18:54,000 --> 00:18:56,000 Speaker 1: up AI versus actual human beings. 376 00:18:56,359 --> 00:18:59,040 Speaker 5: One of the things that is always unfair in this 377 00:18:59,119 --> 00:19:01,840 Speaker 5: comparison is people will talk about how much energy it 378 00:19:01,880 --> 00:19:05,320 Speaker 5: takes to train an AI model relative to how much 379 00:19:05,359 --> 00:19:08,240 Speaker 5: it costs a human to do one inference query. But 380 00:19:08,320 --> 00:19:10,240 Speaker 5: it also takes a lot of energy to train a human. 381 00:19:10,280 --> 00:19:12,760 Speaker 5: It takes like twenty years of life and all of 382 00:19:12,760 --> 00:19:15,680 Speaker 5: the food you eat during that time before you get smart. 383 00:19:15,760 --> 00:19:18,159 Speaker 5: And not only that, it took like the very widespread 384 00:19:18,280 --> 00:19:20,320 Speaker 5: evolution of the one hundred billion people that have ever 385 00:19:20,400 --> 00:19:23,000 Speaker 5: lived and learned not to get eaten by predators and 386 00:19:23,080 --> 00:19:25,600 Speaker 5: learned how to figure out science and whatever to produce you, 387 00:19:25,920 --> 00:19:28,479 Speaker 5: and then you took whatever you know you took. So 388 00:19:28,640 --> 00:19:32,760 Speaker 5: the fair comparison is if you ask chatchpta question, how 389 00:19:32,840 --> 00:19:34,920 Speaker 5: much energy does it take? Once its model is trained 390 00:19:34,920 --> 00:19:39,480 Speaker 5: to answer that question versus a human, and probably AI 391 00:19:39,520 --> 00:19:42,000 Speaker 5: has already caught up on an energy efficiency basis measured 392 00:19:42,000 --> 00:19:42,320 Speaker 5: that way. 393 00:19:42,480 --> 00:19:46,199 Speaker 1: So that is one of your AI giants making the 394 00:19:46,320 --> 00:19:49,399 Speaker 1: argument that it's actually more energy efficient despite the giant 395 00:19:49,520 --> 00:19:52,200 Speaker 1: power plants they're building on Earth, and Elon's going to 396 00:19:52,240 --> 00:19:57,720 Speaker 1: build in space over time versus humans, Well, then I 397 00:19:57,920 --> 00:20:00,160 Speaker 1: and the rest of my kind will step asides. I'm 398 00:20:00,320 --> 00:20:02,800 Speaker 1: so sorry to have started so much energy. 399 00:20:02,960 --> 00:20:04,960 Speaker 2: God dang it. I was watching a little bit of 400 00:20:05,240 --> 00:20:06,280 Speaker 2: Elon over the weekend. 401 00:20:06,280 --> 00:20:09,560 Speaker 1: He was on with Joe Rogan again, and Elon was 402 00:20:10,400 --> 00:20:13,320 Speaker 1: he was kind of worked up about variety of people 403 00:20:13,400 --> 00:20:17,280 Speaker 1: that are big time players in the AI world that 404 00:20:17,880 --> 00:20:22,960 Speaker 1: actually out loud say basically what you just said, it's 405 00:20:22,960 --> 00:20:25,240 Speaker 1: time for human beings to step aside. There's a new 406 00:20:26,119 --> 00:20:29,119 Speaker 1: intelligent life form on Earth and it's their time, and 407 00:20:29,160 --> 00:20:32,920 Speaker 1: they're more efficient and they're better than us, and quoting 408 00:20:33,000 --> 00:20:35,879 Speaker 1: one guy who Elon said that he needs to go 409 00:20:35,920 --> 00:20:38,199 Speaker 1: to hell. I wish I remember who it was, quoting 410 00:20:38,200 --> 00:20:43,639 Speaker 1: one of your AI thinkers saying the perfect amount of 411 00:20:43,720 --> 00:20:47,600 Speaker 1: humans for Earth to be in its best state would 412 00:20:47,640 --> 00:20:52,080 Speaker 1: be zero, right, which is quite a thing, Which is 413 00:20:52,160 --> 00:20:57,439 Speaker 1: quite a thing that you're prioritizing AI continuing on production 414 00:20:57,640 --> 00:21:00,040 Speaker 1: of stuff with no human beings around. What's even the 415 00:21:00,080 --> 00:21:03,520 Speaker 1: point what there are no humans around? How productive AI 416 00:21:03,680 --> 00:21:06,080 Speaker 1: is when there are no humans? I can't wrap my 417 00:21:06,119 --> 00:21:08,879 Speaker 1: head around that being a thing that I consider for 418 00:21:08,920 --> 00:21:11,920 Speaker 1: some reason. Well, right, and then what, Well, it's good 419 00:21:11,920 --> 00:21:14,000 Speaker 1: for the earth, and the earth is our ultimate goal 420 00:21:14,200 --> 00:21:17,399 Speaker 1: or something something. I will never accomplish this, But if 421 00:21:17,400 --> 00:21:20,679 Speaker 1: I could accomplish one thing, it might be to change 422 00:21:20,680 --> 00:21:27,080 Speaker 1: our society so that everybody's aware of how pathetically self 423 00:21:27,160 --> 00:21:32,080 Speaker 1: serving and obvious it is to do the I'm so enlightened. 424 00:21:32,280 --> 00:21:35,840 Speaker 1: I don't like my country card. I'm so enlightened. I 425 00:21:35,880 --> 00:21:40,040 Speaker 1: don't even like my species card, because that passes for 426 00:21:40,800 --> 00:21:45,719 Speaker 1: enlightened when it's really the opposite. It's masturbatory, it's idiotic, 427 00:21:45,760 --> 00:21:46,679 Speaker 1: it's obvious. 428 00:21:47,560 --> 00:21:50,080 Speaker 2: Stop it. Oh it annoys me? 429 00:21:51,600 --> 00:21:55,520 Speaker 1: Well, I agree, But if there are people in charge 430 00:21:55,520 --> 00:21:59,440 Speaker 1: of AI who have that mindset, that's that's that's a problem. 431 00:22:00,240 --> 00:22:03,359 Speaker 2: They're I don't know, hunted down like dogs. I don't know. 432 00:22:03,760 --> 00:22:07,400 Speaker 2: So you also, for instance, don't be violent against anybody. 433 00:22:07,480 --> 00:22:09,639 Speaker 1: You also brought up earlier how the stock market dropped 434 00:22:09,640 --> 00:22:12,120 Speaker 1: a whole bunch of points yesterday on what Wall Street 435 00:22:12,160 --> 00:22:13,960 Speaker 1: Journal says is A is a fiction. 436 00:22:14,600 --> 00:22:17,800 Speaker 2: It's just a It's basically hypothetical. 437 00:22:17,880 --> 00:22:20,720 Speaker 1: Yeah, this could happen with AI, and it freaked out 438 00:22:20,720 --> 00:22:23,320 Speaker 1: so many people that all kinds of different sectors of 439 00:22:23,359 --> 00:22:24,360 Speaker 1: the market reacted. 440 00:22:24,760 --> 00:22:24,920 Speaker 2: Yeah. 441 00:22:25,000 --> 00:22:28,640 Speaker 1: Big research company that I guess tech people follow unleashed 442 00:22:28,640 --> 00:22:34,080 Speaker 1: a seven thousand word hypothetical Satrini research they're called. It 443 00:22:34,160 --> 00:22:37,720 Speaker 1: was quote painting a dark portrait of a future in 444 00:22:37,760 --> 00:22:40,960 Speaker 1: which technological change inspires a race to the bottom in 445 00:22:41,040 --> 00:22:47,080 Speaker 1: white collar knowledge work. Essentially, the global intelligence crisis is 446 00:22:47,200 --> 00:22:52,520 Speaker 1: about to hit. And here's what they mean. I love 447 00:22:52,560 --> 00:22:55,919 Speaker 1: this the new broader question. What if AI is so 448 00:22:56,040 --> 00:22:59,640 Speaker 1: bullish for the economy that it is actually bearish quote 449 00:22:59,800 --> 00:23:03,680 Speaker 1: for the entirety of modern economic history, human intelligence has 450 00:23:03,720 --> 00:23:09,280 Speaker 1: been the scarce input. We are now experiencing the unwind 451 00:23:09,359 --> 00:23:10,359 Speaker 1: of that premium. 452 00:23:11,320 --> 00:23:13,439 Speaker 2: What does that mean? That means the. 453 00:23:14,680 --> 00:23:22,720 Speaker 1: Most valuable, rare, critical resource to economic development and activity 454 00:23:23,080 --> 00:23:25,920 Speaker 1: is the human brain, the frontal lobe that was the 455 00:23:26,000 --> 00:23:31,160 Speaker 1: one thing you can't manufacture at scale, really is really 456 00:23:31,200 --> 00:23:33,560 Speaker 1: smart human beings who know what to do. And now 457 00:23:33,560 --> 00:23:35,800 Speaker 1: we don't need that anymore. We have computers that know 458 00:23:35,840 --> 00:23:38,680 Speaker 1: what to do and know what to do, and communicate 459 00:23:38,760 --> 00:23:42,400 Speaker 1: with each other and unwind problems and innovate at speeds 460 00:23:42,400 --> 00:23:45,560 Speaker 1: that we cannot even comprehend. There's no need for human 461 00:23:45,560 --> 00:23:51,280 Speaker 1: intelligence anymore. But that sentence that AI is so bullish on. 462 00:23:52,840 --> 00:23:56,040 Speaker 2: That the economy, it's actually parish. I don't fully understand that. 463 00:23:56,960 --> 00:23:57,919 Speaker 2: I guess that. 464 00:23:59,800 --> 00:24:04,280 Speaker 1: If if the economy doesn't need people, how are people 465 00:24:04,359 --> 00:24:07,800 Speaker 1: going to get income? It's the question you've always posed, 466 00:24:08,160 --> 00:24:11,800 Speaker 1: How exactly is this transfer of the amazing AI wealth 467 00:24:11,840 --> 00:24:14,639 Speaker 1: to us or we got to eat humans going to 468 00:24:14,680 --> 00:24:17,600 Speaker 1: take place? What does that look like? What are the mechanisms? 469 00:24:17,600 --> 00:24:20,760 Speaker 1: What does the economy look like at all? If Amazon 470 00:24:20,840 --> 00:24:26,320 Speaker 1: could do what it does way more efficiently and be 471 00:24:26,720 --> 00:24:32,960 Speaker 1: multiple times more productive but needs no human beings, maybe 472 00:24:32,960 --> 00:24:36,040 Speaker 1: I think I do understand that sense. What does that 473 00:24:36,080 --> 00:24:39,440 Speaker 1: do for humans in the economy? Okay, what You're way 474 00:24:39,480 --> 00:24:41,280 Speaker 1: more productive and you're making more money than you've ever 475 00:24:41,320 --> 00:24:41,840 Speaker 1: made before. 476 00:24:42,400 --> 00:24:44,480 Speaker 2: Then who's going to give me some of that? And 477 00:24:44,600 --> 00:24:46,000 Speaker 2: why exactly? 478 00:24:46,200 --> 00:24:50,800 Speaker 1: So that is a giant, bizarre, unprecedented, terrifying question. But 479 00:24:50,840 --> 00:24:53,960 Speaker 1: we're going to put that aside because there's another giant, bizarre, 480 00:24:54,200 --> 00:24:55,320 Speaker 1: unprecedented question. 481 00:24:56,880 --> 00:24:59,120 Speaker 2: And if you've studied. 482 00:24:59,000 --> 00:25:02,320 Speaker 1: Economics at all, or you're just of reasonable intelligence, you 483 00:25:02,400 --> 00:25:06,399 Speaker 1: know what scarcity is. It's the idea that there aren't 484 00:25:06,440 --> 00:25:11,880 Speaker 1: an unlimited number of my house for me to just buy. Otherwise, 485 00:25:12,040 --> 00:25:14,600 Speaker 1: why would I pay what I paid for my house 486 00:25:14,800 --> 00:25:18,679 Speaker 1: or a car or a blender or whatever. You know, 487 00:25:18,720 --> 00:25:23,120 Speaker 1: supply meeting demand. It's a function of scarcity. What would 488 00:25:23,160 --> 00:25:26,200 Speaker 1: an economy look like where there is no scarcity? 489 00:25:28,440 --> 00:25:30,359 Speaker 2: I want? I want to Rolex. 490 00:25:31,040 --> 00:25:33,320 Speaker 1: I'm going to throw it out tomorrow, but I need 491 00:25:33,359 --> 00:25:35,880 Speaker 1: a blue one to go with my suit because I'm 492 00:25:35,920 --> 00:25:39,480 Speaker 1: going to some function in my Ferrari that I may 493 00:25:39,520 --> 00:25:43,200 Speaker 1: set fire to because Ferraris are free and I get 494 00:25:43,240 --> 00:25:46,240 Speaker 1: all of my I get zillions of dollars wired to 495 00:25:46,240 --> 00:25:49,960 Speaker 1: me from AI Incorporated, and blah blah, How the f 496 00:25:50,119 --> 00:25:52,840 Speaker 1: would that economy even function? What does that look like? 497 00:25:52,920 --> 00:25:56,680 Speaker 1: Nobody has any idea, No, And I'm surprised more people 498 00:25:56,720 --> 00:25:59,040 Speaker 1: don't ask that question. I would love to be able 499 00:25:59,119 --> 00:26:01,440 Speaker 1: to talk to Elon about that when he's when he's saying, 500 00:26:01,480 --> 00:26:04,080 Speaker 1: you don't need to save for retirement in the future. 501 00:26:04,400 --> 00:26:06,400 Speaker 2: Everything will be provided for. 502 00:26:06,240 --> 00:26:10,680 Speaker 1: You by who and at what level, and who determines 503 00:26:10,760 --> 00:26:16,000 Speaker 1: that level. Why would these I assume the companies are 504 00:26:16,000 --> 00:26:18,840 Speaker 1: making all this money, why would why would they give 505 00:26:18,880 --> 00:26:23,080 Speaker 1: a whole bunch of it to just random Americans out 506 00:26:23,080 --> 00:26:26,000 Speaker 1: of the goodness of their heart? Is the government gonna 507 00:26:26,040 --> 00:26:28,960 Speaker 1: compel them? How much do they have to give back 508 00:26:29,560 --> 00:26:32,720 Speaker 1: enough for me to be okay or well off or 509 00:26:32,720 --> 00:26:33,359 Speaker 1: super rich? 510 00:26:33,880 --> 00:26:34,400 Speaker 2: And why? 511 00:26:35,440 --> 00:26:39,400 Speaker 1: And what if somebody who has the most awesome of weapons, 512 00:26:39,760 --> 00:26:43,600 Speaker 1: whether they follow the ideology of a stupid effing German 513 00:26:44,280 --> 00:26:48,919 Speaker 1: Karl Marx or some you know, seventh century religion, decide now, 514 00:26:48,920 --> 00:26:49,439 Speaker 1: we're not going to. 515 00:26:49,480 --> 00:26:50,119 Speaker 2: Let that happen. 516 00:26:50,600 --> 00:26:53,440 Speaker 1: I mean, it's just if somebody understands that, because I 517 00:26:53,800 --> 00:26:58,280 Speaker 1: feel like I must be missing something. I'm of average 518 00:26:58,320 --> 00:27:01,639 Speaker 1: of intelligence and I don't Gavin Newsom, You're not better 519 00:27:01,680 --> 00:27:02,120 Speaker 1: than us. 520 00:27:02,119 --> 00:27:04,280 Speaker 2: But I don't hear anybody ever address this. 521 00:27:05,119 --> 00:27:10,640 Speaker 1: Where's who's collecting all this money? If it's not the 522 00:27:10,720 --> 00:27:15,719 Speaker 1: companies to the government, or I don't get it, and 523 00:27:15,720 --> 00:27:23,280 Speaker 1: how it gets spread out to people. I have fascinated myself. 524 00:27:23,520 --> 00:27:26,600 Speaker 1: What does an economy with no scarcity look like? 525 00:27:29,119 --> 00:27:31,320 Speaker 2: How do people behave? Where do they live? 526 00:27:31,800 --> 00:27:35,280 Speaker 1: Well, if there's our value in a you know, Ferrari rolex, 527 00:27:35,320 --> 00:27:39,240 Speaker 1: but being even more reasonable, if there's no value in 528 00:27:39,280 --> 00:27:41,040 Speaker 1: an F one to fifty pickup, you can make them 529 00:27:41,080 --> 00:27:44,280 Speaker 1: just practically for free because of the way AI works. Sure, 530 00:27:45,240 --> 00:27:47,199 Speaker 1: what motivates you to make them? How do they have 531 00:27:47,280 --> 00:27:49,520 Speaker 1: any value? And why did the company would Why would 532 00:27:49,560 --> 00:27:54,600 Speaker 1: the company make them? Nobody at the company needs any money. No, Well, 533 00:27:54,640 --> 00:27:57,560 Speaker 1: the computers in the robots presumably would just keep making 534 00:27:57,600 --> 00:27:58,760 Speaker 1: them because we told them too. 535 00:28:00,400 --> 00:28:01,560 Speaker 2: What about real estate? 536 00:28:02,600 --> 00:28:06,359 Speaker 1: If anybody can build a five thousand square foot house 537 00:28:06,800 --> 00:28:10,160 Speaker 1: in Malibu overlooking the ocean, and it's just a question 538 00:28:10,200 --> 00:28:12,399 Speaker 1: of who gets there first, then what happens force of 539 00:28:12,520 --> 00:28:16,240 Speaker 1: arms to get that land? Yeah, the biggest problem is 540 00:28:16,240 --> 00:28:19,560 Speaker 1: going to be housing because all climates aren't the same, 541 00:28:19,600 --> 00:28:22,080 Speaker 1: all views aren't the same, all that sort of stuff. 542 00:28:22,160 --> 00:28:23,800 Speaker 2: So people are still early. 543 00:28:24,160 --> 00:28:25,840 Speaker 1: But then we're going to die out because nobody's having 544 00:28:25,840 --> 00:28:30,639 Speaker 1: babies and with no purpose for your life and no scarcity, 545 00:28:30,760 --> 00:28:32,760 Speaker 1: no want, nobody's going to have any babies. 546 00:28:33,960 --> 00:28:34,680 Speaker 2: If this is. 547 00:28:35,080 --> 00:28:39,920 Speaker 1: It's really pretty interesting that some company wrote a fanciful 548 00:28:40,080 --> 00:28:43,400 Speaker 1: well what if this happens essay, and the Stark market, 549 00:28:43,480 --> 00:28:46,240 Speaker 1: with a whole bunch of different sectors of the economy. 550 00:28:45,840 --> 00:28:48,480 Speaker 2: Went, whoa, what if that did happen? Holy crap? 551 00:28:48,680 --> 00:28:51,320 Speaker 1: And the Stark market went down a lot over that 552 00:28:51,800 --> 00:28:54,040 Speaker 1: over over basically somebody just saying at a. 553 00:28:53,960 --> 00:28:58,360 Speaker 2: Bar, God, what if this happened? Right? Yeah? Is that 554 00:28:58,400 --> 00:29:00,000 Speaker 2: where we are? Apparently that's where we are. 555 00:29:00,680 --> 00:29:05,000 Speaker 1: Listen, if future generations revere me as some sort of prophet, 556 00:29:05,480 --> 00:29:07,440 Speaker 1: do me a favor of the statues, slim me down 557 00:29:07,480 --> 00:29:10,240 Speaker 1: a little bit, that would be great. But I'm telling 558 00:29:10,280 --> 00:29:13,280 Speaker 1: you AI is the apple from the tree of knowledge 559 00:29:13,280 --> 00:29:16,360 Speaker 1: in the Book of Genesis. Well, I agree with that. 560 00:29:16,640 --> 00:29:20,240 Speaker 1: But in the before we get to the culmination of 561 00:29:20,280 --> 00:29:22,720 Speaker 1: this whatever, it's going to be us dying out, I 562 00:29:22,800 --> 00:29:31,800 Speaker 1: guess what the hell? Has there ever been a time 563 00:29:31,840 --> 00:29:35,280 Speaker 1: in human history where a parent like me who's got 564 00:29:35,320 --> 00:29:37,440 Speaker 1: a couple of fourteen year old and a sixteen year 565 00:29:37,440 --> 00:29:38,760 Speaker 1: old doesn't. 566 00:29:38,360 --> 00:29:41,400 Speaker 2: Have the slightest idea. 567 00:29:42,320 --> 00:29:45,840 Speaker 1: How they should prepare those kids for the coming world. 568 00:29:46,360 --> 00:29:48,760 Speaker 2: No is the answer to that question. For most of history. 569 00:29:49,160 --> 00:29:52,520 Speaker 1: Not only did you have some idea, you had the 570 00:29:52,680 --> 00:29:55,160 Speaker 1: entire idea. You're going to work in the field, just 571 00:29:55,200 --> 00:29:56,960 Speaker 1: like I did and your grandpa did in the last 572 00:29:57,000 --> 00:30:00,360 Speaker 1: eighteen generations and the next eighteen generations, we're all going 573 00:30:00,440 --> 00:30:03,560 Speaker 1: to do the same thing or or basically a factory 574 00:30:03,600 --> 00:30:06,120 Speaker 1: or whatever the hell it was. But the idea, even 575 00:30:06,200 --> 00:30:08,280 Speaker 1: in the midst of the twenty first century, which had 576 00:30:08,320 --> 00:30:11,160 Speaker 1: a lot of change, in the middle of a World war, 577 00:30:11,800 --> 00:30:16,520 Speaker 1: there's more certainty, oh, completely, And the idea, if you 578 00:30:16,560 --> 00:30:18,560 Speaker 1: get this kind of education or these kind of skills, 579 00:30:18,560 --> 00:30:20,520 Speaker 1: you could go out and you know that that industry 580 00:30:20,560 --> 00:30:23,880 Speaker 1: isn't going to go away anytime soon. All industries might 581 00:30:23,920 --> 00:30:26,880 Speaker 1: go in the next five years. 582 00:30:27,440 --> 00:30:28,960 Speaker 2: Whoops, I have no idea. 583 00:30:29,200 --> 00:30:32,040 Speaker 1: It's the first time that And you know, kids saying 584 00:30:32,480 --> 00:30:34,760 Speaker 1: I'm never going to use this about whatever they're learning 585 00:30:34,760 --> 00:30:37,120 Speaker 1: in the school might be completely true. 586 00:30:38,280 --> 00:30:40,040 Speaker 2: Yeah, a little wisiness is. 587 00:30:41,960 --> 00:30:45,360 Speaker 1: I'm still gonna yell at him. But back to the 588 00:30:45,360 --> 00:30:49,120 Speaker 1: whole biblical theme. If a tree falls on Sam Altman 589 00:30:49,280 --> 00:30:52,320 Speaker 1: and then all of a sudden, Markus Zuckerberg has a 590 00:30:52,320 --> 00:30:55,200 Speaker 1: fatal heart attack, although God's back in the smiting. 591 00:30:54,960 --> 00:31:00,360 Speaker 2: Business, Yeah, no, kidding, just saying, thank God. Huh today 592 00:31:00,400 --> 00:31:02,840 Speaker 2: tomorrow before it's too late. We got mailbag on the way, 593 00:31:02,880 --> 00:31:09,760 Speaker 2: a lot of other stuff. Stay here a little bit later. 594 00:31:09,800 --> 00:31:11,840 Speaker 1: We're going to feature a woman who fell completely in 595 00:31:11,880 --> 00:31:16,080 Speaker 1: love with an AI chatbot. So we talked about that yesterday, 596 00:31:16,960 --> 00:31:20,280 Speaker 1: and that couldn't be more disturbing. I just saw a 597 00:31:20,320 --> 00:31:23,560 Speaker 1: headline that Defense Secretary Pete Hegzeth is a meeting with 598 00:31:23,680 --> 00:31:27,120 Speaker 1: the Anthropic CEO. Anthropic is quawed, one of my favorite 599 00:31:27,160 --> 00:31:30,200 Speaker 1: chatbots to talk over military AI use. 600 00:31:30,320 --> 00:31:32,640 Speaker 2: Thank God, because that is the future of everything. 601 00:31:33,600 --> 00:31:36,320 Speaker 1: They're having a bit of a tussle the Pentagon and 602 00:31:36,440 --> 00:31:38,320 Speaker 1: Silicon Valley interested in that. 603 00:31:38,400 --> 00:31:39,280 Speaker 2: We will talk about that. 604 00:31:39,520 --> 00:31:41,880 Speaker 1: Here is the longest ever freedom loving quote of the 605 00:31:41,960 --> 00:31:46,480 Speaker 1: day from Justice Neil Gorsuch from the recent tariff case. 606 00:31:47,960 --> 00:31:49,880 Speaker 1: For those who think that it is important for the 607 00:31:49,960 --> 00:31:52,400 Speaker 1: nation to impose more tariffs, I understand that today's decision 608 00:31:52,440 --> 00:31:54,560 Speaker 1: will be disappointing. All I can offer them is that 609 00:31:54,600 --> 00:31:57,640 Speaker 1: most major decisions affecting the rights and responsibilities of the 610 00:31:57,680 --> 00:32:01,120 Speaker 1: American people, including the duty to pay taxes and tariffs, 611 00:32:01,280 --> 00:32:04,680 Speaker 1: are funneled through the legislative process. For a reason, Yes, 612 00:32:04,800 --> 00:32:06,960 Speaker 1: legislating can be hard and take time. And yes, it 613 00:32:06,960 --> 00:32:10,440 Speaker 1: can be tempting to bypass Congress when some pressing problem arises, 614 00:32:10,600 --> 00:32:13,280 Speaker 1: But the deliberative nature of the legislative process was the 615 00:32:13,320 --> 00:32:16,360 Speaker 1: whole point of its design. Through that process, the nation 616 00:32:16,440 --> 00:32:19,560 Speaker 1: can tap the combined wisdom of the people's elected representatives, 617 00:32:19,840 --> 00:32:24,520 Speaker 1: not just that of one faction or man. Their deliberation tempers, impulse, 618 00:32:24,560 --> 00:32:28,680 Speaker 1: and compromise hammer's disagreements into workable solutions, and because laws 619 00:32:28,760 --> 00:32:32,280 Speaker 1: must earn such broad support to survive the legislative process, 620 00:32:32,520 --> 00:32:35,760 Speaker 1: they tend to endure, allowing ordinary people to plan their 621 00:32:35,760 --> 00:32:38,120 Speaker 1: lives in ways they cannot when the rules shift from 622 00:32:38,160 --> 00:32:41,160 Speaker 1: day to day. In all, the legislative process helps ensure 623 00:32:41,200 --> 00:32:42,640 Speaker 1: each of us has a stake in the laws that 624 00:32:42,720 --> 00:32:45,480 Speaker 1: govern us and in the nation's future. For some today, 625 00:32:45,520 --> 00:32:48,200 Speaker 1: the weight of those virtues is apparent. For others it 626 00:32:48,240 --> 00:32:51,040 Speaker 1: may not seem so obvious. But if history is any guide, 627 00:32:51,080 --> 00:32:53,360 Speaker 1: the tables will turn, and the day will come when 628 00:32:53,360 --> 00:32:57,160 Speaker 1: those disappointed by today's result will appreciate the legislative process 629 00:32:57,360 --> 00:33:00,240 Speaker 1: for the bulwark of liberty it is. It was a 630 00:33:00,280 --> 00:33:02,920 Speaker 1: little bit of a long way of saying, oh, yeah, 631 00:33:03,040 --> 00:33:07,680 Speaker 1: you want Gavin Newsom to have this power well summarized mailbag, 632 00:33:08,880 --> 00:33:13,080 Speaker 1: drop us a note mailbag At Armstrong and Getty dot com, 633 00:33:13,280 --> 00:33:15,640 Speaker 1: JT and Livermore are pointing out and I love this. 634 00:33:15,800 --> 00:33:19,120 Speaker 1: David Hughes of the American Hockey Team American Olympic Hockey 635 00:33:19,160 --> 00:33:23,520 Speaker 1: Team literally bleeds red his blood, white his teeth, and 636 00:33:23,640 --> 00:33:26,120 Speaker 1: blue his bruises as he declares his. 637 00:33:26,080 --> 00:33:29,840 Speaker 2: Love for America. Love that. 638 00:33:30,400 --> 00:33:32,560 Speaker 1: Can I say you shout questions, Adham? Did you vote 639 00:33:32,560 --> 00:33:34,720 Speaker 1: for Trump? Why are you going to the White House? 640 00:33:34,920 --> 00:33:40,760 Speaker 1: Aren't you embarrassed about? In Minnesota? This is from Kevin 641 00:33:40,880 --> 00:33:45,640 Speaker 1: courtesy of the Babylon Be commenting on Gavin Newsom's speech 642 00:33:45,680 --> 00:33:48,840 Speaker 1: in front of a heavily black audience, saying, I'm just 643 00:33:48,960 --> 00:33:51,800 Speaker 1: like you. I'm dumb, I can't pass tests. Oh my god, 644 00:33:53,200 --> 00:33:56,760 Speaker 1: I love this. The headline is Gavin Newsom wow's black 645 00:33:56,800 --> 00:33:58,000 Speaker 1: audience by putting some. 646 00:33:57,920 --> 00:34:01,080 Speaker 2: Hot sauce in his purple drank. Wow. 647 00:34:01,200 --> 00:34:03,280 Speaker 1: Then Kevin says, I sent this because I was really 648 00:34:03,320 --> 00:34:05,160 Speaker 1: hoping to hear you Joe say. 649 00:34:05,880 --> 00:34:08,359 Speaker 2: Purple drank Yeow. There you go. 650 00:34:10,719 --> 00:34:13,320 Speaker 1: Seawn listened to the Armstrong You Getty One More Thing podcast, 651 00:34:13,640 --> 00:34:17,600 Speaker 1: which was Jack was discussing his experience flopping down on 652 00:34:17,640 --> 00:34:20,560 Speaker 1: a forty thousand dollars bed, than an eighty four one 653 00:34:20,680 --> 00:34:24,640 Speaker 1: hundred thousand doca mattress, then a four hundred thousand dollars 654 00:34:24,640 --> 00:34:27,239 Speaker 1: mattress set, passing up the chance to flop down on 655 00:34:27,239 --> 00:34:30,520 Speaker 1: an eight hundred thousand set, And Sean points out a 656 00:34:30,520 --> 00:34:34,520 Speaker 1: couple of things. Number one heston means the horse. Yes, 657 00:34:35,320 --> 00:34:37,560 Speaker 1: My Swedish salesman explained that to me he used to 658 00:34:37,560 --> 00:34:41,799 Speaker 1: work at IKEA's favorite phrase in Swedish was inging coopo esen, 659 00:34:42,160 --> 00:34:44,080 Speaker 1: which means no cow on the ice. 660 00:34:44,560 --> 00:34:45,560 Speaker 2: What a great saying. 661 00:34:45,800 --> 00:34:48,320 Speaker 1: Obviously, in a rural society, a cow on a frozen 662 00:34:48,360 --> 00:34:51,560 Speaker 1: lake is a monumental problem. Do nothing, impossibly lose it 663 00:34:51,680 --> 00:34:53,680 Speaker 1: or try to save it and possibly drown yourself and 664 00:34:53,760 --> 00:34:58,680 Speaker 1: the cow. No cow on a frozen lake man's words 665 00:34:58,719 --> 00:35:00,799 Speaker 1: to live by. Oh yeah, I'm telling them. 666 00:35:00,880 --> 00:35:02,200 Speaker 2: Yea. Let's see what is this? 667 00:35:03,120 --> 00:35:06,759 Speaker 1: John wonders why we haven't heard much about this. The 668 00:35:06,960 --> 00:35:11,440 Speaker 1: Grand Ayatola Nasa Makarasarasi, one of Iran's Topshia clerks, has 669 00:35:11,440 --> 00:35:15,680 Speaker 1: put out a fatwah on Trump and others in the administration, 670 00:35:16,280 --> 00:35:19,279 Speaker 1: calling on all Muslims worldwide to kill them and to 671 00:35:19,280 --> 00:35:21,040 Speaker 1: spend their lives trying to hunt them down. 672 00:35:21,239 --> 00:35:22,560 Speaker 2: That's a pretty good reason for war. 673 00:35:23,880 --> 00:35:28,840 Speaker 1: Oh boy, Yeah, State of the Union address tonight. I 674 00:35:28,840 --> 00:35:31,759 Speaker 1: wonder how much Trump's going to do selling the war 675 00:35:31,840 --> 00:35:33,840 Speaker 1: with Iran to the American people. I don't know how 676 00:35:33,920 --> 00:35:35,799 Speaker 1: much of the speech is going to be that got 677 00:35:35,840 --> 00:35:37,440 Speaker 1: a lot more in the way to stay here and 678 00:35:37,560 --> 00:35:37,759 Speaker 1: get