1 00:00:01,840 --> 00:00:06,120 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio the George 2 00:00:06,200 --> 00:00:10,120 Speaker 1: Washington Broadcast Center, Jack Armstrong and Joe Getty. 3 00:00:10,160 --> 00:00:24,960 Speaker 2: Armstrong and Jetty Armstrong and Getty. According to the. 4 00:00:24,880 --> 00:00:28,720 Speaker 3: White House zappers, while visiting Donald Trump in the Oval Office, 5 00:00:28,760 --> 00:00:33,040 Speaker 3: Elon Musk's four year old needed a diaper change. Luckily 6 00:00:33,120 --> 00:00:35,519 Speaker 3: there was plenty left over from the previous tenant. 7 00:00:39,560 --> 00:00:42,360 Speaker 2: Yeah, that was that ten? I asked for ten? Was 8 00:00:42,360 --> 00:00:42,680 Speaker 2: that joke? 9 00:00:42,720 --> 00:00:42,879 Speaker 1: Ten? 10 00:00:42,920 --> 00:00:44,400 Speaker 2: That was on joke? Ten? Huh? 11 00:00:44,400 --> 00:00:48,240 Speaker 1: Okay, my joke TenneT joke was a ten out of ten. 12 00:00:48,440 --> 00:00:50,800 Speaker 1: My Joe Pye was in continent? Is the point? 13 00:00:51,159 --> 00:00:54,080 Speaker 2: My joke? Ten? Is Bill Maher Kanye West is getting 14 00:00:54,080 --> 00:00:55,840 Speaker 2: a divorce? Is that joke? Ten? 15 00:00:56,160 --> 00:00:56,440 Speaker 1: Okay? 16 00:00:57,240 --> 00:00:58,520 Speaker 2: Well, here I have that joke. 17 00:00:59,520 --> 00:00:59,840 Speaker 1: That's it. 18 00:01:00,160 --> 00:01:03,080 Speaker 4: Marriage is over, Biancusen sorry, and Kanye West of cold 19 00:01:03,080 --> 00:01:10,559 Speaker 4: it quits. I I just hope this doesn't make Kanye 20 00:01:10,640 --> 00:01:20,640 Speaker 4: do something stupid. Yeah, apparently it was. It was a 21 00:01:20,680 --> 00:01:23,320 Speaker 4: while in coming. Try can I try to be sensitive? 22 00:01:23,360 --> 00:01:25,440 Speaker 4: You said to her, it's not you, it's not me, 23 00:01:25,720 --> 00:01:26,480 Speaker 4: it's the Jews. 24 00:01:29,520 --> 00:01:34,120 Speaker 2: Wow, that's pretty funny. It's not you, it's not me, 25 00:01:34,319 --> 00:01:35,280 Speaker 2: it's the Jews. 26 00:01:36,360 --> 00:01:40,080 Speaker 1: Kanye, It's so crazy. I hadn't heard that they're getting 27 00:01:40,120 --> 00:01:40,640 Speaker 1: a divorce. 28 00:01:40,800 --> 00:01:45,760 Speaker 2: So a week after she's naked looking like a hostage 29 00:01:45,959 --> 00:01:49,200 Speaker 2: at the Grammys, they're getting a divorce. Her played a 30 00:01:49,280 --> 00:01:49,960 Speaker 2: role in any way. 31 00:01:50,680 --> 00:01:54,280 Speaker 1: Yeah. Plus I heard something she's not down with the 32 00:01:54,320 --> 00:01:58,680 Speaker 1: anti Semitism at all. And I hadn't understood exactly how 33 00:01:58,960 --> 00:02:01,600 Speaker 1: that quote unquote super Bowl out of his got on 34 00:02:02,360 --> 00:02:05,080 Speaker 1: where he rambles and says, go to my website. Then 35 00:02:05,120 --> 00:02:08,280 Speaker 1: the only product that his website is a swastika T shirt. 36 00:02:08,720 --> 00:02:12,680 Speaker 1: He had an ad agency by a local Super Bowl 37 00:02:12,720 --> 00:02:16,480 Speaker 1: spot in every market he could around America, so Big 38 00:02:16,520 --> 00:02:19,280 Speaker 1: Fox didn't air it. It was just on. I can't 39 00:02:19,320 --> 00:02:23,359 Speaker 1: remember many dozens of local markets around America, that crazy 40 00:02:23,400 --> 00:02:27,040 Speaker 1: ass commercial of his. And he's, man, is he headed 41 00:02:27,040 --> 00:02:27,840 Speaker 1: for a big crack up? 42 00:02:28,040 --> 00:02:30,280 Speaker 2: Well with what goal? Or do you just think he's 43 00:02:30,320 --> 00:02:31,280 Speaker 2: actually nuts? 44 00:02:31,600 --> 00:02:34,000 Speaker 1: Oh, he's got to just be nuts. I mean it 45 00:02:34,040 --> 00:02:37,680 Speaker 1: could be he's an obsessive anti Semite in the vein 46 00:02:37,760 --> 00:02:39,240 Speaker 1: of a certain Chancellor of Germany. 47 00:02:39,280 --> 00:02:43,640 Speaker 2: But well, I don't think that's clearly nuts. I mean 48 00:02:43,680 --> 00:02:45,280 Speaker 2: I could be wrong, but I don't think that's it. 49 00:02:45,360 --> 00:02:49,440 Speaker 2: I don't he wants he wants people to be mad 50 00:02:49,480 --> 00:02:55,080 Speaker 2: at him. He's got some sort of childish upset doing 51 00:02:55,200 --> 00:02:57,560 Speaker 2: things to make people not like him so he can 52 00:02:57,560 --> 00:03:00,560 Speaker 2: say people don't like him thing going. I mean, like 53 00:03:00,639 --> 00:03:04,519 Speaker 2: his tweet about his tweet about I turned down three 54 00:03:04,600 --> 00:03:06,799 Speaker 2: make a Wish kids and wheelchairs this week. I mean, 55 00:03:06,800 --> 00:03:10,400 Speaker 2: he just he just wants people to be upset with 56 00:03:10,480 --> 00:03:15,239 Speaker 2: him for some sad reason, and that he hasn't killed 57 00:03:15,280 --> 00:03:18,120 Speaker 2: himself already is shocking to me. I think that's going 58 00:03:18,160 --> 00:03:19,600 Speaker 2: to be a headline in my lifetime. 59 00:03:19,639 --> 00:03:22,520 Speaker 1: But as usual, when you're talking about something like this, 60 00:03:22,680 --> 00:03:25,680 Speaker 1: trying to come up with a rational explanation for crazy, 61 00:03:25,760 --> 00:03:26,760 Speaker 1: you just you can't. 62 00:03:26,880 --> 00:03:29,760 Speaker 2: He might be the richest mentally ill person in the world, 63 00:03:30,240 --> 00:03:32,920 Speaker 2: and then can you know, kind of get away with it. 64 00:03:33,720 --> 00:03:35,280 Speaker 1: Yeah, I was just thinking the same thing. I mean, 65 00:03:35,320 --> 00:03:37,680 Speaker 1: if he were not independently wealthy, he would be living 66 00:03:37,720 --> 00:03:42,200 Speaker 1: in a box somewhre absolutely just yeah, yeah, too bad, folks. 67 00:03:42,280 --> 00:03:45,120 Speaker 1: Be grateful for your mental health to the extent that 68 00:03:45,200 --> 00:03:48,640 Speaker 1: you have. Any Ah, So a headful of AI related 69 00:03:48,680 --> 00:03:52,280 Speaker 1: stories that I found very interesting as we move further 70 00:03:52,360 --> 00:03:54,880 Speaker 1: and further into this world. I've actually been using it 71 00:03:55,040 --> 00:03:58,560 Speaker 1: a bit more in last week, partly because Apple updated 72 00:03:58,600 --> 00:04:01,640 Speaker 1: their software, and I've had a couple of searches that 73 00:04:02,560 --> 00:04:05,560 Speaker 1: they needed to use chat GPT to help with and 74 00:04:05,880 --> 00:04:09,800 Speaker 1: it's been quite impressive, really, so I'm feeling more and 75 00:04:09,880 --> 00:04:13,840 Speaker 1: more like at least I understand how some of it works. 76 00:04:13,840 --> 00:04:19,040 Speaker 1: But anyway, a story number one is they did a 77 00:04:19,120 --> 00:04:21,919 Speaker 1: couple of universities got together did a series of experiments 78 00:04:22,240 --> 00:04:27,320 Speaker 1: where they were they had to volunteers pose as drone 79 00:04:27,360 --> 00:04:31,839 Speaker 1: pilots making kill or no kill, shoot or no shoot decisions. 80 00:04:32,680 --> 00:04:36,640 Speaker 1: And they would, you know, up the ante by showing 81 00:04:36,680 --> 00:04:41,240 Speaker 1: them pictures of civilians hurt and killed by drone strikes 82 00:04:41,279 --> 00:04:43,520 Speaker 1: and buildings damage and stuff like that. They made it 83 00:04:43,560 --> 00:04:47,320 Speaker 1: as impactful as they could in a research setting. And 84 00:04:48,080 --> 00:04:50,039 Speaker 1: you know, it's too bad this this article is written 85 00:04:50,080 --> 00:04:52,680 Speaker 1: with way too much editorial, so I'll just summarize it 86 00:04:52,720 --> 00:04:59,200 Speaker 1: for you. People seemed very willing to overcome their own thinking, 87 00:04:59,560 --> 00:05:02,920 Speaker 1: or over turn their own thinking if the AI system 88 00:05:03,320 --> 00:05:08,160 Speaker 1: said I disagree, even if it had been made infinitely 89 00:05:08,200 --> 00:05:11,880 Speaker 1: clear to them that the AI system is unsophisticated and 90 00:05:11,960 --> 00:05:16,720 Speaker 1: it makes mistakes. But they seemed in large numbers to 91 00:05:16,880 --> 00:05:21,359 Speaker 1: defer to the computer, and amusingly, they tried it with 92 00:05:21,520 --> 00:05:24,680 Speaker 1: like a primitive robot saying I would go ahead and 93 00:05:24,760 --> 00:05:27,960 Speaker 1: shoot at them, and then turning very robotically. Then they 94 00:05:28,040 --> 00:05:31,280 Speaker 1: used did a more sophisticated humanoid like robot, and then 95 00:05:31,320 --> 00:05:33,919 Speaker 1: they just did a simple computer interface to suit that 96 00:05:34,080 --> 00:05:35,400 Speaker 1: changed it, and it did a little bit. 97 00:05:35,480 --> 00:05:39,080 Speaker 2: So the human psychology of this AI side is interesting. 98 00:05:39,160 --> 00:05:43,640 Speaker 2: So that's it's like the famous trolley car problem. You 99 00:05:43,680 --> 00:05:48,080 Speaker 2: will say, go ahead and run over the person if 100 00:05:48,440 --> 00:05:52,240 Speaker 2: if something tells you to, if a supposed expert tells you. 101 00:05:52,160 --> 00:05:55,760 Speaker 1: To, yeah, even if somebody makes clear to you, hey, 102 00:05:55,760 --> 00:05:59,720 Speaker 1: this expert is wrong a fair amount. And the way 103 00:05:59,760 --> 00:06:02,320 Speaker 1: they this experiment was they would show them pictures very 104 00:06:02,400 --> 00:06:07,359 Speaker 1: quickly of their target, and that often included the symbol 105 00:06:07,480 --> 00:06:11,279 Speaker 1: of either your side or the other side, either the 106 00:06:11,400 --> 00:06:16,200 Speaker 1: Ukrainian flag or the Russian Z for instance, and and 107 00:06:16,240 --> 00:06:19,440 Speaker 1: say shoot or don't shoot. And even though they'd programmed 108 00:06:19,440 --> 00:06:22,520 Speaker 1: the AI to be random in its decisions so it 109 00:06:22,560 --> 00:06:25,040 Speaker 1: was wrong as much as it was right, the people 110 00:06:25,080 --> 00:06:27,240 Speaker 1: would refer to it in large numbers. So anyway, I 111 00:06:27,360 --> 00:06:31,000 Speaker 1: just thought that was kind of interesting. We tend well, 112 00:06:31,040 --> 00:06:33,840 Speaker 1: certain people tend to be very easily led, whether it's 113 00:06:33,880 --> 00:06:37,040 Speaker 1: a human being or a technical beast like AI. 114 00:06:37,160 --> 00:06:39,920 Speaker 2: Well, I guess that that explains the whole, not on 115 00:06:39,960 --> 00:06:42,280 Speaker 2: purpose to get back to Germans and Nazi things, but 116 00:06:42,320 --> 00:06:45,440 Speaker 2: the whole, the whole. I was just following orders thing. 117 00:06:45,720 --> 00:06:48,680 Speaker 2: I guess that fits in with that. You're you're willing 118 00:06:48,720 --> 00:06:53,600 Speaker 2: to do lots of things if somebody tells you to yeah, 119 00:06:54,080 --> 00:06:57,000 Speaker 2: an expert tells you to yeah. 120 00:06:57,440 --> 00:07:00,279 Speaker 1: To refer to a very famous parable, I guess it 121 00:07:00,360 --> 00:07:04,640 Speaker 1: is about how among human beings there are sheep, there 122 00:07:04,640 --> 00:07:08,560 Speaker 1: are wolves, there are sheep dogs, and the masses of 123 00:07:08,600 --> 00:07:11,880 Speaker 1: folks are a little sheepy. And you know, you could 124 00:07:11,880 --> 00:07:14,280 Speaker 1: be insulting about that and call them sheeple and stuff 125 00:07:14,280 --> 00:07:16,600 Speaker 1: like that, and I have in the past. But if 126 00:07:16,600 --> 00:07:19,240 Speaker 1: that is the way most people are at some point, 127 00:07:19,320 --> 00:07:21,960 Speaker 1: you have to realize that and be a realist about it. 128 00:07:22,000 --> 00:07:25,160 Speaker 1: Founding Fathers work. That's why the designed so many checks 129 00:07:25,160 --> 00:07:28,760 Speaker 1: and balances against somebody seizing dictatorial powers, because a lot 130 00:07:28,800 --> 00:07:31,720 Speaker 1: of sheeple want a dictator because it's simple and they 131 00:07:31,760 --> 00:07:34,840 Speaker 1: know what they're supposed to do in a dictatorship. Anyway, 132 00:07:35,200 --> 00:07:40,600 Speaker 1: Another quick artificial intelligence story. New study shows that over 133 00:07:40,640 --> 00:07:45,000 Speaker 1: aliance on artificial intelligence to perform certain tasks reduces critical thinking. 134 00:07:45,880 --> 00:07:49,320 Speaker 1: Study conducted by Carnegie, Mellen and Microsoft who is working 135 00:07:49,360 --> 00:07:51,640 Speaker 1: as hard as they can to get AI going, found 136 00:07:51,640 --> 00:07:54,640 Speaker 1: that people who use AI regularly for basic routine tasks 137 00:07:54,680 --> 00:07:59,440 Speaker 1: will lose their ability for complex critical thinking. From social 138 00:07:59,440 --> 00:08:01,720 Speaker 1: workers to people who write code for a living, the 139 00:08:01,760 --> 00:08:05,160 Speaker 1: professional surveyed, We're all asked to share real life examples, 140 00:08:05,200 --> 00:08:07,440 Speaker 1: and the blah blah blah. The methodology is not that 141 00:08:07,480 --> 00:08:11,120 Speaker 1: interesting and it's long. But of course, if you know, 142 00:08:11,200 --> 00:08:14,880 Speaker 1: if I have a machine to shovel snow for me, 143 00:08:15,040 --> 00:08:17,600 Speaker 1: my snow shoveling muscles are going to get less strong. 144 00:08:17,680 --> 00:08:19,160 Speaker 1: And it's true intellectually too. Well. 145 00:08:19,160 --> 00:08:21,120 Speaker 2: I don't know if this fits in with my example 146 00:08:21,160 --> 00:08:23,240 Speaker 2: of it. I'm still the only person I've heard say this. 147 00:08:23,840 --> 00:08:25,040 Speaker 2: I want to get it. I want to say it 148 00:08:25,040 --> 00:08:26,920 Speaker 2: again because I want credit for it when it becomes 149 00:08:26,960 --> 00:08:28,880 Speaker 2: a big thing, because I do think it's going to 150 00:08:28,960 --> 00:08:33,080 Speaker 2: be a big thing. I'm a worst driver now because 151 00:08:33,120 --> 00:08:37,240 Speaker 2: of the automatic driving in my Tesla. When I'm in 152 00:08:37,320 --> 00:08:41,160 Speaker 2: my car without it, I'm not as I just don't 153 00:08:41,160 --> 00:08:43,480 Speaker 2: pay as good attention because I got in the habit 154 00:08:43,520 --> 00:08:46,600 Speaker 2: of something paying attention for me. Yeah, And I think 155 00:08:46,640 --> 00:08:50,120 Speaker 2: that's why I had my motorcycle wreck because I'm just 156 00:08:51,040 --> 00:08:55,040 Speaker 2: I'm it happens so quickly that my brain got retrained 157 00:08:55,040 --> 00:08:57,760 Speaker 2: when I'm driving that I don't need to pay attention. 158 00:08:57,840 --> 00:08:59,320 Speaker 2: There's a computer paying attention for me. 159 00:09:00,400 --> 00:09:02,840 Speaker 1: Yeah. I think that's undeniable, and that's going to become 160 00:09:02,880 --> 00:09:06,880 Speaker 1: a real problem nationwide. There aren't enough people driving cars 161 00:09:06,920 --> 00:09:10,319 Speaker 1: that have that technology yet, I guess. But similar sort 162 00:09:10,360 --> 00:09:10,560 Speaker 1: of thing. 163 00:09:10,600 --> 00:09:14,880 Speaker 2: When something else can do it for you, you atrophy amazingly quickly. 164 00:09:15,559 --> 00:09:17,120 Speaker 2: Apparently all of us. 165 00:09:17,000 --> 00:09:19,720 Speaker 1: Have had the experience of like spacing off behind the wheel, 166 00:09:19,760 --> 00:09:21,920 Speaker 1: and you'd like startle and you think, oh god, no, 167 00:09:21,920 --> 00:09:23,880 Speaker 1: I got to pay attention. I gotta pay attention. If 168 00:09:23,920 --> 00:09:27,160 Speaker 1: you systematically get trained, no, it's okay to space Offah. Yeah, 169 00:09:27,160 --> 00:09:30,400 Speaker 1: that's going to change your metal approach of course. Yeah. 170 00:09:30,440 --> 00:09:34,560 Speaker 1: So a couple of quick words from the summary of 171 00:09:34,640 --> 00:09:37,880 Speaker 1: this article, and then we will get to the funny 172 00:09:38,000 --> 00:09:41,600 Speaker 1: AI story, or at least ironic. Researchers wrote in the 173 00:09:41,600 --> 00:09:44,560 Speaker 1: paper that AI can result in the deterioration of cognitive 174 00:09:44,559 --> 00:09:46,040 Speaker 1: faculties that ought to be preserved. 175 00:09:46,080 --> 00:09:46,440 Speaker 2: Quote. 176 00:09:46,559 --> 00:09:49,400 Speaker 1: A key irony of automation is that by mechanizing routine 177 00:09:49,400 --> 00:09:52,880 Speaker 1: tasks and leaving exception handling to the human user. You 178 00:09:52,960 --> 00:09:55,680 Speaker 1: deprive the user of the routine opportunities to practice their 179 00:09:55,760 --> 00:09:59,480 Speaker 1: judgment and strengthen their cognitive musculature, leaving the matrophied and 180 00:09:59,559 --> 00:10:01,960 Speaker 1: unprepared when the exceptions do arid. 181 00:10:02,120 --> 00:10:04,600 Speaker 2: I think that's exactly what happened with the driving. It's 182 00:10:04,679 --> 00:10:08,080 Speaker 2: just it happens with everything apparently that we use AI for. 183 00:10:08,640 --> 00:10:11,960 Speaker 1: Wow, this reminds me of that internal study that Facebook 184 00:10:12,400 --> 00:10:14,840 Speaker 1: did that said, Hey, this stuff's addictive and it'll wreck 185 00:10:14,880 --> 00:10:17,760 Speaker 1: your minds. Don't let your kids do this anyway. Let's 186 00:10:17,840 --> 00:10:21,640 Speaker 1: keep marketing to the suckers anyway. That's striking, especially because 187 00:10:21,679 --> 00:10:25,560 Speaker 1: Microsoft is involved. So a quick word again from our 188 00:10:25,600 --> 00:10:28,560 Speaker 1: friends at prize Picks, right, Michael, isn't that right? Yeah, 189 00:10:28,600 --> 00:10:31,840 Speaker 1: And then we'll get back to a rather funny, ironic, 190 00:10:31,920 --> 00:10:35,679 Speaker 1: bitterly amusing story about AI prize Picks. You can now 191 00:10:35,679 --> 00:10:38,000 Speaker 1: went up to a thousand times your money on prize Picks. 192 00:10:38,080 --> 00:10:40,000 Speaker 1: It's best way to get sports action in more than 193 00:10:40,040 --> 00:10:43,440 Speaker 1: thirty states, including call Unicornia, Texas, Georgia, and Florida. 194 00:10:43,559 --> 00:10:46,000 Speaker 2: So a football season over. Now is the time to 195 00:10:46,120 --> 00:10:49,400 Speaker 2: win real money this basketball season with prize Picks as 196 00:10:49,440 --> 00:10:51,640 Speaker 2: it starts to heat up, and you should download the 197 00:10:51,679 --> 00:10:54,480 Speaker 2: Prize Picks app and check all the different options today 198 00:10:54,880 --> 00:10:56,520 Speaker 2: to go more and less in a whole bunch of 199 00:10:56,559 --> 00:11:00,000 Speaker 2: different categories. And you sign up today, you get fifty 200 00:11:00,080 --> 00:11:01,960 Speaker 2: dollars instantly when you play five You don't even need 201 00:11:01,960 --> 00:11:04,080 Speaker 2: to win to receive the fifty dollars bonus. It's guaranteed. 202 00:11:04,640 --> 00:11:06,920 Speaker 1: And as today is Tuesday, it's worth pointing out that 203 00:11:06,960 --> 00:11:09,959 Speaker 1: Prize Picks offers weekly promotions that can lead to big payoffs, 204 00:11:10,000 --> 00:11:13,240 Speaker 1: like Taco Tuesday, where they discount select player projections up 205 00:11:13,280 --> 00:11:15,240 Speaker 1: to twenty five percent. To provide even more value for 206 00:11:15,320 --> 00:11:17,959 Speaker 1: your lineups. Download the Prize Picks app today, use the 207 00:11:17,960 --> 00:11:20,319 Speaker 1: code Armstrong to get fifty dollars instantly after you play 208 00:11:20,360 --> 00:11:24,040 Speaker 1: your first five bucks. Again, that's automatic. Download Prize Picks. 209 00:11:24,120 --> 00:11:29,800 Speaker 1: Use that code Armstrong, Prize Picks run your game. So finally, 210 00:11:29,840 --> 00:11:33,880 Speaker 1: this this goes out to folks on both coasts in 211 00:11:33,920 --> 00:11:36,199 Speaker 1: the center of the country. Whoever's had to make an 212 00:11:36,200 --> 00:11:40,640 Speaker 1: insurance claim. Oh, that reminds me later on in the show, 213 00:11:40,679 --> 00:11:42,440 Speaker 1: maybe an hour four Wait what. 214 00:11:42,440 --> 00:11:43,120 Speaker 2: You do four hours? 215 00:11:43,200 --> 00:11:44,439 Speaker 1: Yes, we do. And if you can't get it, you 216 00:11:44,440 --> 00:11:47,320 Speaker 1: could grab it later via podcast Armstrong and Getty on demand. 217 00:11:47,480 --> 00:11:51,360 Speaker 1: You probably ought to subscribe. Anyway, We're gonna have a 218 00:11:51,400 --> 00:11:55,720 Speaker 1: feature about California versus Florida and policy and homeowners insurance. 219 00:11:55,720 --> 00:12:00,000 Speaker 1: It is quite a striking difference. Anyway. All State Insurance 220 00:12:02,240 --> 00:12:07,480 Speaker 1: was using AI models to write responses to letters speak 221 00:12:07,600 --> 00:12:11,200 Speaker 1: in emails because they get zillions of them from customers, 222 00:12:13,200 --> 00:12:17,120 Speaker 1: and All State is discovering these soulless, generative AI models 223 00:12:17,120 --> 00:12:20,359 Speaker 1: made up entirely of data and code, are more empathetic 224 00:12:20,679 --> 00:12:22,840 Speaker 1: than many of its human representatives. 225 00:12:22,960 --> 00:12:23,520 Speaker 2: Wow. 226 00:12:24,320 --> 00:12:26,880 Speaker 1: The insurer said that during the often frustrating back and 227 00:12:26,920 --> 00:12:29,880 Speaker 1: forth between customers and claim reps after a claim is filed, 228 00:12:30,120 --> 00:12:32,600 Speaker 1: nearly all of the All State emails are now generated 229 00:12:32,640 --> 00:12:35,320 Speaker 1: by AI, and that as a result, they are less 230 00:12:35,360 --> 00:12:40,880 Speaker 1: accusatory and jargony and more empathetic than the humans. 231 00:12:41,840 --> 00:12:42,800 Speaker 2: That's interesting. 232 00:12:43,320 --> 00:12:48,280 Speaker 1: All States using open eyes GPT model for what it's worth. Quote, 233 00:12:48,360 --> 00:12:50,160 Speaker 1: when these emails used to go out, even though we 234 00:12:50,200 --> 00:12:52,520 Speaker 1: had standards and so on, they would include a lot 235 00:12:52,559 --> 00:12:56,280 Speaker 1: of insurance jargon. They weren't very empathetic, claims would claims 236 00:12:56,280 --> 00:12:59,160 Speaker 1: agents would get frustrated, and so it wasn't necessarily great 237 00:12:59,200 --> 00:13:01,960 Speaker 1: communications said their chief information officer. 238 00:13:03,559 --> 00:13:06,960 Speaker 2: That's interesting. So have you found yourself doing this now 239 00:13:07,040 --> 00:13:10,160 Speaker 2: that Google has the little AI summary at the top 240 00:13:10,160 --> 00:13:13,199 Speaker 2: whenever you search anything. I've been using that a lot, 241 00:13:13,360 --> 00:13:16,120 Speaker 2: and I don't almost exclusively, and I don't scroll down 242 00:13:16,520 --> 00:13:18,679 Speaker 2: to check other things. I go with the little AI 243 00:13:18,800 --> 00:13:22,199 Speaker 2: summary and assume it's right. I guess far superior. 244 00:13:22,640 --> 00:13:24,640 Speaker 1: Yeah, but you know, I'm not sure I'd say I 245 00:13:24,679 --> 00:13:28,160 Speaker 1: assume it's right, but it's it's helpful. It's what I 246 00:13:28,200 --> 00:13:30,680 Speaker 1: was looking for way, way, way more often than the 247 00:13:31,000 --> 00:13:32,080 Speaker 1: useless just Google. 248 00:13:32,120 --> 00:13:35,160 Speaker 2: So yeah, it's succinct and it Yeah, that's something, and 249 00:13:35,200 --> 00:13:38,560 Speaker 2: it'll get a lot better, I assume. With allso, with 250 00:13:38,600 --> 00:13:41,280 Speaker 2: all the dangers of it choosing what it wants you 251 00:13:41,320 --> 00:13:44,600 Speaker 2: to see inherent to the problem. We got a lot 252 00:13:44,640 --> 00:13:46,880 Speaker 2: more on the waist here. 253 00:13:48,760 --> 00:13:50,440 Speaker 5: And when you're trying to make that decision, do I 254 00:13:50,480 --> 00:13:53,920 Speaker 5: feel safe flying? And I do very much. So there's 255 00:13:53,960 --> 00:13:57,160 Speaker 5: no trend. There's nothing that ties these accidents together. So 256 00:13:57,520 --> 00:14:00,640 Speaker 5: using critical thinking, you can say for yourself, do all 257 00:14:00,640 --> 00:14:03,760 Speaker 5: of these things indicate some kind of trend in aircraft accidents? 258 00:14:03,960 --> 00:14:07,720 Speaker 5: And to me, as an experienced investigator and being involved 259 00:14:07,760 --> 00:14:10,520 Speaker 5: in aviation as a mechanic and as a safety inspector 260 00:14:10,520 --> 00:14:13,520 Speaker 5: for the FAA. I'm perfectly I feel perfectly safe doing this. 261 00:14:13,559 --> 00:14:16,559 Speaker 5: I don't see anything tying these together that says, oh, 262 00:14:16,600 --> 00:14:19,160 Speaker 5: the FAA's bad or the system is a problem. So 263 00:14:19,200 --> 00:14:21,880 Speaker 5: you got that plane landed in Canada on its upside down, 264 00:14:21,880 --> 00:14:22,800 Speaker 5: which she's not supposed to do. 265 00:14:22,840 --> 00:14:25,480 Speaker 2: Somebody should have told the pilot. And I had a 266 00:14:25,520 --> 00:14:27,520 Speaker 2: number of people in my real life, because this is 267 00:14:27,680 --> 00:14:30,480 Speaker 2: three and a couple of weeks, say what's going on 268 00:14:30,560 --> 00:14:34,280 Speaker 2: with flying? And that guy's saying, nothing's going on with flying. 269 00:14:34,640 --> 00:14:35,520 Speaker 2: So there you go. 270 00:14:36,360 --> 00:14:39,000 Speaker 1: Yeah, yeah, that's a real test of our ability to 271 00:14:39,000 --> 00:14:42,440 Speaker 1: be rational as human beings. Since it just keeps popping up, 272 00:14:42,880 --> 00:14:44,960 Speaker 1: you start to think maybe something's going on. 273 00:14:45,440 --> 00:14:49,880 Speaker 2: Right. We mentioned this last week a little follow up 274 00:14:50,000 --> 00:14:53,000 Speaker 2: on the asteroid that might hit the Earth in twenty 275 00:14:53,080 --> 00:14:56,920 Speaker 2: thirty two. So they have spotted an asteroid way out 276 00:14:56,920 --> 00:15:00,880 Speaker 2: there in outer space that is headed to Horde Earth, 277 00:15:00,960 --> 00:15:06,160 Speaker 2: but it's so far away that it's difficult to nail 278 00:15:06,200 --> 00:15:08,640 Speaker 2: down exactly where it's going to pass by. I mean, 279 00:15:08,880 --> 00:15:13,680 Speaker 2: just imagine the tiniest bit of being off and measuring 280 00:15:13,720 --> 00:15:17,720 Speaker 2: it from that far away would over that many gazillion 281 00:15:17,760 --> 00:15:20,280 Speaker 2: miles turn into a pretty wide space. But it has 282 00:15:20,320 --> 00:15:20,680 Speaker 2: a watch. 283 00:15:20,720 --> 00:15:22,680 Speaker 1: Don't look up. I know how the process works. 284 00:15:22,760 --> 00:15:25,640 Speaker 2: It has a two percent chance of hitting Earth in 285 00:15:25,720 --> 00:15:27,760 Speaker 2: twenty thirty two, and as you said the other day, 286 00:15:27,960 --> 00:15:31,720 Speaker 2: a one in fifty chance, ain't nothing. One in fifty 287 00:15:31,880 --> 00:15:34,680 Speaker 2: of it hit and Earth. So this particular asteroid is 288 00:15:34,760 --> 00:15:38,920 Speaker 2: forty to ninety meters wide, they do know that, which 289 00:15:39,000 --> 00:15:40,960 Speaker 2: is what the length of a football field. 290 00:15:42,880 --> 00:15:43,680 Speaker 1: Yeah, roughly. 291 00:15:46,560 --> 00:15:51,200 Speaker 2: The thing that hit in the Yukton Peninsula that wiped 292 00:15:51,240 --> 00:15:54,520 Speaker 2: out the dinosaurs was several kilometers, so is much bigger 293 00:15:54,560 --> 00:15:57,360 Speaker 2: than that. So this isn't a wipes out Earth, wipes 294 00:15:57,400 --> 00:16:00,320 Speaker 2: out human beings sort of asteroid. But it would would 295 00:16:00,320 --> 00:16:01,960 Speaker 2: be a heck of a big deal wherever it hit, 296 00:16:03,040 --> 00:16:07,280 Speaker 2: really big deal. And I don't know do we currently 297 00:16:07,320 --> 00:16:11,200 Speaker 2: have the technology to intercept that. If fat is it 298 00:16:11,200 --> 00:16:13,800 Speaker 2: gets closer, because that's seven years from now. As it 299 00:16:13,800 --> 00:16:17,520 Speaker 2: gets closers, we I'm sure the one in fifty will 300 00:16:17,520 --> 00:16:19,680 Speaker 2: get narrowed down as it gets closer. Just the math 301 00:16:19,720 --> 00:16:22,280 Speaker 2: will get easier and if they can determine. Look, we 302 00:16:22,360 --> 00:16:24,400 Speaker 2: got like an eighty percent chances is going to hit Earth. 303 00:16:25,000 --> 00:16:28,280 Speaker 2: We got to intercept it right. With our modern technology. 304 00:16:28,760 --> 00:16:31,440 Speaker 1: I would think we'd be able to nudge it successfully 305 00:16:31,520 --> 00:16:33,520 Speaker 1: enough that it would move the Earth. But I've read 306 00:16:33,640 --> 00:16:36,480 Speaker 1: scientists talking about this. How if it was say, aimed 307 00:16:36,520 --> 00:16:40,160 Speaker 1: right in the middle of the Pacific Ocean, and and 308 00:16:40,280 --> 00:16:42,160 Speaker 1: you came along as Nassa and said, now we think 309 00:16:42,200 --> 00:16:43,840 Speaker 1: we can nudge it so it doesn't hurt it hit 310 00:16:43,840 --> 00:16:47,400 Speaker 1: the earth completely. We'll be nudging it eastward a long way, 311 00:16:47,640 --> 00:16:50,840 Speaker 1: and all the countries eastward not a long way like 312 00:16:50,880 --> 00:16:54,160 Speaker 1: you know whatever. Actually that would be the United States udge. 313 00:16:54,320 --> 00:16:55,720 Speaker 2: We'll say, nudge it into China. 314 00:16:56,920 --> 00:16:58,960 Speaker 1: Let's go with westward for the sake of the argument. 315 00:16:59,120 --> 00:17:00,880 Speaker 1: So we say, no, we're going to nudget so far 316 00:17:00,960 --> 00:17:02,880 Speaker 1: west it doesn't even hit the earth. And all the 317 00:17:02,920 --> 00:17:05,879 Speaker 1: countries of the west, China, India, you know, Russia, Europe 318 00:17:05,880 --> 00:17:08,120 Speaker 1: are like, no, you're not nudging it west. How about 319 00:17:08,119 --> 00:17:08,760 Speaker 1: you nudge Niche. 320 00:17:08,840 --> 00:17:11,920 Speaker 2: So China sends up their spaceship, we send up our spaceship, 321 00:17:11,920 --> 00:17:13,760 Speaker 2: and we're on both sides of the asteroid trying to 322 00:17:13,760 --> 00:17:14,800 Speaker 2: push a different directions. 323 00:17:15,119 --> 00:17:21,800 Speaker 1: Oh no, wrestling in space. It's gonna be an interesting 324 00:17:21,840 --> 00:17:24,200 Speaker 1: media a death Come do your work. 325 00:17:24,520 --> 00:17:26,639 Speaker 2: And what what how much time I got, Michael, I've 326 00:17:26,640 --> 00:17:30,399 Speaker 2: got my biggest complaint of the weekend. Starbucks has changed 327 00:17:30,400 --> 00:17:33,280 Speaker 2: their lids, and I'm outraged. They got rid of their 328 00:17:33,320 --> 00:17:35,760 Speaker 2: plastic lids to be good for the earth, and they've 329 00:17:35,800 --> 00:17:38,760 Speaker 2: got these paper lids now that you can't put a 330 00:17:38,800 --> 00:17:40,359 Speaker 2: stopper in. He used to be able to put a 331 00:17:40,400 --> 00:17:42,159 Speaker 2: little plastic stopper and so he could ride in your 332 00:17:42,160 --> 00:17:44,440 Speaker 2: car and it doesn't spill. You can't do that anymore 333 00:17:44,440 --> 00:17:49,200 Speaker 2: with their dumb paper lids. So that's the I emailed 334 00:17:49,200 --> 00:17:51,359 Speaker 2: several people in Ukraine and told them how upset I was. 335 00:17:52,000 --> 00:17:54,199 Speaker 1: Have Trump step in. He got rid of the paper straws. 336 00:17:54,240 --> 00:17:56,160 Speaker 1: Maybe you can get rid of this garbage Starbucks. 337 00:17:56,200 --> 00:17:57,760 Speaker 2: I hate your paper lids. You suck. 338 00:17:58,440 --> 00:18:00,800 Speaker 1: Make coffee great again, exactly. 339 00:18:02,280 --> 00:18:05,719 Speaker 2: Some really disturbing news about the budget so far this 340 00:18:05,760 --> 00:18:08,439 Speaker 2: fiscal year that everybody should be paying attention to, but 341 00:18:08,520 --> 00:18:10,760 Speaker 2: nobody is. Among other things we can talk about. I 342 00:18:10,800 --> 00:18:11,640 Speaker 2: hope you can stay. 343 00:18:11,440 --> 00:18:18,680 Speaker 1: Here, Armstrong and Getty. People are really scared. 344 00:18:19,240 --> 00:18:22,200 Speaker 6: I think that you know, twelve days ago, people knew 345 00:18:22,200 --> 00:18:24,680 Speaker 6: where their next paycheck was coming from. They knew how 346 00:18:24,680 --> 00:18:26,400 Speaker 6: they were going to pay for their kids' take care, 347 00:18:27,000 --> 00:18:31,280 Speaker 6: their medical bills, and then all gone overnight. 348 00:18:31,600 --> 00:18:34,000 Speaker 2: I'm trying to make it clear that I'm not happy 349 00:18:34,200 --> 00:18:37,399 Speaker 2: that this person is wondering where they're going to get 350 00:18:37,400 --> 00:18:39,760 Speaker 2: their next paycheck or how they're going to survive, because 351 00:18:39,800 --> 00:18:41,840 Speaker 2: you know, we most of us have been there at 352 00:18:41,840 --> 00:18:44,200 Speaker 2: some point in our lives. But the fact that these 353 00:18:44,240 --> 00:18:48,480 Speaker 2: government workers are unaware that the private sector lives this 354 00:18:48,520 --> 00:18:51,119 Speaker 2: way all the time is amazing to me. 355 00:18:51,720 --> 00:18:53,720 Speaker 1: Yeah, they're in a different kind of bubble than the 356 00:18:53,760 --> 00:18:55,560 Speaker 1: one we usually talk about. I would call it an 357 00:18:55,600 --> 00:18:59,960 Speaker 1: employment reality bubble. They don't understand that the vast majority 358 00:19:00,080 --> 00:19:04,359 Speaker 1: the world strikes that bargain. Yeah, our production, our skill, 359 00:19:04,440 --> 00:19:07,119 Speaker 1: our value to the company will be high enough that 360 00:19:07,160 --> 00:19:09,680 Speaker 1: they keep us until it's not. Then we got to 361 00:19:09,680 --> 00:19:10,440 Speaker 1: find another gig. 362 00:19:11,800 --> 00:19:16,320 Speaker 2: So let's play the intro from sixty minutes on this 363 00:19:16,440 --> 00:19:19,760 Speaker 2: story last night. This was their first story about DOGE 364 00:19:19,800 --> 00:19:22,360 Speaker 2: cutting USAID and all that sort of stuff yesterday. 365 00:19:22,080 --> 00:19:27,400 Speaker 1: And enjoy the foreboding language used by Scott Pelly. 366 00:19:27,440 --> 00:19:30,720 Speaker 7: It's too soon to tell how serious President Trump is 367 00:19:30,800 --> 00:19:35,200 Speaker 7: in defiance of the Constitution. In his first twenty eight days, 368 00:19:35,240 --> 00:19:39,080 Speaker 7: he signed an order to nullify birthright citizenship for some 369 00:19:39,800 --> 00:19:43,720 Speaker 7: a right guaranteed by the fourteenth Amendment, and he has 370 00:19:43,840 --> 00:19:49,080 Speaker 7: closed agencies and frozen spending that Congress mandated by law. 371 00:19:49,920 --> 00:19:52,960 Speaker 7: Lower courts are holding up many of the president's priorities, 372 00:19:53,280 --> 00:19:56,240 Speaker 7: but nothing has risen to the Supreme Court, where these 373 00:19:56,280 --> 00:20:01,600 Speaker 7: battles over presidential power could rewrite history. Presidents often push 374 00:20:01,720 --> 00:20:06,680 Speaker 7: limits FDR's New Deal, for example, and voters in this 375 00:20:06,760 --> 00:20:11,159 Speaker 7: last election wanted change, but the scope and speed of 376 00:20:11,200 --> 00:20:16,359 Speaker 7: Trump's reach for power may be unprecedented. One example is 377 00:20:16,359 --> 00:20:20,560 Speaker 7: a sixty three year old agency created by Congress, codified 378 00:20:20,720 --> 00:20:25,439 Speaker 7: in law, and eviscerated by Trump in a matter of days. 379 00:20:26,480 --> 00:20:31,040 Speaker 2: Oh, it's an interesting reach for power that you're trying 380 00:20:31,080 --> 00:20:32,200 Speaker 2: to shrink the government. 381 00:20:32,880 --> 00:20:36,280 Speaker 1: Oh yeah, yeah. And the idea that the fourteenth Amendment 382 00:20:36,320 --> 00:20:39,240 Speaker 1: thing is clearly unconstitutional and he knows it and everybody 383 00:20:39,280 --> 00:20:43,159 Speaker 1: knows it. And how shutting down USA temporarily when you 384 00:20:43,240 --> 00:20:46,840 Speaker 1: reassess how it projects American power broad is somehow just 385 00:20:46,880 --> 00:20:50,639 Speaker 1: an unforgivable and horrific sin against humanity. And it's just 386 00:20:50,720 --> 00:20:53,959 Speaker 1: so intellectually dishonest. But again, I've given up on Network 387 00:20:54,000 --> 00:20:55,520 Speaker 1: News and CBS. It's a joke. 388 00:20:55,720 --> 00:20:58,360 Speaker 2: Like I always say, I almost never miss an episode 389 00:20:58,359 --> 00:21:00,480 Speaker 2: of sixty Minutes. But I must have missed the one 390 00:21:01,040 --> 00:21:04,639 Speaker 2: in which Scott Pelly said, no one's exactly sure how 391 00:21:04,720 --> 00:21:07,879 Speaker 2: much Joe Biden's willing to challenge the Constitution, but the 392 00:21:07,920 --> 00:21:12,560 Speaker 2: Supreme Court ruled he can't do away with student loans. However, 393 00:21:12,680 --> 00:21:15,760 Speaker 2: he seems hell bent on doing it despite blah blah blah. 394 00:21:15,880 --> 00:21:17,360 Speaker 2: I must have missed that episode. 395 00:21:17,640 --> 00:21:20,840 Speaker 1: In fact, he said the Supreme Court was illegitimate for 396 00:21:21,119 --> 00:21:24,840 Speaker 1: enforcing the Constitution and vowed that he would do it anyway. 397 00:21:25,200 --> 00:21:29,240 Speaker 1: No lesson an unprecedented reach for power, no lesson authority 398 00:21:29,280 --> 00:21:32,560 Speaker 1: than Nancy Pelosi has said multiple times the President doesn't 399 00:21:32,560 --> 00:21:36,080 Speaker 1: have the authority to cancel student loans, but he's doing 400 00:21:36,119 --> 00:21:36,720 Speaker 1: it anyway. 401 00:21:36,920 --> 00:21:38,439 Speaker 2: Yeah, I must have missed that episode. 402 00:21:38,920 --> 00:21:42,040 Speaker 1: Their partisan hacks anyway, I forget about their past as 403 00:21:42,280 --> 00:21:43,160 Speaker 1: serious journalists. 404 00:21:43,160 --> 00:21:45,199 Speaker 2: It's gone. I wanted to hear more from this poor 405 00:21:46,200 --> 00:21:49,919 Speaker 2: Christina girl who got fired at her government job the. 406 00:21:49,960 --> 00:21:53,320 Speaker 1: Other day and they had to leave the building. 407 00:21:53,680 --> 00:21:57,040 Speaker 6: And these are folks who had decades and decades of 408 00:21:57,040 --> 00:22:02,680 Speaker 6: public service, serving USAID across administrations from you know, George 409 00:22:02,680 --> 00:22:05,560 Speaker 6: Bush to Obama to the first Trump administration, and they 410 00:22:06,440 --> 00:22:08,080 Speaker 6: were never able to walk back in the building again. 411 00:22:08,960 --> 00:22:13,480 Speaker 2: No way. You'd been there for decades doing a good job. 412 00:22:13,760 --> 00:22:16,680 Speaker 2: You got fired and you weren't let back in the building. 413 00:22:16,880 --> 00:22:18,520 Speaker 2: I've never heard of that happening before. 414 00:22:19,720 --> 00:22:22,280 Speaker 1: Thank you for describing precisely how it's going to be 415 00:22:22,560 --> 00:22:27,320 Speaker 1: as wasteful, out of control, ridiculous government programs get rained in. 416 00:22:27,680 --> 00:22:29,760 Speaker 1: That is what it's going to look like. Yeah, and 417 00:22:29,800 --> 00:22:32,320 Speaker 1: it's unfortunate, but it must be done. Have you seen 418 00:22:32,359 --> 00:22:33,880 Speaker 1: the deficit lately, sweetheart? 419 00:22:33,960 --> 00:22:38,359 Speaker 2: Speaking of that alarming report shows us drowning in red ink. 420 00:22:39,600 --> 00:22:42,879 Speaker 2: The New York Post threw in as Democrats block spending cuts, 421 00:22:42,920 --> 00:22:47,320 Speaker 2: which is perfectly fine. So this is the first four 422 00:22:47,359 --> 00:22:51,560 Speaker 2: months of the fiscal year. The federal government has a 423 00:22:51,960 --> 00:22:54,800 Speaker 2: what they call mind numbing and it is eight hundred 424 00:22:54,840 --> 00:23:00,439 Speaker 2: and thirty eight billion dollar shortfall four months into the 425 00:23:00,440 --> 00:23:01,120 Speaker 2: fiscal year. 426 00:23:02,119 --> 00:23:02,639 Speaker 1: Wow. 427 00:23:03,920 --> 00:23:07,359 Speaker 2: Per the non partisan Congressional Budget Office, that eight hundred 428 00:23:07,359 --> 00:23:10,480 Speaker 2: and thirty eight billion dollar hole is fifteen percent higher 429 00:23:10,520 --> 00:23:16,119 Speaker 2: than last year's first three month gap, which was horrifying 430 00:23:16,160 --> 00:23:21,520 Speaker 2: and talked about then. But it's fifteen percent more now. 431 00:23:22,680 --> 00:23:26,520 Speaker 2: Last month, the CBO projected at nearly two trillion dollars 432 00:23:26,560 --> 00:23:30,480 Speaker 2: in overspending this year. I remember the first time we 433 00:23:30,520 --> 00:23:32,479 Speaker 2: spent more than a trillion was that the beginning of 434 00:23:32,520 --> 00:23:35,440 Speaker 2: COVID or was that eight when we had the big 435 00:23:35,480 --> 00:23:39,159 Speaker 2: financial meltdown. But anyway, first time we spent we had 436 00:23:39,440 --> 00:23:43,320 Speaker 2: a yearly budget deficit of a trillion, it was just, 437 00:23:43,440 --> 00:23:46,800 Speaker 2: oh my god, we can't Bloh Now it's two trillion regularly. 438 00:23:47,040 --> 00:23:49,200 Speaker 2: That's just what we do now, that's that's just where we. 439 00:23:49,160 --> 00:23:51,359 Speaker 1: Have time with a strong economy, that's correct. 440 00:23:51,440 --> 00:23:53,720 Speaker 2: Yeah, yeah, that would be about six point two percent 441 00:23:53,800 --> 00:23:58,560 Speaker 2: of gross domestic product. It was three point eight percent 442 00:23:58,720 --> 00:24:04,400 Speaker 2: on average over the last fifty years. Now it's six 443 00:24:04,440 --> 00:24:07,760 Speaker 2: point two percent. That would be added, of course to 444 00:24:07,840 --> 00:24:16,080 Speaker 2: our current thirty six trillion dollar debt, and in just 445 00:24:16,160 --> 00:24:18,120 Speaker 2: a couple of years, the nation's debt. This has been 446 00:24:18,160 --> 00:24:20,280 Speaker 2: threatened for a long time. The math is simple. It's 447 00:24:20,280 --> 00:24:22,760 Speaker 2: out there on the horizon covered our direction. I mean, 448 00:24:22,760 --> 00:24:26,360 Speaker 2: it's as inevitable as the sun rise, but we all 449 00:24:26,359 --> 00:24:28,920 Speaker 2: pretend it's not or something. In just a few years, 450 00:24:28,920 --> 00:24:31,680 Speaker 2: the nation's debt to GDP ratio will exceed it's World 451 00:24:31,760 --> 00:24:35,840 Speaker 2: War two peak, which is amazing in peace time. What 452 00:24:35,880 --> 00:24:40,320 Speaker 2: are we gonna do if there is a similar catastrophe 453 00:24:40,359 --> 00:24:42,720 Speaker 2: like World War Two, which could easily happen with China 454 00:24:43,080 --> 00:24:46,679 Speaker 2: could easily happen. Sure, combination of China and Russia and 455 00:24:46,720 --> 00:24:49,520 Speaker 2: we in Europe are fighting them, absolutely could happen. But 456 00:24:49,600 --> 00:24:53,640 Speaker 2: we're already spending like we did during World War Two. 457 00:24:53,760 --> 00:24:56,240 Speaker 2: What happens in well, maybe you lose is what happens. 458 00:24:56,760 --> 00:25:00,760 Speaker 1: I just like how any cut is identified as a horror, 459 00:25:00,960 --> 00:25:05,239 Speaker 1: particularly by left these days in the media, because it 460 00:25:05,320 --> 00:25:12,200 Speaker 1: will take away the the the excuse used for spending. 461 00:25:12,240 --> 00:25:17,399 Speaker 1: It will be hurt. This is money budgeted for a 462 00:25:17,520 --> 00:25:21,919 Speaker 1: starving pregnant women, So you're pregnant women starve. Here's the 463 00:25:21,960 --> 00:25:28,840 Speaker 1: deal with government spending in casegnan a group of people 464 00:25:28,840 --> 00:25:31,120 Speaker 1: that people could really have some compassion for. I think 465 00:25:31,160 --> 00:25:35,720 Speaker 1: you chose all right exactly. So they spend zillions of 466 00:25:35,720 --> 00:25:38,439 Speaker 1: dollars on god knows what, but they call it spending 467 00:25:38,440 --> 00:25:41,240 Speaker 1: for starving pregnant women. Therefore, when you try to cut it, 468 00:25:41,320 --> 00:25:44,840 Speaker 1: having identified it is wasteful and ridiculous, and one penny 469 00:25:44,840 --> 00:25:47,800 Speaker 1: of every dollar goes to a hungry pregnant gal, they 470 00:25:48,200 --> 00:25:51,480 Speaker 1: screech about it, that the scam being every dollar that's 471 00:25:51,480 --> 00:25:54,760 Speaker 1: spent by government is given a noble sounding name. There's 472 00:25:54,840 --> 00:25:58,000 Speaker 1: no there's no Department of flushing money down the toilet, 473 00:25:58,359 --> 00:26:02,160 Speaker 1: even though you know it. Actually, more appropriately it would 474 00:26:02,160 --> 00:26:05,080 Speaker 1: be the Department of handing out enormous gobs of cash 475 00:26:05,080 --> 00:26:09,600 Speaker 1: to our cronies. Nobody actually calls itself that, but that's. 476 00:26:09,440 --> 00:26:11,399 Speaker 2: What it is. Well, as I've heard a number of 477 00:26:11,400 --> 00:26:16,080 Speaker 2: people point out there should be more. Asking Democrats, what 478 00:26:16,280 --> 00:26:18,800 Speaker 2: would you cut? I assume you believe we've got to 479 00:26:18,840 --> 00:26:21,280 Speaker 2: cut spending since we're thirty two trillion dollars in debt, 480 00:26:21,560 --> 00:26:23,080 Speaker 2: would you cut? Of course, the answer is always going 481 00:26:23,160 --> 00:26:25,320 Speaker 2: to be raised taxes. The billionaires aren't paying their fast share, 482 00:26:25,480 --> 00:26:26,439 Speaker 2: and that's where you go with that. 483 00:26:26,840 --> 00:26:28,560 Speaker 1: The rich, you need to pay their fresh share. Yeah. 484 00:26:28,840 --> 00:26:32,520 Speaker 1: Interestingly enough, I thought this was a great perspective written 485 00:26:32,520 --> 00:26:37,399 Speaker 1: by a fellow by the name of Eugene Controvich. Controrovich. 486 00:26:37,760 --> 00:26:41,040 Speaker 1: How many o's are enough? Eugene? Anyway, he's talking about 487 00:26:41,040 --> 00:26:44,400 Speaker 1: the billions and billions of dollars that the US spends 488 00:26:44,480 --> 00:26:48,840 Speaker 1: through the the UN for what he refers to as 489 00:26:48,880 --> 00:26:53,280 Speaker 1: the global deep State in the Wall Street Journal. This 490 00:26:53,359 --> 00:26:56,520 Speaker 1: is not some sort of fevered website or whatever. He mentions. 491 00:26:56,520 --> 00:26:58,959 Speaker 1: In fiscal twenty two, the US government provided more than 492 00:26:59,000 --> 00:27:01,960 Speaker 1: twenty one billion dollars to one hundred and seventy nine 493 00:27:02,040 --> 00:27:06,040 Speaker 1: international organizations and multilateral entities. It's four hundred and fifty 494 00:27:06,080 --> 00:27:09,040 Speaker 1: pages of a State Department report just listing them. And 495 00:27:09,080 --> 00:27:12,600 Speaker 1: that's on top of the direct foreign aid that went 496 00:27:12,640 --> 00:27:16,520 Speaker 1: to radical progressive causes via the USAID that everybody's been 497 00:27:16,560 --> 00:27:18,959 Speaker 1: talking about. So this is twenty one billion dollars on 498 00:27:19,080 --> 00:27:22,200 Speaker 1: top of all that. Even the most and is headline 499 00:27:22,320 --> 00:27:26,159 Speaker 1: for which is worth mentioning, is now, let's defund the UN. 500 00:27:27,480 --> 00:27:30,399 Speaker 1: But he says even the most innocuous sounding international organizations 501 00:27:30,400 --> 00:27:36,240 Speaker 1: have institutionalized woke ideology. Nearly every UN affiliated organization seeks 502 00:27:36,280 --> 00:27:39,800 Speaker 1: to make climate change, of course, and or gender issues, 503 00:27:39,880 --> 00:27:44,200 Speaker 1: including transgenderism, an integral part of their work. Not if 504 00:27:44,200 --> 00:27:46,800 Speaker 1: they get around to it. It should be all about 505 00:27:46,800 --> 00:27:52,560 Speaker 1: that DEI offices abound. The International Organization for Migration lists 506 00:27:52,600 --> 00:27:58,600 Speaker 1: among its central activity areas of activity gender equality, environmental sustainability, 507 00:27:58,800 --> 00:28:05,000 Speaker 1: and reducing global inequalities. The UN Commission on Human Rights 508 00:28:05,040 --> 00:28:09,040 Speaker 1: promotes a variety of transgender propaganda programs, such as helping 509 00:28:09,080 --> 00:28:13,680 Speaker 1: Nepalese LGBTQIP plus minus over the power three writers tell 510 00:28:13,720 --> 00:28:17,440 Speaker 1: their own story. Then you know you can get into 511 00:28:17,480 --> 00:28:19,840 Speaker 1: abortion and all sorts of stuff like that. But yeah, 512 00:28:20,000 --> 00:28:25,320 Speaker 1: virtually everything the UN does is as he said, sound 513 00:28:25,400 --> 00:28:32,480 Speaker 1: like a taxpayer funded sixteen to nineteen project. Defund the UN. 514 00:28:32,600 --> 00:28:37,520 Speaker 1: That's the next step. I wonder. I wonder because Trump 515 00:28:37,560 --> 00:28:40,840 Speaker 1: and company have made threatening noises about that. I would 516 00:28:40,920 --> 00:28:44,000 Speaker 1: love to see him send shockways like the ridiculous gazzil 517 00:28:44,080 --> 00:28:47,880 Speaker 1: Argo plan Gazalalago. We'll take over Gaza and kick all 518 00:28:47,880 --> 00:28:49,840 Speaker 1: the people out and turn it into a resort. His 519 00:28:49,920 --> 00:28:52,040 Speaker 1: mobile eyes to the air world to say, eh, eh, 520 00:28:52,400 --> 00:28:54,280 Speaker 1: wait a minute, maybe we can find a real solution 521 00:28:54,360 --> 00:28:56,360 Speaker 1: to this. I would love to say, no, we're not 522 00:28:56,400 --> 00:28:59,719 Speaker 1: going to fund the UN anymore at all, beginning next week, 523 00:29:00,200 --> 00:29:04,400 Speaker 1: and see what reforms suddenly shakeout. Well. 524 00:29:04,440 --> 00:29:07,920 Speaker 2: One of Elon's tweets over the weekend about things he's 525 00:29:07,920 --> 00:29:11,680 Speaker 2: cutting this is one of the this is an interesting 526 00:29:11,720 --> 00:29:14,400 Speaker 2: thing that has happened. Of course, one of the great 527 00:29:14,400 --> 00:29:17,680 Speaker 2: things about Elon run on Twitter and being in charge 528 00:29:17,680 --> 00:29:19,480 Speaker 2: of this is how he can get the news out 529 00:29:19,840 --> 00:29:23,920 Speaker 2: in a way that some retired senator. You don't you've 530 00:29:23,960 --> 00:29:27,200 Speaker 2: never heard of part of some blue ribbon commission. Nobody'd 531 00:29:27,200 --> 00:29:29,880 Speaker 2: be paying any attention to him whatsoever. But Elon tweeted 532 00:29:29,920 --> 00:29:33,400 Speaker 2: out funding for racist baby training has been canceled, and 533 00:29:33,480 --> 00:29:38,080 Speaker 2: it's got the list of how they uh, how you 534 00:29:38,120 --> 00:29:42,000 Speaker 2: can train your kid to not be racist starting at 535 00:29:42,040 --> 00:29:45,600 Speaker 2: three months old. Babies are born racist and you have 536 00:29:45,680 --> 00:29:48,000 Speaker 2: to train it out of them starting at three months. 537 00:29:48,040 --> 00:29:51,040 Speaker 2: And it's actually has the information here what you do 538 00:29:51,120 --> 00:29:53,000 Speaker 2: for your baby at three months, at nine months, at 539 00:29:53,000 --> 00:29:54,880 Speaker 2: two years, at three years, and at five years to 540 00:29:54,920 --> 00:29:56,480 Speaker 2: make sure your kid is not racist. 541 00:29:57,880 --> 00:30:01,360 Speaker 1: So babies have a natural predilection to respond more favorably 542 00:30:01,400 --> 00:30:04,080 Speaker 1: to faces that look like their own. Right, the way 543 00:30:04,200 --> 00:30:06,720 Speaker 1: was made by God or nature, and you can't trend 544 00:30:06,760 --> 00:30:07,840 Speaker 1: them out of it accord. 545 00:30:07,640 --> 00:30:11,240 Speaker 2: Your way of presenting it. Or babies are born racist. 546 00:30:13,800 --> 00:30:16,560 Speaker 2: You people are effing nuts, how but not a lot 547 00:30:16,600 --> 00:30:18,640 Speaker 2: of you are spending your time trying to train your 548 00:30:18,640 --> 00:30:21,400 Speaker 2: baby out of racism at three months. There's a lot 549 00:30:21,440 --> 00:30:22,760 Speaker 2: of other things you should do for your baby. 550 00:30:22,760 --> 00:30:26,840 Speaker 1: But all right, you know how every single couple in 551 00:30:26,920 --> 00:30:30,880 Speaker 1: advertising is multi racial, sure, every single fan. 552 00:30:30,720 --> 00:30:34,600 Speaker 2: Even though you know practically none in your real life, right, exactly. 553 00:30:34,640 --> 00:30:39,400 Speaker 1: The actual statistics on that are really interested? Oh really, 554 00:30:39,440 --> 00:30:42,160 Speaker 1: I'd like to hear that and highly politically incorrect. It'll 555 00:30:42,160 --> 00:30:44,680 Speaker 1: probably end our career to even observe the truths. You 556 00:30:44,720 --> 00:30:46,640 Speaker 1: can't observe the truth, like we were talking about in 557 00:30:46,680 --> 00:30:49,280 Speaker 1: Germany before. If you insult anybody or hurt their feelings, 558 00:30:49,280 --> 00:30:52,880 Speaker 1: you're a criminal. I am test drove the Tesla cyber 559 00:30:53,080 --> 00:30:53,960 Speaker 1: truck the other day. 560 00:30:54,000 --> 00:30:57,560 Speaker 2: I wanted to talk about that because it's a controversial vehicle. Also, 561 00:30:57,640 --> 00:30:59,280 Speaker 2: my kids say I have a new dent in my 562 00:30:59,360 --> 00:31:03,280 Speaker 2: head that I should be aware of. I'm a shaven 563 00:31:03,400 --> 00:31:05,480 Speaker 2: headed man. If you've never seen me before, and you 564 00:31:05,480 --> 00:31:08,840 Speaker 2: can see my entire sphere spiracle head. My kids say 565 00:31:08,880 --> 00:31:11,880 Speaker 2: I have a new dent. I should go to a phrenologist. 566 00:31:12,000 --> 00:31:13,320 Speaker 2: I don't know who you see about. 567 00:31:13,040 --> 00:31:19,400 Speaker 1: That, friend, because I don't think that's a thing. I thought, well, yes, 568 00:31:19,560 --> 00:31:21,440 Speaker 1: but I'm not sure it's the legitimate science. 569 00:31:21,640 --> 00:31:23,720 Speaker 2: Will insurance cover it? Anyway? We got a lot more 570 00:31:23,720 --> 00:31:25,880 Speaker 2: on the way. Stay here. 571 00:31:28,760 --> 00:31:31,440 Speaker 1: Burger che burger tea barger tea. 572 00:31:31,280 --> 00:31:34,880 Speaker 2: Burger right, yes, right from the seventies. No, I misplaced 573 00:31:34,920 --> 00:31:39,200 Speaker 2: fourteen burger well drunk until I'm sorry, but I think 574 00:31:39,200 --> 00:31:40,320 Speaker 2: he might be too drug. 575 00:31:40,600 --> 00:31:45,880 Speaker 1: Yeah, try to tackle Paul Simon, but I missed. And 576 00:31:45,960 --> 00:31:49,200 Speaker 1: now Sabrina Carpenter's dad drunk. 577 00:31:49,280 --> 00:31:54,719 Speaker 2: I'm going girl at a party everyone, So I'm uh, 578 00:31:55,200 --> 00:31:57,440 Speaker 2: we're big fans of drunkne call at my household. So 579 00:31:57,440 --> 00:31:59,520 Speaker 2: I like to see him get a spot on the 580 00:31:59,520 --> 00:32:02,800 Speaker 2: fifty years anniversary. There's so many drunk holds. My favorite part, though, 581 00:32:02,800 --> 00:32:04,920 Speaker 2: is he looks down at his drink. He says, not 582 00:32:05,080 --> 00:32:08,120 Speaker 2: my captain America. 583 00:32:08,680 --> 00:32:14,120 Speaker 1: Wow, very funny. I'm not so obsessively political. I can't 584 00:32:14,240 --> 00:32:16,960 Speaker 1: enjoy that. But how about let's do a little drunk 585 00:32:17,400 --> 00:32:21,720 Speaker 1: kid go home from college suddenly knows everything character. Wouldn't 586 00:32:21,760 --> 00:32:22,600 Speaker 1: that be charming? 587 00:32:23,280 --> 00:32:25,040 Speaker 2: I think they've had that over the years a few times. 588 00:32:25,120 --> 00:32:25,760 Speaker 1: Oh good good. 589 00:32:29,080 --> 00:32:32,640 Speaker 2: One international update that's not exactly funny, so trouugh transition. 590 00:32:32,680 --> 00:32:34,520 Speaker 2: Two quick things. So they got the big media in 591 00:32:34,520 --> 00:32:38,520 Speaker 2: Saudi Arabia right now. Marco Rubiar a Secretary of State 592 00:32:38,800 --> 00:32:41,960 Speaker 2: meeting with Lavrov, basically the Secretary of State of Russia 593 00:32:42,120 --> 00:32:44,120 Speaker 2: trying to work out a deal or start the talks 594 00:32:44,200 --> 00:32:48,200 Speaker 2: or whatever. Zelensky isn't there anyway? This breaking news. All 595 00:32:48,240 --> 00:32:51,160 Speaker 2: you need to know about this is the United States 596 00:32:51,240 --> 00:32:56,800 Speaker 2: ask Russia for a moratorium on strikes against Ukraine's energy infrastructure. 597 00:32:57,160 --> 00:33:00,960 Speaker 2: Lavra rejected that. He said, be because Russia has never 598 00:33:01,000 --> 00:33:03,560 Speaker 2: targeted Ukrainian energy infrastructure in the first place. 599 00:33:03,880 --> 00:33:05,680 Speaker 1: Must all that's a good one surgery. 600 00:33:06,160 --> 00:33:08,480 Speaker 2: That's not a good start to the talks if you're 601 00:33:08,520 --> 00:33:13,120 Speaker 2: just going to deny all reality. And then this horrific 602 00:33:13,360 --> 00:33:16,960 Speaker 2: news that came out. Hamas says the youngest hostages ages 603 00:33:17,000 --> 00:33:19,520 Speaker 2: two and five, and their mom are dead and their 604 00:33:19,560 --> 00:33:22,240 Speaker 2: bodies will be sent back to Israel this week. So 605 00:33:22,320 --> 00:33:24,680 Speaker 2: that one of the reasons Israel has been holding out 606 00:33:24,680 --> 00:33:26,840 Speaker 2: this whole time and putting up with a lot of 607 00:33:26,920 --> 00:33:29,480 Speaker 2: crap from Hamas is hoping to get back those two 608 00:33:29,480 --> 00:33:31,960 Speaker 2: little kids and their mom. But they're all dead, probably 609 00:33:32,000 --> 00:33:35,080 Speaker 2: have been for a while. How did those little kids die, Hamas? 610 00:33:35,280 --> 00:33:36,040 Speaker 2: What happened there? 611 00:33:37,200 --> 00:33:42,800 Speaker 1: Yeah, no kidding monsters. So a complete change topic here, 612 00:33:43,160 --> 00:33:45,560 Speaker 1: I said we would do this, so let's squeeze it in. 613 00:33:45,680 --> 00:33:50,280 Speaker 1: I found this very interesting. Its statistics on spouse's races 614 00:33:50,320 --> 00:33:55,440 Speaker 1: for newlyweds, it goes through It's not up to date, 615 00:33:55,560 --> 00:33:58,200 Speaker 1: so I suspect some of these numbers have changed a 616 00:33:58,360 --> 00:34:02,480 Speaker 1: little bit because innerational marriage has become more common in 617 00:34:02,480 --> 00:34:04,760 Speaker 1: the last four or five years, which you're not covered. 618 00:34:05,120 --> 00:34:06,400 Speaker 1: And I will tell you this just for the sake 619 00:34:06,440 --> 00:34:08,000 Speaker 1: of the argument. I don't care who you marry. I 620 00:34:08,000 --> 00:34:09,600 Speaker 1: hope you care for each other and are good for 621 00:34:09,640 --> 00:34:11,480 Speaker 1: each other, and if you have kids, you raise them 622 00:34:11,480 --> 00:34:13,640 Speaker 1: the best you can and you have a happy life together. 623 00:34:13,800 --> 00:34:16,440 Speaker 2: I can't give a crap. I don't care either. I 624 00:34:16,480 --> 00:34:18,480 Speaker 2: just think it's interesting human nature that it tends not 625 00:34:18,560 --> 00:34:19,359 Speaker 2: to happen. 626 00:34:20,040 --> 00:34:23,600 Speaker 1: Right indeed, and to deny that that's okay or natural 627 00:34:23,760 --> 00:34:26,239 Speaker 1: or birds of a feather flock together is just it's 628 00:34:27,000 --> 00:34:29,360 Speaker 1: it's a depth of how unrealistic the DEI crowd is. 629 00:34:29,360 --> 00:34:33,600 Speaker 1: Of course they're they're Marxists. And also, and it's not 630 00:34:33,680 --> 00:34:36,280 Speaker 1: just to us who are annoyed by or have noticed 631 00:34:36,280 --> 00:34:41,360 Speaker 1: that every couple just portrayed on TV, especially in ads, 632 00:34:41,600 --> 00:34:45,160 Speaker 1: has to be multiracial. Now. I get they think, you know, 633 00:34:45,320 --> 00:34:47,040 Speaker 1: we got to show a black person, so a black 634 00:34:47,040 --> 00:34:50,040 Speaker 1: person will want to buy this brand of wheelbarrow or 635 00:34:50,040 --> 00:34:54,160 Speaker 1: a car or cereal too, which I think is probably 636 00:34:54,200 --> 00:34:57,840 Speaker 1: not necessary. But anyway, I thought this is interesting. The 637 00:34:57,920 --> 00:35:02,120 Speaker 1: most recent statistics they have white males merry white females, 638 00:35:02,840 --> 00:35:05,600 Speaker 1: it's a zero point eight nine to eight percent of 639 00:35:05,640 --> 00:35:09,799 Speaker 1: the time, so ninety percent of the time. It's slightly 640 00:35:10,680 --> 00:35:13,759 Speaker 1: slightly more. It's ninety and a half percent that white 641 00:35:13,760 --> 00:35:17,040 Speaker 1: females marry white males. Then there are a mix of 642 00:35:17,120 --> 00:35:21,080 Speaker 1: other ethnicities they might marry, and it's this chart is 643 00:35:21,200 --> 00:35:23,440 Speaker 1: quite interesting. It's a little challenging to read, but it 644 00:35:23,440 --> 00:35:30,600 Speaker 1: has Black, Cambodian, Chinese, Filipino, Hispanic, Long Indian, Japanese, Korean, Laotian, other, Pakistani, Vietnamese, 645 00:35:30,600 --> 00:35:34,200 Speaker 1: all sorts of different things. Blah blah blah. One interesting 646 00:35:34,400 --> 00:35:38,000 Speaker 1: departure from the tendency and this is, well, I don't 647 00:35:38,040 --> 00:35:41,560 Speaker 1: know what to make of it is Japanese females marry 648 00:35:41,680 --> 00:35:44,080 Speaker 1: white guys forty one percent of the time. 649 00:35:44,680 --> 00:35:47,560 Speaker 2: But how often do black. 650 00:35:47,320 --> 00:35:51,080 Speaker 1: Filipinos It's about forty four percent. Of course, you know 651 00:35:51,480 --> 00:35:52,759 Speaker 1: the whole what's white, what's not? 652 00:35:52,840 --> 00:35:55,400 Speaker 2: How often do you have a black white marriage, as 653 00:35:55,520 --> 00:35:57,239 Speaker 2: is portrayed in ads like it's half the. 654 00:35:57,280 --> 00:36:01,480 Speaker 1: Time among black female else it appears to be about 655 00:36:01,719 --> 00:36:06,600 Speaker 1: six and a half percent. Black males merry white women 656 00:36:06,680 --> 00:36:08,400 Speaker 1: about fourteen. 657 00:36:07,960 --> 00:36:12,200 Speaker 2: Percent, But in the time in detergent commercials, it's more 658 00:36:12,280 --> 00:36:13,440 Speaker 2: like ninety five percent. 659 00:36:13,920 --> 00:36:14,800 Speaker 1: Yeah, that's correct. 660 00:36:15,000 --> 00:36:17,160 Speaker 2: For some reason, we do four hours. If you miss 661 00:36:17,160 --> 00:36:19,760 Speaker 2: an hour, get the podcast Armstrong and Getty on demand 662 00:36:22,719 --> 00:36:23,920 Speaker 1: Armstrong and Getty