1 00:00:02,080 --> 00:00:06,360 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio, the George 2 00:00:06,440 --> 00:00:10,480 Speaker 1: Washington Broadcast Center, Jack Armstrong and Joe Ketty. 3 00:00:10,400 --> 00:00:26,120 Speaker 2: Armstrong and Katty and now He Armstrong and Eddy war 4 00:00:26,239 --> 00:00:26,799 Speaker 2: rages on. 5 00:00:26,960 --> 00:00:29,200 Speaker 1: We've got more about that coming up this hour. We'll 6 00:00:29,200 --> 00:00:32,360 Speaker 1: talk a little bit about the whole anthropic Pentagon thing, 7 00:00:32,400 --> 00:00:33,599 Speaker 1: which is super interesting. 8 00:00:34,400 --> 00:00:38,040 Speaker 3: Is that just a clash of two completely different dudes? 9 00:00:38,680 --> 00:00:40,320 Speaker 3: Really interesting. We'll be on it. 10 00:00:40,360 --> 00:00:42,680 Speaker 1: We got to figure that out though, because obviously AI 11 00:00:42,800 --> 00:00:44,800 Speaker 1: is going to play a role in our government and 12 00:00:44,840 --> 00:00:48,800 Speaker 1: in our warfare. And we'll get a little of what 13 00:00:48,840 --> 00:00:51,440 Speaker 1: the latest is on the actual war. This is something 14 00:00:51,479 --> 00:00:54,840 Speaker 1: completely different. This is Gavin Newsom, governor of California, who's 15 00:00:54,880 --> 00:00:58,240 Speaker 1: almost certainly running for president. We talked a lot last 16 00:00:58,280 --> 00:01:03,040 Speaker 1: week about that thing where he was on stage. I've 17 00:01:03,040 --> 00:01:06,080 Speaker 1: heard various reports about how much that was a black 18 00:01:06,120 --> 00:01:09,200 Speaker 1: audience or not. The host was black, a lot of 19 00:01:09,200 --> 00:01:12,000 Speaker 1: the audience was black. Was it mostly a black audience? 20 00:01:12,040 --> 00:01:14,160 Speaker 1: I don't know, because originally it was portrayed to me 21 00:01:14,200 --> 00:01:18,520 Speaker 1: as a black event. And Gavin Newsom said on stage saying, hey, 22 00:01:18,560 --> 00:01:20,440 Speaker 1: I'm just like you. I got a terrible grade on 23 00:01:20,480 --> 00:01:24,000 Speaker 1: my SATs, which is a weird thing to say. Yeah, 24 00:01:24,959 --> 00:01:27,920 Speaker 1: hilariously parodied here on the show. I still think he 25 00:01:28,440 --> 00:01:31,800 Speaker 1: portrayed his and then he jumps in, I can't read it, bok. 26 00:01:31,840 --> 00:01:33,480 Speaker 1: Have you never seen me read a speech? I can't read. 27 00:01:33,800 --> 00:01:37,080 Speaker 1: I think he didn't do the best job of laying 28 00:01:37,120 --> 00:01:43,600 Speaker 1: that out. What's going on there? He has a processing 29 00:01:43,720 --> 00:01:46,480 Speaker 1: disorder with his brain, like lots of people do. I 30 00:01:46,480 --> 00:01:48,920 Speaker 1: think I've learned that I do on some things that 31 00:01:48,960 --> 00:01:52,200 Speaker 1: I just I read different things that I see. It's 32 00:01:52,440 --> 00:01:56,840 Speaker 1: very minor, but now once I discovered it, it's like, oh, yeah, 33 00:01:56,880 --> 00:02:00,960 Speaker 1: obviously I've always done this. His is pretty bad, and 34 00:02:01,000 --> 00:02:03,120 Speaker 1: I thought this was just really interesting. If you have 35 00:02:03,200 --> 00:02:05,360 Speaker 1: a processing disorder, if you've got a kid with one. 36 00:02:05,960 --> 00:02:09,720 Speaker 1: Mark Halpern interviewing Gavin Newsome about his can you read? 37 00:02:09,800 --> 00:02:12,119 Speaker 1: Can you read a book? I can read? 38 00:02:12,160 --> 00:02:13,240 Speaker 3: I used to have to underline it. 39 00:02:13,280 --> 00:02:15,600 Speaker 4: I can't read spatially. I can't. That's why I can't 40 00:02:15,600 --> 00:02:18,200 Speaker 4: read a speech mark because spatially I'll lose the line, 41 00:02:18,520 --> 00:02:20,360 Speaker 4: so I'll literally and I mean, I wish I had 42 00:02:20,400 --> 00:02:22,280 Speaker 4: a book here. If you ever lend me a book. 43 00:02:22,360 --> 00:02:24,040 Speaker 4: That's why I'll go to library, because I can never 44 00:02:24,080 --> 00:02:26,120 Speaker 4: return the book, I have to underline. And then what 45 00:02:26,200 --> 00:02:28,960 Speaker 4: I'll do is I'll take what's underlined and then I'll 46 00:02:28,960 --> 00:02:31,359 Speaker 4: put it out. And this is literally an actual example. 47 00:02:32,000 --> 00:02:34,960 Speaker 4: I'll just put in pieces of paper everything underlined, and 48 00:02:35,000 --> 00:02:37,519 Speaker 4: then I'll do that for hours and hours and hours, 49 00:02:37,520 --> 00:02:39,680 Speaker 4: and then eventually I'll put it on a little yellow 50 00:02:39,760 --> 00:02:42,720 Speaker 4: card which we'll have just quick notes, and then it's 51 00:02:42,760 --> 00:02:46,480 Speaker 4: in my head. But that's the process. But everything online 52 00:02:46,480 --> 00:02:49,079 Speaker 4: I have to print out. So I take what's online, 53 00:02:49,280 --> 00:02:52,240 Speaker 4: print out your blog or something, and then underline it 54 00:02:52,480 --> 00:02:53,440 Speaker 4: and then put it in here. 55 00:02:53,560 --> 00:02:56,320 Speaker 5: So it's a process. So you can't you can't read 56 00:02:56,360 --> 00:02:57,959 Speaker 5: on a phone. You can't read on a phone or 57 00:02:57,960 --> 00:03:00,720 Speaker 5: an iPad. I can I just get a day dreaming. 58 00:03:00,840 --> 00:03:03,440 Speaker 5: I start drifting off. And so it's so if you're saying, 59 00:03:03,600 --> 00:03:05,840 Speaker 5: is the key. So so if you're reading, if you're 60 00:03:05,840 --> 00:03:08,440 Speaker 5: reading my book, say you open it and you just 61 00:03:08,560 --> 00:03:13,240 Speaker 5: you just start underlining as you read, circles. 62 00:03:13,080 --> 00:03:16,840 Speaker 4: I made stars, and then I go back and without exception, 63 00:03:17,360 --> 00:03:19,680 Speaker 4: I'll go back to every book I read and then 64 00:03:19,720 --> 00:03:21,920 Speaker 4: I will write it out. I have hundreds and hundreds 65 00:03:21,919 --> 00:03:22,600 Speaker 4: of these paths. 66 00:03:23,680 --> 00:03:25,600 Speaker 1: I thought that was really interesting, guys. I've known a 67 00:03:25,600 --> 00:03:28,560 Speaker 1: few people explain to me how they, like one with 68 00:03:28,600 --> 00:03:32,400 Speaker 1: a PhD, how they had to get through school like 69 00:03:32,760 --> 00:03:35,720 Speaker 1: record all lectures, take them home and then type them 70 00:03:35,760 --> 00:03:37,800 Speaker 1: out or something because they didn't have the ability to 71 00:03:38,560 --> 00:03:40,360 Speaker 1: process them in any other way or whatever. I mean, 72 00:03:40,360 --> 00:03:42,600 Speaker 1: the amount of work you have to do if you 73 00:03:42,640 --> 00:03:45,320 Speaker 1: have one of these processing disorders, beyond what the rest 74 00:03:45,360 --> 00:03:47,520 Speaker 1: of us have to do to get the same information 75 00:03:47,720 --> 00:03:51,200 Speaker 1: is just incredible. And then some people either don't or 76 00:03:51,280 --> 00:03:54,080 Speaker 1: can't put in that amount of work because it'd be 77 00:03:54,120 --> 00:03:57,240 Speaker 1: so time consuming, and then they really struggle in life. 78 00:03:57,720 --> 00:03:59,760 Speaker 3: It's probably worth noting that this is not about Gavin 79 00:03:59,760 --> 00:04:01,840 Speaker 3: news and per se. I mean, he's a liar and 80 00:04:01,920 --> 00:04:04,640 Speaker 3: utterly unprincipled and a scammer and a schemer of one 81 00:04:04,680 --> 00:04:07,320 Speaker 3: hundred different ways, and as my T shirt indicates the 82 00:04:07,320 --> 00:04:09,320 Speaker 3: one I'm wearing today, if he were elected president, he 83 00:04:09,360 --> 00:04:12,880 Speaker 3: would ruin the entire country. But this is about dyslexia. 84 00:04:13,040 --> 00:04:14,560 Speaker 1: Yeah, roll on with that, Michael. 85 00:04:14,880 --> 00:04:15,840 Speaker 3: And this is the gift. 86 00:04:15,960 --> 00:04:17,920 Speaker 4: This is the gift of dyslexia. I know people want 87 00:04:17,920 --> 00:04:20,880 Speaker 4: to mock it and mock people that didn't do well academically, 88 00:04:20,960 --> 00:04:23,839 Speaker 4: but the gift is a superpower because it allows you 89 00:04:23,880 --> 00:04:24,480 Speaker 4: with discipline. 90 00:04:24,480 --> 00:04:24,960 Speaker 1: It's hard work. 91 00:04:25,080 --> 00:04:25,440 Speaker 3: Is the grid. 92 00:04:25,440 --> 00:04:28,040 Speaker 4: It's what my mom, you know what, that's her legacy. 93 00:04:28,279 --> 00:04:31,320 Speaker 4: It gives you the ability then to be able to 94 00:04:31,560 --> 00:04:35,400 Speaker 4: absorb a lot, but also allows you the ability to 95 00:04:35,440 --> 00:04:38,600 Speaker 4: be much more flexible, allows you the ability to, I think, 96 00:04:39,440 --> 00:04:44,440 Speaker 4: to strengths where others struggle and find weakness. 97 00:04:44,480 --> 00:04:47,440 Speaker 1: So I don't know anything about that angle of it. 98 00:04:47,480 --> 00:04:49,280 Speaker 1: But I also thought it was interesting him say, because 99 00:04:49,279 --> 00:04:52,880 Speaker 1: I know I know kids who have this problem of 100 00:04:52,920 --> 00:04:56,280 Speaker 1: they just they can't not space out. They can't read 101 00:04:56,320 --> 00:05:02,280 Speaker 1: two words without spacing out. Well, that's not dyslexia. Might 102 00:05:02,520 --> 00:05:06,080 Speaker 1: see ADHD, or might be screen addiction. Who knows, so 103 00:05:06,120 --> 00:05:09,840 Speaker 1: he has he has dyslexia and ADHD maybe, or I 104 00:05:09,880 --> 00:05:11,800 Speaker 1: would think he would know. I'm sure he's been testing 105 00:05:11,839 --> 00:05:14,440 Speaker 1: for all this sort of stuff. But yeah, from what 106 00:05:14,480 --> 00:05:17,040 Speaker 1: I understand of dyslexia, I don't understand dyslexi as much 107 00:05:17,040 --> 00:05:19,440 Speaker 1: as I understand ADHD because I know so many people 108 00:05:19,640 --> 00:05:27,200 Speaker 1: with it. But sounds like he has both. Uh that's something. 109 00:05:27,440 --> 00:05:30,080 Speaker 1: And then so because if he under if he draws 110 00:05:30,600 --> 00:05:32,440 Speaker 1: he reads along because he kind of showed it in 111 00:05:32,440 --> 00:05:35,120 Speaker 1: the video, if he underlines it as he reads along. 112 00:05:35,600 --> 00:05:38,240 Speaker 1: He can stay focused and read it in the proper order. 113 00:05:38,760 --> 00:05:40,880 Speaker 1: For some reason. I wonder how he discovered that, or 114 00:05:40,920 --> 00:05:43,880 Speaker 1: if that's true for everybody with dyslexia or only him. 115 00:05:43,880 --> 00:05:45,760 Speaker 1: I don't know, but I've known a number of people 116 00:05:45,839 --> 00:05:51,520 Speaker 1: that came up with workarounds just like intuitively stumbled upon them, 117 00:05:51,680 --> 00:05:54,360 Speaker 1: or whatever, workarounds to where they could make their brain 118 00:05:54,440 --> 00:05:57,880 Speaker 1: work to take in the information, whether record it and 119 00:05:57,920 --> 00:06:00,160 Speaker 1: listen to it as opposed to read it, or it's 120 00:06:00,200 --> 00:06:01,880 Speaker 1: easier to read on like with him, he can do 121 00:06:01,920 --> 00:06:04,920 Speaker 1: it on paper, but not on the screen. Man back 122 00:06:04,920 --> 00:06:06,760 Speaker 1: in the day, I wonder how many of the kids 123 00:06:06,800 --> 00:06:09,160 Speaker 1: in school that we all knew that were just they 124 00:06:09,160 --> 00:06:11,560 Speaker 1: were dumb kids. They were the dumb kids had yelled 125 00:06:11,600 --> 00:06:14,480 Speaker 1: at and got yelled at and had something like this, 126 00:06:14,600 --> 00:06:15,680 Speaker 1: and just nobody knew it. 127 00:06:16,360 --> 00:06:19,120 Speaker 3: Yeah, yeah, a lot. I have a feeling, yeah a lot. 128 00:06:19,240 --> 00:06:21,920 Speaker 3: And I can think of a couple of the weird 129 00:06:22,000 --> 00:06:26,719 Speaker 3: kids were absolutely autistic back in my childhood as well. 130 00:06:28,120 --> 00:06:30,400 Speaker 1: I thought that was really interesting, which I agree. 131 00:06:30,680 --> 00:06:34,159 Speaker 3: I don't know it is interesting, Kevin. It doesn't explain 132 00:06:34,160 --> 00:06:35,760 Speaker 3: why you have zero principles. 133 00:06:36,360 --> 00:06:39,640 Speaker 1: I don't think he interested well, and you'd have more 134 00:06:39,680 --> 00:06:41,839 Speaker 1: compassion for it. I don't think he presented it in 135 00:06:41,880 --> 00:06:44,920 Speaker 1: the best way the other day to alert everybody as 136 00:06:45,000 --> 00:06:47,400 Speaker 1: people are starting to become aware of gavenusom across the 137 00:06:47,440 --> 00:06:50,200 Speaker 1: country and as a presidential candidate and all that sort 138 00:06:50,240 --> 00:06:52,680 Speaker 1: of stuff. To it'say, I'm as dumb as you. I 139 00:06:52,720 --> 00:06:55,000 Speaker 1: can't read. I don't know that that was the best 140 00:06:55,000 --> 00:06:58,480 Speaker 1: way to present this learning disability thing that he figured out. 141 00:06:58,800 --> 00:07:01,680 Speaker 4: We're all human, we all falls short sometimes. 142 00:07:01,880 --> 00:07:05,359 Speaker 3: That's good point, Kevy. I think Democratic donors would do 143 00:07:05,400 --> 00:07:10,320 Speaker 3: well to remember Maya Angelou's famous proclamation that when somebody 144 00:07:10,360 --> 00:07:12,000 Speaker 3: tries to tell you what kind of person they are, 145 00:07:12,040 --> 00:07:19,280 Speaker 3: believe them. Gavy cannot hit major league hitching. He is 146 00:07:19,600 --> 00:07:25,360 Speaker 3: a one party state anointed star. He is not a 147 00:07:25,440 --> 00:07:28,440 Speaker 3: major leaguer, and he's proved it a couple of times lately. 148 00:07:28,520 --> 00:07:30,360 Speaker 3: But they'll keep running them up with the flag bowl 149 00:07:30,400 --> 00:07:33,320 Speaker 3: because he's good luck and well, reasonably well spoken, although, 150 00:07:33,400 --> 00:07:35,720 Speaker 3: as you heard in those clips, not great well. 151 00:07:35,720 --> 00:07:41,520 Speaker 1: I feel like Mark Alburn bailed Gavin out of the 152 00:07:41,560 --> 00:07:44,239 Speaker 1: thing that happened a week or so ago by letting 153 00:07:44,320 --> 00:07:46,080 Speaker 1: him explain all this in a way that he should 154 00:07:46,120 --> 00:07:49,280 Speaker 1: have done himself. I mean, as a good politician, he 155 00:07:49,280 --> 00:07:53,679 Speaker 1: should have taken this very easy to be sympathetic about 156 00:07:53,760 --> 00:07:56,120 Speaker 1: story and turned it into something that worked for him 157 00:07:56,160 --> 00:07:58,280 Speaker 1: as opposed to something incredibly mockable. 158 00:07:59,080 --> 00:08:03,680 Speaker 3: Yeah, I would agree, and even his explanation to Helper, 159 00:08:03,760 --> 00:08:06,360 Speaker 3: and he needed help from Helper and to explain himself. 160 00:08:06,760 --> 00:08:10,360 Speaker 3: It's just he's not a major leaguer. Speaking of Gavin Newsom, 161 00:08:12,560 --> 00:08:16,840 Speaker 3: if you don't mind, we do have a delightful montage 162 00:08:17,360 --> 00:08:21,560 Speaker 3: of Gavin Newsom talking about solving the quote unquote homelessness 163 00:08:21,640 --> 00:08:26,200 Speaker 3: problem in California, which is mostly a junkie problem. For 164 00:08:26,240 --> 00:08:31,320 Speaker 3: the last twenty two years. Michael, let's hear what. 165 00:08:31,440 --> 00:08:34,560 Speaker 4: We called a ten year plan end chronic holeness in 166 00:08:34,559 --> 00:08:35,319 Speaker 4: San Francisco. 167 00:08:35,400 --> 00:08:36,719 Speaker 1: How are you going to fall homelessness? 168 00:08:36,720 --> 00:08:38,120 Speaker 3: What are you going to do? It's a new mayor, 169 00:08:38,320 --> 00:08:39,600 Speaker 3: and I said, well, what are you going to do? 170 00:08:39,640 --> 00:08:42,800 Speaker 4: Focus on a housing first model, direct access to housing, shelter, 171 00:08:42,920 --> 00:08:47,520 Speaker 4: solvet sleep, housing with wrap around and support services solve homelessness. 172 00:08:47,520 --> 00:08:52,120 Speaker 4: Homelessness absolutely can be solved laid out a detailed homeless strategy. 173 00:08:52,200 --> 00:08:55,240 Speaker 4: There's been no intentionality on homelessness in the state for decades. 174 00:08:55,280 --> 00:08:57,160 Speaker 4: It's not been a focus. I don't think we can 175 00:08:57,200 --> 00:09:00,440 Speaker 4: solve homelessness. I note we can solve hollmelessness. Will reduce 176 00:09:00,559 --> 00:09:05,480 Speaker 4: street homelessness quickly and humanely through emergency action. The highest 177 00:09:05,679 --> 00:09:08,480 Speaker 4: investment the state's ever made is one billion dollars on homelessness. 178 00:09:08,840 --> 00:09:11,480 Speaker 4: We are poised to pass the budget in the next 179 00:09:11,520 --> 00:09:15,520 Speaker 4: few hours that will provide twelve billion dollars. That can 180 00:09:15,640 --> 00:09:20,000 Speaker 4: literally quantify fifty eight thousand people that we got off 181 00:09:20,040 --> 00:09:23,360 Speaker 4: the streets last year. This state has not made progress 182 00:09:23,640 --> 00:09:26,199 Speaker 4: in the last two decades as relates to homelessness. 183 00:09:26,440 --> 00:09:27,840 Speaker 3: Not interested in funding failure. 184 00:09:28,120 --> 00:09:30,920 Speaker 4: We're not interested in failing more efficiently when it comes 185 00:09:30,920 --> 00:09:33,319 Speaker 4: to the issue of homelessness and the Christ on the street. 186 00:09:33,400 --> 00:09:37,720 Speaker 1: I feel like that music was mocking in tone. 187 00:09:38,520 --> 00:09:42,920 Speaker 3: Ah, I didn't notice. I love the juxtaposition of we 188 00:09:43,080 --> 00:09:47,400 Speaker 3: have made enormous progress fifty two thousand people. Then the 189 00:09:47,440 --> 00:09:50,439 Speaker 3: next clip was probably six months later, the state has 190 00:09:50,440 --> 00:09:54,120 Speaker 3: made no progress and dealing with the homelessness. Oh my god, 191 00:09:54,320 --> 00:09:57,000 Speaker 3: what's so funny? What's tragic? Comic? 192 00:09:57,600 --> 00:10:01,880 Speaker 1: It was a year or two ago that we Californians 193 00:10:01,880 --> 00:10:05,319 Speaker 1: got hit with the fact that somewhere between a third 194 00:10:05,360 --> 00:10:07,040 Speaker 1: and a half of all the homeless people in the 195 00:10:07,240 --> 00:10:10,880 Speaker 1: entire United States are in California because people figured out, 196 00:10:11,280 --> 00:10:13,160 Speaker 1: not only is the weather good, but they'll let you 197 00:10:13,200 --> 00:10:16,200 Speaker 1: be drunk or high your whole life. They'll provide you 198 00:10:16,320 --> 00:10:19,000 Speaker 1: housing and food and nobody's going to arrest you for 199 00:10:19,120 --> 00:10:21,080 Speaker 1: standing on the corner high and building the tent. 200 00:10:21,160 --> 00:10:26,040 Speaker 3: It's awesome, say the revelation that was a similar timeframe 201 00:10:26,080 --> 00:10:31,280 Speaker 3: that not only could they not tell us how effective 202 00:10:31,320 --> 00:10:34,400 Speaker 3: the various programs had been, there wasn't even a mechanism 203 00:10:34,440 --> 00:10:38,319 Speaker 3: for figuring it out. Yeah, there was no even effort 204 00:10:38,360 --> 00:10:40,319 Speaker 3: toward accountability. 205 00:10:39,520 --> 00:10:45,480 Speaker 1: Which should have been a much bigger scandal. To say 206 00:10:45,480 --> 00:10:48,199 Speaker 1: it for the one millionth time, I honestly wish I 207 00:10:48,200 --> 00:10:51,600 Speaker 1: could talk to Gavin about this. Why don't you look 208 00:10:51,640 --> 00:10:54,000 Speaker 1: at it as a drug problem. It's a drug problem. 209 00:10:54,160 --> 00:10:57,080 Speaker 1: It's not a housing problem. It's a drug problem. 210 00:10:57,600 --> 00:10:57,720 Speaker 3: Right. 211 00:10:57,880 --> 00:10:59,640 Speaker 1: You got a whole bunch of people who ruin their 212 00:10:59,640 --> 00:11:03,280 Speaker 1: lives drugs and no longer can participate in the economy, 213 00:11:03,480 --> 00:11:06,080 Speaker 1: and they end up on the street, and we and 214 00:11:06,080 --> 00:11:09,080 Speaker 1: and it's pretty obvious to them from looking around that 215 00:11:09,160 --> 00:11:11,720 Speaker 1: we have a support structure for living on the street. 216 00:11:12,360 --> 00:11:15,560 Speaker 1: Right where you can get food and shelter and continue 217 00:11:15,600 --> 00:11:17,920 Speaker 1: your life and medical care and all that sort of stuff. 218 00:11:18,200 --> 00:11:20,800 Speaker 1: And it makes sure you don't have enough of negative 219 00:11:20,800 --> 00:11:23,920 Speaker 1: consequences to stop you before you ruin your brain. 220 00:11:24,200 --> 00:11:27,280 Speaker 3: You're like absolutely ensuring these people will ruin their brains. 221 00:11:27,840 --> 00:11:30,040 Speaker 1: Well, it's funny, I just popped into my head, like 222 00:11:30,280 --> 00:11:32,840 Speaker 1: I've had this conversation with other people have kids like 223 00:11:32,840 --> 00:11:36,199 Speaker 1: I do you teenagers. None of us are worried about 224 00:11:36,200 --> 00:11:39,280 Speaker 1: our kids like not being able to support themselves so 225 00:11:39,440 --> 00:11:42,719 Speaker 1: ending up homeless because they're broke. All of us are 226 00:11:42,720 --> 00:11:45,280 Speaker 1: worried about our kids starting to do drugs in ruining 227 00:11:45,320 --> 00:11:48,360 Speaker 1: their lives right and could end up on the street. 228 00:11:48,840 --> 00:11:51,640 Speaker 1: I mean that answers the question right there, who's worried 229 00:11:51,679 --> 00:11:53,520 Speaker 1: that their kid is going to end up homeless on 230 00:11:53,600 --> 00:11:58,840 Speaker 1: the street. Economically, just plaid out, you can't find a 231 00:11:58,920 --> 00:12:02,080 Speaker 1: job or get a find some roommates or whatever, very 232 00:12:02,200 --> 00:12:06,360 Speaker 1: very few, but drugs absolutely could take you off the rails, right, 233 00:12:07,640 --> 00:12:12,560 Speaker 1: I'm honestly asking this question the willful misdiagnosis of what 234 00:12:12,760 --> 00:12:15,400 Speaker 1: the real issue is. You got the one we just 235 00:12:15,720 --> 00:12:16,360 Speaker 1: laid out. 236 00:12:16,280 --> 00:12:19,480 Speaker 3: And then you've got a kid who is excuse me, 237 00:12:19,559 --> 00:12:23,000 Speaker 3: autistic or a victim of trauma, or has some other 238 00:12:23,040 --> 00:12:25,600 Speaker 3: psychological problems. Men on all those problems, and you tell them, 239 00:12:25,840 --> 00:12:28,080 Speaker 3: you know it, you're trapped in the wrong body. You're 240 00:12:28,120 --> 00:12:30,959 Speaker 3: actually a little boy. You're not a girl, you're a boy. 241 00:12:31,120 --> 00:12:34,480 Speaker 3: You should take these hormones instead of dealing with the 242 00:12:34,520 --> 00:12:37,880 Speaker 3: actual root problem. And that's These are the compassionate people 243 00:12:37,880 --> 00:12:41,760 Speaker 3: of the left. I I would like to spend two 244 00:12:41,800 --> 00:12:43,840 Speaker 3: minutes in their heads just to figure out how the 245 00:12:43,880 --> 00:12:47,480 Speaker 3: hell they work. I don't get that you've got an 246 00:12:47,559 --> 00:12:52,920 Speaker 3: unhappy child. Instead of working hard to understand why they're 247 00:12:52,960 --> 00:12:59,080 Speaker 3: so unhappy, you go with this wildly ideological new philosophy. 248 00:12:59,400 --> 00:13:02,520 Speaker 3: In fact, you brand anybody who has a serious talk 249 00:13:02,559 --> 00:13:05,280 Speaker 3: with the child about what's actually at the root of 250 00:13:05,280 --> 00:13:09,319 Speaker 3: their problems. You brand those people as dprogrammer. What's the 251 00:13:09,679 --> 00:13:14,760 Speaker 3: word with the degayifires or what have you? God, it's sick. 252 00:13:15,840 --> 00:13:19,440 Speaker 3: If you want to help people, you have to understand 253 00:13:19,480 --> 00:13:23,199 Speaker 3: what's afflicting them. Can we agree on that? And if 254 00:13:23,240 --> 00:13:24,480 Speaker 3: we can't, God help us. 255 00:13:25,600 --> 00:13:28,280 Speaker 1: We have some breaking news we might have to discuss later. 256 00:13:28,800 --> 00:13:32,160 Speaker 1: I think this is the first time ever a father 257 00:13:33,280 --> 00:13:36,080 Speaker 1: whose kid was one of those school shooters has just 258 00:13:36,160 --> 00:13:39,600 Speaker 1: been convicted of second degree murder and an involuntary manslaughter 259 00:13:40,120 --> 00:13:43,200 Speaker 1: as a parent because their kid went and did that. 260 00:13:43,559 --> 00:13:45,280 Speaker 1: We'll have to look into the details of that story. 261 00:13:45,320 --> 00:13:47,320 Speaker 1: That's pretty interesting. Among other things we can get to 262 00:13:47,360 --> 00:13:49,320 Speaker 1: stay here. 263 00:13:52,480 --> 00:13:55,680 Speaker 6: A metal detector is in Wales recently found two lead 264 00:13:55,920 --> 00:13:58,600 Speaker 6: ingots that are believed to date back to the Roman era, 265 00:13:58,960 --> 00:14:05,120 Speaker 6: while the detector is doesn't date at all. 266 00:14:09,320 --> 00:14:14,880 Speaker 1: So I misreported a story earlier. I misunderstood it. I 267 00:14:15,080 --> 00:14:17,280 Speaker 1: thought that the breaking news that we killed them when 268 00:14:17,320 --> 00:14:19,560 Speaker 1: they were voting for a new Supreme leader was telling 269 00:14:19,600 --> 00:14:21,680 Speaker 1: what was going on over the weekend when the forty 270 00:14:21,760 --> 00:14:25,000 Speaker 1: people got together. No, that's today. So over the weekend, 271 00:14:25,240 --> 00:14:27,440 Speaker 1: had you killed the supreme leader and the forty people 272 00:14:27,480 --> 00:14:33,560 Speaker 1: underneath them? Today, eighty eight members whoever's left of the 273 00:14:33,560 --> 00:14:36,680 Speaker 1: government in Iran, eighty eight members got together in a 274 00:14:36,720 --> 00:14:40,000 Speaker 1: building in calm Qom. I don't know how you pronounce 275 00:14:40,040 --> 00:14:43,880 Speaker 1: it down anyway, they were voting for a new Supreme 276 00:14:43,960 --> 00:14:46,040 Speaker 1: leader because the old one got blowed up, and they 277 00:14:46,080 --> 00:14:47,920 Speaker 1: all got blowed up, every single one of them. Looking 278 00:14:47,960 --> 00:14:50,720 Speaker 1: at that building, I mean that building is rubble, So 279 00:14:50,760 --> 00:14:53,520 Speaker 1: I would assume every single one of them is dead boy. 280 00:14:53,600 --> 00:14:56,479 Speaker 3: You think you hate hearing there's a meeting this afternoon. 281 00:14:57,800 --> 00:15:01,840 Speaker 3: I'm the Iranian government. I'm thinking I'm not and going well. 282 00:15:01,840 --> 00:15:03,840 Speaker 1: And you're so far down the line now because you 283 00:15:03,840 --> 00:15:07,600 Speaker 1: weren't the first forty and you weren't the next almost ninety. Oh, 284 00:15:07,800 --> 00:15:09,920 Speaker 1: so now you're down to like the I don't know. 285 00:15:09,960 --> 00:15:13,000 Speaker 3: You're the second assistant minister of parks. 286 00:15:13,200 --> 00:15:15,880 Speaker 1: You're the dog catcher in a medium sized town. We 287 00:15:15,920 --> 00:15:17,720 Speaker 1: need you to come vote for the Supreme Leader. We're 288 00:15:17,760 --> 00:15:19,320 Speaker 1: running out of guy, so we're having a meeting today 289 00:15:19,320 --> 00:15:21,680 Speaker 1: at four. I got a Dennis. 290 00:15:21,560 --> 00:15:26,280 Speaker 3: Pointment, would zoom be okay? Can I show up via zoom? 291 00:15:26,360 --> 00:15:26,520 Speaker 5: Yeah? 292 00:15:26,520 --> 00:15:29,160 Speaker 1: Why aren't they using zoom? Why aren't they using zoom? 293 00:15:29,280 --> 00:15:33,000 Speaker 3: No freaking kidding? Or a Google meets brought to you 294 00:15:33,080 --> 00:15:33,600 Speaker 3: by Google. 295 00:15:33,760 --> 00:15:37,800 Speaker 1: She's eighty eight members met to vote for the Supreme 296 00:15:37,880 --> 00:15:42,200 Speaker 1: Leader and they all got obliterated. And obviously, even with 297 00:15:42,320 --> 00:15:47,080 Speaker 1: all the chaos going on, Israel or our Cia still 298 00:15:47,200 --> 00:15:53,320 Speaker 1: has the tentacles to know exactly when where and bow 299 00:15:53,360 --> 00:15:55,200 Speaker 1: them up. How is that possible? 300 00:15:56,680 --> 00:15:59,600 Speaker 3: Everybody's question is, you know, how does this end? How 301 00:15:59,640 --> 00:16:04,280 Speaker 3: does it fold? What's next? And I'm thinking about when 302 00:16:04,320 --> 00:16:10,480 Speaker 3: you eliminate that many layers of the hierarchy, how does 303 00:16:10,560 --> 00:16:14,320 Speaker 3: that change the equation? Because if you are like the 304 00:16:14,440 --> 00:16:17,800 Speaker 3: recently elected mayor of a medium sized town, and all 305 00:16:17,800 --> 00:16:20,680 Speaker 3: of a sudden you're among the most elite leaders of 306 00:16:21,120 --> 00:16:24,280 Speaker 3: a regime that you've always been a little ambivalent about. Anyway, 307 00:16:25,000 --> 00:16:27,840 Speaker 3: you might be thinking, hey, uncle Sam, give me a jingle, 308 00:16:28,120 --> 00:16:31,680 Speaker 3: let's work something out. I don't want to get blowed up, right, 309 00:16:32,840 --> 00:16:36,360 Speaker 3: I mean, you're how many hard line leaders are left. 310 00:16:37,120 --> 00:16:40,320 Speaker 3: What's the state of the IRGC, the Revolutionary Guard Corps. 311 00:16:40,480 --> 00:16:41,640 Speaker 3: I don't know these things. 312 00:16:43,400 --> 00:16:46,240 Speaker 1: I'm not sure anybody does well. The CIA and Israel 313 00:16:46,360 --> 00:16:51,200 Speaker 1: probably know well the top of that. Although Arocutionary Guard 314 00:16:51,280 --> 00:16:54,240 Speaker 1: is dead and then his replacement is dead. We know that. 315 00:16:54,360 --> 00:16:57,760 Speaker 1: But wow, that's something everybody got together to vote for 316 00:16:57,800 --> 00:17:00,000 Speaker 1: a new supreme leader that they're probably going to very excite. 317 00:17:00,200 --> 00:17:02,240 Speaker 1: Le announced today. It's some old guy with a beard. 318 00:17:02,240 --> 00:17:02,720 Speaker 3: I'm guessing. 319 00:17:03,640 --> 00:17:07,439 Speaker 1: Uh, but they're all dead. Damn. 320 00:17:09,480 --> 00:17:12,119 Speaker 3: We are monitoring events. There's a joke about that, a 321 00:17:12,200 --> 00:17:14,320 Speaker 3: meme on the internet about how men do that. I 322 00:17:14,359 --> 00:17:15,920 Speaker 3: want to talk about that before too long. 323 00:17:15,920 --> 00:17:20,159 Speaker 4: Stay okay, Armstrong and Geeddy. 324 00:17:21,320 --> 00:17:21,919 Speaker 1: You're fine. 325 00:17:23,160 --> 00:17:29,600 Speaker 7: Really you need something to help you? 326 00:17:29,640 --> 00:17:32,720 Speaker 1: No problem? You want safe, you want safe, You're safe. 327 00:17:33,280 --> 00:17:36,880 Speaker 5: Everything good, no problem, Thank you for. 328 00:17:37,040 --> 00:17:40,600 Speaker 1: Ahead with us. What was that? That was an American pilot, 329 00:17:40,640 --> 00:17:43,120 Speaker 1: one of the American pilots that ejected from those planes 330 00:17:43,119 --> 00:17:45,800 Speaker 1: that got shot down by the Kuwaitis. Three of our 331 00:17:45,880 --> 00:17:48,880 Speaker 1: jets got shot down. All of the crew got out 332 00:17:49,440 --> 00:17:52,840 Speaker 1: and safely parachuted to the ground. But there's one Akwaiti 333 00:17:52,960 --> 00:17:56,000 Speaker 1: local making sure the F fifteen pilot was okay, which 334 00:17:56,040 --> 00:17:58,600 Speaker 1: they were, and said, thank you for helping us. 335 00:17:59,119 --> 00:18:02,160 Speaker 3: Oh is the bel Turnto Americans destroy the Middle East? 336 00:18:02,280 --> 00:18:05,720 Speaker 3: The locals are greeting them. Did you see the other video? 337 00:18:05,840 --> 00:18:08,360 Speaker 3: So compelling? Another one of our guys who parachuted down. 338 00:18:08,440 --> 00:18:10,800 Speaker 3: Some of the locals surrounded them and were menacing them 339 00:18:10,840 --> 00:18:13,040 Speaker 3: like they were going to beat them down with the lumber. 340 00:18:13,840 --> 00:18:17,640 Speaker 3: And finally the guy community communicated American. I'm an American, 341 00:18:17,960 --> 00:18:21,160 Speaker 3: and they're like, oh, oh, okay, all right. 342 00:18:21,200 --> 00:18:24,920 Speaker 1: They put down their sticks and no kidding, yeah cool. 343 00:18:25,119 --> 00:18:30,560 Speaker 1: And I just saw the Iranian women's soccer team competing 344 00:18:30,640 --> 00:18:33,720 Speaker 1: in some soccer tournament or something like that. Didn't sing. 345 00:18:33,960 --> 00:18:36,800 Speaker 1: They remained silent during the national anthem when they normally sing, 346 00:18:37,440 --> 00:18:40,760 Speaker 1: and it's a showing fine read their faces right, And 347 00:18:40,840 --> 00:18:44,240 Speaker 1: again I really think they should tune into MSNBC, so 348 00:18:44,320 --> 00:18:47,000 Speaker 1: they should know. They should be unhappy about this. They 349 00:18:47,000 --> 00:18:49,919 Speaker 1: should be in favor of the Ayatola and unhappy that 350 00:18:49,960 --> 00:18:51,119 Speaker 1: the United States is doing this. 351 00:18:51,920 --> 00:18:54,080 Speaker 3: Oh yeah, well some point, maybe next hour, we can 352 00:18:54,119 --> 00:18:58,840 Speaker 3: get into the slew of big foot media organizations that 353 00:19:00,200 --> 00:19:04,760 Speaker 3: you elogize the eye atill as a fuzzy bearded cleric 354 00:19:04,800 --> 00:19:07,480 Speaker 3: who had a ready smile and love poetry. You know, 355 00:19:07,680 --> 00:19:13,520 Speaker 3: just unfriggin believable. But speaking of military matters, this is 356 00:19:13,680 --> 00:19:16,760 Speaker 3: so interesting and is a question that will continue to 357 00:19:17,240 --> 00:19:19,760 Speaker 3: be devil the United States, I think for some time. 358 00:19:20,920 --> 00:19:25,840 Speaker 3: Who ultimately controls how cutting edge AI tools are deployed 359 00:19:26,359 --> 00:19:29,720 Speaker 3: in conflict in the military and the society as a 360 00:19:29,720 --> 00:19:34,080 Speaker 3: whole too, Obviously is it the user or the maker? 361 00:19:35,600 --> 00:19:39,720 Speaker 3: And that question has been under the spotlight in particular 362 00:19:39,800 --> 00:19:43,520 Speaker 3: because of the conflict between Dario Emodi and Pete Hegseath, 363 00:19:44,200 --> 00:19:48,240 Speaker 3: the founder the CEO of Anthropic Slash Claude and the 364 00:19:48,280 --> 00:19:51,480 Speaker 3: Secretary of War. And I like it this is buried 365 00:19:51,480 --> 00:19:53,520 Speaker 3: in the middle of this article, but I love this description. 366 00:19:54,040 --> 00:19:58,639 Speaker 3: Emodi and Hexath approached the question differently. A bespectacled researcher 367 00:19:58,680 --> 00:20:02,879 Speaker 3: who often twirls his curly hair, Amodi author's lengthy documents 368 00:20:02,920 --> 00:20:06,240 Speaker 3: philosophizing about the importance of AI safety and is known 369 00:20:06,240 --> 00:20:08,919 Speaker 3: for his deliberate approach to problem solving. He has been 370 00:20:08,960 --> 00:20:13,280 Speaker 3: a vegetarian since childhood. Hegath is a former Fox News 371 00:20:13,320 --> 00:20:16,000 Speaker 3: host with several tattoos tied to his Christian faith and 372 00:20:16,040 --> 00:20:19,640 Speaker 3: military service. Videos of heg Sath lifting weights frequently circulate 373 00:20:19,720 --> 00:20:22,000 Speaker 3: on social media, and he played a role in President 374 00:20:22,040 --> 00:20:24,840 Speaker 3: Trump's decision to rename the Defense Department to the Department 375 00:20:24,960 --> 00:20:27,879 Speaker 3: of War. In other words, as the headline and the 376 00:20:27,960 --> 00:20:31,160 Speaker 3: journal puts it, a fight about vibes drove the Pentagon's 377 00:20:31,200 --> 00:20:34,439 Speaker 3: breakup with Anthropic. You've got two guys who are so 378 00:20:34,800 --> 00:20:38,840 Speaker 3: different from each other, they're having a tough time coming 379 00:20:38,840 --> 00:20:39,880 Speaker 3: to any sort of agreement. 380 00:20:40,760 --> 00:20:48,280 Speaker 1: Ananthropics specifically doesn't want their AI used for lethal autonomous 381 00:20:48,320 --> 00:20:53,160 Speaker 1: weapons known as laws, laws, which we don't currently have, 382 00:20:53,400 --> 00:20:55,160 Speaker 1: or at least we don't think. I mean, they haven't 383 00:20:55,160 --> 00:20:56,720 Speaker 1: announced that we have. Maybe we have him and they're 384 00:20:56,760 --> 00:20:59,200 Speaker 1: keeping it a secret or whatever. But currently we don't 385 00:20:59,200 --> 00:21:03,760 Speaker 1: have him in that legal but Anthropic is so concerned 386 00:21:03,800 --> 00:21:06,359 Speaker 1: that we would use the Pentagon, would use the AI 387 00:21:06,480 --> 00:21:11,040 Speaker 1: technology to have weapons systems that, once activated, can select 388 00:21:11,080 --> 00:21:14,880 Speaker 1: and engage targets without further intervention by an operator. How 389 00:21:15,000 --> 00:21:18,200 Speaker 1: horrifying is that you turn the thing on, send it out, 390 00:21:18,200 --> 00:21:20,160 Speaker 1: and it makes it all its decisions from that point 391 00:21:20,200 --> 00:21:21,119 Speaker 1: on and kills people. 392 00:21:21,880 --> 00:21:26,760 Speaker 3: Yeah, yeah, anyway, it is. It's absolutely a troubling prospect. 393 00:21:26,880 --> 00:21:29,439 Speaker 3: Especially I've got an actually kind of funny slash troubling 394 00:21:29,520 --> 00:21:32,560 Speaker 3: article about this guy who tried to do a programming 395 00:21:32,640 --> 00:21:39,560 Speaker 3: change in an AI system, and the AI system researched, wrote, published, 396 00:21:39,640 --> 00:21:45,119 Speaker 3: and promoted a slam piece on this guy. It just 397 00:21:45,200 --> 00:21:47,960 Speaker 3: took the ball and ran with it, assassinated the guy's 398 00:21:48,119 --> 00:21:52,080 Speaker 3: character to try to not let him change the programming. 399 00:21:52,320 --> 00:21:53,800 Speaker 3: This is entirely on its own. 400 00:21:55,160 --> 00:21:58,880 Speaker 1: What I haven't heard this one? Those stories are amazing, right, 401 00:21:59,560 --> 00:22:04,719 Speaker 1: Holy cow, that's troubling. And so anyway, currently there's no 402 00:22:04,720 --> 00:22:07,280 Speaker 1: public evidence that the Department of Defense has those kind 403 00:22:07,320 --> 00:22:11,320 Speaker 1: of weapons, but you know, Andthropic as a company wants 404 00:22:11,359 --> 00:22:14,479 Speaker 1: to make sure that their AI isn't used for any 405 00:22:14,520 --> 00:22:17,240 Speaker 1: of that stuff. Also, they don't want ententropic used for 406 00:22:17,280 --> 00:22:22,240 Speaker 1: any surveillance of US citizens, right which is illegal already 407 00:22:22,520 --> 00:22:27,040 Speaker 1: but could be done with AI in kinds of ways 408 00:22:27,080 --> 00:22:31,240 Speaker 1: that are legal, as was explained by the Dispatch today. 409 00:22:31,280 --> 00:22:33,120 Speaker 1: And I found this very troubling. 410 00:22:35,359 --> 00:22:38,840 Speaker 3: I'm sorry, before we get to the surveillance thing to 411 00:22:38,920 --> 00:22:41,920 Speaker 3: the military use deal, and I completely understand Pete Hexas's 412 00:22:42,000 --> 00:22:47,239 Speaker 3: position on this, and thropics is essentially asking for a 413 00:22:47,320 --> 00:22:50,520 Speaker 3: case by case the right to offer a case by 414 00:22:50,560 --> 00:22:53,840 Speaker 3: case thumbs up or thumbs down. Hey, we're gonna use 415 00:22:53,840 --> 00:22:56,520 Speaker 3: your AI to do this now. I can't let you 416 00:22:56,560 --> 00:22:59,760 Speaker 3: do that. Can you imagine if Northrop Drummond or I 417 00:22:59,760 --> 00:23:02,200 Speaker 3: can remember these Jihan Cooties names, but if the maker 418 00:23:02,280 --> 00:23:06,360 Speaker 3: of the F eighteen said, hm, no, don't bomb that village. 419 00:23:06,440 --> 00:23:08,800 Speaker 3: I know there's some isis guys. There can't have you 420 00:23:08,920 --> 00:23:11,439 Speaker 3: us in our plane like that. I mean, the AI 421 00:23:11,480 --> 00:23:14,199 Speaker 3: systems are now a tool of war. Do they get 422 00:23:14,240 --> 00:23:14,760 Speaker 3: a vetopar. 423 00:23:16,119 --> 00:23:17,399 Speaker 1: No, that's unworkable. 424 00:23:19,080 --> 00:23:21,640 Speaker 3: That's the way I see it. Well, I appreciate their. 425 00:23:21,440 --> 00:23:27,160 Speaker 1: Caution, it's unworkable, But the government doesn't itself have this technology. 426 00:23:29,080 --> 00:23:33,199 Speaker 1: So we're going to pass on the best technology on 427 00:23:33,320 --> 00:23:37,119 Speaker 1: planet Earth and have these other companies. I mean, it exists, 428 00:23:37,480 --> 00:23:40,639 Speaker 1: so you know, using your example again, there's a company 429 00:23:40,640 --> 00:23:42,680 Speaker 1: out there that's built a better fighter plane than any 430 00:23:42,680 --> 00:23:45,720 Speaker 1: fighter plane on planet Earth, but they're not yetting it, 431 00:23:45,760 --> 00:23:47,639 Speaker 1: gonna let the Pentagon use it however they want. So 432 00:23:47,680 --> 00:23:51,520 Speaker 1: they just won't have it and we'll have lesser fighter planes. 433 00:23:51,560 --> 00:23:55,000 Speaker 1: That doesn't make any sense. Yeah, yeah, So I don't 434 00:23:55,040 --> 00:23:56,520 Speaker 1: know how this gets worked out. And then there was 435 00:23:56,560 --> 00:23:59,960 Speaker 1: on Friday before the war started, with the help of Anthropic. 436 00:24:00,080 --> 00:24:03,199 Speaker 1: By the way, they used Entropic for a lot of 437 00:24:03,240 --> 00:24:07,040 Speaker 1: the targeting and the complicated figure out what to hit 438 00:24:07,080 --> 00:24:09,400 Speaker 1: and how and arrange the planes and all that sort 439 00:24:09,400 --> 00:24:12,800 Speaker 1: of stuff. They used Entropic because intelligence processing as well. 440 00:24:12,840 --> 00:24:14,520 Speaker 1: If you were following. On Friday, there was a big 441 00:24:14,600 --> 00:24:17,760 Speaker 1: dust up, Trump said and hegxeth announced no more government 442 00:24:17,880 --> 00:24:20,600 Speaker 1: work with Entropic. Their person and on grata to us 443 00:24:20,600 --> 00:24:23,280 Speaker 1: were cutting ties with Entropic. Then used them twenty four 444 00:24:23,280 --> 00:24:26,720 Speaker 1: hours later to kill the Iyatola. But also there was 445 00:24:26,800 --> 00:24:30,919 Speaker 1: talk of Trump made some noises about seizing Ententropic, taking 446 00:24:30,960 --> 00:24:32,440 Speaker 1: the technology from them. 447 00:24:32,359 --> 00:24:35,840 Speaker 3: Which I have to call them left wing nut jobs. 448 00:24:35,960 --> 00:24:42,520 Speaker 3: Have to assume that can't happen legally. Yes, it says 449 00:24:42,520 --> 00:24:45,240 Speaker 3: some law I'm unfamiliar with they cited. I can't remember 450 00:24:45,280 --> 00:24:48,240 Speaker 3: the name of it, but it's the to label somebody 451 00:24:48,280 --> 00:24:53,679 Speaker 3: as a supply chain risk is a serious thing for 452 00:24:53,760 --> 00:24:56,359 Speaker 3: a military supplier, and that's what the Trump administration is 453 00:24:56,400 --> 00:24:58,199 Speaker 3: looking at doing or already has. I'm not sure how 454 00:24:58,280 --> 00:24:58,960 Speaker 3: law works, but. 455 00:24:59,040 --> 00:25:02,040 Speaker 1: I was troubled by this, andthropics really worried about its 456 00:25:02,320 --> 00:25:08,560 Speaker 1: technology being used to surveil Americans. US law doesn't allow 457 00:25:08,960 --> 00:25:15,000 Speaker 1: the government to surveil us. But as this expert on 458 00:25:15,040 --> 00:25:17,119 Speaker 1: the Fourth Amendment pointed out, a lot of what you 459 00:25:17,119 --> 00:25:20,919 Speaker 1: would do colloquially, what you would colloquially kind of in 460 00:25:20,960 --> 00:25:24,040 Speaker 1: common terms think of as a search is not a 461 00:25:24,080 --> 00:25:27,000 Speaker 1: search under the Fourth Amendment. When it comes to AI, 462 00:25:27,640 --> 00:25:31,520 Speaker 1: user data such as search history and algorithm preferences collected 463 00:25:31,520 --> 00:25:34,800 Speaker 1: and held by private companies is not categorically protected by 464 00:25:34,800 --> 00:25:38,040 Speaker 1: the Fourth Amendment. Neither is all kinds of footage of 465 00:25:38,160 --> 00:25:43,520 Speaker 1: cameras all over everywhere necessarily fall under the Fourth Amendment, 466 00:25:43,600 --> 00:25:46,280 Speaker 1: and all kinds of other different things that AI could 467 00:25:46,280 --> 00:25:49,000 Speaker 1: gather in a way that human beings couldn't, just like 468 00:25:49,040 --> 00:25:51,960 Speaker 1: super fast and put it all together. If we're gonna 469 00:25:51,960 --> 00:25:52,520 Speaker 1: have to work that. 470 00:25:52,560 --> 00:25:58,880 Speaker 3: Out as a country as the technology continually changes, good luck. Yeah, 471 00:25:59,119 --> 00:26:02,040 Speaker 3: where do you have an ex spectation of privacy these days? 472 00:26:02,800 --> 00:26:05,879 Speaker 3: In my own home? Certainly? Every just so click to 473 00:26:05,960 --> 00:26:08,480 Speaker 3: form and said, I agree when Google told you, by 474 00:26:08,480 --> 00:26:11,639 Speaker 3: the way your search is, are ours your search history. 475 00:26:11,800 --> 00:26:15,439 Speaker 1: Every time I get into my cyber truck, there's a 476 00:26:15,480 --> 00:26:18,240 Speaker 1: little thing on there that says four events happened while 477 00:26:18,240 --> 00:26:20,520 Speaker 1: you're gone, because it's recording on cameras all the way around, 478 00:26:20,560 --> 00:26:22,639 Speaker 1: and I watch the events just for fun sometimes, and 479 00:26:22,720 --> 00:26:25,639 Speaker 1: like somebody pulled up next to me and opened their 480 00:26:25,680 --> 00:26:27,320 Speaker 1: door and got their baby out of the car and 481 00:26:27,320 --> 00:26:29,439 Speaker 1: stuff like that, and walked behind me and scratched their 482 00:26:29,440 --> 00:26:30,879 Speaker 1: ass and all these different things, and I see it 483 00:26:30,880 --> 00:26:34,240 Speaker 1: on the camera. But all of these videos of these 484 00:26:34,280 --> 00:26:37,320 Speaker 1: people that they don't know they're being videoed. Well, when 485 00:26:37,400 --> 00:26:40,640 Speaker 1: every car has what the cyber truck has, which all 486 00:26:40,680 --> 00:26:43,280 Speaker 1: of them will soon enough, every car will have that 487 00:26:43,560 --> 00:26:48,080 Speaker 1: will be recorded constantly. Every car you walk down the 488 00:26:48,119 --> 00:26:50,760 Speaker 1: street is recording you, car by car by car by car, 489 00:26:50,880 --> 00:26:53,439 Speaker 1: so your entire walk around your neighborhood would be recorded. 490 00:26:53,480 --> 00:26:57,600 Speaker 1: If you wanted to put all that together, and AI 491 00:26:57,760 --> 00:27:02,399 Speaker 1: probably could and that something. You combine the ring cams 492 00:27:02,400 --> 00:27:05,800 Speaker 1: with the car cameras, you will be on video constantly. 493 00:27:07,080 --> 00:27:09,959 Speaker 3: I wish we could reanimate Huxley and Orwell and have 494 00:27:10,000 --> 00:27:14,040 Speaker 3: them update their famous books Brave New World in nineteen 495 00:27:14,280 --> 00:27:17,119 Speaker 3: eighty four, although nineteen eighty four is pretty close to 496 00:27:17,160 --> 00:27:21,119 Speaker 3: the technology we're talking about. But yeah, how does the 497 00:27:21,119 --> 00:27:24,080 Speaker 3: world look? Because I know, a surveilled society is an 498 00:27:24,080 --> 00:27:31,160 Speaker 3: obedient society. People feel the need to conform and obey 499 00:27:31,359 --> 00:27:35,159 Speaker 3: and keep their mouths shut lest they be suspected of 500 00:27:35,160 --> 00:27:37,119 Speaker 3: blah blah blah. And that just changes the nature of 501 00:27:37,200 --> 00:27:37,840 Speaker 3: your society. 502 00:27:37,920 --> 00:27:43,440 Speaker 1: Well, there's that. And if you end up with a 503 00:27:43,480 --> 00:27:47,199 Speaker 1: bad actor in charge of the government and decides to 504 00:27:47,280 --> 00:27:50,760 Speaker 1: use it for ill. If mankind ever falls under the 505 00:27:50,760 --> 00:27:54,359 Speaker 1: sway of a totalitarian government that has the surveillance technology 506 00:27:54,359 --> 00:27:57,879 Speaker 1: that exists now, will never break free. Correct, It'll be 507 00:27:57,920 --> 00:28:01,399 Speaker 1: the classic orwell a boots stamping on your face forever 508 00:28:01,480 --> 00:28:03,560 Speaker 1: because there'll be no getting out of it. 509 00:28:03,800 --> 00:28:06,879 Speaker 3: I can show your soul's desire for freedom. That's the 510 00:28:06,920 --> 00:28:09,760 Speaker 3: only answer. Thank you, sweetheart. You know not every slope 511 00:28:09,840 --> 00:28:13,320 Speaker 3: is slippery, but this feels like I don't know a 512 00:28:13,359 --> 00:28:16,960 Speaker 3: sheet of glass covered in w D forty, some sort 513 00:28:16,960 --> 00:28:20,000 Speaker 3: of sexual loub and some whale snot. I mean, the 514 00:28:20,080 --> 00:28:22,520 Speaker 3: slipperiest slope that has ever existed. 515 00:28:22,840 --> 00:28:24,600 Speaker 1: Now, my son said to me, and I had to 516 00:28:24,840 --> 00:28:28,560 Speaker 1: discipline him harshly, said, I said, I'm not doing anything wrong, 517 00:28:28,600 --> 00:28:30,919 Speaker 1: So I don't care, which is the attitude of a 518 00:28:30,920 --> 00:28:34,920 Speaker 1: lot of people always on this stuff. But you think 519 00:28:34,920 --> 00:28:37,919 Speaker 1: you're not doing anything wrong, wait till somebody else is 520 00:28:37,920 --> 00:28:40,720 Speaker 1: in charge. Maybe they'll think what you're doing is wrong. 521 00:28:41,040 --> 00:28:43,440 Speaker 1: How you show me the man, I will bring you 522 00:28:43,520 --> 00:28:45,600 Speaker 1: the crime. How you vote, how you pay taxes, how 523 00:28:45,640 --> 00:28:50,360 Speaker 1: you whatever. You don't pick a topic anyway that's really interesting. 524 00:28:50,880 --> 00:28:54,440 Speaker 1: So I don't know what I think about the most 525 00:28:54,480 --> 00:28:57,920 Speaker 1: powerful tools on earth not being in the hand the 526 00:28:57,920 --> 00:29:02,880 Speaker 1: government doesn't have it or won't have it because they 527 00:29:02,880 --> 00:29:08,360 Speaker 1: don't want to play along, because China is going to 528 00:29:08,440 --> 00:29:12,440 Speaker 1: have AI in their military right they already do. I'm 529 00:29:12,440 --> 00:29:14,600 Speaker 1: sure it's probably the only practically the only thing they're 530 00:29:14,640 --> 00:29:18,200 Speaker 1: going to do with AI is militarize it, and we won't. 531 00:29:19,040 --> 00:29:19,200 Speaker 5: Right. 532 00:29:19,680 --> 00:29:24,280 Speaker 3: Uh. The one thing I can't answer is whether anthropics 533 00:29:24,600 --> 00:29:27,800 Speaker 3: leadership in some of the systems we're talking about is 534 00:29:27,920 --> 00:29:32,160 Speaker 3: so clearly superior that you know, to go with somebody 535 00:29:32,160 --> 00:29:34,480 Speaker 3: else would be a big step back. 536 00:29:34,680 --> 00:29:36,560 Speaker 1: Well this is I don't know that this story got 537 00:29:36,560 --> 00:29:38,840 Speaker 1: obliterated Friday because the war started over the weekend. But 538 00:29:38,880 --> 00:29:42,320 Speaker 1: Sam Altman at open AI said, will do it. He 539 00:29:42,400 --> 00:29:44,959 Speaker 1: raised his hand, said will do it because the government 540 00:29:45,040 --> 00:29:48,280 Speaker 1: had a contract with Nthropic and when they canceled it 541 00:29:48,320 --> 00:29:51,640 Speaker 1: and said that's it no more. Altman at open AI said, hey, 542 00:29:51,640 --> 00:29:53,600 Speaker 1: we're We're happy to fill in. So I'm not sure 543 00:29:53,640 --> 00:29:57,440 Speaker 1: he cares if the Pentagon used it to surveyal people 544 00:29:57,560 --> 00:30:01,800 Speaker 1: or send out robots the landscape with machine guns or 545 00:30:01,840 --> 00:30:02,400 Speaker 1: whatever the hell. 546 00:30:03,000 --> 00:30:07,200 Speaker 3: Oh well, probably robot wolves. One more note, just because 547 00:30:07,240 --> 00:30:10,440 Speaker 3: it's interesting, and then I can't wait. Next segment, We've 548 00:30:10,480 --> 00:30:12,880 Speaker 3: got a guy from the Middle East challenging a bunch 549 00:30:12,920 --> 00:30:21,200 Speaker 3: of left wing stupid anti Trump demonstrators, anti Iran war demonstrators. Anyway, 550 00:30:23,040 --> 00:30:26,240 Speaker 3: Claude not only played a significant role in setting up 551 00:30:26,240 --> 00:30:28,600 Speaker 3: the attack on Iran, it also played a role in 552 00:30:28,640 --> 00:30:32,600 Speaker 3: the military action that captured old Man Maduro. Has been 553 00:30:32,720 --> 00:30:36,160 Speaker 3: used for war gaming and mission planning. Anthropic for years 554 00:30:36,240 --> 00:30:38,960 Speaker 3: been the most vocal AI company advocating for guard rails. 555 00:30:39,560 --> 00:30:43,080 Speaker 3: The stance at times is frustrated administration officials blah blah blah. 556 00:30:43,120 --> 00:30:46,240 Speaker 3: Earlier this year, Anthropic effectively banned the use of the 557 00:30:46,280 --> 00:30:50,000 Speaker 3: word pathogen in model prompts as part of its safeguards 558 00:30:50,000 --> 00:30:53,840 Speaker 3: against AI creating a bio weapon on its unclassified systems 559 00:30:53,920 --> 00:30:56,200 Speaker 3: used by many agencies. Blah blah. The band made it 560 00:30:56,240 --> 00:30:58,720 Speaker 3: difficult for employees at the Centers for Disease Control and 561 00:30:58,760 --> 00:31:02,200 Speaker 3: Prevention to use the AI tool. It took weeks to 562 00:31:02,240 --> 00:31:05,719 Speaker 3: get workers permission to circumvent those bands. 563 00:31:05,800 --> 00:31:07,720 Speaker 1: Yeah, it's impossible to do that. It reminds me of 564 00:31:07,760 --> 00:31:10,040 Speaker 1: the thing years ago we had about when Facebook was 565 00:31:10,040 --> 00:31:11,360 Speaker 1: trying to deal with the nipple. 566 00:31:11,840 --> 00:31:14,040 Speaker 3: Oh, one of the funniest episodes in the history of 567 00:31:14,080 --> 00:31:14,920 Speaker 3: the computer. 568 00:31:14,840 --> 00:31:17,280 Speaker 1: Ban the nipple because they didn't well. Originally was to 569 00:31:17,360 --> 00:31:21,400 Speaker 1: not have nudity, and then well, cassono breastfeeding, and then 570 00:31:21,440 --> 00:31:23,680 Speaker 1: you allow breastfeeding, But then what if it's an adult 571 00:31:23,760 --> 00:31:25,120 Speaker 1: breastfeeding on a woman? Is that? 572 00:31:26,680 --> 00:31:32,040 Speaker 3: What about a man breastfeeding another man? Blah blah. Yeah, 573 00:31:32,280 --> 00:31:33,480 Speaker 3: hi caramba. 574 00:31:33,280 --> 00:31:37,880 Speaker 1: Oh oh god, oh boy. It's better than the Kentucky 575 00:31:37,920 --> 00:31:40,480 Speaker 1: meat shower, which is the one hundred and fiftieth anniversary 576 00:31:40,520 --> 00:31:43,240 Speaker 1: of today, do tell I'll tell you about that and 577 00:31:43,240 --> 00:31:50,680 Speaker 1: other things on the way. Why am I not in 578 00:31:50,800 --> 00:31:56,240 Speaker 1: Bath County, Kentucky today to celebrate the meat shower with 579 00:31:56,360 --> 00:31:58,800 Speaker 1: everybody else. It's an annual festival. We'll talk about that 580 00:31:58,840 --> 00:31:59,239 Speaker 1: coming out. 581 00:31:59,440 --> 00:32:02,520 Speaker 3: It sounds like something that was outlawed even in San Francisco. 582 00:32:04,400 --> 00:32:07,800 Speaker 3: We now bring you to Philthadelphia, Pennsylvania, where a bunch 583 00:32:07,840 --> 00:32:11,640 Speaker 3: of woke protesters are yelling about the attack on Iran 584 00:32:11,960 --> 00:32:14,760 Speaker 3: and they are confronted by a fellow who's actually from 585 00:32:14,800 --> 00:32:15,480 Speaker 3: the Middle East. 586 00:32:15,880 --> 00:32:20,680 Speaker 1: Why don't you leave America? You hate this imperial this country. 587 00:32:20,680 --> 00:32:21,240 Speaker 1: Why don't you. 588 00:32:21,320 --> 00:32:26,840 Speaker 7: Leave when Iranian women and murdered by their government? None 589 00:32:26,840 --> 00:32:30,440 Speaker 7: of you said, you're fake activist? Why don't you snake 590 00:32:30,480 --> 00:32:32,120 Speaker 7: about the fifty thousand. 591 00:32:31,920 --> 00:32:36,080 Speaker 1: Ranians who were cute? You fake town? That's none of 592 00:32:36,120 --> 00:32:37,760 Speaker 1: you because none of you are showered. 593 00:32:40,200 --> 00:32:47,200 Speaker 3: That is some trolling, fake terrorist, effing cause playing children, 594 00:32:48,560 --> 00:32:50,240 Speaker 3: Well said, Sir Well. 595 00:32:50,080 --> 00:32:54,640 Speaker 1: Said, what is the Kentucky Meat Shower? So happened one 596 00:32:54,760 --> 00:32:56,880 Speaker 1: hundred and fifty years ago today they have an annual 597 00:32:56,920 --> 00:32:59,360 Speaker 1: festival there in the county where it occurred. 598 00:33:00,080 --> 00:33:02,280 Speaker 3: And I asked my wife for the Kentucky Meat Shower, 599 00:33:02,320 --> 00:33:03,840 Speaker 3: and she didn't speak to me for a week. 600 00:33:08,440 --> 00:33:11,840 Speaker 1: You date a Kentucky meat Shower, you don't marry the 601 00:33:11,920 --> 00:33:16,640 Speaker 1: Kentucky meatch. That's horrible. 602 00:33:17,560 --> 00:33:18,320 Speaker 3: It's terrible. 603 00:33:20,280 --> 00:33:22,520 Speaker 1: Katie's staying out of it. She's just gonna stay out 604 00:33:22,520 --> 00:33:29,560 Speaker 1: of it. So I'm from rural America, So you come 605 00:33:29,600 --> 00:33:31,560 Speaker 1: up with any reason for your festival. Like I lived 606 00:33:31,600 --> 00:33:34,360 Speaker 1: near Cocker City, Kansas, where they have the world's largest 607 00:33:34,360 --> 00:33:36,880 Speaker 1: ball of twine. And every year you have a big, 608 00:33:36,920 --> 00:33:38,760 Speaker 1: giant party with bands and you drink beer and they 609 00:33:38,760 --> 00:33:40,800 Speaker 1: add some more twined to the ball to make it bigger. 610 00:33:40,840 --> 00:33:43,680 Speaker 1: I mean, you just come up with these things. In 611 00:33:43,840 --> 00:33:46,479 Speaker 1: Bath County, Kentucky, they have the Kentucky Meat Shower Festival. 612 00:33:46,520 --> 00:33:49,640 Speaker 1: It's one hundred and fiftieth anniversary today. When on this day, 613 00:33:49,680 --> 00:33:53,320 Speaker 1: March third, eighteen seventy six, chunks of meat began falling 614 00:33:53,360 --> 00:33:58,440 Speaker 1: from the sky for several minutes around eleven am, covering 615 00:33:58,440 --> 00:34:03,160 Speaker 1: a one hundred fifty yard area in chunks and chunks 616 00:34:03,160 --> 00:34:06,120 Speaker 1: and chunks of meat. It was a side I don't. 617 00:34:06,040 --> 00:34:09,800 Speaker 3: Know what's happening exactly, but something has gone terribly wrong. 618 00:34:12,480 --> 00:34:16,399 Speaker 1: This poor woman was sitting there. It's not good washing 619 00:34:16,480 --> 00:34:19,279 Speaker 1: clothes or something like that. When chunks of meat, some 620 00:34:19,440 --> 00:34:21,840 Speaker 1: of them as large as her hand, began tumbling from. 621 00:34:21,719 --> 00:34:23,560 Speaker 3: The sky, people want to beef. 622 00:34:25,520 --> 00:34:28,080 Speaker 1: The story went national fast as people tried to figure 623 00:34:28,080 --> 00:34:30,960 Speaker 1: out why chunks of meat fell from the sky in 624 00:34:31,000 --> 00:34:35,000 Speaker 1: that little part of Kentucky. Scientists got involved examined samples. 625 00:34:35,400 --> 00:34:39,000 Speaker 1: Seven samples were examined by scientists who confirmed that several 626 00:34:39,000 --> 00:34:43,000 Speaker 1: were lung tissue, some were muscular tissue, two were cartilage. 627 00:34:43,360 --> 00:34:46,480 Speaker 1: Several people actually tasted it instead, it tasted like mutton 628 00:34:46,600 --> 00:34:49,719 Speaker 1: or venison. Ah, this is the sort of thing you 629 00:34:49,719 --> 00:34:51,919 Speaker 1: did in the eighteen seventies, apparently, or if a chunk 630 00:34:51,960 --> 00:34:53,600 Speaker 1: of meat fell out of the sky, you'd say, oh, 631 00:34:53,680 --> 00:34:54,440 Speaker 1: let me taste that. 632 00:34:54,840 --> 00:34:57,800 Speaker 3: I wonder what that is. I'm ready to figure it out. Clym. 633 00:34:58,120 --> 00:35:03,160 Speaker 1: Anyway, Apparently, vultures vomit occasionally, and for whatever reason, a 634 00:35:03,320 --> 00:35:07,080 Speaker 1: big flock of vultures vomited up all the dead food 635 00:35:07,080 --> 00:35:09,440 Speaker 1: that day had eaten up. And at the same time, 636 00:35:09,680 --> 00:35:13,960 Speaker 1: two hundred pounds of unidentified meat fell in one little spot. 637 00:35:14,239 --> 00:35:17,200 Speaker 1: And that's the Kentucky Meat Shower. And that's why we 638 00:35:17,360 --> 00:35:18,120 Speaker 1: have a festival. 639 00:35:18,160 --> 00:35:21,719 Speaker 3: Every Please send the bill for detailing your car after 640 00:35:21,760 --> 00:35:24,719 Speaker 3: you vomited it to Jack Armstrong. He'll be happy to 641 00:35:24,760 --> 00:35:25,480 Speaker 3: pay that bill. 642 00:35:26,400 --> 00:35:31,320 Speaker 1: We got more on the way Armstrong and Getty