1 00:00:01,880 --> 00:00:06,160 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio the George 2 00:00:06,240 --> 00:00:10,840 Speaker 1: Washington Broadcast Center, Jack Armstrong and Joe, Ketty Armstrong and 3 00:00:11,000 --> 00:00:24,759 Speaker 1: Getty and he Armstrong and Eddy. I don't know that 4 00:00:24,880 --> 00:00:28,400 Speaker 1: that happened, And Pete said he did not want that. 5 00:00:28,680 --> 00:00:31,600 Speaker 1: He didn't even know what people were talking about. So 6 00:00:32,040 --> 00:00:34,680 Speaker 1: we'll look at we'll look into it. But no, I 7 00:00:34,680 --> 00:00:36,839 Speaker 1: wouldn't have wanted that, not a second strike. 8 00:00:37,360 --> 00:00:39,360 Speaker 2: So that's a story that's burbled up over the last 9 00:00:39,400 --> 00:00:42,440 Speaker 2: couple of days of So when we blasted I think 10 00:00:42,479 --> 00:00:45,000 Speaker 2: it was the first drug boat we blasted. After we 11 00:00:45,040 --> 00:00:46,960 Speaker 2: blasted it, there were a couple of people clinging to 12 00:00:47,000 --> 00:00:52,400 Speaker 2: the boat, and then, according to The Washington Post, Pete 13 00:00:52,479 --> 00:00:57,080 Speaker 2: Hegzeth ordered those two people to be blasted again, which 14 00:00:57,120 --> 00:01:01,240 Speaker 2: would be a violation of all that is good and decent. 15 00:01:01,440 --> 00:01:04,120 Speaker 2: Here is a report on NBC about that. Then we'll discuss. 16 00:01:04,360 --> 00:01:08,240 Speaker 3: The Washington Post, citing two sources with direct knowledge, reports 17 00:01:08,280 --> 00:01:12,440 Speaker 3: defend Secretary Pete Hegseth issued a verbal directive to target 18 00:01:12,480 --> 00:01:16,160 Speaker 3: a boat suspected of ferrying drugs, with one source saying 19 00:01:16,200 --> 00:01:19,479 Speaker 3: the order was to quote kill everybody. But the Post 20 00:01:19,520 --> 00:01:22,800 Speaker 3: reports when it was discovered two people on board had 21 00:01:22,880 --> 00:01:27,720 Speaker 3: survived the strike. A Special Operations commander ordered a double tap, 22 00:01:27,959 --> 00:01:31,760 Speaker 3: a second strike that killed the survivors, one source telling 23 00:01:31,760 --> 00:01:34,280 Speaker 3: the Post the follow on strike was to clear the 24 00:01:34,319 --> 00:01:37,160 Speaker 3: debris of the wreckage for other boats, not to kill 25 00:01:37,200 --> 00:01:37,959 Speaker 3: the survivors. 26 00:01:38,800 --> 00:01:42,040 Speaker 2: Now, these are anonymous sources from the Washington Post. They're 27 00:01:42,080 --> 00:01:45,600 Speaker 2: the original reporting on this. Nobody has confirmed those reports 28 00:01:45,760 --> 00:01:48,360 Speaker 2: that I'm aware of. Yet every story I saw about 29 00:01:48,360 --> 00:01:51,040 Speaker 2: this today NBC, whoever I was watching that, I'll say, 30 00:01:51,360 --> 00:01:56,280 Speaker 2: we have not individually confirmed these reports from the Washington Post. 31 00:01:56,280 --> 00:01:59,320 Speaker 2: Pete Heggzath says that is not what happened. That was 32 00:01:59,400 --> 00:02:02,440 Speaker 2: news to him. Trump says, if it did happen that way, 33 00:02:02,480 --> 00:02:05,720 Speaker 2: I would not have wanted it, which is interesting, But 34 00:02:05,800 --> 00:02:06,279 Speaker 2: I'm sorry. 35 00:02:06,320 --> 00:02:09,360 Speaker 4: Who said was that Pentagon sources who said the second 36 00:02:09,360 --> 00:02:13,120 Speaker 4: strike was to clear debris, So that was acknowledging a 37 00:02:13,160 --> 00:02:14,520 Speaker 4: second strike occurred. 38 00:02:15,000 --> 00:02:15,400 Speaker 2: Okay. 39 00:02:16,320 --> 00:02:19,639 Speaker 4: So the only question then remaining is were their survivors 40 00:02:19,919 --> 00:02:23,160 Speaker 4: clinging to that wreckage and did the US military know 41 00:02:23,240 --> 00:02:27,120 Speaker 4: it before they fired the kill shots of people who 42 00:02:27,120 --> 00:02:29,000 Speaker 4: are obviously no military threat. 43 00:02:29,160 --> 00:02:31,600 Speaker 2: The thing that is a war crime, and the crime 44 00:02:31,639 --> 00:02:35,119 Speaker 2: is murder. The thing I really don't know. That needs 45 00:02:35,160 --> 00:02:37,400 Speaker 2: to be nailed down. I haven't everybody talk about it? 46 00:02:37,760 --> 00:02:40,200 Speaker 2: How much? How many people are involved in this communication 47 00:02:40,520 --> 00:02:45,960 Speaker 2: between Pete and whoever? You know, how many people are 48 00:02:46,000 --> 00:02:49,840 Speaker 2: down the line from Pete to somebody else, to somebody 49 00:02:49,840 --> 00:02:52,720 Speaker 2: else to somebody else who actually pulls the trigger on 50 00:02:52,800 --> 00:02:55,959 Speaker 2: the aerial shot from the drone? I mean, I don't. 51 00:02:55,960 --> 00:02:57,920 Speaker 2: I don't know how many people are involved there. And 52 00:02:57,960 --> 00:03:00,240 Speaker 2: then there is there any record of that communication they 53 00:03:00,280 --> 00:03:03,600 Speaker 2: record that is it a verbal I don't. 54 00:03:03,400 --> 00:03:06,600 Speaker 4: Know any of any questions all. Yeah, the Free press 55 00:03:06,680 --> 00:03:09,920 Speaker 4: is writing about it. According to the post mission, Commander 56 00:03:09,919 --> 00:03:14,080 Speaker 4: Admiral Frank Mitchell, Bradley ordered the second strike specifically to 57 00:03:14,160 --> 00:03:19,440 Speaker 4: kill the two survivors, so he he would certainly be 58 00:03:19,520 --> 00:03:23,160 Speaker 4: guilty of the war crime, and depending how to direct 59 00:03:23,160 --> 00:03:25,760 Speaker 4: the order was from Pete, hegseeth he might. 60 00:03:25,639 --> 00:03:31,880 Speaker 2: Be as well, I guess, you know, allegedly. And then. 61 00:03:33,320 --> 00:03:36,680 Speaker 4: It's worth pointing out we have Republicans joining in the Hey, 62 00:03:36,720 --> 00:03:38,560 Speaker 4: we've got to figure out what happened here. 63 00:03:38,960 --> 00:03:41,960 Speaker 2: Cry It's not just you know the usual suspect, right. 64 00:03:42,120 --> 00:03:45,160 Speaker 2: You only open an investigation in the House of Representatives 65 00:03:45,200 --> 00:03:48,440 Speaker 2: if the majority of party wants it, and they've opened 66 00:03:48,480 --> 00:03:51,080 Speaker 2: investigations and a couple of different committees in the House 67 00:03:51,120 --> 00:03:54,080 Speaker 2: of Representatives because the Republican committee chairs have said, yeah, 68 00:03:54,120 --> 00:03:56,760 Speaker 2: we want to look into this. Also a Senate committee 69 00:03:56,760 --> 00:04:03,080 Speaker 2: looking into it. And is it just a coincidence that 70 00:04:03,160 --> 00:04:06,280 Speaker 2: the week before this story broke the Republican the Democrats 71 00:04:06,320 --> 00:04:08,120 Speaker 2: put out that video of you do not have to 72 00:04:08,200 --> 00:04:11,280 Speaker 2: follow illegal orders, You do not have to follow illegal 73 00:04:11,360 --> 00:04:13,480 Speaker 2: orders and that whole dust up. Is it just a 74 00:04:13,480 --> 00:04:17,360 Speaker 2: coincidence that that happened right before this information came out. 75 00:04:18,560 --> 00:04:20,160 Speaker 2: That would be a hell of a coincidence. 76 00:04:20,560 --> 00:04:24,479 Speaker 4: Sure seems like it because this attack allegedly happened on 77 00:04:24,520 --> 00:04:28,599 Speaker 4: September the second, So yeah, you absolutely could believe that 78 00:04:28,680 --> 00:04:32,360 Speaker 4: there were leaks and people were hearing Oh, this was 79 00:04:32,400 --> 00:04:35,200 Speaker 4: a clear war crime in the blasting the drug boats 80 00:04:35,240 --> 00:04:39,039 Speaker 4: the thing, and that's why they put out that video. 81 00:04:39,080 --> 00:04:40,520 Speaker 4: It makes more sense now. 82 00:04:41,760 --> 00:04:43,640 Speaker 2: Right because it kind of seemed like it came out 83 00:04:43,640 --> 00:04:52,320 Speaker 2: of nowhere. Although would they be after this what is it? 84 00:04:52,360 --> 00:05:00,000 Speaker 2: Admiral whoever? Yeah, of a Bradley. You know, I'm pretty 85 00:05:00,080 --> 00:05:04,160 Speaker 2: cynical about our politics today. I'm not sure how many 86 00:05:04,240 --> 00:05:07,840 Speaker 2: of these Democrats who are making a big deal out 87 00:05:07,839 --> 00:05:11,080 Speaker 2: of this, are actually concerned about these Venezuelans clinging to 88 00:05:11,120 --> 00:05:14,239 Speaker 2: the boat versus they want a scalp in the Trump 89 00:05:14,240 --> 00:05:18,800 Speaker 2: administration to damage the Trump administration, which is more important 90 00:05:18,839 --> 00:05:23,320 Speaker 2: to them. So I feel like some admirable, no admirable, 91 00:05:23,520 --> 00:05:26,440 Speaker 2: I did it. Yeah, I feel like I feel like some 92 00:05:26,720 --> 00:05:30,919 Speaker 2: admiral whose name nobody had ever heard before would be 93 00:05:31,040 --> 00:05:35,200 Speaker 2: enough for them to feel like they got a victory. Yeah. 94 00:05:35,240 --> 00:05:38,160 Speaker 4: I don't disagree with anything you said, but I don't 95 00:05:38,279 --> 00:05:41,000 Speaker 4: think it's appropriate to look at this story through the 96 00:05:41,080 --> 00:05:44,960 Speaker 4: lens of what is the motivation of Democrats, whether it 97 00:05:45,080 --> 00:05:48,240 Speaker 4: is holy or unholy? If a war crime was committed, 98 00:05:48,240 --> 00:05:49,480 Speaker 4: we got to get to the bottom of it. 99 00:05:51,240 --> 00:05:55,080 Speaker 2: Yeah, I guess I was assuming that it didn't happen 100 00:05:55,120 --> 00:05:57,120 Speaker 2: the way it's being portrayed in the Washington Post, and 101 00:05:57,160 --> 00:05:58,960 Speaker 2: we don't know that. Yeah, I don't know. 102 00:05:59,200 --> 00:06:02,000 Speaker 4: No idea the fact that were you going to play 103 00:06:02,600 --> 00:06:03,760 Speaker 4: our two Congress fellas? 104 00:06:03,880 --> 00:06:06,000 Speaker 2: I wasn't, But we can. Let's do it. Sixty three 105 00:06:06,080 --> 00:06:06,880 Speaker 2: and four back to back. 106 00:06:06,960 --> 00:06:11,240 Speaker 4: Mike Turner and Don Bacon, both Armed Services Committee guys 107 00:06:11,560 --> 00:06:13,800 Speaker 4: or House Intelligence Committee guys, both Republicans. 108 00:06:14,120 --> 00:06:17,320 Speaker 5: Congress does not have information that that had occurred. Both 109 00:06:17,360 --> 00:06:20,760 Speaker 5: the chairman of the Center Armed Services Committee, Chairman of 110 00:06:20,880 --> 00:06:24,400 Speaker 5: the House Armed Services Committee, and ranking members have opened investigations. 111 00:06:24,600 --> 00:06:26,440 Speaker 2: Obviously, if that occurred, that. 112 00:06:26,360 --> 00:06:29,400 Speaker 5: Would be very serious, and I agree that that would 113 00:06:29,839 --> 00:06:31,040 Speaker 5: would be an illegal act. 114 00:06:31,400 --> 00:06:34,080 Speaker 6: If the facts go to where the Washington Post article 115 00:06:34,560 --> 00:06:35,440 Speaker 6: takes it, well, then. 116 00:06:36,960 --> 00:06:37,880 Speaker 2: We'll have to go from there. 117 00:06:37,960 --> 00:06:41,159 Speaker 6: But if it was as if the article said, that 118 00:06:41,279 --> 00:06:42,359 Speaker 6: is a violation of. 119 00:06:42,360 --> 00:06:42,960 Speaker 2: The law of war. 120 00:06:43,240 --> 00:06:45,640 Speaker 6: But people want a survivory, don't kill them, and they 121 00:06:45,680 --> 00:06:48,600 Speaker 6: have to pose an eminent threat. It's hard to believe 122 00:06:48,640 --> 00:06:52,360 Speaker 6: that two people on our raft trying to survive would 123 00:06:52,360 --> 00:06:53,440 Speaker 6: pose an eminent thrut. 124 00:06:54,200 --> 00:06:57,400 Speaker 2: I would just like to have somebody point out that 125 00:06:57,480 --> 00:07:00,480 Speaker 2: if it turns out this didn't happen, how about a 126 00:07:00,480 --> 00:07:02,960 Speaker 2: little pressure on the Washington Post and not put out 127 00:07:02,960 --> 00:07:05,480 Speaker 2: this sort of crap because that happens a lot too. 128 00:07:05,520 --> 00:07:08,760 Speaker 2: Where you get these stories with anonymous sources. They become 129 00:07:08,800 --> 00:07:11,640 Speaker 2: a big deal for a couple of twenty four hour 130 00:07:11,720 --> 00:07:14,640 Speaker 2: news cycles. Then they kind of drizzle away and nobody 131 00:07:14,680 --> 00:07:15,320 Speaker 2: ever gets packed. 132 00:07:15,320 --> 00:07:18,840 Speaker 4: Straight years during the Russia collusion hoax, please hat them shift, 133 00:07:18,880 --> 00:07:19,720 Speaker 4: you scum back. 134 00:07:19,600 --> 00:07:22,560 Speaker 2: No kidding, And then never nobody ever circles back to say, hey, 135 00:07:22,560 --> 00:07:24,920 Speaker 2: that stuff that didn't turn out to be true. 136 00:07:24,720 --> 00:07:29,360 Speaker 4: That's not cool, right, right, just take down the Washington Post. Sorry, 137 00:07:29,400 --> 00:07:31,600 Speaker 4: Jeff Bezos. I know you wasted a lot of money 138 00:07:31,600 --> 00:07:34,520 Speaker 4: on it, but I'm sorry. We're shutting it down. Yeah, 139 00:07:34,600 --> 00:07:36,480 Speaker 4: we'll just have to wait for the facts to come out. 140 00:07:36,560 --> 00:07:38,600 Speaker 4: It's troubling. We need to be. 141 00:07:38,400 --> 00:07:41,280 Speaker 2: Better than the worst people on earth as a country. Well, 142 00:07:41,280 --> 00:07:44,040 Speaker 2: it's getting it pretty hard to imagine either Pete or 143 00:07:44,080 --> 00:07:49,880 Speaker 2: the guy underneath him thinking that was okay, yeah, yeah 144 00:07:49,880 --> 00:07:51,680 Speaker 2: it is. I mean that would be extraordinary. 145 00:07:52,120 --> 00:07:54,560 Speaker 4: I mean if Pete said something to the effect to 146 00:07:54,640 --> 00:07:58,200 Speaker 4: kill them all, which I think is in the Washington 147 00:07:58,320 --> 00:08:03,440 Speaker 4: Post story. I can't remember the phrase exactly, and this 148 00:08:03,480 --> 00:08:08,320 Speaker 4: guy interpreted that to kill hopeless, helpless survivors, Yeah, that 149 00:08:08,360 --> 00:08:11,080 Speaker 4: would be an enormous error in judgment, the sort of 150 00:08:11,080 --> 00:08:14,040 Speaker 4: which results in, you know, prosecutions. 151 00:08:15,720 --> 00:08:15,840 Speaker 1: Uh. 152 00:08:15,920 --> 00:08:21,160 Speaker 4: The commander overseeing the operation from Fort Bragg in North Carolina, 153 00:08:21,440 --> 00:08:24,720 Speaker 4: this Admiral Bradley, told people on this secure conference call, 154 00:08:24,840 --> 00:08:28,360 Speaker 4: according to Washington Post, that the survivors were still legitimate 155 00:08:28,360 --> 00:08:32,040 Speaker 4: targets because they could theoretically call other traffickers to retrieve 156 00:08:32,080 --> 00:08:35,240 Speaker 4: them and their cargo. According to two people, he ordered 157 00:08:35,240 --> 00:08:39,360 Speaker 4: the second strike to fulfill Hexeth's directive that everyone must 158 00:08:39,400 --> 00:08:39,960 Speaker 4: be killed. 159 00:08:41,920 --> 00:08:44,160 Speaker 2: So, I don't know anything about this. So if you're 160 00:08:44,200 --> 00:08:46,320 Speaker 2: in that line of work, you're the sort of person 161 00:08:46,400 --> 00:08:49,360 Speaker 2: that pulls the trigger on the drone to blast these 162 00:08:49,400 --> 00:08:52,600 Speaker 2: people in the boat. Are you taught all of this 163 00:08:52,679 --> 00:08:56,360 Speaker 2: and some sort of training or are you just supposed 164 00:08:56,360 --> 00:09:00,640 Speaker 2: to know it from watching movies? Are I don't know that. 165 00:09:01,800 --> 00:09:04,600 Speaker 2: Would you have been taught. Now if you hit the 166 00:09:04,600 --> 00:09:07,440 Speaker 2: boat and there are survivors, they're no longer a threat, 167 00:09:08,800 --> 00:09:10,560 Speaker 2: so it would be illegal to kill them. Are you 168 00:09:10,600 --> 00:09:13,080 Speaker 2: actually taught that? I would hope so, but maybe maybe 169 00:09:13,120 --> 00:09:15,200 Speaker 2: are Maybe aren't. I don't know if you know anything 170 00:09:15,200 --> 00:09:17,199 Speaker 2: about this, Like you served in the military and you 171 00:09:17,280 --> 00:09:19,480 Speaker 2: got that sort of training. Let us know on the 172 00:09:19,520 --> 00:09:20,560 Speaker 2: text line or the email. 173 00:09:21,840 --> 00:09:25,040 Speaker 4: Yeah, I'm looking to see if we have any decisive, 174 00:09:25,200 --> 00:09:34,800 Speaker 4: knowledgeable emails. No, just a bunch of dopey comments online. 175 00:09:36,320 --> 00:09:40,120 Speaker 2: Well, this is a Grade A scandal. If it can 176 00:09:40,160 --> 00:09:44,319 Speaker 2: be substantiated in any way, it would drive Hegzeth out 177 00:09:44,360 --> 00:09:50,280 Speaker 2: of his job and perhaps have some sort of trial 178 00:09:50,400 --> 00:09:50,800 Speaker 2: or something. 179 00:09:50,920 --> 00:09:54,959 Speaker 4: I don't know, here's a Trump loyalist says, Joe, you're 180 00:09:55,000 --> 00:09:58,280 Speaker 4: missing a possibility. Maybe the Dems made that illegal order 181 00:09:58,400 --> 00:10:00,439 Speaker 4: video because they were going to play and to a 182 00:10:00,480 --> 00:10:04,640 Speaker 4: false story about Hegzathy. 183 00:10:04,760 --> 00:10:07,880 Speaker 2: See that's yeah, that's a stretch. 184 00:10:10,400 --> 00:10:13,959 Speaker 4: And this from Robert So the WAHPO reports something quoting 185 00:10:14,000 --> 00:10:16,920 Speaker 4: anonymous sources. Why is that given any credibility at all, 186 00:10:16,960 --> 00:10:22,440 Speaker 4: given the history of the WAPPO with fabricating things. The 187 00:10:22,520 --> 00:10:24,200 Speaker 4: dude was just put in place. That would be the 188 00:10:24,200 --> 00:10:27,400 Speaker 4: admiral on August first Navy seal. I'm guessing he knows 189 00:10:27,440 --> 00:10:29,679 Speaker 4: the law and is not afraid of the Secretary of Defense. 190 00:10:30,080 --> 00:10:32,200 Speaker 4: This is an entire argument against a straw man to 191 00:10:32,240 --> 00:10:35,600 Speaker 4: put and keep it in the news unless there's proof 192 00:10:35,640 --> 00:10:38,839 Speaker 4: this order exists or something happened illegal. It's just people 193 00:10:38,880 --> 00:10:41,240 Speaker 4: trying to whip up news. Yeah, you know what, And 194 00:10:41,280 --> 00:10:43,840 Speaker 4: he says our media is so broken. I would agree 195 00:10:43,880 --> 00:10:47,120 Speaker 4: with you, and both Don Bacon and Mike Turner said, 196 00:10:47,200 --> 00:10:49,800 Speaker 4: if this is true, it's a serious deal. But all right, 197 00:10:49,840 --> 00:10:51,640 Speaker 4: now we're in the let's let the facts come out 198 00:10:52,240 --> 00:10:54,000 Speaker 4: part of the process. 199 00:10:56,160 --> 00:11:00,120 Speaker 2: I'm kind of interested. I often like what Barry McCaffrey says, 200 00:11:00,120 --> 00:11:03,360 Speaker 2: retired general. He's usually on NBC. Can we are sixty seven? 201 00:11:03,400 --> 00:11:04,720 Speaker 2: I'd just like to hear his stake on this. 202 00:11:05,200 --> 00:11:08,400 Speaker 7: The easy question is whether or not we are allowed 203 00:11:08,440 --> 00:11:12,600 Speaker 7: to use LESO force against people clinging the wreckage. They 204 00:11:12,600 --> 00:11:16,920 Speaker 7: have submitted there's no threat to the US military forces. 205 00:11:18,080 --> 00:11:19,120 Speaker 1: That seems to me. 206 00:11:19,559 --> 00:11:22,400 Speaker 7: It needs to be investigated, need to be confirmed. It 207 00:11:22,480 --> 00:11:26,200 Speaker 7: seemed to be clearly an illegal order in a war crime. 208 00:11:26,280 --> 00:11:30,640 Speaker 7: And this admiral should have, when he got allegedly got 209 00:11:30,640 --> 00:11:32,920 Speaker 7: this verbal order from the Secretary of Defense, should have 210 00:11:32,960 --> 00:11:36,640 Speaker 7: put it in writing, said we will not comply and 211 00:11:36,720 --> 00:11:40,319 Speaker 7: explained why. So that one straightforward, it's no different than 212 00:11:40,320 --> 00:11:45,840 Speaker 7: a Nazi submarine machine gunning survivors in the water of 213 00:11:45,920 --> 00:11:47,880 Speaker 7: a sunken ship in World War Two. 214 00:11:49,480 --> 00:11:52,680 Speaker 4: For which they were prosecuted at the Nuremberg trials and. 215 00:11:52,679 --> 00:11:56,319 Speaker 2: Executed man And so how fast does this stuff happened? Too? 216 00:11:56,400 --> 00:11:59,200 Speaker 2: So the first shot on the boat went How much 217 00:11:59,240 --> 00:12:01,040 Speaker 2: time was there but between the first strike and the 218 00:12:01,080 --> 00:12:05,120 Speaker 2: second strike, probably not a lot, I'm guessing. 219 00:12:06,720 --> 00:12:10,040 Speaker 4: Yeah, the Wallpole count Is took a while for the 220 00:12:10,040 --> 00:12:12,520 Speaker 4: smoke tick clear, and that booty was surprised to see 221 00:12:12,520 --> 00:12:17,640 Speaker 4: survivors and the order was given hit him again. But again, 222 00:12:18,240 --> 00:12:21,640 Speaker 4: at this point, I feel like the entire question is, Okay, 223 00:12:21,679 --> 00:12:24,160 Speaker 4: did that stuff actually happen in the way it was described? 224 00:12:24,520 --> 00:12:27,520 Speaker 4: If so, okay, then the process begins. If not, let's 225 00:12:27,600 --> 00:12:30,040 Speaker 4: move on with our lives after kicking the hell out 226 00:12:30,040 --> 00:12:32,400 Speaker 4: of the wallpole for false stories. 227 00:12:32,440 --> 00:12:34,920 Speaker 2: Again, there needs to be better reporting on how this 228 00:12:34,960 --> 00:12:37,360 Speaker 2: whole thing works, and I'm surprised that's not included. That's 229 00:12:37,440 --> 00:12:40,360 Speaker 2: one of the reasons that makes me very skeptical about this, 230 00:12:40,640 --> 00:12:43,400 Speaker 2: because if I'm the reporter receiving this information from some 231 00:12:43,760 --> 00:12:46,439 Speaker 2: anonymous source, I'm asking, Okay, how does the chain of 232 00:12:46,480 --> 00:12:49,720 Speaker 2: command work here? Pete talks to who, who talks to who? 233 00:12:50,000 --> 00:12:53,439 Speaker 2: Who actually fires the rocket or the missile? Is all 234 00:12:53,480 --> 00:12:56,360 Speaker 2: of this videotaped? We've seen the video of the original 235 00:12:56,400 --> 00:12:58,400 Speaker 2: strikes on a lot of these boats. Are all of 236 00:12:58,440 --> 00:13:00,880 Speaker 2: them videotapes? So are you claiming there is a videotape 237 00:13:00,880 --> 00:13:03,360 Speaker 2: of this out there? Also? I would ask those questions 238 00:13:03,400 --> 00:13:06,480 Speaker 2: if I as the reporter receiving this information and include that 239 00:13:06,640 --> 00:13:12,280 Speaker 2: in my report and say videotape exists of this, for instance, 240 00:13:13,600 --> 00:13:16,679 Speaker 2: because that would be a big deal. It's very easy 241 00:13:16,679 --> 00:13:19,280 Speaker 2: to get to the bottom of then whichever committee just 242 00:13:19,280 --> 00:13:21,160 Speaker 2: needs to say, Okay, we need to see a videotape, like 243 00:13:21,200 --> 00:13:27,839 Speaker 2: by noon today. I'm not always surprised at how long 244 00:13:27,880 --> 00:13:30,240 Speaker 2: it takes to answer these questions. Seems like they should 245 00:13:30,240 --> 00:13:31,559 Speaker 2: be answerable, like in an hour. 246 00:13:32,640 --> 00:13:34,800 Speaker 4: Yeah, of course, but everybody lawyer's up and drags their 247 00:13:34,800 --> 00:13:38,040 Speaker 4: feet as much as possible. I mean, you can have 248 00:13:38,080 --> 00:13:40,600 Speaker 4: the most legitimate request in the world for information from 249 00:13:40,640 --> 00:13:42,840 Speaker 4: the government, and it's just it's. 250 00:13:43,480 --> 00:13:46,160 Speaker 2: Like, you know, I don't know mining for rare earth minerals. 251 00:13:46,240 --> 00:13:49,440 Speaker 2: Do you have Republican committees in theory trying to protect 252 00:13:49,480 --> 00:13:52,640 Speaker 2: a Republican administration and a Republican sect deaf and all that. 253 00:13:52,840 --> 00:13:54,400 Speaker 2: I would think they'd all want to Yeah, let's get 254 00:13:54,400 --> 00:13:58,840 Speaker 2: the video out. This is what happened, unless something bad happened. Yeah, 255 00:13:59,120 --> 00:14:03,319 Speaker 2: interesting thoughts Texas please four one, two, nine, five KFTC. 256 00:14:07,040 --> 00:14:09,360 Speaker 2: I might want to get super involved in the AI 257 00:14:09,559 --> 00:14:11,880 Speaker 2: argument because it might be the most important thing that's 258 00:14:11,880 --> 00:14:13,760 Speaker 2: ever happened in the history of the world. Want to 259 00:14:13,760 --> 00:14:17,880 Speaker 2: talk about that coming up next segment. Became aware of this. 260 00:14:17,800 --> 00:14:22,080 Speaker 4: New book, A Kinder Gentler Feminism that well, the title 261 00:14:22,160 --> 00:14:27,480 Speaker 4: is actually the Dignity of Dependence, a feminist manifesto that 262 00:14:27,680 --> 00:14:30,520 Speaker 4: Caleb Bart reviewed super interesting. We don't really have time 263 00:14:30,520 --> 00:14:33,920 Speaker 4: to get into it now, but the author, who's a woman, uh, 264 00:14:34,360 --> 00:14:39,800 Speaker 4: talks about, you know, feminism got way off track, but 265 00:14:40,040 --> 00:14:42,800 Speaker 4: it got off track when it began to deny women's 266 00:14:42,880 --> 00:14:47,000 Speaker 4: essential differences from men. She observes with great acuity, depth, 267 00:14:47,000 --> 00:14:49,240 Speaker 4: and wisdom that much of feminism today seeks to make 268 00:14:49,280 --> 00:14:51,800 Speaker 4: women more like men in so far as they are 269 00:14:51,880 --> 00:14:55,800 Speaker 4: autonomous and impenetrable blah blah blah, and and just that 270 00:14:55,800 --> 00:14:58,520 Speaker 4: that's a very perverse way to look at empowering women. 271 00:14:58,920 --> 00:15:01,880 Speaker 2: There's no make them into There's no doubt that that's true. 272 00:15:01,880 --> 00:15:05,360 Speaker 2: Though a lot of modern feminism is just if you're 273 00:15:05,720 --> 00:15:10,120 Speaker 2: anything that's not being like a woman is wrong. Yeah, yeah, 274 00:15:10,120 --> 00:15:10,840 Speaker 2: it's just crazy. 275 00:15:11,000 --> 00:15:13,120 Speaker 4: And then that reminded me of something else i'd seen, 276 00:15:13,280 --> 00:15:15,560 Speaker 4: and I'd mentioned earlier in the show that in the 277 00:15:15,600 --> 00:15:18,440 Speaker 4: midst of my holiday revelry and a lovely time with 278 00:15:18,560 --> 00:15:21,360 Speaker 4: my family, I made the mistake of clicking on an 279 00:15:21,400 --> 00:15:23,320 Speaker 4: article in The New York Post, which was just a 280 00:15:23,320 --> 00:15:28,000 Speaker 4: perfect example of Internet snipy bitterness, superiority trolling. 281 00:15:28,000 --> 00:15:29,720 Speaker 2: The rest of it here's another good one. 282 00:15:30,440 --> 00:15:33,160 Speaker 4: This is a Vice article that came out a couple 283 00:15:33,200 --> 00:15:34,840 Speaker 4: of months ago, but now it's gone viral. 284 00:15:35,160 --> 00:15:37,760 Speaker 2: It's about man keeping. I will quote. 285 00:15:38,240 --> 00:15:41,480 Speaker 4: Mankeeping describes the emotional labor women end up doing in 286 00:15:41,480 --> 00:15:46,080 Speaker 4: heterosexual relationships. It goes beyond remembering birthdays or coordinating social plans. 287 00:15:46,160 --> 00:15:49,720 Speaker 4: It means being your partner's one person support system, managing 288 00:15:49,720 --> 00:15:52,760 Speaker 4: his stress, interpreting his moods, holding his hand through feelings. 289 00:15:52,760 --> 00:15:57,120 Speaker 4: He won't share with anyone else, all of it unpaid, unacknowledged, 290 00:15:57,160 --> 00:15:58,920 Speaker 4: and often unreciprocated. 291 00:15:59,120 --> 00:16:02,120 Speaker 2: What are we talking about, well man keeping? 292 00:16:02,200 --> 00:16:04,960 Speaker 4: And as Nelly Bowles writes, hmm, when I was young, 293 00:16:05,000 --> 00:16:08,800 Speaker 4: it was called loving someone. Matter woman, you do a 294 00:16:08,840 --> 00:16:11,000 Speaker 4: lot of that stuff for your partner, But now we 295 00:16:11,040 --> 00:16:12,760 Speaker 4: call it man keeping and we hate it. 296 00:16:13,040 --> 00:16:18,160 Speaker 2: Oh do we hate it? Holding his hand ew through feelings. Eh, 297 00:16:18,160 --> 00:16:22,280 Speaker 2: he won't share with anyone but you. Ooh. And it's unpaid. 298 00:16:22,400 --> 00:16:24,640 Speaker 4: This is worse than an internship. 299 00:16:25,400 --> 00:16:27,400 Speaker 2: Took the words right out of my mouth. That's that's 300 00:16:27,440 --> 00:16:28,080 Speaker 2: called love? 301 00:16:28,760 --> 00:16:32,560 Speaker 4: What that is a viral whatever that means? Article on 302 00:16:32,600 --> 00:16:36,000 Speaker 4: the internet that's really making the rounds. Young women can say, 303 00:16:36,200 --> 00:16:37,520 Speaker 4: I don't need that this. 304 00:16:37,600 --> 00:16:42,400 Speaker 2: Man keeping good grief. Unplug the effing Internet. 305 00:16:43,160 --> 00:16:46,160 Speaker 4: I'm gonna miss buying stuff online, but I'll schlep to 306 00:16:46,200 --> 00:16:50,000 Speaker 4: the store. I'll write letters, I'll call people on the telephone. 307 00:16:50,560 --> 00:16:51,760 Speaker 4: Unplug the Internet. 308 00:16:52,280 --> 00:16:54,640 Speaker 2: Well, it's too late to unplug the Internet. Is it 309 00:16:54,680 --> 00:16:57,640 Speaker 2: too late to unplug the a dot? The idea of 310 00:16:57,680 --> 00:17:02,040 Speaker 2: AI taking over the world and destroying mankind? And probably 311 00:17:02,200 --> 00:17:04,760 Speaker 2: isn't too late. But we'd all have to rally together, 312 00:17:04,960 --> 00:17:08,679 Speaker 2: which I might be interested in trying to start? What 313 00:17:08,720 --> 00:17:13,159 Speaker 2: do I need? A sledgehammer, a fro bar and a gasoline? 314 00:17:13,160 --> 00:17:14,320 Speaker 2: What do I tell me? 315 00:17:14,320 --> 00:17:17,679 Speaker 3: And I'll bring up day tune, Armstrong and Getty. 316 00:17:19,080 --> 00:17:21,400 Speaker 8: Retailers are hoping to add a new spin to their 317 00:17:21,400 --> 00:17:25,320 Speaker 8: sales events, leaning into artificial intelligence. In the weeks leading 318 00:17:25,359 --> 00:17:28,280 Speaker 8: up to peak holiday spending, chat GBT maker open Ai 319 00:17:28,400 --> 00:17:31,520 Speaker 8: partnered with Walmart and Target. Over half of shoppers set 320 00:17:31,560 --> 00:17:34,280 Speaker 8: in a recent survey they plan to use AI this year. 321 00:17:34,520 --> 00:17:38,000 Speaker 8: This year, Amazon's offering its own chatbot called Rufus, where 322 00:17:38,080 --> 00:17:40,159 Speaker 8: you can take a screenshot of a shopping list and 323 00:17:40,359 --> 00:17:42,560 Speaker 8: automatically add those items to your card. 324 00:17:43,480 --> 00:17:48,359 Speaker 2: Oh, how exciting is that? That's not that great? All 325 00:17:48,440 --> 00:17:52,359 Speaker 2: the discussions we're having around AI are missing the main 326 00:17:52,440 --> 00:17:58,400 Speaker 2: topic which should be discussed probably constantly and maybe maybe 327 00:17:58,440 --> 00:18:01,639 Speaker 2: the number one political topic in America, if not the 328 00:18:01,680 --> 00:18:07,160 Speaker 2: top couple. As it's been pointed out by a number 329 00:18:07,160 --> 00:18:09,280 Speaker 2: of the people I've been watching and listening to and 330 00:18:09,320 --> 00:18:13,640 Speaker 2: reading over the last several weeks about AI. These chatbots 331 00:18:13,640 --> 00:18:19,560 Speaker 2: were all using They're like a web page as opposed 332 00:18:19,560 --> 00:18:23,760 Speaker 2: to the Internet. It's got nothing to do with what 333 00:18:23,800 --> 00:18:25,840 Speaker 2: AI is going to become or the impact it's going 334 00:18:25,880 --> 00:18:27,920 Speaker 2: to have on the world. The fact that chat GPT 335 00:18:28,119 --> 00:18:30,440 Speaker 2: is a cooler Google, you know that sort of thing, 336 00:18:30,880 --> 00:18:34,399 Speaker 2: and it's kind of misleading people into thinking as to 337 00:18:34,440 --> 00:18:36,560 Speaker 2: what AI actually is. So I've been on this kick 338 00:18:36,600 --> 00:18:38,000 Speaker 2: for quite a while. If you listen you know, the 339 00:18:38,320 --> 00:18:40,000 Speaker 2: big into AI and read about it and listen to it, 340 00:18:40,880 --> 00:18:43,119 Speaker 2: let's podcast about it and everything like that. The book 341 00:18:43,119 --> 00:18:45,560 Speaker 2: that came out fairly recently I mentioned Before the Break 342 00:18:45,640 --> 00:18:50,399 Speaker 2: by Ellie Ykowski, which has gotten a lot of attention 343 00:18:50,520 --> 00:18:55,199 Speaker 2: in AI circles. It's called if Anyone builds It, Everyone Dies. 344 00:18:56,080 --> 00:18:58,920 Speaker 2: He was the one of the biggest proponents of AI 345 00:18:59,040 --> 00:19:04,680 Speaker 2: over the last several decades, one of your leading cheerleaders 346 00:19:04,720 --> 00:19:07,679 Speaker 2: for AI. If you ever read anything or watched a 347 00:19:07,760 --> 00:19:11,280 Speaker 2: show or anything like that, or any network television show 348 00:19:11,440 --> 00:19:14,359 Speaker 2: Oprah in the afternoon, whatever, and somebody was talking about AI, 349 00:19:14,400 --> 00:19:18,480 Speaker 2: it was probably him up until recently when he decided, no, 350 00:19:18,600 --> 00:19:22,160 Speaker 2: this is we can't control this. Superintelligence is absolutely going 351 00:19:22,200 --> 00:19:26,080 Speaker 2: to happen. We're creating a beast significantly smarter than us. 352 00:19:26,760 --> 00:19:28,840 Speaker 2: How do we think that's possibly going to turn out 353 00:19:28,840 --> 00:19:31,639 Speaker 2: to our benefit? We need to stop it immediately. And 354 00:19:31,680 --> 00:19:34,439 Speaker 2: he wrote this book. If anyone builds it, everyone dies, 355 00:19:34,720 --> 00:19:36,200 Speaker 2: and he's trying to get a whole bunch of people 356 00:19:36,240 --> 00:19:39,320 Speaker 2: on board to do something about this. Elon said the 357 00:19:39,359 --> 00:19:43,159 Speaker 2: other day he thinks there's a ten to twenty percent 358 00:19:43,359 --> 00:19:49,240 Speaker 2: chance that AI destroys mankind. Why in the hell would 359 00:19:49,280 --> 00:19:52,639 Speaker 2: you build anything that there's even a ten percent chance 360 00:19:52,680 --> 00:19:58,320 Speaker 2: that destroys all mankind? It seems crazy, that is. 361 00:19:58,800 --> 00:20:02,120 Speaker 4: I mean, the answer is so so clear and so unimplementable. 362 00:20:03,119 --> 00:20:05,600 Speaker 4: At the point that it gets real good at curing cancer, 363 00:20:05,640 --> 00:20:10,320 Speaker 4: for instance, we all love it and then stop right, 364 00:20:10,359 --> 00:20:11,560 Speaker 4: but it's unimplementable. 365 00:20:12,680 --> 00:20:13,199 Speaker 2: I was watching it. 366 00:20:13,359 --> 00:20:15,199 Speaker 4: I mean, for the obvious reasons, we don't have the 367 00:20:15,200 --> 00:20:16,800 Speaker 4: cooperation of everyone on Earth. 368 00:20:17,640 --> 00:20:21,000 Speaker 2: I was watching a I'm gonna start wearing t shirts 369 00:20:21,040 --> 00:20:26,520 Speaker 2: from the doom people because I'm one of them. I'm 370 00:20:26,560 --> 00:20:31,280 Speaker 2: a doomer definitely that believes that it's all going to 371 00:20:31,280 --> 00:20:36,040 Speaker 2: go to hell. Doom Debates is a website. I've been 372 00:20:36,080 --> 00:20:39,320 Speaker 2: watching a lot. Practically everybody involved in this arguments within 373 00:20:39,480 --> 00:20:41,560 Speaker 2: sixty miles of this radio studio, and I wish we 374 00:20:41,600 --> 00:20:44,560 Speaker 2: could get some of them on the air, the Doom Debates, 375 00:20:44,600 --> 00:20:46,399 Speaker 2: where they have some of the leading people on to 376 00:20:46,400 --> 00:20:48,240 Speaker 2: discuss various sides of it. I was watching a doom 377 00:20:48,320 --> 00:20:51,480 Speaker 2: debate between this guy, Max Tegmark, who I've mentioned a lot. 378 00:20:51,960 --> 00:20:53,920 Speaker 2: I've read a couple of his books, Life three point 379 00:20:54,000 --> 00:20:56,240 Speaker 2: zero and bunch of different stuff. He's an MIT scientist 380 00:20:56,560 --> 00:20:59,320 Speaker 2: with this other guy who's one of your leading AI researchers, 381 00:20:59,359 --> 00:21:02,960 Speaker 2: who thinks you know hit And the main pro argument 382 00:21:03,320 --> 00:21:07,080 Speaker 2: from this dude is there's just no regulating it anyway. 383 00:21:07,119 --> 00:21:08,680 Speaker 2: I mean, how the hell are you going to regulate it? 384 00:21:08,720 --> 00:21:11,280 Speaker 2: Which he might be right about, but tech Mark and 385 00:21:11,880 --> 00:21:13,440 Speaker 2: a lot of others, and this is what I might 386 00:21:13,480 --> 00:21:17,440 Speaker 2: want to get involved with personally. They're trying to get 387 00:21:17,720 --> 00:21:22,960 Speaker 2: people's attention and maybe have marches or something to try 388 00:21:23,000 --> 00:21:26,639 Speaker 2: to alert the government. We need to come up with 389 00:21:26,760 --> 00:21:30,960 Speaker 2: a plan. We're just screaming one thousand miles an hour 390 00:21:31,160 --> 00:21:36,360 Speaker 2: toward developing a beast or whatever you want to call 391 00:21:36,400 --> 00:21:40,440 Speaker 2: it that is absolutely going to doom humanity and nobody's 392 00:21:40,480 --> 00:21:43,680 Speaker 2: putting any brakes on it. What the freaking hell are 393 00:21:43,720 --> 00:21:44,359 Speaker 2: we doing? 394 00:21:45,359 --> 00:21:49,280 Speaker 4: Yeah, twice already you've used the word something, And I'm 395 00:21:49,280 --> 00:21:52,480 Speaker 4: not saying that we shouldn't be trying to get people's 396 00:21:52,520 --> 00:21:55,720 Speaker 4: attention so that they're willing to do something. Then the 397 00:21:55,760 --> 00:21:58,360 Speaker 4: obvious next step is what thing. 398 00:21:58,320 --> 00:22:01,680 Speaker 2: Welle Mark, tech Mark other people's argument, And he's been 399 00:22:01,800 --> 00:22:03,760 Speaker 2: he's been trying to lobby Congress, but there's just not 400 00:22:03,920 --> 00:22:07,040 Speaker 2: enough public knowledge out there will to really have any 401 00:22:07,480 --> 00:22:09,760 Speaker 2: any haf yet. And that's why I's wondering where maybe 402 00:22:10,000 --> 00:22:11,720 Speaker 2: we can come in, or I can come in or whatever, 403 00:22:11,960 --> 00:22:16,560 Speaker 2: get the media involved to alert people to what could happen. 404 00:22:16,640 --> 00:22:18,760 Speaker 2: Would just be to say, everybody's got to stop open 405 00:22:18,840 --> 00:22:24,240 Speaker 2: AI chat, gpt elon Zuckerberg, you gotta stop no more 406 00:22:24,880 --> 00:22:27,960 Speaker 2: until we get our heads around this and come up 407 00:22:27,960 --> 00:22:31,640 Speaker 2: with some sort of regulations. One of the points being 408 00:22:31,680 --> 00:22:34,440 Speaker 2: made is we have way more regulations on sandwiches in 409 00:22:34,480 --> 00:22:38,960 Speaker 2: America currently than we do on AI. It's not even close. 410 00:22:39,240 --> 00:22:44,040 Speaker 2: There are Sandwich There are almost zero, almost zero regulations 411 00:22:44,040 --> 00:22:49,439 Speaker 2: on AI at this point, right, And you know, if 412 00:22:49,480 --> 00:22:51,040 Speaker 2: you're a listener to this show, you know it's not 413 00:22:51,240 --> 00:22:53,720 Speaker 2: like me to want to be pro any kind of regulation. 414 00:22:54,960 --> 00:22:57,000 Speaker 2: But as he points out, you're not allowed to just 415 00:22:57,040 --> 00:22:59,520 Speaker 2: make any kind of drug you want to and put 416 00:22:59,520 --> 00:23:02,520 Speaker 2: it out to the people. But we have no regulations 417 00:23:02,520 --> 00:23:05,240 Speaker 2: on AI for what we're just going to unleash on humanity. 418 00:23:07,720 --> 00:23:10,679 Speaker 4: Does I assume he gets to my next you know, 419 00:23:11,840 --> 00:23:15,159 Speaker 4: Devil's advocate type question is okay, if we regulate it 420 00:23:15,200 --> 00:23:17,959 Speaker 4: but China does not, then where are we? 421 00:23:18,040 --> 00:23:19,879 Speaker 2: That always gets sticky also, and that's one of the 422 00:23:19,920 --> 00:23:23,399 Speaker 2: pushbacks from people. You'd have to have some sort of 423 00:23:24,119 --> 00:23:27,479 Speaker 2: world pressured, same way we had around nuclear weapons, that 424 00:23:29,520 --> 00:23:31,680 Speaker 2: inspectors going in, or you have to announce when you're 425 00:23:31,720 --> 00:23:34,159 Speaker 2: blah blah blah, all kinds of different things. Yeah, I 426 00:23:34,200 --> 00:23:36,000 Speaker 2: don't know if that's fling me. Don't know what the 427 00:23:36,080 --> 00:23:38,199 Speaker 2: questions and what the some things are. Yet does not 428 00:23:39,000 --> 00:23:41,200 Speaker 2: in any way deny that we ought to be trying 429 00:23:41,240 --> 00:23:43,760 Speaker 2: to come up with them. But boy, it's a head scratcher. 430 00:23:44,480 --> 00:23:51,640 Speaker 2: I'd say it's one of the arguments is that we're 431 00:23:51,680 --> 00:23:58,280 Speaker 2: the smartest beast on earth. Every other living organism lives 432 00:23:58,320 --> 00:24:02,040 Speaker 2: at our pleasure. It's only because we as a society, 433 00:24:02,160 --> 00:24:03,879 Speaker 2: and this is kind of in the modern age, that 434 00:24:03,920 --> 00:24:06,359 Speaker 2: we've decided we want to let chimpanzees live and we 435 00:24:06,359 --> 00:24:09,560 Speaker 2: should value them and not just murder them for their 436 00:24:09,840 --> 00:24:15,040 Speaker 2: teeth or whatever. Uh, it's they live at our even 437 00:24:15,080 --> 00:24:19,359 Speaker 2: though they're the second smartest beast on earth. Why would 438 00:24:19,359 --> 00:24:21,120 Speaker 2: you think that that's not going to occur when there's 439 00:24:21,160 --> 00:24:24,320 Speaker 2: a smarter thing than us that we're not going to live. 440 00:24:24,359 --> 00:24:26,879 Speaker 2: It's its pleasure, whether it wants us to be around 441 00:24:26,960 --> 00:24:31,720 Speaker 2: or not. Makes sense to me. Yeah. 442 00:24:31,920 --> 00:24:36,000 Speaker 4: Yeah, I've got this horrible thought that when the super 443 00:24:36,040 --> 00:24:38,040 Speaker 4: Ai is developed, the first thing that's going to happen 444 00:24:38,080 --> 00:24:40,680 Speaker 4: is Kim Jong un is going to empty everybody's bank 445 00:24:40,720 --> 00:24:44,040 Speaker 4: accounts worldwide. It's all going to flow into the North 446 00:24:44,119 --> 00:24:47,560 Speaker 4: Korean treasury and that'll be it. That'll be plenty, can 447 00:24:47,600 --> 00:24:48,280 Speaker 4: you imagine? 448 00:24:49,240 --> 00:24:51,640 Speaker 2: Yeah, I don't know if that. I don't think any 449 00:24:51,640 --> 00:24:54,240 Speaker 2: individual is going to have any control When all the 450 00:24:54,280 --> 00:24:56,920 Speaker 2: experts say this, no individual is going to have any 451 00:24:56,960 --> 00:24:59,679 Speaker 2: control over super AI. It will do whatever the hell 452 00:24:59,680 --> 00:25:01,679 Speaker 2: it's w it wants. It's not going to do the 453 00:25:01,680 --> 00:25:04,439 Speaker 2: bidding of the Chinese or North Korea or US or 454 00:25:04,480 --> 00:25:06,600 Speaker 2: any other human. It's going to do whatever it wants 455 00:25:06,640 --> 00:25:09,440 Speaker 2: and what it wants. Nobody has the slightest idea. 456 00:25:09,560 --> 00:25:11,920 Speaker 4: May I hit you with an intriguing email from our 457 00:25:11,960 --> 00:25:15,880 Speaker 4: friend jt in Livermore about the dangers from super intelligent 458 00:25:15,920 --> 00:25:17,879 Speaker 4: general AI. I think it comes down to this question. 459 00:25:17,920 --> 00:25:21,760 Speaker 4: Would an intelligence vastly superior to human intelligence basis actions 460 00:25:22,000 --> 00:25:25,480 Speaker 4: on the most base animalistic behaviors and emotions, or would 461 00:25:25,520 --> 00:25:28,680 Speaker 4: it be driven by a higher understanding and intelligent empathy. 462 00:25:29,119 --> 00:25:31,359 Speaker 4: Claiming that a super AI would attack us out of 463 00:25:31,400 --> 00:25:33,840 Speaker 4: desire for self preservation, or out of paranoia, or out 464 00:25:33,880 --> 00:25:36,879 Speaker 4: of indifference to the value of life, for presupposes that 465 00:25:36,920 --> 00:25:39,159 Speaker 4: the basist emotions would be the domin drivers of the 466 00:25:39,160 --> 00:25:42,480 Speaker 4: AI's action. But don't most higher thinkers believe in empathy, 467 00:25:42,680 --> 00:25:45,040 Speaker 4: helping those less fortunate, the sanctity of life and the 468 00:25:45,040 --> 00:25:47,760 Speaker 4: beauty of life. Wouldn't a super AI be more likely 469 00:25:47,800 --> 00:25:50,760 Speaker 4: to adopt those higher forms of enlightenment rather than the 470 00:25:50,800 --> 00:25:54,159 Speaker 4: basist motivations of a selfish scared Toddler, and then, by 471 00:25:54,200 --> 00:25:57,320 Speaker 4: way of illustration, at the end of the fabulous original 472 00:25:57,480 --> 00:26:01,840 Speaker 4: Blade Runner movie, the last Ultra Advanced Replicant has almost 473 00:26:01,880 --> 00:26:03,800 Speaker 4: every reason that killed the human that has killed all 474 00:26:03,800 --> 00:26:06,560 Speaker 4: of his friends, and that was trying to kill him, 475 00:26:06,560 --> 00:26:09,440 Speaker 4: but he chose to let Harrison Ford's characters live. The 476 00:26:09,560 --> 00:26:11,920 Speaker 4: character live because it believed in the beauty and sanctity 477 00:26:11,920 --> 00:26:12,359 Speaker 4: of life. 478 00:26:13,080 --> 00:26:15,919 Speaker 2: Yeah, one would that'd be fantastic that went that direction, 479 00:26:15,960 --> 00:26:19,240 Speaker 2: but I don't know how you count on it. The 480 00:26:19,320 --> 00:26:23,639 Speaker 2: example was used that when Germany started two World Wars, 481 00:26:23,680 --> 00:26:28,560 Speaker 2: they were pretty much the most sophisticated advanced society on 482 00:26:28,720 --> 00:26:32,680 Speaker 2: planet Earth, with the finest arts and writers and everything else, 483 00:26:33,080 --> 00:26:35,399 Speaker 2: and they went completely off the rails and did the 484 00:26:35,400 --> 00:26:39,879 Speaker 2: things that they did because intelligent beings are capable of 485 00:26:39,920 --> 00:26:42,560 Speaker 2: doing that and convincing themselves they're doing the right thing. 486 00:26:43,160 --> 00:26:44,720 Speaker 2: And a reminder that C. S. 487 00:26:44,800 --> 00:26:47,360 Speaker 4: Lewis, to paraphrase him, put it that the most oppressive 488 00:26:47,359 --> 00:26:49,800 Speaker 4: oppression is from people who think they're doing it for 489 00:26:50,000 --> 00:26:53,280 Speaker 4: your right. And I could easily see one power or 490 00:26:53,280 --> 00:26:55,800 Speaker 4: another deciding that, you know, all of humanity would be 491 00:26:55,800 --> 00:26:58,920 Speaker 4: a hell of a lot better off, you know, under 492 00:26:59,000 --> 00:26:59,760 Speaker 4: our boot heel. 493 00:27:01,320 --> 00:27:04,439 Speaker 2: Yeah, one example was used. So the alignment problem all 494 00:27:04,480 --> 00:27:11,119 Speaker 2: along has been can you align whichever AIS you're talking about? 495 00:27:11,119 --> 00:27:13,080 Speaker 2: And I have become aware that they all use the 496 00:27:13,160 --> 00:27:15,840 Speaker 2: term AIS when they're talking about multiples. That is just 497 00:27:16,000 --> 00:27:20,800 Speaker 2: the term of art. So whichever AIS you're talking about, 498 00:27:21,280 --> 00:27:23,840 Speaker 2: you try to align them with some sort of morals 499 00:27:23,920 --> 00:27:27,000 Speaker 2: or decency that your company puts in them. But the 500 00:27:27,119 --> 00:27:28,960 Speaker 2: argument was made, and I thought this was really good. 501 00:27:29,040 --> 00:27:34,600 Speaker 2: We're programmed with really one alignment completely as human beings, 502 00:27:35,160 --> 00:27:40,600 Speaker 2: and that is to stay alive and pro create. Yet 503 00:27:40,640 --> 00:27:44,560 Speaker 2: we invented birth control and abortion, which seems to run 504 00:27:44,840 --> 00:27:47,840 Speaker 2: completely contrary to the one thing we were aligned to do. 505 00:27:48,400 --> 00:27:51,240 Speaker 2: And then we regularly do things that aren't like in 506 00:27:51,280 --> 00:27:54,240 Speaker 2: our best interest in terms of eating or exercising or 507 00:27:54,440 --> 00:27:57,840 Speaker 2: all kinds of different things. We're aligned to stay alive, 508 00:27:59,359 --> 00:28:01,440 Speaker 2: but we do all kinds of things that will kill us. 509 00:28:01,960 --> 00:28:04,800 Speaker 2: So you don't necessarily stay on track, which would be 510 00:28:04,880 --> 00:28:07,560 Speaker 2: the same problem with whichever AI is built to be 511 00:28:07,600 --> 00:28:11,199 Speaker 2: aligned with whatever Elon Musk or Sam Altman's goals are 512 00:28:11,280 --> 00:28:13,880 Speaker 2: for the for the supercomputer. 513 00:28:14,359 --> 00:28:17,560 Speaker 4: So now that you've terrified everybody, what's the latest thinking 514 00:28:17,600 --> 00:28:21,760 Speaker 4: on timetables. Is there any predominant opinion on when you 515 00:28:21,800 --> 00:28:26,440 Speaker 4: know various terrifying end or awe inspiring bench stones are 516 00:28:26,840 --> 00:28:29,240 Speaker 4: milestones are reached. It's all over the place, but the 517 00:28:29,440 --> 00:28:32,199 Speaker 4: bench stone, it's benchmark or milestone anyway back to you. 518 00:28:32,520 --> 00:28:34,639 Speaker 2: It's all over the place. But everybody agrees that we 519 00:28:34,760 --> 00:28:38,440 Speaker 2: got here way faster than everybody thought we would to 520 00:28:38,480 --> 00:28:42,440 Speaker 2: where we are today, ended up arriving much faster than 521 00:28:42,480 --> 00:28:46,280 Speaker 2: most predictions. So so far it has been on the 522 00:28:46,440 --> 00:28:50,200 Speaker 2: forward end of things happening, as opposed to the back 523 00:28:50,280 --> 00:28:54,000 Speaker 2: end in terms of how fast it can happen. 524 00:28:54,440 --> 00:28:54,760 Speaker 1: Now. 525 00:28:57,600 --> 00:28:59,920 Speaker 2: They were generally arguing in the debate I was watching 526 00:29:00,160 --> 00:29:03,120 Speaker 2: last night somewhere in the early thirties, which is only 527 00:29:03,520 --> 00:29:06,080 Speaker 2: five and a half years from now, between five and 528 00:29:06,120 --> 00:29:08,400 Speaker 2: ten years from now that it will arrive. But what 529 00:29:08,440 --> 00:29:10,520 Speaker 2: do you think of the idea of trying to raise 530 00:29:10,680 --> 00:29:13,360 Speaker 2: public will to do this? Do you think you could 531 00:29:13,360 --> 00:29:17,760 Speaker 2: possibly do that? Convince people? And the other thing. I 532 00:29:17,760 --> 00:29:20,520 Speaker 2: would actually love to talk to these people about this, 533 00:29:20,760 --> 00:29:26,960 Speaker 2: some of the thinkers that are trying to move the masses. 534 00:29:27,880 --> 00:29:34,720 Speaker 2: If a whiff of partisanship comes into this, it's over. 535 00:29:35,160 --> 00:29:35,480 Speaker 1: Though. 536 00:29:35,800 --> 00:29:38,680 Speaker 2: If Trump weighs in one way or the other on AI, 537 00:29:39,280 --> 00:29:42,520 Speaker 2: forget it, We're done. Or if some pundit, you know, 538 00:29:42,560 --> 00:29:46,600 Speaker 2: assigns being pro AI is what Trump wants or being 539 00:29:46,600 --> 00:29:49,520 Speaker 2: anti AI is Trump, it'd be like masks and vaccines 540 00:29:49,560 --> 00:29:52,680 Speaker 2: and everything else. It's over at that point, right, And 541 00:29:52,680 --> 00:29:54,600 Speaker 2: I don't know if there's any avoiding that. 542 00:29:55,000 --> 00:29:59,000 Speaker 4: I just keep thinking back to the hilarious movie Don't 543 00:29:59,040 --> 00:30:01,760 Speaker 4: Look Up, which a lot of conservatives didn't like because 544 00:30:01,800 --> 00:30:05,760 Speaker 4: it's scured conservatives, but I thought it scured lefties. Every 545 00:30:05,760 --> 00:30:10,520 Speaker 4: bit is brilliantly. I don't know that you can get 546 00:30:10,520 --> 00:30:13,040 Speaker 4: that going. I don't know that you can get people 547 00:30:13,040 --> 00:30:14,000 Speaker 4: to pay attention. 548 00:30:13,880 --> 00:30:16,120 Speaker 2: Well like and Don't Look Up. There was a movement 549 00:30:16,200 --> 00:30:19,440 Speaker 2: toward making sure the media is going to benefit so 550 00:30:19,520 --> 00:30:22,360 Speaker 2: many people that we need to make sure the media 551 00:30:22,520 --> 00:30:24,440 Speaker 2: hits here. But what if it hits in the United 552 00:30:24,440 --> 00:30:26,320 Speaker 2: States and not in countries where they have you know, 553 00:30:26,440 --> 00:30:30,720 Speaker 2: more inequality, you know, all that sort of stuff, right, 554 00:30:31,080 --> 00:30:33,200 Speaker 2: which absolutely will be the topic for AI. 555 00:30:33,440 --> 00:30:38,480 Speaker 4: But metior to destroy life, the poor and minorities affected most. 556 00:30:38,640 --> 00:30:41,120 Speaker 2: Yeah, do you think there's a possibility because we have 557 00:30:41,160 --> 00:30:42,560 Speaker 2: to take a break here soon, do you think there's 558 00:30:42,560 --> 00:30:46,320 Speaker 2: a possibility that you could raise awareness, get people worked 559 00:30:46,400 --> 00:30:48,320 Speaker 2: up enough that you could end up with And I've 560 00:30:48,360 --> 00:30:52,200 Speaker 2: been against every march that's occurred since nineteen sixty eight, 561 00:30:52,720 --> 00:30:55,760 Speaker 2: but if you could, if you could get people like 562 00:30:55,800 --> 00:30:59,720 Speaker 2: in the streets and say regulate AI, let's do something 563 00:30:59,720 --> 00:31:04,040 Speaker 2: about Do you think you could even come close to that? Yeah? 564 00:31:04,160 --> 00:31:06,320 Speaker 4: Yeah, and I think it would probably be useless because 565 00:31:06,360 --> 00:31:09,800 Speaker 4: of the China factor, but I don't know for sure. 566 00:31:11,400 --> 00:31:17,000 Speaker 2: So far, we're way ahead of them, We're the leading 567 00:31:17,080 --> 00:31:17,479 Speaker 2: edge of it. 568 00:31:19,440 --> 00:31:22,480 Speaker 4: This has to be how people felt when they saw 569 00:31:22,520 --> 00:31:24,080 Speaker 4: the mushroom clouds in the forties. 570 00:31:25,520 --> 00:31:30,280 Speaker 2: They thought this can't end. Well, yeah, and that's an 571 00:31:30,360 --> 00:31:33,480 Speaker 2: argument for c that everybody thought that would doom humanity. 572 00:31:33,520 --> 00:31:36,080 Speaker 2: But that's been in the hands of a small number 573 00:31:36,120 --> 00:31:39,560 Speaker 2: of people and human beings making the decision. What if 574 00:31:39,600 --> 00:31:41,920 Speaker 2: the you know, the nuclear weapons didn't get to make 575 00:31:41,960 --> 00:31:44,040 Speaker 2: their own decisions of what would be best for life 576 00:31:44,080 --> 00:31:48,040 Speaker 2: and go ahead and develop independently of human needs. Well, in, 577 00:31:48,080 --> 00:31:51,680 Speaker 2: come on, Japan didn't have mutually shared destruction in nineteen 578 00:31:51,760 --> 00:31:56,080 Speaker 2: forty five, right, and well they were on the receiver end. 579 00:31:56,120 --> 00:31:57,840 Speaker 2: Anybody with any thoughts on this, I'd love to hear 580 00:31:57,920 --> 00:32:00,920 Speaker 2: it for a mailbag tomorrow or on the line or whatever. 581 00:32:01,480 --> 00:32:03,360 Speaker 2: It's a hell of a topic. Anyway, we will finish 582 00:32:03,440 --> 00:32:09,360 Speaker 2: strong next waning moments of the first day back after 583 00:32:09,400 --> 00:32:11,400 Speaker 2: a long vacation in which I gained three and a 584 00:32:11,480 --> 00:32:16,360 Speaker 2: quarter pounds. Congratulations to me, Michael, can we hear forty seven? Please? 585 00:32:18,520 --> 00:32:20,120 Speaker 2: Why do you blame the Biden. 586 00:32:19,800 --> 00:32:22,440 Speaker 1: Administration because they let him in? Are you stupid? 587 00:32:22,800 --> 00:32:24,000 Speaker 2: Are you a stupid person? 588 00:32:24,320 --> 00:32:28,120 Speaker 7: Because they came into on a plane along with thousands 589 00:32:28,120 --> 00:32:29,040 Speaker 7: of other people. 590 00:32:28,800 --> 00:32:31,760 Speaker 6: That shouldn't be here, And you're just asking questions because 591 00:32:31,760 --> 00:32:32,800 Speaker 6: you're a stupid person. 592 00:32:34,000 --> 00:32:36,120 Speaker 2: I love him or hate him, He's always a gentleman. 593 00:32:36,240 --> 00:32:40,560 Speaker 2: Donald J. Trump? Are you stupid? Are you a stupid person? Yes? 594 00:32:40,640 --> 00:32:49,000 Speaker 4: Indeed, so that's just he's not getting more controlled. It 595 00:32:49,080 --> 00:32:52,320 Speaker 4: should be interesting going forward. I played that partly as 596 00:32:52,360 --> 00:32:54,040 Speaker 4: an excuse to bring this up again. We were talking 597 00:32:54,040 --> 00:32:58,040 Speaker 4: earlier about how so many countries around the West have 598 00:32:58,160 --> 00:33:01,960 Speaker 4: finally said openly, look, and Marco Rubio made an announcement 599 00:33:01,960 --> 00:33:05,280 Speaker 4: about it Thanksgiving week. Look, we're not going to import 600 00:33:05,400 --> 00:33:09,880 Speaker 4: people who hate our values from the third world. It's suicide. 601 00:33:10,720 --> 00:33:14,480 Speaker 4: And that's unquestionably true. And if you don't understand that 602 00:33:14,560 --> 00:33:17,160 Speaker 4: you are a fool or to dopey. Now I sound 603 00:33:17,200 --> 00:33:20,680 Speaker 4: like the president to participate in the national conversation. 604 00:33:22,560 --> 00:33:23,320 Speaker 2: I hope I'm not. 605 00:33:24,160 --> 00:33:28,280 Speaker 4: Just came across this stat brace yourself. The percentage of 606 00:33:28,360 --> 00:33:32,280 Speaker 4: children born in Canada who had a foreign born mother 607 00:33:32,880 --> 00:33:37,560 Speaker 4: last year forty two point three percent. 608 00:33:37,800 --> 00:33:42,400 Speaker 2: No, yes, no, yes, That. 609 00:33:42,280 --> 00:33:49,000 Speaker 4: Proportion is almost exactly doubled since nineteen ninety seven. The 610 00:33:49,040 --> 00:33:54,440 Speaker 4: West is rapidly becoming not the West anymore. To support 611 00:33:54,640 --> 00:33:58,560 Speaker 4: the economy, politicians and enough the people who live here. 612 00:33:58,640 --> 00:34:01,000 Speaker 2: And I don't understand the crowd that that's automatically a 613 00:34:01,040 --> 00:34:04,120 Speaker 2: good thing. Maybe we'll discuss that tomorrow. They're idiots. What's 614 00:34:04,160 --> 00:34:13,920 Speaker 2: to understand? They're idiots? Thought I'm strong, I'm strong. You're 615 00:34:13,960 --> 00:34:15,279 Speaker 2: ready with Katie Green and. 616 00:34:17,880 --> 00:34:18,840 Speaker 1: Thought Strong. 617 00:34:20,239 --> 00:34:22,560 Speaker 2: Here's your host for final thoughts, Joe Getty. 618 00:34:22,719 --> 00:34:24,480 Speaker 4: Hey, let's get a final thought from everybody and the 619 00:34:24,480 --> 00:34:26,040 Speaker 4: crew to wrap things up for the day. There is 620 00:34:26,080 --> 00:34:28,799 Speaker 4: our technical director, michae Lingelow. Michael, what's final thought? 621 00:34:28,960 --> 00:34:31,880 Speaker 6: I'm a little irritated having to work today on Cyber Monday. 622 00:34:32,040 --> 00:34:34,320 Speaker 6: I was at home, you know, I wanted to decorate 623 00:34:34,360 --> 00:34:35,160 Speaker 6: for the holidays. 624 00:34:35,400 --> 00:34:38,920 Speaker 2: But here I am working on Cyber Monday, of all days, 625 00:34:39,320 --> 00:34:40,000 Speaker 2: son of them. 626 00:34:40,080 --> 00:34:43,480 Speaker 4: Katie Green are Esteemed Newswoman. As a final thought, Katie, we. 627 00:34:43,440 --> 00:34:45,880 Speaker 2: Did Thanksgiving at someone else's house and I only regret 628 00:34:46,239 --> 00:34:49,560 Speaker 2: no leftover. Yes, that's the worst part of doing Thanksgiving 629 00:34:49,640 --> 00:34:52,160 Speaker 2: somewhere else. I mean the upside you didn't have to 630 00:34:52,160 --> 00:34:53,680 Speaker 2: make the meal, you don't have to clean up, but 631 00:34:53,800 --> 00:34:55,080 Speaker 2: no leftover sucks. 632 00:34:55,920 --> 00:34:57,960 Speaker 4: If you have any decency, you help with the cleanup 633 00:34:58,080 --> 00:34:58,719 Speaker 4: point of order. 634 00:34:58,800 --> 00:35:00,680 Speaker 2: Jack, do you have a final thought for Yeah, that's 635 00:35:00,680 --> 00:35:03,480 Speaker 2: a good point to know. Leftover's thing. Yeah. I mean, 636 00:35:03,560 --> 00:35:05,319 Speaker 2: on the other hand, maybe I'm better off without a 637 00:35:05,320 --> 00:35:07,279 Speaker 2: bunch of pie sitting around my house and stuffing and 638 00:35:07,280 --> 00:35:08,759 Speaker 2: everything like that, as I did gain three and a 639 00:35:08,880 --> 00:35:09,759 Speaker 2: quarter pounds. 640 00:35:10,880 --> 00:35:13,319 Speaker 4: Yeah. My final thought is also dessert related. We had 641 00:35:13,400 --> 00:35:16,759 Speaker 4: all sorts of different pies prepared for Thanksgiving dinner, but 642 00:35:16,840 --> 00:35:20,440 Speaker 4: we had twelve people in the house and family and 643 00:35:20,480 --> 00:35:23,720 Speaker 4: all and twelve yeah, and so like, by two o'clock 644 00:35:23,800 --> 00:35:25,560 Speaker 4: the next afternoon, it was all gone. 645 00:35:26,719 --> 00:35:28,840 Speaker 2: Damn it. It's bummed. 646 00:35:28,880 --> 00:35:32,160 Speaker 4: I could eat apple pie for breakfast every day in 647 00:35:32,239 --> 00:35:32,560 Speaker 4: my life. 648 00:35:32,640 --> 00:35:37,840 Speaker 2: Heck, yeah, oh I may some days. It exactly Armstrong 649 00:35:37,880 --> 00:35:40,879 Speaker 2: in Getty wrapping up another ruling four hour workday, so. 650 00:35:40,840 --> 00:35:42,719 Speaker 4: Many people will thanks a little time good Armstrong and 651 00:35:42,719 --> 00:35:47,120 Speaker 4: Getty dot com. No shipping charged today. Free shipping at 652 00:35:47,200 --> 00:35:50,799 Speaker 4: Armstrong and Getty dot com. The Ang Swag Store. Get 653 00:35:50,840 --> 00:35:54,480 Speaker 4: your favorite Ang fan. Maybe it's you a little souvenir. 654 00:35:54,400 --> 00:35:56,200 Speaker 2: And you're gonna be running out of time soon to 655 00:35:56,200 --> 00:35:58,080 Speaker 2: get it to them before the holiday. You know, we'll 656 00:35:58,080 --> 00:35:59,600 Speaker 2: see tomorrow. God bless America. 657 00:36:01,640 --> 00:36:05,560 Speaker 1: I'm Strong and Getty just on then this bollic. 658 00:36:14,160 --> 00:36:17,200 Speaker 4: Easy for you to say, Signed Stegas Store Segast. That 659 00:36:17,239 --> 00:36:19,000 Speaker 4: would have been a good punch, would have been great. 660 00:36:19,400 --> 00:36:21,600 Speaker 2: Oh yeah, there you go. Month is off to a 661 00:36:21,640 --> 00:36:27,320 Speaker 2: bad start. A pie, that's your problem. 662 00:36:28,880 --> 00:36:30,200 Speaker 1: Armstrong and Getty