1 00:00:04,840 --> 00:00:09,200 Speaker 1: On this episode of newts World, the AI company Anthropic 2 00:00:09,760 --> 00:00:14,120 Speaker 1: created a special model called Claude gov, the first to 3 00:00:14,160 --> 00:00:17,680 Speaker 1: be used on classified systems. This model does not have 4 00:00:17,720 --> 00:00:21,319 Speaker 1: the same guardrails and restrictions that their models available to 5 00:00:21,360 --> 00:00:26,120 Speaker 1: the public have. On Tuesday, February twenty fourth, Secretary Hegseth 6 00:00:26,239 --> 00:00:31,840 Speaker 1: met with Dario Amodai, the Entropic chief executive at the Pentagon, 7 00:00:32,280 --> 00:00:35,440 Speaker 1: and they could not come to an agreement over Anthropic 8 00:00:35,840 --> 00:00:39,640 Speaker 1: asking for reasonable assurances that its model would not be 9 00:00:39,800 --> 00:00:44,840 Speaker 1: used for surveillance of Americans or in autonomous weapons such 10 00:00:44,840 --> 00:00:49,480 Speaker 1: as drone operations that did not involve human oversight. Anthropic 11 00:00:49,600 --> 00:00:53,480 Speaker 1: is now suing the Department of Defense and escalating their 12 00:00:53,520 --> 00:00:57,920 Speaker 1: dispute over the use of artificial intelligence and warfare. I 13 00:00:57,960 --> 00:01:02,360 Speaker 1: am really pleased to welcome my guest, Michael Horowitz. He 14 00:01:02,440 --> 00:01:05,920 Speaker 1: is a Senior Fellow for Technology and Innovation at the 15 00:01:06,000 --> 00:01:10,080 Speaker 1: Council and Foreign Relations. He's also director of Perry World 16 00:01:10,120 --> 00:01:14,080 Speaker 1: House and Richard Perry Professor at the University of Pennsylvania. 17 00:01:14,680 --> 00:01:18,320 Speaker 1: He previously served as Deputy Assistant Secretor of Defense for 18 00:01:18,319 --> 00:01:23,200 Speaker 1: Force Development and Emerging Capabilities and Director of the Emerging 19 00:01:23,240 --> 00:01:27,520 Speaker 1: Capabilities Policy Office. He is the author of the Diffusion 20 00:01:27,520 --> 00:01:32,640 Speaker 1: of Military Power, Causes and Consequences for International Politics, and 21 00:01:32,760 --> 00:01:50,520 Speaker 1: co author of Why Leaders Fight. Michael, Welcome and thank 22 00:01:50,560 --> 00:01:51,880 Speaker 1: you for joining me on newts World. 23 00:01:52,720 --> 00:01:54,800 Speaker 2: Thanks so much for having me. I'm looking forward to 24 00:01:54,840 --> 00:01:55,520 Speaker 2: the conversation. 25 00:01:56,080 --> 00:01:58,840 Speaker 1: I'm delighted that you would take time to be with us, 26 00:01:59,120 --> 00:02:01,400 Speaker 1: particularly with all the things that are going on right now. 27 00:02:02,200 --> 00:02:06,920 Speaker 1: You've suggested that the Defense Department is really still approaching 28 00:02:07,480 --> 00:02:12,680 Speaker 1: military artificial intelligence in kind of an experimentation mode. Why 29 00:02:12,720 --> 00:02:17,200 Speaker 1: do you think adoption has been slower and more tentative 30 00:02:17,680 --> 00:02:18,960 Speaker 1: than some people expected. 31 00:02:20,160 --> 00:02:22,280 Speaker 2: You're a student of history, and I think it comes 32 00:02:22,280 --> 00:02:25,040 Speaker 2: straight from there. The United States. You know, we have 33 00:02:25,160 --> 00:02:27,880 Speaker 2: been privileged that the United States has the world's most 34 00:02:27,880 --> 00:02:31,800 Speaker 2: powerful military and the world's best soldiers and the world's 35 00:02:31,840 --> 00:02:35,600 Speaker 2: best technology. And sometimes when you're the best, it's hard 36 00:02:35,680 --> 00:02:40,080 Speaker 2: to transform because every day the US military is in 37 00:02:40,120 --> 00:02:43,840 Speaker 2: fact the best, and so change seems risky. And so 38 00:02:44,120 --> 00:02:47,320 Speaker 2: when it comes to the incorporation of emerging technologies like 39 00:02:47,400 --> 00:02:51,440 Speaker 2: artificial intelligence, I think many people in the Pentagon, both 40 00:02:51,480 --> 00:02:53,840 Speaker 2: before I served, sort of while I was there, and 41 00:02:53,880 --> 00:02:59,160 Speaker 2: now recognize the enormous potential that AI has to transform 42 00:02:59,320 --> 00:03:03,040 Speaker 2: warfare like so many other elements of our society. But 43 00:03:03,120 --> 00:03:06,680 Speaker 2: it's been hard to get senior decision makers to invest 44 00:03:06,720 --> 00:03:11,280 Speaker 2: the real dollars that you need to scale capabilities across 45 00:03:11,320 --> 00:03:11,960 Speaker 2: the military. 46 00:03:12,400 --> 00:03:14,720 Speaker 1: Why do you think that's true. There's a sort of 47 00:03:16,120 --> 00:03:18,760 Speaker 1: sticking with the old system, and this is not new. 48 00:03:18,800 --> 00:03:22,480 Speaker 1: This was also true, for example, the battleship admirals who 49 00:03:22,600 --> 00:03:25,560 Speaker 1: kept rejecting the idea that aircraft would work from carriers. 50 00:03:26,000 --> 00:03:30,079 Speaker 1: Exactly what should we as a country do to get 51 00:03:30,760 --> 00:03:34,200 Speaker 1: a genuinely modernizing department of war. 52 00:03:35,400 --> 00:03:38,480 Speaker 2: I think that we now live in an era that 53 00:03:38,560 --> 00:03:41,680 Speaker 2: I would call the age of precise mass in war, 54 00:03:42,360 --> 00:03:47,280 Speaker 2: where the combination of commercial manufacturing, the fact that precision 55 00:03:47,320 --> 00:03:51,080 Speaker 2: guidance is now fifty year old technology, and advances in 56 00:03:51,120 --> 00:03:56,320 Speaker 2: autonomous systems and artificial intelligence mean that now every country 57 00:03:56,360 --> 00:03:59,880 Speaker 2: and militant group around the world can launch precision strikes 58 00:04:00,120 --> 00:04:03,520 Speaker 2: one way or another, and sometimes in large volumes, as 59 00:04:03,520 --> 00:04:06,480 Speaker 2: we see Aroan doing in the Middle East right now. 60 00:04:07,080 --> 00:04:10,520 Speaker 2: And in that era, the way that the US has 61 00:04:10,800 --> 00:04:13,840 Speaker 2: thought about setting up its military I think needs to 62 00:04:13,960 --> 00:04:15,839 Speaker 2: change a little bit, and that the United States needs 63 00:04:15,880 --> 00:04:18,680 Speaker 2: to be a lot more aggressive and incorporating some of 64 00:04:18,720 --> 00:04:22,440 Speaker 2: these artificial intelligence tools that can help our troops and 65 00:04:22,520 --> 00:04:26,760 Speaker 2: our senior decision makers make sense of the battlefield, processing 66 00:04:26,800 --> 00:04:31,200 Speaker 2: information much faster and getting information to the right people 67 00:04:31,240 --> 00:04:33,839 Speaker 2: at the right time so they can make the correct 68 00:04:33,839 --> 00:04:36,479 Speaker 2: decisions and help the American military fight and win the 69 00:04:36,560 --> 00:04:37,440 Speaker 2: nation's wars. 70 00:04:37,920 --> 00:04:41,760 Speaker 1: I helped found the Military Reform Caucus in nineteen eighty one. 71 00:04:42,440 --> 00:04:45,479 Speaker 1: I was the third witness on behalf of the Goldwater 72 00:04:45,560 --> 00:04:50,279 Speaker 1: Nicols Bill that profoundly changed the system and move destroys jointness, 73 00:04:50,320 --> 00:04:54,679 Speaker 1: and which, by the way, every currently serving active duty 74 00:04:54,720 --> 00:04:59,159 Speaker 1: four star opposed. They hated the idea of being forced 75 00:04:59,160 --> 00:05:02,440 Speaker 1: into joint us and they actually got Reagan and Secretary 76 00:05:02,440 --> 00:05:05,360 Speaker 1: of Defense Weinberger to oppose it. And we won anyway, 77 00:05:05,800 --> 00:05:09,080 Speaker 1: because it was so obvious how out of cycle the 78 00:05:09,160 --> 00:05:13,919 Speaker 1: system was. Because during the liberation of Grenada, an Army 79 00:05:13,920 --> 00:05:17,640 Speaker 1: officer had to go to a payphone, use his credit card, 80 00:05:18,160 --> 00:05:20,640 Speaker 1: call the Pentagon to get a friend to call the 81 00:05:20,720 --> 00:05:24,000 Speaker 1: Navy to tell them that the Army and Navy radios 82 00:05:24,040 --> 00:05:25,200 Speaker 1: were not working together. 83 00:05:25,560 --> 00:05:28,680 Speaker 2: Unbelievable. I mean know, it is believable. That's the problem. 84 00:05:28,920 --> 00:05:31,680 Speaker 1: What it did was it led to hearings where these 85 00:05:31,720 --> 00:05:35,240 Speaker 1: four stars would come in explain the system was really working, 86 00:05:35,839 --> 00:05:38,640 Speaker 1: and then an everyday congressman who might not have been 87 00:05:38,640 --> 00:05:41,599 Speaker 1: all that sophisticated, said, let me get this straight. You 88 00:05:41,720 --> 00:05:44,120 Speaker 1: think that having to use a credit card at a 89 00:05:44,160 --> 00:05:47,960 Speaker 1: payphone is a solution, And of course they had no 90 00:05:48,040 --> 00:05:51,680 Speaker 1: answers and they just were totally discredited. So I've been 91 00:05:52,440 --> 00:05:56,120 Speaker 1: working this problem of how do you modernize defense for 92 00:05:56,160 --> 00:05:59,560 Speaker 1: a fair length of time. It always involves the building 93 00:06:00,080 --> 00:06:03,360 Speaker 1: doing everything I can to avoid it, you know, great speeches, 94 00:06:03,400 --> 00:06:07,360 Speaker 1: but very timid decisions. When you look at that, and 95 00:06:07,400 --> 00:06:10,599 Speaker 1: it strikes me that artificial intelligence is both going to 96 00:06:10,640 --> 00:06:16,920 Speaker 1: have a supportive role in that, things like managing inventories, 97 00:06:17,560 --> 00:06:21,600 Speaker 1: tracking data. There's just a thousand ways that it's going 98 00:06:21,640 --> 00:06:24,680 Speaker 1: to dramatically improve and accelerate what we can do, and 99 00:06:24,760 --> 00:06:27,120 Speaker 1: at the same time, it may well end up having 100 00:06:27,160 --> 00:06:30,360 Speaker 1: a war fighting role. So you've got to actually look 101 00:06:30,400 --> 00:06:33,440 Speaker 1: at both sides of this. How are you using it 102 00:06:33,839 --> 00:06:38,520 Speaker 1: to dramatically modernize and improve the ongoing systems, but also 103 00:06:39,160 --> 00:06:42,400 Speaker 1: how do you potentially have to rethink the whole art 104 00:06:42,440 --> 00:06:46,559 Speaker 1: of war in what could really become a real time, 105 00:06:46,760 --> 00:06:51,359 Speaker 1: astonishingly complex situation. How do you approach that, how do 106 00:06:51,400 --> 00:06:52,240 Speaker 1: you think about that? 107 00:06:52,920 --> 00:06:55,719 Speaker 2: It's a great question. I think that artificial intelligence is 108 00:06:55,760 --> 00:06:59,359 Speaker 2: really changing the character of warfare. I would think about 109 00:06:59,440 --> 00:07:03,600 Speaker 2: three different buckets that artificial intelligence falls into. For a 110 00:07:03,720 --> 00:07:07,520 Speaker 2: military like the United States. The first bucket are uses 111 00:07:07,560 --> 00:07:11,080 Speaker 2: of artificial intelligence for the military that sort of as 112 00:07:11,120 --> 00:07:13,880 Speaker 2: you suggested, are exactly like what you would see in 113 00:07:13,920 --> 00:07:16,440 Speaker 2: a company in the private sector. Uses for you know, 114 00:07:16,560 --> 00:07:21,360 Speaker 2: human resources, payroll processing, basic logistics, all those things where 115 00:07:21,440 --> 00:07:27,000 Speaker 2: artificial intelligence can speed workflows, you know, simplify things, generate efficiencies, accepts. 116 00:07:27,080 --> 00:07:31,040 Speaker 2: That's bucket one. Bucket two is in what I would 117 00:07:31,040 --> 00:07:34,960 Speaker 2: call the intelligence, surveillance and reconnaissance arena, and that is 118 00:07:35,000 --> 00:07:38,440 Speaker 2: the way that the military is getting all sorts of 119 00:07:38,520 --> 00:07:43,880 Speaker 2: data from satellites, from human sources, from airborne sensors, from 120 00:07:44,160 --> 00:07:46,920 Speaker 2: troops on the battlefield, and taking all of that data 121 00:07:47,560 --> 00:07:53,160 Speaker 2: and trying strategically to understand the capabilities of potential adversaries 122 00:07:53,200 --> 00:07:56,440 Speaker 2: like China or Russia and tactically that in the context 123 00:07:56,480 --> 00:07:58,400 Speaker 2: of a conflict that's happening, figure out, you know, what 124 00:07:58,520 --> 00:08:01,440 Speaker 2: is actually going on in Iran, or what is actually 125 00:08:01,480 --> 00:08:04,000 Speaker 2: going on in the strait of poor moves and in 126 00:08:04,040 --> 00:08:07,320 Speaker 2: that case, the ability of artificial intelligence to process data 127 00:08:07,480 --> 00:08:11,000 Speaker 2: quickly and to pull out the signal from the noise 128 00:08:11,440 --> 00:08:15,000 Speaker 2: faster than a human might be able to means that 129 00:08:15,120 --> 00:08:18,800 Speaker 2: it becomes an incredible teammate to help speed up that 130 00:08:18,960 --> 00:08:22,440 Speaker 2: process of getting more accurate information to commanders. So that's 131 00:08:22,440 --> 00:08:26,360 Speaker 2: bucket two. Then bucket three is the sort of pointy 132 00:08:26,520 --> 00:08:29,400 Speaker 2: end of the spear, and that is the use of 133 00:08:29,560 --> 00:08:34,840 Speaker 2: artificial intelligence as decisions support, essentially tools to help commanders 134 00:08:34,880 --> 00:08:39,720 Speaker 2: make good decisions close to the battlefield. And then potentially 135 00:08:39,720 --> 00:08:43,559 Speaker 2: the incorporation of artificial intelligence into what are called autonomous 136 00:08:43,559 --> 00:08:47,160 Speaker 2: weapon systems, which are weapon systems that can select and 137 00:08:47,280 --> 00:08:51,680 Speaker 2: engage targets themselves after a human activates them. And to 138 00:08:51,760 --> 00:08:54,520 Speaker 2: be clear, as you know, the military has been using 139 00:08:54,600 --> 00:08:58,240 Speaker 2: artificial intelligence for decades, and the United States has actually 140 00:08:58,240 --> 00:09:00,840 Speaker 2: fielded some of these kinds of weapons and with very 141 00:09:01,240 --> 00:09:04,479 Speaker 2: very simple, sort of more primitive types of artificial intelligence 142 00:09:04,840 --> 00:09:10,120 Speaker 2: since the nineteen eighties. But today, as artificial intelligence tools 143 00:09:10,120 --> 00:09:13,000 Speaker 2: have become much more powerful than the prospect for them 144 00:09:13,080 --> 00:09:16,080 Speaker 2: to really impact the battlefield has become much more powerful 145 00:09:16,120 --> 00:09:18,440 Speaker 2: as well. But those are the three buckets I think. 146 00:09:34,200 --> 00:09:37,440 Speaker 1: When you think about it in those three buckets. Let 147 00:09:37,520 --> 00:09:39,280 Speaker 1: me start with the last one. You mentioned the point 148 00:09:39,320 --> 00:09:42,680 Speaker 1: of the sphere. How realistic is it in your mind 149 00:09:42,960 --> 00:09:46,640 Speaker 1: that you're going to have, for example, one fighter aircraft 150 00:09:47,280 --> 00:09:51,320 Speaker 1: with thirty or forty smart drones that are capable of 151 00:09:51,360 --> 00:09:52,559 Speaker 1: operating autonomously. 152 00:09:53,360 --> 00:09:55,960 Speaker 2: I sure hope that happens sooner rather than later. When 153 00:09:56,000 --> 00:09:57,679 Speaker 2: I was in the Pentagon a couple of years ago, 154 00:09:57,840 --> 00:10:01,760 Speaker 2: we moved forward and a new program for the Air 155 00:10:01,800 --> 00:10:05,840 Speaker 2: Force called Collaborative Combat Aircraft, or the whole ideas that 156 00:10:05,880 --> 00:10:08,880 Speaker 2: you have an advanced US fighter jet, like say in 157 00:10:09,000 --> 00:10:14,319 Speaker 2: F thirty five, surrounded by a series of unmanned aircraft 158 00:10:14,800 --> 00:10:18,880 Speaker 2: that can operate mostly autonomously, but are controlled by that 159 00:10:19,000 --> 00:10:23,000 Speaker 2: pilot or by someone in a different aircraft nearby, and 160 00:10:23,040 --> 00:10:26,960 Speaker 2: that really can amplify the ability of our individual fighter 161 00:10:27,000 --> 00:10:30,200 Speaker 2: pilots to have impact on the battlefield because each of 162 00:10:30,240 --> 00:10:33,319 Speaker 2: those drones then can carry missiles, they can carry sensors, 163 00:10:33,920 --> 00:10:36,840 Speaker 2: and we are on track to start fielding some of 164 00:10:36,880 --> 00:10:41,520 Speaker 2: those systems before twenty thirty. So this isn't science fiction anymore. 165 00:10:41,559 --> 00:10:44,400 Speaker 2: This is becoming reality. And it might be one pilot 166 00:10:44,480 --> 00:10:47,720 Speaker 2: surrounded by five drones to start, not one pilot surrounded 167 00:10:47,760 --> 00:10:50,760 Speaker 2: by thirty drones. But that's the direction of travel we 168 00:10:50,800 --> 00:10:51,360 Speaker 2: are moving in. 169 00:10:51,760 --> 00:10:56,640 Speaker 1: From that standpoint, does that become truly autonomous? Is there 170 00:10:56,679 --> 00:11:00,400 Speaker 1: some middle ground here where you still have human engage 171 00:11:00,920 --> 00:11:03,600 Speaker 1: interacting with semiautonomous systems. 172 00:11:04,760 --> 00:11:07,720 Speaker 2: So the collaborative combat aircraft that the US Air Force 173 00:11:07,800 --> 00:11:12,040 Speaker 2: is envisioning right now are what we would call semi autonomous, 174 00:11:12,040 --> 00:11:15,120 Speaker 2: and that they have autonomous flight, and they can do 175 00:11:15,280 --> 00:11:17,480 Speaker 2: lots of things autonomously, but when it comes to the 176 00:11:17,559 --> 00:11:20,760 Speaker 2: use of force, actually firing a missile, dropping a bomb, 177 00:11:21,480 --> 00:11:24,560 Speaker 2: there's still that human pilot insay, the F thirty five 178 00:11:24,679 --> 00:11:27,640 Speaker 2: or in some other airplane that's controlling them is still 179 00:11:27,679 --> 00:11:30,640 Speaker 2: the one that makes the decision about that. Now, that 180 00:11:30,760 --> 00:11:35,480 Speaker 2: of course raises the question as technology improves, And sorry, 181 00:11:35,559 --> 00:11:38,720 Speaker 2: just to be clear about this, the Air Force isn't, 182 00:11:38,920 --> 00:11:41,960 Speaker 2: at least I hope not, isn't having the human make 183 00:11:42,000 --> 00:11:45,280 Speaker 2: the decision because at a principle, per se, it's because 184 00:11:45,280 --> 00:11:47,400 Speaker 2: they think that's the most effective way to use force. 185 00:11:47,920 --> 00:11:50,439 Speaker 2: And in part that's because the technology still has a 186 00:11:50,520 --> 00:11:54,200 Speaker 2: ways to go. For something like collaborative combat aircraft, before 187 00:11:54,280 --> 00:11:58,719 Speaker 2: you could have an autonomous weapons release, but one could 188 00:11:58,760 --> 00:12:02,880 Speaker 2: imagine future increments, its future collaborative combat aircraft that the 189 00:12:02,920 --> 00:12:06,720 Speaker 2: Air Force deploys that would be completely autonomous, and it's 190 00:12:06,760 --> 00:12:09,959 Speaker 2: just a question of how quickly the technology advances and 191 00:12:10,120 --> 00:12:13,320 Speaker 2: whether the Air Force can prove that it reliably works. 192 00:12:13,440 --> 00:12:16,680 Speaker 2: Since you know this better than I do, nobody wants 193 00:12:16,679 --> 00:12:20,160 Speaker 2: their technology to work more than the military because it's 194 00:12:20,200 --> 00:12:21,800 Speaker 2: their lives that are on the line. 195 00:12:22,000 --> 00:12:27,280 Speaker 1: The battlefield is a very harsh rejector of incompetence, and 196 00:12:27,320 --> 00:12:30,240 Speaker 1: the prices are enormous, So you're exactly right. I think 197 00:12:30,280 --> 00:12:32,560 Speaker 1: people who've never served or never been close to the 198 00:12:32,559 --> 00:12:37,559 Speaker 1: military really do not appreciate how harsh it is and 199 00:12:37,600 --> 00:12:39,920 Speaker 1: how great the pressure is to try to do it right. 200 00:12:40,600 --> 00:12:43,920 Speaker 1: There are some anti tank weapons that in fact are 201 00:12:43,960 --> 00:12:49,040 Speaker 1: relatively autonomous in that they have been programmed to look 202 00:12:49,080 --> 00:12:52,000 Speaker 1: for tanks and if they see a tank, they've been 203 00:12:52,040 --> 00:12:54,319 Speaker 1: programmed go ahead and kill it. And I think they 204 00:12:54,360 --> 00:12:58,280 Speaker 1: currently don't do that with any kind of permission from 205 00:12:58,320 --> 00:12:59,840 Speaker 1: the initial source. I think it's all on. 206 00:13:01,679 --> 00:13:04,640 Speaker 2: That's absolutely right. Those are still from a very technical 207 00:13:04,720 --> 00:13:08,680 Speaker 2: Pentagon policy perspective. Now those are still called semi autonomous 208 00:13:08,679 --> 00:13:12,760 Speaker 2: because a human is firing the individual javelin. But another 209 00:13:12,800 --> 00:13:15,560 Speaker 2: example is think about radar guided missiles. Radar guided missiles 210 00:13:15,559 --> 00:13:17,560 Speaker 2: have been around for forty some odd years, and so 211 00:13:17,600 --> 00:13:20,080 Speaker 2: a pilot in a US fighter aircraft gets a ping 212 00:13:20,200 --> 00:13:23,400 Speaker 2: that there's an enemy radar, they fire a missile. That 213 00:13:23,440 --> 00:13:26,240 Speaker 2: missile goes off in the direction that the pilots fired it. 214 00:13:26,280 --> 00:13:29,000 Speaker 2: It opens a seeker and it goes for the radar. 215 00:13:29,400 --> 00:13:31,800 Speaker 2: And what if that raidars on top of a school. 216 00:13:31,800 --> 00:13:33,560 Speaker 2: What if that raiders on top of a hospital, like 217 00:13:33,600 --> 00:13:35,280 Speaker 2: it doesn't know, Like it just goes and hits the radar. 218 00:13:35,320 --> 00:13:38,040 Speaker 2: There's no human supervision on it. That's a system that 219 00:13:38,240 --> 00:13:40,640 Speaker 2: dozens of militaries around the world have used, I think 220 00:13:40,640 --> 00:13:43,360 Speaker 2: since the late sixties. Essentially, if you go back to 221 00:13:43,400 --> 00:13:44,959 Speaker 2: heat seeking missiles. 222 00:13:44,600 --> 00:13:48,120 Speaker 1: In that sense, there's a lot deeper pattern of gradually 223 00:13:48,160 --> 00:13:52,160 Speaker 1: evolving these systems. Then people sometimes think it's not like 224 00:13:52,720 --> 00:13:55,480 Speaker 1: there's a brand new world that open up on Tuesday. 225 00:13:55,480 --> 00:13:59,520 Speaker 1: Explain to me, what is the big fight between the 226 00:13:59,520 --> 00:14:02,440 Speaker 1: Defense Apartment and Anthropic. 227 00:14:04,400 --> 00:14:07,360 Speaker 2: A lot of controversy there the last couple of weeks. 228 00:14:08,040 --> 00:14:10,719 Speaker 2: I think this is a tragedy. And my bottom line 229 00:14:10,800 --> 00:14:13,360 Speaker 2: up front, and that is the winner is China and 230 00:14:13,480 --> 00:14:17,080 Speaker 2: let me explain why I get to that conclusion. Anthropic 231 00:14:17,320 --> 00:14:21,040 Speaker 2: was the first of the big artificial intelligence frontier labs, 232 00:14:21,080 --> 00:14:24,600 Speaker 2: you know, leading American companies pushing the edge forward, that 233 00:14:24,760 --> 00:14:27,920 Speaker 2: was willing to do classified work with the Pentagon. And 234 00:14:28,320 --> 00:14:32,680 Speaker 2: there's no dispute between the Pentagon and Entthropic about any 235 00:14:32,800 --> 00:14:36,440 Speaker 2: projects that they are working on together today. There is 236 00:14:36,800 --> 00:14:40,840 Speaker 2: also no dispute between Anthropic and the Pentagon about any 237 00:14:40,880 --> 00:14:44,480 Speaker 2: projects the Pentagon has asked Anthropic to work on. It 238 00:14:44,640 --> 00:14:49,880 Speaker 2: sounds like what happened is after the Maduro operation in January, 239 00:14:50,560 --> 00:14:52,880 Speaker 2: an Nthropic staffer called up a staff for a company 240 00:14:52,880 --> 00:14:57,240 Speaker 2: called Palenteer, which operates a sort of dashboard for US 241 00:14:57,320 --> 00:15:01,160 Speaker 2: military commanders called Maven Smart System. And they called up 242 00:15:01,240 --> 00:15:04,120 Speaker 2: somebody a palent and they said, hey, was our system, 243 00:15:04,200 --> 00:15:08,120 Speaker 2: you know, Nanthropics system being clawed used in the Maduro operation. 244 00:15:08,760 --> 00:15:10,800 Speaker 2: And the Pentagon heard about this and they were upset. 245 00:15:11,080 --> 00:15:14,400 Speaker 2: Why is Anthropic asking questions about the use of their technology? 246 00:15:14,400 --> 00:15:17,920 Speaker 2: Of course, we're using their technology responsibly. And this dispute 247 00:15:18,000 --> 00:15:23,600 Speaker 2: kind of escalates where the Pentagon is essentially talking to 248 00:15:23,720 --> 00:15:27,080 Speaker 2: Anthropic thinking about adopting Anthropics technology. The way that you 249 00:15:27,160 --> 00:15:30,720 Speaker 2: think about acquiring a missile from Lockheed. When Lockheed sells 250 00:15:30,760 --> 00:15:33,920 Speaker 2: the Pentagon a missile, Locke doesn't get to tell the Pentagon, hey, 251 00:15:33,960 --> 00:15:36,040 Speaker 2: you can only use it against these countries, but not 252 00:15:36,200 --> 00:15:41,880 Speaker 2: those other countries. And meanwhile, Entthropic was nervous that the 253 00:15:41,920 --> 00:15:45,640 Speaker 2: Pentagon might be thinking about employing its technology and areas 254 00:15:45,640 --> 00:15:49,080 Speaker 2: where Anthropic thinks it's not ready and wouldn't be effective. 255 00:15:49,560 --> 00:15:52,480 Speaker 2: And so you really had essentially a breakdown in trust. 256 00:15:52,600 --> 00:15:55,920 Speaker 2: I think on both sides, where the Pentagon didn't trust 257 00:15:56,000 --> 00:16:00,120 Speaker 2: that Anthropic would be there for important national security use cases, 258 00:16:00,560 --> 00:16:04,320 Speaker 2: Anthropic didn't trust that the Pentagon would use as technology responsibly. 259 00:16:05,040 --> 00:16:08,200 Speaker 2: This is a tragedy in some ways because Entthropic is 260 00:16:08,320 --> 00:16:11,320 Speaker 2: already working with the American military, and in fact, Anthropics 261 00:16:11,320 --> 00:16:14,640 Speaker 2: technology is supporting our troops in the operation against Iran. 262 00:16:15,280 --> 00:16:18,239 Speaker 2: I think that this is really about personality and politics 263 00:16:18,240 --> 00:16:21,800 Speaker 2: in some ways, like masquerading as a policy to spute. 264 00:16:22,480 --> 00:16:24,200 Speaker 1: It's almost like you ought to lock some people in 265 00:16:24,240 --> 00:16:27,560 Speaker 1: a room and tell them works this out and will 266 00:16:27,600 --> 00:16:28,440 Speaker 1: unlock the door. 267 00:16:29,000 --> 00:16:32,080 Speaker 2: It's unbelievable in some ways to me that on Friday 268 00:16:32,120 --> 00:16:35,600 Speaker 2: at five oh one pm you have announcements from the 269 00:16:35,640 --> 00:16:38,880 Speaker 2: President and from senior leadership in the Pentagon, putting Anthropic 270 00:16:39,000 --> 00:16:41,720 Speaker 2: essentially on the Natty list from the perspective of the 271 00:16:41,840 --> 00:16:46,120 Speaker 2: US government. And then the next day, US Central Command 272 00:16:46,160 --> 00:16:49,760 Speaker 2: in the Middle East is using Anthropics technology to help 273 00:16:50,320 --> 00:16:54,720 Speaker 2: initial hours of the strikes against Iran, which illustrates how 274 00:16:54,800 --> 00:16:57,640 Speaker 2: useful the actual war fighters think that this technology is. 275 00:16:58,200 --> 00:17:00,200 Speaker 2: And so you would think that there should be a 276 00:17:00,240 --> 00:17:03,160 Speaker 2: path to agreement then where both sides could get along, 277 00:17:03,200 --> 00:17:05,639 Speaker 2: But they seem really dug in at this point, and 278 00:17:05,680 --> 00:17:08,560 Speaker 2: I think Anthropics leadership doesn't want to back down, you know, 279 00:17:08,560 --> 00:17:11,359 Speaker 2: they don't want to look weak to their workforce, and 280 00:17:11,440 --> 00:17:14,480 Speaker 2: I think the Pentagon Under Secretary Hegseth is as I'm 281 00:17:14,480 --> 00:17:17,800 Speaker 2: sure you've noticed a little aggressive at times, and so 282 00:17:18,119 --> 00:17:21,119 Speaker 2: I'm sure also doesn't want to back down. That's a 283 00:17:21,160 --> 00:17:24,200 Speaker 2: recipe for a disagreement that at least in my view, 284 00:17:24,600 --> 00:17:25,320 Speaker 2: isn't needed. 285 00:17:26,040 --> 00:17:29,600 Speaker 1: I've been fascinated that there's a churning effect where when 286 00:17:29,640 --> 00:17:33,520 Speaker 1: you have the kind of fight we're seeing with Anthropic, 287 00:17:34,560 --> 00:17:37,280 Speaker 1: they pull back, but then all of a sudden you 288 00:17:37,400 --> 00:17:40,800 Speaker 1: have somebody else jump in. Is it your judgment that 289 00:17:41,000 --> 00:17:45,800 Speaker 1: this fight will drive people who are good at artificial 290 00:17:45,840 --> 00:17:48,199 Speaker 1: intelligence away or that in fact, it just creates a 291 00:17:48,240 --> 00:17:51,800 Speaker 1: vacuum where other players come in because they see it 292 00:17:51,840 --> 00:17:52,920 Speaker 1: as their advantage. 293 00:17:53,160 --> 00:17:55,760 Speaker 2: That's a really good question. You know, open Ai stepped 294 00:17:55,800 --> 00:17:58,480 Speaker 2: into the breach here, and I think open Ai deserves 295 00:17:58,560 --> 00:18:01,400 Speaker 2: some credit in this contact because they were i think 296 00:18:01,440 --> 00:18:04,600 Speaker 2: initially trying to broker a piece essentially to come up 297 00:18:04,640 --> 00:18:07,600 Speaker 2: with an agreement that they could make with the Pentagon 298 00:18:07,640 --> 00:18:10,800 Speaker 2: that Enthropic would make as well. Since I do think 299 00:18:10,880 --> 00:18:15,560 Speaker 2: that these technology companies see what the Pentagon has done 300 00:18:15,560 --> 00:18:18,440 Speaker 2: to Entthropic and labeling them as a supply chain risk, 301 00:18:18,880 --> 00:18:21,000 Speaker 2: that creates a risk for everyone in some ways. If 302 00:18:21,000 --> 00:18:23,600 Speaker 2: you have negotiations that go south with the Pentagon and 303 00:18:23,600 --> 00:18:27,160 Speaker 2: then the Pentagon doesn't just cancel your contract, they also 304 00:18:27,480 --> 00:18:30,000 Speaker 2: like salt the earth essentially and like try to take 305 00:18:30,000 --> 00:18:32,520 Speaker 2: out your business. I mean we've seen Microsoft, for example, 306 00:18:33,040 --> 00:18:36,560 Speaker 2: support Enthropics lawsuit against the Nagon, you know, because they're 307 00:18:36,560 --> 00:18:39,160 Speaker 2: a software company. They have terms of service in their contracts, 308 00:18:39,160 --> 00:18:40,439 Speaker 2: and they want to be able to have terms of 309 00:18:40,440 --> 00:18:42,520 Speaker 2: service in their contracts with the Pentagon the same way 310 00:18:42,520 --> 00:18:46,240 Speaker 2: they have for decades. I think open Ai seems likely 311 00:18:46,280 --> 00:18:50,040 Speaker 2: to step in Xai elon Musk's company has said that 312 00:18:50,040 --> 00:18:52,480 Speaker 2: they're willing to do classified work with the Pentagon, but 313 00:18:52,560 --> 00:18:56,240 Speaker 2: it's going to take time. In Thropics technology, it's platform, 314 00:18:56,400 --> 00:18:59,960 Speaker 2: claud is now built into the operational workflow of how 315 00:19:00,000 --> 00:19:03,639 Speaker 2: the US military commanders around the world are getting information 316 00:19:03,800 --> 00:19:07,399 Speaker 2: and making decisions, and untangling that is going to be messy. 317 00:19:07,880 --> 00:19:11,879 Speaker 2: And that's why simultaneously, Anthropic has been labeled a supply 318 00:19:12,000 --> 00:19:15,439 Speaker 2: chain risk, but the Pentagon has six full months to 319 00:19:15,560 --> 00:19:19,160 Speaker 2: disentangle them. As an illustration, I think of how useful 320 00:19:19,440 --> 00:19:21,960 Speaker 2: the actual warfighters think that this technology is. 321 00:19:22,359 --> 00:19:25,760 Speaker 1: Well in six months time. Who knows what negotiations will 322 00:19:25,760 --> 00:19:27,840 Speaker 1: go on behind closed doors. 323 00:19:28,240 --> 00:19:30,440 Speaker 2: Is that a six month bargaining period? Maybe I would 324 00:19:30,440 --> 00:19:30,800 Speaker 2: hope so. 325 00:19:31,000 --> 00:19:47,399 Speaker 3: I would hope so, but we'll see. 326 00:19:49,520 --> 00:19:53,240 Speaker 1: Do you think that the evolution the sheer power of 327 00:19:53,400 --> 00:19:57,520 Speaker 1: artificial intelligence as it's being developed across an enormous range 328 00:19:57,560 --> 00:20:02,639 Speaker 1: of companies and entrepreneurs, do you think, despite these occasional glitches, 329 00:20:02,960 --> 00:20:06,520 Speaker 1: it's almost inevitable that it will permeate the entire Department 330 00:20:06,520 --> 00:20:06,840 Speaker 1: of War. 331 00:20:08,119 --> 00:20:11,120 Speaker 2: The short answer is in one word, yes, and the 332 00:20:11,160 --> 00:20:14,720 Speaker 2: reason is that artificial intelligence, to me is a general 333 00:20:14,760 --> 00:20:18,440 Speaker 2: purpose technology and it's not the only general purpose technology 334 00:20:18,440 --> 00:20:20,639 Speaker 2: in history. You know, we have other general purpose technologies. 335 00:20:20,720 --> 00:20:23,639 Speaker 2: Think back to the late nineteenth century and electricity and 336 00:20:23,680 --> 00:20:27,640 Speaker 2: the combustion engine and the airplane, and you're like it said, 337 00:20:27,680 --> 00:20:31,719 Speaker 2: early twentieth century. But the idea you have these breakthroughs 338 00:20:31,720 --> 00:20:35,240 Speaker 2: that are not just like a single widget, but are 339 00:20:35,560 --> 00:20:39,560 Speaker 2: transformational to the economy, to society, and to the military. 340 00:20:40,080 --> 00:20:43,919 Speaker 2: Think about the way that electricity permeates every area of 341 00:20:44,080 --> 00:20:47,119 Speaker 2: American society and the economy and everything that's happening in 342 00:20:47,160 --> 00:20:49,440 Speaker 2: the military in a way that it's ubiquitous. You don't 343 00:20:49,480 --> 00:20:51,960 Speaker 2: think about like, oh, like, there's the impact of electricity 344 00:20:52,680 --> 00:20:55,800 Speaker 2: in some ways. And I think artificial intelligence is poised 345 00:20:55,880 --> 00:20:59,359 Speaker 2: to be very similar. And in that way, you would 346 00:20:59,359 --> 00:21:02,520 Speaker 2: expect that every office in the Pentagon we'll be using 347 00:21:02,640 --> 00:21:05,080 Speaker 2: artificial intelligence in one way or another, and they'll be 348 00:21:05,800 --> 00:21:10,520 Speaker 2: algorithms embedded in every American military platform in the coming decades. 349 00:21:10,640 --> 00:21:12,879 Speaker 2: And I do think that that is inevitable to some extent. 350 00:21:13,200 --> 00:21:15,520 Speaker 2: The question is just do we get there fast enough 351 00:21:16,119 --> 00:21:19,720 Speaker 2: in a way that preserves America's military edge, and do 352 00:21:19,800 --> 00:21:22,760 Speaker 2: we do it right in a way that ensures that 353 00:21:22,800 --> 00:21:26,600 Speaker 2: the technology is effective and keeps us ahead of China, 354 00:21:26,720 --> 00:21:27,760 Speaker 2: our greatest competitor. 355 00:21:28,680 --> 00:21:32,880 Speaker 1: The other thing that's going on is a declining cost. 356 00:21:33,680 --> 00:21:36,800 Speaker 1: If you look at what, for example, Ukraine now makes 357 00:21:36,880 --> 00:21:40,240 Speaker 1: drones for compared to what the Pentagon would make a 358 00:21:40,320 --> 00:21:43,240 Speaker 1: drone for if left to its own devices, there are 359 00:21:43,359 --> 00:21:47,760 Speaker 1: staggering crashes and costs coming our way, and that's going 360 00:21:47,840 --> 00:21:53,399 Speaker 1: to make a lot of mid size countries more dangerous 361 00:21:54,040 --> 00:21:54,879 Speaker 1: than they used to be. 362 00:21:55,880 --> 00:21:57,720 Speaker 2: It used to be that if you wanted to launch 363 00:21:57,720 --> 00:22:00,679 Speaker 2: a missile one thousand, two thousand kilometers, maybe you'd be 364 00:22:00,800 --> 00:22:04,199 Speaker 2: looking at trying to buy an advanced missile from the 365 00:22:04,280 --> 00:22:07,520 Speaker 2: United States that maybe costs one million dollars per shot 366 00:22:07,600 --> 00:22:10,639 Speaker 2: or two million dollars per shot. This goes back to 367 00:22:10,680 --> 00:22:13,320 Speaker 2: what I said earlier about us being in this age 368 00:22:13,359 --> 00:22:16,439 Speaker 2: of precise mass. Look at what the Houthies did in 369 00:22:16,480 --> 00:22:20,560 Speaker 2: the Red Sea. The Houthis generated a billion dollars in 370 00:22:20,760 --> 00:22:25,959 Speaker 2: economic damage, essentially firing flying lawnmowers that cost twenty thousand 371 00:22:25,960 --> 00:22:28,280 Speaker 2: dollars a pop. Forty thousand dollars a pop at US 372 00:22:28,359 --> 00:22:31,560 Speaker 2: Navy ships and our sailors would sometimes fire million dollar 373 00:22:31,640 --> 00:22:34,200 Speaker 2: missiles to shoot them down. Like that's not a good 374 00:22:34,240 --> 00:22:38,280 Speaker 2: cost exchange ratio, and what Ukraine has done is remarkable 375 00:22:38,320 --> 00:22:40,520 Speaker 2: in the way that they have leveraged these one way 376 00:22:40,560 --> 00:22:44,879 Speaker 2: attack drones to hold back the Russian invaders over time. 377 00:22:45,600 --> 00:22:48,400 Speaker 2: If you want a good news story here though. One 378 00:22:48,440 --> 00:22:50,720 Speaker 2: of the things that happened in the context of the 379 00:22:50,760 --> 00:22:54,120 Speaker 2: Iran operation is the US deployed for the first time 380 00:22:54,160 --> 00:22:58,720 Speaker 2: a system called the Lucas. It's the low cost unmanned 381 00:22:58,920 --> 00:23:03,760 Speaker 2: combat aerial system them and the Lucas is about thirty 382 00:23:03,760 --> 00:23:09,240 Speaker 2: five thousand dollars and is reverse engineered from Iran's Shaheed 383 00:23:09,280 --> 00:23:13,280 Speaker 2: one thirty six, which Iran has sold tens of thousands 384 00:23:13,280 --> 00:23:14,960 Speaker 2: of these to the Russians, and the Russians have built 385 00:23:14,960 --> 00:23:17,680 Speaker 2: their own version. Well, the US captured a couple of 386 00:23:17,760 --> 00:23:23,040 Speaker 2: these Iranian shaheds, reverse engineered them, and then thought, well, hey, 387 00:23:23,600 --> 00:23:26,040 Speaker 2: maybe we could build these from ourselves. After all, you know, 388 00:23:26,160 --> 00:23:29,520 Speaker 2: Russia's using tens of thousands of these months sometimes like 389 00:23:29,560 --> 00:23:31,800 Speaker 2: this is a system that's proven to work, and so 390 00:23:31,840 --> 00:23:34,040 Speaker 2: in the opening hours of the campaign against Iran, the 391 00:23:34,160 --> 00:23:38,000 Speaker 2: US actually used Iran's own technology against it in the 392 00:23:38,040 --> 00:23:41,400 Speaker 2: form of this Lucas platform, which is rather than being 393 00:23:41,440 --> 00:23:44,320 Speaker 2: a two million dollar Tomahawk, is a thirty five thousand 394 00:23:44,320 --> 00:23:47,720 Speaker 2: dollars system. The US is starting to get it. It's 395 00:23:47,760 --> 00:23:48,320 Speaker 2: just been slow. 396 00:23:49,160 --> 00:23:51,880 Speaker 1: We had better get to be a lot faster, both 397 00:23:51,920 --> 00:23:55,440 Speaker 1: in competing with China but also just random opponents. I mean, 398 00:23:56,080 --> 00:23:57,360 Speaker 1: a lot of these countries are going to be able 399 00:23:57,359 --> 00:23:59,640 Speaker 1: to buy stuff on the free market or be able 400 00:23:59,720 --> 00:24:02,680 Speaker 1: to go to their own chat GPT and say, now, 401 00:24:02,680 --> 00:24:06,399 Speaker 1: how would you build this and have an amazing amount 402 00:24:06,440 --> 00:24:09,639 Speaker 1: of information being distributed in virtually real time. 403 00:24:10,760 --> 00:24:13,600 Speaker 2: Absolutely, I mean in this period where in any country 404 00:24:13,600 --> 00:24:15,199 Speaker 2: around the world is now going to be able to 405 00:24:15,240 --> 00:24:19,520 Speaker 2: have their own essentially one thousand kilometer two thousand kilometer 406 00:24:19,960 --> 00:24:22,639 Speaker 2: strike weapons. They're not going to have to rely just 407 00:24:22,680 --> 00:24:24,800 Speaker 2: on the United States for that or just on Russia 408 00:24:24,840 --> 00:24:27,160 Speaker 2: for that, which is a more dangerous world. But it's 409 00:24:27,200 --> 00:24:29,320 Speaker 2: not like you could stop that with arms control. That'd 410 00:24:29,320 --> 00:24:31,680 Speaker 2: be like trying to stop water or something like. This 411 00:24:31,720 --> 00:24:35,000 Speaker 2: is an inevitability like this is happening, which means that 412 00:24:35,000 --> 00:24:37,480 Speaker 2: if you're in the United States, then both the United 413 00:24:37,480 --> 00:24:39,600 Speaker 2: States needs to get on the train. And I'm encouraged 414 00:24:39,640 --> 00:24:42,159 Speaker 2: by some of the things that I'm seeing now, but 415 00:24:42,240 --> 00:24:44,560 Speaker 2: it also means then that we need to figure out 416 00:24:44,560 --> 00:24:48,240 Speaker 2: how to defend against these systems at a much lower cost. 417 00:24:48,640 --> 00:24:51,320 Speaker 2: We're really lucky to be the wealthiest, most powerful country 418 00:24:51,359 --> 00:24:54,680 Speaker 2: in the world, but we can't like forever shoot down 419 00:24:54,760 --> 00:24:58,280 Speaker 2: you know, twenty thirty thousand dollars things with million dollar 420 00:24:58,320 --> 00:25:01,840 Speaker 2: defensive missiles. And it's why I was encouraging that the 421 00:25:01,960 --> 00:25:04,960 Speaker 2: US is now adopting some of the technology that Ukraine 422 00:25:05,000 --> 00:25:09,040 Speaker 2: has been using to defend against these strikes. 423 00:25:09,080 --> 00:25:16,240 Speaker 1: The ability to get to low cost, massively produced numbers 424 00:25:16,440 --> 00:25:21,320 Speaker 1: Lord Nelson and the Napoleonic Wars once should that numbers annihilate. 425 00:25:22,800 --> 00:25:25,919 Speaker 1: Seems to me that being able to produce a field 426 00:25:26,320 --> 00:25:31,320 Speaker 1: and use relatively and expensive weapons is a major key 427 00:25:31,960 --> 00:25:34,879 Speaker 1: to surviving the battlefield of the next thirty or forty years. 428 00:25:35,960 --> 00:25:38,080 Speaker 2: I think that's absolutely right. You know, the US has 429 00:25:38,119 --> 00:25:40,480 Speaker 2: been in a position for the last thirty or forty 430 00:25:40,520 --> 00:25:45,000 Speaker 2: years where we've relied on increasingly small numbers of really 431 00:25:45,119 --> 00:25:48,640 Speaker 2: exquisite systems. And that was okay because our systems were 432 00:25:48,640 --> 00:25:51,119 Speaker 2: the best in the world. Even having small numbers of 433 00:25:51,119 --> 00:25:53,720 Speaker 2: them was good enough to sort of take on any adversary. 434 00:25:54,400 --> 00:25:58,000 Speaker 2: But that is going to be increasingly difficult in this 435 00:25:58,280 --> 00:26:01,119 Speaker 2: era when so many different and adversaries are going to 436 00:26:01,160 --> 00:26:06,320 Speaker 2: have so many more capabilities and increasingly leveraging artificial intelligence, 437 00:26:06,920 --> 00:26:10,119 Speaker 2: and so in this new era, then I think the 438 00:26:10,240 --> 00:26:12,400 Speaker 2: US needs to shift what I would call a high 439 00:26:12,440 --> 00:26:14,720 Speaker 2: low mix, where it's not that we need to get 440 00:26:14,840 --> 00:26:17,960 Speaker 2: rid of, you know, our Tomahawk missiles or our fancy systems, 441 00:26:18,359 --> 00:26:21,840 Speaker 2: but we need to compliment those by spending a lot 442 00:26:21,920 --> 00:26:24,360 Speaker 2: more on some of these low cost weapons and low 443 00:26:24,400 --> 00:26:27,440 Speaker 2: cost sensors that are genuinely no costs, not like gold 444 00:26:27,440 --> 00:26:28,600 Speaker 2: plated Pentagon low cost. 445 00:26:28,920 --> 00:26:33,720 Speaker 1: Aren't we today actually spending a remarkably tiny amount of 446 00:26:33,760 --> 00:26:37,160 Speaker 1: money on the kind of weapons you're describing compared to 447 00:26:37,200 --> 00:26:40,640 Speaker 1: what we spend on very exquisite complex systems. 448 00:26:41,119 --> 00:26:43,760 Speaker 2: When I was in the Pentagon, we tried to move 449 00:26:44,480 --> 00:26:47,359 Speaker 2: I think five hundred million dollars at one point to 450 00:26:47,680 --> 00:26:50,719 Speaker 2: move toward to something called the Replicator Initiative to fund 451 00:26:50,760 --> 00:26:53,280 Speaker 2: some of these kinds of systems. And it took more 452 00:26:53,280 --> 00:26:57,879 Speaker 2: than forty briefings of the appropriations committees in the House 453 00:26:57,880 --> 00:27:00,400 Speaker 2: and the Senate of really senior leaders busy, the senior 454 00:27:00,480 --> 00:27:03,000 Speaker 2: leaders in the Pentagon like up to the Deputy secretary 455 00:27:03,200 --> 00:27:07,240 Speaker 2: then Deputy Secretary of Defense to get that done, and 456 00:27:07,400 --> 00:27:12,040 Speaker 2: that was so much time to move what essentially was 457 00:27:12,160 --> 00:27:15,480 Speaker 2: less than one percent of the Pentagon budget. And even 458 00:27:15,520 --> 00:27:17,520 Speaker 2: today there's actually a bunch of money that the Trump 459 00:27:17,520 --> 00:27:21,919 Speaker 2: administration really smartly put into the big Beautiful Bill to 460 00:27:21,960 --> 00:27:25,400 Speaker 2: try to accelerate these capabilities. But even with those there's 461 00:27:25,400 --> 00:27:27,800 Speaker 2: about like five billion dollars or so there. Even then, 462 00:27:27,800 --> 00:27:30,080 Speaker 2: we're talking about like one percent of the defense budget 463 00:27:30,160 --> 00:27:32,159 Speaker 2: something like that. It's a lot of money to mir 464 00:27:32,240 --> 00:27:35,600 Speaker 2: or you, but for the Pentagon it's a fraction of 465 00:27:35,600 --> 00:27:37,160 Speaker 2: what one aircraft carrier. 466 00:27:36,920 --> 00:27:40,280 Speaker 1: Costs, which then tells you where the attention of the 467 00:27:40,280 --> 00:27:41,280 Speaker 1: senior leadership is. 468 00:27:41,480 --> 00:27:42,639 Speaker 2: Yep, follow the money. 469 00:27:42,800 --> 00:27:45,600 Speaker 1: It's a real problem. Listener, Michael, I want to thank 470 00:27:45,640 --> 00:27:48,720 Speaker 1: you for joining me. Our listeners can follow the work 471 00:27:48,720 --> 00:27:52,399 Speaker 1: you're doing at the Council and Foreign Relations by visiting 472 00:27:52,480 --> 00:27:57,520 Speaker 1: the website at CFR dot org. Your book, The Diffusion 473 00:27:57,560 --> 00:28:01,960 Speaker 1: of Military Power, Causes and Kind Sequences for International Politics 474 00:28:02,400 --> 00:28:04,719 Speaker 1: is available on Amazon and I thought this is a 475 00:28:04,880 --> 00:28:06,359 Speaker 1: very very helpful conversation. 476 00:28:07,480 --> 00:28:09,800 Speaker 2: Thanks so much for having me and happy to chat anytime. 477 00:28:12,880 --> 00:28:16,320 Speaker 1: Thank you to my guest, Michael Horowitz. Newtworld is produced 478 00:28:16,359 --> 00:28:20,199 Speaker 1: by Gingrich three sixty and iHeartMedia. Our executive producer is 479 00:28:20,200 --> 00:28:24,600 Speaker 1: Guernsey Sloan. Our researcher is Rachel Peterson. The artwork for 480 00:28:24,640 --> 00:28:27,840 Speaker 1: the show was created by Steve Penley. Special thanks to 481 00:28:27,840 --> 00:28:31,160 Speaker 1: the team at Gingleish three sixty. If you've been enjoying Newsworld, 482 00:28:31,560 --> 00:28:34,120 Speaker 1: I hope you'll go to Apple Podcast and both rate 483 00:28:34,200 --> 00:28:37,199 Speaker 1: us with five stars and give us a review so 484 00:28:37,240 --> 00:28:40,000 Speaker 1: others can learn what it's all about. Join me on 485 00:28:40,040 --> 00:28:44,520 Speaker 1: substack at gingrishree sixty dot net. I'm Newt Gingrich. This 486 00:28:44,680 --> 00:28:45,400 Speaker 1: is news World.