1 00:00:15,356 --> 00:00:22,516 Speaker 1: Pushkin. If you go to the front line of the 2 00:00:22,556 --> 00:00:26,356 Speaker 1: war in Ukraine and you look up, there's a very 3 00:00:26,356 --> 00:00:30,236 Speaker 1: good chance you'll see a drone. Could be a friendly drone, 4 00:00:30,676 --> 00:00:34,116 Speaker 1: could be an enemy drone, could be a reconnaissance drone, 5 00:00:34,716 --> 00:00:36,516 Speaker 1: or it could be a drone that is trying to 6 00:00:36,636 --> 00:00:39,796 Speaker 1: kill you. The war in Ukraine is the first war 7 00:00:39,876 --> 00:00:44,596 Speaker 1: in history where huge numbers of cheap drones powered largely 8 00:00:44,596 --> 00:00:48,276 Speaker 1: by commercial grade hardware and open source software, are playing 9 00:00:48,276 --> 00:00:53,516 Speaker 1: a central role. This technological change has profound implications. It 10 00:00:53,556 --> 00:00:57,156 Speaker 1: gives smaller countries like Ukraine new ways to fight, and 11 00:00:57,196 --> 00:01:00,836 Speaker 1: it means the militaries of big traditional powers military is 12 00:01:00,836 --> 00:01:04,756 Speaker 1: built on twentieth century technologies like tanks and aircraft carriers, 13 00:01:05,276 --> 00:01:14,876 Speaker 1: need to radically change just to keep up. I'm Jacob Goldstein, 14 00:01:14,956 --> 00:01:17,076 Speaker 1: and this is what's your problem. My guest today is 15 00:01:17,156 --> 00:01:20,996 Speaker 1: Christopher Kirkoff. Nine years ago, Christopher helped to start the 16 00:01:20,996 --> 00:01:24,836 Speaker 1: Defense Innovation Unit, a Defense Department office in Silicon Valley. 17 00:01:25,156 --> 00:01:28,836 Speaker 1: He currently works as an advisor there. Christopher says, in 18 00:01:28,876 --> 00:01:31,636 Speaker 1: many ways, the US military has failed to keep up 19 00:01:31,676 --> 00:01:35,676 Speaker 1: with the way military technology has changed. That fundamentally is 20 00:01:35,676 --> 00:01:38,596 Speaker 1: the problem. DIU is trying to solve how can the 21 00:01:38,676 --> 00:01:42,436 Speaker 1: giant bureaucracy that is the US military catch up with 22 00:01:42,516 --> 00:01:48,676 Speaker 1: the latest military technology. Christopher is a civilian. He had 23 00:01:48,676 --> 00:01:52,356 Speaker 1: a bunch of national security related jobs in the Obama administration, 24 00:01:52,876 --> 00:01:55,476 Speaker 1: and he recently co wrote a book about his work 25 00:01:55,556 --> 00:01:59,156 Speaker 1: at the Defense Innovation Unit, the DIIU. Christopher and I 26 00:01:59,276 --> 00:02:02,356 Speaker 1: talk about the book and the DIIU later in the interview, 27 00:02:02,796 --> 00:02:05,596 Speaker 1: but to start, we talked about the weapons and tech 28 00:02:05,756 --> 00:02:08,836 Speaker 1: being used in the war in Ukraine. So, I mean, 29 00:02:08,836 --> 00:02:12,876 Speaker 1: it's in some ways like the war in Ukraine. It 30 00:02:12,916 --> 00:02:16,516 Speaker 1: is some version of the war or the kind of 31 00:02:16,556 --> 00:02:20,196 Speaker 1: war you've been thinking about for a long time, years 32 00:02:20,236 --> 00:02:20,636 Speaker 1: and years. 33 00:02:21,356 --> 00:02:24,516 Speaker 2: Well, I've always looked at Ukraine, you know, at sort 34 00:02:24,516 --> 00:02:27,596 Speaker 2: of two levels. The first, of course, is that it's 35 00:02:27,636 --> 00:02:32,036 Speaker 2: just a very tragic and frankly needless war. You know, 36 00:02:32,076 --> 00:02:38,596 Speaker 2: there's there's very few reasons right ever to go to war, 37 00:02:38,836 --> 00:02:41,316 Speaker 2: and this is a particularly stupid war that's just going 38 00:02:41,356 --> 00:02:44,556 Speaker 2: to cost a lot of young men and women their lives. 39 00:02:45,276 --> 00:02:47,956 Speaker 2: So it's tragic, and I'm terribly sorry for the suffering 40 00:02:47,996 --> 00:02:50,316 Speaker 2: of the Ukrainian people, and frankly the suffering of the 41 00:02:50,356 --> 00:02:53,076 Speaker 2: young conscripted Russian soldiers as well, So let me just 42 00:02:53,196 --> 00:02:56,956 Speaker 2: say that to start. But the second way that I 43 00:02:56,956 --> 00:03:01,396 Speaker 2: look at Ukraine as you know, essentially a laboratory of war. 44 00:03:02,276 --> 00:03:06,676 Speaker 2: We saw in you know, a decade ago already the 45 00:03:06,796 --> 00:03:10,076 Speaker 2: rise of commercial technologies that it was it was obvious 46 00:03:10,156 --> 00:03:14,156 Speaker 2: could be weaponized and weaponized at scale. And I mean 47 00:03:14,196 --> 00:03:16,476 Speaker 2: I remember sitting on the e ring of the Pentagon 48 00:03:16,596 --> 00:03:21,236 Speaker 2: watching YouTube videos of drones, you know, first of all, 49 00:03:21,316 --> 00:03:24,636 Speaker 2: quad copters flying around, you know, first sinkly, then in pairs, 50 00:03:24,756 --> 00:03:28,116 Speaker 2: then doing dances, and it was really easy to imagine 51 00:03:29,036 --> 00:03:31,556 Speaker 2: that at some point in the future those drones could 52 00:03:31,596 --> 00:03:35,756 Speaker 2: themselves also be carrying payloads of various kinds. And not 53 00:03:35,796 --> 00:03:38,876 Speaker 2: only that, but I mean these are drones that you know, 54 00:03:38,916 --> 00:03:41,316 Speaker 2: you could order on Amazon, right, they were a few 55 00:03:41,436 --> 00:03:46,396 Speaker 2: hundred dollars, so they were incredibly inexpensive to begin experimenting with. 56 00:03:46,516 --> 00:03:51,636 Speaker 2: So all the ingredients were there for a complete revolution 57 00:03:52,076 --> 00:03:56,756 Speaker 2: in the tools with which which war could be fought, 58 00:03:57,516 --> 00:04:00,876 Speaker 2: and they came together kind of all at once in 59 00:04:00,916 --> 00:04:08,156 Speaker 2: a way when Russian armored columns began and advancing in Ukraine. Right. 60 00:04:08,316 --> 00:04:10,796 Speaker 1: I mean, it's it's kind of easy to forget now, 61 00:04:10,836 --> 00:04:15,316 Speaker 1: but it at the time when Russia invaded Ukraine, seems 62 00:04:15,356 --> 00:04:19,556 Speaker 1: like the basic conventional wisdom was, oh, giant country with 63 00:04:19,596 --> 00:04:23,676 Speaker 1: a huge army invading, They're going to win in a week. 64 00:04:24,236 --> 00:04:28,116 Speaker 1: And that didn't happen. I mean, what what role did 65 00:04:28,556 --> 00:04:32,236 Speaker 1: this sort of new set of technologies play in allowing 66 00:04:32,316 --> 00:04:33,756 Speaker 1: Ukraine to defend itself. 67 00:04:34,596 --> 00:04:37,756 Speaker 2: Well, you know that the technologies actually came into play 68 00:04:37,836 --> 00:04:42,596 Speaker 2: even before the invasion commenced. And one of the most 69 00:04:42,676 --> 00:04:45,196 Speaker 2: visible ways that came into play is I mean, this 70 00:04:45,276 --> 00:04:47,596 Speaker 2: is the first war in a way where there's so 71 00:04:47,716 --> 00:04:53,236 Speaker 2: many private imagery companies that have their own satellite constellations 72 00:04:53,236 --> 00:04:56,636 Speaker 2: that are beaming back to Earth high resolution imagery, resolution 73 00:04:56,756 --> 00:04:59,116 Speaker 2: of the order of what would have been, you know, 74 00:04:59,396 --> 00:05:03,316 Speaker 2: a Cold War spy satellite just a few years ago. Uh. 75 00:05:03,556 --> 00:05:07,756 Speaker 2: And this this importantly, this, these images weren't just going 76 00:05:07,836 --> 00:05:10,396 Speaker 2: to you know, the US government or to other militaries. 77 00:05:10,396 --> 00:05:13,196 Speaker 2: They were available to anybody that wanted to see them. 78 00:05:13,956 --> 00:05:18,756 Speaker 2: And so when Putin and the Kremlin was issuing statements 79 00:05:18,756 --> 00:05:21,276 Speaker 2: to the effect that no, they weren't going to invade Ukraine, 80 00:05:21,316 --> 00:05:24,316 Speaker 2: in fact, they were demobilizing the Russian military and preparing 81 00:05:24,356 --> 00:05:29,516 Speaker 2: to ship material back to Russia. By rail. The Biden 82 00:05:29,516 --> 00:05:34,716 Speaker 2: administration was able to release commercial imagery that showed no, 83 00:05:34,916 --> 00:05:38,556 Speaker 2: in fact, there's armored columns lining up on the border 84 00:05:38,996 --> 00:05:39,556 Speaker 2: ready to go. 85 00:05:40,196 --> 00:05:42,356 Speaker 1: Uh huh. I mean the fact that it was commercial 86 00:05:42,396 --> 00:05:45,316 Speaker 1: means you didn't even need the Biden administration, right like 87 00:05:45,476 --> 00:05:49,276 Speaker 1: anybody who paid for a subscription could see that the 88 00:05:49,356 --> 00:05:50,596 Speaker 1: Russian tanks were there. 89 00:05:50,476 --> 00:05:55,556 Speaker 2: Right exactly, and so and this carried on. Then so 90 00:05:55,596 --> 00:06:01,396 Speaker 2: the invasion begins and the Ukrainians not only have eyes 91 00:06:01,436 --> 00:06:06,356 Speaker 2: and ears from this commercial data fees, not just imagery, 92 00:06:06,436 --> 00:06:11,716 Speaker 2: but also radio frequency intelligence, and that gets paired quickly 93 00:06:11,796 --> 00:06:15,876 Speaker 2: with starlink at a moment when you know, the Russians 94 00:06:15,876 --> 00:06:18,836 Speaker 2: are using very advanced cyber attacks to attempt to take 95 00:06:18,876 --> 00:06:21,476 Speaker 2: down not only the Internet in Ukraine, but also the 96 00:06:21,476 --> 00:06:25,516 Speaker 2: command and control structure of the Ukrainian military. And what 97 00:06:26,036 --> 00:06:29,356 Speaker 2: then happened next was really quite unique and amazing. The 98 00:06:29,876 --> 00:06:36,276 Speaker 2: Ukrainian government had become quite adept at offering digital services 99 00:06:36,316 --> 00:06:39,676 Speaker 2: to its citizens, both through online websites and through a 100 00:06:39,716 --> 00:06:43,556 Speaker 2: smartphone app that the government built. In fact, Zelensky actually 101 00:06:43,636 --> 00:06:47,756 Speaker 2: campaigned as a somebody who would bring in a new 102 00:06:47,836 --> 00:06:51,956 Speaker 2: generation of digital services, whether it was taxes or you know, 103 00:06:51,996 --> 00:06:56,636 Speaker 2: the ability to incorporate online. So the Ukrainian government very 104 00:06:56,716 --> 00:07:01,036 Speaker 2: quickly reconfigure a smartphone app that Ukrainians used every day 105 00:07:01,116 --> 00:07:04,956 Speaker 2: in their ordinary lives to allow Ukrainians to be able 106 00:07:04,996 --> 00:07:07,236 Speaker 2: to send in spot reports of you know, I just 107 00:07:07,276 --> 00:07:10,076 Speaker 2: saw a tank column moving past my town. 108 00:07:10,436 --> 00:07:12,556 Speaker 1: You basically just pull out your phone and open your 109 00:07:12,596 --> 00:07:16,236 Speaker 1: app and instead of whatever, calling an uber, you say, 110 00:07:16,356 --> 00:07:17,316 Speaker 1: I just saw a tank. 111 00:07:17,476 --> 00:07:20,876 Speaker 2: Well, yeah, so the nickname for this became Uber for artillery. 112 00:07:21,476 --> 00:07:25,796 Speaker 2: It literally was possible for citizens to file a spot 113 00:07:25,876 --> 00:07:29,196 Speaker 2: report that went right away to a command center in 114 00:07:29,236 --> 00:07:33,596 Speaker 2: the Ministry of Defense, and artillery and other weapons were 115 00:07:33,596 --> 00:07:35,836 Speaker 2: literally dispatched on that information. 116 00:07:36,516 --> 00:07:41,036 Speaker 1: So, okay, So that's the sort of initial state of play. 117 00:07:41,676 --> 00:07:44,316 Speaker 1: And then there's this evolution, right it seems like it's 118 00:07:44,396 --> 00:07:47,356 Speaker 1: three years from now, a long time. There has been 119 00:07:47,396 --> 00:07:54,036 Speaker 1: this really intense yeah, evolution, innovation right on both sides. 120 00:07:54,276 --> 00:07:57,396 Speaker 1: Talk to me about that. How has how has technology 121 00:07:57,516 --> 00:08:00,716 Speaker 1: driven this sort of rapid evolution of war fighting over 122 00:08:00,756 --> 00:08:01,636 Speaker 1: the last few years. 123 00:08:02,156 --> 00:08:06,676 Speaker 2: Well, Ukraine has been described the Ukrainian military has been described, 124 00:08:06,916 --> 00:08:09,436 Speaker 2: you know, as a as a military that already came 125 00:08:09,476 --> 00:08:13,956 Speaker 2: with a silicon valley wing attached to it because Ukraine 126 00:08:14,516 --> 00:08:18,636 Speaker 2: over the last two decades has redefined itself as a 127 00:08:18,676 --> 00:08:23,036 Speaker 2: country of programmers. It has a large endemic IT industry 128 00:08:23,796 --> 00:08:27,036 Speaker 2: that was there before the war, and the Ukrainian government 129 00:08:27,076 --> 00:08:31,316 Speaker 2: was really smart in that it mobilized, of course, military 130 00:08:31,356 --> 00:08:34,476 Speaker 2: age males to serve It really is a citizen army 131 00:08:34,516 --> 00:08:37,636 Speaker 2: in that respect, but it took anybody that had engineering 132 00:08:37,716 --> 00:08:42,116 Speaker 2: or programming skills and it allowed them either to work 133 00:08:42,156 --> 00:08:45,596 Speaker 2: for a private company developing technology or to work for 134 00:08:45,636 --> 00:08:49,996 Speaker 2: the military developing technology. And so from the first weeks 135 00:08:49,996 --> 00:08:55,276 Speaker 2: of the war you had an incredible number of engineers 136 00:08:55,516 --> 00:09:01,956 Speaker 2: spitballing on how to develop indigenous technology that could take 137 00:09:01,996 --> 00:09:04,316 Speaker 2: on the Russians. And this mattered because as soon as 138 00:09:04,356 --> 00:09:08,836 Speaker 2: the Russians got their advanced electronic warfare systems to the 139 00:09:08,876 --> 00:09:12,316 Speaker 2: front and move them forward and turn them on even 140 00:09:12,516 --> 00:09:16,116 Speaker 2: advanced Western drones that either Ukraine already had in this 141 00:09:16,196 --> 00:09:18,516 Speaker 2: inventory or rushed in and the opening weeks of the 142 00:09:18,516 --> 00:09:23,716 Speaker 2: war became essentially ineffective because they were not programmed nor 143 00:09:23,756 --> 00:09:26,476 Speaker 2: did they have the sensors on them to be able 144 00:09:26,556 --> 00:09:31,556 Speaker 2: to defeat the extremely advanced Russian countermeasures that were immediately 145 00:09:31,636 --> 00:09:31,996 Speaker 2: rolled out. 146 00:09:32,036 --> 00:09:34,956 Speaker 1: So the Russians were essentially jamming the drones in some way. 147 00:09:34,996 --> 00:09:38,436 Speaker 1: They had some kind of technology that rendered sort of 148 00:09:38,436 --> 00:09:43,116 Speaker 1: off the shelf or even military grade drones ineffective. 149 00:09:43,356 --> 00:09:46,596 Speaker 2: The jamming was so powerful that if you were in 150 00:09:46,756 --> 00:09:49,556 Speaker 2: Kiev and you say you were on the third or 151 00:09:49,596 --> 00:09:51,836 Speaker 2: fourth form of a hotel and you tried to order 152 00:09:51,996 --> 00:09:57,476 Speaker 2: an uber your location on the map in the Uber app, 153 00:09:57,476 --> 00:09:59,676 Speaker 2: which show up in the middle of the Indian Ocean. 154 00:10:00,036 --> 00:10:04,396 Speaker 2: Because the GPS jamming was so effective that even in cities, 155 00:10:04,796 --> 00:10:07,356 Speaker 2: it wasn't possible to pick up a reliable signal that 156 00:10:07,396 --> 00:10:10,276 Speaker 2: you know, cities that were hundred miles from the front. 157 00:10:10,956 --> 00:10:16,076 Speaker 2: So at the front, the electromagnetic spectrum was just flooded 158 00:10:16,116 --> 00:10:21,956 Speaker 2: with energy and that caused right away, it touched off 159 00:10:21,996 --> 00:10:28,596 Speaker 2: this technology race to invent drones that could essentially fly 160 00:10:28,716 --> 00:10:33,236 Speaker 2: on their own, even through incredibly intense jamming. And so 161 00:10:33,316 --> 00:10:36,236 Speaker 2: the Ukrainians and it was remarkable to go see when 162 00:10:36,716 --> 00:10:38,596 Speaker 2: my co author Radshaw and I got a chance to 163 00:10:38,676 --> 00:10:43,556 Speaker 2: visit Ukraine a year and a half ago, we went 164 00:10:43,676 --> 00:10:47,116 Speaker 2: to a number of drone companies and they have essentially 165 00:10:47,276 --> 00:10:51,196 Speaker 2: the Ukrainians in a matter of months, recreated the entire 166 00:10:51,836 --> 00:10:56,556 Speaker 2: American Cold War precision strike complex that we first got 167 00:10:56,556 --> 00:11:00,516 Speaker 2: to know so well in the original Gulf War, where 168 00:11:00,556 --> 00:11:03,756 Speaker 2: we all of a sudden were dropping bombs down chimneys. 169 00:11:03,796 --> 00:11:08,836 Speaker 2: So the Ukrainians have an essence, used a combination of 170 00:11:09,196 --> 00:11:15,236 Speaker 2: open source software and low cost hardware to recreate everything 171 00:11:15,356 --> 00:11:19,396 Speaker 2: from the function of our a wax planes. These are 172 00:11:19,476 --> 00:11:24,076 Speaker 2: the airline looking airplanes with the giant raidardomes on top 173 00:11:24,196 --> 00:11:27,996 Speaker 2: that provide a kind of God's eye view of the battlefield. 174 00:11:28,316 --> 00:11:31,356 Speaker 2: So they literally have drones whose sole purpose is to 175 00:11:31,436 --> 00:11:35,596 Speaker 2: keep track of large parts of the battlefield to other 176 00:11:36,356 --> 00:11:41,956 Speaker 2: smaller surveillance drones that will go take pictures of a 177 00:11:41,996 --> 00:11:46,116 Speaker 2: specific area that the Ukrainians want to target. They'll relay 178 00:11:46,156 --> 00:11:49,316 Speaker 2: that imagery back kind of almost like miniature YouTube spy 179 00:11:49,596 --> 00:11:54,076 Speaker 2: planes flying and dropping film canisters and whatnot, and they'll 180 00:11:54,156 --> 00:11:57,356 Speaker 2: quickly be able to then load the images of Russian 181 00:11:57,356 --> 00:12:01,916 Speaker 2: targets that they want to hit up onto killer drones 182 00:12:02,236 --> 00:12:06,156 Speaker 2: that will then use image recognition and terrain following software 183 00:12:06,956 --> 00:12:11,556 Speaker 2: to go hunt and find the target. It's with fully 184 00:12:11,796 --> 00:12:14,756 Speaker 2: you know autonomy on fully autonomous routes right that don't 185 00:12:14,796 --> 00:12:18,436 Speaker 2: require phoning home for them either to navigate or have 186 00:12:18,476 --> 00:12:22,036 Speaker 2: permission to engage. On top of that, you actually have 187 00:12:22,116 --> 00:12:26,196 Speaker 2: the Ukrainians using open source software to fuse different sensor streams, 188 00:12:26,276 --> 00:12:29,876 Speaker 2: so you can have a Ukrainian commander that on a 189 00:12:29,916 --> 00:12:33,596 Speaker 2: tablet computer in front of them has data feeds from 190 00:12:33,876 --> 00:12:37,516 Speaker 2: surveillance drones that that are relaying video in real time 191 00:12:37,676 --> 00:12:45,916 Speaker 2: to maps of the electronic warfare spectrum. You know. So 192 00:12:45,996 --> 00:12:48,756 Speaker 2: it's really incredible what they've been able to do for 193 00:12:49,316 --> 00:12:52,956 Speaker 2: essentially pennies on the dollar of what the American military 194 00:12:53,076 --> 00:12:56,076 Speaker 2: is spent to recreate the same technological stack. 195 00:12:56,836 --> 00:12:59,316 Speaker 1: So you went to Ukraine a year and a half 196 00:12:59,396 --> 00:13:02,356 Speaker 1: ago or so. You mentioned a couple of the places 197 00:13:02,396 --> 00:13:04,956 Speaker 1: you visited. One of them is sort of a had 198 00:13:04,996 --> 00:13:08,196 Speaker 1: been the Ukrainian version of like best Buy right and 199 00:13:08,316 --> 00:13:10,356 Speaker 1: had been transfer warmed. Tell me about that. 200 00:13:11,116 --> 00:13:14,636 Speaker 2: Yeah, it was wild running around the city of Kiev 201 00:13:15,516 --> 00:13:19,756 Speaker 2: visiting a number of the places that the government had 202 00:13:19,796 --> 00:13:23,356 Speaker 2: commandeered for the war effort. And so one of these 203 00:13:23,356 --> 00:13:26,836 Speaker 2: places was an all literal you know, Ukraine's version of 204 00:13:26,876 --> 00:13:29,356 Speaker 2: Best Buy, a big box store essentially on the wall 205 00:13:29,396 --> 00:13:33,916 Speaker 2: were still signed saying you know, televisions, appliances, you know, 206 00:13:34,036 --> 00:13:37,316 Speaker 2: there were ads for washing machines, you know, on the walls, 207 00:13:37,356 --> 00:13:39,276 Speaker 2: like walking to the Best Buy, but instead of there 208 00:13:39,316 --> 00:13:41,356 Speaker 2: being a whole bunch of electronics for you to buy, 209 00:13:42,116 --> 00:13:45,356 Speaker 2: there were benches, work benches laid out in an assembly 210 00:13:45,436 --> 00:13:50,636 Speaker 2: a line style rows and this particular electronics store was 211 00:13:50,676 --> 00:13:55,356 Speaker 2: being used to mass produce quad copter Kamakazi drones. So 212 00:13:55,676 --> 00:13:58,036 Speaker 2: at one end of the assembly line you would start 213 00:13:58,076 --> 00:14:01,796 Speaker 2: with the frame of the drone and by the time 214 00:14:01,796 --> 00:14:03,396 Speaker 2: he got to the other end, you would have a 215 00:14:03,476 --> 00:14:07,916 Speaker 2: completely working drone with communications software ready to be mated 216 00:14:08,516 --> 00:14:13,396 Speaker 2: with either an anti personnel charge or a shaped charge 217 00:14:13,476 --> 00:14:16,076 Speaker 2: to go through armor, you know, a couple of kilogram 218 00:14:16,196 --> 00:14:20,076 Speaker 2: charge that would create these inexpensive mass produced quad copters 219 00:14:20,116 --> 00:14:21,716 Speaker 2: into quite potent weapons. 220 00:14:23,116 --> 00:14:25,996 Speaker 1: When you say a charge, should I think bomb? It's 221 00:14:26,076 --> 00:14:27,276 Speaker 1: basically a little. 222 00:14:27,036 --> 00:14:29,196 Speaker 2: Bomb, right, yeah, larger than a grenade. 223 00:14:29,396 --> 00:14:32,956 Speaker 1: Yeah. And then it would so go out to the 224 00:14:32,956 --> 00:14:34,756 Speaker 1: front and they'd put the bomb on it and they'd 225 00:14:35,036 --> 00:14:38,596 Speaker 1: go crash it into a Russian tank or something like that. 226 00:14:38,996 --> 00:14:42,356 Speaker 2: They sure would. And this goes back to the incredibly 227 00:14:44,276 --> 00:14:48,156 Speaker 2: novel hybrid warfare that we're seeing in Ukraine, where yes, 228 00:14:48,316 --> 00:14:51,956 Speaker 2: there are you know, in many respects, the war in 229 00:14:52,076 --> 00:14:55,156 Speaker 2: Ukraine is a. There is a lot of continuities with 230 00:14:55,196 --> 00:14:57,596 Speaker 2: the First and the Second World War. There are trench lines, 231 00:14:57,636 --> 00:15:01,636 Speaker 2: there is artillery, there are tanks and mechanize differenttry that 232 00:15:01,876 --> 00:15:04,756 Speaker 2: are moving around. But at the same time in the 233 00:15:04,836 --> 00:15:08,716 Speaker 2: trenches are drone pilots and they are launching armed drones 234 00:15:08,756 --> 00:15:14,196 Speaker 2: and sense drones that are watching enemy movements behind enemy lines. 235 00:15:14,276 --> 00:15:18,116 Speaker 2: And so the battle you know in World War One 236 00:15:18,196 --> 00:15:21,236 Speaker 2: or World War Two typically happened at the front where 237 00:15:21,316 --> 00:15:25,356 Speaker 2: forces actually came into contact with one another. Today, Joan 238 00:15:25,356 --> 00:15:27,956 Speaker 2: Wolfire is pushing the battle behind the front. 239 00:15:28,396 --> 00:15:31,036 Speaker 1: It makes the front sort of more porous exactly. 240 00:15:31,156 --> 00:15:35,156 Speaker 2: So if you were, example, you know, a Russian armor unit, 241 00:15:35,196 --> 00:15:39,156 Speaker 2: and you're trying to advance into Ukrainian territory, you have 242 00:15:39,236 --> 00:15:43,276 Speaker 2: to bring your tanks and your armored personal carriers with 243 00:15:43,396 --> 00:15:47,396 Speaker 2: your infantry towards the front. Well, the Ukrainian has gone 244 00:15:47,396 --> 00:15:51,156 Speaker 2: in the habit of using their inexpensive kamikaze drones to 245 00:15:51,236 --> 00:15:54,756 Speaker 2: loiter and track armored personal carriers, and at the moment 246 00:15:54,876 --> 00:15:58,076 Speaker 2: at which troops would disembark the armored personal carrier to 247 00:15:58,716 --> 00:16:03,676 Speaker 2: move towards the front, they would strike. So this you know, 248 00:16:03,796 --> 00:16:07,396 Speaker 2: lawn range or sort of short range precisions. Strike has 249 00:16:07,476 --> 00:16:11,036 Speaker 2: completely changed how the war is being fought. It's being 250 00:16:11,076 --> 00:16:13,756 Speaker 2: fought unlike any other war in modern times. 251 00:16:14,476 --> 00:16:16,196 Speaker 1: So the war has been going on for a few years. 252 00:16:16,956 --> 00:16:20,636 Speaker 1: Both Russia and Ukraine have clearly been learning and you know, 253 00:16:20,716 --> 00:16:23,996 Speaker 1: developing new technology at this incredibly rapid pace. How does 254 00:16:24,036 --> 00:16:27,116 Speaker 1: the war in Ukraine look different now than it did 255 00:16:27,676 --> 00:16:30,396 Speaker 1: three years ago or two years ago shortly after it started. 256 00:16:31,196 --> 00:16:34,116 Speaker 2: Yeah, well, we're you know, back to a place that's 257 00:16:35,476 --> 00:16:39,236 Speaker 2: a well known place to historians, to military historians, where 258 00:16:40,156 --> 00:16:46,276 Speaker 2: you know, there was an initial explosion of conflict in 259 00:16:46,316 --> 00:16:51,876 Speaker 2: the in the confused invasion, where massive Russian armored columns 260 00:16:52,196 --> 00:16:54,276 Speaker 2: rushed in at multiple points in the country. They were 261 00:16:54,316 --> 00:16:58,756 Speaker 2: ultimately beaten back to a certain point. Those lines, you know, 262 00:16:59,236 --> 00:17:04,036 Speaker 2: over time came to freeze. Because each side has very 263 00:17:04,076 --> 00:17:10,076 Speaker 2: potent precision strike weapons, it's very hard to change those lines. 264 00:17:10,116 --> 00:17:13,476 Speaker 2: In other words, the defense in a way has an advantage. 265 00:17:14,476 --> 00:17:18,716 Speaker 2: It's very very hard to use offensive mass to gain territory, 266 00:17:18,756 --> 00:17:22,436 Speaker 2: which is why we've seen the battle lines not shift 267 00:17:22,596 --> 00:17:25,276 Speaker 2: all that much after they settled in the initial months 268 00:17:25,276 --> 00:17:27,716 Speaker 2: of the year. But what's going on behind them has 269 00:17:27,716 --> 00:17:34,796 Speaker 2: been this incredible combat through cycles of innovation, who can 270 00:17:35,316 --> 00:17:38,516 Speaker 2: come up with better electronic countermeasures, who can come up 271 00:17:38,556 --> 00:17:41,396 Speaker 2: with the next generation of drones that will defeat them? 272 00:17:41,636 --> 00:17:43,796 Speaker 2: And so both the Russians and the Ukrainians have been 273 00:17:43,836 --> 00:17:48,596 Speaker 2: innovating behind the front lines and then bringing that technology forward. 274 00:17:48,836 --> 00:17:51,396 Speaker 1: Does the world look different in some way now than 275 00:17:51,436 --> 00:17:52,956 Speaker 1: it did a couple of years ago? Or is it 276 00:17:52,996 --> 00:17:56,716 Speaker 1: different on such a subtle level, like the gpsgmming is 277 00:17:56,756 --> 00:17:58,836 Speaker 1: better and the countermeasures are better, and so the net 278 00:17:58,836 --> 00:18:01,316 Speaker 1: effect is the same. 279 00:18:02,236 --> 00:18:06,116 Speaker 2: I say, it's different in pretty dramatic ways. So today 280 00:18:06,956 --> 00:18:12,076 Speaker 2: there are such effect of small scale Kamakazi drones that 281 00:18:12,796 --> 00:18:18,156 Speaker 2: they can effectively hunt individuals. So we are truly today 282 00:18:18,316 --> 00:18:22,556 Speaker 2: in a black Mere episode where if you're a civilian 283 00:18:22,876 --> 00:18:26,516 Speaker 2: at a border city near the front, or a soldier 284 00:18:27,236 --> 00:18:30,876 Speaker 2: at the front, you are in danger of every time 285 00:18:30,916 --> 00:18:34,916 Speaker 2: you leave the bunker or leave your house literally being 286 00:18:34,956 --> 00:18:37,036 Speaker 2: pursued by a drone that's trying to kill you. 287 00:18:37,156 --> 00:18:39,876 Speaker 1: I mean, how do you survive a kamakazi drone attack? 288 00:18:40,356 --> 00:18:44,556 Speaker 2: You essentially have to take cover and so oftentimes you 289 00:18:44,596 --> 00:18:48,916 Speaker 2: know the sound of a drone buzzing, you know, And 290 00:18:48,956 --> 00:18:51,316 Speaker 2: this is a challenge, right because on every square mile 291 00:18:51,356 --> 00:18:54,756 Speaker 2: of the front, they're probably a dozen or more drones 292 00:18:54,796 --> 00:18:56,396 Speaker 2: in the air at any given time. 293 00:18:56,756 --> 00:18:59,196 Speaker 1: So if you're fighting this where there's basically always a 294 00:18:59,276 --> 00:19:01,796 Speaker 1: drone flying over your head and maybe it's coming to 295 00:19:01,916 --> 00:19:04,476 Speaker 1: kill you personally, and maybe it's not. 296 00:19:04,876 --> 00:19:09,756 Speaker 2: The airspace on both sides of the lines incredibly overrun 297 00:19:09,916 --> 00:19:12,676 Speaker 2: with drones. I mean they are literally smacking it to 298 00:19:12,796 --> 00:19:15,996 Speaker 2: each other in the air. It's like flocks of birds 299 00:19:16,076 --> 00:19:19,756 Speaker 2: at this point. So so no, the changes we're seeing, frankly, 300 00:19:19,916 --> 00:19:22,316 Speaker 2: are not subtle at all. They're quite dramatic. 301 00:19:22,956 --> 00:19:26,276 Speaker 1: So what does the US learn from the war in Ukraine. 302 00:19:27,156 --> 00:19:29,516 Speaker 2: It's woken the United States and our allies up in 303 00:19:29,796 --> 00:19:33,516 Speaker 2: two very profound ways. The first is that, you know, 304 00:19:33,516 --> 00:19:38,316 Speaker 2: it turns out we had not been stockpiling adequate material 305 00:19:38,596 --> 00:19:41,836 Speaker 2: for war, whether it was howards to shells or anti 306 00:19:41,876 --> 00:19:48,276 Speaker 2: tank missiles or sophisticated rockets we had basically you know, 307 00:19:48,396 --> 00:19:50,876 Speaker 2: ran to the storage cupboard and found it to be 308 00:19:50,996 --> 00:19:54,836 Speaker 2: almost bare. And that's very alarming. You know, it happened 309 00:19:54,836 --> 00:19:57,436 Speaker 2: because of course munitions, when the budget gets tired, it's 310 00:19:57,476 --> 00:19:59,676 Speaker 2: easy to say, oh, we'll buy more next year. So 311 00:19:59,716 --> 00:20:01,836 Speaker 2: that's how we ended up in that situation. So we 312 00:20:02,236 --> 00:20:05,636 Speaker 2: discovered a crisis actually of military readiness on our side, 313 00:20:05,956 --> 00:20:09,196 Speaker 2: and we discovered a manufacturing base more than that that 314 00:20:09,476 --> 00:20:13,236 Speaker 2: was not easily able to be mobilized because many factories 315 00:20:13,236 --> 00:20:16,476 Speaker 2: that produce these weapons had been idle for years or 316 00:20:16,516 --> 00:20:19,756 Speaker 2: even shuttered. And in a world in which supply chains 317 00:20:19,796 --> 00:20:23,276 Speaker 2: are threatened globally, if you can't assume the seas are 318 00:20:23,316 --> 00:20:25,156 Speaker 2: going to be saved for transit, you have to be 319 00:20:25,196 --> 00:20:30,556 Speaker 2: able to manufacture this weaponry within your borders. But there 320 00:20:30,636 --> 00:20:33,076 Speaker 2: was an even more profound wake up call, and let's 321 00:20:33,116 --> 00:20:35,876 Speaker 2: just go back to the use of Kamakazi drones. We 322 00:20:35,996 --> 00:20:39,476 Speaker 2: gave the Ukrainians very early in the war and one 323 00:20:39,636 --> 00:20:43,236 Speaker 2: A one Abraham's battle tanks. These are the most sophisticated 324 00:20:43,316 --> 00:20:47,276 Speaker 2: battle tank in the world. We've all seen pictures of them. 325 00:20:48,196 --> 00:20:50,476 Speaker 2: You can be driving down the highway in an Abram's 326 00:20:50,796 --> 00:20:52,756 Speaker 2: battle tank at seventy miles an hour and you can 327 00:20:52,756 --> 00:20:55,516 Speaker 2: decide which window of a skyscraper to put around through them. 328 00:20:55,556 --> 00:20:58,956 Speaker 2: It's an incredible piece of hardware. And almost all thirty 329 00:20:58,956 --> 00:21:01,996 Speaker 2: one of those tanks have been destroyed or disabled by 330 00:21:02,076 --> 00:21:05,156 Speaker 2: Russian Kamakazi drones that each cost no more than a 331 00:21:05,156 --> 00:21:06,396 Speaker 2: few thousand dollars a piece. 332 00:21:07,076 --> 00:21:09,956 Speaker 1: And what's the tank cost the tankst. 333 00:21:09,436 --> 00:21:11,756 Speaker 2: Oh gosh, I don't know, twelve or fifteen million dollars 334 00:21:11,836 --> 00:21:16,596 Speaker 2: or maybe more. So that tells me another military strategist, 335 00:21:17,276 --> 00:21:21,996 Speaker 2: that if you're an army who bought a lot of tanks, 336 00:21:22,116 --> 00:21:25,836 Speaker 2: and you presume that your military strength comes from those tanks, 337 00:21:26,436 --> 00:21:29,916 Speaker 2: you could easily be surprised by an adversary using really 338 00:21:29,956 --> 00:21:32,876 Speaker 2: an expensive Kama Coze drones. In other words, the era 339 00:21:34,236 --> 00:21:37,676 Speaker 2: of manned mechanized warfare is over. In the same way 340 00:21:37,716 --> 00:21:41,116 Speaker 2: that during the First World War, tanks showed us that 341 00:21:41,156 --> 00:21:43,956 Speaker 2: the cavalry age was coming to a close, we are 342 00:21:43,996 --> 00:21:46,916 Speaker 2: now at the end of the era of the battle tank. 343 00:21:47,676 --> 00:21:51,596 Speaker 1: So you're sayings, tanks now are about as useful as 344 00:21:51,596 --> 00:21:59,836 Speaker 1: horses were in nineteen fifteen. Yes, what do we do 345 00:21:59,916 --> 00:22:03,276 Speaker 1: about that? What does that mean for the United States? 346 00:22:04,356 --> 00:22:11,276 Speaker 2: Well, the reason why my cousin I wrote a book 347 00:22:11,356 --> 00:22:14,556 Speaker 2: all about just how far behind the US military finds 348 00:22:14,596 --> 00:22:21,476 Speaker 2: itself is that this phenomena of less expensive technology overcoming 349 00:22:22,476 --> 00:22:25,556 Speaker 2: legacy weapons systems, you know, the sort of building blocks 350 00:22:25,596 --> 00:22:29,116 Speaker 2: of the Cold War American military isn't true just for tanks, 351 00:22:29,676 --> 00:22:32,316 Speaker 2: And in fact we're already seeing this in Ukraine, where 352 00:22:32,316 --> 00:22:36,436 Speaker 2: the Russian black sea fleet is today being kept in 353 00:22:36,636 --> 00:22:44,716 Speaker 2: port by Ukrainians ingenious use of autonomous boats that are 354 00:22:45,316 --> 00:22:50,196 Speaker 2: carrying payloads of high explosives that have rammed in and 355 00:22:50,276 --> 00:22:54,396 Speaker 2: sunk the largest Russian ship, the Moskova on the black seat. 356 00:22:55,076 --> 00:22:58,796 Speaker 2: So the naval strategists called the Russian fleet now being 357 00:22:58,836 --> 00:23:01,276 Speaker 2: a fleet in being, it's a fleet kept in port 358 00:23:01,716 --> 00:23:05,596 Speaker 2: because of cheap autonomous technology. So, in other words, many 359 00:23:05,636 --> 00:23:12,556 Speaker 2: of the Americans most powerful militaries systems, aircraft carriers, battle tanks, 360 00:23:12,636 --> 00:23:16,716 Speaker 2: battleships may now no longer be as effective as they 361 00:23:16,756 --> 00:23:18,756 Speaker 2: were just a few years ago. 362 00:23:19,116 --> 00:23:22,076 Speaker 1: I mean so in terms of changing the equilibrium. Is 363 00:23:22,076 --> 00:23:26,116 Speaker 1: the effect of essentially autonomous technology drones and then ship 364 00:23:26,196 --> 00:23:31,036 Speaker 1: drones to give more power to countries with less money. 365 00:23:31,956 --> 00:23:33,156 Speaker 1: Is that one effect? 366 00:23:33,396 --> 00:23:36,876 Speaker 2: Well, there's two effects. So the first and most obvious 367 00:23:36,876 --> 00:23:41,316 Speaker 2: effect is that a country with fewer financial resources can 368 00:23:41,996 --> 00:23:46,836 Speaker 2: amass larger degrees of military power. But there's a second 369 00:23:46,836 --> 00:23:51,316 Speaker 2: and more insidious effect, which is that militaries are inherently 370 00:23:51,356 --> 00:23:55,876 Speaker 2: conservative institutions for a couple of reasons. One, war is 371 00:23:55,916 --> 00:24:00,676 Speaker 2: a very unforgiving teacher. And it's almost impossible through just 372 00:24:00,756 --> 00:24:04,676 Speaker 2: simulations alone to actually replicate combat conditions. And so you 373 00:24:04,756 --> 00:24:07,636 Speaker 2: can test, you can test new systems all you want, 374 00:24:07,676 --> 00:24:09,676 Speaker 2: but you won't actually have the com evident. So they're 375 00:24:09,676 --> 00:24:11,156 Speaker 2: going to work in the way you suspect them to 376 00:24:11,636 --> 00:24:13,476 Speaker 2: work or want them to work until you go to war. 377 00:24:14,116 --> 00:24:18,196 Speaker 2: So this is what leads militaries to almost always perpetually 378 00:24:18,236 --> 00:24:21,636 Speaker 2: be overfocused on winning the last war rather than winning 379 00:24:21,676 --> 00:24:23,036 Speaker 2: the war. That's the cop. 380 00:24:26,236 --> 00:24:28,796 Speaker 1: Still to come on the show. Why the US military 381 00:24:28,956 --> 00:24:31,436 Speaker 1: is not ready to fight a war right now according 382 00:24:31,436 --> 00:24:45,476 Speaker 1: to Chris, and what it might take to change that. So, okay, 383 00:24:45,516 --> 00:24:48,236 Speaker 1: there is this problem right the US military is not 384 00:24:48,396 --> 00:24:51,556 Speaker 1: really keeping up with technological change. And this is a 385 00:24:51,596 --> 00:24:54,396 Speaker 1: problem that Christopher has been working on for a while now. 386 00:24:54,796 --> 00:24:56,836 Speaker 1: It's a key part of the reason he was tapped 387 00:24:56,876 --> 00:25:00,396 Speaker 1: to help launch the Defense Innovation Unit back in twenty sixteen, 388 00:25:01,316 --> 00:25:03,236 Speaker 1: and to kind of set the scene here right to 389 00:25:03,236 --> 00:25:05,876 Speaker 1: give a sense of what he and his colleagues were 390 00:25:05,916 --> 00:25:08,076 Speaker 1: up against, what they were trying to change when they 391 00:25:08,156 --> 00:25:11,756 Speaker 1: launched the Defense Inovation Unit. I asked Christopher about this 392 00:25:11,876 --> 00:25:14,876 Speaker 1: story that he and his co author tell in their book, 393 00:25:15,436 --> 00:25:18,036 Speaker 1: it's the story of something they saw at the US 394 00:25:18,076 --> 00:25:22,276 Speaker 1: military's Combined Air Operations Center in Katar. 395 00:25:22,956 --> 00:25:27,876 Speaker 2: So the Combined Air Operations Center is this nerve center 396 00:25:28,036 --> 00:25:31,996 Speaker 2: that is in charge of running all air operations in 397 00:25:32,076 --> 00:25:35,356 Speaker 2: the Middle East in an area that's actually larger than 398 00:25:35,396 --> 00:25:40,396 Speaker 2: the lower forty eight States. And so any plane, any missile, 399 00:25:40,956 --> 00:25:45,116 Speaker 2: any air strike that goes on in this extraordinary piece 400 00:25:45,196 --> 00:25:50,316 Speaker 2: of geography is the responsibility of a set of airmen 401 00:25:51,076 --> 00:25:55,076 Speaker 2: and women that operate in a base in Qatar. And 402 00:25:55,516 --> 00:25:59,516 Speaker 2: in twenty sixteen, you'll remember, there was a crisis in 403 00:25:59,516 --> 00:26:05,636 Speaker 2: the Middle East. Isis was storming through countries, was encircling 404 00:26:05,756 --> 00:26:09,436 Speaker 2: the Iraqi population in Mosul, was trying to prosecut the 405 00:26:09,516 --> 00:26:13,116 Speaker 2: largest genocide of the twenty first century against the Ziti 406 00:26:13,196 --> 00:26:17,716 Speaker 2: people who were trapped on Mountain Jar. And these civilians 407 00:26:17,756 --> 00:26:22,236 Speaker 2: were kept alive only by an extraordinary campaign of US 408 00:26:22,276 --> 00:26:28,636 Speaker 2: air strikes. We were striking ISIS forces every few minutes 409 00:26:28,956 --> 00:26:31,636 Speaker 2: twenty four to seven for weeks on end, in a 410 00:26:31,676 --> 00:26:36,516 Speaker 2: bombing campaign that's the highest intensity in history. And to 411 00:26:36,556 --> 00:26:39,316 Speaker 2: sustain a bombing campaign of this you have to have, 412 00:26:39,356 --> 00:26:42,236 Speaker 2: of course fighter aircraft and surveillance, but you also have 413 00:26:42,316 --> 00:26:46,316 Speaker 2: to have refueling tankers. These fighter jets are gas guzzlers. 414 00:26:46,756 --> 00:26:49,876 Speaker 2: They need to refill their gas tanks every forty minutes 415 00:26:49,956 --> 00:26:53,036 Speaker 2: or so, and to do that they have to mate 416 00:26:53,116 --> 00:26:56,476 Speaker 2: up with air refueling tankers. And this is actually a 417 00:26:56,596 --> 00:27:00,116 Speaker 2: very complex thing to figure out because fighters come in 418 00:27:00,196 --> 00:27:03,316 Speaker 2: different types, they refuel at different altitudes, they have different 419 00:27:03,356 --> 00:27:07,116 Speaker 2: nozzles on them, and if you have hundreds of aircraft 420 00:27:07,516 --> 00:27:11,876 Speaker 2: a day aloft operating in combat support roles, you need 421 00:27:12,236 --> 00:27:17,556 Speaker 2: dozens of tankers to refuel them. So how did the 422 00:27:17,836 --> 00:27:22,876 Speaker 2: Air Force men and women in uniform and guitar orchestrate 423 00:27:23,556 --> 00:27:28,116 Speaker 2: the complex planning of tanker routes. Well, they used a whiteboard. 424 00:27:29,036 --> 00:27:34,916 Speaker 2: We showed up in twenty sixteen and found people essentially 425 00:27:34,956 --> 00:27:37,316 Speaker 2: operating like we did in World War Two, except instead 426 00:27:37,356 --> 00:27:42,876 Speaker 2: of having a chalkboard and magnets, we had a whiteboard. 427 00:27:43,156 --> 00:27:45,916 Speaker 2: And they were having to do all these calculations by hand, 428 00:27:46,636 --> 00:27:49,876 Speaker 2: which took an enormous number of hours. Why by hand 429 00:27:50,196 --> 00:27:52,756 Speaker 2: because the computer program that was supposed to do this 430 00:27:53,276 --> 00:27:55,956 Speaker 2: had broken and they were waiting for an upgrade. But 431 00:27:55,996 --> 00:27:59,076 Speaker 2: they had been waiting seven years for an upgrade that 432 00:27:59,116 --> 00:28:01,476 Speaker 2: one of the large defense companies was in charge of 433 00:28:01,836 --> 00:28:04,996 Speaker 2: developing that was way of rebudget and way behind schedule. 434 00:28:05,156 --> 00:28:07,156 Speaker 1: They'd been waiting seven years for an upgrade, and they 435 00:28:07,156 --> 00:28:10,156 Speaker 1: were in the middle of this like multi multi million 436 00:28:10,236 --> 00:28:15,116 Speaker 1: dollar command center the US had built, coordinating hundreds of 437 00:28:15,156 --> 00:28:16,716 Speaker 1: flights all the time. 438 00:28:16,796 --> 00:28:20,316 Speaker 2: It gets worse. So the minute that there's a deviation 439 00:28:20,596 --> 00:28:24,396 Speaker 2: from the plan, whether that's bad weather or you know, 440 00:28:24,996 --> 00:28:27,956 Speaker 2: unexpected combat operations in a different area than you'd plan 441 00:28:28,076 --> 00:28:30,436 Speaker 2: for now you have to redo the calculations on the fly. 442 00:28:31,156 --> 00:28:34,596 Speaker 2: So this is actually introducing a lot of unneeded risk 443 00:28:34,956 --> 00:28:37,756 Speaker 2: in our combat operations that that only puts it risk 444 00:28:38,676 --> 00:28:40,796 Speaker 2: the men and women of the US military, but also 445 00:28:40,836 --> 00:28:43,796 Speaker 2: puts it risk the forces on the ground that we're 446 00:28:43,796 --> 00:28:46,116 Speaker 2: trying to protect, where the civilians were trying to defend. 447 00:28:46,756 --> 00:28:50,356 Speaker 1: You describe like somewhere in the process, there'd be like 448 00:28:50,356 --> 00:28:53,196 Speaker 1: somebody with one spreadsheet and then there was another spreadsheet, 449 00:28:53,196 --> 00:28:56,596 Speaker 1: but like the two spreadsheets couldn't like talk to each other, right, 450 00:28:56,636 --> 00:28:58,676 Speaker 1: so somebody had to read out the numbers to get 451 00:28:58,716 --> 00:29:01,996 Speaker 1: it from one spreadsheet to the other. You describe extra 452 00:29:02,676 --> 00:29:07,036 Speaker 1: refueling flights all the time because they couldn't, you know, 453 00:29:07,196 --> 00:29:10,276 Speaker 1: process the data fast enough essentially by hand. 454 00:29:10,516 --> 00:29:14,036 Speaker 2: There, they had multiple different databases they had to manually 455 00:29:14,036 --> 00:29:17,716 Speaker 2: fuel fuse together. Somebody would literally stand behind the computer 456 00:29:17,796 --> 00:29:20,076 Speaker 2: operator to make sure they were typing in the numbers right, 457 00:29:20,516 --> 00:29:23,956 Speaker 2: and it would take fifty person hours to put together 458 00:29:24,356 --> 00:29:28,796 Speaker 2: the refueling plan for the next day. So we realized 459 00:29:28,836 --> 00:29:32,156 Speaker 2: this was an opportunity to show what Silicon Valley does best, 460 00:29:32,196 --> 00:29:37,116 Speaker 2: which is rapid iteration using software to create an optimization 461 00:29:37,196 --> 00:29:39,156 Speaker 2: algorithm that could solve this in seconds. 462 00:29:39,596 --> 00:29:41,836 Speaker 1: Why hadn't they just fixed it? Let me ask the 463 00:29:41,956 --> 00:29:45,396 Speaker 1: naive question, like, it's not as a technical matter, it's 464 00:29:45,476 --> 00:29:47,556 Speaker 1: not a hard problem. What was the actual problem. 465 00:29:47,996 --> 00:29:50,756 Speaker 2: So the problem is that the way the military bias 466 00:29:50,836 --> 00:29:56,156 Speaker 2: technology and the way Silicon Valley makes it, it's like 467 00:29:56,396 --> 00:30:00,236 Speaker 2: Mars and Venus, they're on two different planets. So the 468 00:30:00,276 --> 00:30:04,956 Speaker 2: military way, it's called a requirements based system of developing technology. 469 00:30:05,036 --> 00:30:07,196 Speaker 2: So a bunch of people sit in a windowless room 470 00:30:07,236 --> 00:30:10,836 Speaker 2: and they imagine, if I were to need a piece 471 00:30:10,876 --> 00:30:13,276 Speaker 2: of technology to help with air refueling, what should it 472 00:30:13,316 --> 00:30:16,356 Speaker 2: look like. And they write a bunch of specifications down 473 00:30:16,796 --> 00:30:19,036 Speaker 2: that they think would be great. So it's drawn up 474 00:30:19,076 --> 00:30:22,236 Speaker 2: in the abstract and then because contracting takes a lot 475 00:30:22,276 --> 00:30:26,836 Speaker 2: in the military. Oftentimes multiple generations of technology come along 476 00:30:27,356 --> 00:30:30,196 Speaker 2: before that system ever gets built, but of course it 477 00:30:30,236 --> 00:30:32,396 Speaker 2: has to be built through the original specifications. 478 00:30:32,996 --> 00:30:35,316 Speaker 1: So the thing you get is basically a thing somebody 479 00:30:35,356 --> 00:30:41,476 Speaker 1: designed ten years ago or more, which is so obviously 480 00:30:41,516 --> 00:30:44,796 Speaker 1: a bad idea, right, And I mean certainly you make 481 00:30:44,836 --> 00:30:46,836 Speaker 1: a compelling case in the book that it's obviously a 482 00:30:46,836 --> 00:30:50,956 Speaker 1: bad idea. Is there any counter argument? Is there any 483 00:30:50,996 --> 00:30:52,276 Speaker 1: case for the way they do it? 484 00:30:52,916 --> 00:30:55,276 Speaker 2: There is, And we have to be very sympathetic to 485 00:30:55,316 --> 00:30:57,396 Speaker 2: the people that are a part of the system because 486 00:30:57,436 --> 00:30:59,956 Speaker 2: the system actually was built to do something a little 487 00:30:59,956 --> 00:31:02,796 Speaker 2: bit different, and that is, you know, there aren't too 488 00:31:02,836 --> 00:31:06,876 Speaker 2: many companies that can build a nuclear powered aircraft carrier 489 00:31:07,036 --> 00:31:10,236 Speaker 2: or a submarine or a fifth generation stuff fighter, and 490 00:31:10,276 --> 00:31:13,676 Speaker 2: so you can't exactly go on Amazon and comparison price 491 00:31:13,716 --> 00:31:15,796 Speaker 2: shop if the're you're the US govern. 492 00:31:15,676 --> 00:31:18,156 Speaker 1: There is no market. There is no market for a 493 00:31:18,236 --> 00:31:20,756 Speaker 1: nuclear sub in a conventional sense. Yeah. 494 00:31:20,956 --> 00:31:25,076 Speaker 2: So the Defense Procurement system and the associated auditing system 495 00:31:25,316 --> 00:31:30,876 Speaker 2: that governs it is designed to save taxpayer dollar by 496 00:31:31,436 --> 00:31:35,996 Speaker 2: coming up with exceptionally precise specifications through which things must 497 00:31:36,076 --> 00:31:39,076 Speaker 2: be built and then a cost accounting system that is 498 00:31:39,116 --> 00:31:42,316 Speaker 2: meant to make sure that the taxpayer is never overcharged 499 00:31:42,716 --> 00:31:46,076 Speaker 2: from the one or two companies that can supply this technology. 500 00:31:46,236 --> 00:31:48,356 Speaker 2: So it was built for a different era, and in 501 00:31:48,476 --> 00:31:52,516 Speaker 2: terms of building aircraft carriers it actually does a reasonable job. 502 00:31:52,916 --> 00:31:56,036 Speaker 2: But commercial technology, there is a market. You don't have 503 00:31:56,116 --> 00:31:57,796 Speaker 2: to do it that way, and in fact, doing it 504 00:31:57,836 --> 00:31:59,156 Speaker 2: that way only slows you down. 505 00:32:00,476 --> 00:32:06,516 Speaker 1: So okay, so you go, you see this terrible, inefficient, dangerous, 506 00:32:06,636 --> 00:32:13,396 Speaker 1: expensive technology failure that is happening at this base and cutter, 507 00:32:13,436 --> 00:32:13,916 Speaker 1: what do you do? 508 00:32:16,596 --> 00:32:20,316 Speaker 2: Phone home? So my Cothur Raj got on the phone 509 00:32:20,316 --> 00:32:29,356 Speaker 2: that night and we immediately fedexed Imax and deployed a 510 00:32:29,396 --> 00:32:32,196 Speaker 2: few Air Force courders and a few coders from the 511 00:32:32,196 --> 00:32:35,756 Speaker 2: Silicon Valley out with the Pivotal to Qatar to sit 512 00:32:36,036 --> 00:32:39,836 Speaker 2: side by side with the airmen and women doing the 513 00:32:39,876 --> 00:32:44,076 Speaker 2: tanker planning and to build an automated app that they 514 00:32:44,076 --> 00:32:47,396 Speaker 2: could use eventually to push a button and have this done. 515 00:32:47,836 --> 00:32:50,836 Speaker 2: And it took just over a million dollars and just 516 00:32:50,916 --> 00:32:52,476 Speaker 2: over one hundred days to do it. 517 00:32:55,036 --> 00:32:58,156 Speaker 1: And you said it saved a million dollars in excess 518 00:32:58,196 --> 00:33:00,716 Speaker 1: flights in like a couple of days. 519 00:33:00,836 --> 00:33:04,876 Speaker 2: Right, So before the Air Force had to actually keep 520 00:33:04,916 --> 00:33:09,396 Speaker 2: a couple refueling tankers on standby to scramble because because 521 00:33:09,396 --> 00:33:12,356 Speaker 2: they weren't able to pivot the plan fast enough. And 522 00:33:12,396 --> 00:33:15,516 Speaker 2: of course, each time you scramble an enormous seven sixty 523 00:33:15,516 --> 00:33:20,516 Speaker 2: seven sized aircraft full of kerosene, it's energetically expensive. So 524 00:33:20,596 --> 00:33:23,876 Speaker 2: the tool that we use to optimize air tanker refueling 525 00:33:24,036 --> 00:33:28,156 Speaker 2: actually has saved the Air Force today something like I 526 00:33:28,156 --> 00:33:31,676 Speaker 2: think twenty four million gallons of kerosene a year. So 527 00:33:31,716 --> 00:33:33,916 Speaker 2: we're talking, you know, a lot of money of savings, right, 528 00:33:33,956 --> 00:33:36,036 Speaker 2: So this is this is a no brainer. Spend the money, 529 00:33:36,156 --> 00:33:40,236 Speaker 2: automate it. You get better mission outcomes for lower cost. 530 00:33:40,356 --> 00:33:43,476 Speaker 2: It's the Silicon Valley way of co developing technology with 531 00:33:43,516 --> 00:33:47,356 Speaker 2: the users, of iterating on a minimum, minimum viral product 532 00:33:47,716 --> 00:33:50,076 Speaker 2: and then adding features to that product as you go along. 533 00:33:51,036 --> 00:33:55,356 Speaker 1: So this is like an early win in a fairly 534 00:33:55,396 --> 00:33:58,716 Speaker 1: straightforward way. I mean also worth noting that having the 535 00:33:58,796 --> 00:34:01,156 Speaker 1: coders go sit with the people who are using the 536 00:34:01,196 --> 00:34:04,676 Speaker 1: thing is like classic Silicon Valley, right, get close to 537 00:34:04,756 --> 00:34:07,756 Speaker 1: the user, see what they need, as opposed to like 538 00:34:08,276 --> 00:34:11,596 Speaker 1: have a guy in Washington ten years ago drop a 539 00:34:11,636 --> 00:34:13,516 Speaker 1: list of things he thinks people are going to need 540 00:34:13,556 --> 00:34:17,076 Speaker 1: ten years from now. Right, there are a couple of 541 00:34:17,516 --> 00:34:21,676 Speaker 1: new companies selling weapons and technology to the government that 542 00:34:21,716 --> 00:34:24,436 Speaker 1: seem worth talking about, that seem quite different than the 543 00:34:24,476 --> 00:34:31,116 Speaker 1: sort of twentieth century giant military contractors, Right, Pallanteer and Anderil. 544 00:34:32,356 --> 00:34:34,156 Speaker 1: Tell me about those two companies and how they fit 545 00:34:34,236 --> 00:34:35,676 Speaker 1: with this broader story. 546 00:34:36,276 --> 00:34:41,036 Speaker 2: Well, it's important to recognize that the defense contracts we 547 00:34:41,076 --> 00:34:43,156 Speaker 2: have today, particularly the large ones that we know about, 548 00:34:43,196 --> 00:34:46,476 Speaker 2: the household names so Lockheed, Martin, Boeing, and so on 549 00:34:46,516 --> 00:34:51,156 Speaker 2: and so forth, they are really best thought of kind 550 00:34:51,156 --> 00:34:57,516 Speaker 2: of like utility companies, as highly regulated enterprises that are 551 00:34:57,556 --> 00:35:01,796 Speaker 2: not like traditional product companies because they have to adhere 552 00:35:01,836 --> 00:35:06,636 Speaker 2: to this massive system of cost accounting and auditing, this 553 00:35:07,156 --> 00:35:11,236 Speaker 2: two thousand page re regulation that governs everything they do. 554 00:35:11,836 --> 00:35:15,556 Speaker 2: And so in fact, the Pentagon has created these companies 555 00:35:16,316 --> 00:35:18,996 Speaker 2: and the way they operate, which is extraordinarily inefficient in 556 00:35:19,076 --> 00:35:22,356 Speaker 2: terms of how modern technology is produced by the commercial sector. 557 00:35:22,996 --> 00:35:27,076 Speaker 2: So what's so interesting about companies like Anderole, like Palanteer, 558 00:35:27,276 --> 00:35:31,916 Speaker 2: like shield Ai. They are traditional product companies in the 559 00:35:31,996 --> 00:35:36,036 Speaker 2: sense that Google and Microsoft are. They are unburdened by 560 00:35:36,876 --> 00:35:41,236 Speaker 2: having to erect this massive, separate accounting system to deal 561 00:35:41,316 --> 00:35:45,516 Speaker 2: with the Department of Defense bureaucracy because they are primarily 562 00:35:45,596 --> 00:35:49,836 Speaker 2: selling their technology through a completely different kind of contract 563 00:35:49,836 --> 00:35:52,956 Speaker 2: that we pioneered at DIU that our twenty nine year 564 00:35:52,996 --> 00:35:57,156 Speaker 2: old employee, Lauren Daily, came up with by seeing up 565 00:35:57,196 --> 00:35:59,876 Speaker 2: late at night and reading congressional legalation. 566 00:35:59,796 --> 00:36:02,556 Speaker 1: Basically a way that the Department of Defense can buy 567 00:36:02,636 --> 00:36:06,836 Speaker 1: weapons without going through the traditional two thousand page process. 568 00:36:06,956 --> 00:36:07,436 Speaker 2: Exactly. 569 00:36:07,996 --> 00:36:11,956 Speaker 1: It's so amazing that that's what's driving the technological change 570 00:36:12,036 --> 00:36:15,276 Speaker 1: right in a way, not surprising but quite interesting, right 571 00:36:15,356 --> 00:36:20,756 Speaker 1: that this, this the process ends up driving the outcome 572 00:36:21,316 --> 00:36:22,796 Speaker 1: in a backwards way. 573 00:36:23,876 --> 00:36:27,116 Speaker 2: Well, political scientists and sociologists all, you know, are always 574 00:36:27,196 --> 00:36:29,796 Speaker 2: keen to look at what you know, structures are operating, 575 00:36:29,916 --> 00:36:31,876 Speaker 2: and there is of course a whole discipline looks at 576 00:36:31,916 --> 00:36:34,916 Speaker 2: public institutions, which are constrained quite a bit more than 577 00:36:34,956 --> 00:36:38,556 Speaker 2: public companies. So once it isn't surprised. But what is 578 00:36:38,596 --> 00:36:41,996 Speaker 2: a surprise is the difference that one individual made in this. 579 00:36:42,116 --> 00:36:45,956 Speaker 2: So Lauren Daly, you know, the daughter of a tank driver, 580 00:36:46,796 --> 00:36:50,436 Speaker 2: somebody who entered the army as an acquisition specialist. As 581 00:36:50,476 --> 00:36:54,076 Speaker 2: her way of serving the country, figures out that there's 582 00:36:54,156 --> 00:36:57,596 Speaker 2: a loophole a Congress that a renegae Senate staffer is 583 00:36:57,676 --> 00:37:01,236 Speaker 2: just inserted in the National Defense Authorization Act that that 584 00:37:01,436 --> 00:37:04,396 Speaker 2: DAU can drive a truck through, and in a matter 585 00:37:04,436 --> 00:37:07,276 Speaker 2: of weeks, through some hustling at the Pentagon and the 586 00:37:07,276 --> 00:37:11,876 Speaker 2: support of thatsh, Carter codified this new method of procuring 587 00:37:11,916 --> 00:37:15,356 Speaker 2: technology called the Commercial Solutions Offering that essentially all want 588 00:37:15,436 --> 00:37:18,756 Speaker 2: us to negotiate with companies for a few weeks very openly, 589 00:37:18,916 --> 00:37:21,916 Speaker 2: in a very conversational way on commercial terms, so the 590 00:37:21,956 --> 00:37:24,516 Speaker 2: way they were used to negotiating and then to sign 591 00:37:24,516 --> 00:37:27,396 Speaker 2: a contract. So this was a game changer. And I 592 00:37:27,396 --> 00:37:30,916 Speaker 2: am so proud to report that since twenty sixteen to date, 593 00:37:31,436 --> 00:37:35,236 Speaker 2: about eighty billion dollars of technology acquisition by the departed 594 00:37:35,276 --> 00:37:37,836 Speaker 2: Defense has gone through this new mechanism. And in fact, 595 00:37:38,236 --> 00:37:42,156 Speaker 2: last week Trump Secretary of Defense Pete Hegseth signed out 596 00:37:42,156 --> 00:37:47,156 Speaker 2: a memo mandating that all software bought by the Department 597 00:37:47,196 --> 00:37:51,076 Speaker 2: of Defense be acquired using this means, not the old 598 00:37:51,396 --> 00:37:54,596 Speaker 2: two thousand page, two thousand pageway. 599 00:37:55,556 --> 00:37:59,956 Speaker 1: Huh So it's sort of the end of the old 600 00:38:00,556 --> 00:38:03,516 Speaker 1: pretty clearly bad way of buying software for the military. 601 00:38:04,436 --> 00:38:06,996 Speaker 2: Yes, And so you know, there's two things to keep 602 00:38:07,036 --> 00:38:10,116 Speaker 2: in mind here. You know, one, it doesn't make sense 603 00:38:10,156 --> 00:38:13,236 Speaker 2: to use a two thousand page rule book to buy 604 00:38:13,276 --> 00:38:16,156 Speaker 2: things where you don't need it, where there's an actual market. 605 00:38:16,396 --> 00:38:19,516 Speaker 2: But the second thing that's really important. We would never 606 00:38:19,596 --> 00:38:22,436 Speaker 2: go to Microsoft or Amazon and ask them to build 607 00:38:22,436 --> 00:38:25,996 Speaker 2: an aircraft carrier. So why were we going to Raytheon 608 00:38:26,076 --> 00:38:29,076 Speaker 2: or Lockheed Martin and asking them or north remmen to 609 00:38:29,076 --> 00:38:32,956 Speaker 2: build software. It's not their core expertise. So the Pentagon, 610 00:38:33,116 --> 00:38:35,956 Speaker 2: in the last eight years, thanks in part to its 611 00:38:35,996 --> 00:38:39,556 Speaker 2: Silicon Valley office, has gotten a lot smarter about how 612 00:38:39,596 --> 00:38:42,916 Speaker 2: to procure advanced technology. And so fast forward to today. 613 00:38:43,676 --> 00:38:47,516 Speaker 2: The most exciting contract the Air Force is letting to 614 00:38:47,556 --> 00:38:52,556 Speaker 2: build up to potentially ten thousand supersonic autonomous drones that 615 00:38:52,596 --> 00:38:57,956 Speaker 2: will fight alongside our fifth generation man fighters. That contract 616 00:38:58,356 --> 00:39:02,076 Speaker 2: went to anderall In General Atomics, not de Boeing, not 617 00:39:02,156 --> 00:39:04,396 Speaker 2: to Lockheed, not to one of the prime So we're 618 00:39:04,436 --> 00:39:09,116 Speaker 2: seeing this complete sea change in who is being successful 619 00:39:09,476 --> 00:39:13,276 Speaker 2: at pulling down the biggest contracts with the most advanced 620 00:39:13,276 --> 00:39:14,996 Speaker 2: technology come out of the Department today. 621 00:39:16,356 --> 00:39:25,196 Speaker 1: So do you think the US is ready to fight 622 00:39:25,196 --> 00:39:28,996 Speaker 1: a war right now, No, go on. 623 00:39:31,636 --> 00:39:37,476 Speaker 2: We are in the unfortunate position of having made major 624 00:39:37,796 --> 00:39:46,916 Speaker 2: decatal investments in weapons systems that have essentially reached the 625 00:39:46,996 --> 00:39:47,956 Speaker 2: end of their lifespan. 626 00:39:48,756 --> 00:39:52,436 Speaker 1: You mean like tank and aircraft carrier. Yes, not a 627 00:39:52,436 --> 00:39:55,396 Speaker 1: particular tank or aircraft carrier, but just any tank or 628 00:39:55,436 --> 00:39:58,796 Speaker 1: aircraft carrier is now basically obsolete. 629 00:39:58,956 --> 00:40:03,836 Speaker 2: And we have a three part challenge. We're not yet 630 00:40:03,876 --> 00:40:08,516 Speaker 2: exactly sure because we haven't experimented enough what technology to 631 00:40:08,596 --> 00:40:12,996 Speaker 2: buy to replace them. We're also not sure exactly how 632 00:40:12,996 --> 00:40:16,436 Speaker 2: to use that new technology and what the military calls 633 00:40:16,556 --> 00:40:20,796 Speaker 2: concepts of operation. So when we invented care aviation, that's 634 00:40:20,836 --> 00:40:25,636 Speaker 2: a whole new way of using aircraft to fight naval. 635 00:40:25,356 --> 00:40:28,396 Speaker 1: Battles, you have to figure out what to do with 636 00:40:28,676 --> 00:40:30,676 Speaker 1: ten thousand supersonic drones. 637 00:40:30,756 --> 00:40:34,116 Speaker 2: And the third problem we have is that once we 638 00:40:34,196 --> 00:40:37,476 Speaker 2: do figure out what technology to buy and what operating 639 00:40:37,556 --> 00:40:41,036 Speaker 2: concepts to insert them into, we have to scale that up. 640 00:40:41,676 --> 00:40:43,756 Speaker 1: Is there not a fourth problem, which was we're still 641 00:40:43,756 --> 00:40:47,756 Speaker 1: spending billions and billions of dollars on weapons that you're 642 00:40:47,876 --> 00:40:50,316 Speaker 1: arguing aren't going to work, are obsolete. 643 00:40:51,156 --> 00:40:55,116 Speaker 2: Yes, that is a fourth problem. The Biden administration shifted 644 00:40:55,196 --> 00:40:57,796 Speaker 2: less than one percent of the Department of Defense's procurement 645 00:40:57,836 --> 00:41:01,556 Speaker 2: budget towards autonomous. 646 00:41:01,076 --> 00:41:04,396 Speaker 1: Weapons less than one percent, meaning basically. 647 00:41:03,916 --> 00:41:05,876 Speaker 2: Zero, basically zero. 648 00:41:06,156 --> 00:41:11,396 Speaker 1: Yes, I feel like that's a bigger problem than the 649 00:41:11,396 --> 00:41:13,196 Speaker 1: other three. And I mean that comes through in the 650 00:41:13,236 --> 00:41:16,556 Speaker 1: book in a way, right, like your antagonist in the book, 651 00:41:17,676 --> 00:41:21,156 Speaker 1: in a narrative sense, is not Russia or China or 652 00:41:21,196 --> 00:41:24,996 Speaker 1: isis right. It's bureaucrats. It's it's you know, you go 653 00:41:25,076 --> 00:41:27,716 Speaker 1: to Capitol Hill and somebody's like, we're going to take 654 00:41:27,716 --> 00:41:30,396 Speaker 1: your money away because you're not spending any money in Indiana, 655 00:41:30,436 --> 00:41:34,156 Speaker 1: and that's where my congressman is from. And it seems 656 00:41:34,156 --> 00:41:36,516 Speaker 1: like that is still a problem. 657 00:41:37,156 --> 00:41:39,436 Speaker 2: It is, And you know, this is one of these 658 00:41:39,476 --> 00:41:43,276 Speaker 2: moments where the strategic environment is changing so fast and 659 00:41:43,316 --> 00:41:46,956 Speaker 2: anybody watching or reading the news knows it from what's 660 00:41:46,996 --> 00:41:50,236 Speaker 2: happening in Ukraine, what's happening in the Middleize, what's happening elsewhere. 661 00:41:51,036 --> 00:41:54,236 Speaker 2: And so we're in this situation where the ordinary way 662 00:41:54,796 --> 00:41:58,916 Speaker 2: we buy technology and manage the military won't work. If 663 00:41:58,916 --> 00:42:00,796 Speaker 2: we keep doing it, we're going to walk the plank. 664 00:42:01,476 --> 00:42:05,596 Speaker 2: So we're now in a fierce battle to outmaneuver our 665 00:42:05,636 --> 00:42:09,356 Speaker 2: own system to transform it before there's another common. 666 00:42:10,116 --> 00:42:13,756 Speaker 1: Frankly, it's hard for me to imagine the kind of 667 00:42:14,036 --> 00:42:17,556 Speaker 1: change you're suggesting is needed. Like when I think of 668 00:42:17,636 --> 00:42:20,636 Speaker 1: how whatever Congress works, and how the country works, and 669 00:42:20,676 --> 00:42:23,996 Speaker 1: how the military works and how everything works, it's hard 670 00:42:24,036 --> 00:42:26,036 Speaker 1: for him to imagine. I mean, unless there was a war, 671 00:42:26,236 --> 00:42:28,956 Speaker 1: which I don't want to happen obviously, right, Like, I 672 00:42:28,956 --> 00:42:30,636 Speaker 1: don't know, what do you think is going to happen? 673 00:42:31,356 --> 00:42:34,596 Speaker 2: Well, you know, to make a more optimistic case, there 674 00:42:34,636 --> 00:42:36,596 Speaker 2: are a lot of things that are now going in 675 00:42:36,636 --> 00:42:40,876 Speaker 2: our favor. Most fundamentally, there is a new consensus in 676 00:42:40,916 --> 00:42:43,996 Speaker 2: the Pentagon and in the Congress that we're in a 677 00:42:44,036 --> 00:42:47,316 Speaker 2: crisis and the way to solve it is to pivot 678 00:42:47,356 --> 00:42:50,116 Speaker 2: the way the Department of Defense buys technology and use 679 00:42:50,196 --> 00:42:53,716 Speaker 2: this technology. And at the same time, there's a new 680 00:42:53,716 --> 00:42:57,156 Speaker 2: commitment in industry. We have a very dynamic economy the 681 00:42:57,196 --> 00:43:02,116 Speaker 2: United States of America, with venture backed investment creating small 682 00:43:02,116 --> 00:43:06,716 Speaker 2: scale infusions of technology that scales up companies in incredible ways. 683 00:43:07,076 --> 00:43:10,636 Speaker 2: So since Defense Innovation Unit figure it out this more 684 00:43:11,236 --> 00:43:13,876 Speaker 2: seamless way to quickly buy technology and actually make the 685 00:43:13,916 --> 00:43:17,196 Speaker 2: government a good customer. In Silicon Valley, there's been about 686 00:43:17,196 --> 00:43:21,436 Speaker 2: two hundred and fifty billion dollars of venture funds now 687 00:43:21,476 --> 00:43:25,236 Speaker 2: in play to back defense tech startups like and Andro 688 00:43:25,276 --> 00:43:27,676 Speaker 2: Al's am on the most successful, but there are hundreds more. 689 00:43:28,276 --> 00:43:32,676 Speaker 2: So the market is already responding. And today there is 690 00:43:32,796 --> 00:43:36,756 Speaker 2: more dynamic invention going on in the defense tech sector 691 00:43:36,756 --> 00:43:41,756 Speaker 2: than I think anywhere else. And so if the consensus 692 00:43:41,796 --> 00:43:44,876 Speaker 2: in the Pentagon in Congress can actually be changed into 693 00:43:46,116 --> 00:43:50,436 Speaker 2: shifts in the budget, we will quickly beheaded in the 694 00:43:50,476 --> 00:43:50,956 Speaker 2: right direction. 695 00:43:51,476 --> 00:43:54,716 Speaker 1: I was with you until you said, if consensus in 696 00:43:54,756 --> 00:43:59,116 Speaker 1: the Pentagon in Congress can be changed, right, that's like 697 00:43:59,156 --> 00:44:01,916 Speaker 1: there's so much money there and so much inertia, and 698 00:44:01,996 --> 00:44:07,476 Speaker 1: like the government seems quite bad in many domains at 699 00:44:07,716 --> 00:44:14,356 Speaker 1: changing when change is needed. That it seems I don't know, 700 00:44:15,036 --> 00:44:16,956 Speaker 1: you think that might happen. It seems unlikely to me. 701 00:44:18,556 --> 00:44:22,996 Speaker 2: Well, you know, again to paint the optimist picture. 702 00:44:24,116 --> 00:44:26,516 Speaker 1: Well, just for a sec what do you think? What 703 00:44:26,556 --> 00:44:27,556 Speaker 1: do you think is going to happen? 704 00:44:28,196 --> 00:44:33,596 Speaker 2: I am worried that people now understand the problem, but 705 00:44:33,916 --> 00:44:37,996 Speaker 2: the pendulum of change is going to swing too slowly. Huh. 706 00:44:38,836 --> 00:44:43,476 Speaker 1: So I read that you recently did a residence at Anthropic, 707 00:44:43,596 --> 00:44:47,236 Speaker 1: which is one of the big frontier AI companies that 708 00:44:47,356 --> 00:44:50,796 Speaker 1: may claude the large language model, And so I'm curious. 709 00:44:50,876 --> 00:44:54,436 Speaker 1: This is very interesting to me, Like what did you 710 00:44:54,716 --> 00:44:57,196 Speaker 1: learn there? Like, what do you know now that you 711 00:44:57,356 --> 00:45:00,716 Speaker 1: didn't know before you you were there? 712 00:45:01,316 --> 00:45:04,956 Speaker 2: The United States is unquestionably right now ahead at least 713 00:45:04,956 --> 00:45:08,596 Speaker 2: by a little bit in artificial intelligence. So's it's a 714 00:45:08,636 --> 00:45:11,916 Speaker 2: technology that we're better at than anybody else. And so 715 00:45:11,996 --> 00:45:14,996 Speaker 2: I think that the sort of sixty four million dollar 716 00:45:15,076 --> 00:45:18,236 Speaker 2: question in terms of military strategy for the United States 717 00:45:18,356 --> 00:45:22,356 Speaker 2: is whether or not we can harness AI in designing 718 00:45:22,396 --> 00:45:25,676 Speaker 2: a new military because right now that is the one 719 00:45:25,716 --> 00:45:28,916 Speaker 2: thing that will give us an unquestioned advantage in the battlefield. 720 00:45:30,156 --> 00:45:32,956 Speaker 1: I mean, more generally, how do you think about AI 721 00:45:33,116 --> 00:45:35,036 Speaker 1: in the future of war? I mean, what does it 722 00:45:35,116 --> 00:45:39,876 Speaker 1: mean for the equilibrium? And what does it mean for 723 00:45:39,916 --> 00:45:44,116 Speaker 1: the sort of relative abilities of big countries and small 724 00:45:44,116 --> 00:45:48,596 Speaker 1: countries and kind of you know, rogue actors, Like how 725 00:45:48,596 --> 00:45:51,316 Speaker 1: does it change the power dynamics? 726 00:45:51,796 --> 00:45:55,356 Speaker 2: Well, I'm really glad you used the word equilibrium because 727 00:45:55,476 --> 00:46:00,836 Speaker 2: it's a it's a core concept in strategy. There's a 728 00:46:00,876 --> 00:46:05,756 Speaker 2: whole literature on something called strategic stability, and it's a 729 00:46:05,796 --> 00:46:08,796 Speaker 2: literature that comes out in part of the Cold War. 730 00:46:09,636 --> 00:46:13,876 Speaker 2: It's a literature that has to reckon with the counterintuitive 731 00:46:13,916 --> 00:46:17,356 Speaker 2: finding that the proliferation of nuclear weapons otherds, the more 732 00:46:17,396 --> 00:46:22,316 Speaker 2: countries that became nuclear powers, the less incidents of conflict 733 00:46:22,356 --> 00:46:27,156 Speaker 2: between them there were, so nuclear weapons, an incredibly dangerous technology, 734 00:46:27,156 --> 00:46:30,396 Speaker 2: in a strange way, actually made the world safer. 735 00:46:30,796 --> 00:46:32,116 Speaker 1: So far, so far. 736 00:46:32,196 --> 00:46:34,996 Speaker 2: Now, with this whole true, if everybody had a nuclear weapon, 737 00:46:35,396 --> 00:46:35,996 Speaker 2: probably not. 738 00:46:36,276 --> 00:46:39,076 Speaker 1: Right, And there's also a tail risk where it's like, 739 00:46:39,556 --> 00:46:43,036 Speaker 1: maybe this was just the lucky whatever eighty years, but 740 00:46:43,156 --> 00:46:44,996 Speaker 1: go on, yeah, so far, so far. 741 00:46:45,116 --> 00:46:49,316 Speaker 2: So I think that the question is what will AI 742 00:46:49,596 --> 00:46:55,876 Speaker 2: do to strategic stability? Will it actually enhance it in 743 00:46:55,916 --> 00:46:58,996 Speaker 2: the following sense of AI is clearly going to make 744 00:46:59,356 --> 00:47:04,036 Speaker 2: warfare much more destructive than it is today, and more 745 00:47:04,076 --> 00:47:07,436 Speaker 2: destructive systems will raise the cost of going to war. 746 00:47:07,956 --> 00:47:12,196 Speaker 2: And so if multiple countries have really advanced AI, even 747 00:47:12,236 --> 00:47:13,916 Speaker 2: if one country is a little bit ahead of the other, 748 00:47:14,516 --> 00:47:17,476 Speaker 2: that won't change the fact that either one going to 749 00:47:17,516 --> 00:47:22,196 Speaker 2: war starting a war will be massively costly. So it 750 00:47:22,276 --> 00:47:27,476 Speaker 2: could be that AI, as a very diffuse technology, actually 751 00:47:27,516 --> 00:47:32,516 Speaker 2: increases to deterrence, in effect creating a higher bar to 752 00:47:32,556 --> 00:47:36,956 Speaker 2: starting a war. The optimistic scenario, the truly optimistic scenario, 753 00:47:37,036 --> 00:47:39,476 Speaker 2: is that not only does AI the diffusion of AI 754 00:47:39,556 --> 00:47:45,476 Speaker 2: technology make great power war less likely, but because there 755 00:47:45,516 --> 00:47:47,556 Speaker 2: are going to be some bad actors that get a 756 00:47:47,556 --> 00:47:51,596 Speaker 2: hold of this stuff and go cause a RUCKUS, that's 757 00:47:51,636 --> 00:47:55,036 Speaker 2: actually going to increase the interest of nations like the 758 00:47:55,156 --> 00:47:59,676 Speaker 2: US and China working together to ensure they retain their 759 00:47:59,676 --> 00:48:02,356 Speaker 2: own sovereignty and moopolying the means of violence to keep 760 00:48:02,356 --> 00:48:05,676 Speaker 2: their citizens safe. So it could actually not only decrease 761 00:48:05,756 --> 00:48:07,996 Speaker 2: the likelihood of great power of war, it could increase 762 00:48:08,316 --> 00:48:11,116 Speaker 2: international cop ration because we desperately need it if we're 763 00:48:11,116 --> 00:48:14,316 Speaker 2: all going to stay safe. Now. There is, of course, 764 00:48:14,516 --> 00:48:18,916 Speaker 2: a darker alternative. One flavor of that darker alternative is 765 00:48:18,956 --> 00:48:22,116 Speaker 2: that there is a country that believes they have a 766 00:48:22,196 --> 00:48:27,396 Speaker 2: breakout uh uh you know AI, and that that changes 767 00:48:27,396 --> 00:48:30,236 Speaker 2: for them the calculus of wanting to go to war 768 00:48:30,236 --> 00:48:32,556 Speaker 2: because they think that they'll just be able to eke 769 00:48:32,596 --> 00:48:36,236 Speaker 2: out enough advantage that that they'll launch an attack. Also, 770 00:48:36,436 --> 00:48:38,596 Speaker 2: I mean there's another danger too. We have to remember that, 771 00:48:38,676 --> 00:48:43,716 Speaker 2: you know, Vietnam started because in part the US Navy 772 00:48:43,756 --> 00:48:46,476 Speaker 2: thought its destroyers were attacked in the Gulf of Talking, 773 00:48:46,756 --> 00:48:49,436 Speaker 2: leading the you know US Congress to pass the Golf 774 00:48:49,476 --> 00:48:53,156 Speaker 2: of Tonkin Resolutions, which later led to the deployment of 775 00:48:53,156 --> 00:48:55,916 Speaker 2: troops in Vietnam. Well, turns out the attack never happened. 776 00:48:55,956 --> 00:48:59,836 Speaker 2: It was just mass confusion. Uh, And so AI introduces 777 00:48:59,956 --> 00:49:02,356 Speaker 2: a lot more uncertainty into the battlefield. That that is 778 00:49:02,396 --> 00:49:04,996 Speaker 2: a risk, that is that is worth knowing. And then 779 00:49:05,316 --> 00:49:09,516 Speaker 2: truly dark scenario is that AI allows small grop groups 780 00:49:10,196 --> 00:49:13,396 Speaker 2: to project lethality in a way that's very hard to control. 781 00:49:13,716 --> 00:49:15,596 Speaker 1: What do you what do you mean when you say that? 782 00:49:15,676 --> 00:49:18,076 Speaker 2: What means that you could go by or make a 783 00:49:18,116 --> 00:49:21,076 Speaker 2: weapon for low cost that there is no that there 784 00:49:21,156 --> 00:49:25,076 Speaker 2: is no effective defense for, so that you could go 785 00:49:25,236 --> 00:49:28,476 Speaker 2: toe to toe, you know, with with the most advanced 786 00:49:28,476 --> 00:49:31,236 Speaker 2: police force or homeland security force in military and still 787 00:49:31,276 --> 00:49:33,996 Speaker 2: cause a bunch of damage. So that is indeed a 788 00:49:34,236 --> 00:49:37,356 Speaker 2: very legitimate worry. It always has been. AI would just 789 00:49:37,396 --> 00:49:41,596 Speaker 2: be the latest technological innovation to land in the hands 790 00:49:41,596 --> 00:49:44,716 Speaker 2: of bad actors that are looking to do bad things. 791 00:49:48,196 --> 00:50:01,876 Speaker 1: We'll be back in a minute with the lightning round. Okay, 792 00:50:03,676 --> 00:50:05,796 Speaker 1: let's finish with the lightning round, which is going to 793 00:50:05,876 --> 00:50:11,236 Speaker 1: be a little lighter. It's going to be like then 794 00:50:11,276 --> 00:50:15,476 Speaker 1: the rest of this conversation. I read in a bio 795 00:50:15,556 --> 00:50:19,876 Speaker 1: of yours online that you have backpacked in over thirty countries, 796 00:50:19,876 --> 00:50:25,116 Speaker 1: including a trek from Moscow to Singapore overland, and I'm 797 00:50:25,116 --> 00:50:30,876 Speaker 1: curious about that trip in particular. What was one really 798 00:50:30,916 --> 00:50:32,276 Speaker 1: bad moment on that trip. 799 00:50:34,476 --> 00:50:39,356 Speaker 2: I'll go with food poisoning in eastern Siberia that caused 800 00:50:39,396 --> 00:50:43,116 Speaker 2: me to hallucinate at a time. I was staying in 801 00:50:43,156 --> 00:50:47,636 Speaker 2: a Russian farmhouse up a ladder in a loft, so 802 00:50:47,756 --> 00:50:51,836 Speaker 2: I had to while I was hallucinating, climb down to 803 00:50:52,956 --> 00:50:56,716 Speaker 2: use the bathroom every few minutes late late one night. 804 00:50:56,756 --> 00:50:58,996 Speaker 2: But I will say it was an incredible trip. 805 00:50:59,036 --> 00:51:01,556 Speaker 1: That wait, I want to get to the incredible trip part. 806 00:51:01,636 --> 00:51:04,356 Speaker 1: But what was the nature of the hallucination. 807 00:51:07,116 --> 00:51:12,036 Speaker 2: It was like profoundly lifelike dreams, and I recognized at 808 00:51:12,036 --> 00:51:15,436 Speaker 2: the time for you know, what they were, and it 809 00:51:15,516 --> 00:51:17,996 Speaker 2: was just bizarre to be, you know, on this rural 810 00:51:18,036 --> 00:51:22,036 Speaker 2: farm in eastern Siberia hallucinating and trying to climb down 811 00:51:22,036 --> 00:51:26,556 Speaker 2: a ladder. So don't you know crap where I'm sleeping? 812 00:51:27,236 --> 00:51:30,356 Speaker 1: Yes, yes, okay, you were going to say how great 813 00:51:30,356 --> 00:51:31,676 Speaker 1: the trip was, tell me that part. 814 00:51:31,836 --> 00:51:36,276 Speaker 2: It was just absolutely astonishing to go from Moscow ultimately 815 00:51:36,276 --> 00:51:38,876 Speaker 2: all the way to Singapore over a land because we 816 00:51:38,956 --> 00:51:41,076 Speaker 2: got to watch out the window of a train moving 817 00:51:41,116 --> 00:51:44,916 Speaker 2: into about twenty miles an hour Europe turned into Asia, 818 00:51:45,716 --> 00:51:49,556 Speaker 2: and we brought history books along and read them and 819 00:51:49,636 --> 00:51:53,316 Speaker 2: it was an incredible education and I'm so glad we 820 00:51:53,356 --> 00:51:55,876 Speaker 2: took that trip. It certainly shaped how I see the 821 00:51:55,876 --> 00:51:58,036 Speaker 2: world and I'll never forget it. 822 00:51:58,596 --> 00:52:03,116 Speaker 1: And is it right that you have like some super 823 00:52:03,316 --> 00:52:05,276 Speaker 1: top secret clearance. 824 00:52:05,716 --> 00:52:08,076 Speaker 2: Well, they haven't told me where the aliens are, so 825 00:52:08,316 --> 00:52:10,076 Speaker 2: I don't know how high have a clearance. I actually 826 00:52:10,076 --> 00:52:10,316 Speaker 2: have it. 827 00:52:10,836 --> 00:52:14,956 Speaker 1: Across that question off the list. I'm like, I'm curious, what, like, 828 00:52:16,116 --> 00:52:17,836 Speaker 1: what do you have to do to get a you know, 829 00:52:17,876 --> 00:52:19,236 Speaker 1: a really high level clearance. 830 00:52:20,596 --> 00:52:21,796 Speaker 2: Well you just had to fill out a bunch of 831 00:52:21,796 --> 00:52:24,676 Speaker 2: paperwork and and take take a drug test. 832 00:52:25,316 --> 00:52:25,716 Speaker 1: Uh. 833 00:52:26,156 --> 00:52:28,596 Speaker 2: But what I will say is that, you know, holding 834 00:52:28,596 --> 00:52:32,196 Speaker 2: a security clearance, you know, is not all fun, right, 835 00:52:32,356 --> 00:52:34,756 Speaker 2: because you you learn dings about the world, and you 836 00:52:34,756 --> 00:52:38,156 Speaker 2: you see things that you you often can't can't forget. 837 00:52:39,196 --> 00:52:42,276 Speaker 2: So although I was really thrilled when I first was 838 00:52:42,316 --> 00:52:45,196 Speaker 2: granted a top secret clearance. There are days now that 839 00:52:45,236 --> 00:52:46,756 Speaker 2: I wish I never had one. 840 00:52:47,076 --> 00:52:54,596 Speaker 1: Huh, because the world is scarier or that than you 841 00:52:54,596 --> 00:52:56,396 Speaker 1: than you would have thought. 842 00:52:58,036 --> 00:53:01,596 Speaker 2: Uh, the world is is scarier and and and more 843 00:53:01,676 --> 00:53:05,836 Speaker 2: dangerous and and uh people people sometimes do really really 844 00:53:06,036 --> 00:53:06,756 Speaker 2: awful things. 845 00:53:07,196 --> 00:53:11,956 Speaker 1: Yes, anything else you want to talking about, I would. 846 00:53:11,756 --> 00:53:14,676 Speaker 2: I would just say to everybody that's listening that that 847 00:53:14,796 --> 00:53:19,356 Speaker 2: now is an incredible time to be watching the world, 848 00:53:19,476 --> 00:53:24,076 Speaker 2: because we really are seeing historic things happen almost every day. 849 00:53:24,196 --> 00:53:26,316 Speaker 2: And anybody that pick us up a news story about 850 00:53:26,396 --> 00:53:30,636 Speaker 2: Ukraine or about China or about the Middle East, it's 851 00:53:30,676 --> 00:53:33,476 Speaker 2: sort of like reading the papers. I suppose you know, 852 00:53:33,516 --> 00:53:35,396 Speaker 2: in the run up to the First World War the 853 00:53:35,716 --> 00:53:38,756 Speaker 2: Second World War, you just see things that are so 854 00:53:38,916 --> 00:53:41,996 Speaker 2: striking and that tell you so much about where the 855 00:53:42,596 --> 00:53:46,436 Speaker 2: world is going. Uh, that that that it's worth it's 856 00:53:46,436 --> 00:53:47,276 Speaker 2: worth paying attention. 857 00:53:48,796 --> 00:53:50,676 Speaker 1: I got nervous when you said run up to the 858 00:53:50,676 --> 00:53:52,316 Speaker 1: First World War or Second World War. 859 00:53:53,516 --> 00:53:58,716 Speaker 2: Well, I say that quite quite specifically in the sense 860 00:53:58,796 --> 00:54:02,996 Speaker 2: that you know, warfare is changing remarkably fast, and we've 861 00:54:03,036 --> 00:54:08,276 Speaker 2: seen a succession of advanced militaries fail, the Russian military, 862 00:54:08,876 --> 00:54:13,476 Speaker 2: the Israeli military demonstrably failed during the attacks that Moss 863 00:54:13,476 --> 00:54:17,116 Speaker 2: waged on October seventh, and I think it would be 864 00:54:18,116 --> 00:54:21,356 Speaker 2: humorous to imagine that our military, the American military, couldn't 865 00:54:21,396 --> 00:54:25,076 Speaker 2: fail in a similar way. So that's why I think 866 00:54:25,116 --> 00:54:29,316 Speaker 2: it's really important to pay attention to what's happening in 867 00:54:29,436 --> 00:54:32,956 Speaker 2: conflict around the world and to realize that we as 868 00:54:32,996 --> 00:54:35,676 Speaker 2: Americans now need to pay a lot more attention because 869 00:54:35,716 --> 00:54:38,396 Speaker 2: we no longer have the margin of error that we're 870 00:54:38,396 --> 00:54:41,316 Speaker 2: accustomed to in being able to project military power. 871 00:54:48,596 --> 00:54:51,956 Speaker 1: Christopher Kierkoff is an advisor at the Defense Innovation Unit. 872 00:54:53,116 --> 00:54:56,396 Speaker 1: Today's show was produced by Gabriel Hunter Cheng. It was 873 00:54:56,636 --> 00:55:00,116 Speaker 1: edited by Lyddy Jane Kott and engineered by Sarah Brugier. 874 00:55:00,556 --> 00:55:03,876 Speaker 1: You can email us at problem at Pushkin dot FM. 875 00:55:04,316 --> 00:55:06,676 Speaker 1: I'm Jacob Goldstein and we'll be back next week with 876 00:55:06,756 --> 00:55:11,956 Speaker 1: another episode of What's Your Problem? Doctor A sign instituted 877 00:55:12,196 --> 00:55:13,516 Speaker 1: the fat FA