1 00:00:14,080 --> 00:00:17,160 Speaker 1: Welcome to Tech Stuff. I'm as Valoshan here with Kara Price. 2 00:00:17,400 --> 00:00:18,640 Speaker 2: Hi Kara, hi os. 3 00:00:18,960 --> 00:00:22,880 Speaker 1: I'm excited to share today's interview with Dexter Philkins, who's 4 00:00:23,000 --> 00:00:26,599 Speaker 1: a longtime war correspondent and now a reporter at large 5 00:00:26,600 --> 00:00:29,080 Speaker 1: for The New Yorker. He recently wrote a piece to 6 00:00:29,120 --> 00:00:32,519 Speaker 1: answer a very big question, is the US ready for 7 00:00:32,600 --> 00:00:35,920 Speaker 1: the next war? In search of that question, Dexter traveled 8 00:00:36,040 --> 00:00:41,360 Speaker 1: all over the world, from Israel to Taiwan to of course, Ukraine, 9 00:00:41,800 --> 00:00:45,480 Speaker 1: where he visited a drone factory. Here's Dexter on how 10 00:00:45,600 --> 00:00:48,400 Speaker 1: Ukraine became ground zero for drone warfare. 11 00:00:48,840 --> 00:00:52,960 Speaker 2: The Ukrainians, on their und really they have invented basically 12 00:00:53,040 --> 00:00:56,360 Speaker 2: a whole new way of warfare. They build really chieved 13 00:00:56,440 --> 00:01:00,520 Speaker 2: drones that are really really accurate and really really and 14 00:01:00,560 --> 00:01:03,400 Speaker 2: they build them by the thousands and the thousands and 15 00:01:03,440 --> 00:01:08,400 Speaker 2: the thousands. So last year twenty twenty four, the Ukrainians 16 00:01:08,440 --> 00:01:13,399 Speaker 2: built and deployed two million drones on the battlefield. The 17 00:01:13,520 --> 00:01:16,319 Speaker 2: moment the Russians start to move in any kind of 18 00:01:16,360 --> 00:01:20,600 Speaker 2: numbers at all, they walk right into a wall of drones. Yeah. 19 00:01:20,600 --> 00:01:23,240 Speaker 3: I've actually seen videos of these drones. It looks like 20 00:01:23,319 --> 00:01:24,080 Speaker 3: a swarm. 21 00:01:24,400 --> 00:01:26,959 Speaker 1: It really does, and there are these insane stories about 22 00:01:27,000 --> 00:01:31,319 Speaker 1: how Ukrainians are protecting themselves from drones. In fact, old 23 00:01:31,520 --> 00:01:34,279 Speaker 1: fishing nets from the south of France are being shipped 24 00:01:34,280 --> 00:01:38,000 Speaker 1: to Ukraine and kind of put over roads to basically 25 00:01:38,000 --> 00:01:41,240 Speaker 1: prevent the suicide drones from hitting people and vehicles on 26 00:01:41,280 --> 00:01:43,800 Speaker 1: the roads. There's a great story recently in The Guardian 27 00:01:43,840 --> 00:01:46,720 Speaker 1: about how proud the fishermen of Provence are to be 28 00:01:46,880 --> 00:01:50,040 Speaker 1: helping Ukrainians out with their old nets. There's also another 29 00:01:50,160 --> 00:01:53,120 Speaker 1: detail in the story which I found just absolutely mind blowing, 30 00:01:53,160 --> 00:01:56,880 Speaker 1: which is that drone operators in Ukraine are given points 31 00:01:57,080 --> 00:02:01,600 Speaker 1: for kills and for destruction of tanks and other milestones essentially, 32 00:02:01,640 --> 00:02:04,680 Speaker 1: and the units that get the most points get more drones. 33 00:02:05,040 --> 00:02:09,040 Speaker 1: It's basically a video game mechanic for distributing weapons of war. 34 00:02:09,800 --> 00:02:12,119 Speaker 1: But the story is not just about drones. It's also 35 00:02:12,200 --> 00:02:15,440 Speaker 1: about how the wider future of warfare is being driven 36 00:02:15,480 --> 00:02:16,200 Speaker 1: by technology. 37 00:02:16,360 --> 00:02:19,360 Speaker 2: The most important thing is not the jet at the bomber, 38 00:02:19,480 --> 00:02:21,920 Speaker 2: the aircraft carry that costs billions of billions dollars, the 39 00:02:21,919 --> 00:02:23,840 Speaker 2: most important components of software. 40 00:02:24,160 --> 00:02:26,320 Speaker 1: On the one hand, it's inside baseball, but on the 41 00:02:26,400 --> 00:02:30,840 Speaker 1: other hand, it's absolutely fascinating. The whole paradigm of arms 42 00:02:30,880 --> 00:02:34,040 Speaker 1: manufacturing is being turned on its head. In the past, 43 00:02:34,080 --> 00:02:36,560 Speaker 1: the government would decide what it wanted and then put 44 00:02:36,639 --> 00:02:40,480 Speaker 1: in an order. Now the private sector is taking a 45 00:02:40,600 --> 00:02:43,640 Speaker 1: build it and they will come approach. And this simple 46 00:02:43,760 --> 00:02:48,480 Speaker 1: fact explains the explosion of venture capital funding into defense 47 00:02:48,560 --> 00:02:50,400 Speaker 1: tech that we talk about quite often on the show. 48 00:02:50,800 --> 00:02:54,080 Speaker 3: Yeah, you mentioned Ukraine, but you know another place my 49 00:02:54,160 --> 00:02:56,480 Speaker 3: mind goes to in terms of the future of warfare 50 00:02:56,560 --> 00:02:57,640 Speaker 3: is obviously Israel. 51 00:02:57,880 --> 00:03:00,720 Speaker 1: Yeah, that's right. Deta talks about that about how the 52 00:03:00,760 --> 00:03:05,200 Speaker 1: IDEF is using AI to identify and locate people with 53 00:03:05,320 --> 00:03:09,400 Speaker 1: potential links to Hamas based on a number of factors. 54 00:03:09,320 --> 00:03:12,800 Speaker 2: Based on their telephone calls, based on the location, based 55 00:03:12,840 --> 00:03:16,160 Speaker 2: on the buildings they go into. And there's just enormous 56 00:03:16,160 --> 00:03:20,000 Speaker 2: amounts of data, so much data that's all being crunched 57 00:03:20,080 --> 00:03:25,040 Speaker 2: into these big computers that then essentially spit out target recommendations. 58 00:03:25,320 --> 00:03:27,240 Speaker 3: And what did the IDF do with this information? 59 00:03:27,560 --> 00:03:30,840 Speaker 1: They use it to decide who to kill. Dexter was 60 00:03:30,840 --> 00:03:33,720 Speaker 1: told that after the AI analyzes the data, it makes 61 00:03:33,720 --> 00:03:36,480 Speaker 1: a recommendation on who to target, and then a team 62 00:03:36,520 --> 00:03:39,760 Speaker 1: of humans in the IDF tries to verify whether or 63 00:03:39,800 --> 00:03:42,680 Speaker 1: not these targets are really connects to hamas and whether 64 00:03:42,800 --> 00:03:46,040 Speaker 1: or not to make the kill decision. Dexter reports, however, 65 00:03:46,200 --> 00:03:48,960 Speaker 1: this IDF team would get about one hundred recommendations from 66 00:03:49,040 --> 00:03:53,520 Speaker 1: AI per day, and that sometimes the latency between the 67 00:03:53,560 --> 00:03:58,160 Speaker 1: AI recommendation and the kill decision is less than a minute. 68 00:03:58,360 --> 00:04:00,400 Speaker 3: You know, at that point, is there like even a 69 00:04:00,480 --> 00:04:01,360 Speaker 3: human in the loop? 70 00:04:01,720 --> 00:04:06,080 Speaker 1: I mean technically legally, legally yes, but practically less than 71 00:04:06,120 --> 00:04:08,880 Speaker 1: a minute, not really. So we talk about that, and 72 00:04:08,920 --> 00:04:11,880 Speaker 1: we also talk about Palme Lucky. He's the founder of Anderil, 73 00:04:12,000 --> 00:04:18,000 Speaker 1: Silicon Valley's hottest, most darling defense tech startup, Unicorn, And 74 00:04:18,000 --> 00:04:19,560 Speaker 1: we talk about how all these different places in the 75 00:04:19,560 --> 00:04:23,160 Speaker 1: world come together, So how the battlefield of Ukraine and 76 00:04:23,200 --> 00:04:27,000 Speaker 1: the warring Gaza are ultimately affecting how the US and 77 00:04:27,120 --> 00:04:30,839 Speaker 1: China think about a potential conflict in Taiwan. 78 00:04:31,279 --> 00:04:33,240 Speaker 2: It's fair to say it would be catastrophic for the 79 00:04:33,320 --> 00:04:35,200 Speaker 2: United States and for the kind of the interests of 80 00:04:35,240 --> 00:04:36,120 Speaker 2: the Western world. 81 00:04:36,960 --> 00:04:39,839 Speaker 1: Before we get there, dextro Nite started off by talking 82 00:04:39,839 --> 00:04:43,120 Speaker 1: about Dexter's visit to a Ukrainian drone factory. 83 00:04:43,640 --> 00:04:46,520 Speaker 2: So you know, I was blindfolded. I was taking this factory. 84 00:04:46,520 --> 00:04:49,400 Speaker 2: They took the blindfold off. I was inside this you know. 85 00:04:49,440 --> 00:04:52,960 Speaker 2: It was like a small, small like warehouse. There were 86 00:04:53,279 --> 00:04:55,839 Speaker 2: a couple hundred people in there, most of them women, 87 00:04:56,080 --> 00:04:59,000 Speaker 2: and they were they were making drones. But the drones, 88 00:04:59,600 --> 00:05:02,840 Speaker 2: it's just a little thing. It's nothing. It's a farb 89 00:05:02,920 --> 00:05:05,440 Speaker 2: and fib or frame. It's like a foot and a 90 00:05:05,440 --> 00:05:07,280 Speaker 2: half by a foot and a half. It's just a square. 91 00:05:07,360 --> 00:05:09,920 Speaker 2: It's got little propellers on it. I mean, it's nothing. 92 00:05:10,080 --> 00:05:12,719 Speaker 2: And then it's got a bomb rack and then a 93 00:05:12,839 --> 00:05:15,680 Speaker 2: camera so that the person who's driving the drunk can 94 00:05:15,760 --> 00:05:18,159 Speaker 2: kind of see where he's going. That's it. It costs 95 00:05:18,160 --> 00:05:21,279 Speaker 2: five hundred dollars and so with not a lot of money, 96 00:05:21,320 --> 00:05:23,640 Speaker 2: they can deploy thousands of these things. And that's what 97 00:05:23,680 --> 00:05:26,160 Speaker 2: they're doing. And I should say the Russians are doing 98 00:05:26,160 --> 00:05:30,159 Speaker 2: the same thing. And so it's this incredible kind of 99 00:05:30,240 --> 00:05:34,240 Speaker 2: dogfight that tapping in eastern Ukraine. But the technology for 100 00:05:34,360 --> 00:05:37,280 Speaker 2: now appears to favorite the defense. And so it's largely 101 00:05:37,320 --> 00:05:41,400 Speaker 2: because of drones and drone swarms and the tactics Ukrainians 102 00:05:41,440 --> 00:05:44,240 Speaker 2: have developed that they've been able to hold the Russians off. 103 00:05:45,000 --> 00:05:48,280 Speaker 1: You talk about it largely favors the defense, but there's 104 00:05:48,320 --> 00:05:51,880 Speaker 1: this extraordinary David and Goliath moment where some of the 105 00:05:51,960 --> 00:05:55,040 Speaker 1: cheapest technology of warfare knocked out some of the most 106 00:05:55,080 --> 00:05:58,200 Speaker 1: expensive thousands of miles behind the front lines. 107 00:05:58,520 --> 00:06:01,240 Speaker 2: That's right, that's right. And so for instance, the guy 108 00:06:01,320 --> 00:06:04,320 Speaker 2: that I spoke to, he was like running like a 109 00:06:04,360 --> 00:06:07,159 Speaker 2: trucking company like when the war started, and he's like, well, 110 00:06:07,200 --> 00:06:09,600 Speaker 2: you know, the Prime Minister called and they said, hey, 111 00:06:09,600 --> 00:06:11,600 Speaker 2: can you make drones? And that was that was three 112 00:06:11,680 --> 00:06:13,680 Speaker 2: years ago. He never made a drone his life. And 113 00:06:13,760 --> 00:06:17,520 Speaker 2: so this was a government project that I saw with 114 00:06:17,640 --> 00:06:22,880 Speaker 2: private companies, and they've developed these anti ship drones. They're 115 00:06:22,920 --> 00:06:26,839 Speaker 2: like these little toy boats. They're like six feet long, 116 00:06:27,560 --> 00:06:30,279 Speaker 2: no people, a little cameras. Some of them are just 117 00:06:30,360 --> 00:06:33,360 Speaker 2: loaded with explosives. Others had like kind of torpedoes on them. 118 00:06:33,960 --> 00:06:39,279 Speaker 2: But they have chased the entire Russian Black Sea fleet, 119 00:06:39,760 --> 00:06:41,920 Speaker 2: some of it out of the Black Sea and the 120 00:06:41,960 --> 00:06:44,520 Speaker 2: rest of it all the way to the eastern side 121 00:06:44,520 --> 00:06:47,960 Speaker 2: of the Black Sea, where they've basically reopened what the 122 00:06:48,040 --> 00:06:50,960 Speaker 2: Russians had blockaded, which is the sea lanes that allow 123 00:06:51,680 --> 00:06:55,080 Speaker 2: Ukraine to export grain, which is you know, lifeblood. It's 124 00:06:55,080 --> 00:06:57,919 Speaker 2: all the TRD currency. That's the kind of things that 125 00:06:57,960 --> 00:07:00,560 Speaker 2: are happening here, and it's an amazing thing to see 126 00:07:00,600 --> 00:07:00,960 Speaker 2: up close. 127 00:07:01,600 --> 00:07:03,719 Speaker 1: The other moment in the piece that really caught my 128 00:07:03,760 --> 00:07:07,680 Speaker 1: imagination was the fact that the drone operators have a 129 00:07:07,760 --> 00:07:10,680 Speaker 1: kind of internal economy and the more points they accrue, 130 00:07:10,880 --> 00:07:12,480 Speaker 1: the more drones they can get for their unit. I 131 00:07:12,480 --> 00:07:13,679 Speaker 1: mean literally like a video game. 132 00:07:14,160 --> 00:07:16,800 Speaker 2: It does feel like a video game. The drones are 133 00:07:17,000 --> 00:07:21,760 Speaker 2: are flown to their destination by a guy on a joystick, 134 00:07:21,920 --> 00:07:24,560 Speaker 2: you know, manning a joystick. They have a screen, they 135 00:07:24,560 --> 00:07:28,880 Speaker 2: have a laptop. Often they're not sure what their destination is. 136 00:07:28,960 --> 00:07:33,240 Speaker 2: They're they're looking for targets or more typically, they'll have 137 00:07:33,240 --> 00:07:36,800 Speaker 2: a surveillance drone finding targets and then directing the other 138 00:07:36,920 --> 00:07:41,040 Speaker 2: drones to the target. So tank spotted go here. That's 139 00:07:41,160 --> 00:07:44,800 Speaker 2: essentially how the game works. There's drone teams all over 140 00:07:44,840 --> 00:07:50,200 Speaker 2: the front line everywhere, and the best drone team, the 141 00:07:50,280 --> 00:07:52,880 Speaker 2: team with the most kills get You get points, you know, 142 00:07:52,920 --> 00:07:55,480 Speaker 2: like you get a number of points when you blow 143 00:07:55,520 --> 00:07:58,280 Speaker 2: up tank and you can prove it. You get a 144 00:07:58,360 --> 00:08:00,720 Speaker 2: number of points when you when you kill a Russian 145 00:08:00,760 --> 00:08:04,440 Speaker 2: soldier when you kill a missile battery and those points 146 00:08:04,480 --> 00:08:07,680 Speaker 2: out up. So the team with the most points, they 147 00:08:07,680 --> 00:08:09,560 Speaker 2: don't get a prize, you know, or they don't get 148 00:08:09,600 --> 00:08:12,080 Speaker 2: a cash award, they get more drones. And so you 149 00:08:12,080 --> 00:08:14,720 Speaker 2: can see how how well that works. So that the 150 00:08:14,880 --> 00:08:18,600 Speaker 2: really killer teams, and there there's some just utterly lethal 151 00:08:18,680 --> 00:08:21,320 Speaker 2: teams there. They were they were just getting as many 152 00:08:21,400 --> 00:08:22,520 Speaker 2: drones as they could handle. 153 00:08:23,080 --> 00:08:25,440 Speaker 1: One of the sort of iconic images I've seen of 154 00:08:25,480 --> 00:08:30,920 Speaker 1: the Ukrainian battlefield is this crisscrossed FIBROTI cable. 155 00:08:31,120 --> 00:08:33,960 Speaker 2: Yeah, it's crazy. But the biggest challenge to be overcome 156 00:08:34,000 --> 00:08:37,600 Speaker 2: if you're trying to run a drone war is electronic jamming. 157 00:08:37,880 --> 00:08:40,080 Speaker 2: And so you're sitting there with your joystick and you're 158 00:08:40,120 --> 00:08:42,520 Speaker 2: flying your drone around and you're looking for a tank 159 00:08:42,880 --> 00:08:45,760 Speaker 2: or you're looking for a platoon somebody, and somebody to 160 00:08:45,800 --> 00:08:49,280 Speaker 2: crash your drone into. The Russians are really really good 161 00:08:49,600 --> 00:08:53,680 Speaker 2: at jamming the radio signals and so that renders the 162 00:08:53,800 --> 00:08:57,920 Speaker 2: Ukrainian drones inoperable. When I picked up the five hundred 163 00:08:57,920 --> 00:09:02,320 Speaker 2: dollars drone in the Factory and Western Ukraine, it had 164 00:09:02,360 --> 00:09:05,080 Speaker 2: a kind of frequency horp so like if it goes 165 00:09:05,200 --> 00:09:07,800 Speaker 2: up on one frequency, so the guy with the joystick 166 00:09:07,840 --> 00:09:10,360 Speaker 2: is steering the drone and then if the Russians jam that, 167 00:09:10,720 --> 00:09:13,720 Speaker 2: he quickly bounces to another frequency. If they jam that, 168 00:09:13,800 --> 00:09:16,640 Speaker 2: it bounces to another where the long cables come in 169 00:09:16,679 --> 00:09:18,720 Speaker 2: and they're fiber op the cables, so they're just you know, 170 00:09:18,920 --> 00:09:22,480 Speaker 2: super super thin. You can't jam fiber op the cable, 171 00:09:23,280 --> 00:09:25,520 Speaker 2: and you can. You can find us on the web. 172 00:09:25,559 --> 00:09:29,000 Speaker 2: You can find video footage. But there's there's drone teams, 173 00:09:29,040 --> 00:09:34,560 Speaker 2: like Ukrainian drone teams racing down roads in eastern Ukraine 174 00:09:34,600 --> 00:09:37,520 Speaker 2: that they've covered with netting the whole road. 175 00:09:38,679 --> 00:09:43,400 Speaker 1: Of course, there's a big connection between jamming drones and 176 00:09:43,480 --> 00:09:47,199 Speaker 1: the race towards autonomous weapons. I want to come back 177 00:09:47,200 --> 00:09:52,280 Speaker 1: to Palmer Lucky and full autonomy. But in between is 178 00:09:52,280 --> 00:09:56,840 Speaker 1: Israel and Gaza. And some have suggested or questioned that 179 00:09:57,400 --> 00:10:02,400 Speaker 1: Israel is already letting a I make kill decisions on 180 00:10:02,440 --> 00:10:06,280 Speaker 1: the battlefield in Gaza. I think in your piece, the 181 00:10:06,320 --> 00:10:09,920 Speaker 1: answer is AI is identifying targets, and sometimes the gap 182 00:10:10,000 --> 00:10:12,920 Speaker 1: between identifying the target and making the kill decision is 183 00:10:13,200 --> 00:10:16,000 Speaker 1: extremely short in terms of time. But nonetheless there is 184 00:10:16,040 --> 00:10:19,480 Speaker 1: always a human in the loop. I mean, what's the 185 00:10:19,480 --> 00:10:23,160 Speaker 1: difference between a killer robot and a robot telling a 186 00:10:23,240 --> 00:10:24,080 Speaker 1: human who to kill. 187 00:10:24,760 --> 00:10:27,800 Speaker 2: I mean, that's the heart of the matter. So in Israel, 188 00:10:28,120 --> 00:10:31,480 Speaker 2: you get a recommendation from the AI, the mother computer, 189 00:10:32,080 --> 00:10:35,960 Speaker 2: and it says it's basically a coordinate. It doesn't tell 190 00:10:36,000 --> 00:10:37,839 Speaker 2: you who it is because I don't necessarily know who 191 00:10:37,880 --> 00:10:40,840 Speaker 2: it is, and I think typically it gives you a 192 00:10:40,920 --> 00:10:44,440 Speaker 2: kind of percentage of certainty. So if you fire at 193 00:10:44,480 --> 00:10:47,520 Speaker 2: this coordinate on the map, the chances are eighty percent 194 00:10:47,559 --> 00:10:49,880 Speaker 2: you're going to kill a bad guy, or it's seventy 195 00:10:49,880 --> 00:10:52,120 Speaker 2: percent you're going to fifty percent you're going to kill 196 00:10:52,120 --> 00:10:54,440 Speaker 2: a bad guy. And that's essentially what they're acting on. 197 00:10:54,559 --> 00:10:57,480 Speaker 2: So what the Israeli say is that it's always the 198 00:10:57,600 --> 00:11:00,120 Speaker 2: human that kind of pulls the trigger. What we as 199 00:11:00,240 --> 00:11:03,760 Speaker 2: the targeting officers is to try to verify the target 200 00:11:03,960 --> 00:11:08,600 Speaker 2: recommendation from the computer independently. But I think my great 201 00:11:08,600 --> 00:11:12,719 Speaker 2: impression of talking to them was that perfection is kind 202 00:11:12,720 --> 00:11:15,679 Speaker 2: of not really possible in more time, and so what 203 00:11:15,720 --> 00:11:19,480 Speaker 2: you have is like in the heat of battle, when 204 00:11:19,480 --> 00:11:22,920 Speaker 2: that little waypoint the coordinate on the map is moving 205 00:11:23,160 --> 00:11:25,200 Speaker 2: and you just have a few seconds to decide whether 206 00:11:25,280 --> 00:11:27,959 Speaker 2: or not to kill it or not. How much independent 207 00:11:28,120 --> 00:11:30,400 Speaker 2: verification can you do? Do you really have time to 208 00:11:30,440 --> 00:11:32,840 Speaker 2: send another drone over it to look at it, or 209 00:11:32,840 --> 00:11:35,480 Speaker 2: do you just kill it? That's the really hard part 210 00:11:35,600 --> 00:11:38,320 Speaker 2: right there. That's what's happening in the room where they're 211 00:11:38,320 --> 00:11:40,480 Speaker 2: making these targeting decisions. 212 00:11:40,800 --> 00:11:45,079 Speaker 1: Do the algorithms have a built in calculation of chollateral damage? 213 00:11:45,120 --> 00:11:48,520 Speaker 1: Is that information that the officers have access to when 214 00:11:48,559 --> 00:11:51,480 Speaker 1: they're making these decisions, or is that not even consideration. 215 00:11:52,360 --> 00:11:54,880 Speaker 2: Now this is where it gets really really I think 216 00:11:55,000 --> 00:11:59,520 Speaker 2: morally tricky. As I understood it from my reporting, the 217 00:11:59,559 --> 00:12:03,559 Speaker 2: computer will say fire here, here's your coordinate on a map. 218 00:12:03,960 --> 00:12:07,200 Speaker 2: You have an eighty percent certainty, but you might kill 219 00:12:07,840 --> 00:12:11,360 Speaker 2: eight civilians and that has to be included in the 220 00:12:11,400 --> 00:12:13,680 Speaker 2: decision that you're making. You know, if I were a 221 00:12:13,720 --> 00:12:16,760 Speaker 2: newspaper porter, I would have just written a separate story 222 00:12:16,760 --> 00:12:20,520 Speaker 2: on this fact alone. I heard this from an IDF 223 00:12:20,559 --> 00:12:24,760 Speaker 2: targeting officer. He said, we were allowed in the beginning 224 00:12:24,800 --> 00:12:28,040 Speaker 2: phases of the war, and I can't remember the exact number, 225 00:12:28,040 --> 00:12:31,160 Speaker 2: but it was very, very high. We were allowed to 226 00:12:32,040 --> 00:12:36,120 Speaker 2: kill a target that was recommended by the computer. If 227 00:12:36,480 --> 00:12:38,880 Speaker 2: we believed that there was a possibility that we were 228 00:12:38,920 --> 00:12:43,600 Speaker 2: going to kill as many as twenty civilians. Now, you know, 229 00:12:43,960 --> 00:12:48,199 Speaker 2: hit pause there, because that's pretty extraordinary. You're basically saying, 230 00:12:48,240 --> 00:12:51,360 Speaker 2: I'm going to kill the one bad guy who's standing 231 00:12:51,400 --> 00:12:54,160 Speaker 2: in a crowded market, and I think we might kill 232 00:12:54,200 --> 00:12:57,160 Speaker 2: twenty people, but we're going to do it. And that's 233 00:12:57,160 --> 00:13:00,240 Speaker 2: a human decision. I mean, that is in fact to 234 00:13:00,320 --> 00:13:02,520 Speaker 2: what we're seeing in Gaza. If you want to know 235 00:13:03,440 --> 00:13:06,440 Speaker 2: why and how there's so much distruction in Gaza and 236 00:13:06,640 --> 00:13:09,720 Speaker 2: so many civilians dead, that's the reason. It's the rules 237 00:13:09,720 --> 00:13:14,280 Speaker 2: of engagement decision. But they just dialed it way way back. 238 00:13:14,840 --> 00:13:18,360 Speaker 2: And I had a conversation with an American senior civilian 239 00:13:18,360 --> 00:13:21,720 Speaker 2: in the Pentagon, Biden and who had been involved in 240 00:13:21,760 --> 00:13:26,240 Speaker 2: a lot of the decisions in Gossip War, helping these rallies, 241 00:13:26,280 --> 00:13:30,000 Speaker 2: and he said, if we had ever in the course 242 00:13:30,240 --> 00:13:32,200 Speaker 2: of say the Iraq War, the Afghan War, if we 243 00:13:32,240 --> 00:13:33,800 Speaker 2: had thought that there was a chance we were going 244 00:13:33,880 --> 00:13:37,120 Speaker 2: to kill twenty civilians, that a decision like that would 245 00:13:37,120 --> 00:13:38,800 Speaker 2: go to the President or it would go to the 246 00:13:38,840 --> 00:13:42,280 Speaker 2: Secretary Offense. He said, the Israelis are doing this every day. 247 00:13:42,400 --> 00:13:45,160 Speaker 2: Sometimes more than once. So that gives you a sense 248 00:13:45,200 --> 00:13:49,520 Speaker 2: of how I think, extraordinarily loose the rules of engagement 249 00:13:49,559 --> 00:13:51,800 Speaker 2: have been for the Israelis. And I think it's not 250 00:13:51,880 --> 00:13:55,320 Speaker 2: as much a technological kind of decision as it is 251 00:13:55,320 --> 00:13:58,559 Speaker 2: a human decision. I think. Yeah. 252 00:13:58,679 --> 00:14:03,199 Speaker 1: I mean the ciche about you know, technology is that 253 00:14:03,480 --> 00:14:06,199 Speaker 1: these are tools and we choose how to use them, right. 254 00:14:06,240 --> 00:14:08,520 Speaker 1: I mean a soldiers sitting in a bunker making kill 255 00:14:08,600 --> 00:14:12,920 Speaker 1: decisions with an eighty percent probability it's a quote unquote 256 00:14:12,960 --> 00:14:15,800 Speaker 1: bad guy and you know, full civilians or six cividians, 257 00:14:16,000 --> 00:14:20,520 Speaker 1: eight civilians being killed. I mean the technological element kind 258 00:14:20,520 --> 00:14:25,400 Speaker 1: of creates this remove from the consequences of a decision almost. 259 00:14:25,880 --> 00:14:28,920 Speaker 2: Yeah, you can't. You can't hear the screams. But you know, 260 00:14:29,160 --> 00:14:31,720 Speaker 2: you mentioned the Obama administration when they have the drone 261 00:14:31,720 --> 00:14:34,320 Speaker 2: more in Pakistan. They made a lot of decisions like this, 262 00:14:35,120 --> 00:14:38,400 Speaker 2: and they're very similar decisions. I had lots of conversations 263 00:14:38,400 --> 00:14:42,000 Speaker 2: with people during the Obama administration about what they were 264 00:14:42,000 --> 00:14:43,720 Speaker 2: trying to do in Pakistan, and they're trying to kill 265 00:14:43,760 --> 00:14:45,880 Speaker 2: all kinda guys in Pakistan, but it was kind of 266 00:14:45,920 --> 00:14:48,040 Speaker 2: the same. They'd have a drone in the air watching 267 00:14:48,080 --> 00:14:50,680 Speaker 2: a house and they'd say, well, we don't really know 268 00:14:50,720 --> 00:14:53,320 Speaker 2: who's in the house, but there's guys in a pickup 269 00:14:53,360 --> 00:14:55,880 Speaker 2: truck and they come and they all have guns, and 270 00:14:55,920 --> 00:14:59,160 Speaker 2: they're going in and they're going out. And the guys 271 00:14:59,200 --> 00:15:00,880 Speaker 2: with guns, like when I left the house, they went 272 00:15:00,920 --> 00:15:03,720 Speaker 2: to another place. They went to a military camp. We 273 00:15:03,760 --> 00:15:05,600 Speaker 2: don't know who they are, but we think they're probably 274 00:15:05,680 --> 00:15:08,480 Speaker 2: bad guys. Are the civilians in the house. We don't 275 00:15:08,480 --> 00:15:10,800 Speaker 2: know the answer to that, but I think it's a 276 00:15:10,800 --> 00:15:13,880 Speaker 2: pretty well documented. Some of those guys traumatized by what 277 00:15:14,000 --> 00:15:17,520 Speaker 2: happened because you know they're killing people. It's totally bizarre. 278 00:15:17,600 --> 00:15:19,440 Speaker 2: You've got a joystick and you're looking at a video 279 00:15:19,480 --> 00:15:21,360 Speaker 2: screen and then you blow up a house it's got 280 00:15:21,360 --> 00:15:23,680 Speaker 2: fifteen people in it, and then you get in your 281 00:15:23,720 --> 00:15:26,720 Speaker 2: car and like you're gone and have dinner with your family. 282 00:15:27,080 --> 00:15:30,520 Speaker 2: It's like you're removed from those consequences, but like you 283 00:15:30,560 --> 00:15:33,120 Speaker 2: can see how it's still I think haunting a lot 284 00:15:33,160 --> 00:15:42,640 Speaker 2: of the people that have been involved after the break. 285 00:15:43,080 --> 00:15:47,200 Speaker 1: Could technology be the spot that creates World War three? 286 00:15:47,720 --> 00:16:07,080 Speaker 1: Stay with us. Let's talk about full autonomy, i e. 287 00:16:07,520 --> 00:16:12,400 Speaker 1: Killar robots making their own target and killed decisions. Is 288 00:16:12,400 --> 00:16:17,800 Speaker 1: it a technological barrier to arriving there or is it 289 00:16:17,840 --> 00:16:21,200 Speaker 1: a rules of engagement barrier? And if the latter, how 290 00:16:21,280 --> 00:16:22,600 Speaker 1: on do you see that holding. 291 00:16:22,280 --> 00:16:27,840 Speaker 2: For Well, that's a good question. I think ultimately, as 292 00:16:28,000 --> 00:16:30,160 Speaker 2: was explained to me by Palmer, Lucky and others, like, 293 00:16:30,640 --> 00:16:34,160 Speaker 2: we can do this, we can, and we are preparing 294 00:16:34,240 --> 00:16:37,360 Speaker 2: to go fully autonomous. That's what we are making these 295 00:16:37,360 --> 00:16:40,880 Speaker 2: things to do, because the technology is taking us there. 296 00:16:41,320 --> 00:16:42,640 Speaker 2: One of the things he said to me, and we 297 00:16:42,840 --> 00:16:46,920 Speaker 2: were standing in the showroom in the Andrel's office, so 298 00:16:47,000 --> 00:16:50,960 Speaker 2: we're surrounded by the drones and the pilots airplanes and 299 00:16:51,000 --> 00:16:54,360 Speaker 2: the in the in the anti drones and all the 300 00:16:54,400 --> 00:16:59,400 Speaker 2: stuff that they sell, and he said, every vehicle in here, 301 00:16:59,520 --> 00:17:05,040 Speaker 2: i'm a will be cut off from communication, from controlling 302 00:17:05,280 --> 00:17:09,359 Speaker 2: our own weapons. But what it means is that these 303 00:17:09,400 --> 00:17:12,560 Speaker 2: weapons have their own brains and they will have to 304 00:17:12,600 --> 00:17:16,600 Speaker 2: go and find their targets on their own. And you 305 00:17:16,600 --> 00:17:20,480 Speaker 2: can see all the problems that could potentially arise from that, 306 00:17:20,800 --> 00:17:24,640 Speaker 2: because you're really talking about kind of autonomous warfare. And 307 00:17:24,760 --> 00:17:27,199 Speaker 2: so I had a conversation with I think he's now 308 00:17:27,240 --> 00:17:31,159 Speaker 2: the president of and Brian Shimp and Brian sketched out 309 00:17:31,200 --> 00:17:35,000 Speaker 2: the following scenario, which I thought was really interesting. He said, look, 310 00:17:35,080 --> 00:17:40,840 Speaker 2: just imagine a one hundred miles square in the Pacific Ocean. 311 00:17:41,560 --> 00:17:45,440 Speaker 2: Imagine somewhere inside of there, in that vast stretch of ocean, 312 00:17:46,240 --> 00:17:51,119 Speaker 2: there is a fleet of Chinese ships. What Andreil weapons 313 00:17:51,160 --> 00:17:53,720 Speaker 2: are being designed to do, and what they will be 314 00:17:53,760 --> 00:17:58,760 Speaker 2: able to do is we will fire our missiles at 315 00:17:58,920 --> 00:18:02,240 Speaker 2: those ships, and they will, independently of us, with no 316 00:18:02,400 --> 00:18:05,560 Speaker 2: human control, they will find the ships, they will target 317 00:18:05,600 --> 00:18:08,919 Speaker 2: the ships, they will communicate with each other and decide 318 00:18:08,920 --> 00:18:12,480 Speaker 2: who's going to take which ones and then go destroy them. 319 00:18:12,680 --> 00:18:15,919 Speaker 2: But ultimately those are human decisions. And so in the 320 00:18:15,960 --> 00:18:18,640 Speaker 2: Pentagon there's a whole executive order that Biden put out 321 00:18:18,840 --> 00:18:23,040 Speaker 2: and then I think Trump slightly revised that basically kind 322 00:18:23,040 --> 00:18:27,040 Speaker 2: of allows for full autonomy as I understand it. Palmer 323 00:18:27,320 --> 00:18:30,359 Speaker 2: Lucky was really clear about this, which was this is 324 00:18:30,359 --> 00:18:35,080 Speaker 2: where it's going full on killer robe on it science 325 00:18:35,080 --> 00:18:37,960 Speaker 2: fiction movie. These things are fighting each other and they're 326 00:18:38,040 --> 00:18:40,720 Speaker 2: killing people, and like, the Chinese are gonna have theirs 327 00:18:40,840 --> 00:18:43,399 Speaker 2: and we're gonna have ours, and the humans are going 328 00:18:43,480 --> 00:18:45,879 Speaker 2: to be basically taking a ringside seat. 329 00:18:46,520 --> 00:18:49,880 Speaker 1: What was your sense of Palm and Lucky and what's 330 00:18:49,920 --> 00:18:52,520 Speaker 1: the scale of his ambition and where is he on 331 00:18:52,640 --> 00:18:53,119 Speaker 1: achieving end. 332 00:18:53,800 --> 00:18:59,719 Speaker 2: Palmer Lucky is a remarkable guy, and he's deceptively modest 333 00:18:59,720 --> 00:19:01,600 Speaker 2: about the whole thing. Like when you meet him, he's 334 00:19:01,640 --> 00:19:04,120 Speaker 2: just you know, he's like a tech guy. He's young, 335 00:19:04,240 --> 00:19:06,800 Speaker 2: he's wearing flip flops. He's got like a flowered shirt on, 336 00:19:06,920 --> 00:19:08,960 Speaker 2: he's got she had shorts on. When I met him, 337 00:19:09,359 --> 00:19:12,480 Speaker 2: he was eating from a bag and took He's revolutionizing 338 00:19:13,440 --> 00:19:17,800 Speaker 2: the way that America is approaching its wars and approaching 339 00:19:17,880 --> 00:19:21,480 Speaker 2: to wars in the future. And and so the system 340 00:19:21,720 --> 00:19:24,720 Speaker 2: that we've inherited and the system that we have now 341 00:19:25,560 --> 00:19:28,960 Speaker 2: is hundreds of people in the Pentagon will decide what 342 00:19:29,080 --> 00:19:31,040 Speaker 2: they need. And so they'll say, you know, we need 343 00:19:31,040 --> 00:19:34,200 Speaker 2: a we need a jet fighter. And it has to 344 00:19:34,240 --> 00:19:37,760 Speaker 2: do all these things. It has to fly at supersonic speeds, 345 00:19:37,760 --> 00:19:39,560 Speaker 2: and it needs to be able to die at this 346 00:19:39,600 --> 00:19:42,040 Speaker 2: particular angle, and it has to have a range, and 347 00:19:42,560 --> 00:19:46,240 Speaker 2: one thing after another, and this takes literally years for 348 00:19:46,320 --> 00:19:50,000 Speaker 2: the Pentagon to devise a new weapon, whether it's an 349 00:19:50,040 --> 00:19:54,120 Speaker 2: F thirty five fighter jet or a new Navy ship 350 00:19:54,240 --> 00:19:58,040 Speaker 2: or a submarine. It takes them years. And so the 351 00:19:58,080 --> 00:20:01,560 Speaker 2: paradigm that I think is finally being broken is that, 352 00:20:02,000 --> 00:20:04,639 Speaker 2: you know, the tech world moves very very rapidly. That 353 00:20:04,760 --> 00:20:07,000 Speaker 2: I'm talking about our tech world in California. They move 354 00:20:07,040 --> 00:20:09,280 Speaker 2: it super high speeds because if you don't move a 355 00:20:09,359 --> 00:20:12,600 Speaker 2: high speeds to go out of business. Pentagon moved very slowly, 356 00:20:12,680 --> 00:20:16,280 Speaker 2: and these two worlds never really came together. That's where 357 00:20:16,280 --> 00:20:19,359 Speaker 2: Palmer Lucky comes in. So Palmer Lucky has figured out 358 00:20:19,400 --> 00:20:23,720 Speaker 2: a way to bring these two worlds together. And he 359 00:20:23,800 --> 00:20:28,679 Speaker 2: did that by by standing the Pentagon procurement process entirely 360 00:20:28,720 --> 00:20:32,960 Speaker 2: on its head. And so instead of you know, hundreds 361 00:20:32,960 --> 00:20:36,280 Speaker 2: of people taking years and years to design a weapons system, 362 00:20:36,960 --> 00:20:41,280 Speaker 2: Palmer went out there and just built the most advanced 363 00:20:41,800 --> 00:20:45,960 Speaker 2: anti drone and the most advanced pilotless airplane. He just 364 00:20:46,040 --> 00:20:47,879 Speaker 2: built the most sophisticated one that he could and he 365 00:20:47,920 --> 00:20:50,280 Speaker 2: went to the Pentagon and said, this is for sale. 366 00:20:50,800 --> 00:20:52,720 Speaker 2: We can do these, and we can make them. We 367 00:20:52,760 --> 00:20:55,280 Speaker 2: can make a ton of these really fast, really cheaply. 368 00:20:55,720 --> 00:20:58,080 Speaker 2: But take it or leave it here it is. And 369 00:20:58,200 --> 00:21:01,159 Speaker 2: so instead of an F three five fighter jet that 370 00:21:01,200 --> 00:21:05,560 Speaker 2: costs two hundred million dollars, you have a drone like 371 00:21:06,040 --> 00:21:10,520 Speaker 2: they cost fifty thousand dollars. And what Palmer again is 372 00:21:10,600 --> 00:21:13,280 Speaker 2: kind of figured out is it's all about the software. 373 00:21:13,320 --> 00:21:15,920 Speaker 2: So they just have these kind of very cheap shells, 374 00:21:16,280 --> 00:21:19,320 Speaker 2: you know, i e. The drone itself, the missile carrying 375 00:21:19,359 --> 00:21:22,439 Speaker 2: its explosives, but the only thing that really matters is 376 00:21:22,520 --> 00:21:26,240 Speaker 2: software so that they can constantly make it better. The 377 00:21:26,359 --> 00:21:29,359 Speaker 2: drone missed its target, We'll fix the software. We'll tinker 378 00:21:29,359 --> 00:21:31,840 Speaker 2: with the software. How do we make it more accurate? 379 00:21:31,880 --> 00:21:33,840 Speaker 2: How do we make it better? The Pentagon says it 380 00:21:33,880 --> 00:21:36,119 Speaker 2: wants us to do something differently, We'll just go in 381 00:21:36,160 --> 00:21:39,119 Speaker 2: and tinker with the software. And they can do that really, 382 00:21:39,160 --> 00:21:43,359 Speaker 2: really cheaply for nothing. And that's that's how the Palmer 383 00:21:43,440 --> 00:21:48,000 Speaker 2: Lucky stood the whole Pentagon procurement process on its head. 384 00:21:48,320 --> 00:21:52,520 Speaker 1: I mean, both China and the US have looked extremely 385 00:21:52,640 --> 00:21:55,760 Speaker 1: hard at what's been happening in Ukraine. How are the 386 00:21:55,840 --> 00:22:00,840 Speaker 1: two countries responding from a technology point into view? And 387 00:22:00,880 --> 00:22:02,280 Speaker 1: where does Taiwan say. 388 00:22:02,080 --> 00:22:04,560 Speaker 2: In all of this, Well, Taiwan's it's right in the 389 00:22:04,640 --> 00:22:10,000 Speaker 2: middle between China and in the United States. You know, 390 00:22:10,040 --> 00:22:11,840 Speaker 2: you could like look out a map at the Pacific 391 00:22:11,840 --> 00:22:14,440 Speaker 2: and go like, who cares, you know, it's just this 392 00:22:14,560 --> 00:22:17,000 Speaker 2: big island one hundred miles off the coast of China. 393 00:22:17,080 --> 00:22:20,000 Speaker 2: Who cares? And like, it's a democracy. It's an ally, 394 00:22:20,520 --> 00:22:24,200 Speaker 2: a very close ally. And it's also they make something 395 00:22:24,280 --> 00:22:28,760 Speaker 2: like ninety five percent of the most sophisticated microchips in 396 00:22:28,840 --> 00:22:32,040 Speaker 2: the world are made in Taiwan and made nowhere else. 397 00:22:32,680 --> 00:22:36,040 Speaker 2: The big rilla in chip manufacturing, the most sophisticated, the 398 00:22:36,080 --> 00:22:40,320 Speaker 2: best is TSMC in Taiwan. And I remember more than 399 00:22:40,359 --> 00:22:42,560 Speaker 2: a couple of timewhen Hees kind of jokingly said to me, 400 00:22:42,600 --> 00:22:45,080 Speaker 2: you know, if the Chinese invade, the safest place and 401 00:22:45,160 --> 00:22:47,520 Speaker 2: the whole country is going to be in the basement 402 00:22:47,560 --> 00:22:51,320 Speaker 2: of the TSMC building something like. But to answer your question, 403 00:22:51,880 --> 00:22:55,440 Speaker 2: if you got to stand way back, you take America 404 00:22:55,800 --> 00:23:01,080 Speaker 2: on one side, has basically dominated the Western Pacific, and 405 00:23:01,200 --> 00:23:04,360 Speaker 2: those are the most important sea lanes in the world. 406 00:23:04,640 --> 00:23:08,480 Speaker 2: Those sea lanes are absolutely crucial to the world economy 407 00:23:08,920 --> 00:23:12,480 Speaker 2: in every conceivable way. And the Chinese are undergoing a 408 00:23:12,680 --> 00:23:16,520 Speaker 2: rapid massive military build up of their navy, the air force, everything. 409 00:23:17,080 --> 00:23:19,399 Speaker 2: They're challenging that they want to push the United States 410 00:23:19,400 --> 00:23:22,080 Speaker 2: out of the Western Pacific you can see the dilemma 411 00:23:22,080 --> 00:23:25,920 Speaker 2: that the Americans have. The US Navy, Air Force, everything, 412 00:23:26,720 --> 00:23:29,880 Speaker 2: the entire armed forces are built on this principle which 413 00:23:29,920 --> 00:23:33,520 Speaker 2: is completely antiquated now, which is we have very few 414 00:23:33,560 --> 00:23:37,720 Speaker 2: but very expensive and very sophisticated weapons platforms. I mean, 415 00:23:38,000 --> 00:23:40,760 Speaker 2: just the other day in the Red Sea, an aircraft 416 00:23:40,840 --> 00:23:42,679 Speaker 2: carrier had to make a sharp term because they were 417 00:23:42,680 --> 00:23:45,920 Speaker 2: fighting against the Huthi's, you know, and two f thirty 418 00:23:45,960 --> 00:23:48,399 Speaker 2: five's slid off the deck and went to the bottom 419 00:23:48,400 --> 00:23:52,080 Speaker 2: of the ocean. There's two hundred million dollars, yeah, in 420 00:23:52,119 --> 00:23:56,240 Speaker 2: like five minutes. So that's the Chinese, I think, to 421 00:23:56,320 --> 00:24:00,760 Speaker 2: their credit, really, they have designed their entire army forces 422 00:24:00,840 --> 00:24:04,439 Speaker 2: to defeat the United States in the Western Pacific. That 423 00:24:04,600 --> 00:24:06,040 Speaker 2: is what they want to do. They want to push 424 00:24:06,080 --> 00:24:08,720 Speaker 2: the United States out of the Western Pacific, and they 425 00:24:08,760 --> 00:24:14,119 Speaker 2: have spotted and identified our vulnerabilities, you know, very precisely. 426 00:24:14,359 --> 00:24:17,720 Speaker 2: So they have designed their armed forces to sink aircraft 427 00:24:17,800 --> 00:24:21,720 Speaker 2: carriers and to take out very very expensive American plans. 428 00:24:21,800 --> 00:24:24,000 Speaker 2: They have designed their entire armed forces to do that. 429 00:24:24,280 --> 00:24:26,760 Speaker 2: And so the United States is kind of suddenly and 430 00:24:26,880 --> 00:24:29,920 Speaker 2: way way too they'd woken up to that, and that's 431 00:24:29,960 --> 00:24:32,159 Speaker 2: what's driving all this what we're talking about, I mean, 432 00:24:32,200 --> 00:24:35,040 Speaker 2: the Pentagon, It is kind of I wouldn't describe it 433 00:24:35,040 --> 00:24:36,520 Speaker 2: as a panic, but I think, well, I think the 434 00:24:36,560 --> 00:24:39,640 Speaker 2: smart people are paniced. You know, the smart people are pretty. 435 00:24:39,320 --> 00:24:44,520 Speaker 1: Freaked out, and they're actively shipping drones to Taiwan. 436 00:24:44,600 --> 00:24:47,680 Speaker 2: Now, yes, Andrew's sending drones to Taiwan. So if you 437 00:24:47,720 --> 00:24:51,359 Speaker 2: take Ukraine as a template Russia, they come in with 438 00:24:51,400 --> 00:24:55,960 Speaker 2: these very expensive, very expensive platforms like tanks, like big 439 00:24:56,040 --> 00:25:00,480 Speaker 2: radar installations cost millions and millions of dollars, and the 440 00:25:00,600 --> 00:25:02,800 Speaker 2: Ukrainians are taking them out with five hundred dollars drones. 441 00:25:03,480 --> 00:25:06,360 Speaker 2: This is what the United States wants to do in Taiwan. 442 00:25:06,600 --> 00:25:08,840 Speaker 2: I think Sam and Paparo, who's the head at the 443 00:25:08,840 --> 00:25:12,040 Speaker 2: Indo Pacific Fleet in the Western Pacific to the American forces, 444 00:25:12,119 --> 00:25:16,280 Speaker 2: has said, I want to turn Taiwan into a hellscape 445 00:25:16,400 --> 00:25:19,760 Speaker 2: for the Chinese and to make them and he said this, 446 00:25:19,880 --> 00:25:24,040 Speaker 2: to make them utterly miserable until we have enough time 447 00:25:24,560 --> 00:25:26,560 Speaker 2: to get the rest of our military out there. What 448 00:25:26,600 --> 00:25:30,280 Speaker 2: does that mean. It's super simple. Flood Timelong with drunes 449 00:25:30,720 --> 00:25:34,320 Speaker 2: and so if the Chinese try to say, invade the 450 00:25:34,320 --> 00:25:36,760 Speaker 2: island and there's you know, there's a lot of evidence, 451 00:25:37,200 --> 00:25:39,600 Speaker 2: a lot of evidence to suggest that the Chinese are 452 00:25:39,600 --> 00:25:43,720 Speaker 2: thinking very very carefully about doing that. You flood the air, 453 00:25:43,800 --> 00:25:47,480 Speaker 2: you flood the sea, you flood the beaches with unmanned 454 00:25:47,520 --> 00:25:51,280 Speaker 2: vehicles to basically sink an invasion force, And that, in 455 00:25:51,320 --> 00:25:55,160 Speaker 2: a nutshell, is kind of what the plan is now 456 00:25:55,200 --> 00:25:57,280 Speaker 2: that you know the Chinese, no doubt they're trying to 457 00:25:57,320 --> 00:25:59,640 Speaker 2: counter it to and I think to me, the kind 458 00:25:59,640 --> 00:26:03,000 Speaker 2: of the most likely scenario is that the Chinese won't 459 00:26:03,000 --> 00:26:06,439 Speaker 2: invade Taiwan, but they'll blockade Taiwan. They'll just cut it 460 00:26:06,480 --> 00:26:09,600 Speaker 2: off and they'll basically dare us to run the blockade. 461 00:26:09,720 --> 00:26:11,680 Speaker 2: So that's that's a whole other ball of wax. 462 00:26:12,160 --> 00:26:14,800 Speaker 1: But to your point earlier about the advantage being for 463 00:26:14,880 --> 00:26:19,640 Speaker 1: the defender in today's world of military technology, and all 464 00:26:19,640 --> 00:26:22,359 Speaker 1: out of tech from China would make Taiwan into the 465 00:26:22,400 --> 00:26:26,280 Speaker 1: defender and therefore trigger the advantages of all these very cheap, 466 00:26:26,960 --> 00:26:32,000 Speaker 1: unmanned aerial and water drones. But on the other hand, 467 00:26:32,880 --> 00:26:37,360 Speaker 1: if a blockade, they essentially become the defender, and therefore 468 00:26:37,840 --> 00:26:40,240 Speaker 1: the Americans become vunderbirth they try to run the blockade. 469 00:26:40,440 --> 00:26:43,480 Speaker 2: Yeah, exactly. It's a sort of it's everybody's spending a 470 00:26:43,480 --> 00:26:46,640 Speaker 2: lot of time thinking about this. I went to a wargame. Actually, 471 00:26:46,960 --> 00:26:52,840 Speaker 2: the Pentagon has these wargames, most of which you are classified. Yeah, yeah, 472 00:26:52,960 --> 00:26:55,199 Speaker 2: well well they I went to a I went to 473 00:26:55,200 --> 00:26:59,280 Speaker 2: an unclassified wargame at CSIS, the Center for a Strategic 474 00:26:59,280 --> 00:27:03,640 Speaker 2: International Stuff. It's a think tanking Washington, and it was fascinating, 475 00:27:03,680 --> 00:27:06,479 Speaker 2: but also it was like pretty disturbing the whole thing. 476 00:27:06,520 --> 00:27:07,919 Speaker 2: And it was just a war game, but it was like, 477 00:27:08,000 --> 00:27:11,560 Speaker 2: oh my god. The scenario was a blockade. It wasn't 478 00:27:11,560 --> 00:27:15,000 Speaker 2: an invasion. Chinese started stopping ships in the in the 479 00:27:15,040 --> 00:27:17,919 Speaker 2: Straits of Taiwan. So there was a Chinese team and 480 00:27:17,960 --> 00:27:22,920 Speaker 2: there was an American team, and everybody had essentially weapons 481 00:27:23,000 --> 00:27:27,240 Speaker 2: that each country is believed to have. But it was 482 00:27:27,280 --> 00:27:30,600 Speaker 2: fantastically bloody. I think it's safe to say tens of 483 00:27:30,680 --> 00:27:34,560 Speaker 2: thousands of people died on each side. They stopped the 484 00:27:34,640 --> 00:27:37,719 Speaker 2: game after like the four go rounds. I wouldn't say 485 00:27:37,720 --> 00:27:40,280 Speaker 2: it was a draw because like neither side had stopped, 486 00:27:40,320 --> 00:27:43,080 Speaker 2: but they really really blooded each other, and I think 487 00:27:43,160 --> 00:27:45,760 Speaker 2: that was That's the one thing we can be sure 488 00:27:45,800 --> 00:27:48,360 Speaker 2: about is if the United States and China go to war, 489 00:27:48,560 --> 00:27:52,240 Speaker 2: it will be it would be an utter and complete catastrophe. 490 00:27:52,400 --> 00:27:55,760 Speaker 2: And this was no nuclear weapons were used because it 491 00:27:55,840 --> 00:27:57,760 Speaker 2: had sort of decided at the beginning of the war 492 00:27:57,880 --> 00:28:02,840 Speaker 2: game no nukes. Sometimes that prohibition is not in the 493 00:28:02,880 --> 00:28:05,439 Speaker 2: war games that they have, and sometimes nuclear weapons have 494 00:28:05,560 --> 00:28:08,760 Speaker 2: been used. The Chinese did not attack the American mainland, 495 00:28:09,160 --> 00:28:11,359 Speaker 2: even though they are quite capable of doing that. The 496 00:28:11,480 --> 00:28:14,600 Speaker 2: United States attacked the Chinese mainline, filled thousands of Chinese, 497 00:28:14,600 --> 00:28:17,600 Speaker 2: and I think what kind of sobered everyone in the room, 498 00:28:17,640 --> 00:28:19,679 Speaker 2: and these are people in the room where you know 499 00:28:19,720 --> 00:28:24,000 Speaker 2: they were pros. What sobered everybody was how fast things escalated. 500 00:28:24,080 --> 00:28:25,760 Speaker 2: You know, you started with this like kind of nothing 501 00:28:25,800 --> 00:28:29,080 Speaker 2: event where the Chinese navy like stops a Taiwanese ship, 502 00:28:29,119 --> 00:28:31,800 Speaker 2: and then it like stops an American one, and then 503 00:28:32,200 --> 00:28:35,120 Speaker 2: there's some confusion, and then they shoot an American plane down, 504 00:28:35,240 --> 00:28:38,120 Speaker 2: and then the Americans shoot a Chinese and then it's 505 00:28:38,400 --> 00:28:42,000 Speaker 2: World War three like rapidly, very very quickly. It's just 506 00:28:42,160 --> 00:28:44,560 Speaker 2: you could feel it in the room. It just took off. 507 00:28:45,000 --> 00:28:48,120 Speaker 2: And so this is what happens in war. You know, 508 00:28:48,200 --> 00:28:51,080 Speaker 2: wars get out of control, and the wargame that I 509 00:28:51,120 --> 00:28:53,120 Speaker 2: went to the war between the United States and China 510 00:28:53,160 --> 00:28:55,440 Speaker 2: got out of control very quickly. 511 00:28:55,880 --> 00:28:58,680 Speaker 1: So I have to ask you, what is the future 512 00:28:58,680 --> 00:29:02,640 Speaker 1: of wolfare that you saw and how afraid should we be? 513 00:29:03,320 --> 00:29:06,560 Speaker 2: Well, well, it's all pretty sobering, you know. It's the 514 00:29:06,600 --> 00:29:10,920 Speaker 2: old Latin phrase devis possum parabellum. If you want peace, 515 00:29:10,960 --> 00:29:15,080 Speaker 2: prepare for war. And I think if each side would say, 516 00:29:15,200 --> 00:29:17,800 Speaker 2: in this case, the United States and China, we're both 517 00:29:17,840 --> 00:29:22,520 Speaker 2: preparing for war. Hopefully the two cancel out. Everybody's really 518 00:29:22,560 --> 00:29:27,520 Speaker 2: sober about it. Everybody's really careful, and because the prospect 519 00:29:27,760 --> 00:29:30,640 Speaker 2: of going to war is so terrifying to each country, 520 00:29:30,960 --> 00:29:34,600 Speaker 2: that war will be avoided. That's the best case scenarios. 521 00:29:35,160 --> 00:29:37,600 Speaker 2: That's that's the best we're going to do. If the 522 00:29:37,680 --> 00:29:40,320 Speaker 2: terrence fails, that's what everybody's trying to avoid. And I 523 00:29:40,360 --> 00:29:43,720 Speaker 2: think everybody knows this, including the Chinese. I mean, I 524 00:29:43,720 --> 00:29:46,840 Speaker 2: think the Chinese have been very careful in Taiwan. You know, 525 00:29:46,880 --> 00:29:49,680 Speaker 2: they've been increasing the pressure, increasing the pressure, but they're careful, 526 00:29:49,880 --> 00:29:52,560 Speaker 2: like they haven't done anything rash, and I think it's 527 00:29:52,600 --> 00:29:54,720 Speaker 2: for the reasons that we're talking about, Like they've done 528 00:29:54,800 --> 00:29:57,560 Speaker 2: more games too and they know how ugly these things 529 00:29:57,600 --> 00:30:01,000 Speaker 2: are and how unpredictable they are, how rapidly they can 530 00:30:01,000 --> 00:30:01,840 Speaker 2: get out of control. 531 00:30:03,440 --> 00:30:05,240 Speaker 1: What's the biggest question for you. 532 00:30:05,680 --> 00:30:09,320 Speaker 2: The kind of unanswered, unanswerable question, is can andrel and 533 00:30:09,400 --> 00:30:13,720 Speaker 2: the Palmer Luckies of the United States deploy those these 534 00:30:13,760 --> 00:30:17,400 Speaker 2: weapons that we're talking about rapidly enough to deter the 535 00:30:17,480 --> 00:30:22,320 Speaker 2: Chinese from taking Taiwan. Because I think if the Western 536 00:30:22,480 --> 00:30:26,400 Speaker 2: Pacific were to be kind of fall under Chinese domination, 537 00:30:26,640 --> 00:30:29,520 Speaker 2: that would be a catastrophe too. So and I think 538 00:30:29,520 --> 00:30:32,040 Speaker 2: the best way to prevent that is for these weapons 539 00:30:32,080 --> 00:30:34,800 Speaker 2: to get out there as fast as they can go 540 00:30:35,360 --> 00:30:38,440 Speaker 2: to deter the Chinese from doing that. And so that's 541 00:30:38,480 --> 00:30:41,280 Speaker 2: really the question. Can they get that stuff out there 542 00:30:41,280 --> 00:30:47,040 Speaker 2: and ready quickly enough before the Chinese are ready to go? 543 00:30:47,200 --> 00:30:50,760 Speaker 2: Because the Chinese have been very very clear, as clear 544 00:30:50,840 --> 00:30:55,080 Speaker 2: as day like they want Taiwan. They've set twenty twenty 545 00:30:55,080 --> 00:30:56,960 Speaker 2: seven as the year that they wanted to be in 546 00:30:57,040 --> 00:30:59,960 Speaker 2: Chinese hands. They couldn't be more clear about what they want, 547 00:31:00,840 --> 00:31:03,000 Speaker 2: and so I think it's fair to say the United 548 00:31:03,040 --> 00:31:06,040 Speaker 2: States is racing to kind of prevent that. But my 549 00:31:06,200 --> 00:31:09,240 Speaker 2: sense is that we'll work out the technological staff for 550 00:31:09,320 --> 00:31:11,720 Speaker 2: these guys. Will the Palmer Luckees of the world will 551 00:31:11,720 --> 00:31:14,200 Speaker 2: figure out the technology, but can they do it in time? 552 00:31:16,760 --> 00:31:19,080 Speaker 1: Text to filchis thank you, thank you so much. 553 00:31:18,920 --> 00:31:19,360 Speaker 2: For having me. 554 00:31:42,800 --> 00:31:44,360 Speaker 3: That's it for this week for tech Stuff. 555 00:31:44,400 --> 00:31:47,240 Speaker 1: I'm Cara Price and I'm as Velocian. This episode was 556 00:31:47,240 --> 00:31:50,800 Speaker 1: produced by Eliza Dennis, Tyler Hill and Melissa Slaughter. It 557 00:31:50,880 --> 00:31:54,040 Speaker 1: was executive produced by Me, Kara Price, Julian Nutter, and 558 00:31:54,160 --> 00:31:58,240 Speaker 1: Kate Osborne for Kneidoscope and Katria novelle Va iHeart Podcasts. 559 00:31:58,720 --> 00:32:01,680 Speaker 1: Jack Insley makes this episode and Kyle Murdoch wrote off 560 00:32:01,720 --> 00:32:02,200 Speaker 1: theme song. 561 00:32:02,440 --> 00:32:05,120 Speaker 3: On Friday, instead of our usual weekend tech episode, we 562 00:32:05,200 --> 00:32:08,480 Speaker 3: will be airing part one of the podcast shell Game. 563 00:32:08,920 --> 00:32:12,200 Speaker 3: This season, host Evan Ratliff founds a startup run by 564 00:32:12,320 --> 00:32:15,440 Speaker 3: fake people. Listen to see how close we are to 565 00:32:15,560 --> 00:32:17,680 Speaker 3: AI taking our jobs. 566 00:32:17,320 --> 00:32:19,880 Speaker 1: And please do rate and review the show wherever you 567 00:32:19,920 --> 00:32:22,840 Speaker 1: listen and send us your ideas, your suggestions, and your 568 00:32:22,880 --> 00:32:26,520 Speaker 1: feedback to tech Stuff Podcast at gmail dot com. 569 00:32:26,640 --> 00:32:27,600 Speaker 2: We love hearing from you.