1 00:00:04,720 --> 00:00:08,119 Speaker 1: On this episode of Each World, America spends nearly a 2 00:00:08,160 --> 00:00:12,360 Speaker 1: trillion dollars a year in its military. This extraordinary spending 3 00:00:12,800 --> 00:00:16,120 Speaker 1: not only attracts from our ability to address pressing social problems, 4 00:00:16,600 --> 00:00:19,880 Speaker 1: but compels us into foreign wars to justify our vast 5 00:00:19,960 --> 00:00:23,680 Speaker 1: arsenal sold to us the name of security. Our military 6 00:00:23,720 --> 00:00:28,440 Speaker 1: industrial complex actually makes us far less safe. Top policy 7 00:00:28,440 --> 00:00:32,400 Speaker 1: experts William Hartung and Ben Freeman follow the profits of 8 00:00:32,479 --> 00:00:37,280 Speaker 1: militarism from traditional Pentagon contractors who received more than half 9 00:00:37,280 --> 00:00:40,640 Speaker 1: of the Pentagon's budget, to the upstart high tech firms 10 00:00:40,920 --> 00:00:45,800 Speaker 1: that shamelessly promote on proven and destabilizing technologies. Bill n 11 00:00:45,880 --> 00:00:49,960 Speaker 1: masked the neighbors of the war machine, politicians, lobbyists, the media, 12 00:00:50,360 --> 00:00:53,880 Speaker 1: Hollywood think tanks, and so many more whose work enriches 13 00:00:53,920 --> 00:00:57,480 Speaker 1: a wealthy elite yet the expense of everybody else, spreading 14 00:00:57,520 --> 00:01:00,880 Speaker 1: conflict around the world and embroiling America and endless wars. 15 00:01:01,280 --> 00:01:05,080 Speaker 1: Here to discuss their new book, The Trillion Dollar War Machine. 16 00:01:05,400 --> 00:01:09,280 Speaker 1: Our run away military spending drives of America into foreign wars 17 00:01:09,520 --> 00:01:12,800 Speaker 1: and bankrupts us at home. I'm really pleased to welcome 18 00:01:12,800 --> 00:01:28,759 Speaker 1: my guests, William Hartung and Ben Freeman Bill and Ben, 19 00:01:28,959 --> 00:01:30,760 Speaker 1: thank you for joining me on knits World. 20 00:01:31,400 --> 00:01:33,759 Speaker 2: Yes, thank you, thanks for having us neith. 21 00:01:34,040 --> 00:01:38,160 Speaker 1: Pentagon spending today, adjusted for inflation, is one hundred billion 22 00:01:38,200 --> 00:01:41,120 Speaker 1: dollars higher than it was at the height of the 23 00:01:41,120 --> 00:01:45,759 Speaker 1: Cold War. Yet we have half the troops, half the ships, 24 00:01:46,240 --> 00:01:49,840 Speaker 1: half the aircraft. How did the US end up paying 25 00:01:49,880 --> 00:01:51,600 Speaker 1: more but getting so much less? 26 00:01:52,840 --> 00:01:55,120 Speaker 3: That stat there, that was one of the stats that 27 00:01:55,240 --> 00:01:58,120 Speaker 3: really first shocked us when we started writing this book. 28 00:01:58,520 --> 00:02:02,520 Speaker 3: Revealed a hidden truth behind the US military of today, 29 00:02:03,000 --> 00:02:06,320 Speaker 3: that we're spending more and more on security and getting 30 00:02:06,400 --> 00:02:10,160 Speaker 3: less of it. And what our book attempts to do 31 00:02:10,280 --> 00:02:14,200 Speaker 3: is explain the story why, why that is happening, and 32 00:02:14,240 --> 00:02:16,960 Speaker 3: how we've gotten to this point. And I think a 33 00:02:17,120 --> 00:02:20,440 Speaker 3: very direct answer to your question, Nute, is that we 34 00:02:20,560 --> 00:02:24,639 Speaker 3: have a broken defense acquisition system. The way the DoD 35 00:02:25,280 --> 00:02:30,640 Speaker 3: bias things is fundamentally broken, and a considerable amount of 36 00:02:30,639 --> 00:02:32,640 Speaker 3: money is just simply being wasted. 37 00:02:33,040 --> 00:02:35,840 Speaker 2: And the notion fewer better weapons, but a lot of 38 00:02:35,880 --> 00:02:40,000 Speaker 2: the weapons are too complex, hard to maintain, and at 39 00:02:40,000 --> 00:02:43,919 Speaker 2: a certain point quality can't make up for less of quantity. 40 00:02:44,240 --> 00:02:46,639 Speaker 2: Normal Augustine, the former head of Lockey Martin, he sort 41 00:02:46,680 --> 00:02:49,280 Speaker 2: of joked at the way we're going in twenty fifty, 42 00:02:49,800 --> 00:02:52,120 Speaker 2: we'd have one fighter plan and the services we have 43 00:02:52,160 --> 00:02:54,280 Speaker 2: to share it out over the course of the week. 44 00:02:54,680 --> 00:02:56,920 Speaker 2: We're not there yet, but it was insightful in its 45 00:02:56,919 --> 00:02:57,320 Speaker 2: own way. 46 00:02:57,680 --> 00:03:02,000 Speaker 1: It was actually picking up on a comment by President Coolidge, who, 47 00:03:02,120 --> 00:03:05,280 Speaker 1: when told they needed a certain number of planes for training, 48 00:03:05,320 --> 00:03:07,160 Speaker 1: said well, couldn't they just buy one and share it? 49 00:03:07,680 --> 00:03:10,280 Speaker 1: I tell audiences all the time. I think I'm the 50 00:03:10,320 --> 00:03:13,720 Speaker 1: longest serving teacher to the senior military. I've been teaching 51 00:03:14,280 --> 00:03:17,440 Speaker 1: major generals on the art of war since nineteen eighty three, 52 00:03:18,040 --> 00:03:22,359 Speaker 1: and I routinely tell groups of military leaders, if we 53 00:03:22,960 --> 00:03:27,600 Speaker 1: reduce the Pentagon to a triangle and put the other 54 00:03:27,639 --> 00:03:31,680 Speaker 1: two thirds of the building into a museum, we would 55 00:03:31,680 --> 00:03:34,760 Speaker 1: actually have a better defense of some in about two weeks, 56 00:03:35,000 --> 00:03:37,920 Speaker 1: because so much of what we do now is just 57 00:03:38,080 --> 00:03:43,040 Speaker 1: nonsensical bureaucracy. What led you to to write this book, Well, 58 00:03:43,080 --> 00:03:47,040 Speaker 1: I've been writing around this issue since. In nineteen seventy nine, 59 00:03:47,600 --> 00:03:50,640 Speaker 1: I worked with a guy named Gordon Adams now Aby Ross. 60 00:03:50,680 --> 00:03:53,680 Speaker 1: I wrote a book called The Iron Triangle. It's kind 61 00:03:53,720 --> 00:03:56,760 Speaker 1: of an update of status of the military industrial complex. 62 00:03:57,120 --> 00:03:59,480 Speaker 2: Interesting. Many of the companies that he profiled us since 63 00:03:59,560 --> 00:04:02,800 Speaker 2: been absorb in the merger boom. And at Columbia I 64 00:04:02,800 --> 00:04:06,800 Speaker 2: studied with an engineering professor, Seymour Melman, who had written 65 00:04:06,800 --> 00:04:09,640 Speaker 2: a book, The Permanent War Economy of the United States. 66 00:04:09,920 --> 00:04:11,960 Speaker 2: But I blame this all on Ben, you know. When 67 00:04:12,000 --> 00:04:15,360 Speaker 2: they approached me, I'm like, I've been writing about this forever. 68 00:04:16,080 --> 00:04:18,000 Speaker 2: The world's on fire. I don't have time for this. 69 00:04:18,560 --> 00:04:22,080 Speaker 2: Ben wisely convinced me otherwise, And the book is far 70 00:04:22,160 --> 00:04:25,400 Speaker 2: better because he's a little bit of a clear writer, 71 00:04:25,600 --> 00:04:27,880 Speaker 2: less of a hothead. I think it'll reach more people. 72 00:04:28,000 --> 00:04:31,480 Speaker 3: Now I'll tell you new to be perfectly candid. This 73 00:04:31,560 --> 00:04:35,159 Speaker 3: book began by ruining a perfectly good day on a boat. 74 00:04:35,279 --> 00:04:35,640 Speaker 2: You see. 75 00:04:35,680 --> 00:04:38,400 Speaker 3: Bill came down to Florida, where I live on vacation, 76 00:04:38,839 --> 00:04:41,479 Speaker 3: and he was on this beautiful boat ride. He was 77 00:04:41,520 --> 00:04:44,359 Speaker 3: boohooingg that his old publisher wanted him to write a 78 00:04:44,400 --> 00:04:48,200 Speaker 3: new book. Unbeknownst to Bill, I already had a book 79 00:04:48,240 --> 00:04:51,000 Speaker 3: in mind that I wanted Bill to write, and so 80 00:04:51,120 --> 00:04:53,839 Speaker 3: with a little lobbying of my own, Bill agreed to it. 81 00:04:53,960 --> 00:04:55,719 Speaker 3: And the book I wanted Bill to write, which he 82 00:04:55,839 --> 00:04:59,039 Speaker 3: has ultimately very kindly wrote me into writing with him. 83 00:04:59,600 --> 00:05:02,919 Speaker 3: The I thought this book had to be written was 84 00:05:02,960 --> 00:05:08,080 Speaker 3: that the military industrial complex of today was just a 85 00:05:08,200 --> 00:05:11,920 Speaker 3: fundamentally different beast than in Eisenhower's time, and we were 86 00:05:11,960 --> 00:05:15,280 Speaker 3: documenting a lot of this influence at the Quincy Institute. 87 00:05:15,560 --> 00:05:19,479 Speaker 3: We were seeing the military industrial complexes overlap with the 88 00:05:19,480 --> 00:05:23,400 Speaker 3: foreign influence industry, you know, foreign government's lobbyists working together 89 00:05:23,440 --> 00:05:26,240 Speaker 3: with the fence contractors to push arm sales. And we 90 00:05:26,240 --> 00:05:29,359 Speaker 3: were seeing the think tank influence by defense contractors. We 91 00:05:29,400 --> 00:05:32,880 Speaker 3: were seeing their influence at universities, Hollywood, on the DC 92 00:05:33,040 --> 00:05:36,560 Speaker 3: Metro for crying out loud, just everywhere you turned. We 93 00:05:36,560 --> 00:05:40,800 Speaker 3: were seeing the military industrial complex's influence, and we realized 94 00:05:40,800 --> 00:05:43,320 Speaker 3: nobody had put all of these pieces together. And so 95 00:05:43,640 --> 00:05:46,400 Speaker 3: that's where this book really started. We wanted to provide 96 00:05:46,440 --> 00:05:50,320 Speaker 3: people with this holistic look of the military industrial complex, 97 00:05:50,360 --> 00:05:53,280 Speaker 3: including the defense tech sector, which Bill had also been 98 00:05:53,320 --> 00:05:54,560 Speaker 3: covering in great detail too. 99 00:05:54,800 --> 00:05:57,400 Speaker 2: And I would say that sector, even since we started 100 00:05:57,400 --> 00:06:01,039 Speaker 2: the book as mushroomed little bit out of the mix, 101 00:06:01,040 --> 00:06:03,640 Speaker 2: because I was still stuck talking about the old guard 102 00:06:03,680 --> 00:06:06,480 Speaker 2: like Lockheed Martin, which I wrote a book about, there 103 00:06:06,480 --> 00:06:10,440 Speaker 2: really is a competition, although they may sort of pay 104 00:06:10,440 --> 00:06:13,960 Speaker 2: off both sides through big projects like Golden Dome. Even 105 00:06:14,000 --> 00:06:16,960 Speaker 2: though fighter Plane is you know, unman wing men, So 106 00:06:17,480 --> 00:06:20,880 Speaker 2: that whole drama I think is going to be critical, 107 00:06:21,000 --> 00:06:23,120 Speaker 2: and I think Ben has more of a sense that 108 00:06:23,160 --> 00:06:27,760 Speaker 2: there's positives and that you need powerful companies to dislodge 109 00:06:28,040 --> 00:06:31,880 Speaker 2: powerful companies and to create competition. The question is what 110 00:06:31,920 --> 00:06:34,039 Speaker 2: are they offering, How do we know if it's going 111 00:06:34,120 --> 00:06:36,560 Speaker 2: to work, how does it fit into our strategies. I 112 00:06:36,600 --> 00:06:38,880 Speaker 2: think we're kind of a pivotal moment, probably more so 113 00:06:38,920 --> 00:06:41,680 Speaker 2: than we realized when we started the book. When you 114 00:06:41,680 --> 00:06:44,880 Speaker 2: say we're a pivotal moment, what do you mean, well, 115 00:06:45,360 --> 00:06:48,960 Speaker 2: I think, love it or not, we're not as powerful 116 00:06:49,000 --> 00:06:52,040 Speaker 2: as we once were. There's power in the world, just diffuse. 117 00:06:52,640 --> 00:06:56,080 Speaker 2: The challenges are hard. I mean Russian invasion of Ukraine, 118 00:06:56,600 --> 00:07:00,240 Speaker 2: the situation of Israel, Gaza, how do we address the 119 00:07:00,480 --> 00:07:04,040 Speaker 2: challenge from China and now the new focus on Latin 120 00:07:04,080 --> 00:07:08,040 Speaker 2: America what they're calling the Donro doctrine, Donald Trump and 121 00:07:08,120 --> 00:07:12,920 Speaker 2: Monroe mash up. And then domestically, there's a lot of division. 122 00:07:13,080 --> 00:07:15,960 Speaker 2: Our economy. It's not as strong as it was. It's 123 00:07:16,160 --> 00:07:19,200 Speaker 2: especially not serving certain parts of the working class. We 124 00:07:19,320 --> 00:07:23,080 Speaker 2: used to have good paying manufacturing jobs. All that's in 125 00:07:23,120 --> 00:07:25,320 Speaker 2: the mix, and part of that is, all, right, what's 126 00:07:25,360 --> 00:07:30,400 Speaker 2: our strategy, and also how does the Pentagon budget and 127 00:07:30,480 --> 00:07:33,400 Speaker 2: the military fit into what kind of society, what kind 128 00:07:33,440 --> 00:07:35,720 Speaker 2: of economy, how do we want to educate and train 129 00:07:35,800 --> 00:07:39,080 Speaker 2: the next generation? And then, you know, more narrowly, just 130 00:07:39,120 --> 00:07:42,840 Speaker 2: this new fight between the big contractors which formed in 131 00:07:42,840 --> 00:07:45,560 Speaker 2: the nineties during the merger boom. I think this is 132 00:07:45,600 --> 00:07:49,640 Speaker 2: the biggest moment in the defense industrial base since then. 133 00:07:49,960 --> 00:07:52,480 Speaker 3: I really think that last comment of Bills is really 134 00:07:52,520 --> 00:07:55,000 Speaker 3: the key there, and I think that is the reason 135 00:07:55,040 --> 00:07:58,440 Speaker 3: we're going to look back on twenty twenty five specifically 136 00:07:59,000 --> 00:08:03,880 Speaker 3: as the year where the defense industry fundamentally changed. There's 137 00:08:03,920 --> 00:08:07,200 Speaker 3: a paradigm shift that's been underway in the last nine 138 00:08:07,200 --> 00:08:12,440 Speaker 3: months under the Trump administration, and that transition is explicitly 139 00:08:13,120 --> 00:08:17,040 Speaker 3: away from the major defense contractors, the primes who rose 140 00:08:17,120 --> 00:08:20,600 Speaker 3: up during the nineties defense consolidation that Bill mentioned, and 141 00:08:20,840 --> 00:08:24,000 Speaker 3: if that was a defense industry consolidation, I think we're 142 00:08:24,000 --> 00:08:27,760 Speaker 3: in the midst of a defense industry displacement where those 143 00:08:27,840 --> 00:08:32,520 Speaker 3: old guard firms, to some extent or another, are actively 144 00:08:32,760 --> 00:08:36,640 Speaker 3: being pushed around at least if not completely pushed under 145 00:08:37,160 --> 00:08:41,040 Speaker 3: by the defense tech sector. And to Bill's point, I 146 00:08:41,240 --> 00:08:45,240 Speaker 3: was perhaps a little optimistic, a little too optimistic, if 147 00:08:45,240 --> 00:08:48,439 Speaker 3: I'm critical of myself, about the rise of them being 148 00:08:48,480 --> 00:08:51,400 Speaker 3: able to push out the defense contractors. But what we're 149 00:08:51,400 --> 00:08:54,000 Speaker 3: seeing is that a rising Pentagon budget is kind of 150 00:08:54,040 --> 00:08:56,800 Speaker 3: lifting all boats, and so the old guards getting plenty 151 00:08:56,880 --> 00:08:59,280 Speaker 3: of money, and the defense tech startups are getting more 152 00:08:59,280 --> 00:09:02,160 Speaker 3: and more money. But I think as we transition to 153 00:09:02,200 --> 00:09:05,720 Speaker 3: the future, we're going the way of a more sophisticated 154 00:09:05,760 --> 00:09:09,120 Speaker 3: defense industry, and so I think the old guard defense 155 00:09:09,200 --> 00:09:10,760 Speaker 3: firms are in big trouble. 156 00:09:11,320 --> 00:09:13,960 Speaker 2: And there's no question we have to integrate new technology. 157 00:09:13,960 --> 00:09:16,440 Speaker 2: It's just how do we do it, how much hope 158 00:09:16,440 --> 00:09:18,440 Speaker 2: do we put in it, what's the strategy? And of 159 00:09:18,440 --> 00:09:20,920 Speaker 2: course we'll need a different kind of workforce to be 160 00:09:20,960 --> 00:09:24,880 Speaker 2: more involvement of our universities, and AI will be suffuse 161 00:09:24,920 --> 00:09:27,959 Speaker 2: throughout society. So how do the military uses relate to 162 00:09:28,000 --> 00:09:28,880 Speaker 2: the broader uses. 163 00:09:29,440 --> 00:09:32,440 Speaker 1: One of the things which explains part of this. You 164 00:09:32,520 --> 00:09:36,560 Speaker 1: all made the case that these big Pentagon contractors have 165 00:09:36,679 --> 00:09:40,120 Speaker 1: nine hundred and forty five lobbyists. They spent one hundred 166 00:09:40,120 --> 00:09:43,160 Speaker 1: and forty eight million dollars in twenty twenty four a loan, 167 00:09:43,559 --> 00:09:46,240 Speaker 1: So they have about two lobbyists and two hundred and 168 00:09:46,280 --> 00:09:50,840 Speaker 1: seventy five thousand dollars per member of Congress, which makes 169 00:09:50,920 --> 00:09:55,120 Speaker 1: reform pretty tricky because you're clearly going uphill against a 170 00:09:55,200 --> 00:09:57,640 Speaker 1: pratorian guard protecting the past. 171 00:09:58,520 --> 00:10:01,000 Speaker 2: Yeah, and as Ben points out, in addition to money, 172 00:10:01,559 --> 00:10:04,200 Speaker 2: a lot of the Hill is run by twenty three 173 00:10:04,240 --> 00:10:06,320 Speaker 2: to twenty five year olds and they have to deal 174 00:10:06,320 --> 00:10:09,559 Speaker 2: with the whole range of issues. So in addition to money, 175 00:10:09,920 --> 00:10:13,240 Speaker 2: the lobbyists bring expertise to the table. When they're reaching 176 00:10:13,520 --> 00:10:17,600 Speaker 2: for ideas for framing, they're more present. And some other 177 00:10:17,640 --> 00:10:20,640 Speaker 2: groups that don't have as many lobbyists as much money, 178 00:10:20,640 --> 00:10:23,040 Speaker 2: and there's kind of that interaction. A lot of them 179 00:10:23,080 --> 00:10:26,400 Speaker 2: came from government. They have expertise, so sometimes they control 180 00:10:26,480 --> 00:10:30,480 Speaker 2: the discussion internally as well as whatever money they spend. 181 00:10:30,840 --> 00:10:34,760 Speaker 2: I love your reference to the pratorian guard. I similarly 182 00:10:34,880 --> 00:10:38,760 Speaker 2: call it the autoimmune response at DoD when it comes 183 00:10:38,800 --> 00:10:41,800 Speaker 2: to buying stuff, where you know when innovation comes in 184 00:10:42,000 --> 00:10:45,280 Speaker 2: and automatically they're just swarming to crush that and fight 185 00:10:45,360 --> 00:10:48,400 Speaker 2: that innovative thing off. That's not part of what they're producing. 186 00:10:48,920 --> 00:10:50,319 Speaker 2: And it is to your point. 187 00:10:50,080 --> 00:10:52,960 Speaker 3: Too, it is the fact that they have that autoimmune 188 00:10:53,000 --> 00:10:55,840 Speaker 3: response because they have the lobbyist. They have the lobby 189 00:10:55,880 --> 00:10:58,160 Speaker 3: they have all this other influence too, I think tanks 190 00:10:58,160 --> 00:11:00,960 Speaker 3: and elsewhere which we point out. I will say this 191 00:11:01,640 --> 00:11:05,360 Speaker 3: the new guard. They know the game and they're starting 192 00:11:05,400 --> 00:11:08,000 Speaker 3: to play it more and more. And a Reil, for example, 193 00:11:08,040 --> 00:11:10,800 Speaker 3: which is a very big player in this new defense 194 00:11:10,880 --> 00:11:14,200 Speaker 3: tech surge, they have forty two lobbyists on their payroll 195 00:11:14,280 --> 00:11:16,720 Speaker 3: right now and that's just one firm, one of the 196 00:11:16,760 --> 00:11:19,640 Speaker 3: defense tech firms. They're leaning into the jobs argument too, 197 00:11:19,760 --> 00:11:22,559 Speaker 3: creating a big facility in a swing state like Ohio. 198 00:11:23,120 --> 00:11:26,880 Speaker 3: So you're seeing the defense tech firms create this lobbying 199 00:11:26,880 --> 00:11:29,640 Speaker 3: and influence campaign of their own. So I think at 200 00:11:29,640 --> 00:11:31,160 Speaker 3: the very least they're going to be able to push 201 00:11:31,200 --> 00:11:33,080 Speaker 3: back on the praetorium guard as you put it. 202 00:11:33,640 --> 00:11:37,120 Speaker 2: And there's a new revolving door ex military folks not 203 00:11:37,200 --> 00:11:40,120 Speaker 2: going directly to the startups, but to the VC firms 204 00:11:40,480 --> 00:11:43,280 Speaker 2: that fund them. And as a former head of procurement 205 00:11:43,280 --> 00:11:45,839 Speaker 2: of the Pentagon said, you could cash in big time 206 00:11:45,880 --> 00:11:48,559 Speaker 2: in front of your investments hits more than making a 207 00:11:48,559 --> 00:11:53,480 Speaker 2: couple hundred thousand as a board member. Pollunteer recruited Mike Gallagher, 208 00:11:53,480 --> 00:11:57,440 Speaker 2: who ran the Congressional China Committee, And interestingly, he's in 209 00:11:57,440 --> 00:11:58,959 Speaker 2: his forties, so a lot of times people do it 210 00:11:59,040 --> 00:12:01,800 Speaker 2: later in their career, but he's sort of staking to 211 00:12:01,880 --> 00:12:03,720 Speaker 2: center his career on this connection. 212 00:12:21,520 --> 00:12:23,240 Speaker 1: You know, the Japanese used to have of this tradition 213 00:12:23,960 --> 00:12:27,439 Speaker 1: that bureaucrats worked for a limited salary, but then they 214 00:12:27,480 --> 00:12:32,239 Speaker 1: got absorbed into these very big companies had amazing salaries 215 00:12:32,520 --> 00:12:35,440 Speaker 1: once they're retired. And to somebody said, we fit that. 216 00:12:35,960 --> 00:12:37,440 Speaker 1: I mean, one of the points you all make is 217 00:12:37,440 --> 00:12:41,880 Speaker 1: that I think there are seventeen hundred former senior DoD 218 00:12:42,000 --> 00:12:46,040 Speaker 1: personnel employed by the major defense contractors that according to 219 00:12:46,080 --> 00:12:50,440 Speaker 1: a government accountability study. There's the whole process here which 220 00:12:50,679 --> 00:12:56,200 Speaker 1: I find challenging. There's something wrong if somebody is arguing 221 00:12:56,240 --> 00:13:00,720 Speaker 1: for an inferior weapon or an inferior system for profit 222 00:13:00,840 --> 00:13:05,320 Speaker 1: reasons in a way which actually weakens our capacity to 223 00:13:05,440 --> 00:13:06,359 Speaker 1: defend ourselves. 224 00:13:07,240 --> 00:13:10,200 Speaker 2: Yeah, and Congress plays a role because sometimes the Pentagon 225 00:13:10,600 --> 00:13:13,160 Speaker 2: wants to get rid of something and they put it back, 226 00:13:13,559 --> 00:13:18,160 Speaker 2: so military form becomes either harder or more expensive. And 227 00:13:18,200 --> 00:13:21,760 Speaker 2: then internally for example. Of course, Air force pilots are 228 00:13:21,800 --> 00:13:25,480 Speaker 2: not thrilled with the move towards and systems. Some of 229 00:13:25,520 --> 00:13:27,960 Speaker 2: them call the drone operators the chair force. 230 00:13:28,559 --> 00:13:32,560 Speaker 1: Let's say that I agree with your general concern. It's 231 00:13:32,600 --> 00:13:38,280 Speaker 1: too bureaucratic, it is too driven by non defense interests. 232 00:13:38,840 --> 00:13:42,720 Speaker 1: I always tell people that Jerry maguire, the movie in 233 00:13:42,760 --> 00:13:45,560 Speaker 1: which the agent the football player keeps saying to him, 234 00:13:45,920 --> 00:13:49,400 Speaker 1: I want you to show me the money. That explains 235 00:13:49,440 --> 00:13:52,320 Speaker 1: a great deal of the various arguments in Washington, DC 236 00:13:52,800 --> 00:13:56,360 Speaker 1: that under whatever the rationale, the real underlying argument has 237 00:13:56,400 --> 00:13:58,559 Speaker 1: showed me the money. And it's true in healthcare, it's 238 00:13:58,559 --> 00:14:02,679 Speaker 1: true in defense. True sadly at NASA, where we have 239 00:14:02,760 --> 00:14:07,760 Speaker 1: been funding a Boeing project that is absurd and amazingly indefensible, 240 00:14:07,800 --> 00:14:11,400 Speaker 1: but the politicians sustain it. How do you break through 241 00:14:11,440 --> 00:14:14,520 Speaker 1: that kind of a system. I think part of it. 242 00:14:14,920 --> 00:14:19,640 Speaker 2: Somehow we have to generate an actual discussion about what 243 00:14:19,680 --> 00:14:22,360 Speaker 2: our strategy should be. I know you are part of 244 00:14:22,360 --> 00:14:26,080 Speaker 2: the military reform movement which tried to power through a 245 00:14:26,080 --> 00:14:28,440 Speaker 2: lot of this. We could use that. Again, there's not 246 00:14:28,480 --> 00:14:31,160 Speaker 2: that many members who are really well informed on this, 247 00:14:31,280 --> 00:14:33,400 Speaker 2: and a lot of them spend more time trying to 248 00:14:33,440 --> 00:14:36,760 Speaker 2: protect one of the other states or districts than talking 249 00:14:36,880 --> 00:14:39,400 Speaker 2: strategy or how to do things better. And then in 250 00:14:39,440 --> 00:14:42,800 Speaker 2: the Pentagon, some people are attached to the existing bureaucracy. 251 00:14:42,840 --> 00:14:46,480 Speaker 2: There's jobs, there's careers, and there's others who would like 252 00:14:46,520 --> 00:14:49,480 Speaker 2: to innovate. So I think one of it is just 253 00:14:49,520 --> 00:14:54,480 Speaker 2: a national discussion about strategy, about technology, the alert knowledgeable 254 00:14:54,520 --> 00:14:58,800 Speaker 2: citizenry that Eisenower talked about. Obviously that's difficult now. I 255 00:14:58,800 --> 00:15:03,200 Speaker 2: mean there's miss and disinformation, there's conspiracy theories, there's division. 256 00:15:03,280 --> 00:15:07,640 Speaker 2: Everybody reads their own information sources. I think we might 257 00:15:07,720 --> 00:15:12,560 Speaker 2: need something old school like meeting people, teachings or even 258 00:15:12,640 --> 00:15:15,760 Speaker 2: podcasts where you can have a longer conversation. So that's 259 00:15:15,800 --> 00:15:18,240 Speaker 2: a little bit independent. On the money, I think it's 260 00:15:18,240 --> 00:15:18,880 Speaker 2: a piece of it. 261 00:15:19,600 --> 00:15:22,240 Speaker 3: I think Bill's right, you need that you need a 262 00:15:22,280 --> 00:15:24,720 Speaker 3: top down and a bottom up approach here, like this 263 00:15:24,800 --> 00:15:28,680 Speaker 3: is a big, wasteful monster we're trying to slay here. 264 00:15:28,800 --> 00:15:29,920 Speaker 3: And at the end of the day, as we point 265 00:15:29,960 --> 00:15:32,040 Speaker 3: out in the book, it doesn't make America safer, and 266 00:15:32,080 --> 00:15:34,400 Speaker 3: you know a lot of cases, it makes America less safe, 267 00:15:34,800 --> 00:15:36,560 Speaker 3: and so we have to provide the information of the 268 00:15:36,560 --> 00:15:39,760 Speaker 3: public so they understand that this idea that more spending 269 00:15:39,800 --> 00:15:42,560 Speaker 3: equals more security, it's who we That's just not how 270 00:15:42,560 --> 00:15:45,640 Speaker 3: the system works. And then to explain to them how 271 00:15:45,720 --> 00:15:49,000 Speaker 3: their tax dollars are being wasted. And I think there's 272 00:15:49,000 --> 00:15:52,400 Speaker 3: an opportunity for that in the near future where you're 273 00:15:52,440 --> 00:15:55,960 Speaker 3: seeing people worried about their healthcare premiums going out, quadrupling 274 00:15:56,120 --> 00:15:59,040 Speaker 3: or even more in some cases going up, and the 275 00:15:59,080 --> 00:16:01,840 Speaker 3: inflation put food on the tables getting harder and harder. 276 00:16:02,200 --> 00:16:05,440 Speaker 3: Yet at the same time we're willing to spend billions 277 00:16:05,440 --> 00:16:09,240 Speaker 3: of dollars on it's basically a failed F thirty five program. 278 00:16:09,440 --> 00:16:11,960 Speaker 3: I think pointing out those disconnects is key and getting 279 00:16:11,960 --> 00:16:14,920 Speaker 3: people mad about it's going to be key. Then top down, 280 00:16:15,360 --> 00:16:18,960 Speaker 3: I think you really have to have a true doge 281 00:16:19,120 --> 00:16:22,520 Speaker 3: at DoD. You know, when Dose came around, I think 282 00:16:22,560 --> 00:16:25,160 Speaker 3: a lot of us in this community that knew there 283 00:16:25,160 --> 00:16:28,040 Speaker 3: was all this wasteful spending at DoD. We're just waiting 284 00:16:28,080 --> 00:16:30,560 Speaker 3: for Elon Musk and his folks to get to DoD 285 00:16:30,800 --> 00:16:33,240 Speaker 3: because you know, we're just sitting there saying, man, if 286 00:16:33,280 --> 00:16:36,520 Speaker 3: you want government waste, go to the largest government bureaucracy 287 00:16:36,640 --> 00:16:39,280 Speaker 3: it's loaded with waste. Folks in the military will tell 288 00:16:39,320 --> 00:16:41,920 Speaker 3: you how much wasteful spending there is. Yet when DOGE 289 00:16:42,000 --> 00:16:45,400 Speaker 3: went to DoD, they barely found the change in the 290 00:16:45,400 --> 00:16:48,720 Speaker 3: couch cushions, so they didn't cut hardly anything from DoD. 291 00:16:48,880 --> 00:16:50,760 Speaker 3: I think you need somebody to come in and really 292 00:16:50,840 --> 00:16:52,600 Speaker 3: do that to get meaningful reform. 293 00:16:53,160 --> 00:16:57,040 Speaker 2: And I think you need intelligent efficiency. There's all kinds 294 00:16:57,080 --> 00:16:59,880 Speaker 2: of paperwork which I think the big firm's hide behind you. 295 00:17:00,040 --> 00:17:02,840 Speaker 2: They hire former acquisition officials and make it hard to 296 00:17:02,960 --> 00:17:05,359 Speaker 2: enter the field. But I think to do it, you 297 00:17:05,359 --> 00:17:07,200 Speaker 2: couldn't do it in three months. I think you would 298 00:17:07,200 --> 00:17:10,040 Speaker 2: have had to study what works and what doesn't work 299 00:17:10,080 --> 00:17:12,280 Speaker 2: in the various bureaucracies, and of course it would be 300 00:17:12,320 --> 00:17:15,760 Speaker 2: a huge fight to make those changes. So and I 301 00:17:15,800 --> 00:17:18,960 Speaker 2: don't know that many people who would know, well, okay, 302 00:17:18,960 --> 00:17:21,399 Speaker 2: what does work, what doesn't work. I think you do 303 00:17:21,480 --> 00:17:25,679 Speaker 2: want independent testing, You want safeguards against price gouging. You 304 00:17:25,720 --> 00:17:29,280 Speaker 2: want to be able to be flexible to adapt technology 305 00:17:29,320 --> 00:17:32,720 Speaker 2: to changing circumstances. If it takes twenty years to develop something, 306 00:17:32,880 --> 00:17:35,800 Speaker 2: you really can't do that. But I think we need 307 00:17:35,840 --> 00:17:39,080 Speaker 2: some of those former officials to tell us what they 308 00:17:39,119 --> 00:17:41,840 Speaker 2: think would work. I think we need Congress to be 309 00:17:41,880 --> 00:17:45,040 Speaker 2: more cognizant. They used to have an Office of Technology Assessment. 310 00:17:45,400 --> 00:17:47,879 Speaker 2: Given that a lot of the new stuff is tech based, 311 00:17:47,880 --> 00:17:50,119 Speaker 2: I would like the government to have the expertise to 312 00:17:50,200 --> 00:17:53,720 Speaker 2: evaluate all that. But it might be controversial because it 313 00:17:53,800 --> 00:17:56,560 Speaker 2: might cost a little money to keep people with that 314 00:17:56,600 --> 00:18:00,439 Speaker 2: expertise in government rather than industry. A lot to think 315 00:18:00,480 --> 00:18:03,200 Speaker 2: about it, but because it's in flux and there are 316 00:18:03,440 --> 00:18:07,480 Speaker 2: possibly this clash between the new guard and the old Guard. 317 00:18:07,560 --> 00:18:11,800 Speaker 2: Andrew has a manifesto called Arsenal Democracy two point zero 318 00:18:12,080 --> 00:18:14,840 Speaker 2: and it's critique of the current system. But it's pretty good. 319 00:18:14,880 --> 00:18:17,159 Speaker 2: I mean, you know, we could have probably written it 320 00:18:17,480 --> 00:18:19,800 Speaker 2: what's the new system and what is going to work 321 00:18:19,840 --> 00:18:22,320 Speaker 2: and not work, and what are the safeguards and the strategy. 322 00:18:22,440 --> 00:18:24,679 Speaker 2: So there is some hope in that change because we 323 00:18:24,720 --> 00:18:44,520 Speaker 2: can't really have a military based on twentieth century technology. 324 00:18:45,600 --> 00:18:50,600 Speaker 1: Part of that is the technologies are changing so rapidly. 325 00:18:50,640 --> 00:18:53,840 Speaker 1: If you look at what's happened with drones in Ukraine. 326 00:18:54,400 --> 00:19:00,679 Speaker 1: On both sides of Russian and Ukraine, the forces profound 327 00:19:00,760 --> 00:19:04,879 Speaker 1: rethinking of how we set up operations, at least for 328 00:19:04,960 --> 00:19:09,679 Speaker 1: the moment. These drones have given an enormous advantage to 329 00:19:09,680 --> 00:19:14,280 Speaker 1: the defense and have made the classic armored warfare that 330 00:19:14,359 --> 00:19:19,760 Speaker 1: we like up through the Iraqi campaign basically not sustainable 331 00:19:20,280 --> 00:19:24,120 Speaker 1: because just the tanks just get killed. And yet you're 332 00:19:24,160 --> 00:19:28,240 Speaker 1: wrote taking on the armored community and you're taking on 333 00:19:28,400 --> 00:19:32,920 Speaker 1: the piloted aircraft community. If you say, let's really study 334 00:19:33,440 --> 00:19:37,480 Speaker 1: what I understand. First became obvious in a fight between 335 00:19:37,520 --> 00:19:43,680 Speaker 1: Azerbaijan and Armenia, where the Azerbaijanis had mastered this technology 336 00:19:44,000 --> 00:19:47,760 Speaker 1: and just massacred the Armenians. And that was the forerunner 337 00:19:48,160 --> 00:19:50,960 Speaker 1: which the Russians obviously had not studied, or they would 338 00:19:50,960 --> 00:19:54,399 Speaker 1: have realized that running down the road towards Kiev large 339 00:19:54,520 --> 00:19:58,520 Speaker 1: armored columns was going to be a nightmare and a disaster. 340 00:19:59,160 --> 00:20:02,040 Speaker 1: But if you look at the American system right now, 341 00:20:02,480 --> 00:20:06,440 Speaker 1: to go in and suggest that scale of change arouses 342 00:20:07,240 --> 00:20:11,240 Speaker 1: so much opposition, And I am puzzled about the whole 343 00:20:11,280 --> 00:20:16,040 Speaker 1: program of a next generation fighter and a next generation bomber. 344 00:20:16,800 --> 00:20:20,760 Speaker 1: When I'm looking at the rise of automated systems, and 345 00:20:20,840 --> 00:20:25,920 Speaker 1: you mentioned Andrew earlier. They have this ghost Shark automated submarine, 346 00:20:26,560 --> 00:20:29,840 Speaker 1: which when you take out the human in a submarine, 347 00:20:30,600 --> 00:20:33,080 Speaker 1: and the amount of space that a human needs, and 348 00:20:33,119 --> 00:20:35,679 Speaker 1: the amount of support that a human needs, and the 349 00:20:35,760 --> 00:20:39,800 Speaker 1: dining room and everything else, you suddenly get a radically 350 00:20:39,840 --> 00:20:44,480 Speaker 1: smaller vehicle that actually carries far more torpedoes. And as 351 00:20:44,520 --> 00:20:48,800 Speaker 1: Ioner said it in the unclassified information, the ghost shark 352 00:20:48,840 --> 00:20:53,360 Speaker 1: that they're building for Australia basically can go out about 353 00:20:53,440 --> 00:20:59,040 Speaker 1: eighteen hundred miles autonomously by itself. If you wanted to 354 00:20:59,080 --> 00:21:03,040 Speaker 1: really screw up any Chinese communist effort to invade Taiwan, 355 00:21:03,080 --> 00:21:07,120 Speaker 1: it wouldn't take very many ghost sharks to simply make 356 00:21:07,160 --> 00:21:10,399 Speaker 1: it impossible to cross one hundred and forty miles of 357 00:21:10,520 --> 00:21:15,760 Speaker 1: the Straits of Taiwan. But we invest huge amounts in 358 00:21:16,119 --> 00:21:21,159 Speaker 1: exquisite tax submarines that are nuclear powered and have brilliant crews. 359 00:21:21,600 --> 00:21:24,480 Speaker 1: But you'd have to ask yourself in the next cycle, 360 00:21:25,560 --> 00:21:28,920 Speaker 1: is that the right investment? And because all of the 361 00:21:29,000 --> 00:21:33,440 Speaker 1: senior admirals will have come out of traditional ships, how 362 00:21:33,480 --> 00:21:34,840 Speaker 1: hard is it going to be to get them to 363 00:21:34,840 --> 00:21:37,680 Speaker 1: stop and say, you know, maybe we're in a different 364 00:21:37,720 --> 00:21:41,840 Speaker 1: world now. Yeah, And there's a couple of things. One 365 00:21:41,960 --> 00:21:44,760 Speaker 1: is I think drones are leveling the playing field a bit, 366 00:21:44,800 --> 00:21:47,639 Speaker 1: because not many countries could build a tank or a 367 00:21:47,640 --> 00:21:51,120 Speaker 1: fighter plane, but they can make relatively cheap drones. 368 00:21:51,640 --> 00:21:56,200 Speaker 2: Israel, Turkey, Iran, Ukraine also has a parallel DIY program 369 00:21:56,240 --> 00:22:00,159 Speaker 2: where they take Chinese drones, put weapons and cameras and 370 00:22:00,200 --> 00:22:03,360 Speaker 2: if it's a suicide drone that does the trick. So 371 00:22:03,400 --> 00:22:07,040 Speaker 2: some of these legacy systems they wouldn't farewell in a 372 00:22:07,080 --> 00:22:10,280 Speaker 2: new approach. They don't give you as much an advantage 373 00:22:10,280 --> 00:22:13,199 Speaker 2: as maybe they did in the past. And I think 374 00:22:13,240 --> 00:22:15,400 Speaker 2: there's got to be a range. I mean, if they're 375 00:22:15,400 --> 00:22:18,560 Speaker 2: too exquisite and they take too long to maintain in 376 00:22:18,600 --> 00:22:21,000 Speaker 2: a war like Ukraine, it would put you at a disadvantage. 377 00:22:21,000 --> 00:22:23,399 Speaker 2: Where Russia is cranking out stuff that may not be 378 00:22:23,480 --> 00:22:27,200 Speaker 2: as technically proficients ares, but they can do it quickly. 379 00:22:27,640 --> 00:22:29,800 Speaker 1: They're just going to say. The sheer number of drones 380 00:22:30,480 --> 00:22:32,600 Speaker 1: that the Ukraine NaNs are going to make in house 381 00:22:32,640 --> 00:22:37,080 Speaker 1: this year is a revolution and capability. We don't see 382 00:22:37,080 --> 00:22:37,560 Speaker 1: anything like it. 383 00:22:38,480 --> 00:22:41,040 Speaker 2: No, these aren't just victory guards. These are the weapons 384 00:22:41,080 --> 00:22:41,360 Speaker 2: of war. 385 00:22:41,880 --> 00:22:45,359 Speaker 3: The Ukraine conflict is fundamentally changing conflict, and not just 386 00:22:45,400 --> 00:22:48,879 Speaker 3: that conflict. You're seeing this in Israel, Gaza, too, And 387 00:22:48,920 --> 00:22:51,000 Speaker 3: I think to your point, new you know, as much 388 00:22:51,000 --> 00:22:54,399 Speaker 3: as we might love or friends and family in the 389 00:22:54,480 --> 00:22:57,480 Speaker 3: US Navy, in the US Air Force for that matter, 390 00:22:57,600 --> 00:23:00,879 Speaker 3: we're getting to a point where the technologology of some 391 00:23:00,960 --> 00:23:06,080 Speaker 3: of this equipment is exceeding what humans can withstand. I've 392 00:23:06,080 --> 00:23:08,760 Speaker 3: wrote in the book about flying with the Thunderbirds as 393 00:23:08,760 --> 00:23:11,160 Speaker 3: special guests with them, and you know, when you do that, 394 00:23:11,760 --> 00:23:14,240 Speaker 3: you go through, you know, like a half day training 395 00:23:14,359 --> 00:23:17,119 Speaker 3: on how to make sure you don't pass out even 396 00:23:17,119 --> 00:23:19,200 Speaker 3: in a you know, a flyover. You know we were 397 00:23:19,200 --> 00:23:20,600 Speaker 3: not in combat, or you know we were going to 398 00:23:20,640 --> 00:23:23,719 Speaker 3: do a flyover, and you get very serious training on 399 00:23:23,800 --> 00:23:26,119 Speaker 3: how not to pass out, you know, from the g's 400 00:23:26,560 --> 00:23:28,639 Speaker 3: And that's an n F sixty that doesn't have the 401 00:23:28,680 --> 00:23:30,760 Speaker 3: speed of an F twenty two or an F thirty five, 402 00:23:31,320 --> 00:23:34,800 Speaker 3: let alone future aircraft. So we're very quickly getting to 403 00:23:34,840 --> 00:23:38,280 Speaker 3: a point where the human body just can't withstand the 404 00:23:38,320 --> 00:23:41,199 Speaker 3: capabilities of the technology that we have. And we have 405 00:23:41,240 --> 00:23:43,879 Speaker 3: to recognize that, because if we don't recognize it, America's 406 00:23:43,920 --> 00:23:47,359 Speaker 3: adversaries certainly are and they're going to be creating these 407 00:23:47,440 --> 00:23:50,400 Speaker 3: unmanned systems with greater capabilities than our man systems. 408 00:23:50,720 --> 00:23:52,400 Speaker 1: It's a little bit like when we went into a wreck. 409 00:23:53,040 --> 00:23:57,359 Speaker 1: I think in retrospect we should have expected you unieds 410 00:23:58,080 --> 00:24:03,199 Speaker 1: and other kinds of low cost but very deadly operations. 411 00:24:03,600 --> 00:24:05,439 Speaker 1: And at the time I think it was shocked us. 412 00:24:05,960 --> 00:24:08,919 Speaker 1: We were fully prepared to fight a traditional war against 413 00:24:08,920 --> 00:24:13,199 Speaker 1: a traditional opponent and we would win it. And I 414 00:24:13,200 --> 00:24:16,520 Speaker 1: remember I happened to be in Australia about five months 415 00:24:16,560 --> 00:24:19,280 Speaker 1: before we went into Iraq and was a dinner with 416 00:24:19,320 --> 00:24:21,720 Speaker 1: the head of the Australian military and I said, if 417 00:24:21,760 --> 00:24:24,560 Speaker 1: you were faced with the problem of defeating the United States, 418 00:24:25,560 --> 00:24:27,399 Speaker 1: what would you do? How would you try to do it? 419 00:24:27,960 --> 00:24:33,280 Speaker 1: And he said I wouldn't. He said, It's inconceivable in 420 00:24:33,320 --> 00:24:35,880 Speaker 1: a normal traditional war that I'm going to be able 421 00:24:35,880 --> 00:24:38,639 Speaker 1: to beat the United States. So I would invest in 422 00:24:38,720 --> 00:24:42,680 Speaker 1: gorilla capabilities. I would have lots of equipment that was decentralized. 423 00:24:43,119 --> 00:24:45,760 Speaker 1: I would assume we're going to lose round one, and 424 00:24:45,800 --> 00:24:48,320 Speaker 1: then I'd be available to make round two really painful. 425 00:24:49,440 --> 00:24:52,160 Speaker 1: I thought that was just so totally outside the way 426 00:24:52,520 --> 00:24:54,320 Speaker 1: we thought of, and then of course that's what happened. 427 00:24:55,359 --> 00:24:58,960 Speaker 2: And sometimes they measure power by money. The US spends 428 00:24:59,000 --> 00:25:01,480 Speaker 2: more than the next ex hunters involved. But if you're 429 00:25:01,480 --> 00:25:03,840 Speaker 2: buying the wrong things, money is not the issue. If 430 00:25:03,840 --> 00:25:08,200 Speaker 2: there's asymmetric warfare, different levels of morale. I mean, the 431 00:25:08,200 --> 00:25:10,800 Speaker 2: revolution of military affairs was supposed to be we have 432 00:25:10,880 --> 00:25:15,280 Speaker 2: superior information, superior networks, period precision kind of mnitions, but 433 00:25:15,520 --> 00:25:18,280 Speaker 2: in Iraq that didn't get the job done, or for example, 434 00:25:18,920 --> 00:25:22,879 Speaker 2: the bombing of the houtis the costs exchange was not great. 435 00:25:22,960 --> 00:25:25,200 Speaker 2: Some of our missus were a few million dollars. There's 436 00:25:25,240 --> 00:25:28,520 Speaker 2: were cheap, so even just economically sustaining some of these 437 00:25:28,520 --> 00:25:29,600 Speaker 2: wars as a challenge. 438 00:25:29,760 --> 00:25:31,960 Speaker 3: To your point, their bill and to your point Nude, 439 00:25:32,000 --> 00:25:35,040 Speaker 3: I think part of the system that's in place we 440 00:25:35,119 --> 00:25:38,920 Speaker 3: talk about a lot in the book is how Hollywood 441 00:25:38,960 --> 00:25:41,439 Speaker 3: has helped to perpetuate this and helped to keep that 442 00:25:41,520 --> 00:25:44,600 Speaker 3: old guard going. When we see movies like Top Gun 443 00:25:44,640 --> 00:25:48,439 Speaker 3: and Top Gun two, these are movies with pilots in seats, 444 00:25:48,520 --> 00:25:51,399 Speaker 3: with butts in seats. It's not a sexy in Hollywood 445 00:25:51,600 --> 00:25:54,080 Speaker 3: to put the drones out there. Very often when the 446 00:25:54,119 --> 00:25:57,320 Speaker 3: AI or the advanced military technology is put in a movie, 447 00:25:57,800 --> 00:26:01,080 Speaker 3: it's scary. It's terminator too, you know, it's hiberdyne systems 448 00:26:01,119 --> 00:26:03,639 Speaker 3: trying to end humanity as we know it. And so 449 00:26:03,760 --> 00:26:08,760 Speaker 3: again there's this built in autoimmune function in the cultural 450 00:26:08,840 --> 00:26:13,640 Speaker 3: part of the miic that is repelling the technological transition, and. 451 00:26:13,680 --> 00:26:16,359 Speaker 2: Some of these weapons that don't function well in reality 452 00:26:16,440 --> 00:26:17,680 Speaker 2: are killing it in the movies. 453 00:26:18,040 --> 00:26:21,720 Speaker 1: But I saw the second top gun. They were trying 454 00:26:21,760 --> 00:26:29,240 Speaker 1: to use basically absolutely tactical aircraft to penetrate and I 455 00:26:29,280 --> 00:26:32,240 Speaker 1: said at the time, well, before what happened, you know, 456 00:26:32,280 --> 00:26:36,200 Speaker 1: in the real world, you'd use B two bombers and 457 00:26:36,240 --> 00:26:39,120 Speaker 1: you would have such precision that, in fact, you would 458 00:26:39,160 --> 00:26:44,320 Speaker 1: penetrate the target instantaneously with the zero risk. And it 459 00:26:44,320 --> 00:26:46,880 Speaker 1: occurred to me just listening to you guys, I hope 460 00:26:46,920 --> 00:26:50,320 Speaker 1: this is not an inappropriate thought. But because these guys 461 00:26:50,359 --> 00:26:54,040 Speaker 1: are obviously extraordinarily well trained and they practice and they're very, 462 00:26:54,119 --> 00:26:57,879 Speaker 1: very courageous. But if you actually tried to film the 463 00:26:58,000 --> 00:27:01,920 Speaker 1: be Too assault in Iran, the number of hours of 464 00:27:02,080 --> 00:27:07,359 Speaker 1: just flying, you couldn't make it a very interesting documentary. Now, 465 00:27:07,760 --> 00:27:11,040 Speaker 1: it had taken the Israeli Air Force to degrade and 466 00:27:11,119 --> 00:27:15,120 Speaker 1: destroy the Iranian air defense system, but which they did 467 00:27:15,160 --> 00:27:20,040 Speaker 1: pretty successfully. But and that effect was we sent seven 468 00:27:20,119 --> 00:27:23,600 Speaker 1: vehicles a long distance, they refueled a number of times, 469 00:27:24,000 --> 00:27:26,359 Speaker 1: they got to the right target area, there was no 470 00:27:26,440 --> 00:27:30,760 Speaker 1: effective in aircraft capability. They took out the SISE and 471 00:27:30,800 --> 00:27:31,600 Speaker 1: then they flew home. 472 00:27:32,200 --> 00:27:35,480 Speaker 2: Yeah, and even in Iraq and Afghanistan, a lot of 473 00:27:35,480 --> 00:27:39,200 Speaker 2: the casualties were just transporting things, and so the Pentagon 474 00:27:39,200 --> 00:27:43,600 Speaker 2: became interested in modular solar and so forth, which I 475 00:27:43,600 --> 00:27:46,800 Speaker 2: think there's still some interesting even though in some circles 476 00:27:47,480 --> 00:27:50,520 Speaker 2: alternative energy is a bad name. So there's a lot 477 00:27:50,560 --> 00:27:53,560 Speaker 2: in it. And of course Hollywood likes heroes. An unmanned 478 00:27:53,640 --> 00:27:57,000 Speaker 2: vehicle is not a hero, Tom Craze somehow winning a 479 00:27:57,040 --> 00:27:58,920 Speaker 2: battle that they couldn't win in real life as a hero. 480 00:27:59,440 --> 00:28:03,080 Speaker 1: That's right, if every citizen could read your new book, 481 00:28:03,440 --> 00:28:07,359 Speaker 1: The Trillion Dollar War Machine, how runaway military spending drives 482 00:28:07,359 --> 00:28:09,760 Speaker 1: the America into foreign wars and bankrupt us at home. 483 00:28:10,280 --> 00:28:13,560 Speaker 1: I think, just in terms of starting the conversation and 484 00:28:13,640 --> 00:28:16,119 Speaker 1: picking up on what Eisenhower tried to warn us about 485 00:28:16,640 --> 00:28:21,360 Speaker 1: that when you build huge systems that are self serving, 486 00:28:22,000 --> 00:28:25,040 Speaker 1: that have a huge interest in manipulating everybody else to 487 00:28:25,119 --> 00:28:29,520 Speaker 1: their advantage, you are inherently both putting your national security 488 00:28:30,000 --> 00:28:33,360 Speaker 1: and your democracy and your freedom at risk and Eisenhower, 489 00:28:33,440 --> 00:28:36,000 Speaker 1: who of course had been a career soldier, a West 490 00:28:36,000 --> 00:28:39,760 Speaker 1: Point graduate, a five star general. I think he genuinely 491 00:28:39,800 --> 00:28:42,680 Speaker 1: had a fear that we had now gotten into a 492 00:28:42,680 --> 00:28:46,440 Speaker 1: cycle where the systems that are supposed to serve us 493 00:28:46,920 --> 00:28:49,560 Speaker 1: in fact have us serving them. And I think your 494 00:28:49,600 --> 00:28:53,200 Speaker 1: book helps make that point. So I encourage everybody to 495 00:28:53,240 --> 00:28:55,800 Speaker 1: take a look at the trillion dollar war machine. Bill 496 00:28:55,840 --> 00:28:58,120 Speaker 1: and Ben, I really want to thank you for taking 497 00:28:58,160 --> 00:29:00,360 Speaker 1: the time to share with us, and I think it's 498 00:29:00,400 --> 00:29:04,480 Speaker 1: a very important conversation. Thank you both for joining me, 499 00:29:04,480 --> 00:29:07,160 Speaker 1: and I think this was a very very good conversation. 500 00:29:07,520 --> 00:29:09,400 Speaker 2: Thank you so much. Yes, thank you. 501 00:29:13,080 --> 00:29:15,719 Speaker 1: Thank you to my guests, Bill Hartung and Ben Freeman. 502 00:29:16,320 --> 00:29:19,800 Speaker 1: Newsworld is produced by Gingrash three sixty and iHeartMedia. Our 503 00:29:19,840 --> 00:29:24,640 Speaker 1: executive producers Guarnsey Sloan. Our researcher is Rachel Peterson. The 504 00:29:24,720 --> 00:29:28,520 Speaker 1: artwork for the show was created by Steve Fenley. Special 505 00:29:28,520 --> 00:29:31,320 Speaker 1: thanks to the team at Ginglish three sixty. If you've 506 00:29:31,320 --> 00:29:34,120 Speaker 1: been enjoying Newtsworld, I hope you'll go to Apple Podcast 507 00:29:34,440 --> 00:29:36,880 Speaker 1: and both rate us at five stars and give us 508 00:29:36,880 --> 00:29:39,360 Speaker 1: a review so others can learn what it's all about. 509 00:29:40,280 --> 00:29:44,160 Speaker 1: Join me on substat at ginglestre sixty dot net. I'm 510 00:29:44,240 --> 00:29:46,160 Speaker 1: Nick Gingrich. This is news world,