1 00:00:07,120 --> 00:00:19,400 Speaker 1: Ah, beaut force. If it doesn't work, you're just not 2 00:00:19,720 --> 00:00:25,440 Speaker 1: using enough. You're listening to software Radio, Special Operations, Military 3 00:00:25,520 --> 00:01:09,040 Speaker 1: news and straight talk with the guys and the community 4 00:01:10,200 --> 00:01:14,920 Speaker 1: and Lee are live soccrep dot com on time on target. 5 00:01:14,920 --> 00:01:17,280 Speaker 1: I mean Scott O, Jack Murphy and here with us 6 00:01:17,400 --> 00:01:19,200 Speaker 1: is I was actually making sure of seeing the name right, 7 00:01:19,240 --> 00:01:23,280 Speaker 1: Paul Say or Shari Sharry. That's what you were saying before. Jack, 8 00:01:23,440 --> 00:01:25,399 Speaker 1: Just get closer to that to the light. Yeah, you 9 00:01:25,400 --> 00:01:28,000 Speaker 1: could move and you can justus them all right. Um, Yeah, 10 00:01:28,040 --> 00:01:30,520 Speaker 1: we're excited to have young So. So, Paul is a 11 00:01:30,560 --> 00:01:33,920 Speaker 1: former team leader in the Army third Range of Battalion, 12 00:01:34,080 --> 00:01:37,000 Speaker 1: teammate of Jack of how many years. Oh, we worked 13 00:01:37,000 --> 00:01:40,720 Speaker 1: together for a while. Uh, we're deployed together to Afghanistan. 14 00:01:41,760 --> 00:01:43,600 Speaker 1: Paul was on the wrecky team and I was in 15 00:01:43,680 --> 00:01:46,880 Speaker 1: sniper section at the time and you and you were 16 00:01:46,880 --> 00:01:50,200 Speaker 1: your selfer graduate of sniper school. Yeah. So I was 17 00:01:50,240 --> 00:01:53,200 Speaker 1: in the sniper detachment in third bat Um for a 18 00:01:53,200 --> 00:01:55,120 Speaker 1: while and then when we created a wreckie team, we 19 00:01:55,160 --> 00:01:56,760 Speaker 1: sort of poach some of the snipers and I ended 20 00:01:56,800 --> 00:01:59,560 Speaker 1: up stand up one of the wreckie teams there. Yeah, 21 00:01:59,640 --> 00:02:02,200 Speaker 1: same time, this stems Jack. We did deployment together. Very cool. 22 00:02:02,280 --> 00:02:04,600 Speaker 1: So Paul is also the author of Army of Non 23 00:02:04,680 --> 00:02:07,440 Speaker 1: Autonomous Weapons and the Future of War, which I actually 24 00:02:07,440 --> 00:02:10,680 Speaker 1: should have on this table. I realized I'm gonna do 25 00:02:10,760 --> 00:02:13,200 Speaker 1: that so that people can see the cover of this. 26 00:02:13,919 --> 00:02:16,440 Speaker 1: But yeah, so he's the author of Army of None 27 00:02:16,880 --> 00:02:18,880 Speaker 1: came out last week, came out one week through some 28 00:02:18,919 --> 00:02:22,720 Speaker 1: product placement and yeah, let's get actually probably put it 29 00:02:22,760 --> 00:02:25,800 Speaker 1: up here somewhere. It's funny because Paul was asking before 30 00:02:25,800 --> 00:02:27,079 Speaker 1: about like, what are we been talking about on the 31 00:02:27,120 --> 00:02:30,440 Speaker 1: script as you see where the least scripted scripted show. 32 00:02:30,520 --> 00:02:34,840 Speaker 1: Put the book on the table and move those headphones there. Yeah, no, 33 00:02:35,040 --> 00:02:36,840 Speaker 1: it's mine that are in the way. Actually, there you go. 34 00:02:36,960 --> 00:02:38,840 Speaker 1: And just to finish off, who Paul is. Paul is 35 00:02:38,840 --> 00:02:41,639 Speaker 1: currently the Senior Fellow and a Director of the Techno 36 00:02:41,760 --> 00:02:45,440 Speaker 1: of the Technology and National Security Program at the Center 37 00:02:45,600 --> 00:02:49,080 Speaker 1: for a New American Security. So excited to be on man, Yeah, 38 00:02:49,080 --> 00:02:51,760 Speaker 1: thanks for having me. Absolutely, it's awesome. I was thinking 39 00:02:51,800 --> 00:02:54,320 Speaker 1: about it, uh this morning, Like, I haven't seen Paul 40 00:02:54,440 --> 00:02:57,040 Speaker 1: like twelve years. The last time we saw each other, 41 00:02:57,080 --> 00:02:59,680 Speaker 1: we ran in it was like a salsalitas or something. 42 00:03:00,200 --> 00:03:04,440 Speaker 1: Fayetteville North Carolina and I we had both left Ranger Battalion. 43 00:03:05,360 --> 00:03:07,920 Speaker 1: I was going through the Q course and Paul you 44 00:03:07,919 --> 00:03:11,040 Speaker 1: you had been stop lost right, Yeah, So so I 45 00:03:11,080 --> 00:03:13,920 Speaker 1: was recalled I've gotten out and then recalled through the 46 00:03:13,960 --> 00:03:16,440 Speaker 1: I r R when they were doing that, um a 47 00:03:16,480 --> 00:03:18,920 Speaker 1: little bit less high speed than the Q course. I 48 00:03:19,000 --> 00:03:21,840 Speaker 1: was sent through like a shaking bake civil affairs trading. 49 00:03:21,840 --> 00:03:23,960 Speaker 1: You were a bunch of reservists. It was, I mean, 50 00:03:24,120 --> 00:03:25,320 Speaker 1: be honest, it was. It was a bit of a 51 00:03:25,320 --> 00:03:27,400 Speaker 1: cop of circus. And uh yeah, I was down in 52 00:03:27,400 --> 00:03:29,359 Speaker 1: four pract too ready to deploy when I ranny there. 53 00:03:30,040 --> 00:03:31,880 Speaker 1: But then that that was the last time I saw you. 54 00:03:32,080 --> 00:03:37,280 Speaker 1: That was like twelve years ago. Yeah, ton flies and 55 00:03:37,400 --> 00:03:40,040 Speaker 1: uh and now you've written a book, got a book out, yep, 56 00:03:40,960 --> 00:03:43,520 Speaker 1: So what's your book about? All right? So, so the 57 00:03:43,560 --> 00:03:47,120 Speaker 1: book is basically about military robotics and where we're seeing 58 00:03:47,120 --> 00:03:49,840 Speaker 1: the technology evolved going forward. So we've got a situation 59 00:03:49,920 --> 00:03:53,520 Speaker 1: today where at least ninety countries have drones today already, 60 00:03:53,760 --> 00:03:56,360 Speaker 1: and you know, many non state groups Islamic State and others. 61 00:03:56,520 --> 00:03:59,840 Speaker 1: Moss has bullotgot drones. There's at least sixteen countries that 62 00:03:59,880 --> 00:04:02,680 Speaker 1: have armed drones. That's what we track in the book. 63 00:04:02,720 --> 00:04:04,920 Speaker 1: We got a map of them, we got a map 64 00:04:04,960 --> 00:04:07,280 Speaker 1: where they're coming from. Most of them are coming internationally 65 00:04:07,320 --> 00:04:10,360 Speaker 1: from China. They're the major proliferator of arm drones. And 66 00:04:10,400 --> 00:04:12,839 Speaker 1: then of course not state groups like ISIS as well, 67 00:04:12,920 --> 00:04:15,600 Speaker 1: they've got you know, hobbyist drones that cowl together put 68 00:04:15,640 --> 00:04:18,360 Speaker 1: some explosives on them. Um. The number keeps going up though. 69 00:04:19,560 --> 00:04:23,320 Speaker 1: With each generation we're seeing more automation and more autonomy 70 00:04:23,320 --> 00:04:26,160 Speaker 1: and these things. So just like in automobiles, as you know, 71 00:04:26,160 --> 00:04:28,080 Speaker 1: if you go buy a top of the aut automobile today, 72 00:04:28,279 --> 00:04:32,479 Speaker 1: it's got self parking, intelligi cruise control, it's got automatic 73 00:04:32,560 --> 00:04:36,440 Speaker 1: lane keeping, each generation of military robotics has more autonomy. 74 00:04:36,600 --> 00:04:39,240 Speaker 1: The question the book rapples with is what happens when 75 00:04:39,240 --> 00:04:41,880 Speaker 1: a predator drone has as much autonomy as a self 76 00:04:41,960 --> 00:04:45,920 Speaker 1: driving car. Where's this going? Um? And particularly we're talking 77 00:04:45,960 --> 00:04:48,960 Speaker 1: about life and death decisions. The book's all about this 78 00:04:49,040 --> 00:04:53,800 Speaker 1: idea of handing over targeting decisions to the machine itself. Well, 79 00:04:53,800 --> 00:04:56,000 Speaker 1: obviously there are lots of domains in warfare. We have 80 00:04:56,000 --> 00:04:59,479 Speaker 1: a lot of automation today, we've got you know, torpedoes 81 00:05:00,040 --> 00:05:02,800 Speaker 1: and air missiles and other things that you know, once 82 00:05:02,839 --> 00:05:05,000 Speaker 1: you let it go, a lot of these things are firing. 83 00:05:05,040 --> 00:05:07,280 Speaker 1: Forget weapons, you're not getting it back. There's a lot 84 00:05:07,360 --> 00:05:09,880 Speaker 1: of automation involved, a lot of technology, but humans are 85 00:05:09,920 --> 00:05:13,040 Speaker 1: still in control of these decisions. Someone is still saying, Yep, 86 00:05:13,160 --> 00:05:15,919 Speaker 1: there's an enemy ship, I want to attack it. Um 87 00:05:15,920 --> 00:05:18,720 Speaker 1: what happens when that changes? And then technology is definitely 88 00:05:18,720 --> 00:05:22,080 Speaker 1: taking us there. It's gonna make it possible to delegate 89 00:05:22,120 --> 00:05:24,480 Speaker 1: those decisions to machines. And so the book kind of 90 00:05:24,520 --> 00:05:26,880 Speaker 1: talks about first half about the technology what are people building, 91 00:05:27,200 --> 00:05:29,039 Speaker 1: and the second half about kind of what are the 92 00:05:29,040 --> 00:05:31,080 Speaker 1: implications of this? You know, what if what if we 93 00:05:31,120 --> 00:05:32,840 Speaker 1: start to cross this line? How do we feel about 94 00:05:32,839 --> 00:05:36,760 Speaker 1: deferring responsibility for killing to a machine? That's right? So 95 00:05:36,800 --> 00:05:40,920 Speaker 1: it talks about the law, the ethics, the strategic issues, stability. 96 00:05:40,960 --> 00:05:43,120 Speaker 1: One of the things that grapples with is UM is 97 00:05:43,120 --> 00:05:45,240 Speaker 1: you know what happens when you have these bots interacting 98 00:05:45,680 --> 00:05:48,359 Speaker 1: and maybe in unexpected ways. So we've seen this in 99 00:05:48,440 --> 00:05:51,599 Speaker 1: stock trading. We're now in a domain of stock trading 100 00:05:51,600 --> 00:05:55,919 Speaker 1: where it's largely automated. Most trades today are done by bots, 101 00:05:56,520 --> 00:05:58,800 Speaker 1: and a lot of these are happening in milliseconds at times, 102 00:05:58,800 --> 00:06:01,400 Speaker 1: because where humans can't pass, simply react. So we've seen 103 00:06:01,400 --> 00:06:04,320 Speaker 1: this arms racing speed and stock trading. Well, one of 104 00:06:04,360 --> 00:06:06,960 Speaker 1: the effects of this is you get things like flash crashes. 105 00:06:07,279 --> 00:06:11,440 Speaker 1: You get these unexpected interactions and bots start to interact, 106 00:06:11,520 --> 00:06:13,960 Speaker 1: or people manipulate the bots, they spoof them in ways 107 00:06:14,040 --> 00:06:16,479 Speaker 1: to take advantage of their behavior, and then you get 108 00:06:16,480 --> 00:06:19,000 Speaker 1: these these sort of crazy effects that happen, these crises, 109 00:06:19,240 --> 00:06:22,640 Speaker 1: these events. Well, in stock trading, they've been able to 110 00:06:22,680 --> 00:06:25,560 Speaker 1: mitigate this problem by installing circuit breakers that take the 111 00:06:25,600 --> 00:06:28,880 Speaker 1: stock offline if the price moves too quickly. But what 112 00:06:28,960 --> 00:06:31,200 Speaker 1: happens if that happens in war, Like there's no referee 113 00:06:31,200 --> 00:06:34,800 Speaker 1: to call time out. Things aren't going well, So you 114 00:06:34,800 --> 00:06:36,880 Speaker 1: know what happens if you get a flash war. Especially 115 00:06:36,880 --> 00:06:38,920 Speaker 1: since like two thousand and eight, I mean the SEC, 116 00:06:39,360 --> 00:06:41,839 Speaker 1: I think, and some others, there have been federal regulators 117 00:06:41,839 --> 00:06:44,400 Speaker 1: that actually oversee algorithms and things like that as far 118 00:06:44,560 --> 00:06:47,159 Speaker 1: like banking algorithms, that's right, and so the regulators are 119 00:06:47,160 --> 00:06:50,120 Speaker 1: stepped in, you know, the stock he changes themselves install 120 00:06:50,200 --> 00:06:52,239 Speaker 1: these circuit breakers because they don't want to have these events. 121 00:06:52,440 --> 00:06:55,200 Speaker 1: But there's no equivalent, and there's no referee in warfare. 122 00:06:56,880 --> 00:06:59,080 Speaker 1: I think one of the big differences we've seen, um 123 00:06:59,120 --> 00:07:01,400 Speaker 1: since you know, when I were in the military to 124 00:07:01,400 --> 00:07:04,280 Speaker 1: today is when I talked to people who are you know, 125 00:07:04,600 --> 00:07:08,000 Speaker 1: working on the front lines, they talk about someone in 126 00:07:08,120 --> 00:07:10,280 Speaker 1: Ukraine tell me about how there is a drone, how 127 00:07:10,320 --> 00:07:14,240 Speaker 1: the Russian separatists had a drone up with thermal 'son 128 00:07:14,320 --> 00:07:16,840 Speaker 1: it looking for though, like that's something. When we were 129 00:07:16,880 --> 00:07:19,880 Speaker 1: in the military, the Taliban or al Qaeda, whoever we 130 00:07:19,960 --> 00:07:22,840 Speaker 1: were up against, they didn't really have any night vision capability. 131 00:07:22,920 --> 00:07:26,600 Speaker 1: They definitely didn't really have any thermal capability. That's changed. 132 00:07:26,680 --> 00:07:29,280 Speaker 1: And then isis in Iraq they were flying drones and 133 00:07:29,360 --> 00:07:32,320 Speaker 1: dropping hand grenades down on troops. That was something we 134 00:07:32,320 --> 00:07:34,559 Speaker 1: didn't have to grapple with. No. Actually, if a buddy 135 00:07:34,600 --> 00:07:36,880 Speaker 1: in the like two thousand and five time frame he 136 00:07:37,000 --> 00:07:39,680 Speaker 1: shot down the drone in Iraq and he was all pleased. 137 00:07:39,680 --> 00:07:41,720 Speaker 1: You just you saw it overhead, you know, looked up 138 00:07:41,720 --> 00:07:43,800 Speaker 1: his inform shout that he's like, I got one. I 139 00:07:43,880 --> 00:07:45,840 Speaker 1: was like, dude, you know that was ours, right, Like 140 00:07:45,880 --> 00:07:49,600 Speaker 1: they don't have them like the job. But that's changed now, 141 00:07:49,880 --> 00:07:52,040 Speaker 1: that is changed, and so now you've got the threat 142 00:07:52,080 --> 00:07:55,040 Speaker 1: coming from above, and you've got this world where the 143 00:07:55,120 --> 00:07:58,200 Speaker 1: technology is so proliferated. Because the stuff is out there 144 00:07:58,240 --> 00:08:00,080 Speaker 1: in the commercial market. You can go buy a on 145 00:08:00,120 --> 00:08:02,960 Speaker 1: online for a couple of hundred bucks. Right, it's very automated. 146 00:08:03,040 --> 00:08:05,000 Speaker 1: Is a lot of autonomy. Get a pretty high quality 147 00:08:05,040 --> 00:08:10,160 Speaker 1: one for right and um And in some ways this 148 00:08:10,200 --> 00:08:13,120 Speaker 1: blowls my mind. You can buy a cheap drawing online. 149 00:08:13,120 --> 00:08:17,119 Speaker 1: There's more autonomy than an air forced reaper drowe. So 150 00:08:17,120 --> 00:08:19,040 Speaker 1: so you can buy things that have automated take off 151 00:08:19,040 --> 00:08:23,040 Speaker 1: and landing. They can do um obstacle avoidance, they can 152 00:08:23,120 --> 00:08:26,160 Speaker 1: navigate around trees, they can do things like track and 153 00:08:26,200 --> 00:08:29,600 Speaker 1: follow moving objects. That's more than we have in military 154 00:08:29,600 --> 00:08:32,920 Speaker 1: grade like large aircraft. Now, a reaper is bigger, it's 155 00:08:32,960 --> 00:08:35,520 Speaker 1: got bigger range, it care is more payload, it's got 156 00:08:35,520 --> 00:08:38,920 Speaker 1: bombs on it. But the autonomy, it's just software. And 157 00:08:38,960 --> 00:08:41,320 Speaker 1: it's so diffuse that anybody's gonna have access to this 158 00:08:41,400 --> 00:08:44,480 Speaker 1: kind of technology. Is that because the military has like 159 00:08:44,480 --> 00:08:48,560 Speaker 1: an ethical hold up on granting autonomy to these drives, 160 00:08:48,679 --> 00:08:51,760 Speaker 1: because the acquisition system is so we can't I mean, 161 00:08:51,880 --> 00:08:54,240 Speaker 1: I mean, that's not fair. Part of it's that right, 162 00:08:54,280 --> 00:08:57,079 Speaker 1: But part of it is a cultural resistance to giving 163 00:08:57,160 --> 00:09:00,360 Speaker 1: up control to robots of jobs. People like, so, I 164 00:09:00,360 --> 00:09:04,000 Speaker 1: mean the elements of autonomy that but we could be 165 00:09:04,040 --> 00:09:07,559 Speaker 1: instituting today. Some of these are non controversial at all. 166 00:09:08,200 --> 00:09:11,120 Speaker 1: Automated takeoff and landing, those are things that we totally 167 00:09:11,120 --> 00:09:13,480 Speaker 1: should be doing. But until you talk to pilots, and 168 00:09:13,480 --> 00:09:15,559 Speaker 1: pilots are like, well, that's my job, right, My job 169 00:09:15,640 --> 00:09:16,960 Speaker 1: is the fought aircraft. And they don't want to give 170 00:09:17,040 --> 00:09:19,440 Speaker 1: up a taxi cab driver getting mad at uber or 171 00:09:19,480 --> 00:09:22,480 Speaker 1: something exactly right. It's and I see this across the force. 172 00:09:22,520 --> 00:09:26,240 Speaker 1: So you've got um in. In the Army, the Army 173 00:09:26,280 --> 00:09:31,200 Speaker 1: medical community has actively resisted casualty evacuation with robotic vehicles. 174 00:09:31,760 --> 00:09:34,640 Speaker 1: Why because it's their job to go into hors Way 175 00:09:34,760 --> 00:09:37,000 Speaker 1: to do metavac. And so they're like, well, we don't, 176 00:09:37,040 --> 00:09:39,080 Speaker 1: we don't agree with that, we think that's wrong. In 177 00:09:39,120 --> 00:09:41,760 Speaker 1: the Marine Corps they're all about it. They don't have it, 178 00:09:41,960 --> 00:09:43,880 Speaker 1: they don't have that many people, and they have naval 179 00:09:43,920 --> 00:09:47,000 Speaker 1: corman exactly. They don't have a dedicated metovac community the 180 00:09:47,040 --> 00:09:49,680 Speaker 1: way the Army does. So it's a lot of times 181 00:09:49,679 --> 00:09:52,040 Speaker 1: like when when the robots taken away somebody's job, then 182 00:09:52,080 --> 00:09:56,120 Speaker 1: you get a lot of resistance. Yeah, it's uh, I 183 00:09:56,120 --> 00:09:59,040 Speaker 1: don't know, it's interesting. And you also get into the 184 00:09:59,040 --> 00:10:03,200 Speaker 1: book a lot about artificial intelligence, and I mean, if 185 00:10:03,240 --> 00:10:05,560 Speaker 1: we're ever going to handle autonomy over to the drone, 186 00:10:05,920 --> 00:10:09,199 Speaker 1: then obviously AI goes hand in hand with that, right. Yeah. 187 00:10:09,200 --> 00:10:11,520 Speaker 1: So we've seen amazing advantages in the past couple of 188 00:10:11,640 --> 00:10:14,760 Speaker 1: years and artificial intelligence and machine learning, and we're able 189 00:10:14,800 --> 00:10:17,720 Speaker 1: to do things that, like five years ago would have 190 00:10:17,760 --> 00:10:20,520 Speaker 1: just been impossible. One of the things that's that's pretty 191 00:10:20,520 --> 00:10:24,000 Speaker 1: remarkable and it's really directly applicable, this is object recognition. 192 00:10:24,440 --> 00:10:26,880 Speaker 1: So ten years ago, for having this conversation, I would 193 00:10:26,880 --> 00:10:29,160 Speaker 1: have said things like, look, a computer can't tell the 194 00:10:29,160 --> 00:10:31,680 Speaker 1: difference between an apple and a tomato, which was true 195 00:10:31,679 --> 00:10:33,680 Speaker 1: at the time. And one of the problems was that 196 00:10:34,240 --> 00:10:36,680 Speaker 1: old school AI, a lot of it was rule based, 197 00:10:37,000 --> 00:10:39,160 Speaker 1: So you have a set of rules for behavior. So 198 00:10:39,160 --> 00:10:40,800 Speaker 1: if you're trying to program a set of rules for 199 00:10:40,880 --> 00:10:44,000 Speaker 1: telligen a apple tomato, well, what were what rules that 200 00:10:44,040 --> 00:10:45,840 Speaker 1: would you give to somebody who's never seen it before? 201 00:10:46,240 --> 00:10:49,480 Speaker 1: Write the both roundable, shiny, the red, yellow green stem 202 00:10:49,480 --> 00:10:52,360 Speaker 1: on top. If you see them. They look obviously different, 203 00:10:52,360 --> 00:10:54,800 Speaker 1: like a toddler knows the difference between them, but it's 204 00:10:54,800 --> 00:10:57,120 Speaker 1: hard to come up with a set of rules. So now, 205 00:10:57,280 --> 00:10:59,920 Speaker 1: in the past couple we've seen this explosion in machine learning, 206 00:11:00,320 --> 00:11:03,840 Speaker 1: driven in large part by huge data sets, and actually 207 00:11:03,880 --> 00:11:06,080 Speaker 1: a lot of it comes out of graphical process units 208 00:11:06,120 --> 00:11:09,280 Speaker 1: GPS from gaming. Well, isn't it isn't it people training 209 00:11:09,320 --> 00:11:13,040 Speaker 1: the AI. So that's that's one way to do it, 210 00:11:13,080 --> 00:11:15,280 Speaker 1: is people train the AI. But we've also been able 211 00:11:15,280 --> 00:11:18,280 Speaker 1: to create thism to train themselves. So the latest version 212 00:11:18,520 --> 00:11:22,600 Speaker 1: of UM Google Deep minds Alpha Go that is playing 213 00:11:22,600 --> 00:11:26,600 Speaker 1: that the Chinese strategy game Go right, taught itself entirely 214 00:11:26,600 --> 00:11:28,559 Speaker 1: to play on its own, and they had a version 215 00:11:28,600 --> 00:11:32,320 Speaker 1: of this that played chess and in four hours, four 216 00:11:32,360 --> 00:11:36,120 Speaker 1: hours of self play, it reached superhuman performance levels. So 217 00:11:36,240 --> 00:11:38,880 Speaker 1: four hours it all has the access to the board 218 00:11:39,280 --> 00:11:44,080 Speaker 1: and rules. The game eclipses all of millennia human ology chess. 219 00:11:44,320 --> 00:11:46,960 Speaker 1: So this is the Skynet thesis that you flipped the 220 00:11:46,960 --> 00:11:50,400 Speaker 1: switch and it automatically comes to the determination that mankind 221 00:11:50,480 --> 00:11:53,640 Speaker 1: must be extinguished. It's a little creepy, a little creepy 222 00:11:54,240 --> 00:11:56,880 Speaker 1: UM and one of the challenges in the technology is 223 00:11:56,880 --> 00:11:59,440 Speaker 1: that is that it's very powerful and it's also very brittle, 224 00:11:59,800 --> 00:12:02,040 Speaker 1: and so you can build these systems and they do really, 225 00:12:02,080 --> 00:12:04,160 Speaker 1: really well. It's something that they were trained to do. 226 00:12:04,640 --> 00:12:06,760 Speaker 1: But then if you change the environment a little bit 227 00:12:06,840 --> 00:12:09,600 Speaker 1: or change the context steps, the performance can just fall apart. 228 00:12:09,880 --> 00:12:13,360 Speaker 1: So for example, um, this this go playing program Alpha 229 00:12:13,360 --> 00:12:16,280 Speaker 1: Go one of the earliest versions of this. It beat 230 00:12:16,320 --> 00:12:18,400 Speaker 1: the top human in the world, But if you change 231 00:12:18,440 --> 00:12:21,480 Speaker 1: the size of the board slightly, it's performance would drop 232 00:12:21,520 --> 00:12:24,120 Speaker 1: off dramatically because it's not how it was trained. It 233 00:12:24,160 --> 00:12:26,240 Speaker 1: was only trained on a certain size board. That's not 234 00:12:26,280 --> 00:12:29,280 Speaker 1: a problem for humans. So humans are very flexible and adaptable, 235 00:12:29,480 --> 00:12:31,400 Speaker 1: but the machines they were building today they don't know 236 00:12:31,400 --> 00:12:33,240 Speaker 1: how to do that very well. And that's a big 237 00:12:33,280 --> 00:12:35,800 Speaker 1: limitation when we think about using them. How are we 238 00:12:35,840 --> 00:12:38,280 Speaker 1: going to improve aim? You know over time? What did 239 00:12:38,320 --> 00:12:41,560 Speaker 1: you think it's going to continue like this? Or I 240 00:12:41,559 --> 00:12:44,240 Speaker 1: mean even if so, I think like how the technology 241 00:12:44,320 --> 00:12:46,920 Speaker 1: evolves go forward is wide open. I mean you hear, 242 00:12:46,920 --> 00:12:50,640 Speaker 1: it's a just huge array of speculation. But even if 243 00:12:50,679 --> 00:12:53,520 Speaker 1: all basic research progress and a I just stopped tomorrow, 244 00:12:53,600 --> 00:12:56,640 Speaker 1: which I think is likely, Um, you could get so 245 00:12:56,760 --> 00:12:59,319 Speaker 1: much mileage for decades out of just taking that things 246 00:12:59,360 --> 00:13:02,200 Speaker 1: are already this and applying them in different sectors. So 247 00:13:02,240 --> 00:13:04,280 Speaker 1: if you took object recognition, which now we can do 248 00:13:04,360 --> 00:13:07,720 Speaker 1: using machine learning better than humans. So yeah, ten years 249 00:13:07,760 --> 00:13:11,319 Speaker 1: ago Apple Submantle was hard. Now we've already had machines 250 00:13:11,320 --> 00:13:15,120 Speaker 1: that can beat humans at benchmark tests of identifying objects. 251 00:13:15,240 --> 00:13:16,880 Speaker 1: So they could look in this woman, they could identify 252 00:13:16,920 --> 00:13:19,600 Speaker 1: our faces, they could identify all the objects this is. 253 00:13:19,960 --> 00:13:21,640 Speaker 1: You know, this is a glass of water, this is 254 00:13:21,640 --> 00:13:25,200 Speaker 1: a book, there's a pen um and so you got 255 00:13:25,200 --> 00:13:27,840 Speaker 1: up playing that to warfare. Could we have some smart 256 00:13:27,920 --> 00:13:31,160 Speaker 1: sensor on some robot that can tell whether someone's holding 257 00:13:31,320 --> 00:13:34,439 Speaker 1: you know, a rifle or a rake. Yeah, we could 258 00:13:34,480 --> 00:13:36,360 Speaker 1: totally do that, and we could probably do that better 259 00:13:36,400 --> 00:13:39,240 Speaker 1: than humans actually, because you could um do it in 260 00:13:39,280 --> 00:13:41,880 Speaker 1: different kind of conditions, different lighting, you could fuse together 261 00:13:41,880 --> 00:13:44,880 Speaker 1: different sensors, you got thormals and lighter and other things. 262 00:13:45,240 --> 00:13:47,880 Speaker 1: There's a lot of potential there. So from uh, I 263 00:13:47,920 --> 00:13:50,840 Speaker 1: guess like a cultural standpoint or a policy standpoint at 264 00:13:50,840 --> 00:13:53,320 Speaker 1: the Pentagon, like how far away? How far off do 265 00:13:53,320 --> 00:13:58,079 Speaker 1: you think we are before we defer that responsibility to robots? Essentially, 266 00:13:58,400 --> 00:14:01,000 Speaker 1: So what what Pentagon leaders have said so far is 267 00:14:01,000 --> 00:14:02,880 Speaker 1: that they plan to still keep humans in the loop. 268 00:14:02,920 --> 00:14:05,320 Speaker 1: So they're gonna build ever more advanced robotic systems. They're 269 00:14:05,320 --> 00:14:08,240 Speaker 1: working on that. Um, they're building you know, air and 270 00:14:08,280 --> 00:14:12,200 Speaker 1: ground and land robotic vehicles and undersea as well. UM, 271 00:14:12,240 --> 00:14:14,400 Speaker 1: but they're still gonna have people kind of making that decision. 272 00:14:14,800 --> 00:14:17,559 Speaker 1: What they've said so far is that they plan to 273 00:14:17,640 --> 00:14:19,800 Speaker 1: keep people in charge, but we'll see what others do. 274 00:14:20,320 --> 00:14:21,800 Speaker 1: And that's kind of where the Penticon there is a 275 00:14:21,800 --> 00:14:24,800 Speaker 1: bit is like, look, if Russia or China take people 276 00:14:24,800 --> 00:14:28,000 Speaker 1: out of the loop and you know that's faster and 277 00:14:28,040 --> 00:14:30,640 Speaker 1: that's better, we may have the respond in kind. And well, 278 00:14:30,640 --> 00:14:32,680 Speaker 1: that's I was gonna bring that up. I mean, the 279 00:14:32,800 --> 00:14:36,560 Speaker 1: Chinese do not seem to have the same ethical considerations. 280 00:14:37,080 --> 00:14:38,920 Speaker 1: And you know, we do a lot of handwringing here 281 00:14:38,920 --> 00:14:43,480 Speaker 1: in the Western world about human rights, human ethics, uh, 282 00:14:43,600 --> 00:14:48,400 Speaker 1: you know, genetics, testing, cloning, human experimentation, what's right, what's wrong. 283 00:14:48,720 --> 00:14:51,680 Speaker 1: It doesn't seem like some foreign countries are quite as 284 00:14:51,720 --> 00:14:54,440 Speaker 1: concerned with some of these notions as we are. I 285 00:14:54,440 --> 00:14:58,000 Speaker 1: think we talked about that like last show. That's very true. 286 00:14:58,280 --> 00:15:00,040 Speaker 1: I mean, it's certainly true that, like you know, the 287 00:15:00,080 --> 00:15:02,280 Speaker 1: competitors that we really care about, they don't see these 288 00:15:02,280 --> 00:15:05,440 Speaker 1: ethical things in the same light. Um. The Russians have 289 00:15:05,480 --> 00:15:08,880 Speaker 1: talked about trying to build a fully roboticized unit someday 290 00:15:08,960 --> 00:15:13,040 Speaker 1: that's conducting independent operations. UM. The Chinese statements on this 291 00:15:13,080 --> 00:15:16,200 Speaker 1: issue has been much murkier and kind of confusing, but 292 00:15:16,240 --> 00:15:18,120 Speaker 1: it's pretty clear that they don't see these ethical things 293 00:15:18,200 --> 00:15:20,600 Speaker 1: the same way. What they do care about is who 294 00:15:20,600 --> 00:15:23,320 Speaker 1: do care about control? They care about control over their 295 00:15:23,320 --> 00:15:26,880 Speaker 1: own weapons, and particularly in Chinese culture and the Chinese 296 00:15:26,920 --> 00:15:30,920 Speaker 1: authoritarian system, they don't want people freelancing things. I think 297 00:15:30,920 --> 00:15:33,320 Speaker 1: there's a lot of speculation about what that means for automation. 298 00:15:33,720 --> 00:15:35,600 Speaker 1: I've read some people say, well, they're going to automate 299 00:15:35,640 --> 00:15:38,560 Speaker 1: things because they don't want their people to be in charge. Yeah, 300 00:15:38,600 --> 00:15:42,520 Speaker 1: the sense of autonomy is contradictory to the communist system, 301 00:15:42,600 --> 00:15:44,920 Speaker 1: totally right. They don't want anyone to have autonomy. It's 302 00:15:44,960 --> 00:15:47,240 Speaker 1: all these decisions should go right up to the senior leadership. 303 00:15:47,440 --> 00:15:49,480 Speaker 1: So I think TBD and how they actually implement it. 304 00:15:49,480 --> 00:15:51,680 Speaker 1: But if they if they hold back, it's not for 305 00:15:51,720 --> 00:15:56,680 Speaker 1: ethical reasons. It's about authoritarian control. That's interesting and be 306 00:15:56,680 --> 00:15:59,680 Speaker 1: interesting to see how that evolves. Yeah, and we you know, 307 00:16:00,280 --> 00:16:02,840 Speaker 1: a colleague of mine, um Elsa Kanya, it's at the 308 00:16:02,840 --> 00:16:05,480 Speaker 1: Center for New American Security, published a great report last 309 00:16:05,560 --> 00:16:09,240 Speaker 1: year on Chinese military views and artificial intelligence, doing a 310 00:16:09,240 --> 00:16:12,600 Speaker 1: bunch of research looking at UM at actual Chinese like 311 00:16:12,680 --> 00:16:16,960 Speaker 1: original language documents and what they're doing. Her report it's 312 00:16:17,000 --> 00:16:20,560 Speaker 1: called battlefield singularities. Great and some Chinese scholars have used 313 00:16:20,600 --> 00:16:23,440 Speaker 1: that term. I've talked about evolving some point in time 314 00:16:23,440 --> 00:16:26,800 Speaker 1: where the pace of action on the battlefield eclipses the 315 00:16:26,840 --> 00:16:30,120 Speaker 1: speed of human reaction time as she reaches battlefield singularity 316 00:16:30,160 --> 00:16:32,360 Speaker 1: where everything is automated. But what we see is when 317 00:16:32,360 --> 00:16:33,920 Speaker 1: we look at the Chinese documents is a lot of 318 00:16:33,960 --> 00:16:36,880 Speaker 1: their actual concepts. There's a lot of mirroring. There's a 319 00:16:36,920 --> 00:16:40,480 Speaker 1: lot of them reading us stuff and then they're like, oh, 320 00:16:40,560 --> 00:16:43,040 Speaker 1: that's a good and they're doing it, maybe with a twist, 321 00:16:43,440 --> 00:16:46,080 Speaker 1: but they're doing it. They're doing similar thing. Their doctrine 322 00:16:46,160 --> 00:16:49,720 Speaker 1: is to take Western technology but strip Western values out 323 00:16:49,760 --> 00:16:53,040 Speaker 1: of it and use it, you know, to further you know, 324 00:16:53,080 --> 00:16:55,840 Speaker 1: the Chinese government's agenda, right right, it's not it's not 325 00:16:55,920 --> 00:17:00,120 Speaker 1: as um original as as you might think, yeah, well, 326 00:17:00,120 --> 00:17:02,640 Speaker 1: I mean there's also questions about, you know, when you 327 00:17:02,680 --> 00:17:05,560 Speaker 1: have a society like that that learns by road and 328 00:17:05,760 --> 00:17:08,800 Speaker 1: that doesn't value autonomy, how original are they going to 329 00:17:08,840 --> 00:17:10,479 Speaker 1: be able to get on their own? I mean, I 330 00:17:10,480 --> 00:17:12,159 Speaker 1: think there's no question that in my mind. You know, 331 00:17:12,200 --> 00:17:15,040 Speaker 1: one of the long standing advantages in the American military 332 00:17:15,119 --> 00:17:19,560 Speaker 1: has been um the freedom um that we give junior 333 00:17:19,600 --> 00:17:22,240 Speaker 1: officers and n c o s and sort of mission 334 00:17:22,359 --> 00:17:25,639 Speaker 1: level command UM. I remember being a basic training it 335 00:17:25,760 --> 00:17:28,800 Speaker 1: is this captain comes in and says, do you know 336 00:17:28,840 --> 00:17:31,400 Speaker 1: why we won World War Two? You know? And it's like, 337 00:17:31,920 --> 00:17:34,040 Speaker 1: you know, well, we were more people and more tanks 338 00:17:34,080 --> 00:17:37,040 Speaker 1: and really and frankly, seventy percent of drig and caputies 339 00:17:37,040 --> 00:17:39,000 Speaker 1: happened on the Eastern Front, So the Soviets did a 340 00:17:39,000 --> 00:17:41,359 Speaker 1: lot of the heavy lifting here. But he's like, the 341 00:17:41,440 --> 00:17:44,840 Speaker 1: one thing we won it was because of the sand table, right, 342 00:17:44,960 --> 00:17:48,000 Speaker 1: It's just like the sand table. So his argument was, 343 00:17:48,080 --> 00:17:49,879 Speaker 1: which seems a little bit s frexed, but you know, 344 00:17:49,920 --> 00:17:52,160 Speaker 1: you got the sand table that if people are familiar, 345 00:17:52,440 --> 00:17:55,159 Speaker 1: they lay this table out, they brief everybody on the mission. 346 00:17:55,160 --> 00:17:56,960 Speaker 1: They can see the terrain map of the mission with 347 00:17:57,000 --> 00:17:59,879 Speaker 1: the little colored pieces of yarn that people do. It 348 00:18:00,040 --> 00:18:03,320 Speaker 1: range just the little little green army man, We're gonna 349 00:18:03,320 --> 00:18:05,639 Speaker 1: march up this hill here. But the idea was this 350 00:18:05,800 --> 00:18:08,560 Speaker 1: visual way to breathe for everybody in the unit what 351 00:18:08,680 --> 00:18:11,040 Speaker 1: the plan is. And so if all the leadership gets 352 00:18:11,080 --> 00:18:14,280 Speaker 1: wiped out, you can handle situations where there's like one 353 00:18:14,320 --> 00:18:16,640 Speaker 1: guy left and he's some private you can carry out 354 00:18:16,640 --> 00:18:18,840 Speaker 1: the mission and then worre instance is what things like 355 00:18:18,880 --> 00:18:21,640 Speaker 1: that happened, like point to Hockey get basically the unit 356 00:18:21,680 --> 00:18:25,960 Speaker 1: wiped out, and so, um, that's something that is I think, 357 00:18:26,080 --> 00:18:28,440 Speaker 1: you know a major advantage in the United States is 358 00:18:28,480 --> 00:18:31,080 Speaker 1: that we have really educated people. We give them a 359 00:18:31,080 --> 00:18:35,040 Speaker 1: lot of autonomy and and it's very clear that well 360 00:18:35,080 --> 00:18:38,840 Speaker 1: AI is powerful, it can't do what humans do. And 361 00:18:38,880 --> 00:18:41,879 Speaker 1: so you know, a military that like still uses aim 362 00:18:42,320 --> 00:18:45,399 Speaker 1: but retains humans involved in these tactical decisions as I 363 00:18:45,400 --> 00:18:48,120 Speaker 1: think gonna be optimally suited to win in the battlefield. Well, 364 00:18:48,119 --> 00:18:52,359 Speaker 1: what are some of the I guess upcoming technologies that 365 00:18:52,600 --> 00:18:55,680 Speaker 1: you're excited about seeing how they develop on one way 366 00:18:55,760 --> 00:18:57,919 Speaker 1: or the other. Yeah. Um, one of some of the 367 00:18:57,920 --> 00:18:59,560 Speaker 1: things we we walked through, I kind of walked through 368 00:18:59,560 --> 00:19:04,000 Speaker 1: in the book are different robotic vehicles, different missiles. You know, 369 00:19:04,040 --> 00:19:06,920 Speaker 1: we tend to think sometimes about if you think about 370 00:19:06,920 --> 00:19:10,960 Speaker 1: autonomous weapons or military robox, you're visioning like a vehicle 371 00:19:11,160 --> 00:19:13,400 Speaker 1: or an aircraft or a robot. A lot of things 372 00:19:13,440 --> 00:19:15,240 Speaker 1: that are happening that are pretty amazing are just in 373 00:19:15,320 --> 00:19:18,520 Speaker 1: better smarter missiles. So physically the system might look exactly 374 00:19:18,520 --> 00:19:20,520 Speaker 1: the same, looks like a Rison missile, but all the 375 00:19:20,560 --> 00:19:24,719 Speaker 1: brands insider are getting smarter. They're able to um uh. 376 00:19:24,800 --> 00:19:26,760 Speaker 1: We got a missile today, the long range and a 377 00:19:26,760 --> 00:19:30,560 Speaker 1: ship missile to l RASUM that can autonomously do um 378 00:19:30,760 --> 00:19:33,080 Speaker 1: change its route on its way to its target. So 379 00:19:33,119 --> 00:19:35,240 Speaker 1: if it's moving to the ship that is targeted at 380 00:19:35,520 --> 00:19:37,320 Speaker 1: and then there's like a what it calls a pop 381 00:19:37,400 --> 00:19:39,879 Speaker 1: up threat on the way, it can re out around 382 00:19:39,920 --> 00:19:42,680 Speaker 1: all that. We're also seeing a swarming missiles that can 383 00:19:42,800 --> 00:19:46,640 Speaker 1: talk to each other. I think tremendous advantages in swarming combat. Um. 384 00:19:46,680 --> 00:19:48,680 Speaker 1: I opened the book with a with a scene out 385 00:19:48,680 --> 00:19:51,880 Speaker 1: at the Naval Postgraduate School where they're doing swarm research 386 00:19:52,320 --> 00:19:56,320 Speaker 1: and they've got ten versus ten aerial dog fight of 387 00:19:56,400 --> 00:19:58,879 Speaker 1: swarms and they're trying to like figure out what's the 388 00:19:58,960 --> 00:20:01,720 Speaker 1: tactic to fight swarm versus swarm because nobody's ever done 389 00:20:01,720 --> 00:20:03,359 Speaker 1: that before. How do you do that if you've got 390 00:20:03,359 --> 00:20:06,160 Speaker 1: these autonomous saying watching it, you have no idea what's 391 00:20:06,160 --> 00:20:09,680 Speaker 1: going on? Really, you can't tell who's fighting who or yeah, yeah, 392 00:20:09,720 --> 00:20:11,960 Speaker 1: it's pretty remarkable and it's all fast. So so it's 393 00:20:11,960 --> 00:20:15,080 Speaker 1: all automated. In in that scene, you know, the human. 394 00:20:15,160 --> 00:20:17,000 Speaker 1: All the human does is pushed the button and say go, 395 00:20:17,560 --> 00:20:20,720 Speaker 1: and then the swarms are all fighting each other and um, 396 00:20:20,760 --> 00:20:23,440 Speaker 1: and they actually do physical experiments using these sort of 397 00:20:23,600 --> 00:20:27,119 Speaker 1: these really cheap drones that they built. They stire from drones, 398 00:20:27,400 --> 00:20:29,800 Speaker 1: but they're totally autonomous, and they've got them open the 399 00:20:29,840 --> 00:20:31,960 Speaker 1: air dog fighting each other and it's just this like 400 00:20:32,080 --> 00:20:35,040 Speaker 1: fur ball of like aerial combat and they're just swirling 401 00:20:35,080 --> 00:20:37,560 Speaker 1: around each other and you see it on the computer screen. 402 00:20:37,560 --> 00:20:39,280 Speaker 1: They're just like racking up kills against each other. But 403 00:20:39,359 --> 00:20:42,200 Speaker 1: that that guy pressing the go button, I mean, that 404 00:20:42,440 --> 00:20:46,959 Speaker 1: is that's the human involvement and that's what we're gonna do. 405 00:20:47,000 --> 00:20:48,840 Speaker 1: That is that what we're how we're gonna bless off 406 00:20:48,880 --> 00:20:51,080 Speaker 1: on this and say oh it's okay, a human is 407 00:20:51,080 --> 00:20:53,800 Speaker 1: in the mix. Because our machines come back to us 408 00:20:53,800 --> 00:20:55,480 Speaker 1: and say, hey, Jack, is it okay if I nuke 409 00:20:55,560 --> 00:20:57,359 Speaker 1: that target and I like look at it for like 410 00:20:57,440 --> 00:21:00,000 Speaker 1: five seconds like a digital read Yeah, yeah, yeah, yea 411 00:21:00,840 --> 00:21:02,960 Speaker 1: looks good. I mean, that's that's one of the things 412 00:21:02,960 --> 00:21:04,800 Speaker 1: that the book tries to crap it was. So so 413 00:21:05,560 --> 00:21:09,000 Speaker 1: you've got um one vision is you've got something like 414 00:21:09,040 --> 00:21:11,280 Speaker 1: you described with the human the machines making all of 415 00:21:11,320 --> 00:21:15,639 Speaker 1: these recommendations, and the human is the final approval authority. UM. 416 00:21:15,640 --> 00:21:17,480 Speaker 1: And there's a lot of appeal to that. You get 417 00:21:17,480 --> 00:21:19,359 Speaker 1: a lot of the advantages of automation, but the humans 418 00:21:19,400 --> 00:21:22,000 Speaker 1: still ultimately has the final decision making. One of the 419 00:21:22,080 --> 00:21:25,280 Speaker 1: things we've seen though, is that sometimes isn't enough if 420 00:21:25,280 --> 00:21:28,040 Speaker 1: the humans surrenders their judgment to the machine. So this 421 00:21:28,080 --> 00:21:31,520 Speaker 1: is effect called automation bias. We trust the automation, and 422 00:21:31,560 --> 00:21:34,560 Speaker 1: you've see things like accidents with these Tesla cars now 423 00:21:34,600 --> 00:21:37,800 Speaker 1: to two fatalities where people are trusting the automation more 424 00:21:37,840 --> 00:21:40,680 Speaker 1: than they should. UM. It's a common problem that happens. 425 00:21:41,080 --> 00:21:43,320 Speaker 1: There were two incidents in two thousand three during the 426 00:21:43,280 --> 00:21:47,399 Speaker 1: Iraq invasion where the Army's Patriot Air Defense system shot 427 00:21:47,440 --> 00:21:51,359 Speaker 1: down two frontly aircraft shot down a British tornado and 428 00:21:51,440 --> 00:21:53,679 Speaker 1: a Navy f A team and kill the pilots at 429 00:21:53,680 --> 00:21:56,600 Speaker 1: both instances, and humans were in the loop. Humans pushed 430 00:21:56,600 --> 00:22:00,679 Speaker 1: the buttons on those things. But um afterwards the investigators 431 00:22:00,680 --> 00:22:03,680 Speaker 1: looked at this, they talked to the operators in the community, 432 00:22:04,000 --> 00:22:06,000 Speaker 1: and what they said was that their view was that 433 00:22:06,040 --> 00:22:09,240 Speaker 1: if they overruled the automation and someone was killed, they 434 00:22:09,280 --> 00:22:11,800 Speaker 1: would be in more trouble then if they just like 435 00:22:12,240 --> 00:22:14,440 Speaker 1: let it go through and something bad happened, because it's 436 00:22:14,440 --> 00:22:17,800 Speaker 1: like you ignored your equipment exactly right. So it wasn't 437 00:22:17,840 --> 00:22:21,320 Speaker 1: just a technology We created a whole culture of automation bias. 438 00:22:21,840 --> 00:22:24,160 Speaker 1: Was that famous case of the It was a Soviet 439 00:22:24,280 --> 00:22:27,320 Speaker 1: radar operator, right who it was like two or something 440 00:22:27,359 --> 00:22:29,200 Speaker 1: like that. Yeah, So I opened the book with the 441 00:22:29,240 --> 00:22:33,040 Speaker 1: story about Stanislaw Petrov. He's he's the Soviet colonel. He's 442 00:22:33,080 --> 00:22:37,560 Speaker 1: sitting in a bunker outside Moscow in eight three and 443 00:22:37,560 --> 00:22:39,280 Speaker 1: and and you know his job is he's like at 444 00:22:39,280 --> 00:22:42,600 Speaker 1: the night watch on these these missile warning systems higher 445 00:22:42,640 --> 00:22:45,480 Speaker 1: the Cold War, waiting to see if the Americans are 446 00:22:45,480 --> 00:22:48,440 Speaker 1: going to launch surprise nuclear attack and system goes off 447 00:22:49,160 --> 00:22:52,399 Speaker 1: missile launch says the US just launched the nuclear i 448 00:22:52,480 --> 00:22:54,879 Speaker 1: CBM at the Soviet unions. He's like, well, what's up about? 449 00:22:55,240 --> 00:22:58,560 Speaker 1: And then bang, bang bang, four more launches. So the 450 00:22:58,600 --> 00:23:01,119 Speaker 1: system says, we've got five missile is inbound. I remember 451 00:23:01,119 --> 00:23:03,840 Speaker 1: hearing about yeah, and it's and it's flashing these warnings 452 00:23:03,840 --> 00:23:07,000 Speaker 1: missile launched, and the missiles like the time on time 453 00:23:07,000 --> 00:23:08,919 Speaker 1: on target it's like twenty minutes. Yeah, he's only got 454 00:23:08,920 --> 00:23:10,520 Speaker 1: a few minutes to make a decision. So if if, 455 00:23:10,560 --> 00:23:13,280 Speaker 1: if he its Soviet leadership is gonna be able to respond. 456 00:23:13,720 --> 00:23:15,400 Speaker 1: You know, he's got to make a phone call really 457 00:23:15,440 --> 00:23:17,480 Speaker 1: quick and say like, hey, his missiles are coming, because 458 00:23:17,560 --> 00:23:19,280 Speaker 1: he's gotta pass this up the chain because I gotta 459 00:23:19,320 --> 00:23:21,600 Speaker 1: get their version of a nuclear suitcase and they gotta 460 00:23:21,880 --> 00:23:24,639 Speaker 1: they gotta make a decision or the fear is the 461 00:23:24,720 --> 00:23:27,040 Speaker 1: US might be launching a decapitating strike to take out 462 00:23:27,040 --> 00:23:30,679 Speaker 1: the Soviet leadership. Right, So, but he also knows that 463 00:23:30,720 --> 00:23:33,199 Speaker 1: there's a new system they just deployed, and it's a 464 00:23:33,240 --> 00:23:35,680 Speaker 1: little bit kind of junkie's a little bit well maybe 465 00:23:35,680 --> 00:23:38,000 Speaker 1: it's just it's just not a phrase. Now. He says 466 00:23:38,160 --> 00:23:40,600 Speaker 1: later that he thought it was like fifty fifty. It's 467 00:23:40,600 --> 00:23:43,280 Speaker 1: a coin flip whether this is real or not. But 468 00:23:43,320 --> 00:23:45,600 Speaker 1: the system is blaring, that's got a big flashing wed light, 469 00:23:45,640 --> 00:23:48,800 Speaker 1: like missiles are inbound um at a certain point in time, 470 00:23:48,920 --> 00:23:51,160 Speaker 1: they should be expecting to see the missiles coming over 471 00:23:51,200 --> 00:23:54,679 Speaker 1: the horizon on the Soviet early warning radars. And he 472 00:23:54,720 --> 00:23:56,400 Speaker 1: calls the radar operators and they're like, yeah, we don't 473 00:23:56,400 --> 00:23:59,119 Speaker 1: see anything. So he makes just a gut call. He's like, 474 00:23:59,119 --> 00:24:01,040 Speaker 1: I don't think this makes it. Why would Americans only 475 00:24:01,080 --> 00:24:03,800 Speaker 1: watch five missiles? And he didn't trust his system. He 476 00:24:03,840 --> 00:24:05,600 Speaker 1: didn't trust it. He didn't trust it, and he had 477 00:24:05,760 --> 00:24:08,400 Speaker 1: outside of the system, removed from that as a human being, 478 00:24:08,440 --> 00:24:11,320 Speaker 1: he had that like political awareness that like this doesn't 479 00:24:11,359 --> 00:24:13,560 Speaker 1: really make sense, that's right. So he brings to bear 480 00:24:13,680 --> 00:24:15,639 Speaker 1: this context where he thinks, why would Americans launch only 481 00:24:15,720 --> 00:24:17,600 Speaker 1: five missiles? Like if you're going to launch a first strike, 482 00:24:17,640 --> 00:24:20,680 Speaker 1: you launch them all right, So so he makes a 483 00:24:20,720 --> 00:24:24,680 Speaker 1: gut check. He says system is malfunctioning, and uh obviously 484 00:24:24,800 --> 00:24:27,199 Speaker 1: was right. Uh, And so he's you know, saved us 485 00:24:27,200 --> 00:24:30,480 Speaker 1: all from potentially nuclear war. But it raises this question 486 00:24:30,560 --> 00:24:32,240 Speaker 1: of like what would a machine have done by the 487 00:24:32,280 --> 00:24:34,600 Speaker 1: way James Powell talked about that in a recent show 488 00:24:34,640 --> 00:24:39,360 Speaker 1: because he just died recently, that guy, he died very recently. Um. 489 00:24:39,520 --> 00:24:41,600 Speaker 1: And and so you know, what would a machine have done? 490 00:24:41,600 --> 00:24:43,920 Speaker 1: And whatever the program did, a machine would have launched 491 00:24:43,920 --> 00:24:46,399 Speaker 1: the nukes right if we told it like and that's 492 00:24:46,440 --> 00:24:49,320 Speaker 1: when the machine wasn't machine was saying, it's an actual 493 00:24:49,640 --> 00:24:52,280 Speaker 1: you know launch. Um. So that's something that we probably 494 00:24:52,320 --> 00:24:54,000 Speaker 1: don't want to give up, is this ability of people 495 00:24:54,040 --> 00:24:56,359 Speaker 1: to kind of see the bigger picture. So this is 496 00:24:56,400 --> 00:24:58,480 Speaker 1: like a little bit of a more far out there 497 00:24:58,560 --> 00:25:00,959 Speaker 1: question beyond the scope of your book, I think. But 498 00:25:01,119 --> 00:25:03,879 Speaker 1: I want to get into these conceptual ideas because I 499 00:25:03,880 --> 00:25:07,840 Speaker 1: think they're interesting. Um. I notice how in science fiction, 500 00:25:08,320 --> 00:25:10,879 Speaker 1: you know, in the um Isaac Asimov, all all the 501 00:25:10,920 --> 00:25:13,120 Speaker 1: movies that you and I grew up watching, a lot 502 00:25:13,160 --> 00:25:17,240 Speaker 1: of it is about the human relationship with our machines, 503 00:25:17,480 --> 00:25:21,240 Speaker 1: you know, like, um, like how in two thousand one 504 00:25:21,760 --> 00:25:24,240 Speaker 1: h movies like this, films like this, where it's there 505 00:25:24,760 --> 00:25:28,840 Speaker 1: us interacting with our robots, with our artificial intelligences. Um. 506 00:25:28,920 --> 00:25:32,080 Speaker 1: But I noticed now when I watched movies I watch, um, 507 00:25:32,119 --> 00:25:37,080 Speaker 1: like the New Blade Runner film or Alien Covenant, watch 508 00:25:37,160 --> 00:25:40,359 Speaker 1: these types of films, and what they're actually about is 509 00:25:40,400 --> 00:25:44,639 Speaker 1: the relationship between our Aiyes, and humans are no longer 510 00:25:44,680 --> 00:25:47,520 Speaker 1: the protagonists were like in the background, and it's about 511 00:25:47,520 --> 00:25:50,960 Speaker 1: our robots talking to each other and how they feel 512 00:25:50,960 --> 00:25:53,000 Speaker 1: about each other and what they think about each other. 513 00:25:53,960 --> 00:25:55,880 Speaker 1: What do you make of that? Where do you I mean, 514 00:25:55,960 --> 00:25:59,439 Speaker 1: it seems like, uh, like subconsciously we have come to 515 00:25:59,480 --> 00:26:02,560 Speaker 1: this conclu Lucian already like this is gonna happen in 516 00:26:02,560 --> 00:26:06,520 Speaker 1: our Our relationship with our robots is passe. Now okay, 517 00:26:06,600 --> 00:26:11,159 Speaker 1: now it's really about the relationship between the robots. Yeah. Um. 518 00:26:11,240 --> 00:26:13,080 Speaker 1: I think one of the things that I that I 519 00:26:13,160 --> 00:26:15,760 Speaker 1: like when I see it in science fiction is this 520 00:26:15,840 --> 00:26:20,320 Speaker 1: idea of machine intelligence as different than humans. Um. And 521 00:26:20,359 --> 00:26:22,440 Speaker 1: I think you see this more and more in things. 522 00:26:22,480 --> 00:26:24,440 Speaker 1: I think X Market it does this very well actually, 523 00:26:25,320 --> 00:26:29,520 Speaker 1: of if not of not envisioning AI as replacing us 524 00:26:29,600 --> 00:26:32,480 Speaker 1: or being human like, but it is is envisioning something 525 00:26:32,480 --> 00:26:35,000 Speaker 1: that might look like humans, might talk like us, but 526 00:26:35,080 --> 00:26:37,600 Speaker 1: it thinks fundamentally different from us. Well. But and that's 527 00:26:37,600 --> 00:26:39,199 Speaker 1: actually what we're seeing in in a lot of the 528 00:26:39,200 --> 00:26:41,520 Speaker 1: technology and X Mah. You know what I thought was 529 00:26:41,560 --> 00:26:45,400 Speaker 1: interesting about that movie was that they develop in AI 530 00:26:46,640 --> 00:26:49,480 Speaker 1: UM And we have these thoughts about what AI is 531 00:26:49,520 --> 00:26:51,159 Speaker 1: and isn't, But at the end of the day, the 532 00:26:51,200 --> 00:26:54,159 Speaker 1: AI in that film had a very human desire for 533 00:26:54,240 --> 00:26:58,200 Speaker 1: basic freedom and acted on that. That's true. That's true. 534 00:26:58,640 --> 00:27:01,160 Speaker 1: So I mean, I think what's fascinating to me when 535 00:27:01,160 --> 00:27:03,719 Speaker 1: I look at the at the actual technology we're building, 536 00:27:04,200 --> 00:27:06,160 Speaker 1: is that the goal is sort of like, we don't 537 00:27:06,160 --> 00:27:09,280 Speaker 1: really understand intelligence. We don't really know how the human 538 00:27:09,280 --> 00:27:11,280 Speaker 1: brain works. We don't want to stand much intelligence. We 539 00:27:11,320 --> 00:27:13,720 Speaker 1: have one data point for what we think intelligence is, 540 00:27:13,720 --> 00:27:16,440 Speaker 1: which is the human mind. And so what what people 541 00:27:16,440 --> 00:27:18,679 Speaker 1: have actually be doing in the field of AI is 542 00:27:18,720 --> 00:27:21,760 Speaker 1: they take hard problems, things that seem to us like 543 00:27:21,800 --> 00:27:24,720 Speaker 1: things that require intelligence. I will, playing chess, we would say, 544 00:27:24,720 --> 00:27:28,040 Speaker 1: among humans, like punching numbers. Punching numbers, right, we said, well, 545 00:27:28,040 --> 00:27:29,560 Speaker 1: that's hard for humans to do. You know, if you're 546 00:27:29,560 --> 00:27:31,800 Speaker 1: a chess grand master, you're a smart person. Let's see. 547 00:27:31,920 --> 00:27:34,720 Speaker 1: And and the decades ago, the view was if we 548 00:27:34,720 --> 00:27:37,320 Speaker 1: could build a machine that could beat humans at chess, 549 00:27:37,560 --> 00:27:40,280 Speaker 1: we will have solved AI. Then it turns out to 550 00:27:40,280 --> 00:27:43,080 Speaker 1: be not that hard because the one of the things 551 00:27:43,080 --> 00:27:45,640 Speaker 1: we keep finding is that things that are easy for 552 00:27:45,760 --> 00:27:48,720 Speaker 1: humans are really hard for machines, and things are hard 553 00:27:48,720 --> 00:27:51,160 Speaker 1: for machines are actually kind of easy for for people 554 00:27:51,200 --> 00:27:54,320 Speaker 1: to do. Um, you know, going into a house and 555 00:27:54,359 --> 00:27:56,879 Speaker 1: make on a pot of coffee or folding laundry is 556 00:27:56,920 --> 00:27:59,240 Speaker 1: like really hard for machines. This is great videos online 557 00:27:59,359 --> 00:28:01,600 Speaker 1: like machines try to like about trying to fold laundry 558 00:28:01,600 --> 00:28:05,159 Speaker 1: and it's just like a t shirts just killing him. Um. 559 00:28:05,240 --> 00:28:08,800 Speaker 1: So so to me, the lesson is that as we're 560 00:28:08,800 --> 00:28:11,280 Speaker 1: building these systems, they don't think like us. They're very 561 00:28:11,359 --> 00:28:13,720 Speaker 1: different and a lot of the models that we have 562 00:28:13,880 --> 00:28:17,399 Speaker 1: for thinking about intelligence, um don't don't work well. We 563 00:28:17,400 --> 00:28:19,959 Speaker 1: tan their anthem morphies the machines and there's a lot 564 00:28:20,000 --> 00:28:23,439 Speaker 1: of risks in that. Actually, yeah, we think that the 565 00:28:23,480 --> 00:28:26,240 Speaker 1: machine is going to have human desires or are we 566 00:28:26,320 --> 00:28:28,520 Speaker 1: just so what happens sometimes is, you know, we'll see 567 00:28:28,520 --> 00:28:32,240 Speaker 1: that it's better than humans at one thing, and then 568 00:28:32,240 --> 00:28:35,440 Speaker 1: we extrapolate it to everything else, um, and then we're 569 00:28:35,480 --> 00:28:38,000 Speaker 1: surprised when the machine falls apart all of a sudden. 570 00:28:38,080 --> 00:28:39,720 Speaker 1: So we might get into a self driving car and 571 00:28:39,760 --> 00:28:41,760 Speaker 1: you're like, oh, the car drives better than humans in 572 00:28:41,800 --> 00:28:44,400 Speaker 1: some settings. Yes, it may and then there might be 573 00:28:44,440 --> 00:28:47,480 Speaker 1: another setting where it goes incredibly dumb all of a 574 00:28:47,480 --> 00:28:50,280 Speaker 1: sudden and it just falls apart, and we're not prepared 575 00:28:50,320 --> 00:28:53,479 Speaker 1: for that because because the machine sort of thinks differently 576 00:28:53,480 --> 00:28:56,560 Speaker 1: than us. I think this stuff is endlessly interesting. I mean, 577 00:28:56,600 --> 00:28:59,080 Speaker 1: how do you see the future of the human machine 578 00:28:59,080 --> 00:29:03,160 Speaker 1: interface then? Is it like, uh, working in tandem like 579 00:29:03,240 --> 00:29:06,560 Speaker 1: the way humans worked with domesticated animals thousands of years ago, 580 00:29:06,800 --> 00:29:08,960 Speaker 1: or I mean, how is that going to shake out? Yeah? 581 00:29:09,720 --> 00:29:11,480 Speaker 1: The vision that is that sort of a lot of 582 00:29:11,480 --> 00:29:13,880 Speaker 1: people in the defense community have migrated towards in the 583 00:29:13,920 --> 00:29:17,480 Speaker 1: past couple of years, is one of a center model. 584 00:29:17,800 --> 00:29:20,120 Speaker 1: So like the old mythical creature of you know, half 585 00:29:20,120 --> 00:29:22,480 Speaker 1: half horse half human, this sort of this idea of 586 00:29:22,480 --> 00:29:27,160 Speaker 1: a center warfighter that's humans and machines working together. Um. 587 00:29:27,280 --> 00:29:29,480 Speaker 1: What that looks like I think is gonna keep evolving 588 00:29:29,520 --> 00:29:31,800 Speaker 1: over time. But the idea being that you you incorporate 589 00:29:31,880 --> 00:29:35,520 Speaker 1: lots of automation, lots of machine cognition, using the machines 590 00:29:35,560 --> 00:29:37,760 Speaker 1: where it's good, in the humans where it's good, blending 591 00:29:37,800 --> 00:29:40,960 Speaker 1: these kind of things together. Um, I would compare it 592 00:29:41,040 --> 00:29:43,880 Speaker 1: to you know, we are in an era of warfare 593 00:29:43,920 --> 00:29:49,440 Speaker 1: already where humans work closely with mechanical machines. Um, it's 594 00:29:49,480 --> 00:29:52,200 Speaker 1: a little bit different. Like on the ground, um, you 595 00:29:52,240 --> 00:29:54,600 Speaker 1: know where we were at, or people of saw community, 596 00:29:54,600 --> 00:29:57,000 Speaker 1: where you know, you're running around on your own two legs. 597 00:29:57,240 --> 00:29:59,760 Speaker 1: But in other domains of warfare, you know, an air, 598 00:30:00,160 --> 00:30:03,760 Speaker 1: see your undersea, you cannot fight without the machine. You 599 00:30:03,760 --> 00:30:05,680 Speaker 1: can't fight in the air without the machine, or underwater 600 00:30:05,720 --> 00:30:08,960 Speaker 1: it's not possible. Um. And so we have very close 601 00:30:09,280 --> 00:30:13,680 Speaker 1: human machine integration. But they're like physical mechanical objects, even 602 00:30:13,760 --> 00:30:15,800 Speaker 1: for the you know, the soldier on the ground, I mean, 603 00:30:16,200 --> 00:30:19,880 Speaker 1: the m for rifle, the humvy, I mean, these are 604 00:30:19,880 --> 00:30:22,520 Speaker 1: all machines that have to work in tandem with human beings. 605 00:30:22,640 --> 00:30:25,080 Speaker 1: That's right, right, Like your your integration with your with 606 00:30:25,160 --> 00:30:28,160 Speaker 1: your rifle is really intimate. You gotta know how to 607 00:30:28,280 --> 00:30:30,320 Speaker 1: use it, how to be effective, how to clear stoppages, 608 00:30:30,400 --> 00:30:33,760 Speaker 1: things like that, um. And I think we're gonna continue 609 00:30:33,840 --> 00:30:35,680 Speaker 1: seeing that, but it's gonna begin to happen at the 610 00:30:35,720 --> 00:30:38,800 Speaker 1: cognitive level. So instead of just like physical machines, machines 611 00:30:38,800 --> 00:30:41,240 Speaker 1: that are making decisions, that are sensing the environment, that 612 00:30:41,280 --> 00:30:43,720 Speaker 1: are making recommendations to people that would be still doing 613 00:30:43,720 --> 00:30:47,400 Speaker 1: some things automatically. There might be some automatid responses talking 614 00:30:47,400 --> 00:30:50,000 Speaker 1: about like reality plots. That's been called having some sort 615 00:30:50,000 --> 00:30:53,080 Speaker 1: of heads up display, might be might be heads up display. Um. 616 00:30:53,080 --> 00:30:56,400 Speaker 1: There's some incredible research that just came out UM where 617 00:30:56,400 --> 00:31:00,120 Speaker 1: people are using AI systems to basically put e G. 618 00:31:00,280 --> 00:31:03,840 Speaker 1: Sensors on your head and read brain ways and then 619 00:31:03,920 --> 00:31:06,640 Speaker 1: actually like do crude interpretations of what people are thinking. 620 00:31:07,640 --> 00:31:11,360 Speaker 1: So you can imagine integrating that into systems where you 621 00:31:11,400 --> 00:31:13,600 Speaker 1: can like have machines that react at the speed of thought. 622 00:31:13,920 --> 00:31:16,200 Speaker 1: So instead of me having to like push a button 623 00:31:16,560 --> 00:31:19,320 Speaker 1: to fire a missile or do something, the machine reads 624 00:31:19,360 --> 00:31:22,000 Speaker 1: my thoughts and automatically response auto. When you hear this 625 00:31:22,080 --> 00:31:24,560 Speaker 1: type of thing, it really does become hard to decipher 626 00:31:24,600 --> 00:31:26,960 Speaker 1: between like what really is going on and what is 627 00:31:27,000 --> 00:31:31,240 Speaker 1: conspiracy nut job stuff, because some of this sounds like 628 00:31:31,280 --> 00:31:34,640 Speaker 1: borderline with Alex Jones talks about like MK ultry stuff, 629 00:31:34,760 --> 00:31:37,000 Speaker 1: And then you think, like what's real and what's not 630 00:31:37,080 --> 00:31:40,160 Speaker 1: Because the technology is advancing so fast before our eyes 631 00:31:40,480 --> 00:31:42,280 Speaker 1: it is it's remarkable. And so I work in this 632 00:31:42,280 --> 00:31:45,160 Speaker 1: space when we run this technology program at the Center 633 00:31:45,160 --> 00:31:47,880 Speaker 1: food New American Security, and I'm continually amazed by how 634 00:31:47,960 --> 00:31:50,160 Speaker 1: fast things are moving. I will often be in situations 635 00:31:50,160 --> 00:31:52,040 Speaker 1: I think to myself, like, I'll bet in the next 636 00:31:52,040 --> 00:31:55,160 Speaker 1: twelve months, we'll see X. Right, We'll see a drone 637 00:31:55,160 --> 00:31:57,360 Speaker 1: that can fully it's honously map a building or something, 638 00:31:57,640 --> 00:31:59,880 Speaker 1: or it could do obstacle avoidance on its own. And 639 00:32:00,000 --> 00:32:01,800 Speaker 1: it'll be like, let me just look online, and I'll 640 00:32:01,840 --> 00:32:03,480 Speaker 1: Google and I'm like, oh, yeah, you could buy that 641 00:32:03,560 --> 00:32:05,000 Speaker 1: for a couple of hundred blucks. It's on the market. 642 00:32:05,120 --> 00:32:07,160 Speaker 1: What happened last month? And it blowls me and I 643 00:32:07,560 --> 00:32:10,480 Speaker 1: keep being surprised by this. I think people who are 644 00:32:10,640 --> 00:32:13,400 Speaker 1: our age and we're probably the last generation really like 645 00:32:13,440 --> 00:32:15,880 Speaker 1: we grew up in the analog era. We still had 646 00:32:15,880 --> 00:32:17,960 Speaker 1: cassette tapes and things like that, and then we grew 647 00:32:18,080 --> 00:32:22,240 Speaker 1: up the Internet came around and and things went digital. Um. 648 00:32:22,280 --> 00:32:25,680 Speaker 1: I think like our kids, they everything's always been digital 649 00:32:25,760 --> 00:32:29,400 Speaker 1: for them. Do you think that upcoming generations while the 650 00:32:29,400 --> 00:32:32,360 Speaker 1: same hesitation that maybe you and I might have of 651 00:32:32,560 --> 00:32:36,000 Speaker 1: not trusting the machine, you know, the Soviet radar operator 652 00:32:36,080 --> 00:32:39,520 Speaker 1: that's like I don't know about this, you know, I 653 00:32:39,560 --> 00:32:41,120 Speaker 1: don't know. I think, I mean, we're gonna have to 654 00:32:41,160 --> 00:32:45,000 Speaker 1: develop new intuitions for these kind of machines and how 655 00:32:45,080 --> 00:32:48,640 Speaker 1: they think. One of the things that's that's it's different 656 00:32:48,840 --> 00:32:51,959 Speaker 1: is they have different kinds of failure modes. UM. So, 657 00:32:52,040 --> 00:32:55,240 Speaker 1: for example, these machine learning systems that do object classification 658 00:32:55,440 --> 00:32:58,240 Speaker 1: that do better than humans, they're very very good, but 659 00:32:58,320 --> 00:33:01,200 Speaker 1: they they're vulnerable to a formal spoofing attack where they 660 00:33:01,200 --> 00:33:04,920 Speaker 1: can be fed manipulated images that can trick the machine 661 00:33:05,280 --> 00:33:08,320 Speaker 1: into believing something with a high degree of confidence. And so, 662 00:33:08,360 --> 00:33:11,480 Speaker 1: for example, UM, a group of researchers made a three 663 00:33:11,600 --> 00:33:14,240 Speaker 1: D printed turtle. So it looks like a little little 664 00:33:14,240 --> 00:33:16,560 Speaker 1: plastic turtle right to a human and we would all 665 00:33:16,560 --> 00:33:19,760 Speaker 1: agree with as a turtle object. They've embedded little tiny 666 00:33:19,920 --> 00:33:23,160 Speaker 1: swirl images that are very tailored in the shell of 667 00:33:23,200 --> 00:33:26,160 Speaker 1: the turtle that when a machine looks at this, it 668 00:33:26,240 --> 00:33:32,000 Speaker 1: says it's a rifle confidence. So that's a place where 669 00:33:32,040 --> 00:33:35,000 Speaker 1: it's a weird and counterintuitive to many people that that's 670 00:33:34,840 --> 00:33:36,880 Speaker 1: a that's a that's a thing. I do think over 671 00:33:36,960 --> 00:33:41,000 Speaker 1: time will begin to develop better intuition for those failure moves, 672 00:33:41,040 --> 00:33:43,320 Speaker 1: just like people have good intuition today for like mechanical 673 00:33:43,360 --> 00:33:45,800 Speaker 1: objects that you can do good the intuitive feel for 674 00:33:46,080 --> 00:33:48,800 Speaker 1: a humby or a tank or something else. Um, we're 675 00:33:48,840 --> 00:33:51,360 Speaker 1: going to develop those for these cognitive machines. But it 676 00:33:51,400 --> 00:33:53,720 Speaker 1: seems that one of the big problems, and something that's 677 00:33:53,720 --> 00:33:56,640 Speaker 1: been talked about a lot lately, is the notion of 678 00:33:56,720 --> 00:33:59,840 Speaker 1: uncertainty and how much uncertainty there is on the battlefield today. 679 00:34:00,240 --> 00:34:03,880 Speaker 1: And if you receive feedback from the sensors on your machine, 680 00:34:04,440 --> 00:34:07,840 Speaker 1: how do you know they're authentic? Um, when your machine, 681 00:34:07,880 --> 00:34:09,840 Speaker 1: when you're telling the machine, when you're pressing the button 682 00:34:09,840 --> 00:34:11,760 Speaker 1: telling the machine to do something, how does the machine 683 00:34:11,760 --> 00:34:15,040 Speaker 1: know that he's really getting that from you? How do 684 00:34:15,080 --> 00:34:17,000 Speaker 1: you think they're going to grapple with these questions when 685 00:34:17,040 --> 00:34:20,120 Speaker 1: we we operate in such an uncertain battle space where 686 00:34:20,120 --> 00:34:23,320 Speaker 1: we have adversaries who are manipulating the electronic environment. So 687 00:34:23,600 --> 00:34:26,800 Speaker 1: I think you have like these two um pressures foreign 688 00:34:26,800 --> 00:34:30,160 Speaker 1: against automation in warfare that kind of uncountered each other. 689 00:34:30,400 --> 00:34:35,080 Speaker 1: So one is an advantage in speed. Speed sunsued right 690 00:34:35,160 --> 00:34:37,640 Speaker 1: speed kills like speed is a huge advantage in war. 691 00:34:37,719 --> 00:34:39,560 Speaker 1: Speed is the essence of wars, what sun Too says, 692 00:34:39,719 --> 00:34:42,560 Speaker 1: So speed is it's a huge advantage. Automation is very 693 00:34:42,600 --> 00:34:45,600 Speaker 1: good at quick reaction times, So if we can react faster. 694 00:34:45,640 --> 00:34:47,360 Speaker 1: Those are places where you might want to use automation. 695 00:34:47,640 --> 00:34:50,040 Speaker 1: The flip side of that is it war is chaotic, 696 00:34:50,280 --> 00:34:53,160 Speaker 1: it's confusing, it's unpredictable. You've got an adversary that's trying 697 00:34:53,200 --> 00:34:56,920 Speaker 1: to do creative things. Machines are terrible at that that situation, 698 00:34:56,920 --> 00:34:59,960 Speaker 1: that kind of environment, and we can do automation very 699 00:35:00,080 --> 00:35:02,759 Speaker 1: well in places where we can test it. You can 700 00:35:02,800 --> 00:35:04,680 Speaker 1: take cars out on the road. You could drive cars 701 00:35:04,680 --> 00:35:07,800 Speaker 1: on the road every single day and do millions of miles, 702 00:35:08,120 --> 00:35:10,680 Speaker 1: and then you can build up you can increase reliability 703 00:35:10,880 --> 00:35:13,120 Speaker 1: in warfare. You never actually get to do that. You 704 00:35:13,120 --> 00:35:15,160 Speaker 1: can train all you want, and then you get to 705 00:35:15,200 --> 00:35:17,080 Speaker 1: war and it's always going to be different. You're not 706 00:35:17,120 --> 00:35:19,720 Speaker 1: gonna get to go practice against the enemy for thirty 707 00:35:19,800 --> 00:35:23,000 Speaker 1: years before the war had starts. So so that's where 708 00:35:23,040 --> 00:35:25,759 Speaker 1: you know, we tell humans things like training is just 709 00:35:25,840 --> 00:35:28,200 Speaker 1: a foundation for what you're gonna have to do. Exact 710 00:35:28,200 --> 00:35:31,279 Speaker 1: people don't go into war and just you know, slavishly 711 00:35:31,360 --> 00:35:35,440 Speaker 1: follow the training that you were given. Um, you're gonna 712 00:35:35,440 --> 00:35:37,400 Speaker 1: have to adjust your taxes. You have to be creative. 713 00:35:37,800 --> 00:35:40,520 Speaker 1: No plans for us first contact with the enemy. Machines 714 00:35:40,560 --> 00:35:43,359 Speaker 1: can't do that kind of adaptability, and so I think 715 00:35:43,400 --> 00:35:45,960 Speaker 1: that the challenge is trying to figure out ways to 716 00:35:46,000 --> 00:35:50,040 Speaker 1: integrate both humans and machines effectively the Center model. And 717 00:35:50,120 --> 00:35:52,600 Speaker 1: my suspicion is that the military that gets that right, 718 00:35:52,960 --> 00:35:55,160 Speaker 1: that figures out how to blend those together, will have 719 00:35:55,200 --> 00:35:59,160 Speaker 1: tremendous advantages in future wars. I'm a little skeptical about us, 720 00:35:59,480 --> 00:36:02,840 Speaker 1: at least for my perspective on looking at the infantry. 721 00:36:03,120 --> 00:36:05,880 Speaker 1: Remember the land Warrior program. It's been around sense of 722 00:36:05,920 --> 00:36:11,920 Speaker 1: about there's been like three generations of it. I mean, 723 00:36:12,040 --> 00:36:14,400 Speaker 1: so you've got a confluence of problems. One is the 724 00:36:14,560 --> 00:36:17,120 Speaker 1: army has not had a successful modernization program since the 725 00:36:17,120 --> 00:36:21,000 Speaker 1: Striker um. The army has huge modernization challenges as a whole. 726 00:36:21,440 --> 00:36:24,960 Speaker 1: Um it feeds into broader duty problems, which is the 727 00:36:25,520 --> 00:36:30,320 Speaker 1: pangang gets fixated by next gen technology. Whiz. I remember 728 00:36:30,360 --> 00:36:33,720 Speaker 1: you telling me this is like two thousand five, saying 729 00:36:33,840 --> 00:36:35,919 Speaker 1: something like we don't need all this high tech stuff. 730 00:36:35,960 --> 00:36:38,280 Speaker 1: We just need better ruck sacks, better boots. It's true, 731 00:36:38,360 --> 00:36:41,600 Speaker 1: right at a time, like like let's just give infantry 732 00:36:41,640 --> 00:36:43,840 Speaker 1: guys better boots. Now, eventually we didn't get better boots. 733 00:36:44,120 --> 00:36:47,160 Speaker 1: But but you know people, I think, especially people in 734 00:36:47,239 --> 00:36:49,560 Speaker 1: ground combat or it's messy and it's dirty and things 735 00:36:49,560 --> 00:36:52,720 Speaker 1: are gonna jam and things gonna break, want reliability and robustness. 736 00:36:52,760 --> 00:36:55,920 Speaker 1: They don't want to us fancy gadget wires and ship 737 00:36:56,000 --> 00:36:58,279 Speaker 1: coming out of it. It's right, and but it's also 738 00:36:58,400 --> 00:37:00,640 Speaker 1: compounds of the fact that for the infant three soldier, 739 00:37:01,160 --> 00:37:04,520 Speaker 1: all this stuff edge weight. And I think the fundamental 740 00:37:04,560 --> 00:37:07,640 Speaker 1: difference between what we've seen in infantry combat versus other 741 00:37:07,719 --> 00:37:10,240 Speaker 1: areas of combat like being a fighter pilot or whatever 742 00:37:10,680 --> 00:37:12,520 Speaker 1: is that is that you're limited by what you can 743 00:37:12,560 --> 00:37:14,160 Speaker 1: carry in the weight. You go back to World War Two, 744 00:37:14,360 --> 00:37:18,239 Speaker 1: the three most dangerous jobs in the military infantry, bombers, 745 00:37:18,520 --> 00:37:22,239 Speaker 1: and submarines. Now we've been ever leverage modern technology to 746 00:37:22,360 --> 00:37:26,080 Speaker 1: build very advanced, survivable stealth bombers and submarines. Being an 747 00:37:26,120 --> 00:37:28,879 Speaker 1: infantry is as dangerous as it's ever been. That's that's 748 00:37:28,880 --> 00:37:30,640 Speaker 1: where the bulk of the cash thats are happening for 749 00:37:30,680 --> 00:37:33,879 Speaker 1: combat forces. Um So, so why is that? I think 750 00:37:34,080 --> 00:37:36,520 Speaker 1: part of the problem is you just everything you give somebody. 751 00:37:36,600 --> 00:37:38,640 Speaker 1: They got to carry this stuff, and people are loaded 752 00:37:38,680 --> 00:37:41,960 Speaker 1: down with them. They're already way overload, way overloaded. Right, 753 00:37:42,080 --> 00:37:43,719 Speaker 1: and then you lose mobility. So we have we have 754 00:37:43,760 --> 00:37:47,000 Speaker 1: a project underway at the Center Food in American Schoody 755 00:37:47,040 --> 00:37:50,560 Speaker 1: called Super Soldiers. We did actually a project for the Army, 756 00:37:50,920 --> 00:37:54,080 Speaker 1: for the Army Research Laboratory looking at emerging technologies to 757 00:37:54,080 --> 00:37:56,600 Speaker 1: try to solve this problem. Well, the Army has these 758 00:37:56,640 --> 00:38:00,680 Speaker 1: ideas about like giving soldiers an exo skeleton cocked Martin 759 00:38:00,840 --> 00:38:04,160 Speaker 1: has like the fucking you know, six wheeled mechanical horse 760 00:38:04,239 --> 00:38:06,839 Speaker 1: that's going to carry your ruck sacks. And I mean 761 00:38:06,960 --> 00:38:09,240 Speaker 1: there's a there's a have you ever read that book? 762 00:38:09,280 --> 00:38:11,840 Speaker 1: It's it's an actual mandatory reading at West Point. I 763 00:38:11,840 --> 00:38:15,320 Speaker 1: think it's called a Soldiers Load, Yes, Yes, and it 764 00:38:15,680 --> 00:38:18,600 Speaker 1: and it breaks down how much weight our soldiers should 765 00:38:18,640 --> 00:38:21,000 Speaker 1: be carrying. Everybody has to read it and then they 766 00:38:21,000 --> 00:38:22,880 Speaker 1: blow it off. We totally blow it off, right. And 767 00:38:22,920 --> 00:38:24,319 Speaker 1: so one of the things you walk through in our 768 00:38:24,360 --> 00:38:27,040 Speaker 1: in our report um that's coming out part of the 769 00:38:27,080 --> 00:38:30,279 Speaker 1: Super Soldiers series this summer is is all of these 770 00:38:30,320 --> 00:38:33,279 Speaker 1: there's all these studies over time of soldiers loaded, both 771 00:38:33,280 --> 00:38:35,160 Speaker 1: the Army and Marine Corps, and they all settle for 772 00:38:35,160 --> 00:38:37,640 Speaker 1: about fifty pounds. It's kind of like the right load 773 00:38:37,960 --> 00:38:40,520 Speaker 1: combat load combat load. Right, where do we actually load 774 00:38:40,560 --> 00:38:42,759 Speaker 1: people up with? Everybody knows, but news we care a 775 00:38:43,040 --> 00:38:45,480 Speaker 1: hell of a lot more than fifty pounds. But so 776 00:38:45,480 --> 00:38:48,240 Speaker 1: so there are some technology things, you know, the giant 777 00:38:48,320 --> 00:38:50,560 Speaker 1: Ironman suit. It's not around the corner. I mean, it's 778 00:38:50,560 --> 00:38:53,680 Speaker 1: gonna require some more research. We do look at exoskeletons 779 00:38:53,719 --> 00:38:56,319 Speaker 1: and exo suits, which is like a soft version of 780 00:38:56,360 --> 00:38:59,080 Speaker 1: that that kind of helps. It's like it's like artificial 781 00:38:59,160 --> 00:39:01,319 Speaker 1: muscles and tendon from mobility. It's a little bit more 782 00:39:01,400 --> 00:39:04,359 Speaker 1: near term than an extra skeleton. But we also talk 783 00:39:04,400 --> 00:39:09,279 Speaker 1: about like policy changes, like what if you allowed people commanders, 784 00:39:09,280 --> 00:39:14,640 Speaker 1: You delegated authority to lower level commanders to change ppe 785 00:39:14,680 --> 00:39:17,120 Speaker 1: and body armor. So if you're like a company commander 786 00:39:17,160 --> 00:39:19,760 Speaker 1: up in the mountains of Afghanistan, you can be like, hey, 787 00:39:19,760 --> 00:39:23,640 Speaker 1: we're not wearing place this time. Right, every we say 788 00:39:23,680 --> 00:39:26,360 Speaker 1: everything is meant to you dependent and then in practice 789 00:39:26,440 --> 00:39:29,280 Speaker 1: we don't give authority to people to actually adapt the equipment. 790 00:39:29,280 --> 00:39:32,160 Speaker 1: And there's all these policy letters that dictate what commanders 791 00:39:32,200 --> 00:39:34,200 Speaker 1: can do. We don't get people freedom like that. So 792 00:39:34,440 --> 00:39:37,319 Speaker 1: I think that's a huge mistake. Um well, I also 793 00:39:37,320 --> 00:39:39,239 Speaker 1: think one of these recommend is we set a weight 794 00:39:39,280 --> 00:39:42,160 Speaker 1: limit an individual equipment, So the army should establish for 795 00:39:42,160 --> 00:39:45,480 Speaker 1: personal gear a weight limit and then every time they 796 00:39:45,480 --> 00:39:47,440 Speaker 1: want to add a pound of something, something's got to 797 00:39:47,520 --> 00:39:50,399 Speaker 1: come away. And they actually don't have that right now. Um, 798 00:39:50,520 --> 00:39:52,520 Speaker 1: they pay attention to wait, but they don't have like 799 00:39:52,520 --> 00:39:54,600 Speaker 1: a number in place. So there are some things we 800 00:39:54,600 --> 00:39:57,520 Speaker 1: could do today. I think, uh, I think my last 801 00:39:57,520 --> 00:40:00,840 Speaker 1: deployment is wearing like ninety pounds a gear. Yeah. I 802 00:40:00,840 --> 00:40:03,480 Speaker 1: mean there's been you know, people, Um, there's been a 803 00:40:03,520 --> 00:40:05,080 Speaker 1: lot of studies they have done that showed like your 804 00:40:05,120 --> 00:40:07,719 Speaker 1: average infesting units carrying you know when you add in 805 00:40:07,760 --> 00:40:10,080 Speaker 1: the rucksack, right, and I wasn't carrying a rock. I 806 00:40:10,120 --> 00:40:13,080 Speaker 1: wasn't even that. Yeah, you're you're carrying eighty like a 807 00:40:13,160 --> 00:40:15,560 Speaker 1: hunter in tank pounds depending on your due position, you know, 808 00:40:15,560 --> 00:40:17,960 Speaker 1: whether your mortars or like a machine gun team. I 809 00:40:18,040 --> 00:40:21,000 Speaker 1: remember one mission we did this one time with and 810 00:40:21,040 --> 00:40:22,920 Speaker 1: you remember some of the guys in the sniper sections. 811 00:40:22,920 --> 00:40:24,640 Speaker 1: You can imagine how we ever would do this, right, 812 00:40:25,080 --> 00:40:27,720 Speaker 1: We carried I swear to got a hundred and sixty 813 00:40:27,719 --> 00:40:30,839 Speaker 1: pounds up a mountain Afghanistan. So we only did that 814 00:40:30,960 --> 00:40:34,279 Speaker 1: once and we were like this this is dumb right, 815 00:40:34,560 --> 00:40:37,600 Speaker 1: and we got need to say, we were so poor 816 00:40:37,640 --> 00:40:40,000 Speaker 1: mobility that we couldn't get to the height st we're 817 00:40:40,000 --> 00:40:42,520 Speaker 1: trying to get to sun comes up, we got a 818 00:40:42,560 --> 00:40:44,960 Speaker 1: honker on when we were we get compromised right out 819 00:40:44,960 --> 00:40:46,560 Speaker 1: the gate. You know, as soon as the sun comes up, 820 00:40:46,600 --> 00:40:48,560 Speaker 1: people found us and we ended up getting into a 821 00:40:48,560 --> 00:40:51,640 Speaker 1: firefight and had to Xville and and the next the 822 00:40:51,680 --> 00:40:53,839 Speaker 1: next day we were like every everything's coming out right, 823 00:40:53,880 --> 00:40:55,600 Speaker 1: everything's coming out of the rucksack and we went. We 824 00:40:55,600 --> 00:40:58,840 Speaker 1: went real late and fast. Um. But you know, we 825 00:40:58,920 --> 00:41:00,759 Speaker 1: have this culture part of the policies, the point of 826 00:41:00,800 --> 00:41:03,800 Speaker 1: it's just like we're hard, right, like carry more weight. Yeah, 827 00:41:04,520 --> 00:41:06,800 Speaker 1: I mean I've read a lot of books um about 828 00:41:06,880 --> 00:41:09,480 Speaker 1: you know, the Rhodesion in the South African units that 829 00:41:09,560 --> 00:41:13,279 Speaker 1: made like some really long term infiltrations and uh yeah 830 00:41:13,320 --> 00:41:15,880 Speaker 1: every time they you know, especially the long range of 831 00:41:15,920 --> 00:41:18,600 Speaker 1: conaissance guys going out with a hundred and fifty it's 832 00:41:18,600 --> 00:41:21,960 Speaker 1: not possible, Like you can't operate like that. The more 833 00:41:22,000 --> 00:41:24,640 Speaker 1: effective operations are where the guys are carrying thirty pounds 834 00:41:24,640 --> 00:41:28,000 Speaker 1: of gear. They're very light mobile, parachuting in the theater, 835 00:41:28,880 --> 00:41:31,600 Speaker 1: mopping up the enemy and cruising out. Yeah, yeah, you've 836 00:41:31,600 --> 00:41:34,319 Speaker 1: gotta met. You gotta value mobility. Um. You know when 837 00:41:34,320 --> 00:41:37,279 Speaker 1: you think about survivability and effectiveness like protection and gear 838 00:41:37,320 --> 00:41:39,640 Speaker 1: as part of it, but part of its mobility, it's 839 00:41:39,640 --> 00:41:41,960 Speaker 1: really key. I mean, especially like in these wars where 840 00:41:41,960 --> 00:41:43,440 Speaker 1: you've got our guys, I'll turn it up with all 841 00:41:43,480 --> 00:41:46,359 Speaker 1: this gear and it can't move and they're fighting people 842 00:41:46,400 --> 00:41:48,920 Speaker 1: skipping around on them on tops above them and pajamas 843 00:41:48,920 --> 00:41:52,759 Speaker 1: and sandals. Yeah. Huge, it's a huge problems. Yeah. In 844 00:41:52,760 --> 00:41:54,799 Speaker 1: Iraq we also got very used to being able to 845 00:41:54,920 --> 00:41:57,920 Speaker 1: drive around vehicles and we move a hundred meters to 846 00:41:57,960 --> 00:42:01,400 Speaker 1: target and then you know, back to the v goals. Um. 847 00:42:01,440 --> 00:42:03,880 Speaker 1: You know, guys in Afghanistan, I haven't always had that luxury. 848 00:42:04,239 --> 00:42:08,480 Speaker 1: Very different, right, Yeah, So how far off do you 849 00:42:08,520 --> 00:42:12,520 Speaker 1: think we really are um to having autonomous androids that 850 00:42:12,560 --> 00:42:14,920 Speaker 1: can do our door breaching for us, our room clearing 851 00:42:14,920 --> 00:42:20,120 Speaker 1: for us. When the operator was such a glorified position today, 852 00:42:20,480 --> 00:42:22,879 Speaker 1: you know, the pinnacle of the alpha male in America. 853 00:42:23,200 --> 00:42:26,640 Speaker 1: When is the operator going to get replaced? Will it happen? So? 854 00:42:26,640 --> 00:42:28,440 Speaker 1: So I think you know, one of the ways that 855 00:42:28,440 --> 00:42:30,279 Speaker 1: people think about it sometimes is LETNA say you know, 856 00:42:30,320 --> 00:42:32,399 Speaker 1: a robot could never do my job, and I find 857 00:42:32,440 --> 00:42:35,040 Speaker 1: it matter who you talk to in the military, everybody 858 00:42:35,080 --> 00:42:37,720 Speaker 1: thinks that, right, everybody's like, oh, my job is important. 859 00:42:37,719 --> 00:42:40,919 Speaker 1: And and the answer is probably yes, a robot could 860 00:42:41,000 --> 00:42:44,279 Speaker 1: never do everything that a human does. But that's not 861 00:42:44,360 --> 00:42:46,600 Speaker 1: that shouldn't be the standard, right. The question is could 862 00:42:46,600 --> 00:42:49,319 Speaker 1: we use robots to do some of the things better? Right? 863 00:42:49,520 --> 00:42:52,600 Speaker 1: So you know, could we build like this android robot 864 00:42:52,719 --> 00:42:54,920 Speaker 1: it's gonna do everything a human does? No, no, no, no, 865 00:42:55,520 --> 00:43:00,960 Speaker 1: like uh like Michael and Prometheus, that's a guy's name. Yeah, 866 00:43:01,000 --> 00:43:03,400 Speaker 1: that that fast better place. Yeah, I don't remember that. 867 00:43:03,440 --> 00:43:06,200 Speaker 1: I don't remember that. He's almost like a human being. 868 00:43:06,400 --> 00:43:08,040 Speaker 1: You do all the things you would expect a human. 869 00:43:08,239 --> 00:43:10,960 Speaker 1: That's that's like, that's like lighters away, right, that's well 870 00:43:11,800 --> 00:43:13,440 Speaker 1: lighters a unit of distance, I should say it, but 871 00:43:13,480 --> 00:43:16,120 Speaker 1: it's like it's decads away at best, right, maybe centuries. 872 00:43:16,280 --> 00:43:18,840 Speaker 1: That's really hard to do. Um, but could we be 873 00:43:18,960 --> 00:43:21,440 Speaker 1: using robots so that the first thing in the door 874 00:43:21,719 --> 00:43:25,359 Speaker 1: is not an operator catching bullets? Yeah? Right, we can 875 00:43:25,400 --> 00:43:27,759 Speaker 1: imagine a robot being able to batter down the door, right, 876 00:43:27,840 --> 00:43:29,319 Speaker 1: you could use a robots the batter on the door 877 00:43:29,360 --> 00:43:31,640 Speaker 1: and then once you breach the door, right, why don't 878 00:43:31,640 --> 00:43:34,239 Speaker 1: I send in a quad copter with a flesh bang 879 00:43:34,239 --> 00:43:36,880 Speaker 1: on it. Now that's smart. Could be sending a swarm 880 00:43:36,920 --> 00:43:40,520 Speaker 1: of quad copters into a building to completely map it autonomously, 881 00:43:41,200 --> 00:43:44,560 Speaker 1: to find all the people inside it, use facial recognition. 882 00:43:45,120 --> 00:43:47,400 Speaker 1: Linked that up to a broader network that we have 883 00:43:47,520 --> 00:43:50,399 Speaker 1: of intel to identify all these people in their relationships, 884 00:43:50,560 --> 00:43:53,080 Speaker 1: identify all of the objects, say here's all the weapons 885 00:43:53,080 --> 00:43:55,239 Speaker 1: that we're carrying, and then a person is like, yep, 886 00:43:55,360 --> 00:43:57,440 Speaker 1: that's the guy we're going after. Fragg the room. I 887 00:43:57,480 --> 00:43:59,600 Speaker 1: would say. The fear is, though, we're not always going 888 00:43:59,680 --> 00:44:02,160 Speaker 1: to be at war with these countries so far behind 889 00:44:02,239 --> 00:44:05,720 Speaker 1: in technology that if we have this technology, Russia, China, 890 00:44:05,760 --> 00:44:07,719 Speaker 1: they're gonna have the same type of technology to use 891 00:44:07,760 --> 00:44:09,880 Speaker 1: on us. They are. Yeah, and we're gonna I mean 892 00:44:09,920 --> 00:44:12,240 Speaker 1: we're already seeing with the Islamic State they're using drones 893 00:44:12,280 --> 00:44:15,440 Speaker 1: against you know, against us and our allies and our 894 00:44:15,520 --> 00:44:17,719 Speaker 1: Rock and Syria. So we've got me thinking about counter 895 00:44:17,760 --> 00:44:20,680 Speaker 1: measures to these things too. As much as uh as 896 00:44:20,800 --> 00:44:23,960 Speaker 1: cheesy as I thought this movie was, um, don't stay 897 00:44:24,040 --> 00:44:26,840 Speaker 1: Terminator too, because no, no, not not not Terminator to 898 00:44:27,120 --> 00:44:31,880 Speaker 1: we all loved Arnold and Terminator too, and uh um 899 00:44:31,920 --> 00:44:35,480 Speaker 1: Sarah Connor one of our all time favorites. But uh 900 00:44:35,680 --> 00:44:39,759 Speaker 1: the new the Robocopy remake and Sam Sam Jackson. In 901 00:44:39,800 --> 00:44:42,319 Speaker 1: the beginning of the movie, he makes this point that 902 00:44:42,360 --> 00:44:45,120 Speaker 1: I thought was really important. He's saying, we have these 903 00:44:45,160 --> 00:44:50,480 Speaker 1: amazing um drone technologies and android technologies that we're using 904 00:44:50,520 --> 00:44:54,360 Speaker 1: to fight counterinsurgencies abroad, but back home, we have crime 905 00:44:54,360 --> 00:44:57,640 Speaker 1: in our streets, our police officers gunned down and killed. 906 00:44:57,920 --> 00:45:01,000 Speaker 1: Why can't we use this amazing technolo oology here at home? 907 00:45:02,280 --> 00:45:04,040 Speaker 1: So we've seen in little bits of this, right, there 908 00:45:04,120 --> 00:45:06,439 Speaker 1: was an incident UM last year or two years ago 909 00:45:06,880 --> 00:45:09,640 Speaker 1: where the police would use this this robot and put 910 00:45:09,640 --> 00:45:12,000 Speaker 1: a bomb on it to blow up the R two 911 00:45:12,080 --> 00:45:16,200 Speaker 1: murder bot. I thought it was awesome. So, I mean, 912 00:45:16,239 --> 00:45:18,319 Speaker 1: I think, look, one of the one of the things 913 00:45:18,320 --> 00:45:21,080 Speaker 1: that robots can do that's really beneficial is give greater 914 00:45:21,200 --> 00:45:24,120 Speaker 1: standoff from threats. UM. We put a lot of people 915 00:45:24,120 --> 00:45:27,840 Speaker 1: in harms or whether it's police officers or troops in 916 00:45:27,920 --> 00:45:31,080 Speaker 1: combat UM where they have to make these split second 917 00:45:31,120 --> 00:45:34,560 Speaker 1: decisions and they're didn't ambiguous situation. There's like shooting, no 918 00:45:34,600 --> 00:45:38,080 Speaker 1: shoot situation. Maybe it's a police officer, somebody's fishing around 919 00:45:38,120 --> 00:45:40,080 Speaker 1: in their in their pocket for something and maybe it's 920 00:45:40,080 --> 00:45:42,400 Speaker 1: a gun, maybe it's not. Maybe you've got some soldier 921 00:45:42,400 --> 00:45:45,279 Speaker 1: standing at a checkpoint and the vehicle's approaching, or a 922 00:45:45,360 --> 00:45:47,400 Speaker 1: person and they're like, all right, is this a threat? 923 00:45:47,400 --> 00:45:50,240 Speaker 1: Do I have to shoot? Robots can give you greater standoff, 924 00:45:50,480 --> 00:45:52,120 Speaker 1: So you can have a situation we have a robot 925 00:45:52,160 --> 00:45:55,399 Speaker 1: there and the humans still in control. Okay, humans making 926 00:45:55,440 --> 00:45:57,960 Speaker 1: all the decision making. But now the human can say, 927 00:45:58,160 --> 00:46:00,360 Speaker 1: you know what, I'm gonna be a little bit more rushes. 928 00:46:00,960 --> 00:46:03,400 Speaker 1: Let him shoot the robot, let him take the first shot. 929 00:46:03,600 --> 00:46:05,000 Speaker 1: I mean, that's what I brought up to you earlier. 930 00:46:05,040 --> 00:46:07,520 Speaker 1: We kind of use canines that way now. I mean 931 00:46:07,560 --> 00:46:08,880 Speaker 1: a lot of people don't like to hear it, but 932 00:46:08,920 --> 00:46:10,680 Speaker 1: you put the dog in the in the house first, 933 00:46:11,280 --> 00:46:14,200 Speaker 1: and maybe the dog gets shot, hopefully not. But yeah, 934 00:46:14,239 --> 00:46:15,520 Speaker 1: and so like that, that kind of stuff is a 935 00:46:15,520 --> 00:46:17,160 Speaker 1: great job for robot. I mean, in some ways that 936 00:46:17,480 --> 00:46:19,839 Speaker 1: the so called Talles project, which I'm a support robe, 937 00:46:19,840 --> 00:46:22,120 Speaker 1: I think we should be doing actually more on extra skeletons. 938 00:46:22,440 --> 00:46:25,319 Speaker 1: I think that there are the technology is not it's 939 00:46:25,320 --> 00:46:27,520 Speaker 1: not gonna happen tomorrow, but it's not gonna happen at 940 00:46:27,520 --> 00:46:29,200 Speaker 1: all if we don't invest in it. And there was 941 00:46:29,239 --> 00:46:32,600 Speaker 1: a potential there, right, But the Tallis project is sort 942 00:46:32,600 --> 00:46:34,560 Speaker 1: of the embodiment of one of the problems of this, 943 00:46:34,600 --> 00:46:36,839 Speaker 1: which is we had this problem with operators running into 944 00:46:36,840 --> 00:46:38,879 Speaker 1: the fatal funnel getting shot, and so look, we're gonna 945 00:46:38,880 --> 00:46:41,200 Speaker 1: put him his Iron Man suit. Okay, that's great, let's 946 00:46:41,200 --> 00:46:43,000 Speaker 1: work on that. But why don't we just send like 947 00:46:43,160 --> 00:46:46,200 Speaker 1: robots in first, send a quad coptermin We can totally 948 00:46:46,239 --> 00:46:49,120 Speaker 1: do that today and let the robot catch the bullets. 949 00:46:50,320 --> 00:46:54,840 Speaker 1: What do you feel about that these technologies, these military 950 00:46:54,880 --> 00:46:58,080 Speaker 1: technologies could potentially be used here in the United States 951 00:46:59,239 --> 00:47:01,399 Speaker 1: that I can very quickly. Like I mentioned you before, 952 00:47:01,440 --> 00:47:04,360 Speaker 1: it sounds like Minority Report where they come and they 953 00:47:04,440 --> 00:47:07,160 Speaker 1: send the robots into the apartment complex looking for Tom 954 00:47:07,200 --> 00:47:10,400 Speaker 1: Cruise and they're scanning everybody's eyeballs. I mean, it sounds 955 00:47:10,400 --> 00:47:13,719 Speaker 1: like some Panopticon type technology surveillance state like we see 956 00:47:13,760 --> 00:47:16,359 Speaker 1: in China. Yeah, I mean, obviously, look, there's a lot 957 00:47:16,400 --> 00:47:18,920 Speaker 1: of a lot of greater concerns when you start applying 958 00:47:18,960 --> 00:47:22,799 Speaker 1: this stuff domestically. Um, particularly when it comes to surveillance, 959 00:47:22,920 --> 00:47:24,880 Speaker 1: surveillance and some of those kind of issues that come up. 960 00:47:25,120 --> 00:47:27,000 Speaker 1: I mean, it's it's weird in the sense that we 961 00:47:27,080 --> 00:47:32,040 Speaker 1: get really fixated on the physical aspect of these things. Um, well, 962 00:47:32,040 --> 00:47:34,800 Speaker 1: there's a lot of fear surrounding drones in the US, 963 00:47:35,239 --> 00:47:37,360 Speaker 1: and people are worried about you know, somebody buys a 964 00:47:37,440 --> 00:47:40,919 Speaker 1: drone and the flying the backyard, the neighbors spying on them. Well, 965 00:47:41,000 --> 00:47:42,319 Speaker 1: we live in this world where you go to a 966 00:47:42,400 --> 00:47:45,080 Speaker 1: major city in the US and they got cameras everywhere 967 00:47:45,600 --> 00:47:48,600 Speaker 1: recording everything. People don't talk about that. We didn't have 968 00:47:48,600 --> 00:47:51,160 Speaker 1: a debate as a society about that. We're just like 969 00:47:51,200 --> 00:47:53,880 Speaker 1: boom cameras went up post on eleven and then and 970 00:47:53,920 --> 00:47:56,520 Speaker 1: then after the Boston bombing, when we use those cameras 971 00:47:56,520 --> 00:47:59,480 Speaker 1: to find people like that just sealed the deal. Right now, 972 00:47:59,520 --> 00:48:02,120 Speaker 1: It's like, the cameras are effective, We're going to use them. Um. 973 00:48:02,280 --> 00:48:04,359 Speaker 1: We live in a world where people freely give up 974 00:48:04,840 --> 00:48:08,000 Speaker 1: all of their personal data to tech companies, right, and 975 00:48:08,040 --> 00:48:10,080 Speaker 1: we don't I mean, like and then afterwards when you 976 00:48:10,080 --> 00:48:11,640 Speaker 1: get stuff like cambradge out to look at people like 977 00:48:11,840 --> 00:48:14,279 Speaker 1: what is tech companies doing? Like you didn't know that 978 00:48:14,320 --> 00:48:17,040 Speaker 1: they were collecting all your data Facebook. Now with the 979 00:48:17,080 --> 00:48:19,839 Speaker 1: facial recognition and all that, they're doing the same same 980 00:48:19,880 --> 00:48:22,719 Speaker 1: type of One thing I bring up to people, um 981 00:48:22,760 --> 00:48:25,080 Speaker 1: in a military context, and some people tend to I 982 00:48:25,280 --> 00:48:28,080 Speaker 1: get some really angry reactions when I say this is 983 00:48:28,120 --> 00:48:31,719 Speaker 1: that we still the military and the special operations community 984 00:48:31,840 --> 00:48:36,520 Speaker 1: largely still operates as if it's like nine five, thinking 985 00:48:36,560 --> 00:48:38,239 Speaker 1: that we're gonna be able to infiltrate like an O 986 00:48:38,400 --> 00:48:42,360 Speaker 1: d A into an austere environment and it's gonna go unnoticed, 987 00:48:42,360 --> 00:48:44,800 Speaker 1: that it will be able to operate in a clandestine manner. 988 00:48:45,440 --> 00:48:48,120 Speaker 1: I mean, I've been in some total third world countries 989 00:48:48,239 --> 00:48:51,479 Speaker 1: I know you have as well, where every dude has 990 00:48:51,520 --> 00:48:54,400 Speaker 1: a phone in their pocket that has a camera on it, 991 00:48:54,480 --> 00:48:56,560 Speaker 1: and more than likely it's a smartphone that they can 992 00:48:56,640 --> 00:48:59,520 Speaker 1: upload pictures and video on. So the notion that we're 993 00:48:59,520 --> 00:49:01,839 Speaker 1: gonna be able to send, you know, white dudes like 994 00:49:01,880 --> 00:49:03,840 Speaker 1: me and multi camp to a country they're going to 995 00:49:03,960 --> 00:49:07,080 Speaker 1: operate in a clandestine manner, that's all out the window. 996 00:49:07,120 --> 00:49:09,279 Speaker 1: I mean, you can't assume any of that anymore. It's 997 00:49:09,320 --> 00:49:12,000 Speaker 1: not like in nineteen fifty you could send a special 998 00:49:12,040 --> 00:49:15,400 Speaker 1: Forces team to the Congo and it would be pretty nondescripted. 999 00:49:15,400 --> 00:49:17,759 Speaker 1: You probably never hear about it, you know. No, I 1000 00:49:17,800 --> 00:49:19,880 Speaker 1: don't think that we have really fully adapted to this 1001 00:49:19,960 --> 00:49:23,800 Speaker 1: world of radical transparency that we're already living in. Um. 1002 00:49:23,840 --> 00:49:26,280 Speaker 1: I mean, look at bin Laden raid was reported on Twitter. 1003 00:49:26,920 --> 00:49:29,319 Speaker 1: Now they didn't know what was happening. But then there's 1004 00:49:29,320 --> 00:49:32,440 Speaker 1: there's weird helicopters something going on. Yeah, and that's I mean, 1005 00:49:32,440 --> 00:49:34,120 Speaker 1: that's several years ago. So it's you start to fast 1006 00:49:34,160 --> 00:49:38,000 Speaker 1: forward that where everybody's got a smartphone, everybody's reporting things. 1007 00:49:38,200 --> 00:49:40,520 Speaker 1: How do we adapt to a world where the location 1008 00:49:40,560 --> 00:49:43,400 Speaker 1: and disposition of our forces and their actions are going 1009 00:49:43,440 --> 00:49:46,200 Speaker 1: to be reported in real time to everyone? And not 1010 00:49:46,280 --> 00:49:49,160 Speaker 1: everybody who's reporting this is necessarily the enemy. There might 1011 00:49:49,200 --> 00:49:52,160 Speaker 1: just be locals that might be concerned citizens. So how 1012 00:49:52,160 --> 00:49:54,080 Speaker 1: do we get some kid on the street like, well, 1013 00:49:54,160 --> 00:49:56,840 Speaker 1: well Americans are here right, like maybe maybe maybe like 1014 00:49:57,080 --> 00:49:59,520 Speaker 1: it's cool, maybe you're supportive of us, but then boom, 1015 00:49:59,520 --> 00:50:02,239 Speaker 1: like your own right. So I think like there are 1016 00:50:02,239 --> 00:50:05,000 Speaker 1: things in terms of adapting our operations that we need 1017 00:50:05,040 --> 00:50:06,680 Speaker 1: to do. It's also things in terms of ops sick, 1018 00:50:07,080 --> 00:50:10,040 Speaker 1: like we still put name tapes on people Okay, not 1019 00:50:10,120 --> 00:50:12,560 Speaker 1: in the soft community, but like in big army. That's insane. 1020 00:50:13,000 --> 00:50:14,759 Speaker 1: That's insane that we do that, particularly when you think 1021 00:50:14,800 --> 00:50:18,319 Speaker 1: about all the information on social media, people's families and things. 1022 00:50:18,360 --> 00:50:20,839 Speaker 1: We have not a huge force protection. We have not 1023 00:50:20,880 --> 00:50:23,399 Speaker 1: grappled with that at all. I mean people talk about 1024 00:50:23,480 --> 00:50:25,799 Speaker 1: that kind of stuff, but you can use off the 1025 00:50:25,840 --> 00:50:29,720 Speaker 1: shelf facial recognition technology and I mean start popping people 1026 00:50:29,800 --> 00:50:32,560 Speaker 1: very very quickly. Well, and then you've got situations like 1027 00:50:32,600 --> 00:50:35,520 Speaker 1: this this Strava stuff that came up recently, the heat 1028 00:50:35,560 --> 00:50:39,520 Speaker 1: maps with all the all the running, the fitbit and 1029 00:50:39,560 --> 00:50:41,759 Speaker 1: so so some of it's also ourselves that we got 1030 00:50:41,800 --> 00:50:44,400 Speaker 1: always we exposed to what we exposed all these bases 1031 00:50:44,400 --> 00:50:46,840 Speaker 1: overseas well. That's what I was saying earlier. You're I 1032 00:50:46,880 --> 00:50:49,600 Speaker 1: think there's there's people out there who grew up with 1033 00:50:49,640 --> 00:50:51,920 Speaker 1: this technology and don't really think twice about it. You know, 1034 00:50:51,960 --> 00:50:55,560 Speaker 1: they're not bad people, it's just it's like the you're 1035 00:50:55,640 --> 00:50:58,520 Speaker 1: so attached to that technology, it's so embedded in part 1036 00:50:58,520 --> 00:51:00,880 Speaker 1: of your lifestyle that you really don't think twice about it. 1037 00:51:01,239 --> 00:51:03,080 Speaker 1: And I think I think we've got to really adapt 1038 00:51:03,080 --> 00:51:05,640 Speaker 1: to that both thinking about you know what happens when 1039 00:51:05,719 --> 00:51:08,560 Speaker 1: other people can go to Internet and they can find 1040 00:51:08,600 --> 00:51:10,759 Speaker 1: these maps of our bases and they can tell in 1041 00:51:10,800 --> 00:51:12,960 Speaker 1: real time where we are and how many of us are. 1042 00:51:13,400 --> 00:51:15,920 Speaker 1: But also like what happens in the world where every 1043 00:51:15,960 --> 00:51:19,280 Speaker 1: interaction that the U. S. Soldier has with locals is recorded, 1044 00:51:19,960 --> 00:51:22,520 Speaker 1: that's a very different kind of world. And we're seeing 1045 00:51:22,640 --> 00:51:25,360 Speaker 1: the police in America do what the sea change in 1046 00:51:25,360 --> 00:51:27,359 Speaker 1: the past couple of years where now you've get video 1047 00:51:27,360 --> 00:51:29,839 Speaker 1: recordings of all these interactions and you get you get 1048 00:51:29,840 --> 00:51:31,799 Speaker 1: like these sort of these big debates about the use 1049 00:51:31,800 --> 00:51:34,759 Speaker 1: of force and policing. Um, we need to prepare for 1050 00:51:34,800 --> 00:51:37,280 Speaker 1: that to happen. I mean where you know what, every 1051 00:51:37,280 --> 00:51:40,400 Speaker 1: time some don't some Joe does something kind of silly 1052 00:51:40,960 --> 00:51:43,640 Speaker 1: and and it maybe it's rude to somebody that goes viral, 1053 00:51:43,960 --> 00:51:46,000 Speaker 1: and I don't think that we're prepared for the training 1054 00:51:46,000 --> 00:51:47,479 Speaker 1: that we need to give our people to be ready 1055 00:51:47,480 --> 00:51:49,160 Speaker 1: for that. Right every time a Joe gets into a 1056 00:51:49,160 --> 00:51:51,960 Speaker 1: fight in McDonald's or something like that, or the rude 1057 00:51:52,000 --> 00:51:53,799 Speaker 1: to some locally you know, points his weapon its some 1058 00:51:53,960 --> 00:51:56,800 Speaker 1: you know, some some old lady on a street corner. 1059 00:51:57,040 --> 00:51:59,040 Speaker 1: And in the past those incidents happened but they were 1060 00:51:59,040 --> 00:52:02,120 Speaker 1: isolated and or maybe maybe that village was piste off 1061 00:52:02,200 --> 00:52:04,880 Speaker 1: or that family, but it didn't go viral across the 1062 00:52:05,040 --> 00:52:06,919 Speaker 1: and it goes viral, then then it's seen as being 1063 00:52:06,920 --> 00:52:11,040 Speaker 1: emblematic of the entire American military operation. And good news 1064 00:52:11,080 --> 00:52:16,040 Speaker 1: doesn't typically go viral. No. No, I never done somebody's parents, 1065 00:52:16,080 --> 00:52:20,640 Speaker 1: you know, passing out. Here's a young infantry men like 1066 00:52:20,680 --> 00:52:23,759 Speaker 1: picking lilacs for the locals and giving them the children. Yeah, 1067 00:52:23,760 --> 00:52:26,080 Speaker 1: it's probably not going to go viral. You know, we've 1068 00:52:26,080 --> 00:52:27,840 Speaker 1: talked about this on the show, and I feel like 1069 00:52:27,920 --> 00:52:31,239 Speaker 1: you'd be a good person to answer this. Really. We've 1070 00:52:31,239 --> 00:52:34,279 Speaker 1: talked about the stuff going on with gun control and 1071 00:52:34,320 --> 00:52:36,399 Speaker 1: the whole issue right now, and I feel like when 1072 00:52:36,400 --> 00:52:38,560 Speaker 1: you turn on mainstream media, no one even brings up 1073 00:52:38,560 --> 00:52:41,480 Speaker 1: the issue of three D printers, which you mentioned before. 1074 00:52:41,880 --> 00:52:43,680 Speaker 1: And I almost feel like this debate is going to 1075 00:52:43,760 --> 00:52:46,440 Speaker 1: become obsolete when the guy you know across the street 1076 00:52:46,440 --> 00:52:48,800 Speaker 1: from you can print out whatever weaprint on a KT 1077 00:52:49,200 --> 00:52:51,440 Speaker 1: in his basement. Yeah. Like so the FBI, like when 1078 00:52:51,440 --> 00:52:53,360 Speaker 1: the first come out, you I point out these regulations 1079 00:52:53,360 --> 00:52:54,960 Speaker 1: that you can't three D print things, and it's like 1080 00:52:55,000 --> 00:52:59,520 Speaker 1: we're fighting against a tsunami of technology fighting and right 1081 00:52:59,560 --> 00:53:01,560 Speaker 1: now there are some limitations in terms of you you're 1082 00:53:01,560 --> 00:53:03,440 Speaker 1: printing your three D printing something that's plastic, it's not 1083 00:53:03,440 --> 00:53:05,520 Speaker 1: gonna the same reliability. But people are working on that. 1084 00:53:05,880 --> 00:53:08,400 Speaker 1: It's getting better. You can three to print metal stuff too. 1085 00:53:09,160 --> 00:53:11,560 Speaker 1: I have a friend who works in aerospace and at 1086 00:53:11,560 --> 00:53:14,160 Speaker 1: work he said, they have this huge three D metal 1087 00:53:14,320 --> 00:53:18,200 Speaker 1: metal printer and they print parts for rockets that we 1088 00:53:18,280 --> 00:53:20,799 Speaker 1: send into space. He's like, you program it in, you 1089 00:53:20,880 --> 00:53:22,640 Speaker 1: go home, you come back in the morning, and you 1090 00:53:22,680 --> 00:53:25,520 Speaker 1: pull it out of the three a metal part for 1091 00:53:25,560 --> 00:53:28,960 Speaker 1: a rocket. Yeah. Yeah, it's a huge challenge, right, and 1092 00:53:29,040 --> 00:53:31,919 Speaker 1: so um yeah, you know when you can print off 1093 00:53:32,600 --> 00:53:37,640 Speaker 1: parts or entire firearms, Um, it's gonna make things like um, 1094 00:53:37,680 --> 00:53:40,439 Speaker 1: not only adding new gun control measures if people wanted 1095 00:53:40,480 --> 00:53:42,560 Speaker 1: to do that, but even you know when forcing things 1096 00:53:42,600 --> 00:53:45,439 Speaker 1: are in place today about automatic weapons, very very gent. 1097 00:53:45,640 --> 00:53:47,640 Speaker 1: The New Anarchist Cookbook is going to be like what 1098 00:53:47,680 --> 00:53:51,520 Speaker 1: it is it an STL file for your three D Right, 1099 00:53:51,800 --> 00:53:55,200 Speaker 1: I'm going on the darknet, baby, yeah, I would think. So. Yeah, 1100 00:53:55,880 --> 00:53:58,359 Speaker 1: I want to hear some some stories of you guys 1101 00:53:58,440 --> 00:54:01,279 Speaker 1: serving together. It's got to be some stuff there. Any 1102 00:54:01,320 --> 00:54:03,800 Speaker 1: of that, I don't know where do you want to 1103 00:54:03,800 --> 00:54:06,680 Speaker 1: go to Jack? I don't know, man, Um yeah, I 1104 00:54:06,719 --> 00:54:09,040 Speaker 1: mean a lot of it is actually gonna be in 1105 00:54:09,080 --> 00:54:11,200 Speaker 1: my book that I've been writing. I shot some of 1106 00:54:11,200 --> 00:54:13,920 Speaker 1: that over to Paul. Actually I sent I sent excerpts 1107 00:54:13,920 --> 00:54:16,200 Speaker 1: of the book out two different people I worked with, 1108 00:54:16,280 --> 00:54:19,840 Speaker 1: just see like, how how close am I to? You know? 1109 00:54:19,960 --> 00:54:23,160 Speaker 1: Am I recalling all of this the right way? But um, 1110 00:54:23,239 --> 00:54:24,839 Speaker 1: so I don't want to spill Jack book. I tell 1111 00:54:24,840 --> 00:54:27,360 Speaker 1: a story that's um that I think it's interesting and 1112 00:54:27,440 --> 00:54:29,480 Speaker 1: roll with the topic. That's it's in the book. Um, 1113 00:54:29,880 --> 00:54:32,279 Speaker 1: Army have known it's not a it's a Jack story 1114 00:54:32,320 --> 00:54:34,320 Speaker 1: in the book. Or now it's not a Jack story. 1115 00:54:34,360 --> 00:54:36,239 Speaker 1: It's not a jackstory. It's it's a it's a story 1116 00:54:36,239 --> 00:54:39,759 Speaker 1: from our slipper team. Jack was there, um Jack, I 1117 00:54:39,880 --> 00:54:42,600 Speaker 1: know all the other players there. So this was pretty um, 1118 00:54:42,600 --> 00:54:45,160 Speaker 1: pretty early in the wars in Afghanistan, and we were 1119 00:54:45,200 --> 00:54:47,880 Speaker 1: sent to to do what overlook up on the a 1120 00:54:48,040 --> 00:54:50,440 Speaker 1: Pack border were the idea was. We were supposed to 1121 00:54:50,480 --> 00:54:53,279 Speaker 1: watch sort of infiltration routes. It is fairly early where 1122 00:54:53,280 --> 00:54:55,160 Speaker 1: I think people didn't fully realize just how totally poor 1123 00:54:55,239 --> 00:54:57,399 Speaker 1: us the border is like the idea that you're gonna 1124 00:54:57,440 --> 00:55:01,439 Speaker 1: put tiny ops all on the boarder watching everything, like yeah, 1125 00:55:01,480 --> 00:55:03,799 Speaker 1: it's laughable, right, but it's just kind of early on 1126 00:55:03,840 --> 00:55:05,120 Speaker 1: and people were like, well, we're just gonna go with 1127 00:55:05,320 --> 00:55:08,080 Speaker 1: the and we we're gonna figure out where they're coming 1128 00:55:08,080 --> 00:55:10,640 Speaker 1: from and heads like they're coming from everywhere, right, okay, 1129 00:55:10,760 --> 00:55:14,040 Speaker 1: So so we go up there we influence hightight. This 1130 00:55:14,080 --> 00:55:16,279 Speaker 1: was actually the time that we carried a hundred and 1131 00:55:16,280 --> 00:55:19,120 Speaker 1: sixty pounds up this mountain. This is crazy, right, Like 1132 00:55:19,160 --> 00:55:21,719 Speaker 1: we're not ants. You can't do that kind of thing. Um. 1133 00:55:21,760 --> 00:55:24,480 Speaker 1: So we infill at night and uh and the sun 1134 00:55:24,520 --> 00:55:27,799 Speaker 1: comes up and instantly we're compromised. And we can see this. 1135 00:55:27,800 --> 00:55:30,400 Speaker 1: This farmer comes out of his mud hut and he 1136 00:55:30,440 --> 00:55:32,320 Speaker 1: comes out to like take a league in the field. 1137 00:55:32,320 --> 00:55:34,360 Speaker 1: But you know, and he looks up and he sees us. 1138 00:55:34,400 --> 00:55:36,600 Speaker 1: There's like eight of us behind us like rock all right, 1139 00:55:36,680 --> 00:55:39,319 Speaker 1: and like there's no cover. It's and there's no trees 1140 00:55:39,400 --> 00:55:41,279 Speaker 1: or anything where we are what the tree line and 1141 00:55:41,320 --> 00:55:43,080 Speaker 1: our heads are all like this like look it out 1142 00:55:43,920 --> 00:55:45,920 Speaker 1: and you can seem kind of like and he kind 1143 00:55:45,920 --> 00:55:49,600 Speaker 1: of scurs back in so um. About an hour later, 1144 00:55:50,080 --> 00:55:53,279 Speaker 1: little girl comes along and she's got two goats in 1145 00:55:53,360 --> 00:55:55,960 Speaker 1: tow like she's hurting hurting goats, but it's probably clear 1146 00:55:56,040 --> 00:55:58,120 Speaker 1: she's been sent out to to re kind our position, 1147 00:55:58,920 --> 00:56:00,839 Speaker 1: and um, and she walks kind of she comes up 1148 00:56:00,840 --> 00:56:02,919 Speaker 1: along the mountain and she walks kind of this long arcle. 1149 00:56:03,000 --> 00:56:05,200 Speaker 1: She's got five or six She's not very sneaky, to 1150 00:56:05,239 --> 00:56:07,560 Speaker 1: be probably honest with you. She's like standing on us, 1151 00:56:07,719 --> 00:56:11,480 Speaker 1: you know, not like very stealthy kind of recon And 1152 00:56:11,480 --> 00:56:13,719 Speaker 1: and then we hear this kind of beeping we later 1153 00:56:13,760 --> 00:56:16,520 Speaker 1: realizes is icon is radio that she's got right that 1154 00:56:16,520 --> 00:56:20,080 Speaker 1: she's reported back, So we watch her and she watches us. 1155 00:56:20,280 --> 00:56:23,360 Speaker 1: Eventually she leaves, and a couple of minutes later, some 1156 00:56:23,400 --> 00:56:26,239 Speaker 1: Taliban fighters come along, so we we take care of them, 1157 00:56:26,440 --> 00:56:29,239 Speaker 1: and uh, and then the firefight that happens ends up 1158 00:56:29,239 --> 00:56:32,480 Speaker 1: basically like it acchoes across the valley and and like 1159 00:56:32,520 --> 00:56:35,239 Speaker 1: the whole village kind of comes out. Yeah, it was 1160 00:56:35,320 --> 00:56:37,800 Speaker 1: like it's something's happening, you know, there's something in Afghanistan 1161 00:56:37,880 --> 00:56:40,160 Speaker 1: to draw people out, like a gunfight. You know, people 1162 00:56:40,200 --> 00:56:43,560 Speaker 1: love of fighting here, So so very quickly after the gunfight, 1163 00:56:43,600 --> 00:56:46,680 Speaker 1: we realize, like village is massing on our position. We're 1164 00:56:46,680 --> 00:56:48,759 Speaker 1: gonna have to have to excel. So we excel and 1165 00:56:48,800 --> 00:56:50,160 Speaker 1: we're talking later. But what will we do in that 1166 00:56:50,239 --> 00:56:52,719 Speaker 1: kind of situation, Because for early and we're still sort 1167 00:56:52,719 --> 00:56:55,200 Speaker 1: of coming groups with the fact that um up in 1168 00:56:55,280 --> 00:56:57,839 Speaker 1: these mountains in Afghanistan, I found it doesn't matter where 1169 00:56:57,840 --> 00:57:00,799 Speaker 1: you go to civilians everywhere be up at mountons ten 1170 00:57:01,160 --> 00:57:03,160 Speaker 1: feet and we won't across some guy hurting goats. You 1171 00:57:03,200 --> 00:57:06,239 Speaker 1: can being like a totally remote location and there's still 1172 00:57:06,320 --> 00:57:09,520 Speaker 1: like lean twos and you'll find like footprints in the sand, 1173 00:57:10,520 --> 00:57:12,880 Speaker 1: some guy out there trapping wood, like middle of nowhere, 1174 00:57:13,000 --> 00:57:15,279 Speaker 1: right up on top of mountains. So one of the 1175 00:57:15,280 --> 00:57:16,720 Speaker 1: things we talked about is like, all right, you know, 1176 00:57:16,720 --> 00:57:19,520 Speaker 1: if we saw somebody with their civilian maybe we'll like 1177 00:57:19,600 --> 00:57:22,120 Speaker 1: kind of try to apprehend them, pat them down, see 1178 00:57:22,120 --> 00:57:23,760 Speaker 1: if they got a radio. Figure out if we're like 1179 00:57:24,040 --> 00:57:27,080 Speaker 1: we're compromised now, or we might be compromised when they 1180 00:57:27,120 --> 00:57:28,480 Speaker 1: come back. Like how do we deal with kind of 1181 00:57:28,480 --> 00:57:31,600 Speaker 1: this problem? So something that never came up is that 1182 00:57:31,680 --> 00:57:33,760 Speaker 1: we're like, oh, we should just shoot this girl. That 1183 00:57:33,880 --> 00:57:36,560 Speaker 1: wasn't a topic of a conversation, okay, And I'm not 1184 00:57:36,600 --> 00:57:39,360 Speaker 1: going to say that there aren't difficult situations that people 1185 00:57:39,360 --> 00:57:40,960 Speaker 1: are placed too in war, but this was not one. 1186 00:57:41,040 --> 00:57:42,840 Speaker 1: This is like an easy one, like let the girl go, 1187 00:57:42,920 --> 00:57:45,720 Speaker 1: We'll deal with whatever comes afterwards. Um. You know, frankly, 1188 00:57:46,160 --> 00:57:47,960 Speaker 1: you know we were probably itching for a fight. So 1189 00:57:48,120 --> 00:57:51,000 Speaker 1: like girl leaves, a bunch of guys come after us. 1190 00:57:51,080 --> 00:57:53,720 Speaker 1: Were like when right, you're like go over to like hey, hey, 1191 00:57:53,800 --> 00:57:55,960 Speaker 1: go go tell your homeboys to come over, and like 1192 00:57:56,040 --> 00:57:58,960 Speaker 1: send the send more some more. So we weren't that 1193 00:57:58,960 --> 00:58:03,480 Speaker 1: brokenhearted about it. So um. So what's interesting though, is 1194 00:58:03,640 --> 00:58:06,760 Speaker 1: when you think about automation and and uh and robots, 1195 00:58:07,000 --> 00:58:11,960 Speaker 1: is legally she's a combatant. She is participating in hostilities, 1196 00:58:12,040 --> 00:58:13,800 Speaker 1: and she's part of the she's a part of the 1197 00:58:13,840 --> 00:58:16,840 Speaker 1: weapon system. Right, she's scouted for the There's no age 1198 00:58:16,880 --> 00:58:19,360 Speaker 1: set under the laws of war for combatants. It doesn't 1199 00:58:19,360 --> 00:58:21,640 Speaker 1: matter how young you are. If your participant astilities, you're 1200 00:58:21,640 --> 00:58:24,680 Speaker 1: a little lawful. Combatants like that. You know Gothic serpent 1201 00:58:24,800 --> 00:58:28,120 Speaker 1: scenario where they're in Mangadishu and the fighters are hiding 1202 00:58:28,120 --> 00:58:31,120 Speaker 1: behind the women, or you have kids running ammunition across 1203 00:58:31,160 --> 00:58:33,800 Speaker 1: the street to the fighters. Kids are running ammunition. Now, 1204 00:58:33,840 --> 00:58:36,120 Speaker 1: look if if you run across the civilian and it's 1205 00:58:36,160 --> 00:58:39,560 Speaker 1: just like the jugiament go hur right, and they're like, hey, man, 1206 00:58:39,600 --> 00:58:41,040 Speaker 1: I just happened to be one luce the wrong time. 1207 00:58:41,200 --> 00:58:43,720 Speaker 1: You shoot that person as a war crime, okay, But 1208 00:58:43,720 --> 00:58:45,800 Speaker 1: but here you've got like kids running ammunition like they're 1209 00:58:45,920 --> 00:58:49,520 Speaker 1: they're at an active participant. So if you program to 1210 00:58:49,680 --> 00:58:52,240 Speaker 1: robot to follow the laws of war to the letter, 1211 00:58:52,400 --> 00:58:54,760 Speaker 1: it shoot this little girl. Um. And so it's one 1212 00:58:54,760 --> 00:58:56,040 Speaker 1: of the things to talk about in the book is like, 1213 00:58:56,040 --> 00:58:58,080 Speaker 1: how do you think about the role of human judgment, 1214 00:58:58,440 --> 00:59:01,960 Speaker 1: think about value and ethics and his kind of And 1215 00:59:02,000 --> 00:59:05,880 Speaker 1: it's interesting too, because the enemy was taking advantage of 1216 00:59:06,360 --> 00:59:10,160 Speaker 1: your perceived system of values that they think, well, they're 1217 00:59:10,200 --> 00:59:12,800 Speaker 1: probably not going to shoot this girl. But if they do, 1218 00:59:12,920 --> 00:59:15,040 Speaker 1: oh well whatever, I mean, it shows you the value 1219 00:59:15,280 --> 00:59:17,680 Speaker 1: they had on her life, which is basically none at all. 1220 00:59:17,720 --> 00:59:19,320 Speaker 1: They were willing to take that, you know, roll the 1221 00:59:19,320 --> 00:59:22,400 Speaker 1: dice with her life. Um, but they're taking advantage of 1222 00:59:22,520 --> 00:59:26,840 Speaker 1: your perceived American values. Actual probably they probably won't shoot her. Yeah. 1223 00:59:27,280 --> 00:59:28,680 Speaker 1: So I think that's you know, that's one of the 1224 00:59:28,760 --> 00:59:31,520 Speaker 1: things that that people sort of grapple with this idea 1225 00:59:31,520 --> 00:59:34,240 Speaker 1: of more automation is you know what happens when you 1226 00:59:34,280 --> 00:59:37,040 Speaker 1: have automation that doesn't have this level of sort of 1227 00:59:37,040 --> 00:59:40,440 Speaker 1: the human more linkagement with it's happening. That's all that 1228 00:59:40,480 --> 00:59:43,439 Speaker 1: your book, that your book goes into these stories of 1229 00:59:43,520 --> 00:59:46,760 Speaker 1: you know, your own experience because although it's obviously not 1230 00:59:46,840 --> 00:59:49,200 Speaker 1: a memoir, and you know, I know you're putting out 1231 00:59:49,200 --> 00:59:51,200 Speaker 1: a memoir, but that's like the popular thing right now 1232 00:59:51,280 --> 00:59:55,120 Speaker 1: is operators putting out memoirs. This is completely different. Pulse 1233 00:59:55,200 --> 00:59:58,200 Speaker 1: to pulse to educated for that, I missed that class. 1234 00:59:58,240 --> 01:00:00,920 Speaker 1: Apparently there's some class that they you have everybody when 1235 01:00:00,920 --> 01:00:02,960 Speaker 1: they go through soft training these days, like how to 1236 01:00:02,960 --> 01:00:04,600 Speaker 1: write a memory. It took me, like must I come 1237 01:00:04,640 --> 01:00:06,440 Speaker 1: after my time? It took me like eight years to 1238 01:00:06,520 --> 01:00:08,520 Speaker 1: learn before I finally broke down it and did it. 1239 01:00:08,640 --> 01:00:10,800 Speaker 1: But nonetheless, I mean, the Army of None is still 1240 01:00:11,080 --> 01:00:14,160 Speaker 1: a book that that talks about your experiences as an 1241 01:00:14,200 --> 01:00:16,840 Speaker 1: operator and then into doing what you're doing now, which 1242 01:00:16,840 --> 01:00:19,200 Speaker 1: I think is pretty cool. I do want to ask Paul, 1243 01:00:19,200 --> 01:00:20,640 Speaker 1: since we got him here and this is a little 1244 01:00:20,680 --> 01:00:24,160 Speaker 1: since we are software radio. Paul was one of the 1245 01:00:24,160 --> 01:00:27,080 Speaker 1: first guys who stood up the wreckie teams for Ranger 1246 01:00:27,120 --> 01:00:29,400 Speaker 1: Battalion I was wondering if you could talk to us 1247 01:00:29,400 --> 01:00:32,040 Speaker 1: a little bit about that, about why these because we 1248 01:00:32,080 --> 01:00:34,920 Speaker 1: had the range of Reconnaissance Detachment of course rr D 1249 01:00:35,480 --> 01:00:38,240 Speaker 1: that did recon for the range of regiment for a 1250 01:00:38,280 --> 01:00:41,080 Speaker 1: long time. I wrote an article about them, kind of 1251 01:00:41,080 --> 01:00:43,800 Speaker 1: a history article about this guy is pretty cool. Um. 1252 01:00:44,320 --> 01:00:47,000 Speaker 1: But then each battalion got its own recie team, which 1253 01:00:47,040 --> 01:00:49,360 Speaker 1: you were. I think three seven five is the first 1254 01:00:49,360 --> 01:00:51,680 Speaker 1: battalion that that stood up reckie teams. I think we 1255 01:00:51,680 --> 01:00:53,520 Speaker 1: were later than that. Like I was just you know, 1256 01:00:53,520 --> 01:00:56,280 Speaker 1: I was just a sergeant at the time, so so 1257 01:00:56,320 --> 01:00:59,120 Speaker 1: you're a little bit in the dark doing. My recollection 1258 01:00:59,200 --> 01:01:02,080 Speaker 1: was that we were the asked. Actually other Battalian team 1259 01:01:02,080 --> 01:01:04,160 Speaker 1: moved quicker and then for whatever reason, the three seven 1260 01:01:04,160 --> 01:01:06,880 Speaker 1: five leadership in your drag on their heels problem. Maybe 1261 01:01:06,920 --> 01:01:08,680 Speaker 1: because we were next door to r r D and 1262 01:01:08,720 --> 01:01:10,439 Speaker 1: they thought that they could rely on them. I don't 1263 01:01:10,440 --> 01:01:12,440 Speaker 1: know that that was all politically, Yeah, it was like 1264 01:01:12,480 --> 01:01:14,720 Speaker 1: that's like what the colonels are doing is like you 1265 01:01:14,760 --> 01:01:17,360 Speaker 1: know often I was. I was a tab speck for 1266 01:01:17,440 --> 01:01:18,960 Speaker 1: at the time, so I didn't know about all of that. 1267 01:01:18,960 --> 01:01:20,240 Speaker 1: But anyway, I was wondering if you could talk a 1268 01:01:20,280 --> 01:01:22,480 Speaker 1: little bit about standing up the wrecky teams, what your 1269 01:01:22,560 --> 01:01:24,720 Speaker 1: job was, what you guys were up to. That was, honestly, 1270 01:01:24,760 --> 01:01:26,000 Speaker 1: that was the most fun I've ever had. That was 1271 01:01:26,000 --> 01:01:28,120 Speaker 1: just the blast. So so they basically, I mean the 1272 01:01:28,240 --> 01:01:30,280 Speaker 1: basic just it was what you're describing, which is this 1273 01:01:30,320 --> 01:01:32,960 Speaker 1: is fairly early in the worst, like two thousand, three 1274 01:01:33,000 --> 01:01:36,640 Speaker 1: thousand four time frame. Um, where you know, lady, he 1275 01:01:36,680 --> 01:01:40,480 Speaker 1: had been doing work for the battalions, doing doing recon 1276 01:01:40,600 --> 01:01:43,600 Speaker 1: stuff for years and then once the worst really kicked off, 1277 01:01:43,920 --> 01:01:46,400 Speaker 1: they got they got so there was so much work 1278 01:01:46,440 --> 01:01:48,480 Speaker 1: to be done that they just got sucked up and 1279 01:01:48,480 --> 01:01:50,600 Speaker 1: they were doing other things right, And so the idea was, 1280 01:01:50,640 --> 01:01:52,840 Speaker 1: we're going to create Italian level attachments. Now at the 1281 01:01:52,880 --> 01:01:56,520 Speaker 1: time they had in all the arrange Batalians had sniper 1282 01:01:56,560 --> 01:02:00,960 Speaker 1: attachments and so there's basically like expanding that platoon to 1283 01:02:01,000 --> 01:02:03,920 Speaker 1: add a couple of wrecked attachments. It would go alongside 1284 01:02:03,920 --> 01:02:07,520 Speaker 1: then um that would do recon work. And so when 1285 01:02:07,560 --> 01:02:10,640 Speaker 1: we studed up, they basically they they poached a couple 1286 01:02:10,680 --> 01:02:13,000 Speaker 1: of people from the sniper teams to help stand them up. 1287 01:02:13,000 --> 01:02:14,920 Speaker 1: I was one of them, um that moved from the 1288 01:02:14,920 --> 01:02:17,439 Speaker 1: sniper section over to the recon teams and then grab 1289 01:02:17,520 --> 01:02:20,720 Speaker 1: some other people from across the unit and it was 1290 01:02:20,760 --> 01:02:23,960 Speaker 1: a blast. We were we had some folks doing obviously 1291 01:02:24,000 --> 01:02:25,240 Speaker 1: work in the rock. I was at the time in 1292 01:02:25,240 --> 01:02:27,960 Speaker 1: Afghanistan and those deployments. The guys in Iraq weren't having 1293 01:02:27,960 --> 01:02:30,320 Speaker 1: that much fun though. I don't think like you were, well, 1294 01:02:30,360 --> 01:02:32,080 Speaker 1: you know, at the time it was it was, to 1295 01:02:32,120 --> 01:02:34,520 Speaker 1: be honest with you, it was actually a really permissive 1296 01:02:34,600 --> 01:02:38,040 Speaker 1: environment in Afghanistan. Right, this is like two thousand five 1297 01:02:38,080 --> 01:02:40,160 Speaker 1: time frame, and you could get away with a lot 1298 01:02:40,240 --> 01:02:42,360 Speaker 1: of stuff that who were doing. It was just wild 1299 01:02:42,360 --> 01:02:44,160 Speaker 1: when we were doing a lot of little visibility work 1300 01:02:44,280 --> 01:02:47,560 Speaker 1: and and aerial rekis were doing some tactical stuff on 1301 01:02:47,680 --> 01:02:50,280 Speaker 1: foot um and it was just it was really do 1302 01:02:50,320 --> 01:02:53,160 Speaker 1: a lot of close target reconnaissance work. It was a 1303 01:02:53,200 --> 01:02:55,560 Speaker 1: lot of fun. Um I really enjoyed. And this was 1304 01:02:55,600 --> 01:02:58,720 Speaker 1: all like new for Ranger Battalion too. It was all 1305 01:02:58,840 --> 01:03:01,919 Speaker 1: very new. It was it was three seven fives. First, 1306 01:03:02,120 --> 01:03:04,960 Speaker 1: you know, we're standing up reckie teams when we deployed. 1307 01:03:04,960 --> 01:03:07,040 Speaker 1: Paul and I we had like beards and long hair 1308 01:03:07,160 --> 01:03:11,600 Speaker 1: like all. This was totally like outside the like mind blowing, yeah, 1309 01:03:11,680 --> 01:03:12,960 Speaker 1: you know, I mean it was like just blow and 1310 01:03:13,000 --> 01:03:14,960 Speaker 1: it was just so it was so mindful against the 1311 01:03:15,000 --> 01:03:17,240 Speaker 1: range of battalion. Was that was when they started shifting 1312 01:03:17,280 --> 01:03:19,080 Speaker 1: too more low visibility to work, so that the sniper 1313 01:03:19,080 --> 01:03:21,600 Speaker 1: and wreckie teams had they said, well we can start. 1314 01:03:21,720 --> 01:03:24,280 Speaker 1: You could do long hair operations and grow beards and stuff, 1315 01:03:24,560 --> 01:03:26,640 Speaker 1: and oh my god, I mean that we were to 1316 01:03:26,640 --> 01:03:28,680 Speaker 1: add a christ Yeah, I mean people were so upset. 1317 01:03:28,720 --> 01:03:31,000 Speaker 1: I mean they had Actually the Rangers had just moved 1318 01:03:31,000 --> 01:03:32,960 Speaker 1: like a year before that away from high end tights, 1319 01:03:33,360 --> 01:03:35,760 Speaker 1: so the idea that you could even have like Army's 1320 01:03:35,800 --> 01:03:40,120 Speaker 1: standard hair was still controversial inside range. This was like sacrilege. Oh, 1321 01:03:40,160 --> 01:03:42,560 Speaker 1: I mean people just people were so people just hated 1322 01:03:43,200 --> 01:03:46,840 Speaker 1: wearing civilians around battalion all day. Oh yeah, who you are? 1323 01:03:47,160 --> 01:03:49,560 Speaker 1: You know, it's like whatever. So it's just it was 1324 01:03:49,680 --> 01:03:51,640 Speaker 1: part of the job. There were times when that was important. 1325 01:03:51,680 --> 01:03:55,080 Speaker 1: You needed it. And you guys, you know, we're disguising 1326 01:03:55,080 --> 01:03:57,720 Speaker 1: yourselves as afghanis a lot of times we did we 1327 01:03:57,760 --> 01:04:01,600 Speaker 1: did stuff in um obviously civilian kind of clothing. We 1328 01:04:01,680 --> 01:04:04,160 Speaker 1: did stuff in um, you know, blending in with other 1329 01:04:04,160 --> 01:04:06,200 Speaker 1: Afghan forces if who might integrate with them, like we're 1330 01:04:06,240 --> 01:04:09,280 Speaker 1: in those uniforms. Um. You know, obviously there's limits to 1331 01:04:09,320 --> 01:04:11,280 Speaker 1: how close you could get to people. You know, I 1332 01:04:11,320 --> 01:04:13,200 Speaker 1: mean I do that, I'm gonna like put on a 1333 01:04:13,640 --> 01:04:15,880 Speaker 1: set of mention, doesn't walk to them. I mean your team, 1334 01:04:15,880 --> 01:04:19,600 Speaker 1: we're all Caucasians, one African American. Yeah, I mean you 1335 01:04:19,600 --> 01:04:21,919 Speaker 1: weren't going to blend in his Afghani. Yeah, I think 1336 01:04:21,920 --> 01:04:23,920 Speaker 1: at the time. You know, if I if I was 1337 01:04:24,000 --> 01:04:27,960 Speaker 1: like well tanned and kind of dirty, from a distance, 1338 01:04:28,000 --> 01:04:30,520 Speaker 1: I probably look like like a kind of an overfed Afghan. 1339 01:04:31,120 --> 01:04:33,080 Speaker 1: But but you're not gonna blend in up close. And 1340 01:04:33,120 --> 01:04:35,280 Speaker 1: obviously people even speak the language and stuff, so you're 1341 01:04:35,280 --> 01:04:37,280 Speaker 1: not you're not doing those kind of operations. But but 1342 01:04:37,400 --> 01:04:40,320 Speaker 1: from a distance, you could at least, um create some 1343 01:04:40,440 --> 01:04:44,920 Speaker 1: kind of ambiguity. Um, I did. I did know. You know, um, 1344 01:04:45,040 --> 01:04:47,680 Speaker 1: folks who who you know, at least up in tactical 1345 01:04:47,720 --> 01:04:49,880 Speaker 1: operations up in the mountains, had been like face to 1346 01:04:49,880 --> 01:04:51,640 Speaker 1: face with Afghans. You didn't know that they were Americans. 1347 01:04:51,960 --> 01:04:53,680 Speaker 1: They could tell what they were in Afghan but they 1348 01:04:53,720 --> 01:04:56,560 Speaker 1: assume that they were like AIRB fighter fighters. Yeah, because 1349 01:04:56,560 --> 01:04:58,600 Speaker 1: there's four and fighters small over the place, and so 1350 01:04:58,640 --> 01:05:00,360 Speaker 1: if you weren't in uniform people and peop. We didn't 1351 01:05:00,360 --> 01:05:02,520 Speaker 1: know who he were. That's that's pretty cool. So actually 1352 01:05:02,600 --> 01:05:04,160 Speaker 1: up on top of him, yeah, I knew we gotta 1353 01:05:04,160 --> 01:05:07,480 Speaker 1: got hugged by an Afghan and then like uh and 1354 01:05:07,480 --> 01:05:09,480 Speaker 1: when he hugged him, realized that he had like like 1355 01:05:09,480 --> 01:05:12,120 Speaker 1: an ak buried underneath this, that the American was carrying 1356 01:05:12,120 --> 01:05:15,480 Speaker 1: a weapon on his robes and was like I think 1357 01:05:15,480 --> 01:05:17,400 Speaker 1: he assumed he was probably like he was like a 1358 01:05:17,440 --> 01:05:20,200 Speaker 1: foreign fighter. It was an interesting time in Ranger Battalion 1359 01:05:20,240 --> 01:05:22,920 Speaker 1: because you were getting into all that. Um. It was 1360 01:05:22,960 --> 01:05:25,480 Speaker 1: the first time I think we started making any approaches 1361 01:05:25,560 --> 01:05:29,680 Speaker 1: until like techn things like that, um, all kinds of 1362 01:05:29,680 --> 01:05:31,920 Speaker 1: different stuff that we were kind of like like it 1363 01:05:32,000 --> 01:05:34,800 Speaker 1: was interesting because these were all things are perceived that 1364 01:05:34,880 --> 01:05:37,720 Speaker 1: like Jaysak guys would do in the past, and it 1365 01:05:37,800 --> 01:05:40,040 Speaker 1: was like, Ranger, you are not high speed at all 1366 01:05:40,440 --> 01:05:43,120 Speaker 1: to be able to attempt something like this. But now 1367 01:05:43,160 --> 01:05:45,280 Speaker 1: things were changing because of the war. Well, part of 1368 01:05:45,280 --> 01:05:46,400 Speaker 1: it was just you know, there was a sort of 1369 01:05:46,440 --> 01:05:49,080 Speaker 1: this warfall effect of I think I think runs fell 1370 01:05:49,080 --> 01:05:50,760 Speaker 1: actually the time to described this as as sort of 1371 01:05:50,760 --> 01:05:53,240 Speaker 1: a step to the right that he wanted um Jaysack 1372 01:05:53,320 --> 01:05:55,120 Speaker 1: to two one used to meaning things with more like 1373 01:05:55,120 --> 01:05:58,439 Speaker 1: what the CIA had been doing, um um other soft 1374 01:05:58,480 --> 01:06:00,320 Speaker 1: you just doing things more like what Jay I could 1375 01:06:00,320 --> 01:06:02,160 Speaker 1: be doing, and kind of so forth and so on, 1376 01:06:02,200 --> 01:06:04,600 Speaker 1: convention just being more kind of like regular kind of song. 1377 01:06:04,760 --> 01:06:06,480 Speaker 1: Everyone kind of took a step up. And then and 1378 01:06:06,520 --> 01:06:08,280 Speaker 1: that happened, you know, sort of down the line where 1379 01:06:08,280 --> 01:06:10,520 Speaker 1: people would say, okay, look we need to because it 1380 01:06:10,600 --> 01:06:12,480 Speaker 1: was just so much work to be done, you know, 1381 01:06:12,520 --> 01:06:14,240 Speaker 1: so much work that was out there. People say, all right, 1382 01:06:14,280 --> 01:06:18,000 Speaker 1: we need to start um doing more more interesting operations, 1383 01:06:18,000 --> 01:06:22,400 Speaker 1: adapting our tactics. And also and I always, uh, you know, 1384 01:06:22,440 --> 01:06:24,560 Speaker 1: get a laugh out of this. People tell me, well, 1385 01:06:24,680 --> 01:06:27,520 Speaker 1: rangers don't do fit And I was like, oh, let 1386 01:06:27,520 --> 01:06:29,920 Speaker 1: me tell you a story about Afghanistan. My friend and 1387 01:06:31,360 --> 01:06:34,200 Speaker 1: we actually use both of us in your team. We're 1388 01:06:34,240 --> 01:06:36,800 Speaker 1: training the Afghan SWAT unit. Yeah it was a local 1389 01:06:36,840 --> 01:06:39,080 Speaker 1: swat once a week or yeah, I think once a week, 1390 01:06:39,160 --> 01:06:41,040 Speaker 1: and yeah, we trained them and like and you know, 1391 01:06:41,040 --> 01:06:43,320 Speaker 1: again like you sort of do what the mission requires. 1392 01:06:43,360 --> 01:06:45,480 Speaker 1: So we did. Um, we did a lot of things 1393 01:06:45,480 --> 01:06:47,160 Speaker 1: that you know, that's not that's not maybe the core 1394 01:06:47,720 --> 01:06:50,720 Speaker 1: competency of a ranging units. You create a range of 1395 01:06:50,720 --> 01:06:54,360 Speaker 1: battalion to do fit right, But um, like a lot 1396 01:06:54,400 --> 01:06:57,080 Speaker 1: of people in that in that world, people are gonna do, um, 1397 01:06:57,120 --> 01:06:58,560 Speaker 1: do what jobs they need to. So we did, we 1398 01:06:58,600 --> 01:07:00,680 Speaker 1: did some fit and did something. I mean, there's the 1399 01:07:00,760 --> 01:07:03,040 Speaker 1: environment we found ourselves and you know, and I liked 1400 01:07:03,040 --> 01:07:05,200 Speaker 1: working with you guys because you were flexible like that, 1401 01:07:05,240 --> 01:07:06,760 Speaker 1: and it was like all kinds of different things, all 1402 01:07:06,800 --> 01:07:09,080 Speaker 1: these different doors were open. We were trying a lot 1403 01:07:09,120 --> 01:07:10,560 Speaker 1: of new things, and it was we were breaking a 1404 01:07:10,560 --> 01:07:12,240 Speaker 1: lot of eggs at the time. You know. The funny 1405 01:07:12,240 --> 01:07:15,680 Speaker 1: thing is, other than the haircuts, which was just like sacrilege, 1406 01:07:15,920 --> 01:07:18,200 Speaker 1: the other hardest thing we had to do was we're 1407 01:07:18,200 --> 01:07:20,960 Speaker 1: trying to change tactics. So here we created these small 1408 01:07:21,320 --> 01:07:23,800 Speaker 1: recon teams. You've got like an ape man recon team, 1409 01:07:23,920 --> 01:07:26,800 Speaker 1: and you know, you're operating very very differently than a 1410 01:07:26,840 --> 01:07:30,040 Speaker 1: regular line infantry and it would um you've got different 1411 01:07:30,120 --> 01:07:32,520 Speaker 1: different end to a different equipment than a squad would have, 1412 01:07:32,840 --> 01:07:34,800 Speaker 1: I mean similar types of year but not still exactly 1413 01:07:34,840 --> 01:07:37,040 Speaker 1: the same. Um. You know, we would have things like 1414 01:07:37,080 --> 01:07:39,440 Speaker 1: maybe an air force j tack along with or an attack. 1415 01:07:39,720 --> 01:07:42,600 Speaker 1: So we've got big radio we could call an air strikes. Um, 1416 01:07:42,640 --> 01:07:44,640 Speaker 1: you're not gonna have a heavy machine gun team along 1417 01:07:44,680 --> 01:07:48,000 Speaker 1: with UM, but also you don't have a full platoon 1418 01:07:48,040 --> 01:07:51,440 Speaker 1: and a company backing you, and so you know, just 1419 01:07:51,560 --> 01:07:54,440 Speaker 1: little things like we do you know, UM movement to 1420 01:07:54,520 --> 01:07:57,520 Speaker 1: contact rules and your battle drills. And one of the 1421 01:07:57,520 --> 01:08:00,280 Speaker 1: things we started doing is we started cracking open old 1422 01:08:00,320 --> 01:08:02,440 Speaker 1: books and saying, okay, look, well, what are you know 1423 01:08:02,480 --> 01:08:05,080 Speaker 1: what alert units do you right? What are their kind 1424 01:08:05,080 --> 01:08:07,640 Speaker 1: of kind of react to contact rules and how do 1425 01:08:07,760 --> 01:08:11,840 Speaker 1: they adapt? And we had at the time a senior 1426 01:08:11,880 --> 01:08:15,160 Speaker 1: sergeant who ran the team's the recontaining third Range Battalion 1427 01:08:15,360 --> 01:08:17,880 Speaker 1: who had done a lot of like lert kind of 1428 01:08:17,960 --> 01:08:20,719 Speaker 1: units work in other units. It was really good, really 1429 01:08:20,720 --> 01:08:25,080 Speaker 1: skilled gunny you may remember, right, so so people who 1430 01:08:25,760 --> 01:08:28,360 Speaker 1: who who had this. But when we started doing that 1431 01:08:28,439 --> 01:08:32,360 Speaker 1: kind of training, it really really upset a lot of 1432 01:08:32,360 --> 01:08:34,879 Speaker 1: people in battalion because they were like, we're changing the tactics. 1433 01:08:34,920 --> 01:08:36,960 Speaker 1: You know, who are you to change up how we 1434 01:08:37,040 --> 01:08:39,320 Speaker 1: do these battled rules And so it was really upsetting 1435 01:08:39,320 --> 01:08:41,560 Speaker 1: to like the kind of the senior mateo leadership of 1436 01:08:41,600 --> 01:08:43,599 Speaker 1: the first starting the start and made we were attached 1437 01:08:43,640 --> 01:08:46,479 Speaker 1: to Charlie Company at the time. Doc Jenkins Leo Jenkins 1438 01:08:46,520 --> 01:08:50,519 Speaker 1: was there Coo'll be here and he was with Charlie 1439 01:08:50,560 --> 01:08:53,120 Speaker 1: Company at the time, and uh, I even remember sitting 1440 01:08:53,120 --> 01:08:55,000 Speaker 1: in a r S and they're upset that you guys 1441 01:08:55,160 --> 01:08:58,479 Speaker 1: had action to targets at times and why what that's 1442 01:08:58,479 --> 01:09:00,600 Speaker 1: our job? Why what are those guys doing that? But 1443 01:09:00,680 --> 01:09:02,439 Speaker 1: I think it all worked out very well at the 1444 01:09:02,520 --> 01:09:04,760 Speaker 1: end of the day because you know, you guys were 1445 01:09:04,800 --> 01:09:08,200 Speaker 1: training the cow Swat team. You ended up going out 1446 01:09:08,240 --> 01:09:11,200 Speaker 1: with those guys on raids and you locked down the perimeter. 1447 01:09:11,640 --> 01:09:14,120 Speaker 1: So there's that relationship between you and them. The Charlie 1448 01:09:14,120 --> 01:09:18,080 Speaker 1: Companies could got could go in uh and UH strike targets. Yeah, 1449 01:09:18,120 --> 01:09:19,240 Speaker 1: they used to send a lot of ways. I mean 1450 01:09:19,240 --> 01:09:20,960 Speaker 1: one of the things that we're doing embedded into sort 1451 01:09:20,960 --> 01:09:24,639 Speaker 1: of like um um raids, right, would be doing recond 1452 01:09:24,720 --> 01:09:27,280 Speaker 1: ahead of time, so we could provide the raiders with 1453 01:09:27,320 --> 01:09:29,800 Speaker 1: a sort of a target package to say, look, because 1454 01:09:29,880 --> 01:09:31,519 Speaker 1: I was you know, as people know doing this, you 1455 01:09:31,600 --> 01:09:32,960 Speaker 1: end up you do this raid and then you hit 1456 01:09:33,000 --> 01:09:35,240 Speaker 1: the wrong building. Right, So we do things like you know, 1457 01:09:35,320 --> 01:09:37,840 Speaker 1: go do recon have of time so we can come 1458 01:09:37,840 --> 01:09:40,599 Speaker 1: back and present to people a full package of information 1459 01:09:40,640 --> 01:09:43,040 Speaker 1: to say, lo, here's the compound you're going after here's 1460 01:09:43,040 --> 01:09:45,760 Speaker 1: the neighboring compound, there's an alley way between it. Here's 1461 01:09:45,800 --> 01:09:47,519 Speaker 1: what you might want to think of moving up. Here's 1462 01:09:47,520 --> 01:09:49,479 Speaker 1: the doorway. Here's what it looks like. You know, it's 1463 01:09:49,479 --> 01:09:52,160 Speaker 1: got red door or whatever like. That's that was really valuable. 1464 01:09:52,160 --> 01:09:55,000 Speaker 1: And of course we'd all been on the line units 1465 01:09:55,040 --> 01:09:57,240 Speaker 1: doing raids in prior deployment, so we kind of knew 1466 01:09:57,280 --> 01:09:59,760 Speaker 1: what those guys were looking for, so we could give 1467 01:10:00,000 --> 01:10:02,280 Speaker 1: a bold information there. And then we do things like 1468 01:10:02,320 --> 01:10:04,920 Speaker 1: going ahead of time and then actually um provide like 1469 01:10:05,000 --> 01:10:07,840 Speaker 1: early eyes on. So so sometimes we go in and 1470 01:10:07,880 --> 01:10:10,559 Speaker 1: we we leave the assault force in and then we'd 1471 01:10:10,560 --> 01:10:12,920 Speaker 1: have just a very small number of guys like put 1472 01:10:12,960 --> 01:10:15,880 Speaker 1: eyes on while the assaults were infiltrating, which was always 1473 01:10:15,920 --> 01:10:17,639 Speaker 1: very exciting because you were really kind of like hanging 1474 01:10:17,640 --> 01:10:19,280 Speaker 1: out there on your own. It'd be like you and 1475 01:10:19,479 --> 01:10:21,920 Speaker 1: one other dude and you're kind of like covering down 1476 01:10:21,920 --> 01:10:23,320 Speaker 1: the side of the building and they're like, let's hope 1477 01:10:23,360 --> 01:10:25,760 Speaker 1: that they don't hear us, because yeah, it's gonna get 1478 01:10:25,760 --> 01:10:28,600 Speaker 1: pretty hairy pretty quick. It was. Yeah, that was a 1479 01:10:28,640 --> 01:10:31,639 Speaker 1: really interesting time. And uh and the battalion rekie teams 1480 01:10:31,640 --> 01:10:33,519 Speaker 1: they went on over the years to do some really 1481 01:10:33,560 --> 01:10:35,840 Speaker 1: great work. I mean they ended up doing you know, 1482 01:10:35,960 --> 01:10:40,559 Speaker 1: vehicle interdictions and kidnappings, you know, quote unquote, they went 1483 01:10:40,600 --> 01:10:43,960 Speaker 1: on did some really cool stuff. Yeah, that's great, that's great. 1484 01:10:44,000 --> 01:10:45,200 Speaker 1: I mean, I'm glad that it was. You know, it'll 1485 01:10:45,240 --> 01:10:46,600 Speaker 1: be part of the foundation to get the kind of 1486 01:10:46,640 --> 01:10:49,960 Speaker 1: stuff going. Um. Before we get to some questions, if 1487 01:10:50,000 --> 01:10:52,400 Speaker 1: there are any from the licy, I see a few. 1488 01:10:52,439 --> 01:10:54,400 Speaker 1: A lot of people are just enjoying the discuss I wanted, 1489 01:10:54,439 --> 01:10:57,720 Speaker 1: which is great. This top the Beard topic I have 1490 01:10:57,760 --> 01:11:00,400 Speaker 1: to go back to because there's some funny stuff going 1491 01:11:00,479 --> 01:11:03,439 Speaker 1: on there. We had there's one guy, I can't remember 1492 01:11:03,439 --> 01:11:06,479 Speaker 1: which team he was on, good dude, but all he 1493 01:11:06,479 --> 01:11:09,719 Speaker 1: grew was like a little peach fuzz and a little 1494 01:11:09,720 --> 01:11:11,800 Speaker 1: bit here. And what he did is he went to 1495 01:11:11,840 --> 01:11:14,960 Speaker 1: go get beard. Died to die because it's all blond. 1496 01:11:15,360 --> 01:11:17,640 Speaker 1: So he went to go die his chin pubes and 1497 01:11:17,680 --> 01:11:20,799 Speaker 1: he got the he got jet black dye. That's for brothers. 1498 01:11:21,320 --> 01:11:24,000 Speaker 1: It's for black guys to die their afro. And so 1499 01:11:24,040 --> 01:11:26,000 Speaker 1: it's like he was he was, he was. He was 1500 01:11:26,040 --> 01:11:28,800 Speaker 1: like Swedish. This dude, Okay, he's an American, but like 1501 01:11:28,960 --> 01:11:32,240 Speaker 1: Nordic descent. He had this like jet black chin pubes. 1502 01:11:32,280 --> 01:11:37,080 Speaker 1: They looked ridiculous. It was funny so much. You have 1503 01:11:37,240 --> 01:11:38,880 Speaker 1: to grow him out right, and some people can grow 1504 01:11:38,920 --> 01:11:40,280 Speaker 1: like a really nice beard and some people they just 1505 01:11:40,280 --> 01:11:42,880 Speaker 1: don't have this Patchy had to be the option because 1506 01:11:42,880 --> 01:11:44,880 Speaker 1: he's you know, from that origin, but he can grow 1507 01:11:44,920 --> 01:11:48,400 Speaker 1: a pretty damn ful bar that Jeremiah Johnson beard. Yeah, yeah, 1508 01:11:48,400 --> 01:11:50,360 Speaker 1: you're gonna go. Like I said, this is a weird 1509 01:11:50,400 --> 01:11:53,200 Speaker 1: patches or something, you know, weirde colors, weird. One guy 1510 01:11:53,680 --> 01:11:55,800 Speaker 1: couldn't great. He didn't even officially just a little patch. 1511 01:11:56,040 --> 01:11:58,479 Speaker 1: And he kept saying he was like I'm Chilean apparently, 1512 01:12:00,600 --> 01:12:02,240 Speaker 1: and he was really upset about it. He's like, you know, 1513 01:12:02,240 --> 01:12:04,120 Speaker 1: it's not, it's not. You know it's not because because 1514 01:12:04,120 --> 01:12:07,080 Speaker 1: it's kind of a side of man who like he 1515 01:12:07,160 --> 01:12:09,680 Speaker 1: ain't gone out before the plumbing and he bought a 1516 01:12:09,720 --> 01:12:13,760 Speaker 1: fake beard. Kids, it's awesome like Hollywood, like a Hollywood 1517 01:12:13,840 --> 01:12:15,920 Speaker 1: makeup kid, right, because he was like, man, if we 1518 01:12:16,000 --> 01:12:17,680 Speaker 1: gotta go, if that's the thing, we gotta go in 1519 01:12:17,680 --> 01:12:19,640 Speaker 1: a cover, like I'm not gonna get left behind, Like 1520 01:12:19,680 --> 01:12:21,160 Speaker 1: I'm not sitting on the base because I don't have 1521 01:12:21,160 --> 01:12:23,200 Speaker 1: a beard. So he had this thing work off, you know, 1522 01:12:23,720 --> 01:12:26,080 Speaker 1: and he look, we're what do you think doesn't look 1523 01:12:26,120 --> 01:12:28,280 Speaker 1: the gym? Like, no, dude, looks like we're the fake bed. 1524 01:12:29,360 --> 01:12:32,080 Speaker 1: He looked. He looked like a pedophile in Central Park, 1525 01:12:32,240 --> 01:12:35,320 Speaker 1: like behind a tree. He just didn't look very legit. 1526 01:12:35,360 --> 01:12:37,960 Speaker 1: Now fair enough, like if you hadn't gotten close enough 1527 01:12:38,000 --> 01:12:41,120 Speaker 1: to notice, like, we're probably already in trouble. Yeah right, 1528 01:12:41,240 --> 01:12:42,600 Speaker 1: I mean, like you know you're supposed to be you 1529 01:12:42,600 --> 01:12:44,840 Speaker 1: have some standoff or maybe. But I remember it had 1530 01:12:44,840 --> 01:12:47,320 Speaker 1: two parts. There's the beard that goes over here with 1531 01:12:47,400 --> 01:12:50,160 Speaker 1: like and he had to like legit like Hollywood make 1532 01:12:50,240 --> 01:12:53,280 Speaker 1: up effects blue. And then there's a mustache that went 1533 01:12:53,280 --> 01:12:54,680 Speaker 1: on over that. I mean, but it looked like a 1534 01:12:54,760 --> 01:12:57,519 Speaker 1: dude board a fake. But he was like, hey, man, 1535 01:12:57,800 --> 01:12:59,439 Speaker 1: if we gotta do it like I'm putting a thing on, 1536 01:12:59,479 --> 01:13:01,960 Speaker 1: I'm all right, all right, left behind. He was a 1537 01:13:02,000 --> 01:13:04,960 Speaker 1: good dude. The other funny thing was pretty Afghan. It's 1538 01:13:05,040 --> 01:13:08,360 Speaker 1: a big thing in Afghan culture. They had this perception 1539 01:13:08,680 --> 01:13:11,799 Speaker 1: that whoever had the longer beard was the guy in charge, 1540 01:13:12,200 --> 01:13:15,439 Speaker 1: which was hilarious, and her platoon such at the time, 1541 01:13:15,800 --> 01:13:20,120 Speaker 1: remember could not go beard Randy. Yeah, he couldn't go 1542 01:13:20,400 --> 01:13:22,640 Speaker 1: to this patche kind of goofy looking beard, so the 1543 01:13:22,680 --> 01:13:25,280 Speaker 1: Afghans wouldn't believe that he was the boss. I had 1544 01:13:25,720 --> 01:13:28,080 Speaker 1: boss I had a pretty sick beard, so they thought 1545 01:13:28,120 --> 01:13:30,599 Speaker 1: that I was in charge or the other team leader 1546 01:13:31,160 --> 01:13:33,160 Speaker 1: who would also like a really cool beard, but they 1547 01:13:33,200 --> 01:13:34,800 Speaker 1: would not believe that he was the boss man. And 1548 01:13:34,840 --> 01:13:36,960 Speaker 1: they were like no, no, no, no, definitely not. Yeah, 1549 01:13:36,960 --> 01:13:38,479 Speaker 1: and if you don't have a beard, you're like the 1550 01:13:38,600 --> 01:13:40,920 Speaker 1: chi boy. Yeah, right, and so they're like, you know, 1551 01:13:41,000 --> 01:13:42,200 Speaker 1: it was kind of a big deal, and he was. 1552 01:13:42,320 --> 01:13:44,559 Speaker 1: He was Actually he was kind of sensitive about it. Yeah, 1553 01:13:45,200 --> 01:13:47,559 Speaker 1: you know. I ran into him at a ranger meet 1554 01:13:47,640 --> 01:13:49,760 Speaker 1: up in Las Vegas like two years ago. Yeah. It's 1555 01:13:49,760 --> 01:13:53,920 Speaker 1: always good catching. He's doing good, good guy. That's great. 1556 01:13:53,920 --> 01:13:57,479 Speaker 1: I'm glad we got into that story. Um. So we're 1557 01:13:57,479 --> 01:13:59,360 Speaker 1: getting a lot of people just enjoying the discussion. I mean, 1558 01:13:59,400 --> 01:14:01,439 Speaker 1: if you have any quite and shoot them over, it's funny. 1559 01:14:01,560 --> 01:14:03,760 Speaker 1: The other question my God has been asked was where 1560 01:14:03,760 --> 01:14:06,080 Speaker 1: the fund is Brandon web which is a nice plug 1561 01:14:06,160 --> 01:14:09,040 Speaker 1: for Brandon's podcast. So Brandon is doing the Power of 1562 01:14:09,120 --> 01:14:11,360 Speaker 1: Thought podcast every week. If you look up the Power 1563 01:14:11,400 --> 01:14:14,760 Speaker 1: of Thought, uh soft reap as you know, currently it's 1564 01:14:14,800 --> 01:14:17,000 Speaker 1: just been Jack Murphy and myself and whoever else we 1565 01:14:17,080 --> 01:14:19,680 Speaker 1: get in here. Um. And we should get Brandon on 1566 01:14:19,840 --> 01:14:22,800 Speaker 1: an episode soon. We've talked about that before. But his 1567 01:14:23,000 --> 01:14:25,800 Speaker 1: other question is when does the AI say no to 1568 01:14:25,920 --> 01:14:29,040 Speaker 1: a command because it doesn't fit the overall battle plan. 1569 01:14:30,400 --> 01:14:32,559 Speaker 1: I think that's a good question. Do you want systems 1570 01:14:32,600 --> 01:14:35,799 Speaker 1: like that they're gonna say no? Um? That could overrule somebody. 1571 01:14:36,120 --> 01:14:37,880 Speaker 1: That's one account of the debass do you want to 1572 01:14:37,920 --> 01:14:40,519 Speaker 1: put um? Mostly this comes up in the context of 1573 01:14:40,720 --> 01:14:44,080 Speaker 1: legal issues, right, would you want a system that somebody's 1574 01:14:44,080 --> 01:14:45,640 Speaker 1: about to commit a war crime? And things like no, 1575 01:14:45,720 --> 01:14:48,120 Speaker 1: I'm not gonna do that, you know, I'm I'm sorry, 1576 01:14:48,160 --> 01:14:49,680 Speaker 1: I'm sorry, Dave, I can't do that. And I'm not 1577 01:14:49,720 --> 01:14:52,160 Speaker 1: going to open the pod bay to wars for you. UM. 1578 01:14:52,880 --> 01:14:55,080 Speaker 1: And that's I think a legitimate kind of question, a 1579 01:14:55,240 --> 01:14:58,080 Speaker 1: legitimate concern go forward, you know. I think one of 1580 01:14:58,120 --> 01:14:59,680 Speaker 1: the things that comes up is like who do you 1581 01:15:00,000 --> 01:15:02,479 Speaker 1: do you want to rest control? Who's controlling the system? 1582 01:15:02,520 --> 01:15:04,320 Speaker 1: Whose job is it is that the program or ahead 1583 01:15:04,320 --> 01:15:06,720 Speaker 1: of time? Is that the operator who uses it? One 1584 01:15:06,760 --> 01:15:08,160 Speaker 1: of the things I dig into the book is I 1585 01:15:08,200 --> 01:15:11,040 Speaker 1: do like a sort of a case study of the 1586 01:15:11,280 --> 01:15:14,360 Speaker 1: Army's use of the Patriot Air Defense System, particularly in 1587 01:15:14,400 --> 01:15:16,840 Speaker 1: two thousand three that led to these fracture sides and 1588 01:15:16,920 --> 01:15:19,599 Speaker 1: then adapted their their doctrine since then, but at least 1589 01:15:19,600 --> 01:15:22,960 Speaker 1: at the time, and how the Navy uses the AGES 1590 01:15:23,000 --> 01:15:25,840 Speaker 1: Combat System on their ships, which is basically is as 1591 01:15:25,880 --> 01:15:30,439 Speaker 1: a full auto mode UM that allows them to shoot 1592 01:15:30,479 --> 01:15:32,600 Speaker 1: down any incoming rockets or missiles that are going to 1593 01:15:32,680 --> 01:15:35,880 Speaker 1: overwhelm the ship. Um. If it's like wartime and their 1594 01:15:35,920 --> 01:15:37,760 Speaker 1: their own threat, they can just flip this thing on. 1595 01:15:37,800 --> 01:15:40,360 Speaker 1: It's gonna shoot down income with things. So they're built 1596 01:15:40,400 --> 01:15:43,559 Speaker 1: by the same company in defense contract basically the same 1597 01:15:43,640 --> 01:15:46,559 Speaker 1: concepts and technology, the same mission set, but they're used 1598 01:15:46,720 --> 01:15:50,639 Speaker 1: very differently. And for the Army, the Patriot is basically, 1599 01:15:50,680 --> 01:15:53,240 Speaker 1: at least at the time, it's used in a way 1600 01:15:53,560 --> 01:15:56,360 Speaker 1: to um to what I would describe it as replace 1601 01:15:56,520 --> 01:15:59,840 Speaker 1: human judgment and decision making. So they EMBEDDL this p 1602 01:16:00,040 --> 01:16:02,640 Speaker 1: gonna being into the machine and then the operator has 1603 01:16:02,640 --> 01:16:04,360 Speaker 1: a very just a couple of different modes to turn 1604 01:16:04,439 --> 01:16:06,760 Speaker 1: this thing on. It's kind of like semi auto went 1605 01:16:06,800 --> 01:16:09,559 Speaker 1: full auto okay um, and that's it. That's what they 1606 01:16:09,640 --> 01:16:12,160 Speaker 1: decide and it's based on either aircraft or missiles, and 1607 01:16:12,200 --> 01:16:14,960 Speaker 1: that's all the choices they get safe, semi lots of fun. 1608 01:16:15,400 --> 01:16:18,240 Speaker 1: Yeah like that. So so in the Navy it's almost 1609 01:16:18,320 --> 01:16:21,479 Speaker 1: infinitely customizable. They write these doctor what they're called doctrine statements, 1610 01:16:21,520 --> 01:16:25,400 Speaker 1: are like programming lines UM, and they start planning six 1611 01:16:25,479 --> 01:16:27,880 Speaker 1: months before deployment. They look at the mission, they look 1612 01:16:27,880 --> 01:16:30,200 Speaker 1: at the environment they're going to to look at threat information, 1613 01:16:30,640 --> 01:16:33,639 Speaker 1: and then map are always different doctrine statements to say, Okay, 1614 01:16:33,720 --> 01:16:37,400 Speaker 1: you know, if there's an incoming of radar hit at 1615 01:16:37,479 --> 01:16:39,639 Speaker 1: this speed at this altity, you come from this direction, 1616 01:16:40,080 --> 01:16:42,360 Speaker 1: it's gonna be semi auto. If it's from this you 1617 01:16:42,400 --> 01:16:44,720 Speaker 1: know this altitude is going to be full auto. And 1618 01:16:44,800 --> 01:16:46,880 Speaker 1: they kind of map all this stuff out and then 1619 01:16:46,920 --> 01:16:49,280 Speaker 1: they can upload all of these things individually, one at 1620 01:16:49,280 --> 01:16:51,840 Speaker 1: a time, or like group them into into bundles and 1621 01:16:51,920 --> 01:16:55,360 Speaker 1: upload them very quickly. But in that case, the automation 1622 01:16:55,479 --> 01:16:59,479 Speaker 1: is used to embody the judgment of the military commander 1623 01:16:59,800 --> 01:17:01,439 Speaker 1: and and for the Navy it's a ship commander. He 1624 01:17:01,680 --> 01:17:04,960 Speaker 1: hears she ultimately is all of the authority UM. And 1625 01:17:05,080 --> 01:17:06,519 Speaker 1: so I think one of the questions is like do 1626 01:17:06,640 --> 01:17:09,519 Speaker 1: we want to use automation to replace humans or to 1627 01:17:09,960 --> 01:17:13,400 Speaker 1: carry out human judgment for the war fighter, and that's 1628 01:17:13,400 --> 01:17:16,559 Speaker 1: a very different kind of design philosophy. I think that's 1629 01:17:16,560 --> 01:17:19,200 Speaker 1: a that's a great answer to that. Um, wrapping things 1630 01:17:19,320 --> 01:17:21,920 Speaker 1: up here, I did want to mention you guys all 1631 01:17:22,160 --> 01:17:24,920 Speaker 1: watching and listening, there's only one club out there with 1632 01:17:25,080 --> 01:17:29,400 Speaker 1: gear handpicked by special operations military veterans from several branches, 1633 01:17:29,800 --> 01:17:33,120 Speaker 1: and that of course is Great Club. Past items about 1634 01:17:33,160 --> 01:17:36,559 Speaker 1: Great Club. Yeah, Great Club is is doing awesome. People 1635 01:17:36,600 --> 01:17:39,400 Speaker 1: are very happy with all the gear we've been sending out. Um. 1636 01:17:39,720 --> 01:17:42,519 Speaker 1: You know, we've had items in the past like a E. 1637 01:17:42,640 --> 01:17:45,960 Speaker 1: D C medikit that Benghazi survivor and fellow Army ranger 1638 01:17:46,120 --> 01:17:49,280 Speaker 1: I should say for you guys, Kristanto Piranto put together 1639 01:17:49,400 --> 01:17:53,480 Speaker 1: a ballistic shield insert for your backpack made by cry Precision. 1640 01:17:54,000 --> 01:17:57,080 Speaker 1: And this year Scott Whittner, who I who might still 1641 01:17:57,160 --> 01:18:00,240 Speaker 1: be watching, is helping us put together a out of 1642 01:18:00,320 --> 01:18:03,519 Speaker 1: custom products. So we have different tiers of membership depending 1643 01:18:03,560 --> 01:18:06,160 Speaker 1: on how prepared you want to be, and gift options 1644 01:18:06,160 --> 01:18:08,120 Speaker 1: are available as well. You can check that all out 1645 01:18:08,160 --> 01:18:11,280 Speaker 1: at Great Club dot Us. Once again, that's Great Club 1646 01:18:11,400 --> 01:18:14,200 Speaker 1: dot Us for your dog owners. Check this out. You're 1647 01:18:14,200 --> 01:18:16,720 Speaker 1: gonna love this. We've just launched Kuna. We have a 1648 01:18:16,800 --> 01:18:19,800 Speaker 1: team of trained canine handlers picking out a box fore 1649 01:18:19,920 --> 01:18:23,200 Speaker 1: dog each month of healthy treats and training aids. It's 1650 01:18:23,320 --> 01:18:26,360 Speaker 1: custom built for your dog's size and age as well. 1651 01:18:26,720 --> 01:18:29,720 Speaker 1: The products are us sourced, all natural and not only 1652 01:18:29,760 --> 01:18:32,800 Speaker 1: promote a healthy diet, but also promote being active with 1653 01:18:32,920 --> 01:18:35,680 Speaker 1: your dogs. So whether you're talking a pitbull or chihuahua, 1654 01:18:36,040 --> 01:18:37,760 Speaker 1: this is just what you're looking for. You can see 1655 01:18:37,800 --> 01:18:40,960 Speaker 1: all of this at Kuna dot Dog. That's Kuna dot 1656 01:18:41,040 --> 01:18:43,640 Speaker 1: Dog and that's spelled c U n A dot d 1657 01:18:43,840 --> 01:18:46,600 Speaker 1: o G. I saw the first UM box that we 1658 01:18:46,640 --> 01:18:48,960 Speaker 1: put together. We've adds out of that and people are 1659 01:18:49,000 --> 01:18:52,479 Speaker 1: really loving them. So Kuna dot Dog um. And then 1660 01:18:52,640 --> 01:18:55,680 Speaker 1: also the Spec Ops Channel, our channel that offers the 1661 01:18:55,720 --> 01:19:00,000 Speaker 1: most exclusive shows, documentaries and interviews covering the most exciting 1662 01:19:00,000 --> 01:19:03,280 Speaker 1: fighting military content. Today, our premier show on there is 1663 01:19:03,320 --> 01:19:07,760 Speaker 1: Training Cell, which follows Special Operations Forces as they participate 1664 01:19:07,840 --> 01:19:10,720 Speaker 1: in the most advanced training in the country. Everything from 1665 01:19:10,760 --> 01:19:15,479 Speaker 1: shooting school's, defensive driving, jungle and winter warfare, climbing and 1666 01:19:15,640 --> 01:19:18,600 Speaker 1: much more. And you can watch this content by subscribing 1667 01:19:18,640 --> 01:19:20,960 Speaker 1: to the spec Ops Channel and that sad. Spec ops 1668 01:19:21,080 --> 01:19:25,040 Speaker 1: Channel dot com can take advantage of a limited time 1669 01:19:25,120 --> 01:19:29,439 Speaker 1: offer off your membership though, so that's only four month 1670 01:19:30,040 --> 01:19:32,600 Speaker 1: UM and we have a new app. Comment, Well, the 1671 01:19:32,680 --> 01:19:35,360 Speaker 1: app is already available, I should say. If you have 1672 01:19:35,520 --> 01:19:38,880 Speaker 1: an iPhone, the I per iOS the android is coming 1673 01:19:38,920 --> 01:19:41,479 Speaker 1: in June, so check out the spec Ops channel app. 1674 01:19:41,520 --> 01:19:43,719 Speaker 1: You could watch all that on your phone. Of course, 1675 01:19:43,840 --> 01:19:45,840 Speaker 1: I should mention what Paul is up to. That's why 1676 01:19:45,920 --> 01:19:49,040 Speaker 1: he's here. Army of None is available now Army of 1677 01:19:49,160 --> 01:19:51,880 Speaker 1: None Autonomous Weapons and the Future of War. You could 1678 01:19:51,880 --> 01:19:56,280 Speaker 1: follow Paul on Twitter at Paul Underscore Shari, which is 1679 01:19:56,360 --> 01:20:00,360 Speaker 1: Paul Underscore s C H A R R E. Uh. 1680 01:20:00,479 --> 01:20:04,599 Speaker 1: Your website is Paul Shari dot com. Currently working um 1681 01:20:04,840 --> 01:20:07,960 Speaker 1: for the Center for a New American Security. Any other 1682 01:20:08,040 --> 01:20:09,559 Speaker 1: stuff that you want to plug before you get out 1683 01:20:09,600 --> 01:20:11,559 Speaker 1: of here or now? Thanks for having me on the show. 1684 01:20:11,560 --> 01:20:13,760 Speaker 1: Are there any more questions for him? From? Uh? You know, 1685 01:20:13,920 --> 01:20:15,880 Speaker 1: it's a lot of people just liking what we're doing, 1686 01:20:16,000 --> 01:20:20,760 Speaker 1: like you know, Ariel Crew says Beard hilarious. Uh, you know, um, 1687 01:20:21,280 --> 01:20:25,080 Speaker 1: excellent discussion from Michael Sable Uh and just you know, 1688 01:20:25,200 --> 01:20:27,000 Speaker 1: different stuff like that. So I'm not seeing a lot 1689 01:20:27,080 --> 01:20:29,559 Speaker 1: of questions. I mean, if you have any, shoot them 1690 01:20:29,560 --> 01:20:32,040 Speaker 1: over now because we're about to wrap this up. But um, 1691 01:20:32,320 --> 01:20:33,960 Speaker 1: I think we we covered a lot of ground here. 1692 01:20:34,200 --> 01:20:35,880 Speaker 1: Really enjoyed it. I mean, this is pretty in depth. 1693 01:20:35,920 --> 01:20:37,920 Speaker 1: You're probably used to because I know you're in town 1694 01:20:38,000 --> 01:20:40,800 Speaker 1: doing media, You're probably used to doing these hits where 1695 01:20:41,080 --> 01:20:44,400 Speaker 1: you're trying to cover a very complex subject in what 1696 01:20:44,680 --> 01:20:46,960 Speaker 1: two to five minutes you're so, Yeah, a lot of 1697 01:20:47,000 --> 01:20:48,920 Speaker 1: them are radio hits where it's like five minutes, real 1698 01:20:49,000 --> 01:20:50,920 Speaker 1: quick on and off on the air, and this is 1699 01:20:51,000 --> 01:20:54,760 Speaker 1: not a five minute subject. I don't have took this, uh, 1700 01:20:55,280 --> 01:20:57,320 Speaker 1: you know, so I can do the five minutes fuel. 1701 01:20:57,360 --> 01:20:59,479 Speaker 1: But it's always fun to talk more. It's um and 1702 01:20:59,560 --> 01:21:01,080 Speaker 1: it's great to be able to talk to to you 1703 01:21:01,200 --> 01:21:03,919 Speaker 1: guys and have talked to like an audience that understands 1704 01:21:03,920 --> 01:21:05,320 Speaker 1: a lot of these things and be able to get 1705 01:21:05,360 --> 01:21:08,320 Speaker 1: into some of the more of the the the vissel 1706 01:21:08,439 --> 01:21:10,960 Speaker 1: reality the military operations that that I try to talk 1707 01:21:10,960 --> 01:21:12,680 Speaker 1: about in the Yeah, a lot of you know, a 1708 01:21:12,760 --> 01:21:15,240 Speaker 1: lot of our audience are you know, the infantry men 1709 01:21:15,439 --> 01:21:18,479 Speaker 1: that worked on the ground, or the radar operators or 1710 01:21:18,520 --> 01:21:20,960 Speaker 1: the missile operator on the ship, you know, all those 1711 01:21:21,040 --> 01:21:23,160 Speaker 1: kind of guys who have some familiarity with this. And 1712 01:21:23,800 --> 01:21:27,360 Speaker 1: I think I have the same questions you have. Where's 1713 01:21:27,360 --> 01:21:29,240 Speaker 1: all this going right? And that's one of the things 1714 01:21:29,280 --> 01:21:31,320 Speaker 1: I try to weave into and using some of the 1715 01:21:31,400 --> 01:21:34,400 Speaker 1: experiences that I had, and and and the experiences of 1716 01:21:34,439 --> 01:21:36,840 Speaker 1: others and interviews is kind of what's the what's the 1717 01:21:36,960 --> 01:21:39,320 Speaker 1: reality of what people are doing today, which has a 1718 01:21:39,400 --> 01:21:41,920 Speaker 1: lot of technology, has a lot of automation, but also 1719 01:21:42,000 --> 01:21:44,920 Speaker 1: has a lot of situations where humans are making tough calls. 1720 01:21:45,560 --> 01:21:49,600 Speaker 1: Humans are making whether it's you know, um, people on 1721 01:21:49,680 --> 01:21:52,320 Speaker 1: the ground in wars like a rock Afghanistan, or you've 1722 01:21:52,320 --> 01:21:56,360 Speaker 1: got situations like US fighter pilots over Syria, you know, 1723 01:21:56,560 --> 01:21:59,560 Speaker 1: button up against Russian fighter pilots, right, making decisions that 1724 01:21:59,640 --> 01:22:03,439 Speaker 1: have huge, huge strategic consequences. Um, you know, I think 1725 01:22:03,640 --> 01:22:06,040 Speaker 1: I think what I find is interesting about this it 1726 01:22:06,080 --> 01:22:09,120 Speaker 1: snaps out here. The technologists will say things like, hey, 1727 01:22:09,200 --> 01:22:11,680 Speaker 1: like humans are just pushing buttons anyways, right, They're just 1728 01:22:11,720 --> 01:22:13,920 Speaker 1: trusted all automation. And then when you talk to a 1729 01:22:13,960 --> 01:22:15,880 Speaker 1: lot of the military professionals, they'll be like, well, it's 1730 01:22:15,920 --> 01:22:18,640 Speaker 1: a little more some more complicated than that, regardless of 1731 01:22:18,720 --> 01:22:20,400 Speaker 1: their jobs. So I try to convey that too in 1732 01:22:20,439 --> 01:22:22,680 Speaker 1: the book. Do you think that escalation of force is 1733 01:22:22,680 --> 01:22:26,080 Speaker 1: going to change? Like it's war gonna become easier if 1734 01:22:26,320 --> 01:22:28,639 Speaker 1: you know it's just one side deploying their robots against 1735 01:22:28,680 --> 01:22:32,559 Speaker 1: the other, Like like if the Russia versus America dog 1736 01:22:32,680 --> 01:22:35,760 Speaker 1: fight over Syria. What if both of those plans are automated. 1737 01:22:35,800 --> 01:22:37,920 Speaker 1: Maybe at that point we was kind of like, fuck it. 1738 01:22:38,640 --> 01:22:40,320 Speaker 1: And if they and if they blow each other up, 1739 01:22:40,320 --> 01:22:41,880 Speaker 1: do we even care? Like is it that big a 1740 01:22:41,960 --> 01:22:44,000 Speaker 1: deal because the robots, I mean, there's no question that 1741 01:22:44,040 --> 01:22:46,120 Speaker 1: at a small level that's true. So so here's a 1742 01:22:46,160 --> 01:22:49,280 Speaker 1: real example that already happened. Um, you know last year, 1743 01:22:49,920 --> 01:22:52,439 Speaker 1: year and a half ago, now, China snatched up a 1744 01:22:52,560 --> 01:22:55,840 Speaker 1: US underwater drone in remember that. We just shrugged it off. 1745 01:22:56,320 --> 01:22:58,240 Speaker 1: We just said like he's drawn back in the Chinese like, 1746 01:22:58,439 --> 01:23:01,080 Speaker 1: oh our bad. We mean, we didn't mean to hear's 1747 01:23:01,080 --> 01:23:04,200 Speaker 1: a back. Now, imagine if they had you know, captured 1748 01:23:04,320 --> 01:23:07,599 Speaker 1: like um, you know, a small navy boat with sailors 1749 01:23:07,680 --> 01:23:11,120 Speaker 1: on board. Totally different situation in Iran, like in Iran 1750 01:23:11,320 --> 01:23:14,760 Speaker 1: or in the P three incident years ago, right, it's 1751 01:23:14,760 --> 01:23:17,680 Speaker 1: a huge different level of diplomatic consequence. Like when a 1752 01:23:17,760 --> 01:23:20,640 Speaker 1: drone gets shot down or just goes down in Pakistan 1753 01:23:20,840 --> 01:23:24,720 Speaker 1: or Afghanistan, that sucks. Probably gotta go out there or 1754 01:23:24,880 --> 01:23:27,000 Speaker 1: do drop a bomb on it to destroy it the 1755 01:23:27,080 --> 01:23:29,160 Speaker 1: rest of the way. But what kind of shrugs their 1756 01:23:29,200 --> 01:23:32,599 Speaker 1: shoulders like, yeah, it's not war man. Well, and we've 1757 01:23:32,640 --> 01:23:36,120 Speaker 1: seen impeeddly. Actually, whereas drones have been proliferating, countries have 1758 01:23:36,240 --> 01:23:39,040 Speaker 1: been UM countries are much more liberal. Women take risk 1759 01:23:39,080 --> 01:23:41,280 Speaker 1: with drone. They'll send them across you know, the line 1760 01:23:41,320 --> 01:23:44,479 Speaker 1: of international boarder in ways that they totally wouldn't with 1761 01:23:45,000 --> 01:23:47,439 Speaker 1: with people on board, would be considered an act of war, 1762 01:23:47,520 --> 01:23:50,880 Speaker 1: if you like, parachuted troops into another country, right, it is. 1763 01:23:50,960 --> 01:23:53,439 Speaker 1: It is certainly much more inflammatory where there's people involved. 1764 01:23:53,560 --> 01:23:56,920 Speaker 1: We certainly saw this in Pakistan with the Pakistani reaction 1765 01:23:56,960 --> 01:24:00,320 Speaker 1: to US drone strikes, which is not really enthusiasm steak. 1766 01:24:00,560 --> 01:24:01,880 Speaker 1: But then you compare this to you know, in the 1767 01:24:01,920 --> 01:24:03,920 Speaker 1: Bush administration there were a couple on the ground raids 1768 01:24:04,280 --> 01:24:08,200 Speaker 1: in the Pakistan and and the reaction was completely different 1769 01:24:08,320 --> 01:24:12,439 Speaker 1: or the Bilan right, right, completely different, um and and 1770 01:24:12,520 --> 01:24:14,320 Speaker 1: but we're also see that people will just shoot the 1771 01:24:14,439 --> 01:24:16,960 Speaker 1: enemy drones down. People just people shoot them down and 1772 01:24:17,000 --> 01:24:18,519 Speaker 1: they bui up the drone and then nobody cares. They 1773 01:24:18,560 --> 01:24:20,240 Speaker 1: kind of shrug it off. So I think, like the 1774 01:24:20,360 --> 01:24:24,040 Speaker 1: small level, that's definitely reasonable. We're already seen that happened. 1775 01:24:24,360 --> 01:24:26,600 Speaker 1: Now when you scale that up, could you imagine like 1776 01:24:26,720 --> 01:24:29,439 Speaker 1: a full scale war that's just robots fighting robots and 1777 01:24:29,479 --> 01:24:32,479 Speaker 1: no one's killed. I don't think so. Actually not well. 1778 01:24:32,520 --> 01:24:35,960 Speaker 1: I also think that like ultimately people will die, because 1779 01:24:36,000 --> 01:24:37,559 Speaker 1: that's what's gonna that's what's gonna take for the war 1780 01:24:37,600 --> 01:24:40,000 Speaker 1: to end. Like it's gonna be pain and suffering to 1781 01:24:40,080 --> 01:24:44,400 Speaker 1: get someone to say enough I quit cost. Yeah. But 1782 01:24:44,640 --> 01:24:46,720 Speaker 1: but in the small scale, definitely, I think it's it's 1783 01:24:46,760 --> 01:24:50,560 Speaker 1: already changing escalator dynamics. This is great. I think we 1784 01:24:50,680 --> 01:24:52,920 Speaker 1: covered so much and and there's a lot more to 1785 01:24:53,040 --> 01:24:55,760 Speaker 1: be covered. So pick up the book. Army of None. Um. 1786 01:24:55,920 --> 01:24:57,800 Speaker 1: I wrote down in my sheet here like there's so 1787 01:24:57,960 --> 01:25:01,360 Speaker 1: much going on between North Korea South A meeting. Actually 1788 01:25:01,439 --> 01:25:04,000 Speaker 1: I want Paul, you want you plug your your study 1789 01:25:04,120 --> 01:25:07,760 Speaker 1: to that you just did. Oh we mentioned this all right, 1790 01:25:07,880 --> 01:25:10,920 Speaker 1: So so we just released for study um this week. 1791 01:25:11,120 --> 01:25:13,960 Speaker 1: UM that came out of the Army about the effect 1792 01:25:14,040 --> 01:25:17,760 Speaker 1: of blast pressure waves on the brain and particularly from 1793 01:25:17,800 --> 01:25:21,080 Speaker 1: shoulder fired and heavy weapons. So UM and I tank 1794 01:25:21,240 --> 01:25:25,240 Speaker 1: rockets like Carl gustav a T four law. People who 1795 01:25:25,320 --> 01:25:27,479 Speaker 1: fired these things know that they pack quite a punch. 1796 01:25:27,920 --> 01:25:31,599 Speaker 1: UM and you know I we fired these things. UM. 1797 01:25:31,840 --> 01:25:34,200 Speaker 1: The cargoes of a particular cracks quite a quite a 1798 01:25:34,200 --> 01:25:35,599 Speaker 1: punch when you feel like it hit in the face 1799 01:25:35,600 --> 01:25:39,200 Speaker 1: when you shoot it. UM. There's new evidence that comes 1800 01:25:39,280 --> 01:25:41,759 Speaker 1: that's just come out very recently that we just released 1801 01:25:41,800 --> 01:25:45,720 Speaker 1: publicly this week from duty studies that shows that even 1802 01:25:45,880 --> 01:25:49,439 Speaker 1: a very small number of shots from these things causes uh, 1803 01:25:50,040 --> 01:25:54,439 Speaker 1: cognitive deficits in memory and executive function for people firing them. 1804 01:25:54,960 --> 01:25:57,519 Speaker 1: UM and and so we just released a new report 1805 01:25:57,560 --> 01:25:59,880 Speaker 1: on this. UM. We're really interested trying to get the 1806 01:26:00,000 --> 01:26:02,280 Speaker 1: word out to people who are using these I think 1807 01:26:02,360 --> 01:26:04,759 Speaker 1: that you know, people are familiar with this, have understood 1808 01:26:04,760 --> 01:26:06,720 Speaker 1: that if you massively exceeded to shoot this, you know, 1809 01:26:07,000 --> 01:26:09,120 Speaker 1: two dozen rounds a day, you're not gonna feel well. 1810 01:26:09,160 --> 01:26:12,040 Speaker 1: People know that. I think what is not fully understood 1811 01:26:12,560 --> 01:26:15,160 Speaker 1: is the potential long term harmful effects on the brain 1812 01:26:15,840 --> 01:26:17,840 Speaker 1: and even the effects that happened from just a small 1813 01:26:17,960 --> 01:26:21,160 Speaker 1: number of shots. Even within the approved limits that are 1814 01:26:21,200 --> 01:26:23,839 Speaker 1: out there, that are standards, and we've talked to particularly 1815 01:26:23,840 --> 01:26:27,040 Speaker 1: the range of community, those limits are violated, UM pretty routinely. 1816 01:26:27,120 --> 01:26:29,600 Speaker 1: It's a significant problem. So we're trying to get the 1817 01:26:29,640 --> 01:26:32,600 Speaker 1: word out, UM, don't don't funk up your melon with 1818 01:26:32,680 --> 01:26:35,759 Speaker 1: these things like UM. One of the things we recommended 1819 01:26:35,840 --> 01:26:39,360 Speaker 1: is that the army, you know, change its firing limits 1820 01:26:39,760 --> 01:26:42,719 Speaker 1: and that commanders are really active in enforcing those limits 1821 01:26:42,960 --> 01:26:44,800 Speaker 1: to make sure that people aren't exceeding them and causing 1822 01:26:44,840 --> 01:26:48,240 Speaker 1: home And where could people find this um So there 1823 01:26:48,280 --> 01:26:51,360 Speaker 1: have been stories on NPR and the Wall Street Journal. 1824 01:26:51,800 --> 01:26:56,839 Speaker 1: Our study is available on CNAs dot org. The project 1825 01:26:57,000 --> 01:26:59,360 Speaker 1: is called super Soldiers. UM. So if you go go 1826 01:26:59,880 --> 01:27:02,400 Speaker 1: a c n A S dot org. First Center for 1827 01:27:02,560 --> 01:27:05,040 Speaker 1: New American sent something you had mentioned to me which 1828 01:27:05,120 --> 01:27:07,639 Speaker 1: was interesting is you know, my first job in range 1829 01:27:07,640 --> 01:27:11,280 Speaker 1: of betime was a Carl Gustav Gunner, and we had 1830 01:27:11,320 --> 01:27:14,360 Speaker 1: the BOP chart, the blast over pressure chart, and the 1831 01:27:14,520 --> 01:27:17,200 Speaker 1: medics understood because I remember them describing it to me 1832 01:27:17,240 --> 01:27:18,800 Speaker 1: when I was a private They're like, look, the reason 1833 01:27:18,880 --> 01:27:21,040 Speaker 1: why you look like you're drunk after you fire that 1834 01:27:21,120 --> 01:27:23,800 Speaker 1: thing is because it actually bruises your like there's inflammation 1835 01:27:23,840 --> 01:27:26,280 Speaker 1: of the brain happening, and they understood that. But you 1836 01:27:26,400 --> 01:27:30,400 Speaker 1: told me the BOP chart was developed around protecting hearing 1837 01:27:30,479 --> 01:27:32,479 Speaker 1: loss has nothing to do with t b I at all, 1838 01:27:32,680 --> 01:27:34,840 Speaker 1: you know. So, I mean I would describe where we 1839 01:27:34,920 --> 01:27:38,240 Speaker 1: are is um as like where the NFL was ten 1840 01:27:38,360 --> 01:27:41,320 Speaker 1: years ago, right, and trying to understand the effective hits 1841 01:27:41,360 --> 01:27:43,720 Speaker 1: on the brain. And we're really playing catchup because what 1842 01:27:44,000 --> 01:27:48,120 Speaker 1: what happens um from blast pressure is a different kind 1843 01:27:48,160 --> 01:27:50,120 Speaker 1: of mechanism that if you get hit in the head 1844 01:27:50,120 --> 01:27:52,360 Speaker 1: from a football in right, If if that happens, you know, 1845 01:27:52,479 --> 01:27:55,439 Speaker 1: you smack your head against something vehicle accident, you're playing 1846 01:27:55,479 --> 01:27:58,080 Speaker 1: contact sports, the brain slashes around in the skull and 1847 01:27:58,160 --> 01:28:01,439 Speaker 1: matches against your inside of your skull and causes bruising. UM. 1848 01:28:01,920 --> 01:28:04,679 Speaker 1: There's something different that happens when a pressure wave comes 1849 01:28:04,720 --> 01:28:06,880 Speaker 1: through the body, and we don't fully understand it, but 1850 01:28:06,920 --> 01:28:09,240 Speaker 1: it's a different kind of mechanism, but it definitely has 1851 01:28:09,280 --> 01:28:12,880 Speaker 1: some kind of effect on a brain. And and just 1852 01:28:13,120 --> 01:28:15,000 Speaker 1: like in in the NFL, and people have known that 1853 01:28:15,040 --> 01:28:17,479 Speaker 1: getting repeat concussion is not good for you, now they're 1854 01:28:17,520 --> 01:28:20,120 Speaker 1: beginning to understand the um the fact that you can 1855 01:28:20,160 --> 01:28:23,640 Speaker 1: get degenerative conditions that worse than over time, where you 1856 01:28:23,880 --> 01:28:26,400 Speaker 1: you have, you know, some initial brain injury and then 1857 01:28:26,400 --> 01:28:28,280 Speaker 1: maybe you stop getting hit, but it just spirals out 1858 01:28:28,320 --> 01:28:33,200 Speaker 1: of control. That's one concern. Another concerns repeat subconcussive events, 1859 01:28:33,479 --> 01:28:35,599 Speaker 1: so events where the person might actually report that they're fine, 1860 01:28:35,640 --> 01:28:37,800 Speaker 1: they don't even feel anything, they don't have a concussion. 1861 01:28:38,200 --> 01:28:40,960 Speaker 1: But we can measure cognitive we can measure decreases in 1862 01:28:40,960 --> 01:28:44,600 Speaker 1: their cognitive performance following these things. UM and so it 1863 01:28:44,680 --> 01:28:47,400 Speaker 1: suggests that that that the threshold for which we're protecting 1864 01:28:47,400 --> 01:28:49,200 Speaker 1: people needs to come down. And there's a lot of 1865 01:28:49,240 --> 01:28:51,439 Speaker 1: things that we thought were okay or recoverable that are 1866 01:28:51,520 --> 01:28:53,840 Speaker 1: not um and we need to do a better job 1867 01:28:53,920 --> 01:28:55,960 Speaker 1: of protecting people. We also look at things that better 1868 01:28:56,040 --> 01:28:59,200 Speaker 1: helmet designs. We've they've been computer modeling to suggest that 1869 01:28:59,280 --> 01:29:02,160 Speaker 1: if you have a a full face helmet, you can 1870 01:29:02,200 --> 01:29:05,760 Speaker 1: reduce the pressure we coming into the head. Now there's 1871 01:29:05,760 --> 01:29:07,519 Speaker 1: problems with that. It's like a football helmet, Like a 1872 01:29:07,520 --> 01:29:11,160 Speaker 1: football helmet, right, are a motorcycle helmet moorcyc Right, So 1873 01:29:11,439 --> 01:29:14,400 Speaker 1: it reduces visibility and other things maybe maybe for training, 1874 01:29:14,680 --> 01:29:17,240 Speaker 1: maybe for training, maybe for some duty positions. You know, 1875 01:29:17,320 --> 01:29:21,080 Speaker 1: maybe if you're like the car Gustav gunner, maybe you 1876 01:29:21,080 --> 01:29:24,479 Speaker 1: should have a helmet. Right. Um, So that's you know, 1877 01:29:24,760 --> 01:29:26,599 Speaker 1: some of the stuff that we're trying to to get 1878 01:29:26,640 --> 01:29:29,920 Speaker 1: the word out there to say, um, respect of firing limits, 1879 01:29:29,960 --> 01:29:32,320 Speaker 1: we need to change the firing limits, install some some 1880 01:29:32,479 --> 01:29:35,559 Speaker 1: procedures today. The other thing is we have this technology 1881 01:29:35,640 --> 01:29:38,479 Speaker 1: to measure all this. We have these blast cages that 1882 01:29:38,560 --> 01:29:40,800 Speaker 1: people can wear. They can measure the pressure that we 1883 01:29:40,920 --> 01:29:44,080 Speaker 1: received so that recording it in a record, And we're 1884 01:29:44,080 --> 01:29:46,479 Speaker 1: not actually doing that right now. I'm sure v A 1885 01:29:46,640 --> 01:29:48,720 Speaker 1: doesn't really want you or the Army doesn't want you 1886 01:29:48,760 --> 01:29:50,599 Speaker 1: to do in that because the disability claims. So I'm 1887 01:29:50,640 --> 01:29:52,320 Speaker 1: not looking. I'm not saying that that's why the army 1888 01:29:52,360 --> 01:29:53,880 Speaker 1: is doing it, but they don't have a programming place 1889 01:29:53,920 --> 01:29:55,800 Speaker 1: to do it. And so the problem is right now, 1890 01:29:55,880 --> 01:29:57,720 Speaker 1: let's say you go to the range and you can 1891 01:29:57,800 --> 01:30:00,160 Speaker 1: expose to all these things and then years later or 1892 01:30:00,640 --> 01:30:02,720 Speaker 1: you know, you've got all these symptoms. And since we 1893 01:30:02,800 --> 01:30:04,240 Speaker 1: released the report, we have had a lot of people 1894 01:30:04,280 --> 01:30:06,639 Speaker 1: come out on social media saying like, hey, I shot 1895 01:30:06,720 --> 01:30:10,360 Speaker 1: these things and I've got symptoms. You know. Um, and 1896 01:30:10,600 --> 01:30:12,000 Speaker 1: if you go to the v A and you have 1897 01:30:12,120 --> 01:30:15,360 Speaker 1: these things, they're gonna say, where's your documentation, and we're 1898 01:30:15,400 --> 01:30:17,120 Speaker 1: not documenting it. So that's one of the things too 1899 01:30:17,120 --> 01:30:19,719 Speaker 1: that we say to Army needs the institute a program 1900 01:30:19,800 --> 01:30:22,120 Speaker 1: is surveillance program to start recording all this because if 1901 01:30:22,160 --> 01:30:25,799 Speaker 1: we've seen this before with Agent Orange, with Gulf War syndrol, 1902 01:30:26,160 --> 01:30:29,840 Speaker 1: with burn pits, where we're exposing people these hazards and 1903 01:30:29,920 --> 01:30:32,400 Speaker 1: then later on all these effects come and then we're 1904 01:30:32,439 --> 01:30:36,720 Speaker 1: scrambled or playing catch up or in denial. C n 1905 01:30:36,920 --> 01:30:39,360 Speaker 1: A s dot org And the title of the study 1906 01:30:39,640 --> 01:30:43,440 Speaker 1: super Soldiers, So if google c n A S super Soldiers, 1907 01:30:43,680 --> 01:30:45,320 Speaker 1: you should be able to find it. Our work on 1908 01:30:45,720 --> 01:30:48,600 Speaker 1: on this, this issue of of blast injury awesome. And 1909 01:30:48,960 --> 01:30:51,639 Speaker 1: as I was saying before, UM, like I said, there's 1910 01:30:51,680 --> 01:30:53,719 Speaker 1: major issues we'll get into. I think the next episode 1911 01:30:53,760 --> 01:30:56,800 Speaker 1: North Korea South Korea meeting, that's the biggest news right now, 1912 01:30:56,840 --> 01:31:00,600 Speaker 1: I would say, UM. And then the Israeli Massada operation 1913 01:31:01,080 --> 01:31:05,280 Speaker 1: of finding the plans behind the Irani nuclear situation. So 1914 01:31:05,680 --> 01:31:08,200 Speaker 1: there's articles on the website about both of those. Yes, 1915 01:31:08,560 --> 01:31:11,400 Speaker 1: so software dot com go pick up Army of None. 1916 01:31:11,760 --> 01:31:15,800 Speaker 1: I will hold it up here as we Exitus broadcast 1917 01:31:15,920 --> 01:31:18,960 Speaker 1: and thanks for coming in. Appreciate it none pick it 1918 01:31:19,040 --> 01:31:23,240 Speaker 1: up now well, appreciate it um and we'll be back 1919 01:31:23,400 --> 01:31:26,439 Speaker 1: with another episode that will go up on Friday. Right on. 1920 01:31:27,720 --> 01:31:30,599 Speaker 1: Check it out, guys, Army and none available. Now you've 1921 01:31:30,640 --> 01:31:34,760 Speaker 1: been listening to soulf Rep Radio. New episodes up every 1922 01:31:34,800 --> 01:31:38,160 Speaker 1: Wednesday and Friday for all of the great content from 1923 01:31:38,200 --> 01:31:41,439 Speaker 1: our veteran journalists. Join us and become a team new 1924 01:31:41,560 --> 01:31:45,120 Speaker 1: member today at soft rep dot com. Follow the show 1925 01:31:45,240 --> 01:31:49,200 Speaker 1: on Instagram and Twitter at soft rep Radio, and be 1926 01:31:49,360 --> 01:31:52,400 Speaker 1: sure to also check out the Power of Fault podcast, 1927 01:31:52,840 --> 01:31:57,480 Speaker 1: hosted by Hurricane Group CEO and Navy Seal Sniper instructor 1928 01:31:58,000 --> 01:31:58,880 Speaker 1: Brandon Webb.