1 00:00:04,600 --> 00:00:07,040 Speaker 1: Sleep Workers is a production of I Heart Radio and 2 00:00:07,200 --> 00:00:16,560 Speaker 1: Unusual Productions. I was on a patrol up in the 3 00:00:16,600 --> 00:00:19,960 Speaker 1: mountains of Afghanistan, and I was doing a reconnaissance patrol 4 00:00:20,079 --> 00:00:23,760 Speaker 1: with a very small ranger team, and I saw someone 5 00:00:23,800 --> 00:00:29,120 Speaker 1: approaching us, young man, maybe early twenties. That's Polar, a 6 00:00:29,160 --> 00:00:32,640 Speaker 1: former Army ranger, an elite special operative in the U. S. Military. 7 00:00:33,360 --> 00:00:35,519 Speaker 1: There were sort of a couple of possibilities that came 8 00:00:35,560 --> 00:00:38,640 Speaker 1: into my mind. One was that he was a goat hurter, 9 00:00:38,920 --> 00:00:41,440 Speaker 1: just out, you know, taking his goats out to eat grass. 10 00:00:41,960 --> 00:00:45,160 Speaker 1: Another was that he might be like a woodcutter that's 11 00:00:45,159 --> 00:00:48,240 Speaker 1: protecting his property. And another one might be that he 12 00:00:48,640 --> 00:00:51,080 Speaker 1: was someone who was had seen us. There wasn't a 13 00:00:51,080 --> 00:00:53,360 Speaker 1: lot of vegetation in the area, so we were pretty exposed, 14 00:00:53,720 --> 00:00:56,240 Speaker 1: and he was coming to kill us, And those all 15 00:00:56,280 --> 00:01:01,160 Speaker 1: seemed like very real possibilities. PULL knew of similar situations 16 00:01:01,200 --> 00:01:05,160 Speaker 1: where enemy fighters pretended to be civilians, concealing their weapons 17 00:01:05,240 --> 00:01:08,920 Speaker 1: until the last minute ambush. I fairly id degree of 18 00:01:08,959 --> 00:01:11,040 Speaker 1: confidence that, you know, if we got into a gunfight, 19 00:01:11,160 --> 00:01:14,120 Speaker 1: we could outmatch him or three of us, but if 20 00:01:14,120 --> 00:01:16,280 Speaker 1: you surprised us, he might have usially killed one of 21 00:01:16,319 --> 00:01:18,440 Speaker 1: us in the process first, so I wanted to maneuver 22 00:01:18,480 --> 00:01:20,760 Speaker 1: to a position where I could see him, and I 23 00:01:20,760 --> 00:01:22,200 Speaker 1: found him. He was sitting on the edge of a 24 00:01:22,240 --> 00:01:25,760 Speaker 1: cliff and looking out over this by valley, and he 25 00:01:25,800 --> 00:01:28,080 Speaker 1: had his back to me, and I settled into a 26 00:01:28,080 --> 00:01:30,840 Speaker 1: position where I could watch him very closely through my 27 00:01:30,880 --> 00:01:34,120 Speaker 1: sniper rifle. It was close enough that the wind carried 28 00:01:34,120 --> 00:01:37,000 Speaker 1: his voice towards me and I could hear him talking. 29 00:01:38,200 --> 00:01:40,440 Speaker 1: That alarmed me because I couldn't see who was talking to. 30 00:01:41,240 --> 00:01:45,240 Speaker 1: Paul didn't speak the local Afghan languages Dory or Pashtow, 31 00:01:45,720 --> 00:01:49,040 Speaker 1: so he couldn't understand what the man was saying. I 32 00:01:49,080 --> 00:01:50,880 Speaker 1: thought he might have had a radio and he could 33 00:01:50,920 --> 00:01:54,560 Speaker 1: be reporting back to maybe a group of fighters a nearby. 34 00:01:54,960 --> 00:01:58,000 Speaker 1: I was wearing that possibility versus maybe he's just you know, 35 00:01:58,080 --> 00:01:59,960 Speaker 1: talking to himself or talking to his goats or something 36 00:02:00,000 --> 00:02:04,480 Speaker 1: in And then eventually I heard him start singing, and 37 00:02:04,520 --> 00:02:06,880 Speaker 1: it struck me that if he was singing, you know, 38 00:02:06,920 --> 00:02:09,440 Speaker 1: singing out over the radio to someone in our position, 39 00:02:09,480 --> 00:02:12,040 Speaker 1: it seemed like a very strange thing for a fighter 40 00:02:12,080 --> 00:02:14,320 Speaker 1: to be doing. And the most like the explanation was 41 00:02:14,360 --> 00:02:17,239 Speaker 1: he was just an innocent go to hurd Um had 42 00:02:17,320 --> 00:02:20,040 Speaker 1: no idea that we were there, and he was singing 43 00:02:20,080 --> 00:02:24,639 Speaker 1: to himself just to pass the time, and so Paul 44 00:02:24,800 --> 00:02:29,079 Speaker 1: lowered his sniper rifle and he left, no provocation, no 45 00:02:29,160 --> 00:02:34,080 Speaker 1: harm done to either side. But that incident really stuck 46 00:02:34,120 --> 00:02:35,720 Speaker 1: with me because there was a period of time where 47 00:02:35,760 --> 00:02:38,720 Speaker 1: I didn't know whether he was, you know, an innocent 48 00:02:38,760 --> 00:02:40,639 Speaker 1: person or a fighter who might have been trying to 49 00:02:40,680 --> 00:02:43,200 Speaker 1: kill us Um, And the actions I would take were 50 00:02:43,320 --> 00:02:46,400 Speaker 1: very different in those two worlds. And now I look 51 00:02:46,440 --> 00:02:49,160 Speaker 1: back when I think about autonomous weapons and I think 52 00:02:49,200 --> 00:02:51,920 Speaker 1: what would a machine do? How could a machine understand 53 00:02:51,960 --> 00:02:55,600 Speaker 1: the context that I did, and sort of realize it 54 00:02:55,639 --> 00:02:57,840 Speaker 1: doesn't make sense that a person might be singing in 55 00:02:57,840 --> 00:03:00,640 Speaker 1: a tactical way that's some strange is probably just an 56 00:03:00,800 --> 00:03:04,360 Speaker 1: in person be able to do that. It's a profound question. 57 00:03:04,440 --> 00:03:07,880 Speaker 1: Paul poses what happens when something like a Predator drone 58 00:03:07,919 --> 00:03:11,000 Speaker 1: has as much autonomy as a self driving car and 59 00:03:11,160 --> 00:03:15,080 Speaker 1: can an AI system ever understand context, which can sometimes 60 00:03:15,120 --> 00:03:19,360 Speaker 1: mean the difference between life and death. In this episode 61 00:03:19,520 --> 00:03:22,440 Speaker 1: will examine AI on the battlefield and the future of 62 00:03:22,480 --> 00:03:40,279 Speaker 1: technology driven warfare. I'm as Velosen and this is Sleepwalkers. 63 00:03:40,280 --> 00:03:42,720 Speaker 1: So care last episode we talked about the future of 64 00:03:42,760 --> 00:03:45,360 Speaker 1: work and we focus on one big question, which was 65 00:03:45,720 --> 00:03:49,200 Speaker 1: what can AI not do and what Paul is talking about? 66 00:03:49,360 --> 00:03:53,320 Speaker 1: Identifying whether someone's a shepherd or an insurgent. Identifying targets 67 00:03:53,320 --> 00:03:56,320 Speaker 1: on the battlefield seems to be one of those things, sure, 68 00:03:56,400 --> 00:03:58,080 Speaker 1: But on the other hand, it's important for us to 69 00:03:58,080 --> 00:04:00,320 Speaker 1: ask what is AI good at. You know, good at 70 00:04:00,320 --> 00:04:03,320 Speaker 1: making predictions based on data about what might happen next, 71 00:04:03,560 --> 00:04:07,280 Speaker 1: It's good at seeing patterns. So it makes me think, 72 00:04:07,320 --> 00:04:10,640 Speaker 1: you know that with enough training data about battlefield interactions, 73 00:04:10,680 --> 00:04:14,600 Speaker 1: it could get just as good or better than humans 74 00:04:14,600 --> 00:04:16,600 Speaker 1: at this task. Yeah, and of course we don't have 75 00:04:16,680 --> 00:04:19,840 Speaker 1: the counter effectual to pull story. Sadly, there's probably many 76 00:04:19,880 --> 00:04:22,359 Speaker 1: stories where the army ranger doesn't wait to hear the 77 00:04:22,360 --> 00:04:25,040 Speaker 1: shepherd seeing before deciding what to do and just pulls 78 00:04:25,040 --> 00:04:28,360 Speaker 1: the trigger. And equally, there's probably many sad cases where 79 00:04:28,440 --> 00:04:30,160 Speaker 1: it does turn out to be an insurgent and an 80 00:04:30,160 --> 00:04:33,000 Speaker 1: ambush happens to the soldiers. So just from that story, 81 00:04:33,040 --> 00:04:35,599 Speaker 1: it's not necessarily clear to me that humans are always 82 00:04:35,680 --> 00:04:38,400 Speaker 1: or will always be better than algorithms that identifying targets. 83 00:04:39,000 --> 00:04:41,159 Speaker 1: You know, I do think it raises an important question. 84 00:04:41,760 --> 00:04:45,240 Speaker 1: We are increasingly comfortable outsourcing many parts of our lives 85 00:04:45,279 --> 00:04:47,159 Speaker 1: to technology. You know, who are we going to fall 86 00:04:47,160 --> 00:04:48,839 Speaker 1: in love with? How are we going to get to 87 00:04:48,880 --> 00:04:51,839 Speaker 1: our cousin's house? You know, that's decision making. Though, in 88 00:04:51,880 --> 00:04:54,680 Speaker 1: the world of war, we can also build tools to 89 00:04:54,720 --> 00:04:57,240 Speaker 1: do things that people don't want to do. A lot 90 00:04:57,240 --> 00:05:00,760 Speaker 1: of people have heard about Boston Dynamics. They have this 91 00:05:01,240 --> 00:05:07,640 Speaker 1: extremely terrifying video of this ford legged robodog aptly named Spot, 92 00:05:07,839 --> 00:05:11,040 Speaker 1: you know, that can run across bumpy ground. It can 93 00:05:11,080 --> 00:05:13,600 Speaker 1: go into places that aren't safe for human beings, you know, 94 00:05:13,680 --> 00:05:16,919 Speaker 1: And they've just announced this model with a claw on 95 00:05:16,960 --> 00:05:19,080 Speaker 1: the back of it that can open doors. There are 96 00:05:19,080 --> 00:05:22,719 Speaker 1: so many potential uses for this, from assisting military rates 97 00:05:23,240 --> 00:05:26,919 Speaker 1: to sending Spot into buildings that are unsafe for humans. Yeah, like, 98 00:05:27,320 --> 00:05:30,159 Speaker 1: a whole pack of Spots can actually pull a truck. 99 00:05:30,200 --> 00:05:32,440 Speaker 1: There's a video of it online. Look it up. It's frightening. 100 00:05:32,680 --> 00:05:36,239 Speaker 1: Teamwork is the dream work exactly? But no, like, what's 101 00:05:36,240 --> 00:05:39,120 Speaker 1: going to happen when that comes to the world of 102 00:05:39,160 --> 00:05:42,600 Speaker 1: war and we have packs of robodogs? Well, who knows? 103 00:05:42,680 --> 00:05:44,880 Speaker 1: Right the future is unclear, and that's really the question 104 00:05:44,920 --> 00:05:49,159 Speaker 1: of this episode. How will AI and robotics change the battlefield? 105 00:05:49,560 --> 00:05:51,479 Speaker 1: And how good of an insurance policy is it to 106 00:05:51,520 --> 00:05:53,960 Speaker 1: keep a human in the loop, to have somebody a 107 00:05:54,040 --> 00:05:57,640 Speaker 1: person controlling the pack of spots. We have from Paul 108 00:05:57,720 --> 00:06:00,000 Speaker 1: char at the beginning. He's now the director of Technology, 109 00:06:00,040 --> 00:06:03,360 Speaker 1: g and National Security at a bipartisan think tank called 110 00:06:03,400 --> 00:06:06,679 Speaker 1: the Center for a New American Security. He's recently written 111 00:06:06,680 --> 00:06:09,359 Speaker 1: a book called Army of None, Autonomous Weapons and the 112 00:06:09,400 --> 00:06:14,440 Speaker 1: Future of War. We're moving to a world where machines 113 00:06:14,480 --> 00:06:16,880 Speaker 1: may be making some of the most important decisions on 114 00:06:16,920 --> 00:06:20,680 Speaker 1: the battlefield about who lives and dies. And as Paul's 115 00:06:20,720 --> 00:06:24,640 Speaker 1: experienced in Afghanistan taught him, even elite military training isn't 116 00:06:24,640 --> 00:06:27,200 Speaker 1: always enough to tell who is and isn't likely to 117 00:06:27,200 --> 00:06:30,640 Speaker 1: be a threat. So understandably there's a great worry in 118 00:06:30,760 --> 00:06:34,680 Speaker 1: handing over target identification to a computer, especially as the 119 00:06:34,680 --> 00:06:37,920 Speaker 1: stakes are life and death. And as Paul tells us, 120 00:06:38,080 --> 00:06:41,800 Speaker 1: autonomous weapons are already being produced and sold around the world. 121 00:06:42,120 --> 00:06:46,120 Speaker 1: The best example today of fully autonomous weapon is a 122 00:06:46,240 --> 00:06:49,360 Speaker 1: drone built by Israel called the Harpy Drown and he's 123 00:06:49,400 --> 00:06:53,320 Speaker 1: been sold to a number of countries Turkey, India, China, 124 00:06:53,480 --> 00:06:57,360 Speaker 1: South Korea. It's an anti radar webbon that is lying 125 00:06:57,360 --> 00:07:00,520 Speaker 1: a switch pattern in the sky looking for enemy radars, 126 00:07:01,040 --> 00:07:03,560 Speaker 1: and when it finds one, it can then attack the 127 00:07:03,640 --> 00:07:07,080 Speaker 1: radar within any further human permission. And that processes the 128 00:07:07,120 --> 00:07:10,880 Speaker 1: line to an autonomous weapon that's able to go find 129 00:07:10,920 --> 00:07:13,640 Speaker 1: targets and then attack them all by itself. These are 130 00:07:13,720 --> 00:07:17,320 Speaker 1: not the autonomous killer robots of science fiction, setting their 131 00:07:17,360 --> 00:07:20,120 Speaker 1: own goals and killing it will. We still send them 132 00:07:20,120 --> 00:07:22,520 Speaker 1: into battle and tell them what to look for, but 133 00:07:22,600 --> 00:07:25,080 Speaker 1: we no longer control exactly what they do when they 134 00:07:25,120 --> 00:07:28,280 Speaker 1: get there. So by analogy, you might think about a 135 00:07:28,280 --> 00:07:31,800 Speaker 1: self driving car. Um, there are really degrees of how 136 00:07:31,920 --> 00:07:34,240 Speaker 1: much a car could be self driving. You could have 137 00:07:34,360 --> 00:07:38,400 Speaker 1: some like the Tesla autopilot today, where there's a steering 138 00:07:38,400 --> 00:07:41,280 Speaker 1: wheel and the human could intervene and grab control of 139 00:07:41,280 --> 00:07:44,320 Speaker 1: the vehicle. UM, you might have some like Google self 140 00:07:44,400 --> 00:07:47,040 Speaker 1: driving car that is no steering wheel and a person 141 00:07:47,200 --> 00:07:50,960 Speaker 1: just merely a passenger. But even in a totally self 142 00:07:51,040 --> 00:07:54,400 Speaker 1: driving car, the human is still choosing the destination. You're 143 00:07:54,400 --> 00:07:56,160 Speaker 1: not getting in your car just in car, you know, 144 00:07:56,200 --> 00:07:58,280 Speaker 1: take me where you want to go. So it's some 145 00:07:58,360 --> 00:08:01,120 Speaker 1: level humans are always involved, and it's going to be 146 00:08:01,120 --> 00:08:03,960 Speaker 1: the case in warfare. Um. The question is, you know, 147 00:08:04,000 --> 00:08:06,880 Speaker 1: when do we cross the threshold where the humans have 148 00:08:06,920 --> 00:08:10,960 Speaker 1: transfer control of some important and meaningful decisions two machines, 149 00:08:11,400 --> 00:08:14,760 Speaker 1: and then what are the legal or ethical complications of that. 150 00:08:15,560 --> 00:08:18,720 Speaker 1: It is a hugely important question. In the aftermath of 151 00:08:18,720 --> 00:08:21,560 Speaker 1: the Second World War, a hundred and ninety six countries 152 00:08:21,600 --> 00:08:25,280 Speaker 1: signed up to the Geneva Convention setting standards for behavior 153 00:08:25,360 --> 00:08:28,640 Speaker 1: in battle, But how do we enforce those standards if 154 00:08:28,680 --> 00:08:32,760 Speaker 1: the combatants are machines not people. Beyond having a human 155 00:08:32,760 --> 00:08:35,160 Speaker 1: in the loop, one important piece of the puzzle is 156 00:08:35,160 --> 00:08:38,560 Speaker 1: a healthy testing and review process, a clear understanding of 157 00:08:38,559 --> 00:08:41,160 Speaker 1: how a weapon works and the decisions it will make. 158 00:08:41,840 --> 00:08:45,840 Speaker 1: But according to Richard Danzig, a former secretary of the Navy, 159 00:08:45,880 --> 00:08:48,840 Speaker 1: that's easier said than done. One of the things that 160 00:08:48,920 --> 00:08:54,280 Speaker 1: concerns me is that the technologies are frequently highly classified. 161 00:08:54,800 --> 00:08:59,040 Speaker 1: So we're for self driving cars, we typically require that 162 00:08:59,160 --> 00:09:03,240 Speaker 1: there be millions of miles driven, and we insist that 163 00:09:03,400 --> 00:09:08,720 Speaker 1: external regulators review them for safety. In the military context, 164 00:09:08,880 --> 00:09:12,560 Speaker 1: we don't have millions of miles of experience before combat 165 00:09:13,320 --> 00:09:17,440 Speaker 1: and we don't typically have any kind of third party 166 00:09:17,520 --> 00:09:20,680 Speaker 1: review that says, wait a minute, here, the risks associated 167 00:09:20,720 --> 00:09:24,199 Speaker 1: with this system spinning out of control. We ought to 168 00:09:24,240 --> 00:09:27,880 Speaker 1: be using teams to say, hey, what could go wrong 169 00:09:27,920 --> 00:09:31,400 Speaker 1: if an adversary wants to attack these and subvert them, 170 00:09:31,640 --> 00:09:34,960 Speaker 1: or when they interact with other systems. This kind of 171 00:09:34,960 --> 00:09:38,360 Speaker 1: war gaming is valuable, but the best laid plans and 172 00:09:38,440 --> 00:09:42,280 Speaker 1: standards can crumble in the face of existential threats, real 173 00:09:42,440 --> 00:09:45,920 Speaker 1: or perceived. You only need to think about Hiroshima and 174 00:09:46,080 --> 00:09:50,480 Speaker 1: Nagasaki to understand how quickly restraint can give way to 175 00:09:50,600 --> 00:09:55,240 Speaker 1: the desire for victory his pool Again, as countries feel 176 00:09:55,240 --> 00:09:57,880 Speaker 1: that their national survival might be at stake, they're gonna 177 00:09:57,880 --> 00:10:00,520 Speaker 1: be willing to take more risk put up more experimental weapons. 178 00:10:00,720 --> 00:10:02,920 Speaker 1: The use of poison gas in World War One is 179 00:10:02,960 --> 00:10:05,600 Speaker 1: I think a terrible example of this happening and practice. 180 00:10:05,679 --> 00:10:07,960 Speaker 1: Germany was in a panic to find some kind of 181 00:10:08,000 --> 00:10:12,040 Speaker 1: wonder weapon that might break that stalemate. So a desire 182 00:10:12,120 --> 00:10:16,080 Speaker 1: to get an upper hand will clearly Historically we've seen 183 00:10:16,200 --> 00:10:19,719 Speaker 1: lead militarios to take risks and deploy more experimental technology. 184 00:10:21,480 --> 00:10:23,439 Speaker 1: This is one of the scariest things about war to 185 00:10:23,559 --> 00:10:27,599 Speaker 1: me Kara, particularly war involving new technology, the potential for 186 00:10:27,679 --> 00:10:32,679 Speaker 1: misunderstanding and unintended consequences. Paul mentions the almost accidental use 187 00:10:32,720 --> 00:10:35,520 Speaker 1: of poison gas in World War One, and then there 188 00:10:35,559 --> 00:10:38,200 Speaker 1: was a Cuban missile crisis where we almost stumbled into 189 00:10:38,200 --> 00:10:40,880 Speaker 1: a nuclear war. And all this potential for misunderstanding is 190 00:10:40,920 --> 00:10:44,000 Speaker 1: compounded exponentially in the world of AI because it's not 191 00:10:44,080 --> 00:10:46,120 Speaker 1: just humans who are trying to read each other and 192 00:10:46,160 --> 00:10:49,600 Speaker 1: make decisions. It's algorithms sort of loose in the wild, 193 00:10:49,960 --> 00:10:53,200 Speaker 1: interacting with one another. The Guardian actually had a great 194 00:10:53,200 --> 00:10:57,000 Speaker 1: story about this called Frank and Algorithms, The Deadly Consequences 195 00:10:57,000 --> 00:10:59,880 Speaker 1: of Unpredictable Code, and the article makes the point that 196 00:11:00,080 --> 00:11:03,560 Speaker 1: the stock market flash crash of was actually caused by 197 00:11:03,600 --> 00:11:07,120 Speaker 1: algorithms interacting with one another. You know, it's hard not 198 00:11:07,200 --> 00:11:09,600 Speaker 1: to think that this could happen in the wild, so 199 00:11:09,679 --> 00:11:11,960 Speaker 1: to speak. It's a scary thought, and it's made even 200 00:11:12,040 --> 00:11:15,600 Speaker 1: scarier because there's just so much potential all around for misunderstanding. 201 00:11:15,840 --> 00:11:19,280 Speaker 1: And that's something Richard Danzig is seriously concerned about. When 202 00:11:19,400 --> 00:11:22,840 Speaker 1: we talked about sending a ship on a mission, policy 203 00:11:22,920 --> 00:11:27,080 Speaker 1: makers by and large understand what that means and how 204 00:11:27,160 --> 00:11:31,280 Speaker 1: others will perceive it if the ship comes into their waters. 205 00:11:31,760 --> 00:11:36,520 Speaker 1: Whereas when we start talking about artificial intelligence, policy makers, 206 00:11:36,559 --> 00:11:38,800 Speaker 1: if there are five people in a room, may readily 207 00:11:38,920 --> 00:11:42,840 Speaker 1: envisioned five different things. We need the people making decisions 208 00:11:42,880 --> 00:11:46,320 Speaker 1: to understand both the situations they're dealing with and also 209 00:11:46,400 --> 00:11:48,920 Speaker 1: how the tools they're using actually work. And that's not 210 00:11:49,000 --> 00:11:51,840 Speaker 1: easy when it comes to AI. Part of the issue 211 00:11:51,960 --> 00:11:55,120 Speaker 1: is the so called black box problem. Currently, we understand 212 00:11:55,160 --> 00:11:58,320 Speaker 1: the principles of how a neural network uses probabilities to 213 00:11:58,360 --> 00:12:01,640 Speaker 1: reach a conclusion, but we can't interrogate the millions of 214 00:12:01,679 --> 00:12:04,720 Speaker 1: micro decisions it makes along the way. This is a 215 00:12:04,840 --> 00:12:09,000 Speaker 1: huge barrier to understanding weapons systems powered by AI. I 216 00:12:09,040 --> 00:12:11,880 Speaker 1: wanted to know how the research is developing new military 217 00:12:11,880 --> 00:12:15,280 Speaker 1: technology think about the black box problem. So I spoke 218 00:12:15,320 --> 00:12:18,560 Speaker 1: with Artie Prabaka, the former head of DARPA, the Defense 219 00:12:18,600 --> 00:12:22,959 Speaker 1: Advanced Research Projects Agency, and Arthie shared a story about 220 00:12:22,960 --> 00:12:26,880 Speaker 1: how the black box problem plays out. Many years ago, now, 221 00:12:26,920 --> 00:12:29,199 Speaker 1: there was a wonderful paper from Stanford that showed a 222 00:12:29,280 --> 00:12:32,400 Speaker 1: machine system that could label images. This was a girl 223 00:12:32,440 --> 00:12:35,440 Speaker 1: blowing out the candles on her birthday cake, or a 224 00:12:35,480 --> 00:12:39,520 Speaker 1: construction worker doing something so fairly complex analysis of what 225 00:12:39,600 --> 00:12:42,000 Speaker 1: was going on in this picture, and it would get 226 00:12:42,200 --> 00:12:43,599 Speaker 1: one right, and it would get ten right, and we 227 00:12:43,720 --> 00:12:45,720 Speaker 1: get a hunder right. And then there was a picture 228 00:12:46,240 --> 00:12:49,200 Speaker 1: that every human being would say, that's a baby holding 229 00:12:49,200 --> 00:12:52,560 Speaker 1: an electric truth brush, but the machine said it's a 230 00:12:52,600 --> 00:12:55,880 Speaker 1: small boy holding a baseball bat. And you know, you 231 00:12:55,920 --> 00:12:58,240 Speaker 1: just look at it and you think, what what what? 232 00:12:58,080 --> 00:13:00,319 Speaker 1: What were what were you thinking? And this I think 233 00:13:00,320 --> 00:13:03,520 Speaker 1: it's a great illustration of the black box nature of 234 00:13:03,559 --> 00:13:07,040 Speaker 1: these learning systems because they've been trained on all this 235 00:13:07,360 --> 00:13:11,280 Speaker 1: volume of data, but when you look inside to say, well, 236 00:13:11,320 --> 00:13:13,480 Speaker 1: what went wrong there, you know, you just see a 237 00:13:13,480 --> 00:13:15,800 Speaker 1: bunch of nodes with weights from being trained, and so 238 00:13:15,960 --> 00:13:20,959 Speaker 1: they're really opaque. Offi's example is kind of cute, but 239 00:13:21,120 --> 00:13:23,960 Speaker 1: think about it for a moment. The difference between a 240 00:13:23,960 --> 00:13:27,560 Speaker 1: baby holding an electric toothbrush and a small boy holding 241 00:13:27,559 --> 00:13:31,840 Speaker 1: a baseball bat could also be in Afghanistan, the difference 242 00:13:31,880 --> 00:13:35,400 Speaker 1: between Paul Shepherd and militant, in other words, the difference 243 00:13:35,400 --> 00:13:38,359 Speaker 1: between life and death. Yeah, and it's a bit daunting 244 00:13:39,160 --> 00:13:42,080 Speaker 1: that we are becoming more reliant on something that we 245 00:13:42,160 --> 00:13:45,959 Speaker 1: continue not to understand fully, don't you think absolutely and 246 00:13:46,120 --> 00:13:47,920 Speaker 1: Henry kissing you actually wrote a piece on this for 247 00:13:47,960 --> 00:13:52,200 Speaker 1: the Atlantic called how the Enlightenment Ends Big Stuff for Sure, 248 00:13:52,200 --> 00:13:56,040 Speaker 1: Hetty and what's kissing You? He argued that because we're 249 00:13:56,120 --> 00:13:59,080 Speaker 1: unable to interrogate the output of algorithms, as we rely 250 00:13:59,120 --> 00:14:01,720 Speaker 1: on them more immorti classify the world around us, we 251 00:14:01,760 --> 00:14:04,400 Speaker 1: may actually start to lose the ability to reason for ourselves. 252 00:14:04,640 --> 00:14:07,680 Speaker 1: It's not inevitable though, that AI will always be opaque. 253 00:14:08,160 --> 00:14:11,439 Speaker 1: The EU are actually working on this policy that decisions 254 00:14:11,440 --> 00:14:14,800 Speaker 1: made by AI need to be explainable to people they affect. 255 00:14:15,320 --> 00:14:17,880 Speaker 1: That may be a policy that's easier said than done. 256 00:14:18,360 --> 00:14:21,720 Speaker 1: Although the so called next wave of AI is all 257 00:14:21,760 --> 00:14:24,880 Speaker 1: about explainable AI, and it's actually a major initiative right 258 00:14:24,880 --> 00:14:29,080 Speaker 1: now at DAPPA. Here's off the again. Explainable AI has 259 00:14:29,120 --> 00:14:32,960 Speaker 1: been part of starting an entire new field of inquiry 260 00:14:33,080 --> 00:14:37,360 Speaker 1: and artificial intelligence to couple that kind of statistical power 261 00:14:37,920 --> 00:14:42,040 Speaker 1: that machine learning systems have with systems that explain how 262 00:14:42,080 --> 00:14:44,480 Speaker 1: they got the results that they got. In order for 263 00:14:44,600 --> 00:14:47,840 Speaker 1: us human users to be able to know when to 264 00:14:47,920 --> 00:14:51,680 Speaker 1: trust those machine learning systems and when not to trust them. 265 00:14:51,760 --> 00:14:55,440 Speaker 1: What authe is describing would be a huge breakthrough in AI. 266 00:14:55,680 --> 00:14:58,960 Speaker 1: Understanding how neural networks make their decisions would allow us 267 00:14:59,000 --> 00:15:02,480 Speaker 1: to honess the power the technology much more safely, and 268 00:15:02,560 --> 00:15:06,000 Speaker 1: not just on the battlefield. When we come back, we 269 00:15:06,000 --> 00:15:07,840 Speaker 1: look at how much of the technology we take for 270 00:15:07,880 --> 00:15:11,520 Speaker 1: granted in our everyday lives actually originated in the military. 271 00:15:17,800 --> 00:15:21,360 Speaker 1: So DARPA, the defense agency with an annual budget of 272 00:15:21,440 --> 00:15:25,840 Speaker 1: three point five billion dollars, its motto is to cast 273 00:15:25,880 --> 00:15:29,720 Speaker 1: a javelin into the infinite spaces of the future. What 274 00:15:29,840 --> 00:15:32,000 Speaker 1: you may not know is how much of the technology 275 00:15:32,040 --> 00:15:35,400 Speaker 1: we all use every day came right out of Darper. 276 00:15:36,560 --> 00:15:39,040 Speaker 1: I think about this every time I use my smartphone 277 00:15:39,080 --> 00:15:43,880 Speaker 1: because that's a beautiful, seamless integration of a whole host 278 00:15:43,960 --> 00:15:48,440 Speaker 1: of technologies that were sparked many many years ago by Darper. 279 00:15:48,560 --> 00:15:51,200 Speaker 1: So the chip in your cell phone that talks to 280 00:15:52,000 --> 00:15:56,520 Speaker 1: the cell tower is based on materials and electronics technology 281 00:15:56,560 --> 00:16:00,240 Speaker 1: that was developed for communications, and radar systems that up 282 00:16:00,240 --> 00:16:03,600 Speaker 1: that knows when you've rotated your phone is MEM's technology 283 00:16:03,640 --> 00:16:07,880 Speaker 1: that had huge early support from DARPA. But also Series 284 00:16:07,960 --> 00:16:12,040 Speaker 1: or other intelligent agents are based on the artificial intelligence 285 00:16:12,080 --> 00:16:15,160 Speaker 1: research that was done. But what fueled this wave of 286 00:16:15,240 --> 00:16:22,240 Speaker 1: incredible innovation at DAPPA. Well, actually, war DARPA is a 287 00:16:22,360 --> 00:16:27,480 Speaker 1: very American concept. In nineteen fifty seven, the Soviet Union 288 00:16:27,640 --> 00:16:30,760 Speaker 1: put the first artificial satellite on orbit. That was Sputnik. 289 00:16:31,200 --> 00:16:33,360 Speaker 1: There was a lot of excitement because human beings had 290 00:16:33,360 --> 00:16:36,400 Speaker 1: never done that before, but of course also quite a 291 00:16:36,480 --> 00:16:38,480 Speaker 1: shock for the United States at the height of the 292 00:16:38,520 --> 00:16:42,120 Speaker 1: CULD WARP. Sputnik was a reminder that in addition to 293 00:16:42,200 --> 00:16:45,040 Speaker 1: working on the problems that you knew you had, you 294 00:16:45,240 --> 00:16:47,840 Speaker 1: also needed to have people who came to work every 295 00:16:47,920 --> 00:16:51,760 Speaker 1: day to think about those kinds of technological surprises. And 296 00:16:51,840 --> 00:16:56,040 Speaker 1: so DARPA was started as a reaction to that technological 297 00:16:56,040 --> 00:16:59,960 Speaker 1: surprise of Sputnik. It's mission for sixty years has been 298 00:17:00,120 --> 00:17:05,080 Speaker 1: to create those kinds of technological surprises, and its history 299 00:17:05,400 --> 00:17:09,080 Speaker 1: is one in which it's accomplished that mission. DARPA is 300 00:17:09,640 --> 00:17:13,119 Speaker 1: known as the place that made the early stages of 301 00:17:13,320 --> 00:17:17,879 Speaker 1: each revolution and artificial intelligence, and of course must memorably 302 00:17:17,960 --> 00:17:21,280 Speaker 1: for starting the ARPA net and writing the protocols that 303 00:17:21,359 --> 00:17:24,880 Speaker 1: became the Internet that we have today. Think about that 304 00:17:25,359 --> 00:17:28,560 Speaker 1: the technology that forms the architecture of our daily lives. 305 00:17:28,560 --> 00:17:32,040 Speaker 1: In the twenty one century, the Internet was created by 306 00:17:32,040 --> 00:17:35,760 Speaker 1: a defense agency whose mission was to outthink the Soviet Union, 307 00:17:36,600 --> 00:17:39,000 Speaker 1: and in some sense all of the technologies we've looked 308 00:17:39,040 --> 00:17:42,600 Speaker 1: at so far in Sleepwalkers are the outgrowth of Dapper's work. 309 00:17:43,520 --> 00:17:46,760 Speaker 1: What really enabled this was DARPA's decision to allow their 310 00:17:46,760 --> 00:17:50,520 Speaker 1: technology into the US private sector and to let entrepreneurs 311 00:17:50,560 --> 00:17:55,679 Speaker 1: build on top of it. Absolutely as vital was that 312 00:17:56,040 --> 00:17:59,480 Speaker 1: were the companies and the entrepreneurs and the investors who 313 00:17:59,520 --> 00:18:03,440 Speaker 1: saw that you could turn those those raw research results 314 00:18:03,560 --> 00:18:07,640 Speaker 1: into this seamless, beautiful product that we've now we all 315 00:18:07,680 --> 00:18:10,959 Speaker 1: live with all the time. Characters. Amazing to think just 316 00:18:11,040 --> 00:18:13,600 Speaker 1: how much a Silicon Valley really stands on the shoulders 317 00:18:13,600 --> 00:18:16,399 Speaker 1: of DARPA. But even outside of what we think of 318 00:18:16,480 --> 00:18:19,240 Speaker 1: as the tech world, there's plenty of examples of military 319 00:18:19,320 --> 00:18:22,879 Speaker 1: technology living with us. For example, microwaves, which were an 320 00:18:22,920 --> 00:18:26,800 Speaker 1: accidental byproduct of radar technology, and then as good old Rumba, 321 00:18:26,880 --> 00:18:30,280 Speaker 1: which was originally developed a mind sweeping technology and is 322 00:18:30,280 --> 00:18:34,760 Speaker 1: still a mind sweeping technology in my house. Um. Yeah, 323 00:18:34,840 --> 00:18:37,520 Speaker 1: you know. There's this concept of dual use technology, which 324 00:18:37,560 --> 00:18:40,639 Speaker 1: we've talked about a few times. It's the idea that 325 00:18:40,720 --> 00:18:44,120 Speaker 1: technology developed for the military can have civilian applications and 326 00:18:44,200 --> 00:18:46,840 Speaker 1: vice versa. We talked about this with blink identity and 327 00:18:46,880 --> 00:18:51,359 Speaker 1: facial recognition. Do you remember last year the project may 328 00:18:51,359 --> 00:18:54,080 Speaker 1: even walk out? Yeah, Google, right, So that was a 329 00:18:54,080 --> 00:18:57,359 Speaker 1: project for the Pentagon. Over three thousand employees protested that 330 00:18:57,400 --> 00:19:00,720 Speaker 1: they didn't want to develop that technology. Google pulled the 331 00:19:00,760 --> 00:19:03,720 Speaker 1: plug on the project. The problem is though that you know, 332 00:19:03,800 --> 00:19:06,880 Speaker 1: you can say you're developing you know, AI and target 333 00:19:06,880 --> 00:19:09,920 Speaker 1: recognition for the military. All you can say you're developing 334 00:19:10,000 --> 00:19:13,199 Speaker 1: AI to recognize what's happening in images. But it's the 335 00:19:13,200 --> 00:19:16,080 Speaker 1: same technology and once it gets into the wild, anyone 336 00:19:16,119 --> 00:19:18,440 Speaker 1: who wants can use it. And that's actually something that 337 00:19:18,520 --> 00:19:22,720 Speaker 1: Arthur you spoke about about innovation traveling from DARPA to 338 00:19:22,760 --> 00:19:25,520 Speaker 1: the private sector. Yeah, and now DARPA actually has this 339 00:19:25,640 --> 00:19:29,679 Speaker 1: younger sibling called the Defense Innovation Unit d i u X, 340 00:19:30,160 --> 00:19:33,280 Speaker 1: whose job is actually to invest and incubate technologies from 341 00:19:33,280 --> 00:19:36,600 Speaker 1: the private sector that could be helpful for defense. So 342 00:19:36,760 --> 00:19:39,720 Speaker 1: this is basically the bridge from Silicon Valley to d C, 343 00:19:40,320 --> 00:19:43,960 Speaker 1: which is taking things from the consumer space and applying 344 00:19:44,040 --> 00:19:46,919 Speaker 1: them for military use. And one of these technologies is 345 00:19:46,920 --> 00:19:50,800 Speaker 1: called Halo. They're basically headphones that electrify your brain in 346 00:19:50,880 --> 00:19:55,240 Speaker 1: order to stimulate growth. That's right. So I went to 347 00:19:55,280 --> 00:19:58,600 Speaker 1: Connecticut to test them out with former Navy Seal John Wilson, 348 00:19:59,760 --> 00:20:02,680 Speaker 1: OH a seal for twelve years. I've served in Iraq 349 00:20:03,080 --> 00:20:08,240 Speaker 1: multiple times, Afghanistan, I went to Mogadishue, and then South America. 350 00:20:08,480 --> 00:20:10,920 Speaker 1: As you can imagine, drug warfare is still a really 351 00:20:10,960 --> 00:20:13,520 Speaker 1: big issue, so we have military units down there to 352 00:20:13,880 --> 00:20:16,600 Speaker 1: combat the cartels. We were gone three hundred days out 353 00:20:16,600 --> 00:20:19,080 Speaker 1: of the year training and then we were deployed from 354 00:20:19,080 --> 00:20:23,080 Speaker 1: once on end. So that's what we lived, breathed day 355 00:20:23,080 --> 00:20:24,800 Speaker 1: in and day out. We weren't going home at night. 356 00:20:25,440 --> 00:20:28,000 Speaker 1: These days, John is back with his family and he's 357 00:20:28,080 --> 00:20:31,480 Speaker 1: especially interested in how new technology can help seals past 358 00:20:31,520 --> 00:20:35,520 Speaker 1: and present. One such technology is the Halo headset, which 359 00:20:35,600 --> 00:20:38,520 Speaker 1: d I u X invested in. The headset uses an 360 00:20:38,520 --> 00:20:42,720 Speaker 1: electric current to prime the brain for so called neuroplasticity, 361 00:20:42,760 --> 00:20:45,960 Speaker 1: in other words, the ability to learn and learn quickly. 362 00:20:46,400 --> 00:20:49,160 Speaker 1: For us, I've recognized and when we do a pistol draw, 363 00:20:49,359 --> 00:20:52,600 Speaker 1: that movement is a repetitive movement that we've done thousands 364 00:20:52,640 --> 00:20:55,280 Speaker 1: and thousands and thousands of times over and over again. 365 00:20:55,520 --> 00:20:58,040 Speaker 1: And what this does is it primes the brain to 366 00:20:58,200 --> 00:21:01,440 Speaker 1: learn those repetitive movements faster? And have you genuinely knows 367 00:21:01,520 --> 00:21:04,680 Speaker 1: the difference. When I first came across this, I've got 368 00:21:04,680 --> 00:21:06,480 Speaker 1: a bunch of seals together and went out to the range. 369 00:21:06,520 --> 00:21:09,080 Speaker 1: We neuro primed and we started shooting. So we got 370 00:21:09,119 --> 00:21:11,639 Speaker 1: the baseline. Did this for a month, looked at our 371 00:21:11,680 --> 00:21:15,000 Speaker 1: scores and our scores were light years better. And light 372 00:21:15,080 --> 00:21:17,840 Speaker 1: years I mean milliseconds, right, But milliseconds on the battlefield 373 00:21:17,840 --> 00:21:20,720 Speaker 1: equates the life or death. I may be as far 374 00:21:20,760 --> 00:21:23,679 Speaker 1: away from being a military person as anyone could be, 375 00:21:24,200 --> 00:21:26,240 Speaker 1: but John has kindly agreed to lead me through a 376 00:21:26,359 --> 00:21:29,200 Speaker 1: Navy seal workout. So what we have here it looks 377 00:21:29,200 --> 00:21:33,160 Speaker 1: like a beats headset right with some strange noduals. Yeah, 378 00:21:33,560 --> 00:21:35,760 Speaker 1: so what we have on the top has mentioned were 379 00:21:35,800 --> 00:21:38,679 Speaker 1: these little nods. What those do? Or those are going 380 00:21:38,760 --> 00:21:42,640 Speaker 1: to send a current into the cortex or frontal cortex 381 00:21:42,680 --> 00:21:46,000 Speaker 1: into your brain? Is that safe? That's a great question. 382 00:21:47,000 --> 00:21:51,000 Speaker 1: I'm gonna bypass that question. It's safe. Turns out it 383 00:21:51,080 --> 00:21:53,680 Speaker 1: is safe. It's been tested by Dapper and others. So 384 00:21:54,080 --> 00:21:56,240 Speaker 1: I had to put it on and John agreed to 385 00:21:56,240 --> 00:21:59,320 Speaker 1: help me use the headset to neuro prime before putting 386 00:21:59,320 --> 00:22:01,879 Speaker 1: me through my pace is all right, let's let's crank 387 00:22:01,880 --> 00:22:04,160 Speaker 1: it up so you can. So it's probably a good 388 00:22:04,400 --> 00:22:05,800 Speaker 1: I don't know if you can feel a difference there, 389 00:22:05,800 --> 00:22:07,879 Speaker 1: but we've got twenty minutes of neuro priming. It feels 390 00:22:07,880 --> 00:22:10,439 Speaker 1: like something pinching in my head a bit. Yeah, it's not. 391 00:22:10,600 --> 00:22:13,960 Speaker 1: It's not a comfortable feeling, right, yeah, and you imagine, no, 392 00:22:14,080 --> 00:22:17,159 Speaker 1: you never do. But it's worth it, right, What is 393 00:22:17,160 --> 00:22:20,720 Speaker 1: that uncomfortable sensation? What's happening that? That's electricity? Yea, so 394 00:22:20,800 --> 00:22:23,320 Speaker 1: it's it's going into your brain right now, and it's 395 00:22:23,320 --> 00:22:26,680 Speaker 1: getting your brain into a state of neuroplasicity. Hyper Learning, 396 00:22:26,800 --> 00:22:28,879 Speaker 1: essentially is what that state allows you to do, and 397 00:22:28,880 --> 00:22:32,119 Speaker 1: it just allows you to learn faster and learn more information. 398 00:22:32,800 --> 00:22:35,199 Speaker 1: After a warm up, which was in fact a lot 399 00:22:35,280 --> 00:22:39,719 Speaker 1: more intense than my normal workout, two three, come on up, 400 00:22:39,880 --> 00:22:44,760 Speaker 1: drive up and hold right still the warm up, Yes, 401 00:22:44,760 --> 00:22:46,760 Speaker 1: it's still the warm up, it was time to take 402 00:22:46,800 --> 00:22:50,119 Speaker 1: the headset off and start the real thing. Or so 403 00:22:50,200 --> 00:22:58,399 Speaker 1: I thought, all right, it's already to keep going. You 404 00:22:58,400 --> 00:23:08,200 Speaker 1: can do twenty reps. So oh my god, oh god, ah, 405 00:23:09,280 --> 00:23:15,080 Speaker 1: those people's not do what hell weeks like I thinks 406 00:23:15,160 --> 00:23:24,639 Speaker 1: be worse than this. Yeah, why do you ask? John? Thankfully, 407 00:23:24,760 --> 00:23:28,200 Speaker 1: the workout came to an end and without any injury. Though, 408 00:23:28,240 --> 00:23:30,640 Speaker 1: to be honest, I couldn't tell if the neuropriming had 409 00:23:30,680 --> 00:23:34,280 Speaker 1: worked for me. Because halo enhances the brain's ability to learn, 410 00:23:34,640 --> 00:23:37,439 Speaker 1: Studies show best results when it's used over time. In 411 00:23:37,480 --> 00:23:40,240 Speaker 1: other words, if I wore the headset before every workout, 412 00:23:40,560 --> 00:23:42,840 Speaker 1: I might start to notice the difference in how quickly 413 00:23:42,880 --> 00:23:46,359 Speaker 1: I performed. But somehow I trusted John talking about the 414 00:23:46,440 --> 00:23:51,040 Speaker 1: draw time for his weapons. So the battle feels Afghanistan 415 00:23:51,840 --> 00:23:58,119 Speaker 1: rugs Syria. It's my happy place. Yeah, who can understand that? 416 00:23:58,760 --> 00:24:01,639 Speaker 1: It's my happy place because around people that I know 417 00:24:01,640 --> 00:24:03,320 Speaker 1: would do anything for me that I love me and 418 00:24:03,320 --> 00:24:07,200 Speaker 1: I love them for John. Being on the battlefield wasn't 419 00:24:07,200 --> 00:24:11,679 Speaker 1: the hardest part leaving. It was transition and is not 420 00:24:11,760 --> 00:24:13,880 Speaker 1: an easy thing. It's the hardest thing I've ever done. 421 00:24:14,320 --> 00:24:17,160 Speaker 1: When we transition, we do it by ourselves. We're trying 422 00:24:17,160 --> 00:24:19,600 Speaker 1: to solve a complex problem, which we love to do. 423 00:24:19,800 --> 00:24:22,560 Speaker 1: But normally when that takes place, you have your team 424 00:24:22,600 --> 00:24:24,920 Speaker 1: around you and you're going to figure it out because 425 00:24:25,000 --> 00:24:26,520 Speaker 1: you know you'll never let the person the left and 426 00:24:26,600 --> 00:24:28,560 Speaker 1: right of you down. When you're trying to do this 427 00:24:28,600 --> 00:24:30,439 Speaker 1: by yourself, you have nobody to talk to. When it 428 00:24:30,480 --> 00:24:32,439 Speaker 1: starts getting heart and you go dark is what we 429 00:24:32,480 --> 00:24:34,639 Speaker 1: call us. So we go into our shell. We we 430 00:24:34,720 --> 00:24:37,840 Speaker 1: kind of ostracize ourselves from society and and that's when 431 00:24:38,080 --> 00:24:42,240 Speaker 1: bad stuff starts happening. John recools a recent narrow escape 432 00:24:42,480 --> 00:24:46,000 Speaker 1: for one of his Sealed comrades. He was a Seal 433 00:24:46,040 --> 00:24:49,720 Speaker 1: Team six guy and um ended up going through divorce. 434 00:24:50,200 --> 00:24:52,280 Speaker 1: He had a newborn that he had to stay home 435 00:24:52,280 --> 00:24:54,240 Speaker 1: and take care of, so he went to a really 436 00:24:54,359 --> 00:24:56,439 Speaker 1: dark place. He just called me in unbeknownst to me, 437 00:24:56,520 --> 00:24:57,760 Speaker 1: it just put his son to bed and he was 438 00:24:57,760 --> 00:24:59,919 Speaker 1: sitting in the car with a pistol. I didn't know that. 439 00:25:00,800 --> 00:25:02,639 Speaker 1: I'm just picked up the phone and asked him how 440 00:25:02,720 --> 00:25:05,720 Speaker 1: he was doing, and he said he's doing okay, but 441 00:25:05,760 --> 00:25:07,880 Speaker 1: he needs some help. I said, we got your brother, 442 00:25:07,920 --> 00:25:10,239 Speaker 1: That's all I said. And that was enough for him 443 00:25:10,280 --> 00:25:12,240 Speaker 1: to put that gun away, go back inside and take 444 00:25:12,280 --> 00:25:14,280 Speaker 1: care of that little guy. Just me saying we got you. 445 00:25:15,800 --> 00:25:20,119 Speaker 1: That camaraderie saved John's friends life, but returning veterans need 446 00:25:20,200 --> 00:25:23,880 Speaker 1: something more than community. They need a purpose, a mission, 447 00:25:24,320 --> 00:25:26,520 Speaker 1: and that can be hard to find in civilian life. 448 00:25:26,880 --> 00:25:28,359 Speaker 1: You don't know you're fit, you don't know your rule 449 00:25:28,400 --> 00:25:30,520 Speaker 1: in the family, in your tribe, you don't know your 450 00:25:30,560 --> 00:25:33,120 Speaker 1: rule in society, and you're just trying to get by 451 00:25:33,160 --> 00:25:34,840 Speaker 1: to put food on the table. And there's there's a 452 00:25:34,880 --> 00:25:38,879 Speaker 1: void there now. And according to John, that's where Halo 453 00:25:39,000 --> 00:25:41,120 Speaker 1: comes in. Just because we're steals doesn't mean we all 454 00:25:41,119 --> 00:25:43,800 Speaker 1: want to end up doing security work. There's a lot 455 00:25:43,800 --> 00:25:45,600 Speaker 1: of people that wanted to maybe go in to finance 456 00:25:45,720 --> 00:25:49,199 Speaker 1: or be a lawyer. Halo allows us to succeed and 457 00:25:49,280 --> 00:25:52,080 Speaker 1: accelerate at that process. So if it's people want to 458 00:25:52,160 --> 00:25:54,960 Speaker 1: go back to school putting a Halo headset on before 459 00:25:54,960 --> 00:25:58,480 Speaker 1: you study, people maybe wanting to get a job that 460 00:25:58,520 --> 00:26:02,080 Speaker 1: requires multiple languages, you can throw on the halo and 461 00:26:02,080 --> 00:26:04,760 Speaker 1: pick up those language at an accelerated pace. Do you 462 00:26:04,800 --> 00:26:08,320 Speaker 1: think it's more powerful as something to believe in, like 463 00:26:08,359 --> 00:26:10,679 Speaker 1: if I put this headset on, like achieve my goals? 464 00:26:11,520 --> 00:26:13,760 Speaker 1: Is more powerful? Is a technological solution or is it 465 00:26:13,800 --> 00:26:16,760 Speaker 1: somewhere in between those two things? I think it's probably 466 00:26:16,760 --> 00:26:19,800 Speaker 1: both right. So our Strength and Conditioning coach and the 467 00:26:19,800 --> 00:26:22,800 Speaker 1: steel teams. He had a bottle p E is what 468 00:26:22,920 --> 00:26:26,359 Speaker 1: he wrote on it, which stood for placebo effect. It 469 00:26:26,480 --> 00:26:28,960 Speaker 1: was just water, but guys would come over and saying 470 00:26:29,000 --> 00:26:30,520 Speaker 1: their herd and he would just spare a little bit 471 00:26:30,560 --> 00:26:33,440 Speaker 1: of this water on them and they would every single 472 00:26:33,480 --> 00:26:35,680 Speaker 1: time would walk away like, oh I feel better, thanks, coach. 473 00:26:35,960 --> 00:26:38,080 Speaker 1: My point is is the research their supports that this 474 00:26:38,119 --> 00:26:40,880 Speaker 1: actually works. But who gives a ship even if it didn't, 475 00:26:40,880 --> 00:26:43,760 Speaker 1: because people are going to believe in themselves and that's part. 476 00:26:43,840 --> 00:26:46,080 Speaker 1: That's the bottle of my opinion. If it takes a 477 00:26:46,160 --> 00:26:48,040 Speaker 1: headset to get there, then great, but we know that 478 00:26:48,080 --> 00:26:50,720 Speaker 1: this headset works is going to help accelerate you atue 479 00:26:50,720 --> 00:26:54,520 Speaker 1: in that goal. We've talked about dual use in terms 480 00:26:54,560 --> 00:26:58,080 Speaker 1: of military technology that enters the civilian world and vice versa, 481 00:26:58,680 --> 00:27:01,560 Speaker 1: and Halo is just that a consumer product that is 482 00:27:01,600 --> 00:27:04,480 Speaker 1: also used by the armed forces, but it has a 483 00:27:04,560 --> 00:27:08,560 Speaker 1: much more profound dual use. It can save soldiers lives twice, 484 00:27:09,160 --> 00:27:12,000 Speaker 1: the first time on the battlefield where shaving milliseconds of 485 00:27:12,119 --> 00:27:15,800 Speaker 1: reaction time coming the difference between life and death, and 486 00:27:15,880 --> 00:27:19,159 Speaker 1: the second time when they return home. Haylo can help 487 00:27:19,200 --> 00:27:22,800 Speaker 1: them develop new skills and perhaps even more importantly, give 488 00:27:22,840 --> 00:27:25,600 Speaker 1: them the hope they need to keep going. When we 489 00:27:25,640 --> 00:27:28,320 Speaker 1: come back, we return to Darper and how to ensure 490 00:27:28,320 --> 00:27:32,000 Speaker 1: that we design new military technologies with worst case scenarios 491 00:27:32,119 --> 00:27:44,120 Speaker 1: top of mind. One of the central contentions of Sleepwalkers 492 00:27:44,200 --> 00:27:47,520 Speaker 1: is that our creations reflect us, and knowing this, we 493 00:27:47,560 --> 00:27:49,720 Speaker 1: need to be deliberate about how we tell them to 494 00:27:49,760 --> 00:27:53,400 Speaker 1: behave We're talking episode three of this series, The Watchman 495 00:27:53,560 --> 00:27:57,240 Speaker 1: about automation bias, the very human habit of treating the 496 00:27:57,240 --> 00:28:01,119 Speaker 1: output of computers as infallible even while ignoring the inputs 497 00:28:01,200 --> 00:28:04,399 Speaker 1: that we've given them. And recognizing this, Arthur made it 498 00:28:04,400 --> 00:28:07,240 Speaker 1: a central tenet of her tenure at Dapper to argue 499 00:28:07,280 --> 00:28:13,840 Speaker 1: that technology is not inevitable. There's a tendency to give 500 00:28:13,920 --> 00:28:16,800 Speaker 1: the active role to the technology. It's what the AI 501 00:28:16,920 --> 00:28:20,480 Speaker 1: will do to us. I want to keep bringing us 502 00:28:20,480 --> 00:28:24,440 Speaker 1: back to the fact that these technologies are our creations. 503 00:28:24,560 --> 00:28:28,440 Speaker 1: They're built by human beings. We have this enormous privilege 504 00:28:28,480 --> 00:28:31,080 Speaker 1: that we get to work on the powerful technologies that 505 00:28:31,119 --> 00:28:35,919 Speaker 1: can shape the progress of our societies. That privilege comes 506 00:28:36,040 --> 00:28:39,840 Speaker 1: with a responsibility to ask what could possibly go wrong? 507 00:28:40,600 --> 00:28:44,239 Speaker 1: What could possibly go wrong? It's a legitimate question, but 508 00:28:44,320 --> 00:28:48,200 Speaker 1: there's also a reason it's become a meme. It's notoriously 509 00:28:48,200 --> 00:28:51,800 Speaker 1: hard to answer. This is especially true in times of war, 510 00:28:52,240 --> 00:28:55,680 Speaker 1: when new technologies are often rushed into action without being 511 00:28:55,680 --> 00:29:00,760 Speaker 1: fully understood. Here's poor Shari again. Old War one is 512 00:29:00,840 --> 00:29:04,840 Speaker 1: a a wonderful, a terrible, um example of what can 513 00:29:04,880 --> 00:29:10,440 Speaker 1: happen when we see new technologies change warfare in ways 514 00:29:10,680 --> 00:29:14,280 Speaker 1: that policymakers were not prepared for. You know, the Industrial 515 00:29:14,280 --> 00:29:18,560 Speaker 1: Revolution brought not just you know, locomotives, but also um, 516 00:29:18,600 --> 00:29:23,080 Speaker 1: you know, cars, tanks, airplanes, machine guns that then were 517 00:29:23,200 --> 00:29:27,000 Speaker 1: used to industrialized warfare in a totally new way that 518 00:29:27,160 --> 00:29:30,920 Speaker 1: dramatically change this scale and the speed of killing that 519 00:29:31,000 --> 00:29:34,520 Speaker 1: was possible. The Gatling gun. People still had to crank 520 00:29:34,640 --> 00:29:37,960 Speaker 1: the gun, but then it automated the process of loading 521 00:29:37,960 --> 00:29:41,120 Speaker 1: and firing bullets. We began this episode talking about the 522 00:29:41,200 --> 00:29:45,560 Speaker 1: new dangers posed by automated weapons. Well, the Gatling gun 523 00:29:45,680 --> 00:29:48,440 Speaker 1: was actually one of the world's first, and as Paul 524 00:29:48,520 --> 00:29:51,400 Speaker 1: told us, its invention had a ripple effect that its 525 00:29:51,440 --> 00:29:55,800 Speaker 1: inventor could not have foreseen. The inventor, Richard Gatling, did 526 00:29:55,880 --> 00:29:59,960 Speaker 1: this to save lives. He was looking at um people 527 00:30:00,000 --> 00:30:03,360 Speaker 1: were coming back wounded and killed from the American Civil War. 528 00:30:03,840 --> 00:30:06,120 Speaker 1: He wanted to build a machine that could reduce the 529 00:30:06,200 --> 00:30:09,479 Speaker 1: number of soldiers that were needed on the battlefield as 530 00:30:09,480 --> 00:30:12,040 Speaker 1: a way to save lives. And that sounds like, you know, 531 00:30:12,080 --> 00:30:16,400 Speaker 1: a very well meaning idea and in practice, um as 532 00:30:16,400 --> 00:30:19,000 Speaker 1: the Gatlin gun involved into the machine gun. In World 533 00:30:19,000 --> 00:30:21,600 Speaker 1: War One, we saw a scale of killing that was 534 00:30:21,640 --> 00:30:25,160 Speaker 1: just unprecedented and in a whole a whole generation of 535 00:30:25,240 --> 00:30:29,560 Speaker 1: European man wiped out a battlefield. And so I think 536 00:30:29,560 --> 00:30:32,240 Speaker 1: it's it's an important cautionary tale for our ability to 537 00:30:32,320 --> 00:30:36,200 Speaker 1: predict how this technology will be used. The name of 538 00:30:36,200 --> 00:30:41,200 Speaker 1: this podcast, Sleepwalkers, is borrowed from a book called The Sleepwalkers, 539 00:30:41,200 --> 00:30:44,239 Speaker 1: How Europe Went to War in nineteen fourteen, written by 540 00:30:44,280 --> 00:30:47,160 Speaker 1: the historian Christopher Clark. And one of the big questions 541 00:30:47,200 --> 00:30:49,880 Speaker 1: I've been asking is, are we had a moment like 542 00:30:50,080 --> 00:30:52,520 Speaker 1: we were on the eve of World War One when 543 00:30:52,560 --> 00:30:56,000 Speaker 1: we haven't fully understood the implications of our new technology. 544 00:30:56,680 --> 00:31:00,440 Speaker 1: I asked Richard Danzig, the former Navy secretary, about the anels. 545 00:31:01,440 --> 00:31:05,520 Speaker 1: There is an analogy from World War One. European military 546 00:31:05,720 --> 00:31:11,040 Speaker 1: leaders developed mobilization plans to increase their own capabilities in 547 00:31:11,120 --> 00:31:16,120 Speaker 1: the event of an attack, and they underestimated the degree 548 00:31:16,160 --> 00:31:20,400 Speaker 1: to which that created rigidities and interactions, so that in 549 00:31:20,440 --> 00:31:25,160 Speaker 1: the end, the railroad timetables generated a war that perhaps 550 00:31:25,240 --> 00:31:28,840 Speaker 1: no one intended to be engaged in. People think that 551 00:31:28,880 --> 00:31:34,640 Speaker 1: they're driving the card, and in reality, the horses of 552 00:31:34,720 --> 00:31:38,720 Speaker 1: technology are frequently pulling us in directions that we don't 553 00:31:38,720 --> 00:31:43,400 Speaker 1: anticipate and can't control. So are we better place now 554 00:31:43,520 --> 00:31:46,680 Speaker 1: to understand the implications of AI and new technology for 555 00:31:46,760 --> 00:31:51,640 Speaker 1: global conflict than Europe was in ur I don't believe 556 00:31:51,640 --> 00:31:55,760 Speaker 1: our understanding of AI is greater than their understanding At 557 00:31:55,800 --> 00:31:59,720 Speaker 1: that point, we will make these mistakes too. I cannot 558 00:31:59,760 --> 00:32:03,080 Speaker 1: ask to make their significance or their frequency, but I'm 559 00:32:03,160 --> 00:32:07,440 Speaker 1: rather confident we will lose control, that we will make 560 00:32:07,520 --> 00:32:12,960 Speaker 1: mistakes of that kind and cause unintended consequences. So to me, 561 00:32:13,120 --> 00:32:18,600 Speaker 1: the interesting question is not can I predict their frequency? 562 00:32:18,840 --> 00:32:22,400 Speaker 1: The interesting question is what can I do in advance 563 00:32:22,560 --> 00:32:25,360 Speaker 1: if I recognize that it's one of them. Is well 564 00:32:25,400 --> 00:32:30,920 Speaker 1: represented by the darkast Safe Chaine project that government agency 565 00:32:31,200 --> 00:32:34,840 Speaker 1: is saying if people edit genes, but it turns out 566 00:32:34,880 --> 00:32:39,200 Speaker 1: they escape into the environment and proliferate. How do we 567 00:32:39,840 --> 00:32:42,560 Speaker 1: program them to begin with so that we can shut 568 00:32:42,600 --> 00:32:46,000 Speaker 1: them off? When it's so difficult to predict how new 569 00:32:46,000 --> 00:32:50,120 Speaker 1: technologies will be used and misused, it's hugely important that 570 00:32:50,160 --> 00:32:53,720 Speaker 1: we build precautions while they're still being researched and developed. 571 00:32:54,200 --> 00:32:56,120 Speaker 1: Is difficult, but we have to do our best to 572 00:32:56,160 --> 00:33:00,400 Speaker 1: anticipate the future dangers of a technology long before potential 573 00:33:00,480 --> 00:33:05,200 Speaker 1: deployment on the battlefield. Thankfully, that philosophy governed Arthur Prabaka's 574 00:33:05,240 --> 00:33:08,640 Speaker 1: work at DAPPER. What we developed was a way of 575 00:33:09,000 --> 00:33:12,920 Speaker 1: grappling with the ethical implications of these technologies. It started 576 00:33:13,280 --> 00:33:16,640 Speaker 1: by being open with ourselves, not just about our hopes 577 00:33:16,680 --> 00:33:20,200 Speaker 1: for the technology, but also our fears, and looking at 578 00:33:20,200 --> 00:33:22,080 Speaker 1: each other in the eye and saying, here's what we 579 00:33:22,160 --> 00:33:25,360 Speaker 1: think really is possible, but also here's what could really 580 00:33:25,400 --> 00:33:28,360 Speaker 1: go wrong. Were there any specific programs that you were 581 00:33:29,320 --> 00:33:32,360 Speaker 1: tempted by as a technologist, but in the end you 582 00:33:32,400 --> 00:33:35,600 Speaker 1: had to kill because they didn't meet your ethical standards. 583 00:33:36,120 --> 00:33:38,640 Speaker 1: I don't really have anything to say about that. The 584 00:33:38,640 --> 00:33:40,680 Speaker 1: answer is yes, but I can't give you an example 585 00:33:40,720 --> 00:33:44,680 Speaker 1: because it was classified. Karen, we've talked about the design 586 00:33:44,800 --> 00:33:47,680 Speaker 1: phase and thinking from r and d onwards about making 587 00:33:47,680 --> 00:33:50,960 Speaker 1: new weapons system safe. But it doesn't always work out 588 00:33:51,000 --> 00:33:53,200 Speaker 1: that way. Yeah, the genie does have a habit of 589 00:33:53,200 --> 00:33:56,280 Speaker 1: getting out of the bottle. Um. We've talked about dual 590 00:33:56,360 --> 00:34:01,320 Speaker 1: use before. Even seemingly benign technologies can be hugely destructive. 591 00:34:02,040 --> 00:34:03,560 Speaker 1: The one that blows my mind is the sort of 592 00:34:03,680 --> 00:34:07,440 Speaker 1: Arthur Galston, who was a plant biologist who discovered while 593 00:34:07,480 --> 00:34:10,560 Speaker 1: he was a graduate student, this compound that helps soybeans 594 00:34:10,560 --> 00:34:13,680 Speaker 1: flower faster. He also learned that if this compound were 595 00:34:13,680 --> 00:34:16,560 Speaker 1: applied in excess, that it would cause the plant to 596 00:34:16,600 --> 00:34:20,960 Speaker 1: shed its leaves. And when Galston discovered this defoliant effect, 597 00:34:21,000 --> 00:34:25,080 Speaker 1: that's what was abused by biological warfare scientists who would 598 00:34:25,120 --> 00:34:28,279 Speaker 1: then go on to develop Agent orange. I just got 599 00:34:28,320 --> 00:34:31,000 Speaker 1: back from a trip to Vietnam actually, where the effects 600 00:34:31,000 --> 00:34:34,000 Speaker 1: of Asian orange is still being felt. It's actually a 601 00:34:34,080 --> 00:34:37,840 Speaker 1: gene toxin which causes deformities through the generations. So that 602 00:34:37,920 --> 00:34:40,759 Speaker 1: is a truly horrific one, Kara, And it makes us 603 00:34:40,800 --> 00:34:43,520 Speaker 1: think it wasn't those chemists who were releasing agent orange 604 00:34:43,520 --> 00:34:46,920 Speaker 1: over Vietnam, it was the US military. So the idea 605 00:34:46,960 --> 00:34:49,240 Speaker 1: of the person who creates the technology gets to control 606 00:34:49,239 --> 00:34:51,759 Speaker 1: what happens to it is simply not the case, and 607 00:34:51,800 --> 00:34:53,480 Speaker 1: so we need to move forward with the assumption that 608 00:34:53,600 --> 00:34:57,120 Speaker 1: AI weapons will leave the laboratory and exists in the world. 609 00:34:57,640 --> 00:35:00,400 Speaker 1: And the central question is how helpful it to have 610 00:35:00,480 --> 00:35:04,120 Speaker 1: a human in the loop. Well, according to Richard Danzigg, 611 00:35:04,239 --> 00:35:10,160 Speaker 1: humans are of increasingly limited utility. I think there are 612 00:35:10,200 --> 00:35:14,680 Speaker 1: circumstances where human control is useful, but I don't think 613 00:35:14,719 --> 00:35:18,440 Speaker 1: that's the most useful approach. And the reason for that 614 00:35:18,640 --> 00:35:22,440 Speaker 1: is because the power is in the machine. So many 615 00:35:22,480 --> 00:35:26,600 Speaker 1: decisions that we care about are made at extraordinarily high speed, 616 00:35:27,120 --> 00:35:30,799 Speaker 1: and it just isn't time for the humans to assimilate 617 00:35:30,840 --> 00:35:35,320 Speaker 1: them and make correct judgments. Even the president declaring or 618 00:35:35,400 --> 00:35:40,200 Speaker 1: not nuclear war and responding might have fifteen minutes to 619 00:35:40,239 --> 00:35:44,360 Speaker 1: make a judgment. In other words, the whole idea of 620 00:35:44,400 --> 00:35:47,279 Speaker 1: a human in the loop making the final cool is 621 00:35:47,360 --> 00:35:50,400 Speaker 1: something of an illusion. At the very least, it relies 622 00:35:50,400 --> 00:35:54,120 Speaker 1: on us making wise decisions at lightning speed and under pressure. 623 00:35:54,640 --> 00:35:57,560 Speaker 1: And there's another problem. All of the information people like 624 00:35:57,680 --> 00:36:01,160 Speaker 1: the president used to make decisions has already been filtered 625 00:36:01,160 --> 00:36:05,520 Speaker 1: through several computer systems. So when the president reviews information 626 00:36:05,560 --> 00:36:08,520 Speaker 1: to make a tactical choice. He or she is already 627 00:36:08,560 --> 00:36:13,240 Speaker 1: relying on automation. He's extraordinarily dependent on what the machines 628 00:36:13,280 --> 00:36:17,440 Speaker 1: are telling him, what the sensors are interpreted to him, 629 00:36:17,440 --> 00:36:21,279 Speaker 1: and what the algorithms say the trajectories of missiles that 630 00:36:21,320 --> 00:36:25,520 Speaker 1: have been launched. So realistically, he's on the cart being 631 00:36:25,560 --> 00:36:30,359 Speaker 1: pulled by the horses of these technologies. UM. If that's 632 00:36:30,360 --> 00:36:32,680 Speaker 1: true for the president, think what it's like for the 633 00:36:32,719 --> 00:36:38,640 Speaker 1: person who's a sergeant in the field manning Patriot Missile Battalion. UM. 634 00:36:39,440 --> 00:36:43,759 Speaker 1: And it shows incoming missiles he has seconds or at 635 00:36:43,840 --> 00:36:47,759 Speaker 1: best minutes to respond and has to make decisions. We 636 00:36:47,920 --> 00:36:51,319 Speaker 1: know how fallible we are as decision makers, and we 637 00:36:51,400 --> 00:36:54,080 Speaker 1: know how dependent we already are on computer systems to 638 00:36:54,080 --> 00:36:57,640 Speaker 1: guide our decisions. So what can we do to prevent 639 00:36:57,680 --> 00:37:01,080 Speaker 1: ourselves from stumbling into a conflict that no one wants? 640 00:37:02,080 --> 00:37:05,440 Speaker 1: But I think we need to recognize that science is 641 00:37:05,480 --> 00:37:09,600 Speaker 1: now diffused around the globe, and we need a common 642 00:37:09,760 --> 00:37:13,520 Speaker 1: kind of understanding about how to reduce these risks, And 643 00:37:13,560 --> 00:37:17,040 Speaker 1: then we need some joint planning for the contingency that 644 00:37:17,080 --> 00:37:20,800 Speaker 1: these do escape. What do we do if a newly 645 00:37:20,920 --> 00:37:26,239 Speaker 1: engineered genetic system gets out there into the wild. Well, 646 00:37:26,320 --> 00:37:28,719 Speaker 1: that's not just a problem for the Chinese. If it 647 00:37:28,760 --> 00:37:33,759 Speaker 1: happens to happen in China, technology spreads, it gets modified, copied, 648 00:37:33,800 --> 00:37:36,200 Speaker 1: and hacked, and once something is out of the lab 649 00:37:36,560 --> 00:37:39,640 Speaker 1: is anyone's guess as to what happens. And countries are 650 00:37:39,719 --> 00:37:43,359 Speaker 1: slowly trying to establish standards for AI. But as Paul 651 00:37:43,400 --> 00:37:47,759 Speaker 1: Shari argues, creating a global framework governing AI in warfare 652 00:37:47,920 --> 00:37:50,920 Speaker 1: is a tool order. It's a very hard area to 653 00:37:50,960 --> 00:37:54,640 Speaker 1: think about. How do we mitigate that risk because countries 654 00:37:54,640 --> 00:37:56,879 Speaker 1: are not going to talk about the things that they're 655 00:37:56,920 --> 00:38:01,760 Speaker 1: doing in cyberspace and the fine angel sector. They've installed 656 00:38:01,760 --> 00:38:04,879 Speaker 1: things like circuit breakers that would take stocks offline if 657 00:38:04,920 --> 00:38:07,880 Speaker 1: the price moves too quickly. Well, there's no referee to 658 00:38:07,920 --> 00:38:10,440 Speaker 1: call time out in warfare. So if we're going to 659 00:38:10,520 --> 00:38:13,400 Speaker 1: manage those risks in the military space, does have to 660 00:38:13,400 --> 00:38:17,160 Speaker 1: be circuit breakers, uh if you will that people build 661 00:38:17,160 --> 00:38:20,600 Speaker 1: into our own weapons systems limits on them, ways to 662 00:38:20,680 --> 00:38:24,319 Speaker 1: interject human control to maintain control if things begin to 663 00:38:24,719 --> 00:38:28,600 Speaker 1: move in unexpected ways. And it's worth acknowledging really upfront 664 00:38:28,640 --> 00:38:32,040 Speaker 1: that there's a trade off there that every time that 665 00:38:32,480 --> 00:38:37,040 Speaker 1: a military puts guard rails on a weapon system or 666 00:38:37,320 --> 00:38:40,279 Speaker 1: inserts a human in the loop as a check. That's 667 00:38:40,320 --> 00:38:44,919 Speaker 1: potentially slowing down the effectiveness of their weapon. And there's 668 00:38:44,960 --> 00:38:47,120 Speaker 1: a risk that they are going to be afraid that 669 00:38:47,160 --> 00:38:49,120 Speaker 1: an adversary might not do that and might get an 670 00:38:49,239 --> 00:38:52,200 Speaker 1: edge on them. And that dynamic is really the crux 671 00:38:52,239 --> 00:38:55,360 Speaker 1: of the problem here. It's hard to get to a 672 00:38:55,360 --> 00:38:59,640 Speaker 1: place where countries um trust each other enough to engage 673 00:38:59,640 --> 00:39:02,920 Speaker 1: in mute all frustrain. But we may not have a choice. 674 00:39:03,520 --> 00:39:06,080 Speaker 1: Up until now, our defense policy has been based on 675 00:39:06,120 --> 00:39:10,279 Speaker 1: the assumption of technical superiority, and as our argues, we 676 00:39:10,280 --> 00:39:13,360 Speaker 1: can no longer rely on that. You have a model 677 00:39:13,400 --> 00:39:16,879 Speaker 1: that is based on owning all the technology and knowing 678 00:39:16,920 --> 00:39:18,920 Speaker 1: that no one else can have access to it for 679 00:39:18,960 --> 00:39:23,160 Speaker 1: two or three decades, and today those assumptions simply don't hold. 680 00:39:23,800 --> 00:39:26,719 Speaker 1: We are not the only people who can innovate right now. 681 00:39:27,880 --> 00:39:31,360 Speaker 1: The history of new technology and warfare is frankly disturbing. 682 00:39:31,840 --> 00:39:34,239 Speaker 1: When we create new weapons, we tend to use them. 683 00:39:34,840 --> 00:39:37,400 Speaker 1: We've talked about the atomic bomb in Japan, and about 684 00:39:37,400 --> 00:39:40,160 Speaker 1: poisonous gas in World War One, and even about the 685 00:39:40,200 --> 00:39:44,000 Speaker 1: Gatling gun, one of the world's first automated weapons designed 686 00:39:44,040 --> 00:39:46,600 Speaker 1: to reduce the number of combatants required to wage war. 687 00:39:47,080 --> 00:39:51,160 Speaker 1: It decimated an entire generation in Europe. We haven't yet 688 00:39:51,200 --> 00:39:54,560 Speaker 1: seen what happens when AI weapons systems begin to interact 689 00:39:54,560 --> 00:39:58,440 Speaker 1: with one another, but chances are we will in our lifetime. 690 00:39:59,239 --> 00:40:01,800 Speaker 1: So the pation in all of this can be to 691 00:40:01,920 --> 00:40:05,040 Speaker 1: desperately try to hit pools on new technology, but off, 692 00:40:05,160 --> 00:40:09,879 Speaker 1: y'all ues, that would be the wrong approach. Historically, we 693 00:40:09,920 --> 00:40:15,920 Speaker 1: are drawn forward by the enormous potential that these technologies 694 00:40:15,960 --> 00:40:18,879 Speaker 1: can enhance our lives, and at the same time we're 695 00:40:18,960 --> 00:40:24,160 Speaker 1: we're repelled by the consequences that we understand could be 696 00:40:24,280 --> 00:40:29,080 Speaker 1: fundamentally wrong. So I think that's the tension. But in 697 00:40:29,200 --> 00:40:33,279 Speaker 1: aggregate and over time, I do think that technology has 698 00:40:33,320 --> 00:40:36,759 Speaker 1: lifted us up, has advanced us. You know, when you 699 00:40:36,800 --> 00:40:39,960 Speaker 1: play the parlor game of asking your friends what period 700 00:40:39,960 --> 00:40:42,719 Speaker 1: in history they would rather live in. I might want 701 00:40:42,760 --> 00:40:45,120 Speaker 1: to visit, but there's no other time in history I 702 00:40:45,160 --> 00:40:47,600 Speaker 1: want to live in. And I think the future is 703 00:40:47,600 --> 00:40:50,200 Speaker 1: going to be fraught with problems, and it's still going 704 00:40:50,239 --> 00:40:52,080 Speaker 1: to be a better place than the one that we're in. 705 00:40:53,960 --> 00:40:56,759 Speaker 1: It's true that technology has made so many parts of 706 00:40:56,760 --> 00:41:00,920 Speaker 1: our lives easier, healthier, and safer, and it's also true 707 00:41:00,960 --> 00:41:03,600 Speaker 1: that the technology we create has the potential to be 708 00:41:03,719 --> 00:41:07,880 Speaker 1: ever more destructive. We've talked about dual use in this episode, 709 00:41:08,200 --> 00:41:11,000 Speaker 1: and as a matter of fact, many DARPA programs have 710 00:41:11,040 --> 00:41:15,440 Speaker 1: found their way into revolutionizing medicine. In the next episode, 711 00:41:15,480 --> 00:41:17,800 Speaker 1: we look at some of the incredible applications of AI 712 00:41:17,960 --> 00:41:21,120 Speaker 1: in the world of healthcare, from accurately predicting time of 713 00:41:21,160 --> 00:41:26,000 Speaker 1: death to decoding the human genome. I'm Ozveloshin, See you 714 00:41:26,040 --> 00:41:41,280 Speaker 1: next time. Sleepwalkers is a production of our Heart Radio 715 00:41:41,400 --> 00:41:45,640 Speaker 1: and unusual productions. For the latest AI news, live interviews, 716 00:41:45,680 --> 00:41:48,600 Speaker 1: and behind the scenes footage. Find us on Instagram, at 717 00:41:48,640 --> 00:41:54,600 Speaker 1: Sleepwalker's podcast or at sleepwalkers podcast dot com. Sleepwalkers is 718 00:41:54,640 --> 00:41:57,200 Speaker 1: hosted by me oz Veloshin and car hosted by me 719 00:41:57,360 --> 00:42:00,279 Speaker 1: Kara Price. Were produced by Julian Weller with help from 720 00:42:00,320 --> 00:42:03,920 Speaker 1: Jacopo Penzo and Taylor Chacogne. Mixing by Tristan McNeil and 721 00:42:04,000 --> 00:42:07,480 Speaker 1: Julian Weller. Our story editor is Matthew Riddle. Recording assistance 722 00:42:07,600 --> 00:42:11,080 Speaker 1: this episode from Chris Hambroke and Jackson Bierfeld. Sleepwalkers is 723 00:42:11,120 --> 00:42:15,319 Speaker 1: executive produced by me Osvaloschen and Mangesh Hattikiller. For more 724 00:42:15,360 --> 00:42:17,959 Speaker 1: podcasts from my Heart Radio, visit the I heart Radio app, 725 00:42:18,040 --> 00:42:20,960 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.