1 00:00:03,040 --> 00:00:05,840 Speaker 1: Welcome to Stuff to Blow Your Mind from how Stuff 2 00:00:05,840 --> 00:00:14,720 Speaker 1: Works dot com. Hey, welcome to Stuff to Blow your Mind. 3 00:00:14,880 --> 00:00:17,560 Speaker 1: My name is Robert Lamb and I'm Joe McCormick. And 4 00:00:17,680 --> 00:00:20,800 Speaker 1: so today we're gonna be talking about a an issue 5 00:00:20,960 --> 00:00:23,720 Speaker 1: in the future of technology, actually not just the future, 6 00:00:23,800 --> 00:00:26,400 Speaker 1: the present of technology. And it's going to have to 7 00:00:26,440 --> 00:00:32,040 Speaker 1: do with how we assess threats based on different types 8 00:00:32,080 --> 00:00:36,120 Speaker 1: of technological regimes that exist in the world. And so 9 00:00:36,240 --> 00:00:39,080 Speaker 1: you and I, Robert, are aware that very often we 10 00:00:39,240 --> 00:00:42,559 Speaker 1: fear the wrong things, right, Oh yes, I mean we 11 00:00:42,560 --> 00:00:47,120 Speaker 1: we fear things that are completely disproportionate to the level 12 00:00:47,120 --> 00:00:49,199 Speaker 1: of threat they represent in our lives. And this is 13 00:00:49,240 --> 00:00:51,600 Speaker 1: obvious in some ways because we have phobias of things 14 00:00:51,600 --> 00:00:54,320 Speaker 1: that are totally harmless. Some people are afraid of balloons 15 00:00:54,440 --> 00:00:56,600 Speaker 1: or something. I guess they're not harmful to birds and fish, 16 00:00:56,640 --> 00:00:59,480 Speaker 1: but harmless to us. And some people are afraid of 17 00:00:59,600 --> 00:01:01,880 Speaker 1: I don't know, public speaking, something that is in some 18 00:01:02,000 --> 00:01:05,760 Speaker 1: ways genuinely threatening to your reputation maybe, but it is 19 00:01:05,800 --> 00:01:08,160 Speaker 1: not is not a threat to your to your body, 20 00:01:08,240 --> 00:01:10,600 Speaker 1: to your body integrity. Yeah, and we have a lot 21 00:01:10,600 --> 00:01:13,720 Speaker 1: of these, uh, what some people called paper tigers. You 22 00:01:13,760 --> 00:01:17,279 Speaker 1: know where, in fact, we have a whole classic episode 23 00:01:17,280 --> 00:01:19,679 Speaker 1: of stuff to build your mind about paper tigers, where 24 00:01:19,959 --> 00:01:23,840 Speaker 1: something something that is ultimately not life threatening or even 25 00:01:23,840 --> 00:01:27,360 Speaker 1: even something that's not even gonna cause you actual physical injury. Uh, 26 00:01:27,600 --> 00:01:29,560 Speaker 1: we build it up in our minds to the level 27 00:01:29,560 --> 00:01:32,240 Speaker 1: it's really on par with some sort of a large 28 00:01:32,319 --> 00:01:36,440 Speaker 1: predatory creature, you know, in our in our primordial past. 29 00:01:37,000 --> 00:01:39,120 Speaker 1: You know, like we give it the same credence we 30 00:01:39,160 --> 00:01:42,039 Speaker 1: would give a tiger leaping out of the bushes at us. 31 00:01:42,200 --> 00:01:44,360 Speaker 1: So those are the obvious ways that we fear the 32 00:01:44,400 --> 00:01:47,120 Speaker 1: wrong things. We fear stuff that isn't even literally going 33 00:01:47,160 --> 00:01:49,480 Speaker 1: to hurt us. But then people are just wrong in 34 00:01:49,520 --> 00:01:53,320 Speaker 1: the ways they assess the relative dangers of actual physical threats, 35 00:01:53,360 --> 00:01:57,120 Speaker 1: like attacks by animals. You know, people are more afraid of, say, 36 00:01:57,160 --> 00:02:00,480 Speaker 1: being attacked and killed by a shark, But shark kill 37 00:02:00,760 --> 00:02:04,200 Speaker 1: almost nobody. They kill like less than ten people on 38 00:02:04,280 --> 00:02:07,280 Speaker 1: average per year worldwide. Uh, you know, you you are 39 00:02:07,320 --> 00:02:10,160 Speaker 1: extremely unlikely to die by shark attack, even if you 40 00:02:10,200 --> 00:02:14,000 Speaker 1: swim a lot. Meanwhile, animals that don't come in nearly 41 00:02:14,040 --> 00:02:16,520 Speaker 1: as much fear and and grip our minds in the 42 00:02:16,560 --> 00:02:19,840 Speaker 1: same way like dogs kill way more people. Dogs kill 43 00:02:19,919 --> 00:02:23,280 Speaker 1: tens of thousands of people every year, and mosquitoes, which 44 00:02:23,320 --> 00:02:27,840 Speaker 1: spread diseases, they literally kill hundreds of thousands of people annually. 45 00:02:28,280 --> 00:02:30,760 Speaker 1: Nobody's got mosquito phobia. I mean I guess maybe some 46 00:02:30,800 --> 00:02:33,360 Speaker 1: people do. Yeah, it's true, Like at least some people 47 00:02:33,360 --> 00:02:35,320 Speaker 1: have dog phobias. I think you can. You can basically 48 00:02:35,320 --> 00:02:37,880 Speaker 1: look to horror cinema and see this played out right, 49 00:02:37,960 --> 00:02:40,799 Speaker 1: how many how many dog horror films can you think of? Yes, 50 00:02:40,840 --> 00:02:42,480 Speaker 1: you can think of some key ones here and there, 51 00:02:42,800 --> 00:02:46,520 Speaker 1: mainly Kujo, right um, and a few others of note 52 00:02:46,960 --> 00:02:49,800 Speaker 1: you can, And then when you think of shark horror films, 53 00:02:50,080 --> 00:02:53,040 Speaker 1: there's just an endless supply. You could spend the rest 54 00:02:53,040 --> 00:02:56,560 Speaker 1: of your life I think watching terrible shark movies. But 55 00:02:56,600 --> 00:02:59,800 Speaker 1: when it comes to mosquitoes, you basically just have man squito. 56 00:03:00,200 --> 00:03:03,120 Speaker 1: And that's it's not even just a straight up mosquito movie. 57 00:03:03,160 --> 00:03:05,600 Speaker 1: It's a it's a human mosquito hybrid. Well, part of 58 00:03:05,600 --> 00:03:07,880 Speaker 1: it has to do with the kinds of imagery that 59 00:03:07,960 --> 00:03:10,200 Speaker 1: excite our brain. I mean, you just it's hard to 60 00:03:10,240 --> 00:03:14,440 Speaker 1: make a mosquito look scary. Mosquitoes look irritating. Yeah, but 61 00:03:14,520 --> 00:03:18,160 Speaker 1: then on the other hand, Uh, ticks are horrifying looking 62 00:03:18,680 --> 00:03:22,240 Speaker 1: and carry illnesses. And as we pointed out in the past, 63 00:03:22,280 --> 00:03:26,080 Speaker 1: there's only one there's really only one Tick horror film, 64 00:03:26,120 --> 00:03:29,600 Speaker 1: and it's terrific, But there's only one of them. You're 65 00:03:29,639 --> 00:03:32,320 Speaker 1: talking about, the one with Seth Green and Clint Howard, 66 00:03:32,400 --> 00:03:34,680 Speaker 1: Ron Howard's brother. Yeah, I'll make sure that we linked 67 00:03:34,720 --> 00:03:37,600 Speaker 1: to the trailer talk video episode on the Landing page 68 00:03:37,600 --> 00:03:39,000 Speaker 1: for this episode of Stuff to Blow Your Mind, in 69 00:03:39,040 --> 00:03:42,680 Speaker 1: which we talk about this cinematic jewel. Well, if you must. 70 00:03:43,080 --> 00:03:45,760 Speaker 1: But people are also totally off base in the way 71 00:03:45,760 --> 00:03:49,080 Speaker 1: they assess the relative dangers of, say, travel threats. Everybody's 72 00:03:49,080 --> 00:03:52,720 Speaker 1: heard this statistic right about travel travel methods. Oh yeah, 73 00:03:53,320 --> 00:03:56,000 Speaker 1: fear of flying being a huge one. Um, you know 74 00:03:56,560 --> 00:04:00,640 Speaker 1: driving is considerably more dangerous, and yet it's it's flying 75 00:04:00,680 --> 00:04:04,000 Speaker 1: that fills so many of us with varying levels of anxiety. 76 00:04:04,320 --> 00:04:07,080 Speaker 1: I can personally relate to this, and I'm continually fascinated 77 00:04:07,120 --> 00:04:12,040 Speaker 1: and frustrated by the way like this deeper irrational fear 78 00:04:12,120 --> 00:04:16,480 Speaker 1: can overpower, or at least sufficiently overpower my rational understanding 79 00:04:16,480 --> 00:04:19,000 Speaker 1: of the risks. Yeah, I know exactly what you're talking about. 80 00:04:19,040 --> 00:04:22,599 Speaker 1: I I have had various levels of fear of flying 81 00:04:22,640 --> 00:04:25,000 Speaker 1: in the past, And yet I totally understand, Like I've 82 00:04:25,000 --> 00:04:28,000 Speaker 1: read all the statistics about how per mile traveled, you 83 00:04:28,040 --> 00:04:31,520 Speaker 1: are so much safer in a in a commercial airliner 84 00:04:31,560 --> 00:04:35,400 Speaker 1: than you are, say, driving yourself somewhere. Yeah, Like, just 85 00:04:35,440 --> 00:04:38,520 Speaker 1: as a rational human being, you think, well, I can 86 00:04:38,560 --> 00:04:41,400 Speaker 1: just I can educate myself out of this fear. Like 87 00:04:41,440 --> 00:04:43,359 Speaker 1: with it with the airplane thing. I think back to 88 00:04:43,680 --> 00:04:45,839 Speaker 1: m was it the escape pot episode where we talked 89 00:04:45,839 --> 00:04:49,480 Speaker 1: about like where why there are no escape pods on airplanes? 90 00:04:50,080 --> 00:04:52,400 Speaker 1: And it doesn't make sense? Yeah, that it doesn't make sense. 91 00:04:52,440 --> 00:04:55,160 Speaker 1: And also if you are going to encounter a dangerous 92 00:04:55,160 --> 00:04:58,120 Speaker 1: situation than an airplane, like, it's gonna be far more off. 93 00:04:58,200 --> 00:05:00,960 Speaker 1: That's going to be the take off of the landing. Um. 94 00:05:01,040 --> 00:05:04,239 Speaker 1: And uh. And yet I'll I'll run through these facts, 95 00:05:04,240 --> 00:05:07,520 Speaker 1: I'll run through the material that we research and uh, 96 00:05:07,560 --> 00:05:11,560 Speaker 1: and it still doesn't quite penetrate the deeper set anxiety 97 00:05:11,640 --> 00:05:14,800 Speaker 1: that sometimes kicks in while flying. Yeah, And this disconnect 98 00:05:14,880 --> 00:05:17,520 Speaker 1: between what we fear and the actual threat is kind 99 00:05:17,520 --> 00:05:20,160 Speaker 1: of troubling. But but it actually gets worse because you 100 00:05:20,200 --> 00:05:23,880 Speaker 1: can point out some scenarios where fearing the wrong thing 101 00:05:24,120 --> 00:05:28,120 Speaker 1: has direct consequences in reality. So I was reading this 102 00:05:28,200 --> 00:05:31,920 Speaker 1: short article on edge dot org from by the psychology 103 00:05:31,920 --> 00:05:35,200 Speaker 1: professor David G. Myers about how humans just do not 104 00:05:35,360 --> 00:05:39,320 Speaker 1: accurately gauge what the real threats they face are. For example, 105 00:05:39,520 --> 00:05:42,240 Speaker 1: people tend to be very afraid of terrorist attacks, and 106 00:05:42,279 --> 00:05:46,520 Speaker 1: of course that makes sense. Terrorist attacks are straightforwardly terrifying 107 00:05:46,640 --> 00:05:50,080 Speaker 1: like that, they are a horrible thing, but statistically they 108 00:05:50,080 --> 00:05:54,320 Speaker 1: are so unlikely to harm you, just as one example, uh, 109 00:05:54,800 --> 00:05:57,919 Speaker 1: Myers points to how much more likely you are to 110 00:05:57,960 --> 00:06:01,160 Speaker 1: be harmed by writing in a motor vehicle, by being 111 00:06:01,160 --> 00:06:03,520 Speaker 1: in a car accident, than by being a victim of 112 00:06:03,520 --> 00:06:06,760 Speaker 1: a terrorist attack. And yet terrorism is designed exactly to 113 00:06:06,920 --> 00:06:09,800 Speaker 1: make us afraid of threats in an outsized way. That's 114 00:06:09,800 --> 00:06:12,120 Speaker 1: sort of the purpose of it, right, It's to grip 115 00:06:12,200 --> 00:06:15,000 Speaker 1: your mind with a horrifying image that will force you 116 00:06:15,040 --> 00:06:18,599 Speaker 1: to behave irrationally in response. Yeah, kind of a forced 117 00:06:18,640 --> 00:06:23,080 Speaker 1: recategorization of a safe or reasonably safe place. Uh. And 118 00:06:23,200 --> 00:06:25,640 Speaker 1: and of course that can be psychologically damaging. That's again 119 00:06:25,680 --> 00:06:28,800 Speaker 1: the whole point of terrorism. Yeah, exactly. Terrorism plays on 120 00:06:28,839 --> 00:06:31,400 Speaker 1: our psychology. And Myers gives an example of how this 121 00:06:31,440 --> 00:06:35,479 Speaker 1: can work exactly against our interests. So after the nine 122 00:06:35,480 --> 00:06:37,800 Speaker 1: eleven attacks in America, a lot of people were very 123 00:06:37,839 --> 00:06:40,960 Speaker 1: worried about terrorist attacks on airplanes, and as a result, 124 00:06:41,040 --> 00:06:44,240 Speaker 1: fewer people were flying. But Myers did the math on 125 00:06:44,320 --> 00:06:47,240 Speaker 1: the effects of this given the statistics about the relative 126 00:06:47,320 --> 00:06:50,880 Speaker 1: safety of these travel methods, and he discovered that if 127 00:06:50,920 --> 00:06:55,160 Speaker 1: people in general flew an average of twenty less and 128 00:06:55,279 --> 00:06:58,040 Speaker 1: didn't just like not travel anywhere, but instead made all 129 00:06:58,040 --> 00:07:01,000 Speaker 1: those same trips and covered the same out of ground 130 00:07:01,040 --> 00:07:05,279 Speaker 1: by driving, then their risk of fatal injury actually went 131 00:07:05,480 --> 00:07:08,320 Speaker 1: up because scheduled airline flights, as we know, are much 132 00:07:08,360 --> 00:07:11,040 Speaker 1: safer than surface travel by car per mile. So if 133 00:07:11,080 --> 00:07:14,400 Speaker 1: Americans flew less in two thousand one and made the 134 00:07:14,440 --> 00:07:17,680 Speaker 1: same trips by car, we could expect about eight hundred 135 00:07:17,800 --> 00:07:21,520 Speaker 1: more people to die in auto fatalities that year. And 136 00:07:21,600 --> 00:07:24,600 Speaker 1: Myers reports that later there was a German psychologist named 137 00:07:24,640 --> 00:07:29,640 Speaker 1: Gerd Gigerinzer who checked this prediction against travel data for 138 00:07:29,640 --> 00:07:33,520 Speaker 1: the year and the year following the attacks, and gird 139 00:07:33,640 --> 00:07:37,600 Speaker 1: determined that an estimated one thousand, five hundred Americans died 140 00:07:37,640 --> 00:07:40,960 Speaker 1: in traffic accidents while trying to avoid the dangerous of 141 00:07:40,960 --> 00:07:44,400 Speaker 1: air travel. So Myers writes, quote, long after nine eleven, 142 00:07:44,480 --> 00:07:47,680 Speaker 1: the terrorists were still killing us. And in a way 143 00:07:47,720 --> 00:07:50,320 Speaker 1: this is true, that they were able to present people 144 00:07:50,720 --> 00:07:54,440 Speaker 1: with horrifying images that made them behave irrationally, made them 145 00:07:54,440 --> 00:07:57,360 Speaker 1: worry about the wrong things, and actually led to more 146 00:07:57,520 --> 00:08:01,080 Speaker 1: dangerous decisions that hurt more people. And of course this 147 00:08:01,200 --> 00:08:03,760 Speaker 1: is limited merely to people's choice of how to travel 148 00:08:03,960 --> 00:08:06,640 Speaker 1: right there. There are obviously other ways that you can 149 00:08:06,640 --> 00:08:09,760 Speaker 1: say terrorism leads to negative consequences that actually harm the 150 00:08:09,800 --> 00:08:13,080 Speaker 1: people who are seeking security and all that. Political decisions 151 00:08:13,120 --> 00:08:17,280 Speaker 1: and stuff like that, Yeah, basically affecting at at various 152 00:08:17,440 --> 00:08:19,640 Speaker 1: levels how you live your life. Yeah, trying to get 153 00:08:19,640 --> 00:08:22,920 Speaker 1: people to operate on the basis of terror management rather 154 00:08:22,960 --> 00:08:26,280 Speaker 1: than rational decision making. And people don't make good decisions 155 00:08:26,280 --> 00:08:28,960 Speaker 1: when they're terrified. And part of the reason is that 156 00:08:29,000 --> 00:08:32,360 Speaker 1: there's this principle exploited by terrorism. It's the cognitive bias 157 00:08:32,440 --> 00:08:35,600 Speaker 1: known as the availability heuristic. And we've talked about this 158 00:08:35,640 --> 00:08:37,600 Speaker 1: on the show before, but basically, what this means is 159 00:08:37,679 --> 00:08:41,520 Speaker 1: that items and events which you are, which you have 160 00:08:41,720 --> 00:08:45,880 Speaker 1: easily accessible in your memory, are given undo weight in 161 00:08:45,960 --> 00:08:50,120 Speaker 1: our considerations. So when we try to think about what's dangerous, 162 00:08:50,120 --> 00:08:52,360 Speaker 1: what are the things we should worry about and protect 163 00:08:52,360 --> 00:08:56,200 Speaker 1: ourselves against. We actually end up not thinking about what 164 00:08:56,360 --> 00:08:59,200 Speaker 1: is statistically the most relevant threat, but we end up 165 00:08:59,240 --> 00:09:03,720 Speaker 1: thinking basically about what's the most scary. And these are 166 00:09:03,720 --> 00:09:06,840 Speaker 1: two very different things. Scary images stick in the mind, 167 00:09:07,320 --> 00:09:11,720 Speaker 1: and real threats very often go unnoticed. Myers writes, quote, 168 00:09:11,960 --> 00:09:16,679 Speaker 1: Thus we remember and fear disasters, tornadoes, air crashes, attacks 169 00:09:16,920 --> 00:09:21,280 Speaker 1: that kill people dramatically in bunches, while fearing too little 170 00:09:21,400 --> 00:09:24,640 Speaker 1: the threats that claim lives one by one. We hardly 171 00:09:24,679 --> 00:09:29,000 Speaker 1: noticed the half million children dying quietly each day from rhodavirus. 172 00:09:29,040 --> 00:09:32,720 Speaker 1: Bill Gates once observed the equivalent of four seven forty 173 00:09:32,720 --> 00:09:35,880 Speaker 1: seven's full of children every day. And we discount the 174 00:09:35,920 --> 00:09:39,440 Speaker 1: future and its future weapon of mass destruction, climate change. 175 00:09:40,080 --> 00:09:42,480 Speaker 1: And so Myers goes on to quote the American security 176 00:09:42,480 --> 00:09:46,240 Speaker 1: and privacy expert Bruce Schneier, who says, quote, if it's 177 00:09:46,240 --> 00:09:48,880 Speaker 1: in the news, don't worry about it. The very definition 178 00:09:48,920 --> 00:09:52,720 Speaker 1: of news is something that hardly ever happens. Now, of course, 179 00:09:52,760 --> 00:09:54,720 Speaker 1: that's not always the case, because, of course you could 180 00:09:54,720 --> 00:09:57,400 Speaker 1: see news reports about things that are are real things 181 00:09:57,440 --> 00:09:59,160 Speaker 1: to be concerned about. But I think what he's talking 182 00:09:59,160 --> 00:10:01,040 Speaker 1: about there is that if you detect your in a 183 00:10:01,080 --> 00:10:04,240 Speaker 1: situation of if it bleeds, it leads, you should do 184 00:10:04,280 --> 00:10:07,000 Speaker 1: your best not to let the scary spectacle of what 185 00:10:07,040 --> 00:10:10,360 Speaker 1: you're seeing become overrepresented as a threat in your brain. 186 00:10:11,960 --> 00:10:14,680 Speaker 1: And of course it's also the case with false news, 187 00:10:14,720 --> 00:10:17,760 Speaker 1: which we've talked about recently, um, in which case this 188 00:10:17,840 --> 00:10:20,720 Speaker 1: is something that never happens and it can end up 189 00:10:20,720 --> 00:10:24,480 Speaker 1: affecting the way we we live our lives or govern ourselves. 190 00:10:24,920 --> 00:10:27,720 Speaker 1: I mean, think to such moral panics at say Satanic 191 00:10:27,800 --> 00:10:30,120 Speaker 1: panic or the or a you know, to a lesser 192 00:10:30,160 --> 00:10:33,680 Speaker 1: degree that the poison candy myth, the idea that Halloween 193 00:10:33,720 --> 00:10:37,160 Speaker 1: candy is going to be tampered with and then handed 194 00:10:37,200 --> 00:10:39,920 Speaker 1: out to children. These are things that that that did 195 00:10:39,960 --> 00:10:43,719 Speaker 1: not happen, but became pervasive ideas. I mean, especially in 196 00:10:43,720 --> 00:10:46,440 Speaker 1: the case of Satanic panic, this is the fiction that 197 00:10:46,480 --> 00:10:50,280 Speaker 1: there is or was an organized effort amongst secret occultists 198 00:10:50,320 --> 00:10:55,760 Speaker 1: to ritually abuse children, a fantasy that resulted in manufactured trauma, 199 00:10:56,480 --> 00:11:00,199 Speaker 1: ruined lives, and a legacy of superstitious persecution and in 200 00:11:00,320 --> 00:11:03,040 Speaker 1: violence in parts of the world, including parts of Africa 201 00:11:03,040 --> 00:11:06,080 Speaker 1: where you kind of see the echo effect of of 202 00:11:06,080 --> 00:11:10,120 Speaker 1: of satanic panic in western nations of predominantly United States 203 00:11:10,240 --> 00:11:13,320 Speaker 1: in the nineteen eighties. Yeah, but we should remember, of course, 204 00:11:13,360 --> 00:11:15,360 Speaker 1: while we talked about the myth, to talk about the 205 00:11:15,400 --> 00:11:17,840 Speaker 1: causal reality. I think a lot of people think what 206 00:11:17,880 --> 00:11:20,840 Speaker 1: was going on when people were coming up with stories 207 00:11:20,840 --> 00:11:24,480 Speaker 1: of Satanic ritual abuse in the eighties was that children 208 00:11:24,600 --> 00:11:28,960 Speaker 1: were being led in interviews and prompted by police and 209 00:11:29,120 --> 00:11:32,000 Speaker 1: therapists and people who had these pre existing ideas in 210 00:11:32,040 --> 00:11:34,480 Speaker 1: their head and children were just sort of like going 211 00:11:34,520 --> 00:11:38,120 Speaker 1: along with what they perceived the adults uh to want 212 00:11:38,160 --> 00:11:40,480 Speaker 1: them to say, right, and and of course it wasn't 213 00:11:40,520 --> 00:11:42,720 Speaker 1: just the children that the children were a major part of. 214 00:11:42,720 --> 00:11:45,800 Speaker 1: You also had adults with these uh these these these 215 00:11:45,840 --> 00:11:50,000 Speaker 1: supposed memories that they were reclaiming through therapy that were 216 00:11:50,040 --> 00:11:54,559 Speaker 1: revealing past satanic abuse. Um. But but yeah, it all, 217 00:11:54,600 --> 00:11:59,200 Speaker 1: it all amounted to a manufactured fear that that so 218 00:11:59,200 --> 00:12:02,560 Speaker 1: many people bought into and the thing that everyone was 219 00:12:02,559 --> 00:12:07,400 Speaker 1: afraid of did not exist and has never existed. You know, 220 00:12:07,800 --> 00:12:11,240 Speaker 1: it's it still floors me there there's an older episode 221 00:12:11,240 --> 00:12:12,880 Speaker 1: of stuff to blow your mind about it. Um what 222 00:12:13,080 --> 00:12:16,240 Speaker 1: to link to that it's it's a fascinating psychological event 223 00:12:16,280 --> 00:12:19,240 Speaker 1: in history that it makes you realize yet again that 224 00:12:19,800 --> 00:12:22,920 Speaker 1: it's the vividness of the imagery and sort of the 225 00:12:23,360 --> 00:12:27,440 Speaker 1: what the what the threat suggests as an idea that 226 00:12:27,559 --> 00:12:30,560 Speaker 1: captures people's fear, not so much the present reality of 227 00:12:30,600 --> 00:12:34,080 Speaker 1: the threat. Yeah, and it drives home just how how 228 00:12:34,120 --> 00:12:37,400 Speaker 1: easy it is, then um psychologically to just to have 229 00:12:37,480 --> 00:12:41,080 Speaker 1: an unbalanced fear of flying or dogs or terrorism or 230 00:12:41,080 --> 00:12:43,840 Speaker 1: what have you. And so for today, I wanted to 231 00:12:43,840 --> 00:12:48,560 Speaker 1: take this principle that we don't exactly fear the right 232 00:12:48,800 --> 00:12:52,120 Speaker 1: thing and and and try to redirect it in one 233 00:12:52,160 --> 00:12:55,840 Speaker 1: particular area, which is the kind of thing we're usually 234 00:12:55,840 --> 00:12:59,800 Speaker 1: worried about when it comes to AI and autonomous weaponry 235 00:13:00,000 --> 00:13:05,360 Speaker 1: and the dangers of technological weapons. I think we are 236 00:13:05,480 --> 00:13:09,400 Speaker 1: worried about the wrong stuff, or more specifically, we're worried 237 00:13:09,400 --> 00:13:11,920 Speaker 1: about stuff that we should be worried about, but we're 238 00:13:11,920 --> 00:13:16,440 Speaker 1: worried almost exclusively about the smaller threat rather than the 239 00:13:16,520 --> 00:13:19,920 Speaker 1: larger threat. The smaller more dramatic. And I think this 240 00:13:20,000 --> 00:13:22,000 Speaker 1: is this is something that came up when you're listening 241 00:13:22,040 --> 00:13:25,720 Speaker 1: the examples earlier. A threat that you could you've given 242 00:13:25,760 --> 00:13:29,640 Speaker 1: the chance physically run from fearing climate change, for instance, 243 00:13:30,760 --> 00:13:32,200 Speaker 1: how do you run from that? How do you hide 244 00:13:32,200 --> 00:13:35,360 Speaker 1: behind a bush? Uh like? But a tornado, on the 245 00:13:35,360 --> 00:13:39,120 Speaker 1: other hand, you could conceivably run into a bunker, right 246 00:13:39,320 --> 00:13:41,160 Speaker 1: you can, you can. You can attach a lot of 247 00:13:41,200 --> 00:13:44,440 Speaker 1: anxiety to it. But then you could conceivably see it 248 00:13:44,679 --> 00:13:48,559 Speaker 1: and and react in real time to its to its threat. 249 00:13:48,760 --> 00:13:51,920 Speaker 1: But if you're in a place where tornadoes don't happen 250 00:13:52,040 --> 00:13:54,840 Speaker 1: very often, say, and you are investing a lot of 251 00:13:54,960 --> 00:13:59,360 Speaker 1: energy and resources into preparing for a tornado threat instead 252 00:13:59,440 --> 00:14:02,560 Speaker 1: of in sting that energy and resources into something that 253 00:14:02,720 --> 00:14:05,760 Speaker 1: will definitely affect you in the future, like say, climate change, 254 00:14:05,760 --> 00:14:08,240 Speaker 1: which will affect most people wherever you are in one 255 00:14:08,240 --> 00:14:11,720 Speaker 1: way or another, especially with all of its secondary downstream effects. 256 00:14:11,960 --> 00:14:15,600 Speaker 1: If you're investing exclusively in the lesser threat, than you 257 00:14:15,640 --> 00:14:19,560 Speaker 1: are doing yourself and your future descendants a disservice. Absolutely. 258 00:14:19,880 --> 00:14:21,320 Speaker 1: All right, Well, let's take a quick break and the 259 00:14:21,320 --> 00:14:23,720 Speaker 1: when we come back, we will talk about different types 260 00:14:23,960 --> 00:14:29,280 Speaker 1: of technological weapon threats. Thank you, thank you. All right, 261 00:14:29,320 --> 00:14:32,520 Speaker 1: we're back. So, Joe, I know that when when we 262 00:14:32,600 --> 00:14:36,720 Speaker 1: when we talk about technological weapon threats, what we're really 263 00:14:36,720 --> 00:14:39,280 Speaker 1: talking about here is of course ED two oh nine. 264 00:14:39,920 --> 00:14:43,600 Speaker 1: We're talking about the terminator, right, We're talking about chopping mall, 265 00:14:43,680 --> 00:14:46,680 Speaker 1: the killer robots of shopping mall that may just go 266 00:14:46,760 --> 00:14:50,080 Speaker 1: out of control and start hunting our teenagers down in 267 00:14:50,160 --> 00:14:53,720 Speaker 1: the malls of the future. Chopping Mall is a fantastic 268 00:14:53,800 --> 00:14:57,320 Speaker 1: piece of eighties trash cinema. But but no, I mean 269 00:14:57,520 --> 00:15:00,440 Speaker 1: that is the thing that captures our mind because you 270 00:15:00,440 --> 00:15:02,520 Speaker 1: can run from it. Yeah, there's a robot with a 271 00:15:02,600 --> 00:15:06,040 Speaker 1: gun on it that that is the standard image. Okay, 272 00:15:06,080 --> 00:15:10,080 Speaker 1: what is the AI autonomous weapon threat of the future. 273 00:15:10,600 --> 00:15:13,880 Speaker 1: It is a terminator. It is those killer robots from 274 00:15:13,920 --> 00:15:17,920 Speaker 1: the Matrix. It is something that is an embodied robot 275 00:15:18,040 --> 00:15:20,280 Speaker 1: that is coming to you know, point a gun at 276 00:15:20,320 --> 00:15:22,640 Speaker 1: you and make you do something, or hunt you down 277 00:15:22,680 --> 00:15:25,680 Speaker 1: for robot sport or something like that. And I want 278 00:15:25,680 --> 00:15:28,240 Speaker 1: to be clear that this is absolutely a real thing 279 00:15:28,280 --> 00:15:31,000 Speaker 1: worth discussing. Not so much I don't know about terminators, 280 00:15:31,000 --> 00:15:35,120 Speaker 1: but the idea of conventional autonomous weapons that I'm not 281 00:15:35,240 --> 00:15:37,640 Speaker 1: saying that is not worth discussing, because it is It's 282 00:15:37,640 --> 00:15:40,600 Speaker 1: been the focus of a lot of international conversation about 283 00:15:40,640 --> 00:15:43,320 Speaker 1: the ethics of warfare. We're actively trying to figure out 284 00:15:43,320 --> 00:15:46,640 Speaker 1: how to regulate this, and there for example United Nations 285 00:15:46,680 --> 00:15:50,720 Speaker 1: Conventions on Discussing Autonomous Conventional Weapons and what we should 286 00:15:50,760 --> 00:15:54,200 Speaker 1: do about them. Yeah, the ideas of Head two oh 287 00:15:54,280 --> 00:15:58,040 Speaker 1: nine terminators, what have you even? Chopping mall Sci five 288 00:15:58,880 --> 00:16:02,800 Speaker 1: has always spoken to our anxieties and our fears, uh, 289 00:16:02,880 --> 00:16:06,680 Speaker 1: and our hopes about technology and where it's going to 290 00:16:06,680 --> 00:16:09,640 Speaker 1: to get us and given us vivid imagery to feed 291 00:16:09,640 --> 00:16:13,440 Speaker 1: our availability heuristic for these types of threats. Yea. And 292 00:16:13,480 --> 00:16:15,440 Speaker 1: of course give us something that we can we can 293 00:16:15,600 --> 00:16:17,560 Speaker 1: we can we can run from you know, we can, 294 00:16:17,600 --> 00:16:20,040 Speaker 1: we can battle, you can fight back against the the 295 00:16:20,280 --> 00:16:23,480 Speaker 1: machines and chopping mall. But to get serious for a minute, 296 00:16:23,480 --> 00:16:26,600 Speaker 1: I mean, we are more and more seeing actual semi 297 00:16:26,640 --> 00:16:32,200 Speaker 1: autonomous weapon technology being introduced into military arsenals around the world. Absolutely. 298 00:16:32,200 --> 00:16:34,560 Speaker 1: I mean you'd have to be under a rock to 299 00:16:34,680 --> 00:16:38,640 Speaker 1: have avoided uh any any coverage of the U. S 300 00:16:38,680 --> 00:16:42,760 Speaker 1: military's use of drone strikes in recent decades. Uh. And 301 00:16:42,760 --> 00:16:45,240 Speaker 1: and that's that's just the tip of the iceberg too. 302 00:16:45,320 --> 00:16:48,000 Speaker 1: I mean I was reading a little bit about autonomous 303 00:16:48,040 --> 00:16:51,680 Speaker 1: weapons in Max tech Mark's recent book Life three point 304 00:16:51,680 --> 00:16:55,600 Speaker 1: oh and he points out that you have. You have 305 00:16:55,640 --> 00:16:58,640 Speaker 1: in the U s military, the US Phalanx system for 306 00:16:58,720 --> 00:17:03,840 Speaker 1: its Ages class cruisers that automatically detect, track, and attack 307 00:17:03,920 --> 00:17:07,080 Speaker 1: threats such as anti ship missiles and aircraft. You may 308 00:17:07,119 --> 00:17:10,119 Speaker 1: have seen images of these these like big dome looking 309 00:17:10,160 --> 00:17:13,080 Speaker 1: devices with the weapon on on the front. Oh yeah, 310 00:17:13,080 --> 00:17:16,840 Speaker 1: I mean we'll think about other types of automatic missile defense, 311 00:17:16,880 --> 00:17:21,680 Speaker 1: the the Israeli Iron Dome system. Yeah. But in particular though, 312 00:17:22,000 --> 00:17:25,080 Speaker 1: this this one system, it's been in service since nineteen eighty, 313 00:17:25,200 --> 00:17:28,159 Speaker 1: still in service today, and it it actually led to 314 00:17:28,200 --> 00:17:32,160 Speaker 1: the nine downing of Iran air Flight six five five, 315 00:17:32,280 --> 00:17:36,440 Speaker 1: a civilian Iranian passenger jet, killed all two nine people 316 00:17:36,440 --> 00:17:40,480 Speaker 1: on board and caused international outrage. There was, however, a 317 00:17:40,600 --> 00:17:43,520 Speaker 1: human in the loop on this system who made the error. 318 00:17:43,560 --> 00:17:46,359 Speaker 1: And that's that's one of the key distinctions between uh, 319 00:17:46,680 --> 00:17:51,280 Speaker 1: some of these contemporary and past autonomous weapons systems and 320 00:17:51,720 --> 00:17:54,560 Speaker 1: the possible future of autonomous weapons systems. Is there a 321 00:17:54,680 --> 00:17:57,720 Speaker 1: human being that is at some point weighing in on 322 00:17:57,720 --> 00:18:00,439 Speaker 1: the decision or having to make a final decision. And 323 00:18:00,520 --> 00:18:03,879 Speaker 1: in pretty much all conventional autonomous weapons systems I can 324 00:18:03,920 --> 00:18:07,480 Speaker 1: think of today, they're they're not fully autonomous there's semi autonomous, 325 00:18:07,560 --> 00:18:11,280 Speaker 1: there's still a human command structure, a human override, there's 326 00:18:11,280 --> 00:18:15,359 Speaker 1: still basically being controlled by humans, but they're making some 327 00:18:15,440 --> 00:18:20,840 Speaker 1: kind of uh, they've got some kind of automatic assistance function, right, 328 00:18:20,880 --> 00:18:23,840 Speaker 1: they'll be they'll be a drone pilot somewhere, or in 329 00:18:23,920 --> 00:18:26,040 Speaker 1: some of these models, I believe it would be you'll 330 00:18:26,040 --> 00:18:27,880 Speaker 1: have a drone pilot and maybe they're looking they're dealing 331 00:18:27,960 --> 00:18:30,320 Speaker 1: with with multiple drones, but there's still a human in 332 00:18:30,359 --> 00:18:34,159 Speaker 1: the loop on that particular weapons system. And just a 333 00:18:34,200 --> 00:18:38,000 Speaker 1: reminder that that Russia is believed to be testing its 334 00:18:38,000 --> 00:18:41,919 Speaker 1: first autonomous nuclear torpedo. The idea with this is that 335 00:18:41,960 --> 00:18:45,000 Speaker 1: it would be guided largely by AI to strike the 336 00:18:45,080 --> 00:18:48,200 Speaker 1: United States even if it lost all communications with Moscow. 337 00:18:48,800 --> 00:18:51,360 Speaker 1: A frightening weapon concept, to be sure, and but it's 338 00:18:51,359 --> 00:18:53,760 Speaker 1: the one that was originally proposed back in the nineties 339 00:18:53,800 --> 00:18:58,840 Speaker 1: sixties by um um Andre Sakarov, and some analysts call 340 00:18:58,920 --> 00:19:01,760 Speaker 1: this a doomsday weapon, and with good reason. That's a 341 00:19:01,800 --> 00:19:04,600 Speaker 1: it's a terrifying concept. It certainly is. I mean, one 342 00:19:04,680 --> 00:19:07,840 Speaker 1: at least hopes that there's there's still human command structure 343 00:19:08,320 --> 00:19:11,320 Speaker 1: there and and but these types of weapons, all these 344 00:19:11,359 --> 00:19:14,159 Speaker 1: things we've been talking about are part of this debate, 345 00:19:14,200 --> 00:19:16,879 Speaker 1: part of this debate that the international community is having 346 00:19:17,560 --> 00:19:21,240 Speaker 1: about what are the ethics of autonomous weapons systems, especially 347 00:19:21,280 --> 00:19:24,280 Speaker 1: if they're not just weapons systems used to say, shoot 348 00:19:24,359 --> 00:19:27,920 Speaker 1: down incoming enemy missiles. I mean that you sort of 349 00:19:27,960 --> 00:19:30,120 Speaker 1: see the difference there right now, Like you can imagine 350 00:19:30,880 --> 00:19:33,959 Speaker 1: missile defense kind of thing is different from something that 351 00:19:34,040 --> 00:19:36,920 Speaker 1: will be aiming at people or aiming at places where 352 00:19:36,920 --> 00:19:39,920 Speaker 1: people could be even though even a missile defense system could, 353 00:19:40,000 --> 00:19:43,920 Speaker 1: of course, as we've seen, go go awry, right right, 354 00:19:43,960 --> 00:19:46,200 Speaker 1: I mean, it becomes increasingly more complicated when you start 355 00:19:46,240 --> 00:19:50,720 Speaker 1: thinking about, um, military engagements, say in a city, if 356 00:19:50,760 --> 00:19:54,320 Speaker 1: you're having to deal with civilians and or or or uh, 357 00:19:54,880 --> 00:19:58,160 Speaker 1: you know, combatants that are not in uniform that sort 358 00:19:58,160 --> 00:20:03,560 Speaker 1: of thing, or any variety of uh, morally complex standoff situation. 359 00:20:03,920 --> 00:20:06,520 Speaker 1: How do you program for that? Yeah, and so that 360 00:20:06,680 --> 00:20:09,119 Speaker 1: is a very important question. But I wanted to focus 361 00:20:09,200 --> 00:20:14,119 Speaker 1: today on a potentially even more dangerous class of weapons 362 00:20:14,160 --> 00:20:17,920 Speaker 1: that can actually hurt many more people, and a class 363 00:20:17,920 --> 00:20:21,200 Speaker 1: of weapons that in fact already exist in some form today. 364 00:20:21,640 --> 00:20:24,800 Speaker 1: And we have a pretty clear vision of how much 365 00:20:24,840 --> 00:20:27,760 Speaker 1: more advanced and how much more dangerous they will continue 366 00:20:27,800 --> 00:20:31,080 Speaker 1: to get in the very near future, even without access 367 00:20:31,119 --> 00:20:34,960 Speaker 1: to heavy weapons or manufacturing capabilities. So I got the 368 00:20:35,040 --> 00:20:37,359 Speaker 1: idea to have this discussion when I read an article 369 00:20:37,480 --> 00:20:41,520 Speaker 1: in Undark magazine, which if Robert you ever read an Undark, Yeah, 370 00:20:41,520 --> 00:20:42,719 Speaker 1: I don't think I've read this one before we can 371 00:20:42,760 --> 00:20:45,639 Speaker 1: mentioned the title. My mind immediately went to various horror 372 00:20:45,640 --> 00:20:48,639 Speaker 1: fiction publications. It's got a horror kind of name. I 373 00:20:48,640 --> 00:20:52,119 Speaker 1: think it's named after We intentionally named after a type 374 00:20:52,119 --> 00:20:55,879 Speaker 1: of radium paint that was used back in the day, 375 00:20:55,920 --> 00:20:59,240 Speaker 1: I think early twentieth century, when before people realize what 376 00:20:59,320 --> 00:21:01,480 Speaker 1: the risks of it were. And that's sort of what 377 00:21:01,560 --> 00:21:04,600 Speaker 1: the magazine explores. That explores science and a lot of 378 00:21:04,640 --> 00:21:07,919 Speaker 1: good long form science writing, exploring the good and the 379 00:21:07,960 --> 00:21:11,119 Speaker 1: bad that science has to offer. But anyway, this article 380 00:21:11,200 --> 00:21:14,280 Speaker 1: was published in July of this year by science journalist 381 00:21:14,359 --> 00:21:18,480 Speaker 1: named Jeremy Sue, and it's called forget killer robots. Autonomous 382 00:21:18,520 --> 00:21:20,919 Speaker 1: weapons are already online. So I was reading this and 383 00:21:21,200 --> 00:21:23,200 Speaker 1: I started thinking about this. I thought this would be 384 00:21:23,240 --> 00:21:25,720 Speaker 1: a good topic for us to talk today. So Sue 385 00:21:25,920 --> 00:21:29,520 Speaker 1: starts by discussing the ways that the problem of autonomous 386 00:21:29,520 --> 00:21:33,840 Speaker 1: weaponry has really captured people's attention worldwide in the conventional 387 00:21:33,880 --> 00:21:36,280 Speaker 1: weapons since and as we've been talking about, there's a 388 00:21:36,280 --> 00:21:38,840 Speaker 1: good reason for this. If we're gonna be increasingly using 389 00:21:38,960 --> 00:21:42,639 Speaker 1: robots and AI programs capable of delivering lethal force on 390 00:21:42,640 --> 00:21:45,879 Speaker 1: the battleground, or I guess to be less uphimistic, if 391 00:21:45,880 --> 00:21:49,240 Speaker 1: we're gonna have machines that can kill people without a 392 00:21:49,320 --> 00:21:52,680 Speaker 1: human taking responsibility and directly making the machine do it, 393 00:21:53,320 --> 00:21:55,960 Speaker 1: we really need to be having serious conversations about the 394 00:21:55,960 --> 00:21:59,919 Speaker 1: ethics of this technology. Should there be international treaties governing 395 00:22:00,080 --> 00:22:02,800 Speaker 1: what kinds of autonomous weapons we allow each other to 396 00:22:02,840 --> 00:22:05,360 Speaker 1: make and so forth? I mean, we've got international treaties 397 00:22:05,560 --> 00:22:09,720 Speaker 1: about nuclear weapons. Shouldn't there be international treaty agreements on 398 00:22:09,840 --> 00:22:13,280 Speaker 1: autonomous weapons? Oh? Yeah, I mean it makes sense. Yet 399 00:22:13,320 --> 00:22:16,040 Speaker 1: to your point, we have we have treaties about nuclear weapons. 400 00:22:16,040 --> 00:22:18,440 Speaker 1: We have we have treaties about biological and chemical agents 401 00:22:18,480 --> 00:22:20,879 Speaker 1: as well. We have treaties about just the way that 402 00:22:20,920 --> 00:22:25,480 Speaker 1: one wages traditional warfare. Their various you know, non nuclear, 403 00:22:25,880 --> 00:22:30,359 Speaker 1: non biological weapons that are also banned under international treaty. 404 00:22:30,840 --> 00:22:33,040 Speaker 1: So it makes sense that we would we would also 405 00:22:33,160 --> 00:22:37,120 Speaker 1: have treaties dealing with autonomous systems. Yeah, it totally makes 406 00:22:37,160 --> 00:22:39,840 Speaker 1: sense in these conversations, as we've been saying, are ongoing. 407 00:22:40,080 --> 00:22:42,680 Speaker 1: Sue writes about how the issue was discussed at a 408 00:22:42,800 --> 00:22:46,120 Speaker 1: United Nations convention in Geneva in April of this year 409 00:22:46,200 --> 00:22:49,200 Speaker 1: in and this was part of an ongoing series of 410 00:22:49,240 --> 00:22:53,000 Speaker 1: conversations with the u N's Convention on Certain Conventional Weapons, 411 00:22:53,080 --> 00:22:56,359 Speaker 1: which has a kind of clunky name. But uh, that's 412 00:22:56,359 --> 00:22:59,080 Speaker 1: a useful kind of forum to be having right now. 413 00:22:59,359 --> 00:23:01,840 Speaker 1: But while we're having this debate about the role of 414 00:23:01,880 --> 00:23:07,119 Speaker 1: AI and autonomous programming and conventional deadly weapons, we really 415 00:23:07,160 --> 00:23:10,520 Speaker 1: tend to miss the fact that autonomous weapons are already 416 00:23:10,560 --> 00:23:14,040 Speaker 1: being used widely in warfare around the world. And we're 417 00:23:14,040 --> 00:23:19,240 Speaker 1: talking about cyber warfare, not autonomous guns and bombs and missiles, 418 00:23:19,280 --> 00:23:23,200 Speaker 1: but little pieces of computer code that autonomously attacks systems 419 00:23:23,240 --> 00:23:27,560 Speaker 1: around the world or in targeted places. Plus, with software, 420 00:23:27,560 --> 00:23:30,880 Speaker 1: there's always the worry of replication. YEA. No matter how 421 00:23:30,920 --> 00:23:35,000 Speaker 1: worried you are about a AI controlled nuclear torpedoes, you 422 00:23:35,080 --> 00:23:37,360 Speaker 1: usually don't have to fret about them mating with each 423 00:23:37,400 --> 00:23:41,880 Speaker 1: other and producing a torpedo offspring. Exactly. I mean that 424 00:23:41,960 --> 00:23:45,760 Speaker 1: it introduces a totally different dynamic of threat and a 425 00:23:45,920 --> 00:23:49,960 Speaker 1: certainly different dynamic of proliferation, and that that's something that 426 00:23:50,040 --> 00:23:52,360 Speaker 1: makes it a whole different ball game when we're considering 427 00:23:52,840 --> 00:23:54,640 Speaker 1: how much of a threat it is and what kind 428 00:23:54,640 --> 00:23:57,480 Speaker 1: of limits we should put on it in international agreements, 429 00:23:57,920 --> 00:24:00,480 Speaker 1: so much the same way that we debate how to 430 00:24:00,520 --> 00:24:03,280 Speaker 1: make air travel safer while people are are dying by 431 00:24:03,320 --> 00:24:06,920 Speaker 1: the thousands on the road in traffic accidents, we're debating 432 00:24:06,960 --> 00:24:11,800 Speaker 1: autonomous conventional weapons, which does matter, while autonomous cyber weapons 433 00:24:11,800 --> 00:24:15,879 Speaker 1: are already here and already fighting and potentially capable of 434 00:24:15,920 --> 00:24:18,520 Speaker 1: doing far more harm and killing more people than a 435 00:24:18,600 --> 00:24:21,560 Speaker 1: robot with a gun could. But I think the availability 436 00:24:21,640 --> 00:24:24,560 Speaker 1: heuristic is at play again here because it's it's just 437 00:24:24,640 --> 00:24:29,000 Speaker 1: that the damage done by autonomous digital weapons like these 438 00:24:29,040 --> 00:24:33,320 Speaker 1: computer viruses and worms and malicious bits of code, the 439 00:24:33,680 --> 00:24:36,600 Speaker 1: the type of damage done by these is less visceral, 440 00:24:36,640 --> 00:24:39,880 Speaker 1: it's harder to picture, and it definitely has fewer movies 441 00:24:39,960 --> 00:24:43,320 Speaker 1: about it, and thus it doesn't get the advantage offered 442 00:24:43,320 --> 00:24:45,960 Speaker 1: by the availability heuristic. You get when you get pictures 443 00:24:45,960 --> 00:24:48,480 Speaker 1: of terminators, that's right, When when you think of an 444 00:24:48,520 --> 00:24:53,840 Speaker 1: infrastructure wrecking, malicious program, uh, It's really hard to think 445 00:24:53,880 --> 00:24:56,959 Speaker 1: of in any horror films that that really line up 446 00:24:56,960 --> 00:24:58,720 Speaker 1: with that concept. The best I can come up with 447 00:24:59,160 --> 00:25:04,760 Speaker 1: is Stephen is maximum Overdrive, which is of course Vigures film. 448 00:25:04,800 --> 00:25:09,080 Speaker 1: So that is the most realistic picture of future technology threat. Yeah, 449 00:25:09,080 --> 00:25:11,320 Speaker 1: it's strange how this works. Right, we were talking about 450 00:25:11,320 --> 00:25:14,479 Speaker 1: this a little before we we started rolling. You know, 451 00:25:14,520 --> 00:25:18,480 Speaker 1: it's it's It's been said that any sufficiently advanced technology 452 00:25:18,520 --> 00:25:21,199 Speaker 1: is indistinguishable from magic, right, and you can point to 453 00:25:21,240 --> 00:25:23,920 Speaker 1: it to various bits of mythology, the myths of say, 454 00:25:24,160 --> 00:25:28,880 Speaker 1: humans creating other rational beings. And in the past these 455 00:25:28,880 --> 00:25:33,240 Speaker 1: were this was pure fantasy. Now in the twenty first century, 456 00:25:33,280 --> 00:25:35,399 Speaker 1: it is. Uh, it is far more realistic when we 457 00:25:35,440 --> 00:25:40,080 Speaker 1: look at models of AI and genetic engineering, etcetera. Uh. Likewise, 458 00:25:40,119 --> 00:25:43,520 Speaker 1: maximum overdrive was ridiculous in the nineteen eighties, but as 459 00:25:43,520 --> 00:25:47,080 Speaker 1: we enter into this age that is increasingly defined by 460 00:25:47,080 --> 00:25:51,200 Speaker 1: research into autonomous vehicles or the Internet of things, Uh, 461 00:25:51,920 --> 00:25:55,720 Speaker 1: maximum overdrive is suddenly not so bonkers. It is crazy. 462 00:25:55,760 --> 00:25:59,119 Speaker 1: In the nineteen eighties, people might have said, Okay, maximum 463 00:25:59,119 --> 00:26:03,480 Speaker 1: overdrive is a silly fantasy about the future technology threat, 464 00:26:03,560 --> 00:26:07,719 Speaker 1: and Terminator is a more realistic movie about future technology 465 00:26:07,720 --> 00:26:10,159 Speaker 1: threat and looking at the wilay of the land today, 466 00:26:10,320 --> 00:26:13,480 Speaker 1: I said, I mean, obviously Maximum Overdrive his cartoony, but 467 00:26:13,640 --> 00:26:17,200 Speaker 1: the general picture outlined by the two of them, maximum 468 00:26:17,200 --> 00:26:21,560 Speaker 1: Overdrive might be the more realistic threat. Any sufficiently advanced 469 00:26:21,560 --> 00:26:26,760 Speaker 1: technology is indistinguishable from Maximum Overdrive. But if you're lost 470 00:26:26,840 --> 00:26:29,679 Speaker 1: right now, okay, why do we say Maximum Overdrive, by 471 00:26:29,720 --> 00:26:33,080 Speaker 1: the way, is basically just a ridiculous eighties Stephen King 472 00:26:33,119 --> 00:26:36,720 Speaker 1: movie where like trucks and appliances and everything every electronic 473 00:26:36,920 --> 00:26:40,920 Speaker 1: thing or sometimes just cars and stuff start trying to 474 00:26:41,000 --> 00:26:45,280 Speaker 1: kill mostly cars, and like really badass trucks come to 475 00:26:45,359 --> 00:26:47,960 Speaker 1: life and start killing things. But but also I think 476 00:26:47,960 --> 00:26:51,280 Speaker 1: like toast drivans and lawn mowers and whatnot. So so 477 00:26:51,480 --> 00:26:54,960 Speaker 1: what autonomous cyber weapons have already been deployed? We said, 478 00:26:55,000 --> 00:26:57,800 Speaker 1: this is already a thing that exists. So in his article, 479 00:26:58,240 --> 00:27:01,760 Speaker 1: Sue quote Scott Borg, director in chief economist of the 480 00:27:01,800 --> 00:27:05,600 Speaker 1: US Cyber Consequences Unit, and Borg says quote malicious computer 481 00:27:05,680 --> 00:27:10,080 Speaker 1: programs that could be described as intelligent autonomous agents are 482 00:27:10,200 --> 00:27:14,040 Speaker 1: what steal people's data, build bot nets, lock people out 483 00:27:14,040 --> 00:27:16,960 Speaker 1: of their systems until they pay a ransom. And do 484 00:27:17,040 --> 00:27:19,920 Speaker 1: most of the other work of cyber criminals. So he's 485 00:27:19,920 --> 00:27:22,200 Speaker 1: talking about the fact that generally when they're cyber crime 486 00:27:22,280 --> 00:27:26,480 Speaker 1: going on, it's not a cyber criminal sitting at a 487 00:27:26,520 --> 00:27:30,960 Speaker 1: computer like manually doing stuff to you. They've they've created 488 00:27:30,960 --> 00:27:34,919 Speaker 1: a program that does it autonomously, so they just put it, 489 00:27:35,000 --> 00:27:37,240 Speaker 1: they let it go, and it does its work and 490 00:27:37,280 --> 00:27:40,160 Speaker 1: then they reap the benefits. But there have been plenty 491 00:27:40,200 --> 00:27:43,040 Speaker 1: of examples of this type of warfare in in actual 492 00:27:43,080 --> 00:27:46,840 Speaker 1: international relations. So there's the ducks networm. This is a 493 00:27:46,920 --> 00:27:51,560 Speaker 1: malicious computer worm that attacked computers controlling nuclear center refuges 494 00:27:51,600 --> 00:27:54,159 Speaker 1: in Iran, and it's believed to have slowed down the 495 00:27:54,200 --> 00:27:57,840 Speaker 1: development of the Iranian nuclear program. I think this happened 496 00:27:57,880 --> 00:28:01,080 Speaker 1: around two thousand ten, and no one's in it responsibility, 497 00:28:01,119 --> 00:28:04,040 Speaker 1: but it appears that this was a cyber weapon created 498 00:28:04,040 --> 00:28:06,760 Speaker 1: by the United States and Israel to try to prevent 499 00:28:06,840 --> 00:28:11,240 Speaker 1: Iran from acquiring nuclear weapons. Another example would be the 500 00:28:11,440 --> 00:28:15,199 Speaker 1: want to Cry ransomware worm, which shuts down computers and 501 00:28:15,240 --> 00:28:18,720 Speaker 1: demands payment of money before allowing the computers to be 502 00:28:18,800 --> 00:28:21,919 Speaker 1: made functional again. And want to Cry has caused real damage. 503 00:28:21,960 --> 00:28:25,160 Speaker 1: It attacked computer systems in hospitals at the UK National 504 00:28:25,200 --> 00:28:29,160 Speaker 1: Health Service, among a bunch of other important infrastructure machines. 505 00:28:29,560 --> 00:28:31,680 Speaker 1: Plus I think you could you could, you could reasonably 506 00:28:31,760 --> 00:28:36,199 Speaker 1: argue that there was UH psychological damage inflicted by that 507 00:28:36,240 --> 00:28:38,880 Speaker 1: attack as well. I mean, maybe not as much as 508 00:28:38,880 --> 00:28:44,280 Speaker 1: like a full blown UH cyber terrorism event, but it's 509 00:28:44,280 --> 00:28:47,240 Speaker 1: certainly captured headlines and it was It was the kind 510 00:28:47,280 --> 00:28:50,600 Speaker 1: of threat that's kind of central to the the appeal 511 00:28:50,600 --> 00:28:54,480 Speaker 1: of a terroristic act. It makes everyone feel like they 512 00:28:54,520 --> 00:28:57,880 Speaker 1: could be a potential victim. Yeah, yeah, totally, and in 513 00:28:58,000 --> 00:29:00,800 Speaker 1: fact a lot of people you could be a potential victim. 514 00:29:00,880 --> 00:29:03,320 Speaker 1: I mean, this is a thing that's that's worth being 515 00:29:03,360 --> 00:29:06,440 Speaker 1: concerned about and thinking about what defensive measures could we 516 00:29:06,480 --> 00:29:11,000 Speaker 1: put in place. Again, no one has plausibly admitted responsibility 517 00:29:11,080 --> 00:29:13,200 Speaker 1: for the want to Cry attack, but a lot of 518 00:29:13,240 --> 00:29:15,520 Speaker 1: analysts thinks science point to it being a project of 519 00:29:15,560 --> 00:29:19,200 Speaker 1: the North Korean government. Now, these examples demonstrate that, like 520 00:29:19,320 --> 00:29:24,760 Speaker 1: any weapon and autonomous UH, an autonomous cyberweapon can be 521 00:29:24,880 --> 00:29:28,720 Speaker 1: used for various types of conflicts, right like in the 522 00:29:28,720 --> 00:29:31,120 Speaker 1: case of the stuck S networm. Whatever you think about 523 00:29:31,160 --> 00:29:33,360 Speaker 1: the U S and Israel, I think most people would 524 00:29:33,360 --> 00:29:35,920 Speaker 1: probably agree that they're glad somebody figured out a non 525 00:29:36,040 --> 00:29:39,800 Speaker 1: violent way to stop an authoritarian regime from making nuclear weapons. 526 00:29:40,240 --> 00:29:42,920 Speaker 1: But then again, the same technology could be used by 527 00:29:42,920 --> 00:29:46,479 Speaker 1: the same actors or by others to attack things that 528 00:29:46,520 --> 00:29:48,840 Speaker 1: are you know, that would get less sympathy from people 529 00:29:48,880 --> 00:29:57,160 Speaker 1: around the world. Attack critical infrastructure anywhere, power, water, security, hospitals, telecommunications, media, banks, 530 00:29:57,200 --> 00:30:00,440 Speaker 1: and financial systems, transportation. I mean, all the these things 531 00:30:00,480 --> 00:30:04,240 Speaker 1: are increasingly connected now. Yeah, and we've only seen partial 532 00:30:04,480 --> 00:30:07,440 Speaker 1: use of these cyber weapons, certainly nothing of the magnitude 533 00:30:07,440 --> 00:30:10,840 Speaker 1: of of a full scale cyber war. Uh, something that 534 00:30:10,840 --> 00:30:14,800 Speaker 1: that you know that some futurists and cybersecurity analysts have 535 00:30:14,800 --> 00:30:17,040 Speaker 1: have written about and discussed like what would this look like? 536 00:30:17,480 --> 00:30:20,040 Speaker 1: Uh though though that though many of them stressed that 537 00:30:20,080 --> 00:30:23,800 Speaker 1: even these limited uses, they can build legitimacy for the 538 00:30:23,840 --> 00:30:26,560 Speaker 1: development of such weapons, as well as for the development 539 00:30:26,560 --> 00:30:30,880 Speaker 1: of national cyber response teams. And it's inevitable too that 540 00:30:30,880 --> 00:30:35,040 Speaker 1: that such attacks will lean increasingly into machine learning and 541 00:30:35,080 --> 00:30:38,440 Speaker 1: the use of AI. So what we've seen so far 542 00:30:38,600 --> 00:30:42,440 Speaker 1: is is sadly just like the tip of the iceberg unfortunately. Yeah, 543 00:30:42,480 --> 00:30:44,440 Speaker 1: and I want to talk more about that, especially the 544 00:30:44,560 --> 00:30:47,920 Speaker 1: role of machine learning and AI later on. But um, 545 00:30:48,040 --> 00:30:50,560 Speaker 1: one of the things I really want to stress is 546 00:30:50,600 --> 00:30:55,920 Speaker 1: that these days in the modern world, attacks on infrastructure 547 00:30:55,960 --> 00:31:00,720 Speaker 1: are not necessarily just inconveniences. It's not like thinking, oh, well, 548 00:31:00,760 --> 00:31:02,640 Speaker 1: I just lost internet for a day. It was actually 549 00:31:02,680 --> 00:31:04,960 Speaker 1: kind of nice because I went outside and took a walk. Yeah. 550 00:31:05,000 --> 00:31:07,920 Speaker 1: I mean, you might have lost power in your neighborhood before, 551 00:31:07,960 --> 00:31:09,960 Speaker 1: and you know it was an inconvenience or something, but 552 00:31:10,040 --> 00:31:14,080 Speaker 1: you were fine. But at a large enough scale, attacks 553 00:31:14,120 --> 00:31:18,480 Speaker 1: like these on infrastructure that are totally plausible under under 554 00:31:18,480 --> 00:31:21,720 Speaker 1: the world we live in today, they're pretty much guaranteed 555 00:31:21,800 --> 00:31:26,320 Speaker 1: to result in people facing serious material loss, injury, and death. 556 00:31:26,880 --> 00:31:29,560 Speaker 1: As an example for comparison, you could look at the 557 00:31:29,600 --> 00:31:33,280 Speaker 1: tragedy that's happened in Puerto Rico following the following the 558 00:31:33,360 --> 00:31:38,160 Speaker 1: landfall of Hurricane Maria in September. Now, when that happened, 559 00:31:38,160 --> 00:31:41,160 Speaker 1: when Puerto Rico was hit by the hurricane, it effectively 560 00:31:41,240 --> 00:31:44,800 Speaker 1: knocked out Puerto Rico's electrical power grid and temporarily put 561 00:31:44,800 --> 00:31:48,640 Speaker 1: a stop to lots of services in its aftermath, including electricity, 562 00:31:49,080 --> 00:31:52,720 Speaker 1: sewage treatment, some types of health and medical care, clean 563 00:31:52,800 --> 00:31:56,800 Speaker 1: tap water, and so forth. And in a civilization that's 564 00:31:56,920 --> 00:32:00,200 Speaker 1: built on the assumption of continued access to service is 565 00:32:00,240 --> 00:32:04,760 Speaker 1: like power and clean water. Sudden interruption of those services 566 00:32:04,840 --> 00:32:08,920 Speaker 1: is devastating and genuinely lethal. And while we we can't 567 00:32:08,920 --> 00:32:10,960 Speaker 1: be sure of the exact number of people who died 568 00:32:11,000 --> 00:32:13,440 Speaker 1: as a result of the devastation caused by the hurricane, 569 00:32:13,760 --> 00:32:15,640 Speaker 1: there have been some estimates and it seems like a 570 00:32:15,720 --> 00:32:19,480 Speaker 1: lot of people survived the initial storm but died in 571 00:32:19,520 --> 00:32:22,800 Speaker 1: the weeks and months afterwards from complications in the aftermath. 572 00:32:22,840 --> 00:32:27,120 Speaker 1: A lot of these possibly due to interruptions and services, healthcare, disease, 573 00:32:27,200 --> 00:32:30,760 Speaker 1: and so forth. Harvard study published in the New England 574 00:32:30,840 --> 00:32:34,200 Speaker 1: Journal of Medicine used survey data from households in Puerto 575 00:32:34,240 --> 00:32:37,360 Speaker 1: Rico to try to estimate the amount of the human impact, 576 00:32:37,440 --> 00:32:40,040 Speaker 1: and they found that among respondents there was a mortality 577 00:32:40,160 --> 00:32:44,000 Speaker 1: rate of about fourteen point three deaths for thousand persons 578 00:32:44,200 --> 00:32:48,240 Speaker 1: in September twenty through the end of seventeen, and this 579 00:32:48,320 --> 00:32:52,800 Speaker 1: yielded somewhere between seven hundred and ninety three and eight thousand, 580 00:32:53,040 --> 00:32:57,120 Speaker 1: four hundred nine excess deaths above the normal rate for 581 00:32:57,160 --> 00:32:59,560 Speaker 1: this period. That the mean of those numbers would be 582 00:32:59,600 --> 00:33:03,040 Speaker 1: like four six hundred people and that was within that 583 00:33:03,120 --> 00:33:07,120 Speaker 1: range with confidence interval. So the authors think the death 584 00:33:07,160 --> 00:33:09,920 Speaker 1: holl could actually be higher because the survey data is 585 00:33:09,920 --> 00:33:13,720 Speaker 1: contaminated by survivor bias. Right, people who died were not 586 00:33:13,840 --> 00:33:16,640 Speaker 1: able to respond to the survey. Now, we don't know 587 00:33:16,760 --> 00:33:18,840 Speaker 1: for sure if that estimate is accurate. There could be 588 00:33:18,880 --> 00:33:21,320 Speaker 1: flaws in the method. But if it's accurate, that's equivalent 589 00:33:21,600 --> 00:33:24,600 Speaker 1: to a sixty two percent increase in the mortality rate 590 00:33:24,640 --> 00:33:27,880 Speaker 1: as compared to the same period in st And we 591 00:33:27,920 --> 00:33:30,040 Speaker 1: should acknowledge there's been a lot of anger about the 592 00:33:30,040 --> 00:33:34,280 Speaker 1: way the US government handled the Hurricane Maria aftermath, with 593 00:33:34,400 --> 00:33:37,160 Speaker 1: charges that it basically didn't do enough to get essential 594 00:33:37,200 --> 00:33:41,240 Speaker 1: infrastructure and services back online fast enough, and that people 595 00:33:41,280 --> 00:33:44,240 Speaker 1: actually suffered and died as a result of not having 596 00:33:44,240 --> 00:33:47,240 Speaker 1: this infrastructure online, arguably in a manner that would have 597 00:33:47,240 --> 00:33:49,840 Speaker 1: been far different had this been, uh, you know, the 598 00:33:50,240 --> 00:33:54,480 Speaker 1: aftermath of a terrorist attack or or something that was 599 00:33:54,880 --> 00:33:58,680 Speaker 1: a little more centralized, personified, or too or tied in 600 00:33:58,720 --> 00:34:01,880 Speaker 1: with these more pervasive fear. Yeah, there there does appear 601 00:34:01,920 --> 00:34:05,280 Speaker 1: to be a weird psychology in the way societies respond 602 00:34:05,360 --> 00:34:08,200 Speaker 1: to threats, and that all different kinds of biases can 603 00:34:08,239 --> 00:34:12,680 Speaker 1: provide differential motivation in how much effort we put into 604 00:34:12,719 --> 00:34:16,080 Speaker 1: fixing the problem afterwards and helping people. So, yeah, it's 605 00:34:16,080 --> 00:34:19,120 Speaker 1: been it's been a horrible tragedy there, But we should 606 00:34:19,120 --> 00:34:22,360 Speaker 1: think about that. This kind of tragedy is what happens 607 00:34:22,400 --> 00:34:25,839 Speaker 1: due to a random attack by the weather. One can 608 00:34:25,880 --> 00:34:29,160 Speaker 1: only wonder what it would look like if, say, infrastructure 609 00:34:29,200 --> 00:34:31,640 Speaker 1: was attacked not randomly by the weather, but by a 610 00:34:31,719 --> 00:34:35,719 Speaker 1: malicious party intentionally trying to do as much damage as 611 00:34:35,800 --> 00:34:39,040 Speaker 1: possible with digital weapons. Yeah, I mean obviously a number 612 00:34:39,080 --> 00:34:44,160 Speaker 1: of possibilities instantly come to mind. Targeting infrastructure during the 613 00:34:44,239 --> 00:34:48,600 Speaker 1: very depths of winter, for example, right, or during a key, 614 00:34:48,760 --> 00:34:52,839 Speaker 1: say politically sensitive times like during an election. Yeah. Um, yeah, 615 00:34:52,880 --> 00:34:55,919 Speaker 1: I know I've read that there there's belief that there 616 00:34:56,120 --> 00:35:00,919 Speaker 1: was Russian digital autonomous agent attacks again to say some 617 00:35:01,120 --> 00:35:06,040 Speaker 1: Ukrainian power infrastructure during times of political upheaval and civil unrest, 618 00:35:06,080 --> 00:35:08,800 Speaker 1: which you know, it only makes things worse the timing 619 00:35:08,840 --> 00:35:11,560 Speaker 1: of those attacks. And a scary part is that we're 620 00:35:11,600 --> 00:35:15,200 Speaker 1: constantly connecting more and more devices to the Internet. Yeah, 621 00:35:15,239 --> 00:35:20,040 Speaker 1: like our refrigerators to the internet, cars, home appliances, medical devices, 622 00:35:20,080 --> 00:35:24,120 Speaker 1: and even medical implants. Every year the connectedness does not shrink, 623 00:35:24,160 --> 00:35:27,680 Speaker 1: it grows, and thus every year the world's connectedness and 624 00:35:27,760 --> 00:35:32,240 Speaker 1: vulnerability to cyber attacks, you could argue, becomes even greater, 625 00:35:32,719 --> 00:35:36,040 Speaker 1: almost as if we really want maximum overdrive to happen, 626 00:35:36,120 --> 00:35:38,439 Speaker 1: as if we were saying, you know, you know that movie, 627 00:35:38,480 --> 00:35:40,120 Speaker 1: that movie that came out in the eighties that was 628 00:35:40,160 --> 00:35:42,840 Speaker 1: so ridiculous, Uh, let's do that. Let's just go ahead 629 00:35:42,840 --> 00:35:45,680 Speaker 1: and put our refrigerator and our toaster on the Internet 630 00:35:45,680 --> 00:35:47,440 Speaker 1: and have them talk to each other. I mean, it's 631 00:35:47,440 --> 00:35:50,160 Speaker 1: hard to predict exactly what attacks would look like, though 632 00:35:50,160 --> 00:35:51,520 Speaker 1: there has been a lot of work on this, and 633 00:35:51,560 --> 00:35:53,600 Speaker 1: maybe if we revisit this topic in the future, we 634 00:35:53,600 --> 00:35:58,480 Speaker 1: could just more explicitly explore scenarios that have been talked 635 00:35:58,480 --> 00:36:01,759 Speaker 1: about by cybersecurity x birds. Yeah, in particular that the 636 00:36:02,280 --> 00:36:05,080 Speaker 1: concept of full blown cyber warfare. What that would look 637 00:36:05,120 --> 00:36:08,120 Speaker 1: like if you had nation states actively and at least 638 00:36:08,160 --> 00:36:12,600 Speaker 1: semi openly engaging in attacks and counterattacks against each other, 639 00:36:12,800 --> 00:36:16,680 Speaker 1: with the possibility then of actual military attacks on top 640 00:36:16,719 --> 00:36:19,560 Speaker 1: of that. Yeah, totally. But I mean, one thing I 641 00:36:19,640 --> 00:36:21,680 Speaker 1: really want to drive home, if you take anything away 642 00:36:21,719 --> 00:36:24,719 Speaker 1: from this episode, I wanted to be that attacks by 643 00:36:24,760 --> 00:36:29,320 Speaker 1: autonomous digital agents just computer viruses. Basically in the common understanding, 644 00:36:29,360 --> 00:36:31,520 Speaker 1: worms and stuff like that, things that don't have a 645 00:36:31,560 --> 00:36:34,120 Speaker 1: gun or a missile or an explosive attacks to them 646 00:36:34,480 --> 00:36:37,840 Speaker 1: can be just as dangerous, just as deadly and probably 647 00:36:37,960 --> 00:36:42,280 Speaker 1: more so than conventional weapons can be. Absolutely and remember 648 00:36:42,320 --> 00:36:44,319 Speaker 1: another thing is that most of these attacks can, at 649 00:36:44,360 --> 00:36:49,040 Speaker 1: least in theory, function without ongoing human input or direction. Right, 650 00:36:49,239 --> 00:36:51,680 Speaker 1: It's the set it and forget it model of warfare. 651 00:36:52,120 --> 00:36:55,840 Speaker 1: Like you, you make an autonomous digital agent, a computer virus, 652 00:36:55,880 --> 00:36:59,080 Speaker 1: a worm, whatever to attack infrastructure out there, and you 653 00:36:59,120 --> 00:37:01,040 Speaker 1: might be able to sign it so that you don't 654 00:37:01,080 --> 00:37:03,399 Speaker 1: need to go back and do anything to it later. 655 00:37:03,560 --> 00:37:07,279 Speaker 1: You don't need to maintain it. Uh directed it functions 656 00:37:07,360 --> 00:37:10,319 Speaker 1: on its own. That's what autonomous means. Yeah, it's it's 657 00:37:10,360 --> 00:37:13,719 Speaker 1: basically the more perfect version of the nuclear torpedo that 658 00:37:13,760 --> 00:37:15,959 Speaker 1: I talked about earlier. Like the the idea there, of course, 659 00:37:16,000 --> 00:37:18,799 Speaker 1: is that it's out there, it's not communicating back, and 660 00:37:18,840 --> 00:37:22,759 Speaker 1: then it could and then it strikes its target, detonates 661 00:37:22,800 --> 00:37:26,480 Speaker 1: and causes you know, lasting radioactive damage to a particular 662 00:37:26,520 --> 00:37:30,160 Speaker 1: area uh and and uh and loss of life um 663 00:37:30,280 --> 00:37:33,879 Speaker 1: and damage to the infrastructure. Etcetera. But we're talking about 664 00:37:33,880 --> 00:37:37,200 Speaker 1: digital agents. It would be able to achieve many of 665 00:37:37,239 --> 00:37:40,719 Speaker 1: those same goals. Uh, but without the same risk of 666 00:37:40,760 --> 00:37:44,240 Speaker 1: components aging and uh and uh and and the torpedo 667 00:37:44,280 --> 00:37:47,880 Speaker 1: itself eventually dying. Right, And the one thing about what 668 00:37:47,960 --> 00:37:50,040 Speaker 1: you just said does make me want to emphasize, I'm 669 00:37:50,040 --> 00:37:54,600 Speaker 1: not suggesting that cyber warfares, say, worse than a nuclear 670 00:37:54,640 --> 00:37:57,360 Speaker 1: attack would be. I don't think that's true. I mean, obviously, 671 00:37:57,680 --> 00:38:00,760 Speaker 1: a full scale attack by conventional or new clear weapons 672 00:38:01,080 --> 00:38:04,000 Speaker 1: would be a worse outcome than a cyber attack. But 673 00:38:04,040 --> 00:38:07,000 Speaker 1: I think a cyber attack is a more a threat 674 00:38:07,040 --> 00:38:09,400 Speaker 1: that we need to be even more concerned about because 675 00:38:10,040 --> 00:38:13,799 Speaker 1: it can be very, very realistically destructive, and it's very 676 00:38:13,920 --> 00:38:17,359 Speaker 1: likely to happen because it's already happening. I mean, it's 677 00:38:17,360 --> 00:38:21,319 Speaker 1: something that you can very easily see being deployed, much 678 00:38:21,320 --> 00:38:24,960 Speaker 1: more so than you can imagine, say, nuclear war between 679 00:38:25,000 --> 00:38:29,120 Speaker 1: currently nuclear armed countries. Right. Yeah, When we try to 680 00:38:29,160 --> 00:38:33,360 Speaker 1: imagine an the anonymous use of a nuclear even a 681 00:38:33,480 --> 00:38:38,040 Speaker 1: powerful enough conventional weapon, um, it's it's far less less 682 00:38:38,120 --> 00:38:42,040 Speaker 1: likely compared to the anonymous use of a particularly volatile 683 00:38:42,280 --> 00:38:44,960 Speaker 1: cyber weapon, because we haven't really seen the former, and 684 00:38:45,000 --> 00:38:48,719 Speaker 1: we've definitely definitely seen the ladder exactly. So a lot 685 00:38:48,760 --> 00:38:50,640 Speaker 1: of this article that I mentioned earlier that got me 686 00:38:50,680 --> 00:38:54,600 Speaker 1: thinking about this by Jeremy Sue it it's simply highlighting 687 00:38:54,600 --> 00:38:58,440 Speaker 1: the fact that governments and international organizations are not having 688 00:38:58,560 --> 00:39:02,120 Speaker 1: enough conversation about sidelines for how to control the threat 689 00:39:02,480 --> 00:39:06,560 Speaker 1: of autonomous cyber weapons. We are having some international conversations 690 00:39:06,600 --> 00:39:09,440 Speaker 1: about what to do about autonomous conventional weapons, the robot 691 00:39:09,440 --> 00:39:12,680 Speaker 1: with a gun, We are not having enough conversations about 692 00:39:12,680 --> 00:39:16,399 Speaker 1: what to do about controlling autonomous cyber weapons. And these 693 00:39:16,440 --> 00:39:19,759 Speaker 1: autonomous cyber weapons are already here and being used. They're 694 00:39:19,800 --> 00:39:23,520 Speaker 1: easier to create and deploy and potentially in many cases 695 00:39:23,560 --> 00:39:27,600 Speaker 1: more destructive than autonomous conventional weapons. One of the experts 696 00:39:27,680 --> 00:39:30,279 Speaker 1: that Sue quotes in his article is Kenneth Anderson, a 697 00:39:30,320 --> 00:39:35,360 Speaker 1: professor of law at American University who specializes in national security. 698 00:39:35,400 --> 00:39:37,440 Speaker 1: One of the experts Sue quotes in his article is 699 00:39:37,520 --> 00:39:41,520 Speaker 1: Kenneth Anderson, a professor uh law professor at American University 700 00:39:41,560 --> 00:39:45,440 Speaker 1: who does national security issues. And Anderson says where is 701 00:39:45,480 --> 00:39:49,360 Speaker 1: the band killer apps? In GEO advocacy campaign demanding a 702 00:39:49,400 --> 00:39:53,600 Speaker 1: sweeping total ban on the use, possession, transferred to transfer, 703 00:39:53,680 --> 00:39:56,640 Speaker 1: or development of cyber weapons all the features found in 704 00:39:56,680 --> 00:40:00,879 Speaker 1: today's Stop Killer Robots campaign. I think that's a good question. Yeah, 705 00:40:01,040 --> 00:40:03,600 Speaker 1: I absolutely agree, and I mean I hope that that 706 00:40:03,719 --> 00:40:05,840 Speaker 1: someone is putting it together. I would very much, especially 707 00:40:05,840 --> 00:40:09,040 Speaker 1: after this episode, like to support such a campaign. Yeah. 708 00:40:09,200 --> 00:40:12,480 Speaker 1: I mean, we've got nuclear non proliferation agreements and stuff 709 00:40:12,520 --> 00:40:15,520 Speaker 1: like that. It seems more than reasonable to be trying 710 00:40:15,520 --> 00:40:18,600 Speaker 1: to work on a similar framework for cyber weapons, and 711 00:40:18,960 --> 00:40:23,160 Speaker 1: and nuclear proliferation treaties have to a large extent worked. 712 00:40:23,440 --> 00:40:26,680 Speaker 1: I mean, for starters, we have not had a nuclear war, 713 00:40:27,000 --> 00:40:30,399 Speaker 1: which some commentators have said is nothing short of a miracle. Um, 714 00:40:30,960 --> 00:40:35,400 Speaker 1: we've seen the nuclear stockpiles of of of the United 715 00:40:35,400 --> 00:40:39,720 Speaker 1: States and Russia uh deplete over the decades, and hopefully 716 00:40:39,800 --> 00:40:43,120 Speaker 1: that will remain the trend. Again, We've seen similar scenarios 717 00:40:43,120 --> 00:40:46,520 Speaker 1: with biological and chemical agents as well. So something could 718 00:40:46,560 --> 00:40:50,960 Speaker 1: be done here if we act and we actually uh, 719 00:40:51,000 --> 00:40:54,080 Speaker 1: you know, push for regulations to be made. You know, Robert, 720 00:40:54,080 --> 00:40:56,520 Speaker 1: one of the things you mentioned earlier is about the 721 00:40:56,600 --> 00:41:01,160 Speaker 1: difficulty in controlling the proliferation of digital agents as compared 722 00:41:01,200 --> 00:41:05,960 Speaker 1: to conventional weapons, right, like digital agents, a computer worm 723 00:41:06,239 --> 00:41:09,520 Speaker 1: or a virus something like that can replicate in the 724 00:41:09,560 --> 00:41:13,200 Speaker 1: wild in some scenarios. So I would say that this 725 00:41:13,320 --> 00:41:17,080 Speaker 1: is also a case where just like with nuclear weapons, 726 00:41:17,120 --> 00:41:19,759 Speaker 1: it's like if if two countries with nuclear weapons go 727 00:41:19,800 --> 00:41:22,120 Speaker 1: to war, it's not just a problem for the people 728 00:41:22,200 --> 00:41:25,000 Speaker 1: in those two countries, it's a problem for the entire world. 729 00:41:25,600 --> 00:41:29,240 Speaker 1: And I would say that software based digital warfare agents 730 00:41:29,280 --> 00:41:31,960 Speaker 1: that operate on the Internet are similarly a problem for 731 00:41:32,000 --> 00:41:35,000 Speaker 1: the entire world because you don't know potentially who they 732 00:41:35,000 --> 00:41:37,719 Speaker 1: could harm on the sidelines. Yeah, I mean we with 733 00:41:37,719 --> 00:41:40,799 Speaker 1: with nuclear, biological and chemical agents. Obviously, we all share 734 00:41:40,800 --> 00:41:44,719 Speaker 1: an atmosphere, we all share a global environment, and when 735 00:41:44,760 --> 00:41:49,400 Speaker 1: we wage devastating war within that environment, we run the 736 00:41:49,480 --> 00:41:54,240 Speaker 1: risk of destabilizing UH everything and harming ourselves in the process. 737 00:41:54,560 --> 00:41:58,560 Speaker 1: And with the digital technology, we have created another environment 738 00:41:58,760 --> 00:42:02,760 Speaker 1: that we have we have made ourselves dependent upon uh 739 00:42:02,840 --> 00:42:05,480 Speaker 1: and and we run the risk of doing the same thing, 740 00:42:05,600 --> 00:42:09,319 Speaker 1: poisoning this new ocean that we've created, especially when we 741 00:42:09,400 --> 00:42:13,560 Speaker 1: consider the possibility of these autonomous agents becoming more adaptive, 742 00:42:14,160 --> 00:42:16,520 Speaker 1: and I think we should explore that after we take 743 00:42:16,560 --> 00:42:20,080 Speaker 1: a break. Thank thank you, thank you, Alright, we're back. 744 00:42:20,160 --> 00:42:24,319 Speaker 1: We're talking about the future, the dangers, the risk, and 745 00:42:24,360 --> 00:42:27,399 Speaker 1: how we should how we should really handle the risk 746 00:42:27,520 --> 00:42:32,320 Speaker 1: and handle anxiety over the prospect of of cyber warfare 747 00:42:32,320 --> 00:42:34,839 Speaker 1: in the future. So here's something I think is really 748 00:42:34,840 --> 00:42:38,160 Speaker 1: worth considering, and it's the convergence of cyber warfare and 749 00:42:38,200 --> 00:42:42,640 Speaker 1: cyber weapons with machine learning and artificial intelligence. Because we've 750 00:42:42,680 --> 00:42:46,560 Speaker 1: already got autonomous cyber weapons, these these malicious bits of 751 00:42:46,560 --> 00:42:50,120 Speaker 1: software out there that you know, enact warfare on on 752 00:42:50,239 --> 00:42:53,640 Speaker 1: infrastructure of opposing forces in the world, and we've got 753 00:42:53,719 --> 00:42:57,080 Speaker 1: machine learning and AI, and there's really no reason these 754 00:42:57,080 --> 00:43:00,320 Speaker 1: capabilities could not be combined. So this is a future 755 00:43:00,360 --> 00:43:04,520 Speaker 1: that combines the devastating capabilities of cyber warfare with the 756 00:43:04,600 --> 00:43:09,399 Speaker 1: attack dynamics of something like biological or germ warfare. This 757 00:43:09,480 --> 00:43:13,239 Speaker 1: is a future that should worry us. Autonomous cyber weapons 758 00:43:13,280 --> 00:43:17,040 Speaker 1: that can learn, adapt and change on their own. And 759 00:43:17,080 --> 00:43:19,080 Speaker 1: I think it's not hard to see how we could 760 00:43:19,239 --> 00:43:23,160 Speaker 1: potentially go down this road of developing dangerous AI cyber 761 00:43:23,200 --> 00:43:26,960 Speaker 1: weapons that alter themselves through machine learning and get out 762 00:43:26,960 --> 00:43:29,400 Speaker 1: of control. I thought of just a couple of scenarios 763 00:43:29,440 --> 00:43:32,239 Speaker 1: that seem plausible to me, at least one would be 764 00:43:32,280 --> 00:43:35,960 Speaker 1: cyber terrorism. You know, some forces are not rational actors 765 00:43:36,000 --> 00:43:39,760 Speaker 1: seeking to limit the harm they cause and preserve their interests. 766 00:43:39,760 --> 00:43:42,319 Speaker 1: But some people are just simply interested in causing harm 767 00:43:42,320 --> 00:43:45,239 Speaker 1: and chaos. I imagine how bad things could be if 768 00:43:45,280 --> 00:43:48,840 Speaker 1: somebody like the Uni bomber had computer skills to wage 769 00:43:49,000 --> 00:43:52,040 Speaker 1: lone wolf cyber warfare of this kind. But then I 770 00:43:52,080 --> 00:43:56,040 Speaker 1: also think about dangerous autonomous AI that begins as a 771 00:43:56,160 --> 00:44:01,560 Speaker 1: defensive measure to protect against cyber attacks. So most harmful 772 00:44:01,600 --> 00:44:04,600 Speaker 1: military technologies, Robert, I bet you would agree, most of 773 00:44:04,600 --> 00:44:08,120 Speaker 1: these harmful technologies and strategies in world history have not 774 00:44:08,239 --> 00:44:12,239 Speaker 1: come from people claiming to be developing offensive weapons to 775 00:44:12,360 --> 00:44:17,160 Speaker 1: maliciously attack on suspecting victims. They've been developed under a mindset, 776 00:44:17,239 --> 00:44:20,520 Speaker 1: whether this is really objectively fair or not, of defense. 777 00:44:20,840 --> 00:44:23,279 Speaker 1: People think like, I'm under threat, I need to do 778 00:44:23,400 --> 00:44:26,239 Speaker 1: something to protect myself exactly. I mean this is that 779 00:44:26,280 --> 00:44:29,359 Speaker 1: this is then the arms race throughout history, right. Uh, 780 00:44:29,640 --> 00:44:31,920 Speaker 1: if the other side has a large slingshot, I need 781 00:44:31,960 --> 00:44:34,279 Speaker 1: an equally size size sling shot. I need something that 782 00:44:34,360 --> 00:44:37,640 Speaker 1: is a deterrent, otherwise they're just going to take advantage 783 00:44:37,640 --> 00:44:40,760 Speaker 1: of me. Yeah, So what what looks like offensive threatening 784 00:44:40,800 --> 00:44:45,399 Speaker 1: behavior from your perspective. From the other person's perspective is like, look, 785 00:44:45,440 --> 00:44:48,920 Speaker 1: I just got to defend myself. Uh So, usually by 786 00:44:48,920 --> 00:44:50,920 Speaker 1: the time you recognize you've been the victim of a 787 00:44:50,960 --> 00:44:54,200 Speaker 1: cyber attack, a lot of damage is already done. So 788 00:44:54,280 --> 00:44:57,360 Speaker 1: what if in the future we decide we need autonomous, 789 00:44:57,520 --> 00:45:02,760 Speaker 1: adaptive defensive cyber weapons to protect us against offensive cyber weapons, 790 00:45:02,760 --> 00:45:07,200 Speaker 1: something like an immune system for our infrastructure, the equivalent 791 00:45:07,280 --> 00:45:10,239 Speaker 1: of deployed white blood cells, you know, T cells and 792 00:45:10,280 --> 00:45:13,839 Speaker 1: B cells and so forth, to autonomously detect, hunt, and 793 00:45:13,960 --> 00:45:17,720 Speaker 1: kill malicious autonomous cyberweapons the same way that white blood 794 00:45:17,719 --> 00:45:20,719 Speaker 1: cells in your body behave kind of like an adaptive, 795 00:45:20,840 --> 00:45:24,960 Speaker 1: independent organism within the body, hunting and killing autonomously in 796 00:45:25,000 --> 00:45:27,440 Speaker 1: the bloodstream. Well that sounds great, Joe, But now you're 797 00:45:27,560 --> 00:45:30,080 Speaker 1: right back to sky net, Like this sounds exactly like 798 00:45:30,200 --> 00:45:33,359 Speaker 1: sky net, except it is like sky net is just 799 00:45:33,560 --> 00:45:36,680 Speaker 1: is the the clunkier metaphor for what the future may become. Well, 800 00:45:36,719 --> 00:45:39,480 Speaker 1: it's not terminators holding guns, but it is a distributed 801 00:45:39,600 --> 00:45:42,520 Speaker 1: defense network. At this point, you're you're talking about like 802 00:45:42,560 --> 00:45:46,279 Speaker 1: an anti virus virus, right, or the possibility of an 803 00:45:46,320 --> 00:45:49,799 Speaker 1: of an immune system turning against itself, turning against its host, 804 00:45:49,800 --> 00:45:52,879 Speaker 1: which which of course we see in the biological Well absolutely, yeah, 805 00:45:53,040 --> 00:45:54,960 Speaker 1: that's exactly where I was going with that. So you've 806 00:45:55,000 --> 00:45:57,920 Speaker 1: got autoimmune diseases, if you've got an immune system, you 807 00:45:58,000 --> 00:46:01,040 Speaker 1: run the risk of the immune system, in say cases 808 00:46:01,080 --> 00:46:05,760 Speaker 1: like arthritis or type one diabetes or MS, misidentifying friendly 809 00:46:06,480 --> 00:46:09,360 Speaker 1: tissue as something that needs to be defended against and 810 00:46:09,440 --> 00:46:13,560 Speaker 1: attacking its own body, you know, turning, turning parts of 811 00:46:13,600 --> 00:46:16,799 Speaker 1: the body into innocent victims. Except the nature of the 812 00:46:16,800 --> 00:46:19,719 Speaker 1: internet and the connected world means that if one system 813 00:46:19,840 --> 00:46:24,600 Speaker 1: develops the digital equivalent of an autoimmune disease, potentially anybody 814 00:46:24,640 --> 00:46:27,520 Speaker 1: could catch it. I'd also pair this this kind of 815 00:46:27,560 --> 00:46:31,120 Speaker 1: scary scenario with our episode last year about neurosecurity. The 816 00:46:31,160 --> 00:46:35,680 Speaker 1: increasing vulnerabilities were accumulating as you make the connections between 817 00:46:35,800 --> 00:46:40,160 Speaker 1: your digital services and your nervous system more robust. Indeed, 818 00:46:40,160 --> 00:46:42,520 Speaker 1: the idea that you could have your your brain implant 819 00:46:42,600 --> 00:46:47,359 Speaker 1: hacked or your pacemaker device hacked, etcetera. So anyway, just 820 00:46:47,560 --> 00:46:50,719 Speaker 1: entertaining these scenarios and thinking about the risks posed makes 821 00:46:50,719 --> 00:46:55,440 Speaker 1: me think, should we have international cyber warfare non proliferation 822 00:46:55,520 --> 00:46:58,920 Speaker 1: treatise the same way we've got nuclear non proliferation treaties. 823 00:46:59,120 --> 00:47:01,520 Speaker 1: I mean, absolutely, as we've been driving home these these 824 00:47:01,880 --> 00:47:04,960 Speaker 1: are legitimate threats and therefore they you know, we should 825 00:47:04,960 --> 00:47:08,719 Speaker 1: be taking steps to prevent um it from happening. As 826 00:47:08,760 --> 00:47:11,600 Speaker 1: far as what countries could do, countries around the world 827 00:47:11,640 --> 00:47:15,799 Speaker 1: could do to protect themselves, I mean, I wonder is 828 00:47:15,800 --> 00:47:19,000 Speaker 1: there an option other than just trying to revert to 829 00:47:19,120 --> 00:47:23,440 Speaker 1: a world of decreased connectedness, And would societies ever do 830 00:47:23,520 --> 00:47:26,040 Speaker 1: that without being forced to by some kind of tragedy 831 00:47:26,160 --> 00:47:29,680 Speaker 1: where Like decreased connectedness of course would mean fewer rather 832 00:47:29,760 --> 00:47:32,759 Speaker 1: than more systems can be accessed by the Internet. Where 833 00:47:32,760 --> 00:47:36,400 Speaker 1: you might have crucial systems for controlling infrastructure kept offline 834 00:47:36,520 --> 00:47:39,680 Speaker 1: or in isolated networks, they're not plugged into the Internet, 835 00:47:39,960 --> 00:47:41,880 Speaker 1: and so it would be a lot harder to infect 836 00:47:41,920 --> 00:47:46,000 Speaker 1: them with some kind of autonomous digital weapon. Uh though, 837 00:47:46,000 --> 00:47:48,960 Speaker 1: I wonder if that's possible even would it takes some 838 00:47:49,040 --> 00:47:53,520 Speaker 1: kind of visceral disaster that calls to mind images through 839 00:47:53,560 --> 00:47:56,800 Speaker 1: the availability heuristic to make people think this is worth doing. 840 00:47:57,800 --> 00:47:59,800 Speaker 1: And it's a good point because I mean, I've certainly 841 00:47:59,800 --> 00:48:02,840 Speaker 1: read predictions for the sort of digital future, both in 842 00:48:02,920 --> 00:48:05,520 Speaker 1: sci fi and just in general futurism, the idea that 843 00:48:05,840 --> 00:48:09,880 Speaker 1: the Internet will become say more national, more regional, more 844 00:48:09,920 --> 00:48:15,840 Speaker 1: more layered, and indeed and they need less worldwide. Um uh. 845 00:48:15,880 --> 00:48:18,319 Speaker 1: The alternative, of course, is as a continuation of what 846 00:48:18,360 --> 00:48:21,960 Speaker 1: we've already been doing, essentially building what cybersecurity expert Bruce 847 00:48:22,239 --> 00:48:26,520 Speaker 1: Schneier referred to as a as a worldwide computer. We're 848 00:48:26,520 --> 00:48:30,960 Speaker 1: building a worldwide computer, and it's uh, it's susceptible to 849 00:48:31,000 --> 00:48:34,600 Speaker 1: attack at every level, from baby monitors to uh to 850 00:48:34,800 --> 00:48:37,319 Speaker 1: nuclear power plants. I get, you know. I would hope 851 00:48:37,320 --> 00:48:40,319 Speaker 1: that we're that that we that we will grow into 852 00:48:40,360 --> 00:48:42,920 Speaker 1: this globally connected world. It will be like there will 853 00:48:42,960 --> 00:48:46,799 Speaker 1: be the people deserving of such a worldwide computer. But 854 00:48:46,880 --> 00:48:49,719 Speaker 1: if not, then yeah, we deserve the regional model. I 855 00:48:49,719 --> 00:48:52,799 Speaker 1: guess well, I don't like the regional model. I mean, 856 00:48:52,840 --> 00:48:57,000 Speaker 1: I understand that some decrease in connectedness might be necessary 857 00:48:57,040 --> 00:49:00,880 Speaker 1: to prevent attacks, but then again, I like the world 858 00:49:00,880 --> 00:49:03,640 Speaker 1: connected model in terms of communication, I mean, all the 859 00:49:03,719 --> 00:49:06,759 Speaker 1: good stuff about connection between cultures. I don't think I 860 00:49:06,840 --> 00:49:09,440 Speaker 1: buy into the nationalist mindset that says we should only 861 00:49:09,480 --> 00:49:12,800 Speaker 1: be talking and interacting with people within our own national 862 00:49:13,080 --> 00:49:16,560 Speaker 1: national boundaries or our own culture that seems very limiting. 863 00:49:16,600 --> 00:49:18,239 Speaker 1: I mean, it's a wonderful thing to be able to 864 00:49:18,280 --> 00:49:21,680 Speaker 1: communicate across borders and with people all around the world. 865 00:49:21,719 --> 00:49:25,279 Speaker 1: That that's something I love about what we get to do, right, Uh, 866 00:49:25,320 --> 00:49:27,160 Speaker 1: And so I don't know that that sounds like a 867 00:49:27,200 --> 00:49:30,120 Speaker 1: horrible thing to do. But then again, I mean I 868 00:49:30,160 --> 00:49:33,400 Speaker 1: wonder if there are ways to to leave open the 869 00:49:33,440 --> 00:49:38,280 Speaker 1: good channels while preventing people from from using digital exchange 870 00:49:38,320 --> 00:49:40,359 Speaker 1: to hurt one another. Well, I mean, part of this 871 00:49:40,400 --> 00:49:43,080 Speaker 1: goes back to our past discussions on just the the 872 00:49:43,080 --> 00:49:46,000 Speaker 1: the the origins of the Internet. Some would argue that 873 00:49:46,160 --> 00:49:48,799 Speaker 1: that that one of the big problems is just uh 874 00:49:49,000 --> 00:49:51,600 Speaker 1: security issues with the Internet itself and the idea that 875 00:49:51,640 --> 00:49:53,799 Speaker 1: you have this thing that was built as a as 876 00:49:54,000 --> 00:49:58,359 Speaker 1: essentially a private network for for developers that has been 877 00:49:58,560 --> 00:50:01,360 Speaker 1: bloated out into this glow system that it was really 878 00:50:01,400 --> 00:50:04,160 Speaker 1: never meant to be. So we we need a new Internet, 879 00:50:04,200 --> 00:50:10,080 Speaker 1: we need a new human race. Uh, just for starters, 880 00:50:10,120 --> 00:50:13,000 Speaker 1: those are two things that would help. Yeah. Well, so 881 00:50:13,080 --> 00:50:15,440 Speaker 1: if there there are any takeaways from this episode, I 882 00:50:15,480 --> 00:50:19,040 Speaker 1: would say I think it's that people should should understand 883 00:50:19,120 --> 00:50:22,840 Speaker 1: the relative severity of different types of autonomous technological weapon 884 00:50:22,920 --> 00:50:27,280 Speaker 1: threats like that that cyber warfare is not just an inconvenience. 885 00:50:27,320 --> 00:50:29,839 Speaker 1: It's not just like, oh darn, the power went out 886 00:50:29,880 --> 00:50:31,160 Speaker 1: for a day, or oh, you know, there was a 887 00:50:31,239 --> 00:50:33,279 Speaker 1: d d O S attack on the website I wanted 888 00:50:33,320 --> 00:50:35,759 Speaker 1: to go to and it went down. These could be 889 00:50:35,880 --> 00:50:38,960 Speaker 1: real serious. This could be the warfare of the future 890 00:50:38,960 --> 00:50:43,440 Speaker 1: and every bit as serious as conventional warfare. And uh 891 00:50:43,480 --> 00:50:45,920 Speaker 1: and so that's worth considering, and it's so it's worth 892 00:50:46,000 --> 00:50:48,680 Speaker 1: promoting people who have good ways of thinking about this. 893 00:50:48,719 --> 00:50:52,120 Speaker 1: If if you know of cybersecurity experts, the kind of 894 00:50:52,120 --> 00:50:54,399 Speaker 1: people who are doing the best thought about this, coming 895 00:50:54,440 --> 00:50:57,879 Speaker 1: up with ways of thinking about defenses, especially as we've 896 00:50:57,880 --> 00:51:00,440 Speaker 1: talked about the kinds of defenses that don't lead to 897 00:51:00,560 --> 00:51:04,560 Speaker 1: a you know, a shutdown of international communication and and 898 00:51:04,640 --> 00:51:07,719 Speaker 1: this worse soft world. Shine a light on those people. 899 00:51:07,800 --> 00:51:10,560 Speaker 1: I want to know what our best options are. Who's 900 00:51:10,600 --> 00:51:12,759 Speaker 1: doing the best thinking about this right now? Yeah, let 901 00:51:12,840 --> 00:51:14,719 Speaker 1: us know. We'll do our part to shine our light 902 00:51:15,040 --> 00:51:17,000 Speaker 1: on those people as well. All right, So there you 903 00:51:17,040 --> 00:51:19,839 Speaker 1: have it. Um as always stuff to blow your mind. 904 00:51:19,880 --> 00:51:21,959 Speaker 1: Dot com that is the mothership, that's where you'll find 905 00:51:22,120 --> 00:51:25,879 Speaker 1: this and other episodes of note, we've had a number 906 00:51:25,920 --> 00:51:28,960 Speaker 1: of them that have dealt with technology and warfare uh 907 00:51:28,960 --> 00:51:31,759 Speaker 1: and likewise, UH. There are a number of issues in 908 00:51:31,800 --> 00:51:34,680 Speaker 1: this episode that we could easily return to, such as 909 00:51:34,719 --> 00:51:39,520 Speaker 1: cyber warfare even the more traditional autonomous weapon designs that 910 00:51:40,080 --> 00:51:42,319 Speaker 1: we could do multiple episodes on that as well. So 911 00:51:42,640 --> 00:51:44,880 Speaker 1: as we said, I think one thing definitely worth exploring 912 00:51:44,880 --> 00:51:47,799 Speaker 1: would be the more specific nitty gritty of the scenarios 913 00:51:47,840 --> 00:51:50,960 Speaker 1: imagined by cyber cyber warfare experts, like what's the most 914 00:51:50,960 --> 00:51:55,480 Speaker 1: plausible thing that could happen and how could it be prevented? Um? 915 00:51:55,600 --> 00:51:57,560 Speaker 1: So hey, yeah, check it out Stuff to blow your 916 00:51:57,560 --> 00:51:59,719 Speaker 1: mind dot com UH, and you'll find links to our 917 00:51:59,760 --> 00:52:02,000 Speaker 1: various social media accounts there as well. And if you 918 00:52:02,040 --> 00:52:05,440 Speaker 1: want to support the show, rate and review us wherever 919 00:52:05,560 --> 00:52:08,200 Speaker 1: you have the ability to do so. Big thanks as 920 00:52:08,239 --> 00:52:11,800 Speaker 1: always to our wonderful audio producers Alex Williams and Torry Harrison. 921 00:52:12,080 --> 00:52:13,680 Speaker 1: If you would like to get in touch with us 922 00:52:13,680 --> 00:52:16,319 Speaker 1: with feedback about this episode or any other to let 923 00:52:16,400 --> 00:52:18,960 Speaker 1: us know topic you'd like us to cover in the future, 924 00:52:19,120 --> 00:52:22,000 Speaker 1: to just send us your your thoughts to say hi, 925 00:52:22,160 --> 00:52:23,799 Speaker 1: let us know where you listen, from, how you found 926 00:52:23,800 --> 00:52:25,920 Speaker 1: out about the show. Any of that, You can always 927 00:52:25,960 --> 00:52:28,720 Speaker 1: email us at Blow the Mind and how stuff Works 928 00:52:28,880 --> 00:52:40,480 Speaker 1: dot com for more on this and thousands of other topics. 929 00:52:40,760 --> 00:53:00,759 Speaker 1: Is it how stuff works dot com. B b P 930 00:53:01,120 --> 00:53:02,880 Speaker 1: has a found back about a protect