1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:12,000 --> 00:00:14,360 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,440 --> 00:00:17,240 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,280 --> 00:00:20,239 Speaker 1: and a love of all things tech, and I am 5 00:00:20,280 --> 00:00:25,880 Speaker 1: currently hard at work on an episode that's about cybersecurity, 6 00:00:25,960 --> 00:00:32,440 Speaker 1: cyber warfare, AI, the AI arms race China. It's a 7 00:00:32,440 --> 00:00:34,960 Speaker 1: lot of stuff, a lot of different parts. This is 8 00:00:35,320 --> 00:00:40,400 Speaker 1: largely brought on because recently the Pentagon's chief software officer 9 00:00:40,720 --> 00:00:46,199 Speaker 1: resigned and in the process left a very angry and 10 00:00:46,720 --> 00:00:51,360 Speaker 1: detailed list of grievances that led to his decision to resign. 11 00:00:51,880 --> 00:00:55,600 Speaker 1: So I'm working on an episode that really dives into 12 00:00:55,600 --> 00:00:58,720 Speaker 1: all of that and explains what the landscape is, what 13 00:00:58,920 --> 00:01:05,720 Speaker 1: the concerns are, kind of tries to examine how realistic 14 00:01:05,920 --> 00:01:10,440 Speaker 1: certain threats are or whether there might be other mitigating factors. 15 00:01:10,959 --> 00:01:13,720 Speaker 1: As it turns out, these things get very, very complicated, 16 00:01:13,760 --> 00:01:16,240 Speaker 1: not just because of the technology, but because the way 17 00:01:16,240 --> 00:01:19,560 Speaker 1: the rest of the world works. Like, we can't divorce 18 00:01:19,640 --> 00:01:23,760 Speaker 1: technology from the way things happen in the world, right, 19 00:01:23,959 --> 00:01:27,600 Speaker 1: I Mean, they obey the same sort of restrictions that 20 00:01:27,640 --> 00:01:31,080 Speaker 1: the rest of us do. So anyway, long story short, 21 00:01:31,160 --> 00:01:34,640 Speaker 1: too late. I'm still working on that piece. I want 22 00:01:34,680 --> 00:01:38,440 Speaker 1: to make sure that it's as good as I can 23 00:01:38,480 --> 00:01:41,200 Speaker 1: possibly make it before I publish it. So in the 24 00:01:41,360 --> 00:01:45,360 Speaker 1: spirit of that piece, I thought we could listen to 25 00:01:45,560 --> 00:01:49,559 Speaker 1: a classic episode of Tech Stuff. This one published way 26 00:01:49,560 --> 00:01:53,320 Speaker 1: back on June two thousand nine, and it is titled 27 00:01:53,360 --> 00:01:57,560 Speaker 1: are We in Cyber War? So this episode is more 28 00:01:57,600 --> 00:02:00,640 Speaker 1: than a decade old. It's with me and original co 29 00:02:00,720 --> 00:02:04,680 Speaker 1: host Chris Pullette, and we just have this discussion. And 30 00:02:04,880 --> 00:02:08,160 Speaker 1: it's interesting to go back and listen to this because 31 00:02:08,200 --> 00:02:13,080 Speaker 1: obviously things have progressed a lot since two thousand nine. 32 00:02:13,480 --> 00:02:17,840 Speaker 1: The cyber threats have grown significantly since two thousand nine. 33 00:02:17,840 --> 00:02:20,880 Speaker 1: They were already significant then, but they're even more so now. 34 00:02:21,400 --> 00:02:25,119 Speaker 1: So I think it's a great starting point to kind 35 00:02:25,160 --> 00:02:29,280 Speaker 1: of say, here's where we were more than a decade ago, 36 00:02:30,000 --> 00:02:33,280 Speaker 1: and then that will lead into what will be Friday's episode, 37 00:02:33,800 --> 00:02:38,480 Speaker 1: which will be the deeper dive on the current landscape, 38 00:02:38,480 --> 00:02:43,880 Speaker 1: why people in positions of authority in different tech departments 39 00:02:44,400 --> 00:02:48,120 Speaker 1: within the United States are concerned, and what's going on 40 00:02:48,200 --> 00:02:52,000 Speaker 1: with China and whether or not that's going to have 41 00:02:52,639 --> 00:02:56,720 Speaker 1: a long lasting impact. So let's go and listen to 42 00:02:56,800 --> 00:03:00,320 Speaker 1: this classic episode and I'll be back at the end 43 00:03:00,720 --> 00:03:02,920 Speaker 1: to kind of chat a little bit more before we 44 00:03:02,960 --> 00:03:08,400 Speaker 1: wrap up Enjoy. Unfortunately, we have some serious things to 45 00:03:08,440 --> 00:03:10,639 Speaker 1: talk about. Actually, we have some pretty scary stuff to 46 00:03:10,680 --> 00:03:13,120 Speaker 1: talk about. This. This I think is even scarier than 47 00:03:13,120 --> 00:03:19,800 Speaker 1: our zombie computers and Halloween shows combined. Really, yeah, I think. So. Okay, 48 00:03:19,880 --> 00:03:26,120 Speaker 1: so we're gonna talk today about cyber war. It's not 49 00:03:26,240 --> 00:03:32,120 Speaker 1: pirate war, cyber war. Cyber war, so we're all we're 50 00:03:32,120 --> 00:03:35,640 Speaker 1: not talking about tron here um, nor are we talking 51 00:03:35,640 --> 00:03:39,080 Speaker 1: about war games, both of which are awesome movies, so 52 00:03:39,280 --> 00:03:42,120 Speaker 1: put them to the top of your Netflix queue. Um. No, 53 00:03:42,280 --> 00:03:47,680 Speaker 1: we're talking about using computers to either spy upon, or 54 00:03:47,840 --> 00:03:54,520 Speaker 1: sabotage or otherwise inflict some sort of harm upon a nation. Um. 55 00:03:54,680 --> 00:03:58,560 Speaker 1: And this can be done by one of a dozen 56 00:03:58,600 --> 00:04:00,920 Speaker 1: different entities. That's the That's one of the scary things 57 00:04:00,960 --> 00:04:04,840 Speaker 1: about cyber war, is that? All Right? So in classic warfare, 58 00:04:05,280 --> 00:04:08,200 Speaker 1: you know, usually you you would talk about two different nations, 59 00:04:08,280 --> 00:04:12,480 Speaker 1: or perhaps two different factions within a nation, fighting one another. 60 00:04:13,240 --> 00:04:18,880 Speaker 1: Pretty easy to identify who the parties involved are, right, normally, yeah, 61 00:04:19,000 --> 00:04:21,560 Speaker 1: because guys shooting at you, right, and normally they have 62 00:04:21,880 --> 00:04:24,480 Speaker 1: you know, uniforms of some kind on you know, not 63 00:04:24,520 --> 00:04:26,760 Speaker 1: to shoot your own guy. Yeah, yeah, there's some there's 64 00:04:26,800 --> 00:04:29,400 Speaker 1: some general little rules that make it easier to know 65 00:04:29,440 --> 00:04:32,320 Speaker 1: which guys are the ones you're supposed to be shooting. Um. 66 00:04:32,480 --> 00:04:35,680 Speaker 1: Cyber war is not quite that clean cut. The problem 67 00:04:35,680 --> 00:04:38,960 Speaker 1: with cyber war is that the attacks can come from anywhere. 68 00:04:39,040 --> 00:04:42,240 Speaker 1: They can come from another country. They can come from 69 00:04:42,279 --> 00:04:45,640 Speaker 1: patriots within another country that are acting on their own. 70 00:04:46,040 --> 00:04:50,600 Speaker 1: That could come from essentially a mercenary, a hacker that's 71 00:04:51,080 --> 00:04:53,719 Speaker 1: hired to do this sort of thing. Um, that could 72 00:04:53,720 --> 00:04:56,440 Speaker 1: come from someone who's just trying to cause mischief and 73 00:04:56,480 --> 00:05:00,160 Speaker 1: they don't have any other motives. Uh. So it an 74 00:05:00,160 --> 00:05:02,400 Speaker 1: attack that can come from another country, or that it 75 00:05:02,480 --> 00:05:05,680 Speaker 1: can come from within the country that is being attacked. 76 00:05:06,120 --> 00:05:09,080 Speaker 1: I mean, you know you're talking about uh sort of 77 00:05:09,080 --> 00:05:12,520 Speaker 1: a cyber terrorism in a way. Yeah. And as a 78 00:05:12,560 --> 00:05:15,640 Speaker 1: matter of fact, him, it could be somebody sitting in 79 00:05:15,680 --> 00:05:18,160 Speaker 1: his jammie, is in his living room in the computer. 80 00:05:18,279 --> 00:05:20,920 Speaker 1: You know, it doesn't need to be somebody out you know, 81 00:05:21,000 --> 00:05:24,720 Speaker 1: skulking around the streets or you know, somewhere in a foxhole. Heck, 82 00:05:24,800 --> 00:05:28,000 Speaker 1: it could be someone parked in your driveway, hacking into 83 00:05:28,080 --> 00:05:31,400 Speaker 1: your WiFi. Good point, and it's that's why we're talking 84 00:05:31,400 --> 00:05:34,400 Speaker 1: about how scary this is. It's um and and on 85 00:05:34,440 --> 00:05:37,960 Speaker 1: another level, it's also scary because it takes so little, 86 00:05:38,320 --> 00:05:43,280 Speaker 1: relatively speaking to UH to perform an effective cyber attack. Now, 87 00:05:43,279 --> 00:05:46,600 Speaker 1: when you're talking about a traditional attack on from one 88 00:05:46,680 --> 00:05:50,080 Speaker 1: nation on to another, you're talking about billions of dollars 89 00:05:50,080 --> 00:05:56,120 Speaker 1: worth of equipment, of of personnel. UH. You know, the 90 00:05:56,240 --> 00:05:59,560 Speaker 1: things that have to go behind a war machine. I mean, 91 00:05:59,600 --> 00:06:02,920 Speaker 1: we're that's a huge investment. When you're talking about cyber attacks, 92 00:06:02,920 --> 00:06:07,480 Speaker 1: you're talking about a computer and a computer connection, and 93 00:06:07,560 --> 00:06:09,320 Speaker 1: you know, you might have a couple of other little 94 00:06:09,360 --> 00:06:11,159 Speaker 1: bells and whistles to help you along, but you really 95 00:06:11,200 --> 00:06:13,360 Speaker 1: you don't necessarily need it if you know what you're 96 00:06:13,360 --> 00:06:17,160 Speaker 1: doing and you have the right software. So it's one 97 00:06:17,200 --> 00:06:21,680 Speaker 1: of those things wherefore a very low small entrance fee. 98 00:06:21,800 --> 00:06:23,839 Speaker 1: I guess you could say you could have a huge, 99 00:06:24,000 --> 00:06:27,760 Speaker 1: huge impact. As a matter of fact, your computer could 100 00:06:27,760 --> 00:06:31,080 Speaker 1: be used to carry out a cyber attack. Yes, if 101 00:06:31,120 --> 00:06:34,360 Speaker 1: you've if you've installed some kind of malware like a 102 00:06:34,440 --> 00:06:37,400 Speaker 1: virus or a worm that UH can turn your machine 103 00:06:37,400 --> 00:06:41,279 Speaker 1: into a zombie someone else can direct your computer to 104 00:06:41,680 --> 00:06:45,280 Speaker 1: UH to send email and a denial of service attack 105 00:06:45,640 --> 00:06:49,880 Speaker 1: which basically floods UM floods computers with spam and other 106 00:06:50,360 --> 00:06:54,080 Speaker 1: and other requests if you will, for information. The thing 107 00:06:54,240 --> 00:06:57,760 Speaker 1: is that doesn't require any cost on the part of 108 00:06:58,480 --> 00:06:59,960 Speaker 1: on the part of the attack or at all, because 109 00:07:00,160 --> 00:07:03,599 Speaker 1: all the machines are essentially donated, you know, from somebody else, 110 00:07:03,800 --> 00:07:07,359 Speaker 1: right and the and to make matters worse, UH, when 111 00:07:07,839 --> 00:07:11,160 Speaker 1: when anyone in authority tries to trace the source of 112 00:07:11,200 --> 00:07:14,000 Speaker 1: the attack, they might come to your computer and never 113 00:07:14,160 --> 00:07:17,400 Speaker 1: find the person who actually infected your computer in the 114 00:07:17,440 --> 00:07:21,000 Speaker 1: first place. So then you become the person of interest, 115 00:07:21,280 --> 00:07:24,480 Speaker 1: the person who's under suspicion for committing an attack, and 116 00:07:25,000 --> 00:07:28,400 Speaker 1: the whole time you were completely unaware. Um. Actually, that's 117 00:07:28,400 --> 00:07:32,559 Speaker 1: another big, big issue with the cyber warfare problem. Even 118 00:07:32,640 --> 00:07:35,680 Speaker 1: when you can detect an attack and trace it back, 119 00:07:35,760 --> 00:07:39,520 Speaker 1: you can never be a sure that the last place 120 00:07:39,600 --> 00:07:42,280 Speaker 1: you you trace it back to is in fact the 121 00:07:42,320 --> 00:07:46,520 Speaker 1: original spot of the attack, because there are these you know, 122 00:07:47,000 --> 00:07:49,200 Speaker 1: there's there are things like proxy sites, there are the 123 00:07:49,320 --> 00:07:53,600 Speaker 1: zombie computers where there's always the possibility that there's one 124 00:07:53,600 --> 00:07:56,120 Speaker 1: more link you haven't found yet that will take you 125 00:07:56,160 --> 00:07:59,880 Speaker 1: back even further. So that's uh, you know, if you 126 00:08:00,040 --> 00:08:02,320 Speaker 1: if you uh, if you were to detect, say an attack, 127 00:08:02,320 --> 00:08:04,040 Speaker 1: and you say, well, we've traced it back to China, 128 00:08:04,400 --> 00:08:07,880 Speaker 1: you can never be sure that that the Chinese government 129 00:08:08,040 --> 00:08:10,760 Speaker 1: was behind it. It could have been patriots in China 130 00:08:10,840 --> 00:08:13,120 Speaker 1: who had the same sort of goals as the government 131 00:08:13,120 --> 00:08:15,160 Speaker 1: of China, but we're acting on their own. Or it 132 00:08:15,160 --> 00:08:17,520 Speaker 1: could have even been a people in a totally different 133 00:08:17,520 --> 00:08:20,960 Speaker 1: country that just managed to use proxy sites in China 134 00:08:21,360 --> 00:08:24,640 Speaker 1: to fool you into thinking that's where the attack came from. 135 00:08:24,680 --> 00:08:29,040 Speaker 1: So it's really insidious. Um And you might wonder, well, 136 00:08:29,080 --> 00:08:32,560 Speaker 1: how how vulnerable are we to these sort of attacks? 137 00:08:32,600 --> 00:08:35,120 Speaker 1: And I guess it really depends on which system you're 138 00:08:35,120 --> 00:08:38,520 Speaker 1: talking about, because you know, the Internet is a network 139 00:08:38,559 --> 00:08:43,160 Speaker 1: of networks, right right, so any given network or any 140 00:08:43,160 --> 00:08:47,160 Speaker 1: given computer could be the weak spot, you know, and 141 00:08:47,160 --> 00:08:50,440 Speaker 1: and there are just tons of computers as part of 142 00:08:50,480 --> 00:08:53,240 Speaker 1: the Internet. You know, every time you were computer is 143 00:08:53,800 --> 00:08:56,320 Speaker 1: hooked up for Internet access, you become part of this 144 00:08:56,360 --> 00:09:01,079 Speaker 1: giant cloud. Um. So, and then the really sophisticated crackers, 145 00:09:01,280 --> 00:09:03,920 Speaker 1: those are the really nasty hackers. Those are the ones 146 00:09:03,960 --> 00:09:07,720 Speaker 1: who can find ways to manipulate a network in ways 147 00:09:07,800 --> 00:09:11,240 Speaker 1: that you know, most people don't think of, right and 148 00:09:11,240 --> 00:09:14,400 Speaker 1: and to give you an idea of how vulnerable certain 149 00:09:14,480 --> 00:09:20,080 Speaker 1: systems can be. Back in seven, there was a secret 150 00:09:20,160 --> 00:09:23,080 Speaker 1: experiment the Department of Defense commissioned and it was called 151 00:09:23,120 --> 00:09:26,680 Speaker 1: Eligible Receiver. I remember that. Yeah, this isn't This was 152 00:09:27,679 --> 00:09:30,000 Speaker 1: kind of an eye opener um. Now a lot of 153 00:09:30,000 --> 00:09:33,280 Speaker 1: Eligible Receiver, A lot of that mission remains classified, so 154 00:09:33,320 --> 00:09:35,440 Speaker 1: we don't know all the details. But what we do 155 00:09:35,559 --> 00:09:40,480 Speaker 1: know is that part of the the experiment involved getting 156 00:09:40,520 --> 00:09:44,440 Speaker 1: a group of hackers together, giving them some very basic 157 00:09:44,920 --> 00:09:49,240 Speaker 1: computing hardware and software, and telling them to try and 158 00:09:49,720 --> 00:09:53,440 Speaker 1: break their way into the Pentagon's computer system. And it 159 00:09:53,480 --> 00:09:59,319 Speaker 1: took them three days using basic computers and basic software. Uh, 160 00:09:59,520 --> 00:10:03,880 Speaker 1: three day is just for regular hackers. These aren't necessarily 161 00:10:03,920 --> 00:10:06,199 Speaker 1: the people who are who have a you know, an 162 00:10:06,240 --> 00:10:09,000 Speaker 1: actual motive to break into the Pentagon and the fact 163 00:10:09,000 --> 00:10:11,360 Speaker 1: that they're part of an experiment, right, It's not like 164 00:10:11,360 --> 00:10:13,440 Speaker 1: they have a government breathing down their next saying we 165 00:10:13,480 --> 00:10:19,160 Speaker 1: need access to this information. Uh So that's that's pretty 166 00:10:19,160 --> 00:10:22,559 Speaker 1: sobering to think that within three days one of the 167 00:10:22,679 --> 00:10:27,080 Speaker 1: nation's most important computing systems was compromised, even though it 168 00:10:27,160 --> 00:10:30,640 Speaker 1: was an inside job and an experiment, right, well, they 169 00:10:31,559 --> 00:10:34,600 Speaker 1: there have been attempts to shore that up since then, 170 00:10:34,640 --> 00:10:39,920 Speaker 1: and in fact they conduct regular exercises in order to 171 00:10:39,920 --> 00:10:42,160 Speaker 1: do that. In fact, there was one not that long ago. 172 00:10:43,040 --> 00:10:47,520 Speaker 1: Every year they there are students from Army, Navy, Air Force, 173 00:10:47,600 --> 00:10:49,640 Speaker 1: and the Coast Guarden Merchant Marine, as well as the 174 00:10:49,720 --> 00:10:53,760 Speaker 1: Naval Postgraduate Academy and the Air Force Institute of Technology. 175 00:10:53,960 --> 00:10:58,800 Speaker 1: And basically it's it's uh, undergrads were given the opportunity 176 00:10:58,960 --> 00:11:02,480 Speaker 1: to defend themselves from an attack by the n s 177 00:11:02,559 --> 00:11:08,239 Speaker 1: a UM and UH every year they undergo this experiment, 178 00:11:08,440 --> 00:11:11,360 Speaker 1: and uh, the West Point held out the longest and 179 00:11:11,400 --> 00:11:14,520 Speaker 1: they the Army got to defend their title. But they 180 00:11:15,040 --> 00:11:20,120 Speaker 1: were using Lenox computers. But this is apparently a normal thing. Um. 181 00:11:20,440 --> 00:11:23,680 Speaker 1: The Defense Department is only graduating eighties students a year 182 00:11:24,000 --> 00:11:26,680 Speaker 1: from schools of cyber war in the United States, according 183 00:11:26,720 --> 00:11:30,480 Speaker 1: to the New York Times article that I read about it. UM. 184 00:11:30,520 --> 00:11:33,640 Speaker 1: And if you're wondering, this is the fifty seven Information 185 00:11:33,640 --> 00:11:38,080 Speaker 1: Aggressor Squadron. They're based in Nellis Air Force Base, and 186 00:11:38,120 --> 00:11:40,679 Speaker 1: they are they they are. They are. They make a 187 00:11:40,720 --> 00:11:45,080 Speaker 1: point of doing this test every year, and um, you 188 00:11:45,120 --> 00:11:47,959 Speaker 1: know they it's one of those things where they are 189 00:11:48,040 --> 00:11:53,600 Speaker 1: making a conscious effort to attack and defend UH computer networks. 190 00:11:53,600 --> 00:11:57,359 Speaker 1: And apparently the uh you know, the nerds are nerds everywhere, 191 00:11:57,520 --> 00:12:01,280 Speaker 1: even at West Point Um according to the way, according 192 00:12:01,320 --> 00:12:03,240 Speaker 1: to the way the article was written. They get a 193 00:12:03,240 --> 00:12:05,680 Speaker 1: little ribbing for being the geeks of the group. But 194 00:12:06,120 --> 00:12:10,400 Speaker 1: even the you know, the the future officers that graduate 195 00:12:10,440 --> 00:12:14,520 Speaker 1: from their know the importance of the computer network because 196 00:12:14,520 --> 00:12:16,360 Speaker 1: that's one of the very first things they do. They're 197 00:12:16,360 --> 00:12:18,760 Speaker 1: about to deploy these guys to Afghanistan as a matter 198 00:12:18,800 --> 00:12:20,040 Speaker 1: of fact, and the first thing they're gonna do is 199 00:12:20,040 --> 00:12:22,680 Speaker 1: set up a secure internet connection, and they have to 200 00:12:22,679 --> 00:12:27,000 Speaker 1: be ready to defend themselves against denial of the denial 201 00:12:27,320 --> 00:12:31,000 Speaker 1: of service attacks and uh another attacks. So I mean, 202 00:12:31,040 --> 00:12:33,640 Speaker 1: they're they're coming right out of the service academies with 203 00:12:34,200 --> 00:12:39,840 Speaker 1: knowledge of how to attack and to protect UM computer networks, 204 00:12:39,840 --> 00:12:42,920 Speaker 1: military computer networks. There's a bit more to go with 205 00:12:42,960 --> 00:12:48,560 Speaker 1: our conversation about the state of cyber war in this 206 00:12:48,840 --> 00:12:51,640 Speaker 1: classic episode, but before we get to that, let's take 207 00:12:51,679 --> 00:13:01,520 Speaker 1: a quick break. You usually we call those sort of 208 00:13:01,840 --> 00:13:07,080 Speaker 1: exercises red team attacks UM where a group is is 209 00:13:07,360 --> 00:13:12,040 Speaker 1: designated to play the part of an UM adversary and 210 00:13:12,080 --> 00:13:15,120 Speaker 1: that's the Red team. And the Red team's job is 211 00:13:15,200 --> 00:13:19,840 Speaker 1: to is to achieve their goals by whatever means necessary. 212 00:13:20,160 --> 00:13:21,719 Speaker 1: So in other words, you know, you're not supposed to 213 00:13:21,800 --> 00:13:24,880 Speaker 1: necessarily follow a certain protocol or rules. You're supposed to 214 00:13:24,920 --> 00:13:27,600 Speaker 1: be inventive and creative and try and find new ways 215 00:13:28,200 --> 00:13:33,000 Speaker 1: to to really compromise or defeat the other team and UM, 216 00:13:33,040 --> 00:13:35,040 Speaker 1: because that's exactly what the enemy is going to do. 217 00:13:35,280 --> 00:13:36,840 Speaker 1: You know, the enemy is not going to play by 218 00:13:36,920 --> 00:13:40,400 Speaker 1: rules necessarily, especially if you're talking about enemies that you 219 00:13:40,440 --> 00:13:43,679 Speaker 1: can't predict. I mean, they may not even be directly 220 00:13:43,960 --> 00:13:48,440 Speaker 1: involved with any other government or or official agency. So 221 00:13:49,360 --> 00:13:53,600 Speaker 1: UM and and you know, we government websites and our 222 00:13:53,800 --> 00:13:57,040 Speaker 1: government web servers and and systems aren't the only targets. 223 00:13:57,640 --> 00:13:59,880 Speaker 1: One of the big targets in the United States, and 224 00:14:00,040 --> 00:14:01,800 Speaker 1: it's been in the news quite a bit over the 225 00:14:02,120 --> 00:14:06,120 Speaker 1: spring of two thousand nine is the electric grid and UH. 226 00:14:06,360 --> 00:14:08,160 Speaker 1: Part of the problem with that is that systems like 227 00:14:08,360 --> 00:14:11,200 Speaker 1: the electric grid and and some water and fuel systems 228 00:14:11,600 --> 00:14:16,839 Speaker 1: are using UM, using the software that that directly ties 229 00:14:16,880 --> 00:14:19,800 Speaker 1: into hardware, and if you just change a few settings, 230 00:14:20,320 --> 00:14:24,880 Speaker 1: you can cause catastrophic damage to the the equipment. UM. 231 00:14:24,920 --> 00:14:27,440 Speaker 1: There was a video that was on CNN for a 232 00:14:27,440 --> 00:14:33,360 Speaker 1: while where some uh, some electric utility experts showed that 233 00:14:33,400 --> 00:14:36,520 Speaker 1: with just a couple of tweaks, you could completely destroy 234 00:14:36,640 --> 00:14:40,960 Speaker 1: a generator by changing some settings through the computer system, 235 00:14:41,040 --> 00:14:44,040 Speaker 1: and they essentially turned a generator into a pile of 236 00:14:44,080 --> 00:14:47,800 Speaker 1: scrap metal. UM. Yeah, it was very sobering to me 237 00:14:48,240 --> 00:14:52,120 Speaker 1: to see that, because not that long ago the news 238 00:14:52,240 --> 00:14:55,240 Speaker 1: broke out that the United States electric grid, certain parts 239 00:14:55,240 --> 00:14:59,400 Speaker 1: of it anyway, uh, has been under attack by some 240 00:14:59,560 --> 00:15:02,960 Speaker 1: cyber spies over the last several years. And I don't 241 00:15:02,960 --> 00:15:05,080 Speaker 1: really know who it is, right right right. They've traced 242 00:15:05,120 --> 00:15:08,440 Speaker 1: them back mostly to China and Russia. But again um, 243 00:15:08,520 --> 00:15:10,800 Speaker 1: both China and Russia deny that they had anything to 244 00:15:10,800 --> 00:15:12,880 Speaker 1: do with it. But I mean, of course, wouldn't you. 245 00:15:13,440 --> 00:15:16,440 Speaker 1: The thing is it, you know, those countries are are 246 00:15:16,600 --> 00:15:21,680 Speaker 1: gradually becoming more and more uh, computer centric, and it 247 00:15:21,760 --> 00:15:24,080 Speaker 1: you know, it could be anybody. It could be you know, 248 00:15:24,440 --> 00:15:28,040 Speaker 1: it could it could be that they are directly involved, UM, 249 00:15:28,160 --> 00:15:31,520 Speaker 1: or it could be that it's groups of of individuals 250 00:15:31,560 --> 00:15:33,760 Speaker 1: within those countries, or like we said, it could even 251 00:15:33,800 --> 00:15:36,600 Speaker 1: be that the attacks are ultimately originating somewhere else, but 252 00:15:36,640 --> 00:15:38,640 Speaker 1: we're only able to trace them back as far as 253 00:15:38,680 --> 00:15:41,760 Speaker 1: Russia and China. So that's that's the other issue with 254 00:15:42,160 --> 00:15:45,240 Speaker 1: the Internet is that it is a global entity, and 255 00:15:45,360 --> 00:15:49,600 Speaker 1: so law enforcement officials only have so much authority to 256 00:15:50,120 --> 00:15:54,040 Speaker 1: pursue cyber attacks. You know, they can cross over borders 257 00:15:54,520 --> 00:15:57,920 Speaker 1: easily on the Internet, but law enforcement can't. They don't 258 00:15:57,960 --> 00:16:01,600 Speaker 1: necessarily have the authority to pursue an investigation beyond the 259 00:16:01,600 --> 00:16:04,720 Speaker 1: borders of you know, whatever their jurisdiction is. So that 260 00:16:04,800 --> 00:16:08,080 Speaker 1: also makes life much more complicated when you're talking about 261 00:16:08,440 --> 00:16:13,400 Speaker 1: fending off cyber warfare attacks. Yeah, you know, uh, it 262 00:16:13,440 --> 00:16:17,840 Speaker 1: wasn't even that long ago that some countries were complaining 263 00:16:18,200 --> 00:16:23,800 Speaker 1: of real cyber attacks launched on their inner infrastructure, like 264 00:16:23,920 --> 00:16:27,480 Speaker 1: Estonia not too long ago, and uh they were blaming 265 00:16:27,520 --> 00:16:29,600 Speaker 1: the Russians for that attack. But that was back in 266 00:16:29,600 --> 00:16:34,120 Speaker 1: in two thousand seven, all those years ago. Yeah, all 267 00:16:34,120 --> 00:16:37,440 Speaker 1: those both years ago. Yeah. Well, you know they say 268 00:16:37,440 --> 00:16:40,240 Speaker 1: that Internet time is sort of like dog years. It's 269 00:16:40,280 --> 00:16:43,720 Speaker 1: about that would make it about fourteen years ago in Internet, 270 00:16:43,960 --> 00:16:46,320 Speaker 1: So I guess so, um. Yeah. And then of course 271 00:16:46,360 --> 00:16:49,960 Speaker 1: there's the example of the Dalai Lama's office that the 272 00:16:49,960 --> 00:16:55,160 Speaker 1: Tibetan office that was uh. They knew they were being watched, right, 273 00:16:55,560 --> 00:17:00,160 Speaker 1: they were absolutely certain that their systems had been compromised UM, 274 00:17:00,240 --> 00:17:04,679 Speaker 1: and they hired a Canadian firm to investigate. In the 275 00:17:04,680 --> 00:17:09,440 Speaker 1: Canadian firm found that indeed, there there were programs installed 276 00:17:09,560 --> 00:17:13,800 Speaker 1: upon the Dali lamas Uh computer systems, and that it 277 00:17:13,880 --> 00:17:18,000 Speaker 1: appeared to be coming from an offshore island off the 278 00:17:18,040 --> 00:17:24,680 Speaker 1: coast of a China. And the software even included UM 279 00:17:24,720 --> 00:17:27,480 Speaker 1: controls that would allow people on the other end to 280 00:17:27,600 --> 00:17:31,280 Speaker 1: activate audio and video software UM and hardware so that 281 00:17:31,320 --> 00:17:33,440 Speaker 1: they could turn on if the computer had a webcam 282 00:17:33,560 --> 00:17:35,600 Speaker 1: or a microphone, they could turn it on and turn 283 00:17:35,640 --> 00:17:38,920 Speaker 1: it into a remote listening station, so they could actually 284 00:17:38,960 --> 00:17:44,560 Speaker 1: spy on the goings on of these offices remotely. UM. So, 285 00:17:44,680 --> 00:17:47,280 Speaker 1: I mean, this is a very real problem worldwide. It's 286 00:17:47,359 --> 00:17:50,200 Speaker 1: not just something that we have to worry about in 287 00:17:50,240 --> 00:17:53,639 Speaker 1: the United States or or you know, any other specific nation. 288 00:17:53,720 --> 00:17:56,960 Speaker 1: It's it's pretty much if if you have computers, there's 289 00:17:57,000 --> 00:18:00,440 Speaker 1: a good chance there's another party somewhere that's really interested 290 00:18:00,480 --> 00:18:02,160 Speaker 1: in finding out what you know and what you don't 291 00:18:02,160 --> 00:18:06,240 Speaker 1: know and what you're up to. Yep, and um, there's 292 00:18:06,320 --> 00:18:09,239 Speaker 1: there's even another component to it that I know we 293 00:18:09,240 --> 00:18:12,600 Speaker 1: were gonna stick, uh mainly to talking about how you 294 00:18:12,600 --> 00:18:17,520 Speaker 1: could use computers to launch computer attacks, but um, another 295 00:18:17,720 --> 00:18:20,880 Speaker 1: facet of this that I think is interesting was sort 296 00:18:20,920 --> 00:18:22,960 Speaker 1: of relates to a blog post I wrote in early 297 00:18:23,000 --> 00:18:27,159 Speaker 1: April um on the tech stuff blog that talked about 298 00:18:27,200 --> 00:18:33,400 Speaker 1: the Moldovan pro democracy protesters and they weren't launching computer attacks, 299 00:18:33,720 --> 00:18:37,440 Speaker 1: but what they were doing was using uh social networking 300 00:18:37,480 --> 00:18:41,520 Speaker 1: sites like Twitter and Facebook to coordinate their efforts sort 301 00:18:41,520 --> 00:18:44,720 Speaker 1: of like flash mobs. They could go ahead and use 302 00:18:44,840 --> 00:18:51,040 Speaker 1: computer networks like those and uh text messaging to discuss 303 00:18:51,080 --> 00:18:53,280 Speaker 1: where and when they were going to organize and meet 304 00:18:53,359 --> 00:18:58,000 Speaker 1: and hold a demonstration. So that's um, I mean, that's 305 00:18:59,280 --> 00:19:01,840 Speaker 1: you know, relying on the network staying up and rather 306 00:19:01,920 --> 00:19:04,399 Speaker 1: than taking them down. But UM, I just it's just 307 00:19:04,440 --> 00:19:06,560 Speaker 1: kind of funny because you know, you don't think of 308 00:19:06,680 --> 00:19:08,719 Speaker 1: you think of Facebook and Twitter or something we use 309 00:19:08,800 --> 00:19:11,120 Speaker 1: for fun or to to keep up with people, and 310 00:19:11,960 --> 00:19:14,400 Speaker 1: just another way that you can use them to actually, 311 00:19:14,840 --> 00:19:16,840 Speaker 1: I mean, those could those could just as well have 312 00:19:17,000 --> 00:19:21,200 Speaker 1: been used to hold a violent, you know, attack on someone. Say, 313 00:19:21,280 --> 00:19:23,679 Speaker 1: you know, meet at this corner at one forty in 314 00:19:23,680 --> 00:19:28,479 Speaker 1: the afternoon, Uh, you know, and have everybody show up 315 00:19:28,520 --> 00:19:31,479 Speaker 1: and start fighting. Well, if the law enforcement is unaware 316 00:19:31,480 --> 00:19:33,760 Speaker 1: of it or the military forces are unaware of it, 317 00:19:34,320 --> 00:19:36,320 Speaker 1: you know, that could be a devastating attack, and it 318 00:19:36,320 --> 00:19:39,199 Speaker 1: could be used by virtually anybody. Chris and I have 319 00:19:39,240 --> 00:19:42,480 Speaker 1: a bit more to say about cyber war in general, 320 00:19:43,080 --> 00:19:52,639 Speaker 1: and we'll get to that after this quick break. The 321 00:19:52,720 --> 00:19:56,560 Speaker 1: dangers of these attacks go beyond just damaging a network 322 00:19:56,680 --> 00:19:59,680 Speaker 1: or shutting down a system. UM. One of the big 323 00:20:00,400 --> 00:20:04,080 Speaker 1: fears that a lot of security folks have is that 324 00:20:04,160 --> 00:20:07,320 Speaker 1: what if you were to coordinate a physical attack with 325 00:20:07,359 --> 00:20:09,880 Speaker 1: a cyber attack. So what if you were to target 326 00:20:09,920 --> 00:20:13,520 Speaker 1: a major city and first you bring down the city's 327 00:20:14,080 --> 00:20:16,920 Speaker 1: power grid through a cyber attack, and then you couple 328 00:20:17,000 --> 00:20:21,120 Speaker 1: that with an actual physical attack like bombs or or whatever, 329 00:20:21,760 --> 00:20:25,719 Speaker 1: and that UM together, that would cause a real panic 330 00:20:25,920 --> 00:20:29,080 Speaker 1: because suddenly you have an entire population that that doesn't 331 00:20:29,240 --> 00:20:33,520 Speaker 1: have access to UM information the way they normally would, 332 00:20:34,080 --> 00:20:37,320 Speaker 1: and yet there is obviously chaos going on. And uh 333 00:20:37,480 --> 00:20:41,360 Speaker 1: that that really is the true definition of terrorism. There 334 00:20:41,520 --> 00:20:45,560 Speaker 1: you're you're inspiring terror in the victim. UM. Now would 335 00:20:45,600 --> 00:20:48,679 Speaker 1: this be nationwide? Probably not. For one thing, the electric 336 00:20:48,720 --> 00:20:51,080 Speaker 1: grid is really much a pretty much a regional kind 337 00:20:51,080 --> 00:20:55,200 Speaker 1: of thing. UM. But it's something that every region could 338 00:20:55,359 --> 00:21:02,600 Speaker 1: theoretically be vulnerable to without the right security measures in place. UM. I. Now, 339 00:21:02,720 --> 00:21:04,919 Speaker 1: that sort of attack obviously would have to come from 340 00:21:04,960 --> 00:21:08,000 Speaker 1: a much more organized group. UM. It would have to 341 00:21:08,040 --> 00:21:11,639 Speaker 1: come from a country or organization that had a strong 342 00:21:11,760 --> 00:21:15,320 Speaker 1: financial backing to be able to fund the physical side 343 00:21:15,359 --> 00:21:19,080 Speaker 1: of the attack. UM. So that that narrows down the 344 00:21:19,119 --> 00:21:22,360 Speaker 1: list of possible suspects who could do that. But it's 345 00:21:22,400 --> 00:21:24,720 Speaker 1: still within the realm of possibility. And it's one of 346 00:21:24,720 --> 00:21:27,520 Speaker 1: those things that you know, keep security people up at night. 347 00:21:27,760 --> 00:21:33,320 Speaker 1: Sure sure UM. And you know, I'm really not certain 348 00:21:34,000 --> 00:21:35,920 Speaker 1: what we're going to be able to do short of 349 00:21:35,960 --> 00:21:39,560 Speaker 1: pulling all the plugs um to make it h an 350 00:21:39,560 --> 00:21:42,960 Speaker 1: impost complete and utter impossibility that they could carry out 351 00:21:42,960 --> 00:21:45,639 Speaker 1: those kinds of attacks, because UM, it's just going to 352 00:21:45,880 --> 00:21:50,399 Speaker 1: require constant monitoring and searching for vulnerabilities. That's why the 353 00:21:50,520 --> 00:21:55,800 Speaker 1: efforts of those who are participating in those um those 354 00:21:55,840 --> 00:22:00,880 Speaker 1: computer security uh war games, if you will, UM, there 355 00:22:00,920 --> 00:22:03,399 Speaker 1: they're so important because they're searching, they're actively searching for 356 00:22:03,400 --> 00:22:05,600 Speaker 1: those vulnerabilities in the system and try, you know, to 357 00:22:05,640 --> 00:22:07,479 Speaker 1: try to find ways to patch them up before they 358 00:22:07,480 --> 00:22:11,000 Speaker 1: can be hacked into. But um, you know, I think 359 00:22:11,119 --> 00:22:14,800 Speaker 1: that any time that you update those systems, you're going 360 00:22:14,840 --> 00:22:18,320 Speaker 1: to open up new vulnerabilities and new problems. And you know, 361 00:22:18,440 --> 00:22:21,000 Speaker 1: it's just one of those things where the people who 362 00:22:21,080 --> 00:22:23,160 Speaker 1: whose job it is to pay attention to it are 363 00:22:23,200 --> 00:22:26,399 Speaker 1: just going to have to stay constantly vigilant to prevent 364 00:22:26,440 --> 00:22:28,880 Speaker 1: something like that from happening. And it is even more 365 00:22:28,920 --> 00:22:31,600 Speaker 1: complicated when you think that. You know, not every system 366 00:22:31,680 --> 00:22:35,320 Speaker 1: runs on the same software or operating system or whatever, 367 00:22:35,840 --> 00:22:38,919 Speaker 1: so some of them are proprietary and uh and and 368 00:22:38,960 --> 00:22:41,160 Speaker 1: so you might find something that works as a great 369 00:22:41,160 --> 00:22:44,000 Speaker 1: security measure for one system, but it's not at all 370 00:22:44,040 --> 00:22:47,000 Speaker 1: applicable to any other. So it is a huge challenge. 371 00:22:47,040 --> 00:22:49,639 Speaker 1: I mean, well, what's the response to that. Do you 372 00:22:49,800 --> 00:22:52,800 Speaker 1: go ahead and try and standardize everything so that hopefully 373 00:22:52,840 --> 00:22:55,239 Speaker 1: the same measures will work across the board. Because if 374 00:22:55,240 --> 00:22:57,919 Speaker 1: you do that and someone does find a vulnerability, suddenly 375 00:22:57,920 --> 00:23:01,400 Speaker 1: they've got a vulnerability that works across all systems. So 376 00:23:02,080 --> 00:23:03,920 Speaker 1: I mean it's a yeah, it's a double edged sword, 377 00:23:04,000 --> 00:23:06,639 Speaker 1: and it's it's there are no easy answers. We've got 378 00:23:06,720 --> 00:23:09,800 Speaker 1: people who are way smarter than I am working on 379 00:23:09,840 --> 00:23:13,680 Speaker 1: this UM and I wish them the best because this 380 00:23:13,760 --> 00:23:16,480 Speaker 1: is this is scary stuff. Now. Are we all in 381 00:23:16,640 --> 00:23:21,080 Speaker 1: danger of something like this happening anytime soon? I don't know. 382 00:23:21,320 --> 00:23:23,080 Speaker 1: I don't know. I don't think so. I mean, I'm 383 00:23:23,119 --> 00:23:26,280 Speaker 1: not I'm not staying up at night worrying the next 384 00:23:26,359 --> 00:23:28,399 Speaker 1: day about that's going to be the day when the 385 00:23:28,400 --> 00:23:31,840 Speaker 1: cyber war attack is going to happen. But it's I mean, 386 00:23:31,880 --> 00:23:35,280 Speaker 1: it is possible. It's just not necessarily something that you 387 00:23:35,320 --> 00:23:38,920 Speaker 1: know that I'm gonna have to worry about on a 388 00:23:39,000 --> 00:23:42,160 Speaker 1: day to day basis. Well, the more systems come online 389 00:23:42,720 --> 00:23:45,000 Speaker 1: UM in more places around the world, I think it's 390 00:23:45,000 --> 00:23:47,960 Speaker 1: going to be it becomes sort of like you know, 391 00:23:48,000 --> 00:23:51,320 Speaker 1: aerial assaults were after you know, that became a real 392 00:23:51,400 --> 00:23:54,240 Speaker 1: possibility in the twentieth century. It's it's going to be 393 00:23:54,320 --> 00:23:58,680 Speaker 1: something that a well planned military strategy is going to include. 394 00:23:59,080 --> 00:24:03,840 Speaker 1: You got your ground groops, you know, air see and internet. 395 00:24:04,480 --> 00:24:07,440 Speaker 1: Anything that can take down the computer network, the computer 396 00:24:07,560 --> 00:24:11,720 Speaker 1: the communications network, the power grid, all at one time. 397 00:24:11,760 --> 00:24:14,160 Speaker 1: If you can do that, then you know you'll panic 398 00:24:14,240 --> 00:24:16,800 Speaker 1: the citizenry, and that just gives you a better chance. 399 00:24:17,160 --> 00:24:19,800 Speaker 1: I can pretty much guarantee that just about every modern 400 00:24:19,920 --> 00:24:22,199 Speaker 1: nation in the world has some sort of plan like 401 00:24:22,240 --> 00:24:24,600 Speaker 1: that in place. Um, and I can also guarantee that 402 00:24:24,640 --> 00:24:27,000 Speaker 1: they're not going to share that because that kind of 403 00:24:27,040 --> 00:24:30,560 Speaker 1: defeats the purpose of the plan. Yeah, but you know, 404 00:24:31,400 --> 00:24:34,760 Speaker 1: my Internet connection goes down plenty without anybody attacking it. 405 00:24:34,920 --> 00:24:37,760 Speaker 1: So and I occasionally lose power if I sneeze too hard, 406 00:24:38,520 --> 00:24:41,080 Speaker 1: so or maybe a blackout. It's one of the two 407 00:24:42,119 --> 00:24:47,120 Speaker 1: either way. All right, then I'm done. I'm yeah. That's 408 00:24:47,119 --> 00:24:50,640 Speaker 1: all I have divulge to the public. That wraps up 409 00:24:50,680 --> 00:24:53,320 Speaker 1: that classic episode of tech stuff. Like I said, you know, 410 00:24:53,359 --> 00:24:57,880 Speaker 1: a lot has happened in the the you know, twelve 411 00:24:58,000 --> 00:25:02,760 Speaker 1: years since we recorded that episode. Uh, things have have 412 00:25:03,000 --> 00:25:07,520 Speaker 1: evolved dramatically. We have all sorts of different threats. We 413 00:25:07,560 --> 00:25:10,760 Speaker 1: have to be aware of things like like uh, like 414 00:25:10,800 --> 00:25:14,400 Speaker 1: supply chain threats like we saw with the solar winds hack. 415 00:25:14,720 --> 00:25:20,520 Speaker 1: That's just one example. So when Friday's episode publishes, I'll 416 00:25:20,560 --> 00:25:26,760 Speaker 1: have a more full discussion about cybersecurity in general. As 417 00:25:26,800 --> 00:25:31,880 Speaker 1: well as why are we seeing the various departments within 418 00:25:32,080 --> 00:25:37,400 Speaker 1: the United States Defense Department lagging behind when it comes 419 00:25:37,440 --> 00:25:41,360 Speaker 1: to cybersecurity, what might be done about it, how does 420 00:25:41,440 --> 00:25:45,600 Speaker 1: China factor into it? And more so, tune into Friday's 421 00:25:45,600 --> 00:25:48,480 Speaker 1: episode for a deeper dive into all of that. I 422 00:25:48,520 --> 00:25:52,120 Speaker 1: appreciate your patients. This means we will not have a 423 00:25:52,160 --> 00:25:56,640 Speaker 1: classic episode on Friday, So today was your classic episode. 424 00:25:57,160 --> 00:26:00,840 Speaker 1: And as always, if you have such austions for topics 425 00:26:00,840 --> 00:26:04,320 Speaker 1: I should cover in tech stuff, whether it's a specific technology, 426 00:26:04,640 --> 00:26:08,840 Speaker 1: a trend, a company, maybe that's the history of a 427 00:26:09,000 --> 00:26:11,360 Speaker 1: tech that you want to know more about. Reach out 428 00:26:11,400 --> 00:26:13,879 Speaker 1: to me on Twitter. The handle for the show is 429 00:26:14,000 --> 00:26:17,119 Speaker 1: text stuff H s W and I'll talk to you 430 00:26:17,160 --> 00:26:25,640 Speaker 1: again really soon. Y tex Stuff is an I Heart 431 00:26:25,720 --> 00:26:29,480 Speaker 1: Radio production. For more podcasts from my Heart Radio, visit 432 00:26:29,520 --> 00:26:32,560 Speaker 1: the i Heart Radio app, Apple Podcasts, or wherever you 433 00:26:32,640 --> 00:26:34,000 Speaker 1: listen to your favorite shows.