1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:14,840 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,040 --> 00:00:18,000 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,160 --> 00:00:22,400 Speaker 1: and a love of all things tech. And recently, the 5 00:00:22,480 --> 00:00:28,159 Speaker 1: Pentagon's former chief of software, Nicholas Hilan, resigned, and he 6 00:00:28,280 --> 00:00:32,320 Speaker 1: did not go quietly into that good resignation. No, he 7 00:00:32,400 --> 00:00:36,519 Speaker 1: posted an explanation of why he chose to quit on LinkedIn, 8 00:00:37,040 --> 00:00:41,080 Speaker 1: and he cited lots of very valid reasons for frustration. 9 00:00:41,680 --> 00:00:44,080 Speaker 1: He expressed anger at how he would have to try 10 00:00:44,080 --> 00:00:47,280 Speaker 1: and chase down budgets in order to fund any sort 11 00:00:47,320 --> 00:00:51,400 Speaker 1: of project or research or development and deployment, and just 12 00:00:51,520 --> 00:00:55,000 Speaker 1: you know how hard that was and exhausting and never ending. 13 00:00:55,440 --> 00:00:58,560 Speaker 1: He pointed out how the US military complex is still 14 00:00:58,600 --> 00:01:02,600 Speaker 1: focused largely on trip aditional weapons systems like fighter jets 15 00:01:02,640 --> 00:01:06,040 Speaker 1: and guns and stuff, and not so much on the 16 00:01:06,080 --> 00:01:09,960 Speaker 1: digital side of warfare. He argued that China is so 17 00:01:10,000 --> 00:01:12,840 Speaker 1: far ahead of the United States in this regard, along 18 00:01:12,880 --> 00:01:17,360 Speaker 1: with its advances and artificial intelligence, that the cyber war 19 00:01:17,560 --> 00:01:21,319 Speaker 1: is effectively over already in China's already won. So today 20 00:01:21,360 --> 00:01:24,479 Speaker 1: I thought I would really tackle these issues and examine 21 00:01:24,520 --> 00:01:29,520 Speaker 1: them are his frustrations all valid? Has the United States 22 00:01:29,560 --> 00:01:33,120 Speaker 1: lost the cyber war already? Well, the first thing we 23 00:01:33,160 --> 00:01:37,959 Speaker 1: should do is acknowledge that cybersecurity in the United States, 24 00:01:38,640 --> 00:01:43,560 Speaker 1: particularly at the federal level, like not a specific company 25 00:01:43,640 --> 00:01:48,360 Speaker 1: or specific trend, but looking at federal cyber security, it's 26 00:01:48,400 --> 00:01:54,800 Speaker 1: pretty lousie, particularly for you know, critical organizations like the 27 00:01:54,840 --> 00:01:58,920 Speaker 1: Department of Defense. And there are many reasons why this 28 00:01:59,000 --> 00:02:03,040 Speaker 1: is so. Some of those reasons are fairly intuitive, which 29 00:02:03,080 --> 00:02:06,360 Speaker 1: means I'm still going to go over them because that's 30 00:02:06,360 --> 00:02:10,680 Speaker 1: how I roll. For example, we know that technology advances 31 00:02:10,840 --> 00:02:15,120 Speaker 1: at a really rapid pace. Uh not all technology evolves 32 00:02:15,160 --> 00:02:17,120 Speaker 1: at the same speed, you know, not all of it 33 00:02:17,160 --> 00:02:21,520 Speaker 1: goes super fast. Gordon Moore observed that because of you know, 34 00:02:21,600 --> 00:02:25,519 Speaker 1: multiple factors, primarily market based ones, the number of discrete 35 00:02:25,520 --> 00:02:27,960 Speaker 1: components that we could cram on a square inch of 36 00:02:27,960 --> 00:02:32,600 Speaker 1: silicon would double every two years or so. Now that 37 00:02:32,639 --> 00:02:36,560 Speaker 1: observation became Moore's law, and today we usually interpret that 38 00:02:36,600 --> 00:02:41,560 Speaker 1: as meaning that a new computers processor will be twice 39 00:02:41,560 --> 00:02:44,400 Speaker 1: as powerful as the ones from two years before. So 40 00:02:44,440 --> 00:02:48,280 Speaker 1: the computers of today are twice as fast or can 41 00:02:48,360 --> 00:02:50,960 Speaker 1: compute twice as much. In the same amount of time 42 00:02:51,639 --> 00:02:55,160 Speaker 1: as the computers we produced two years ago and so on. Now, 43 00:02:55,200 --> 00:02:58,360 Speaker 1: for multiple reasons, we've had to find new methods to 44 00:02:58,440 --> 00:03:01,400 Speaker 1: try and keep Moore's law active. It's not it's not 45 00:03:01,440 --> 00:03:05,160 Speaker 1: a guarantee. It was an observation, and now it's almost 46 00:03:05,200 --> 00:03:09,320 Speaker 1: like a challenge. So there's there's nothing that says that 47 00:03:09,400 --> 00:03:13,160 Speaker 1: this trend will continue forever. In fact, you know, we 48 00:03:13,240 --> 00:03:17,960 Speaker 1: might be able to kind of keep the trend going 49 00:03:18,120 --> 00:03:21,320 Speaker 1: by fudging some of the definitions and changing up the 50 00:03:21,360 --> 00:03:24,359 Speaker 1: way as we do things. But my point is that 51 00:03:24,960 --> 00:03:30,400 Speaker 1: this particular subcategory of tech has a really aggressive trend 52 00:03:30,440 --> 00:03:35,720 Speaker 1: when it comes to advancement or evolution. Well, not everything 53 00:03:35,760 --> 00:03:39,200 Speaker 1: advances that quickly, like battery technology is a great example 54 00:03:39,280 --> 00:03:43,280 Speaker 1: of tech that has a much slower evolutionary path, but 55 00:03:43,360 --> 00:03:46,720 Speaker 1: a lot of stuff does change pretty fast. The world 56 00:03:46,720 --> 00:03:50,760 Speaker 1: of software is another example. Developers generate enormous amounts of 57 00:03:50,800 --> 00:03:54,920 Speaker 1: code every single day. Some are working on mission critical 58 00:03:55,000 --> 00:03:57,960 Speaker 1: systems that play a part in important organizations like the 59 00:03:58,000 --> 00:04:01,160 Speaker 1: Department of Defense, uh saw are making the next version 60 00:04:01,160 --> 00:04:04,360 Speaker 1: of Candy Crush. But the point is that software development 61 00:04:04,360 --> 00:04:08,560 Speaker 1: happens super fast, and sometimes developers might overlook things that 62 00:04:08,600 --> 00:04:12,360 Speaker 1: can lead to a potential vulnerability in the system. Like there, 63 00:04:12,400 --> 00:04:14,720 Speaker 1: they might be focused more on I need to make 64 00:04:14,760 --> 00:04:18,120 Speaker 1: this code work, and less on is there some way 65 00:04:18,160 --> 00:04:22,120 Speaker 1: that someone could leverage this code in a way I 66 00:04:22,160 --> 00:04:27,000 Speaker 1: did not intend. Now, in an ideal world, every developer 67 00:04:27,000 --> 00:04:29,520 Speaker 1: out there would have plenty of time to test code 68 00:04:29,720 --> 00:04:32,839 Speaker 1: thoroughly and conduct some penetration testing to make sure that 69 00:04:32,880 --> 00:04:35,839 Speaker 1: the code doesn't have any vulnerabilities in it before ever 70 00:04:35,920 --> 00:04:39,200 Speaker 1: deploying it. But in the real world we run up against, 71 00:04:39,279 --> 00:04:44,000 Speaker 1: you know, stuff like deadlines and and budgets. These are 72 00:04:44,320 --> 00:04:47,279 Speaker 1: things that mean that sometimes we have to push stuff 73 00:04:47,279 --> 00:04:51,560 Speaker 1: out the door before we can do all the testing 74 00:04:51,640 --> 00:04:54,520 Speaker 1: we would like. So sometimes we see code deployed that 75 00:04:54,640 --> 00:04:57,560 Speaker 1: does have gaps in it, gaps that determined hacker might 76 00:04:57,560 --> 00:05:01,680 Speaker 1: discover and exploit. Nelly. Important element for us to consider 77 00:05:01,760 --> 00:05:06,040 Speaker 1: here is that all this development in tech, from hardware 78 00:05:06,240 --> 00:05:10,479 Speaker 1: to software and all the related fields, it's happening all 79 00:05:10,520 --> 00:05:13,039 Speaker 1: the time. The world of tech really drives home what 80 00:05:13,160 --> 00:05:17,400 Speaker 1: the Greek philosopher Heraclitis said way back in around four 81 00:05:17,880 --> 00:05:21,880 Speaker 1: b C, which is the only thing constant is change. 82 00:05:22,440 --> 00:05:26,359 Speaker 1: The tech world is fast paced and fluid. Now let's 83 00:05:26,400 --> 00:05:31,599 Speaker 1: talk about the world of policy and governments. Governments cannot 84 00:05:31,640 --> 00:05:35,680 Speaker 1: be nearly as nimble as the tech sector. Governments move 85 00:05:35,800 --> 00:05:40,799 Speaker 1: at a much slower pace, glacial, some might say, depending 86 00:05:40,880 --> 00:05:44,400 Speaker 1: upon their point of view. So forming policy takes time. 87 00:05:44,600 --> 00:05:47,279 Speaker 1: You have to have someone to propose it, for one thing, 88 00:05:47,720 --> 00:05:50,040 Speaker 1: and then that person has to get buy in from 89 00:05:50,240 --> 00:05:52,640 Speaker 1: other parties in order to form a plan of action. 90 00:05:53,279 --> 00:05:55,279 Speaker 1: There has to be a vote on that plan of 91 00:05:55,320 --> 00:05:58,080 Speaker 1: action to make sure their support for it. There needs 92 00:05:58,080 --> 00:06:00,320 Speaker 1: to be a budget assigned to that plan of action. 93 00:06:00,360 --> 00:06:02,560 Speaker 1: There's got to be someone in charge. There needs to 94 00:06:02,600 --> 00:06:06,159 Speaker 1: be deliverable as assigned, and some means of holding the 95 00:06:06,240 --> 00:06:09,640 Speaker 1: project accountable for achieving the goals that are set out 96 00:06:09,680 --> 00:06:12,679 Speaker 1: by the policy. You need to have metrics these steps 97 00:06:12,720 --> 00:06:17,400 Speaker 1: I'll take a lot of coordination, cooperation, and time. Complicating 98 00:06:17,440 --> 00:06:20,080 Speaker 1: matters is the fact that in the United States the 99 00:06:20,120 --> 00:06:25,039 Speaker 1: average age of federal level policy makers is pretty up there. 100 00:06:25,160 --> 00:06:28,440 Speaker 1: So right now in the US, the average age in 101 00:06:28,480 --> 00:06:31,799 Speaker 1: the US Senate is sixty four point three years old. 102 00:06:32,160 --> 00:06:34,920 Speaker 1: The average age in the House of Representatives is fifty 103 00:06:34,960 --> 00:06:38,839 Speaker 1: eight point four years old. Now, generally speaking, the average 104 00:06:38,880 --> 00:06:43,480 Speaker 1: American is twenty years younger than the representative who you 105 00:06:43,480 --> 00:06:46,440 Speaker 1: know represents them. Now. I don't want to engage in 106 00:06:46,480 --> 00:06:50,200 Speaker 1: ages um here, especially since I'm forty six. I'm no 107 00:06:50,360 --> 00:06:53,919 Speaker 1: Spring Chicken. I don't want to make too many generalizations. However, 108 00:06:54,160 --> 00:06:56,599 Speaker 1: there is something to be said about people who have 109 00:06:56,760 --> 00:07:00,440 Speaker 1: spent a career in politics who might not be the 110 00:07:00,520 --> 00:07:04,400 Speaker 1: most tech savvy individuals out there. In fact, if you 111 00:07:04,480 --> 00:07:08,159 Speaker 1: do a search on how tech savvy is Congress, you're 112 00:07:08,160 --> 00:07:12,160 Speaker 1: gonna find numerous pieces about how Congress is woefully behind 113 00:07:12,160 --> 00:07:15,760 Speaker 1: the times when it comes to even a basic understanding 114 00:07:15,840 --> 00:07:18,520 Speaker 1: of where tech is and what it is capable of. 115 00:07:19,360 --> 00:07:22,560 Speaker 1: The gap and knowledge with regard to tech is a 116 00:07:22,640 --> 00:07:26,720 Speaker 1: serious problem. And in case you think I'm being hyperbolic here, 117 00:07:26,720 --> 00:07:29,840 Speaker 1: I'll point to one of those tech savvy articles I 118 00:07:29,880 --> 00:07:32,320 Speaker 1: just mentioned. This is something that I really feel you 119 00:07:32,320 --> 00:07:35,360 Speaker 1: should read if you have the chance. It's titled How 120 00:07:35,440 --> 00:07:39,680 Speaker 1: Congress Got Dumb on Tech and how it can get Smart. 121 00:07:40,120 --> 00:07:44,640 Speaker 1: It was written by Grace Gedyer and published in Washington Monthly. 122 00:07:45,080 --> 00:07:48,880 Speaker 1: Grace's piece details how some members of Congress have never 123 00:07:48,920 --> 00:07:53,240 Speaker 1: so much as sent out an email. Okay, so email, 124 00:07:53,440 --> 00:07:56,240 Speaker 1: in case you're not aware, dates back to the early 125 00:07:56,400 --> 00:07:59,400 Speaker 1: nineteen seventies. So let that sink in for a second. 126 00:08:00,000 --> 00:08:04,440 Speaker 1: There are politicians who are at least theoretically representing the 127 00:08:04,480 --> 00:08:09,320 Speaker 1: interests of citizens who live in those politicians districts, and 128 00:08:09,360 --> 00:08:13,160 Speaker 1: these politicians are unfamiliar with the technology that was invented 129 00:08:13,560 --> 00:08:19,600 Speaker 1: fifty years ago, half a century ago. So how up 130 00:08:19,640 --> 00:08:22,320 Speaker 1: to speed do you think these same people are going 131 00:08:22,360 --> 00:08:24,640 Speaker 1: to be when it comes to stuff like a distributed 132 00:08:24,720 --> 00:08:30,040 Speaker 1: denial of service attack or state sponsored hacker groups. Now 133 00:08:30,080 --> 00:08:33,160 Speaker 1: I'll give you a specific example of an embarrassing lack 134 00:08:33,240 --> 00:08:38,760 Speaker 1: of understanding. Recently, the governor of Missouri, Mike Parson, accused 135 00:08:38,760 --> 00:08:42,920 Speaker 1: a reporter from the St. Louis Post Dispatch that's a newspaper, 136 00:08:43,559 --> 00:08:46,040 Speaker 1: of being a hacker. Now, what the reporter had done 137 00:08:46,720 --> 00:08:50,400 Speaker 1: was found out that the HTML code on a Missouri 138 00:08:50,480 --> 00:08:54,520 Speaker 1: Department of Education website contained the private information of school 139 00:08:54,520 --> 00:08:59,240 Speaker 1: teachers and administrators, including things like social security numbers. Now, 140 00:08:59,280 --> 00:09:02,240 Speaker 1: that private inform ation was not visible on the web 141 00:09:02,280 --> 00:09:04,360 Speaker 1: page itself if you were looking at the web page, 142 00:09:04,840 --> 00:09:07,719 Speaker 1: it wasn't right out in front of everything. But if 143 00:09:07,720 --> 00:09:10,960 Speaker 1: you looked at the HTML code, you could see it. 144 00:09:11,480 --> 00:09:14,040 Speaker 1: So Governor Parson said the reporter was a hacker and 145 00:09:14,040 --> 00:09:17,520 Speaker 1: should be prosecuted. Now, some of you all out there 146 00:09:17,559 --> 00:09:24,440 Speaker 1: are probably already saying, what excuse me? So if you 147 00:09:24,559 --> 00:09:27,880 Speaker 1: are not aware web browsers, let you look at the 148 00:09:27,960 --> 00:09:31,480 Speaker 1: underlying HTML code of a page. And this is a 149 00:09:31,520 --> 00:09:34,520 Speaker 1: great tool if you're building a web page. Being able 150 00:09:34,559 --> 00:09:37,600 Speaker 1: to switch between what you see in a browser and 151 00:09:37,760 --> 00:09:41,719 Speaker 1: the HTML code can help you troubleshoot problems. If you're 152 00:09:41,800 --> 00:09:45,680 Speaker 1: learning HTML. Using this, let's you see how any given 153 00:09:45,720 --> 00:09:48,080 Speaker 1: web page is set up. You can actually look at 154 00:09:48,120 --> 00:09:50,240 Speaker 1: the code and say, oh, so that's how they did that. 155 00:09:50,679 --> 00:09:57,000 Speaker 1: More importantly, it's super easy to do in Chrome, for example. 156 00:09:57,480 --> 00:09:59,920 Speaker 1: In order to look at the HTML code for any 157 00:10:00,040 --> 00:10:02,160 Speaker 1: web page doesn't have to be a web page, you won't, 158 00:10:02,240 --> 00:10:05,600 Speaker 1: it's any web page. All you do is right click 159 00:10:06,040 --> 00:10:10,040 Speaker 1: on the web page and you choose view page source 160 00:10:10,480 --> 00:10:12,760 Speaker 1: and it will give you the HTML code. Alternatively, you 161 00:10:12,800 --> 00:10:15,040 Speaker 1: could just hold down the control button and type the 162 00:10:15,120 --> 00:10:20,000 Speaker 1: letter you Boom, you got the HTML code, no hacking involved. 163 00:10:20,280 --> 00:10:23,240 Speaker 1: You can do it on any web page and like 164 00:10:23,480 --> 00:10:26,200 Speaker 1: you know, when you think about it, that's essentially the 165 00:10:26,280 --> 00:10:31,080 Speaker 1: code that the web browser receives. Then the web browser says, oh, 166 00:10:31,240 --> 00:10:34,680 Speaker 1: this code means that the page needs to look like this, 167 00:10:34,840 --> 00:10:36,400 Speaker 1: and that's why you see it the way it is 168 00:10:36,400 --> 00:10:40,920 Speaker 1: on your screen. So there there shouldn't be anything in 169 00:10:40,960 --> 00:10:44,800 Speaker 1: the HTML code that is bad like that shouldn't be 170 00:10:44,800 --> 00:10:48,840 Speaker 1: there because ultimately HTML is just instructions for your web browser, 171 00:10:49,200 --> 00:10:52,240 Speaker 1: so that knows how to display the information there. And 172 00:10:52,320 --> 00:10:56,440 Speaker 1: that's my point. Governor Parson didn't or perhaps still to 173 00:10:56,520 --> 00:11:00,760 Speaker 1: this day, doesn't understand that there's no hacking going on here. 174 00:11:01,240 --> 00:11:05,920 Speaker 1: This is a browser tool working as intended. It's something 175 00:11:06,080 --> 00:11:10,160 Speaker 1: anyone can do with no training, which means the State 176 00:11:10,200 --> 00:11:14,360 Speaker 1: of Missouri was negligent and exposed the private information of 177 00:11:14,400 --> 00:11:17,080 Speaker 1: a lot of people to anyone who just happened to 178 00:11:17,080 --> 00:11:20,680 Speaker 1: look at the HTML source code. At best, you could 179 00:11:20,679 --> 00:11:24,560 Speaker 1: say that Governor Parson was deflecting, attempting to shift the 180 00:11:24,640 --> 00:11:28,439 Speaker 1: blame and hold the reporter responsible for an error that 181 00:11:28,480 --> 00:11:31,520 Speaker 1: the state had made. At worst, you would have to 182 00:11:31,520 --> 00:11:35,200 Speaker 1: say Governor Parson just playing doesn't understand web browsers, which 183 00:11:35,280 --> 00:11:38,439 Speaker 1: doesn't necessarily fill you with confidence that he understands any 184 00:11:38,520 --> 00:11:43,480 Speaker 1: other matters relating to technology and and seeing how tech 185 00:11:43,600 --> 00:11:47,960 Speaker 1: plays such an important part two pretty much everything we 186 00:11:48,040 --> 00:11:52,400 Speaker 1: do all the time these days. This is a huge challenge. 187 00:11:52,520 --> 00:11:57,040 Speaker 1: After all, its Congress's job to form laws and regulations 188 00:11:57,080 --> 00:11:59,080 Speaker 1: at the federal level, and at the state level we 189 00:11:59,120 --> 00:12:01,560 Speaker 1: see the same sort of sing I'm concentrating on the 190 00:12:01,600 --> 00:12:06,520 Speaker 1: federal level because otherwise it starts getting really fragmented. But 191 00:12:06,640 --> 00:12:10,079 Speaker 1: the same thing holds true in states too. If Congress 192 00:12:10,240 --> 00:12:12,319 Speaker 1: at the federal or state level, doesn't have a full 193 00:12:12,400 --> 00:12:16,760 Speaker 1: understanding of technology, how can we expect them to regulate 194 00:12:16,800 --> 00:12:21,120 Speaker 1: it or to form policies that makes sense. So we're 195 00:12:21,240 --> 00:12:26,160 Speaker 1: literally seeing a growing chasm between the fundamental ways technology 196 00:12:26,200 --> 00:12:31,320 Speaker 1: plays a part in our lives and politicians understanding of technology. 197 00:12:31,440 --> 00:12:34,720 Speaker 1: For some, you might as well substitute the word magic 198 00:12:35,200 --> 00:12:38,400 Speaker 1: for technology. You know, as Arthur C. Clark said in 199 00:12:38,520 --> 00:12:44,480 Speaker 1: his Third Law, any sufficiently advanced technology is indistinguishable from magic. Well, 200 00:12:44,520 --> 00:12:47,840 Speaker 1: it appears that, at least for some politicians, the advance 201 00:12:47,920 --> 00:12:51,760 Speaker 1: and technology doesn't have to be particularly impressive Web browsers, 202 00:12:52,160 --> 00:12:56,360 Speaker 1: they'll do the trick. Now, I'm not saying politicians all 203 00:12:56,400 --> 00:13:01,040 Speaker 1: need to become engineers or computer scientists, and politicians can 204 00:13:01,120 --> 00:13:04,880 Speaker 1: and do rely upon subject matter experts to help them 205 00:13:04,960 --> 00:13:10,400 Speaker 1: navigate unfamiliar territory. Whether the politicians grasp what those experts 206 00:13:10,440 --> 00:13:13,760 Speaker 1: are saying is a matter for debate, but at the 207 00:13:13,840 --> 00:13:17,360 Speaker 1: very least they do call upon experts. If you've ever 208 00:13:17,400 --> 00:13:21,560 Speaker 1: watched any video or read any transcripts of these types 209 00:13:21,600 --> 00:13:24,080 Speaker 1: of meetings, even if it's something like one of the 210 00:13:24,120 --> 00:13:28,000 Speaker 1: times where you know, Facebook representatives have to appear before Congress, 211 00:13:28,200 --> 00:13:32,880 Speaker 1: if you look at those transcripts, you might walk away 212 00:13:32,920 --> 00:13:35,439 Speaker 1: with a pretty low opinion of some of the politicians, 213 00:13:35,480 --> 00:13:38,760 Speaker 1: at least when it comes to their you know, tech savvy. 214 00:13:38,920 --> 00:13:41,400 Speaker 1: You might also take issue with the saying there are 215 00:13:41,440 --> 00:13:45,240 Speaker 1: no such thing as as dumb questions, because after reading 216 00:13:45,240 --> 00:13:48,679 Speaker 1: one of those transcripts, you'd say, I beg to differ. Now, 217 00:13:48,720 --> 00:13:50,520 Speaker 1: this also means there can be a certain lack of 218 00:13:50,640 --> 00:13:53,640 Speaker 1: urgency on the part of political leaders to address matters 219 00:13:53,640 --> 00:13:56,600 Speaker 1: that relate to tech unless they feel that it's directly 220 00:13:56,679 --> 00:14:00,800 Speaker 1: threatening them, and this has both good and bad consequences. 221 00:14:00,840 --> 00:14:04,280 Speaker 1: One of the good consequences is that we don't frequently 222 00:14:04,320 --> 00:14:07,760 Speaker 1: see US politicians rush to fund some tech initiative that 223 00:14:07,880 --> 00:14:12,080 Speaker 1: is completely unproven or unsuitable. We're not seeing you know, 224 00:14:12,120 --> 00:14:18,640 Speaker 1: taxpayer money thrown at problems without due consideration. In this regard. Typically, 225 00:14:19,200 --> 00:14:22,280 Speaker 1: most politicians are a bit cautious when it comes to 226 00:14:22,440 --> 00:14:26,840 Speaker 1: authorizing money for something they don't actually understand. And in 227 00:14:26,880 --> 00:14:30,120 Speaker 1: some cases this is a good thing. On the bad side, well, 228 00:14:30,120 --> 00:14:32,320 Speaker 1: we don't see enough support for strategies that could do 229 00:14:32,360 --> 00:14:35,160 Speaker 1: a lot of good either. And there are threats out 230 00:14:35,200 --> 00:14:39,880 Speaker 1: there and they are there here now. So for several reasons, 231 00:14:40,320 --> 00:14:43,320 Speaker 1: politics moves at different pace than tech. One other one 232 00:14:43,360 --> 00:14:46,320 Speaker 1: I didn't really cover is that, of course we elect 233 00:14:46,360 --> 00:14:50,880 Speaker 1: politicians and these processes because they can take so long, 234 00:14:51,400 --> 00:14:56,000 Speaker 1: sometimes they will span beyond one person's term of office. 235 00:14:56,720 --> 00:15:00,920 Speaker 1: And when people change in offices, often we will see 236 00:15:00,960 --> 00:15:06,120 Speaker 1: priorities change and we'll see support for stuff shift, and 237 00:15:06,200 --> 00:15:09,640 Speaker 1: that can slow things down again because again, politics, you know, 238 00:15:09,760 --> 00:15:12,560 Speaker 1: works on its own kind of time frame and its 239 00:15:12,560 --> 00:15:16,240 Speaker 1: own cycles, and that can be disruptive for things where 240 00:15:16,360 --> 00:15:19,680 Speaker 1: you know you're trying to put in a cybersecurity policy. 241 00:15:19,840 --> 00:15:23,640 Speaker 1: This is really problem no one this this timing issue 242 00:15:23,640 --> 00:15:26,840 Speaker 1: when it comes to creating a strong cybersecurity policy, because 243 00:15:26,880 --> 00:15:30,360 Speaker 1: just getting the policy makers up to speed as a challenge. 244 00:15:30,720 --> 00:15:33,240 Speaker 1: Getting that buy in is hard, and there's always the 245 00:15:33,240 --> 00:15:36,720 Speaker 1: possibility that someone will have a misguided but compelling approach 246 00:15:36,800 --> 00:15:39,320 Speaker 1: and then we'll go down the wrong path. Like, you know, 247 00:15:39,360 --> 00:15:41,800 Speaker 1: if I were to appear in front of Congress and 248 00:15:41,840 --> 00:15:45,000 Speaker 1: if I made an impassioned appeal for a certain path 249 00:15:45,120 --> 00:15:48,600 Speaker 1: towards security, and it really did sound like I knew 250 00:15:48,600 --> 00:15:51,480 Speaker 1: what I was talking about, I might get support even 251 00:15:51,520 --> 00:15:54,560 Speaker 1: if I'm totally wrong, just because I sound like I'm 252 00:15:54,600 --> 00:15:58,600 Speaker 1: an authority. That's what we call an appeal to authority. 253 00:15:59,120 --> 00:16:02,120 Speaker 1: That's an argument which I claim that my credentials stand 254 00:16:02,160 --> 00:16:05,280 Speaker 1: as evidence that my argument is sound. The argument doesn't 255 00:16:05,320 --> 00:16:09,440 Speaker 1: have to stand on its own. I'm using my credentials 256 00:16:09,720 --> 00:16:13,240 Speaker 1: as if that is enough to say this is the 257 00:16:13,280 --> 00:16:15,840 Speaker 1: way we need to go. That's actually a fallacy. It's 258 00:16:15,880 --> 00:16:17,760 Speaker 1: the sort of thing that a debate club would jump 259 00:16:17,760 --> 00:16:19,520 Speaker 1: on right away, and in fact, a lot of people 260 00:16:19,520 --> 00:16:21,680 Speaker 1: in Congress would probably do it too, because a lot 261 00:16:21,720 --> 00:16:26,560 Speaker 1: of those folks started off in debate club. Well. Even 262 00:16:26,640 --> 00:16:29,560 Speaker 1: when things go well, when Congress is getting good guidance 263 00:16:29,640 --> 00:16:33,960 Speaker 1: from thought leaders in cybersecurity, when they fashion policies that 264 00:16:34,000 --> 00:16:36,560 Speaker 1: are effective and on point, the process can still be 265 00:16:36,600 --> 00:16:39,480 Speaker 1: slow enough that by the time the policy becomes active, 266 00:16:39,920 --> 00:16:42,720 Speaker 1: the field has changed enough so that whatever protection was 267 00:16:42,800 --> 00:16:45,920 Speaker 1: offered has been compromised in the process. Because it's no 268 00:16:46,000 --> 00:16:49,400 Speaker 1: longer the best practices, right, Like, by the time you 269 00:16:49,440 --> 00:16:51,760 Speaker 1: finally get to the point where now we can put 270 00:16:51,760 --> 00:16:55,520 Speaker 1: this into place, best practices have evolved beyond that. That's 271 00:16:55,520 --> 00:16:57,920 Speaker 1: not to say that the policies are completely worthless, but 272 00:16:58,040 --> 00:17:00,760 Speaker 1: rather that they're always lagging behind. It is the state 273 00:17:00,800 --> 00:17:03,400 Speaker 1: of the art and technology. Now, when we come back, 274 00:17:03,440 --> 00:17:06,800 Speaker 1: we'll look at another matter that affects US cyber strategies, 275 00:17:06,840 --> 00:17:11,120 Speaker 1: and well, it's all about the Benjamin's baby. Let's take 276 00:17:11,160 --> 00:17:20,920 Speaker 1: a quick break. Okay, we're gonna move on to other 277 00:17:20,960 --> 00:17:23,840 Speaker 1: issues that complicate matters when it comes to formulating an 278 00:17:23,840 --> 00:17:30,000 Speaker 1: effective national cybersecurity strategy, and a huge one is organization. Alright, 279 00:17:30,040 --> 00:17:34,080 Speaker 1: so we often refer to the Pentagon as an organization, 280 00:17:34,160 --> 00:17:37,080 Speaker 1: but it's really a massive building. The Pentagon is a 281 00:17:37,200 --> 00:17:40,240 Speaker 1: structure of physical structure. It is the headquarters for the 282 00:17:40,280 --> 00:17:43,480 Speaker 1: United States Department of Defense. So when we say the Pentagon, 283 00:17:44,160 --> 00:17:47,080 Speaker 1: we're often actually referring to the do o D, the 284 00:17:47,119 --> 00:17:50,760 Speaker 1: Department of Defense, not the physical building. So I will 285 00:17:50,760 --> 00:17:54,399 Speaker 1: occasionally be using Pentagon in that regard in this podcast. 286 00:17:54,800 --> 00:17:58,960 Speaker 1: The Department of Defense has three main sub departments within it, 287 00:17:59,240 --> 00:18:03,080 Speaker 1: the Army, the Navy, which also has the Marines in it, 288 00:18:03,480 --> 00:18:06,480 Speaker 1: and the Air Force. And there are other agencies within 289 00:18:06,520 --> 00:18:10,000 Speaker 1: the d o D. For example, there's the Defense Advanced 290 00:18:10,040 --> 00:18:14,160 Speaker 1: Research Projects Agency that's DARPA. You also have the National 291 00:18:14,200 --> 00:18:17,520 Speaker 1: Security Agency that's ESSAY, that's part of the d D two. 292 00:18:18,280 --> 00:18:21,800 Speaker 1: And then each of the three main departments, the Army, 293 00:18:21,880 --> 00:18:24,359 Speaker 1: the Navy, and the Air Force all have multiple agencies 294 00:18:24,359 --> 00:18:27,800 Speaker 1: and divisions under them. So if you had an organ 295 00:18:27,880 --> 00:18:31,320 Speaker 1: chart for the Department of Defense that was even just 296 00:18:31,400 --> 00:18:35,400 Speaker 1: a few levels deep, it would just be massively complicated. 297 00:18:35,440 --> 00:18:38,840 Speaker 1: There'll be interconnecting relationships and potentially a few cases where 298 00:18:38,880 --> 00:18:41,320 Speaker 1: it might be confusing to see who reports to whom. 299 00:18:42,040 --> 00:18:44,280 Speaker 1: Sometimes that can be confusing to the people in the 300 00:18:44,359 --> 00:18:50,640 Speaker 1: organizations themselves. Now within this organization of organizations, you've got 301 00:18:50,680 --> 00:18:55,879 Speaker 1: different departments responsible for stuff like establishing, maintaining and protecting networks. 302 00:18:56,440 --> 00:18:59,080 Speaker 1: You've got different groups that are running on different pieces 303 00:18:59,119 --> 00:19:03,080 Speaker 1: of hardware and software. Some might be locked into systems 304 00:19:03,160 --> 00:19:07,680 Speaker 1: that no longer get support, and we call these legacy systems. 305 00:19:08,280 --> 00:19:10,840 Speaker 1: This is a problem a lot of us encounter in technology, 306 00:19:10,960 --> 00:19:13,960 Speaker 1: not just in the government sphere, but it can happen 307 00:19:14,000 --> 00:19:17,080 Speaker 1: in businesses too. So if you've ever bought a product, 308 00:19:17,640 --> 00:19:21,360 Speaker 1: like let's say that you bought a computer from a company, 309 00:19:21,400 --> 00:19:24,760 Speaker 1: and later on down the line, that computer company goes 310 00:19:24,800 --> 00:19:27,760 Speaker 1: out of business, Well, you might find yourself kind of 311 00:19:27,760 --> 00:19:29,920 Speaker 1: stuck because you're no longer going to get support from 312 00:19:29,920 --> 00:19:34,159 Speaker 1: that company. Right if they were uh previously releasing like 313 00:19:34,280 --> 00:19:37,840 Speaker 1: firmware updates for your for your device, You're not going 314 00:19:37,880 --> 00:19:41,040 Speaker 1: to get those anymore because the company doesn't exist anymore. 315 00:19:41,080 --> 00:19:44,640 Speaker 1: So the stakes become higher because now you're relying on 316 00:19:44,720 --> 00:19:49,240 Speaker 1: something that no longer has support from the manufacturer, but 317 00:19:49,320 --> 00:19:52,080 Speaker 1: you're still dependent upon it. I see this happen with 318 00:19:52,160 --> 00:19:55,080 Speaker 1: like back end systems a lot where a company will 319 00:19:55,119 --> 00:19:58,600 Speaker 1: invest a lot of money into a back end system 320 00:19:58,640 --> 00:20:02,119 Speaker 1: and build pretty much its entire infrastructure on top of 321 00:20:02,160 --> 00:20:07,800 Speaker 1: this back end system. That system will age out, but 322 00:20:07,880 --> 00:20:10,720 Speaker 1: to migrate everything off of that system onto something else 323 00:20:10,760 --> 00:20:16,400 Speaker 1: would be an enormous uh sink of of money and 324 00:20:16,480 --> 00:20:22,879 Speaker 1: time and other resources, and it's a nightmare. So again, 325 00:20:22,920 --> 00:20:25,960 Speaker 1: this happens in companies, not just in governments, but when 326 00:20:26,000 --> 00:20:30,439 Speaker 1: it happens in governments it's particularly rough. Um, you know, 327 00:20:31,240 --> 00:20:34,359 Speaker 1: the organization ends up building products on top of this 328 00:20:34,680 --> 00:20:38,320 Speaker 1: and the underlying stuff is the foundation, and when it 329 00:20:38,359 --> 00:20:40,600 Speaker 1: does come time to migrate, you've got to figure out, well, 330 00:20:40,600 --> 00:20:44,399 Speaker 1: how can I do this without interrupting services. For a company, 331 00:20:44,440 --> 00:20:47,520 Speaker 1: that's important because you want to keep generating revenue. For 332 00:20:47,520 --> 00:20:50,600 Speaker 1: a government, that's important because you've got to keep governing. Right, 333 00:20:51,000 --> 00:20:53,880 Speaker 1: you can't just say like, all right, well y'all all 334 00:20:53,960 --> 00:20:57,359 Speaker 1: just behave and uh, well you're gonna go away for 335 00:20:57,400 --> 00:21:02,520 Speaker 1: a month and migrate our systems on the new new network. Uh, 336 00:21:02,600 --> 00:21:05,040 Speaker 1: then we'll be back and you know, whatever is not 337 00:21:05,080 --> 00:21:07,800 Speaker 1: on fire, I'm sure it's fine, and anything that is 338 00:21:07,840 --> 00:21:10,440 Speaker 1: on fire, we'll get to it. Like, you can't do that, 339 00:21:10,720 --> 00:21:14,680 Speaker 1: So it's a huge challenge. So there are government offices 340 00:21:14,720 --> 00:21:17,600 Speaker 1: that are, you know, at least partly dependent upon legacy systems, 341 00:21:17,680 --> 00:21:21,080 Speaker 1: and these systems sometimes they have vulnerabilities, you know, just 342 00:21:21,119 --> 00:21:23,600 Speaker 1: like any other system. They can have things where it 343 00:21:23,680 --> 00:21:27,879 Speaker 1: was an oversight and some intruder has figured out here's 344 00:21:27,920 --> 00:21:31,920 Speaker 1: my entry point into this network. And without ongoing support, 345 00:21:32,400 --> 00:21:36,520 Speaker 1: you know, those vulnerabilities go unaddressed. There's no patch for them. 346 00:21:36,560 --> 00:21:38,800 Speaker 1: So that means if you are someone who is determined 347 00:21:38,800 --> 00:21:41,760 Speaker 1: to exploit a system and you happen to know what 348 00:21:41,960 --> 00:21:45,800 Speaker 1: hardware or software is being used within that organization. You 349 00:21:45,880 --> 00:21:49,919 Speaker 1: can perhaps formulate a plan of attack that leverages that 350 00:21:50,119 --> 00:21:53,040 Speaker 1: vulnerable system, and you can have a pretty decent level 351 00:21:53,040 --> 00:21:56,400 Speaker 1: of confidence that you'll be able to pull it off 352 00:21:56,440 --> 00:22:00,480 Speaker 1: because chances are no one's patched that vulnerability. It's why 353 00:22:00,520 --> 00:22:03,199 Speaker 1: some experts call for a more modular approach when it 354 00:22:03,240 --> 00:22:07,919 Speaker 1: comes out to planning network architecture. That way, administrators can 355 00:22:07,960 --> 00:22:13,160 Speaker 1: swap out modules within the network architecture if necessary, particularly 356 00:22:13,640 --> 00:22:17,040 Speaker 1: if they're working with stuff that's open source, where you know, 357 00:22:17,160 --> 00:22:22,080 Speaker 1: the open source community finds and addresses vulnerabilities at a 358 00:22:22,200 --> 00:22:27,760 Speaker 1: very quick pace, so that you're constantly with the most secure, 359 00:22:27,880 --> 00:22:31,640 Speaker 1: most recent version of whatever it is you're working on. Okay, So, 360 00:22:32,119 --> 00:22:34,840 Speaker 1: there are also a few big authorities in the federal 361 00:22:34,880 --> 00:22:38,880 Speaker 1: government that are concerned with cybersecurity. For example, the Department 362 00:22:38,920 --> 00:22:43,840 Speaker 1: of Homeland Security has the Cybersecurity and Infrastructure Security Agency 363 00:22:44,080 --> 00:22:48,640 Speaker 1: or sees A c I s A. Uh So, there 364 00:22:48,640 --> 00:22:51,960 Speaker 1: are some that are important when it comes to stuff 365 00:22:52,000 --> 00:22:56,320 Speaker 1: like rolling out standards that all agency offices should follow, 366 00:22:57,080 --> 00:23:01,080 Speaker 1: and once you get past that, there's a much more 367 00:23:01,200 --> 00:23:05,000 Speaker 1: fractured landscape. A Senate report in two thousand nineteen recommended 368 00:23:05,000 --> 00:23:08,520 Speaker 1: a more coordinated approach to cybersecurity to try and bring 369 00:23:08,600 --> 00:23:12,760 Speaker 1: things into kind of a more focused effort, because what 370 00:23:12,840 --> 00:23:16,200 Speaker 1: we were seeing is a very patchwork approach towards UH 371 00:23:16,440 --> 00:23:21,000 Speaker 1: design and implementation of cybersecurity measures. Then you've got stuff 372 00:23:21,000 --> 00:23:23,479 Speaker 1: like budgets to contend with. Every single department has its 373 00:23:23,520 --> 00:23:26,560 Speaker 1: own budget and that gets funneled down into different sub 374 00:23:26,600 --> 00:23:31,480 Speaker 1: departments and projects. So when it comes to weapons systems development, 375 00:23:31,880 --> 00:23:35,600 Speaker 1: the typical approach is to spend thirty of your budget 376 00:23:35,720 --> 00:23:40,119 Speaker 1: on development and procurement and the other goes to sustaining 377 00:23:40,160 --> 00:23:43,639 Speaker 1: the weapons system and maintaining it. Now, that's according to 378 00:23:44,040 --> 00:23:48,760 Speaker 1: Heidi Shiu, who is the current Under Secretary of Defense 379 00:23:48,880 --> 00:23:53,000 Speaker 1: for Research and Engineering UH. She had previously served as 380 00:23:53,040 --> 00:23:57,680 Speaker 1: the Assistant Secretary of the Army for Acquisition, Logistics, and Technology, 381 00:23:57,760 --> 00:24:00,920 Speaker 1: so she has a history with the process of setting up, 382 00:24:01,480 --> 00:24:04,760 Speaker 1: you know, weapons systems and technology systems. So when we 383 00:24:04,800 --> 00:24:08,680 Speaker 1: talk about cybersecurity on a national scale, that actually does 384 00:24:08,760 --> 00:24:11,919 Speaker 1: overlap with weapons systems, both from an you know, an 385 00:24:11,960 --> 00:24:15,400 Speaker 1: attack perspective and a defense perspective. So we're not talking 386 00:24:15,400 --> 00:24:18,000 Speaker 1: about traditional weapons. We're not talking about guns or tanks 387 00:24:18,080 --> 00:24:21,080 Speaker 1: or missiles or anything like that. And that's kind of 388 00:24:21,320 --> 00:24:25,080 Speaker 1: show use point. She was saying that that thirties seventies 389 00:24:25,119 --> 00:24:32,360 Speaker 1: split to procurement and deployment and to sustaining. That doesn't 390 00:24:32,359 --> 00:24:34,679 Speaker 1: really make sense when you're looking at it from a 391 00:24:34,760 --> 00:24:38,000 Speaker 1: cyber front, and that really we should flip that ratio 392 00:24:38,080 --> 00:24:42,800 Speaker 1: around with sevent budgets being dedicated to development and procurement 393 00:24:43,160 --> 00:24:48,640 Speaker 1: and reserved to sustaining and maintaining weapons systems. So budgets, 394 00:24:48,680 --> 00:24:51,480 Speaker 1: by their nature, not only limit how much we can 395 00:24:51,560 --> 00:24:56,200 Speaker 1: spend on any given thing, but how we can spend 396 00:24:56,240 --> 00:24:58,960 Speaker 1: money on that thing. And if we adhere to the 397 00:24:59,000 --> 00:25:01,800 Speaker 1: older philosophy as we hinder our efforts to get up 398 00:25:01,800 --> 00:25:06,120 Speaker 1: to speed in the digital realm. As Nicholas La pointed 399 00:25:06,119 --> 00:25:10,080 Speaker 1: out in his resignation, getting those budgetary dollars is a 400 00:25:10,119 --> 00:25:13,560 Speaker 1: never ending pursuit. You have to get buy in from 401 00:25:13,600 --> 00:25:15,960 Speaker 1: the people who oversee the budgets. You have to make 402 00:25:16,000 --> 00:25:18,479 Speaker 1: your case that the money would be well spent on 403 00:25:18,480 --> 00:25:21,359 Speaker 1: a specific endeavor. You have to provide a means to 404 00:25:21,400 --> 00:25:23,960 Speaker 1: show that the project is staying as close to being 405 00:25:24,040 --> 00:25:27,640 Speaker 1: on deadline and under budgets you can possibly manage. It's 406 00:25:27,680 --> 00:25:30,199 Speaker 1: really a game of numbers and politics, and meanwhile, you 407 00:25:30,240 --> 00:25:32,320 Speaker 1: still have those actual threats out there in the real 408 00:25:32,400 --> 00:25:35,760 Speaker 1: world to worry about. So budgets can also be seen 409 00:25:35,800 --> 00:25:39,000 Speaker 1: as an issue when it comes to attracting talent, and 410 00:25:39,080 --> 00:25:41,640 Speaker 1: just addressing that takes time in the world of policy. 411 00:25:41,880 --> 00:25:45,199 Speaker 1: So back in two thousand and fourteen, the Department of 412 00:25:45,200 --> 00:25:50,240 Speaker 1: Homeland Security requested and got the authority to create a 413 00:25:50,320 --> 00:25:53,720 Speaker 1: new personnel system with the goal of attracting more talent, 414 00:25:53,880 --> 00:25:58,199 Speaker 1: specifically in the field of cybersecurity and cyber warfare. See, 415 00:25:58,320 --> 00:26:02,760 Speaker 1: that is a thriving yield, and the private sector pays 416 00:26:03,200 --> 00:26:07,359 Speaker 1: really well for that kind of talent. So getting a 417 00:26:07,440 --> 00:26:09,919 Speaker 1: qualified person to agree to come and work for the 418 00:26:09,960 --> 00:26:14,000 Speaker 1: government at a salary and benefits that might be significantly 419 00:26:14,040 --> 00:26:16,719 Speaker 1: lower than what they would find elsewhere in the market, 420 00:26:17,240 --> 00:26:20,280 Speaker 1: that's a hard sell. Like, Hi, I know that you 421 00:26:20,280 --> 00:26:23,040 Speaker 1: could make three times as much working for company X, 422 00:26:23,040 --> 00:26:26,640 Speaker 1: but why not you work for us? So the Department 423 00:26:26,640 --> 00:26:30,159 Speaker 1: of Homeland Securities goal was to streamline the process, to 424 00:26:30,280 --> 00:26:33,000 Speaker 1: knock down some of the requirements that applicants would need 425 00:26:33,040 --> 00:26:36,679 Speaker 1: to meet in order to be considered to work for, 426 00:26:37,200 --> 00:26:40,840 Speaker 1: you know, the federal government, and to improve things like 427 00:26:41,400 --> 00:26:44,480 Speaker 1: you know how much they would make. Even that process 428 00:26:44,520 --> 00:26:48,639 Speaker 1: took several years, with the department recently, you know, actually 429 00:26:48,680 --> 00:26:52,840 Speaker 1: bringing this talent management program online. So it was started, 430 00:26:52,960 --> 00:26:55,480 Speaker 1: you know, the start the process started in two thousand fourteen. 431 00:26:55,960 --> 00:26:58,840 Speaker 1: Really it's only been active for a short while and 432 00:26:58,880 --> 00:27:03,680 Speaker 1: it's now so we do see changes over time, but 433 00:27:03,920 --> 00:27:05,920 Speaker 1: now there's a lot of loss time to make up. 434 00:27:06,040 --> 00:27:08,639 Speaker 1: Right now, let's talk about the d o d S 435 00:27:08,680 --> 00:27:11,119 Speaker 1: track record when it comes to actually following through on 436 00:27:11,200 --> 00:27:15,960 Speaker 1: cybersecurity projects, because they have set projects even with all 437 00:27:16,000 --> 00:27:19,760 Speaker 1: these challenges in place, the agencies within the d o 438 00:27:19,920 --> 00:27:24,280 Speaker 1: D have tried to set specific goals for cybersecurity. So 439 00:27:24,480 --> 00:27:28,479 Speaker 1: how do they do well? It ain't great. The Government 440 00:27:28,520 --> 00:27:32,240 Speaker 1: Accountability Office or g AO reviewed the d d S 441 00:27:32,359 --> 00:27:36,600 Speaker 1: cyber hygiene initiatives. These projects were meant to improve overall 442 00:27:36,640 --> 00:27:40,600 Speaker 1: cybersecurity practices and procedures within the Department of Defense, and 443 00:27:40,640 --> 00:27:43,320 Speaker 1: the g AO found that the status of many of 444 00:27:43,359 --> 00:27:48,639 Speaker 1: those projects was incomplete and and at least some cases unknowable, 445 00:27:49,400 --> 00:27:52,439 Speaker 1: which is a big old yikes. So, for example, the 446 00:27:52,520 --> 00:27:56,000 Speaker 1: d o D created a cyber Discipline Plan, and this 447 00:27:56,119 --> 00:28:02,439 Speaker 1: plan identified seventeen preventable vulnerabilities in various networks within the 448 00:28:02,480 --> 00:28:05,560 Speaker 1: Department of Defense and these needed to be addressed, like 449 00:28:05,640 --> 00:28:08,159 Speaker 1: there needed to be a way to patch these vulnerabilities. 450 00:28:08,680 --> 00:28:13,880 Speaker 1: So the d D Chief Information Officers Office became accountable 451 00:28:14,000 --> 00:28:19,960 Speaker 1: for ten of those seventeen identified preventable vulnerabilities, and the 452 00:28:20,000 --> 00:28:23,919 Speaker 1: goal was to have of the projects completed by the 453 00:28:24,000 --> 00:28:27,000 Speaker 1: end of the fiscal year of two thousand eighteen. Now, 454 00:28:27,880 --> 00:28:31,800 Speaker 1: when that time came around, only six of those tasks 455 00:28:32,040 --> 00:28:36,240 Speaker 1: were implemented. Four had not been. So remember ten were 456 00:28:36,359 --> 00:28:40,280 Speaker 1: assigned to this office. Six we're complete, four we're not. 457 00:28:40,800 --> 00:28:44,480 Speaker 1: So by my reckoning, that's a six completion of the tasks, 458 00:28:44,520 --> 00:28:47,600 Speaker 1: not a nine d percent. That's not good. But it 459 00:28:47,680 --> 00:28:50,880 Speaker 1: does get worse because remember I said that the total 460 00:28:51,000 --> 00:28:56,200 Speaker 1: number of preventable vulnerabilities that this program identified was seventeen. 461 00:28:56,720 --> 00:28:59,960 Speaker 1: Only ten of those were given to the CIO office, 462 00:29:00,560 --> 00:29:04,600 Speaker 1: So what about the other seven? You know, that's a 463 00:29:04,720 --> 00:29:08,280 Speaker 1: really darn good question, and sadly it's a question that 464 00:29:08,400 --> 00:29:11,120 Speaker 1: we do not have an answer to. The g AO 465 00:29:11,320 --> 00:29:16,120 Speaker 1: found that no Department of Defense entity had been designated 466 00:29:16,280 --> 00:29:19,720 Speaker 1: to be in charge or to report on those vulnerabilities, 467 00:29:20,080 --> 00:29:24,880 Speaker 1: so the status was unknowable. You could probably make a 468 00:29:24,920 --> 00:29:29,240 Speaker 1: good argument that the vulnerabilities probably went unaddressed since no 469 00:29:29,320 --> 00:29:32,880 Speaker 1: one was assigned accountability to them. It would at least 470 00:29:32,920 --> 00:29:36,800 Speaker 1: be the wisest to move forward under the assumption that 471 00:29:36,840 --> 00:29:39,240 Speaker 1: no one had done anything about them. But the g 472 00:29:39,440 --> 00:29:41,560 Speaker 1: a O was really saying, how can you hope to 473 00:29:41,600 --> 00:29:44,760 Speaker 1: improve cyber hygiene if you don't have any way to 474 00:29:44,840 --> 00:29:49,120 Speaker 1: measure or monitor progress on your goals. Another thing the 475 00:29:49,160 --> 00:29:52,240 Speaker 1: g a O found was that while d D personnel 476 00:29:52,360 --> 00:29:55,760 Speaker 1: were to take cyber awareness training courses, a lot of 477 00:29:55,800 --> 00:29:59,920 Speaker 1: departments within the Department of Defense lacked any information about 478 00:30:00,200 --> 00:30:03,920 Speaker 1: who had or had not actually gone through the training. 479 00:30:04,360 --> 00:30:08,880 Speaker 1: And since network access was supposed to be continggent upon 480 00:30:09,000 --> 00:30:12,520 Speaker 1: taking the training, this meant that people who may not 481 00:30:12,680 --> 00:30:16,840 Speaker 1: have followed the required process continue to have access to 482 00:30:16,880 --> 00:30:21,120 Speaker 1: the system. So the g a O found that project administration, 483 00:30:21,200 --> 00:30:24,200 Speaker 1: accountability and follow through was lacking in the d o 484 00:30:24,280 --> 00:30:27,720 Speaker 1: D when it comes to cyber hygiene. So there's yet 485 00:30:27,760 --> 00:30:30,560 Speaker 1: another problem on top of the ones we've already talked about. 486 00:30:30,840 --> 00:30:33,400 Speaker 1: So on the one hand, you could argue that, but 487 00:30:33,560 --> 00:30:36,840 Speaker 1: you know, with bureaucracies that are as labyrinthian as the 488 00:30:36,840 --> 00:30:39,480 Speaker 1: Department of Defense, you can see how things can get 489 00:30:39,520 --> 00:30:41,640 Speaker 1: lost in the shuffle. But on the other hand, you 490 00:30:41,640 --> 00:30:44,400 Speaker 1: can also see the conditions that would lead someone to 491 00:30:44,480 --> 00:30:47,560 Speaker 1: resign in frustration when it's their job to try and 492 00:30:47,600 --> 00:30:50,280 Speaker 1: get things up to speed. You just you see the 493 00:30:50,360 --> 00:30:53,200 Speaker 1: mountain of work you have to do and the fact 494 00:30:53,240 --> 00:30:56,000 Speaker 1: that as you're addressing one problem, other problems could be 495 00:30:56,040 --> 00:30:59,760 Speaker 1: getting worse. It just becomes a never ending quest. Now 496 00:31:00,720 --> 00:31:03,520 Speaker 1: we've seen some other efforts meant to try and get 497 00:31:03,520 --> 00:31:06,360 Speaker 1: a handle on things in the d o D s 498 00:31:06,440 --> 00:31:09,240 Speaker 1: under Secretary of Research and Engineering put out a request 499 00:31:09,240 --> 00:31:12,200 Speaker 1: for information to federal agencies in order to lay out 500 00:31:12,240 --> 00:31:16,920 Speaker 1: a roadmap on creating effective cybersecurity strategies not just for today, 501 00:31:16,960 --> 00:31:19,280 Speaker 1: but for the next couple of decades. So the goal 502 00:31:19,320 --> 00:31:21,720 Speaker 1: was to get a look at what the Pentagon's capabilities 503 00:31:21,760 --> 00:31:24,440 Speaker 1: are right now, as well as to create projections of 504 00:31:24,440 --> 00:31:27,320 Speaker 1: what future threats could look like. So there are definitely 505 00:31:27,360 --> 00:31:31,240 Speaker 1: people working on these problems, they just have really big 506 00:31:31,320 --> 00:31:33,960 Speaker 1: challenges in front of them. One thing the d o 507 00:31:34,040 --> 00:31:37,719 Speaker 1: D is implementing is what's called a zero trust model. 508 00:31:38,080 --> 00:31:41,600 Speaker 1: This is a system in which users must continuously verify 509 00:31:41,680 --> 00:31:45,680 Speaker 1: their identity even within a session, to ensure that the 510 00:31:45,720 --> 00:31:48,760 Speaker 1: person who is accessing any given system actually has the 511 00:31:48,800 --> 00:31:52,440 Speaker 1: authority to do that. Some of that became really necessary 512 00:31:52,480 --> 00:31:55,760 Speaker 1: in the wake of COVID nineteen because people would frequently 513 00:31:55,800 --> 00:31:58,800 Speaker 1: need to work from home, and granting access to critical 514 00:31:58,840 --> 00:32:03,040 Speaker 1: systems for mote workers comes with a big risk. So 515 00:32:03,160 --> 00:32:07,880 Speaker 1: implementing a system that requires frequent identify identity verification is 516 00:32:07,920 --> 00:32:11,920 Speaker 1: one way to kind of mitigate that risk. Okay, when 517 00:32:11,920 --> 00:32:15,000 Speaker 1: we come back, we're going to talk about cybersecurity threats 518 00:32:15,040 --> 00:32:18,760 Speaker 1: and Chilan's belief that the US has already lost the 519 00:32:18,800 --> 00:32:32,600 Speaker 1: cybersecurity war against China. But first, let's take this quick break. Okay, 520 00:32:33,560 --> 00:32:37,920 Speaker 1: whither China? How how are things looking when we are 521 00:32:37,960 --> 00:32:42,520 Speaker 1: looking at China as a potential threat? Uh with regard 522 00:32:42,600 --> 00:32:44,560 Speaker 1: to cyber warfare. Well, first of all, it's not just 523 00:32:44,640 --> 00:32:48,280 Speaker 1: potential threat, it's a real threat. We have seen attacks 524 00:32:48,640 --> 00:32:54,080 Speaker 1: including ransomware attacks, supply chain attacks that link back to 525 00:32:54,960 --> 00:32:59,760 Speaker 1: hacker groups that either we know or suspect are backed 526 00:32:59,760 --> 00:33:05,440 Speaker 1: by China's government. So it is a a clear and 527 00:33:05,520 --> 00:33:10,640 Speaker 1: present danger, if you will. China is definitely at least 528 00:33:10,760 --> 00:33:17,400 Speaker 1: funding efforts to penetrate various cyber systems within the United States. 529 00:33:17,440 --> 00:33:20,960 Speaker 1: So this is where we get into state sponsored hackers. 530 00:33:21,440 --> 00:33:26,360 Speaker 1: China funds companies and hacker organizations. There are companies in 531 00:33:26,480 --> 00:33:31,000 Speaker 1: China that have sort of a front that makes them 532 00:33:31,000 --> 00:33:36,760 Speaker 1: appear to be some you know, relatively harmless organization, but 533 00:33:36,880 --> 00:33:40,360 Speaker 1: in truth, the whole purpose of the company is just 534 00:33:40,440 --> 00:33:46,840 Speaker 1: to attract hackers and then direct them toward efforts to 535 00:33:46,840 --> 00:33:51,400 Speaker 1: to conduct things like espionage. So there are entire companies 536 00:33:51,400 --> 00:33:54,640 Speaker 1: in China that are really just fronts for hacker organizations. 537 00:33:54,640 --> 00:33:57,400 Speaker 1: And then there, of course you're your black hat hacker 538 00:33:57,560 --> 00:34:00,840 Speaker 1: groups that aren't quite that organ eyes but are still 539 00:34:00,880 --> 00:34:05,320 Speaker 1: active in China, and China's government will fund a lot 540 00:34:05,320 --> 00:34:08,200 Speaker 1: of these. It allows China to have you know, top 541 00:34:08,239 --> 00:34:13,240 Speaker 1: talent on hand without formally being part of China's government. Like, 542 00:34:14,000 --> 00:34:16,239 Speaker 1: China has the same issues that the United States has 543 00:34:16,360 --> 00:34:18,560 Speaker 1: in that if you want to make a lot of 544 00:34:18,600 --> 00:34:23,080 Speaker 1: money in China, then working for a company tends to 545 00:34:23,120 --> 00:34:26,560 Speaker 1: be a better approach than working for the government. So 546 00:34:27,719 --> 00:34:32,280 Speaker 1: same sort of thing China, rather than recruiting directly into 547 00:34:32,360 --> 00:34:38,360 Speaker 1: its ranks, is funding the efforts of companies that the 548 00:34:38,360 --> 00:34:42,200 Speaker 1: hackers work directly for. It also means that China has 549 00:34:42,200 --> 00:34:46,680 Speaker 1: a little bit of plausible deniability because those companies aren't 550 00:34:46,880 --> 00:34:50,319 Speaker 1: formally part of China's government, and hackers get a little 551 00:34:50,360 --> 00:34:53,080 Speaker 1: bit more leeway. Like if they worked directly for the government, 552 00:34:53,120 --> 00:34:55,359 Speaker 1: they would have to do exactly what the government says, 553 00:34:55,440 --> 00:34:59,640 Speaker 1: but because there's this buffering, they have a bit more freedom. 554 00:34:59,719 --> 00:35:02,319 Speaker 1: This is a good thing for them and a bad 555 00:35:02,360 --> 00:35:04,680 Speaker 1: thing for them, so they can you know, they still 556 00:35:04,719 --> 00:35:07,760 Speaker 1: have to achieve whatever it is that the government wants 557 00:35:07,800 --> 00:35:09,840 Speaker 1: them to do, like to infiltrate a system for the 558 00:35:09,880 --> 00:35:13,520 Speaker 1: purposes of espionage, for example, but they can also do 559 00:35:13,600 --> 00:35:16,800 Speaker 1: some other stuff, like they can try and steal things, 560 00:35:17,520 --> 00:35:20,520 Speaker 1: which frequently is how we find out about them, because 561 00:35:20,560 --> 00:35:24,120 Speaker 1: if an intrusion is done very well, you might not 562 00:35:24,200 --> 00:35:26,839 Speaker 1: be aware that the system has been compromised, but if 563 00:35:26,880 --> 00:35:30,920 Speaker 1: someone's messing around with stuff, it becomes apparent pretty quickly. 564 00:35:31,360 --> 00:35:35,280 Speaker 1: So in some cases this approach means that we become 565 00:35:35,320 --> 00:35:38,880 Speaker 1: more aware of these intrusions, gives us the chance to 566 00:35:38,920 --> 00:35:42,080 Speaker 1: address those vulnerabilities and patch them out, and while the 567 00:35:42,160 --> 00:35:46,240 Speaker 1: damage might have already been done, it can help prevent 568 00:35:46,480 --> 00:35:52,840 Speaker 1: ongoing espionage projects at least using that specific vulnerability. So 569 00:35:54,120 --> 00:35:58,240 Speaker 1: you know, the fact that hackers can do these things 570 00:35:58,280 --> 00:36:02,160 Speaker 1: outside of their initial directives. Means that there's an additional 571 00:36:02,200 --> 00:36:04,560 Speaker 1: financial incentive for the hackers. If they can make money 572 00:36:04,600 --> 00:36:09,360 Speaker 1: by stealing, then that's an added bonus to them. However, 573 00:36:09,480 --> 00:36:12,200 Speaker 1: it does mean that we tend to catch them more frequently. 574 00:36:12,760 --> 00:36:15,480 Speaker 1: Um yeah, and then espion as you just don't want 575 00:36:15,480 --> 00:36:18,480 Speaker 1: anyone to know that you're there. So it could be 576 00:36:18,520 --> 00:36:21,800 Speaker 1: a downside for China's goals in the long run, although 577 00:36:21,840 --> 00:36:23,720 Speaker 1: they can do, like I said, a lot of damage 578 00:36:23,719 --> 00:36:27,120 Speaker 1: even in the short term. But China itself is also 579 00:36:27,160 --> 00:36:31,240 Speaker 1: facing a cybersecurity talent shortage. This is according to China's 580 00:36:31,360 --> 00:36:36,040 Speaker 1: Ministry of Industry and Information Technology, and it ties into 581 00:36:36,120 --> 00:36:40,360 Speaker 1: another issue, which is China's approach to regulations with regard 582 00:36:40,480 --> 00:36:44,680 Speaker 1: to the tech sector in China. This also will tie 583 00:36:44,760 --> 00:36:49,760 Speaker 1: into artificial intelligence. All of this is interconnected. So China 584 00:36:49,920 --> 00:36:53,080 Speaker 1: was saying that China's superiority in AI is part of 585 00:36:53,080 --> 00:36:57,080 Speaker 1: what is spelling the doom in cyber warfare. So we're 586 00:36:57,080 --> 00:37:01,200 Speaker 1: gonna look at that in a second, um, but really quickly, 587 00:37:01,280 --> 00:37:04,520 Speaker 1: just to talk about regulations. For a long time, it 588 00:37:04,600 --> 00:37:06,840 Speaker 1: was the wild West in China as far as the 589 00:37:07,160 --> 00:37:10,120 Speaker 1: tech sector was concerned. Like China was taking very much 590 00:37:10,120 --> 00:37:14,040 Speaker 1: a hands off approach and allowing companies to do things 591 00:37:14,080 --> 00:37:16,560 Speaker 1: that here in the United States or in places like 592 00:37:16,600 --> 00:37:19,239 Speaker 1: the European Union, companies wouldn't be able to get away 593 00:37:19,239 --> 00:37:24,399 Speaker 1: with stuff that would clearly violate, say, people's privacy. Well, 594 00:37:24,400 --> 00:37:27,400 Speaker 1: in China, that was kind of fair game. You could 595 00:37:27,560 --> 00:37:30,520 Speaker 1: do that, uh and that that ended up fueling a 596 00:37:30,560 --> 00:37:34,320 Speaker 1: lot of rapid growth in the tech sector. It fueled 597 00:37:34,320 --> 00:37:37,719 Speaker 1: a lot of consumerism in China, and it fueled a 598 00:37:37,719 --> 00:37:40,960 Speaker 1: lot of growth in AI in China. We're now starting 599 00:37:41,000 --> 00:37:44,480 Speaker 1: to see that kind of turn around. This also is 600 00:37:44,800 --> 00:37:50,160 Speaker 1: affecting things like the desire to go into UH tech fields, 601 00:37:50,239 --> 00:37:54,960 Speaker 1: because now we're seeing China start to push back against 602 00:37:55,040 --> 00:37:58,319 Speaker 1: the tech industry. Okay, so let's talk about AI. One 603 00:37:58,320 --> 00:38:02,000 Speaker 1: way to measure how our country is along with regard 604 00:38:02,040 --> 00:38:05,840 Speaker 1: to artificial intelligence is to look at how many papers, 605 00:38:05,840 --> 00:38:09,880 Speaker 1: how many scientific papers are published on that subject within 606 00:38:09,920 --> 00:38:15,440 Speaker 1: that country, how many patents relating to AI get filed 607 00:38:15,800 --> 00:38:19,239 Speaker 1: in that country, And by that those metrics, China has 608 00:38:19,320 --> 00:38:22,799 Speaker 1: surged ahead. So in just twenty years, it went from 609 00:38:22,800 --> 00:38:26,600 Speaker 1: publishing less than five percent of all papers on the subject. 610 00:38:26,920 --> 00:38:30,719 Speaker 1: You know, scientific papers in the world per year to 611 00:38:30,880 --> 00:38:36,080 Speaker 1: now around, so like almost a third of all scientific 612 00:38:36,160 --> 00:38:39,879 Speaker 1: papers about artificial intelligence are coming out of China. That 613 00:38:39,920 --> 00:38:45,280 Speaker 1: marks an enormous push in AI research and development. China 614 00:38:45,360 --> 00:38:49,320 Speaker 1: caught up and then started to run past everybody else. However, 615 00:38:50,520 --> 00:38:53,600 Speaker 1: as Harvard Business Review has pointed out, a lot of 616 00:38:53,600 --> 00:38:58,920 Speaker 1: this research has benefited from that very loose regulatory environment 617 00:38:58,920 --> 00:39:02,319 Speaker 1: in China, particular early when it comes to stuff like privacy. 618 00:39:02,360 --> 00:39:06,040 Speaker 1: So AI research could take advantage of the fact that, 619 00:39:06,560 --> 00:39:09,320 Speaker 1: you know, it was easy to collect enormous amounts of 620 00:39:09,440 --> 00:39:13,480 Speaker 1: data and to use that data when refining your artificial 621 00:39:13,480 --> 00:39:19,040 Speaker 1: intelligence for specific fields of AI, like speech and facial 622 00:39:19,080 --> 00:39:22,160 Speaker 1: recognition technologies. This is also where we have to remind 623 00:39:22,160 --> 00:39:25,560 Speaker 1: ourselves when we say AI. You know, when we use 624 00:39:25,680 --> 00:39:30,000 Speaker 1: the phrase artificial intelligence, that is an incredibly broad term. 625 00:39:30,239 --> 00:39:35,520 Speaker 1: It It encompasses dozens of different disciplines. AI is not 626 00:39:35,600 --> 00:39:39,320 Speaker 1: so simple as saying their machines are smarter than our machines. 627 00:39:40,160 --> 00:39:44,319 Speaker 1: That's not that's not an accurate representation of what's going on. 628 00:39:44,800 --> 00:39:49,560 Speaker 1: So you could make a very valid argument that China's 629 00:39:49,719 --> 00:39:57,560 Speaker 1: expertise in AI is incredibly advanced for some specific subcategories 630 00:39:57,600 --> 00:40:02,800 Speaker 1: of AI, but not necessarily across the board. Also, China, 631 00:40:02,880 --> 00:40:05,520 Speaker 1: as I mentioned, has more recently started to impose some 632 00:40:05,600 --> 00:40:11,160 Speaker 1: regulations on the tech sector. So they're starting to put 633 00:40:11,200 --> 00:40:13,160 Speaker 1: a little bit of a cap on the amount of 634 00:40:13,239 --> 00:40:15,960 Speaker 1: data and the types of data that companies can collect 635 00:40:16,000 --> 00:40:19,359 Speaker 1: for example. And I suspect that one cause of this 636 00:40:19,600 --> 00:40:23,799 Speaker 1: is that the Chinese government doesn't want companies to potentially 637 00:40:23,920 --> 00:40:28,000 Speaker 1: rival the power and authority of the government itself. Like 638 00:40:28,080 --> 00:40:30,880 Speaker 1: we've seen that China, China's government has been a little 639 00:40:31,040 --> 00:40:36,839 Speaker 1: uneasy with the growth and power of big companies within China, 640 00:40:36,920 --> 00:40:40,160 Speaker 1: and that there is this sort of push pull relationship 641 00:40:40,320 --> 00:40:42,920 Speaker 1: that goes on occasionally. And right now it looks like 642 00:40:43,040 --> 00:40:47,239 Speaker 1: China is starting to pass more regulations UH. That could 643 00:40:47,239 --> 00:40:51,360 Speaker 1: mean that we might see this area of AI research 644 00:40:51,400 --> 00:40:54,960 Speaker 1: and development start to slow down a bit because the 645 00:40:55,040 --> 00:40:59,759 Speaker 1: access to the data itself is going to decrease. In 646 00:40:59,800 --> 00:41:03,279 Speaker 1: an edition, most of the patents in ai UH in 647 00:41:03,440 --> 00:41:08,040 Speaker 1: China belong to universities. They're coming out of university research groups. 648 00:41:08,239 --> 00:41:12,480 Speaker 1: In the West, most AI patents are actually held by companies, 649 00:41:12,600 --> 00:41:16,799 Speaker 1: not by universities, and that means that the organizations that 650 00:41:16,800 --> 00:41:20,759 Speaker 1: can actually implement AI solutions like these are companies that 651 00:41:20,760 --> 00:41:24,320 Speaker 1: can turn them into products and sell them either to 652 00:41:24,400 --> 00:41:27,960 Speaker 1: consumers or to other businesses or what have you. In China, 653 00:41:28,440 --> 00:41:31,920 Speaker 1: that knowledge is within the universities, and there is a 654 00:41:32,040 --> 00:41:37,719 Speaker 1: pretty weak technoledge transfer in China from universities to companies, 655 00:41:38,120 --> 00:41:42,640 Speaker 1: so actually making use of those patents in China is 656 00:41:42,680 --> 00:41:44,680 Speaker 1: not as straightforward as it is in other parts of 657 00:41:44,719 --> 00:41:48,880 Speaker 1: the world. The Harvard Business Review concluded that AI research 658 00:41:48,920 --> 00:41:52,360 Speaker 1: in China is largely in fields that lack original ideas 659 00:41:52,440 --> 00:41:57,000 Speaker 1: and breakthrough technologies. So again, the stuff that they're focusing on, 660 00:41:57,920 --> 00:42:01,279 Speaker 1: it's phenomenal the work they're doing, but it doesn't necessarily 661 00:42:01,320 --> 00:42:05,000 Speaker 1: mean that the AI technologies that are really going to 662 00:42:05,040 --> 00:42:08,759 Speaker 1: power cyber warfare in the future are the ones that 663 00:42:09,560 --> 00:42:15,640 Speaker 1: China is excelling at. So the from a casual glance 664 00:42:15,960 --> 00:42:20,120 Speaker 1: at how kind of slipshot our cybersecurity is here in 665 00:42:20,160 --> 00:42:24,040 Speaker 1: the United States and the general progress of AI in China, 666 00:42:24,400 --> 00:42:26,279 Speaker 1: I could easily see where you would come to the 667 00:42:26,280 --> 00:42:30,080 Speaker 1: conclusion of the game's over. The United States has lost, 668 00:42:30,160 --> 00:42:34,960 Speaker 1: China has one. There's no point in saying otherwise, I 669 00:42:35,000 --> 00:42:39,279 Speaker 1: would argue the future is not yet written. There is 670 00:42:39,320 --> 00:42:41,239 Speaker 1: a lot that needs to happen in the United States 671 00:42:41,280 --> 00:42:44,600 Speaker 1: in order for cybersecurity to get up to a level 672 00:42:44,960 --> 00:42:48,839 Speaker 1: that is even roughly equivalent to the threats that are 673 00:42:48,840 --> 00:42:50,560 Speaker 1: out there. And you have to keep in mind that 674 00:42:50,600 --> 00:42:54,080 Speaker 1: those threats are constantly evolving. After all, the threats, all 675 00:42:54,120 --> 00:42:57,000 Speaker 1: they're really doing is trying to find a way into systems. 676 00:42:57,280 --> 00:43:00,840 Speaker 1: So they just have to find a way in, whereas 677 00:43:01,200 --> 00:43:04,800 Speaker 1: we have to anticipate all the potential ways that hackers 678 00:43:04,800 --> 00:43:07,520 Speaker 1: could potentially get into a system. It's very, very difficult. 679 00:43:08,400 --> 00:43:11,560 Speaker 1: And you know, if we look at our recent history, 680 00:43:11,719 --> 00:43:14,120 Speaker 1: we might say, well, what hope is there for us? 681 00:43:14,400 --> 00:43:17,799 Speaker 1: But I would argue we're constantly pushing to get better 682 00:43:18,239 --> 00:43:20,920 Speaker 1: and that that is something we have to take into account. 683 00:43:20,920 --> 00:43:24,120 Speaker 1: And I would also argue that we shouldn't fall into 684 00:43:24,120 --> 00:43:30,080 Speaker 1: the trap of overestimating the capabilities of any potential uh, 685 00:43:30,120 --> 00:43:33,760 Speaker 1: you know, rival out there, whether it's China or Russia 686 00:43:33,920 --> 00:43:37,759 Speaker 1: or Iran. We we shouldn't dismiss the threats at all. 687 00:43:38,239 --> 00:43:42,200 Speaker 1: But we also shouldn't, you know, become fatalists and say, well, 688 00:43:42,239 --> 00:43:46,399 Speaker 1: we've lost, because I don't think it's as simple as that. 689 00:43:47,200 --> 00:43:52,799 Speaker 1: I think there's still opportunities and that it's not as 690 00:43:52,920 --> 00:43:55,319 Speaker 1: It's not as simple as saying, well, we left we 691 00:43:55,440 --> 00:43:59,440 Speaker 1: left the country unguarded for too long and now there's 692 00:44:00,000 --> 00:44:05,280 Speaker 1: not anything left to guard. So this was a complicated topic, 693 00:44:05,520 --> 00:44:08,960 Speaker 1: like there was a lot to go through. Uh. And 694 00:44:09,120 --> 00:44:13,040 Speaker 1: obviously we've only touched on certain things a little bit 695 00:44:13,280 --> 00:44:16,080 Speaker 1: and could dive into much more detail. But I wanted 696 00:44:16,080 --> 00:44:18,960 Speaker 1: to cover it because it was in the news. It 697 00:44:19,040 --> 00:44:22,080 Speaker 1: was something that was really interesting to me, and I 698 00:44:22,080 --> 00:44:25,560 Speaker 1: wanted to get a better handle on exactly what are 699 00:44:25,560 --> 00:44:28,960 Speaker 1: we looking at here. Uh. I hope that this was 700 00:44:29,000 --> 00:44:31,480 Speaker 1: interesting to you. If you are someone who works in 701 00:44:31,520 --> 00:44:36,279 Speaker 1: the cybersecurity field and you found this interesting or you 702 00:44:36,320 --> 00:44:40,160 Speaker 1: have more to add, certainly reach out to me. Also, 703 00:44:40,239 --> 00:44:42,160 Speaker 1: if you just have a suggestion for a topic I 704 00:44:42,160 --> 00:44:44,400 Speaker 1: should cover in a future episode of tech Stuff, feel 705 00:44:44,440 --> 00:44:46,319 Speaker 1: free to reach out. The best way to do that 706 00:44:46,480 --> 00:44:49,720 Speaker 1: is over on Twitter. The handle we use is text 707 00:44:49,719 --> 00:44:54,719 Speaker 1: stuff hs W and I'll talk to you again really soon. 708 00:45:00,160 --> 00:45:03,160 Speaker 1: Tex Stuff is an I Heart Radio production. For more 709 00:45:03,239 --> 00:45:06,600 Speaker 1: podcasts from I Heart Radio, visit the i Heart Radio app, 710 00:45:06,760 --> 00:45:09,920 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.