1 00:00:03,160 --> 00:00:05,880 Speaker 1: On this episode of news World, we are fighting a 2 00:00:05,920 --> 00:00:09,920 Speaker 1: new war with an invisible front line and an indiscernible enemy. 3 00:00:10,440 --> 00:00:13,040 Speaker 1: The enemy can strike us at any time without warning, 4 00:00:13,560 --> 00:00:16,400 Speaker 1: and we aren't doing enough to fortify and protect ourselves. 5 00:00:17,120 --> 00:00:20,759 Speaker 1: This is America's secret cyber war, and we've been fighting 6 00:00:20,760 --> 00:00:23,680 Speaker 1: it since the dawn of the Internet. It includes cyber 7 00:00:23,720 --> 00:00:29,160 Speaker 1: attacks on companies, governments, and individuals, hacking, spreading propaganda through 8 00:00:29,200 --> 00:00:34,280 Speaker 1: social media, intellectual property theft, and stealing on military secrets. 9 00:00:35,760 --> 00:00:38,640 Speaker 1: It is an invisible war we are fighting every day, 10 00:00:39,280 --> 00:00:42,000 Speaker 1: and we as a nation need to do more to 11 00:00:42,040 --> 00:00:46,280 Speaker 1: protect ourselves against these advanced adversaries. And please to introduce 12 00:00:46,320 --> 00:00:50,840 Speaker 1: my guest, Chris Gore. Chris is CEO of D four 13 00:00:50,960 --> 00:00:56,240 Speaker 1: C Global, a private counterintelligence firm based in Fairfax, Virginia. 14 00:00:56,440 --> 00:00:59,920 Speaker 1: He served as a former Air Force OSI special agent. 15 00:01:10,240 --> 00:01:14,840 Speaker 1: We really underestimate the amount of criminal involvement in cyber 16 00:01:15,000 --> 00:01:18,840 Speaker 1: because many companies just pay them off and stay quiet 17 00:01:19,080 --> 00:01:22,840 Speaker 1: because they don't want the publicity that they can be penetrated. 18 00:01:23,120 --> 00:01:26,720 Speaker 1: Is it your impression that there's probably actually more cyber 19 00:01:26,800 --> 00:01:31,280 Speaker 1: crime than we know about, because there's a substantial pattern 20 00:01:31,400 --> 00:01:34,360 Speaker 1: of not reporting. If you have the ability to get 21 00:01:34,400 --> 00:01:37,480 Speaker 1: onto a network, whether it's a corporate backphone, whether it's 22 00:01:37,560 --> 00:01:41,319 Speaker 1: data or a power grid. To steal information, you have 23 00:01:41,440 --> 00:01:45,319 Speaker 1: the ability to cause damage. So it's just a matter 24 00:01:45,400 --> 00:01:49,360 Speaker 1: of intent to change from stealing to destruction. I can 25 00:01:49,400 --> 00:01:51,400 Speaker 1: take you all the way back to two thousand and 26 00:01:51,520 --> 00:01:54,640 Speaker 1: four when I was investigating the intrusions around the Trying 27 00:01:54,680 --> 00:01:57,400 Speaker 1: Strike Fighter, which is the largest contract in the history 28 00:01:57,440 --> 00:02:00,640 Speaker 1: of DOOD, thousands of subcontractors. It's being target it all 29 00:02:00,640 --> 00:02:02,720 Speaker 1: over the place, and at that time, there was no 30 00:02:02,800 --> 00:02:07,360 Speaker 1: requirement for defense contractors to report to the government that 31 00:02:07,440 --> 00:02:10,600 Speaker 1: their unclassified networks were compromised, and so we had to 32 00:02:10,680 --> 00:02:14,680 Speaker 1: change policy and the federal acquisition requirements and all kinds 33 00:02:14,680 --> 00:02:18,560 Speaker 1: of things. And that kind of shift hasn't happened in 34 00:02:18,600 --> 00:02:21,320 Speaker 1: the rest of the sectors in America. So the financial 35 00:02:21,360 --> 00:02:23,160 Speaker 1: sector and those type of things are not don't have 36 00:02:23,200 --> 00:02:26,639 Speaker 1: the same type of reporting acquirements. And I am personally 37 00:02:26,639 --> 00:02:30,040 Speaker 1: familiar with a number of cases where corporations have been 38 00:02:30,120 --> 00:02:33,760 Speaker 1: hit with ransomware, have paid the ransom have brought in 39 00:02:33,880 --> 00:02:38,760 Speaker 1: professional cybersecurity firms to negotiate with the bad guys and 40 00:02:38,880 --> 00:02:41,880 Speaker 1: pay it, and then move on as if nothing has happened. 41 00:02:42,320 --> 00:02:45,680 Speaker 1: Some of these are substantial ransoms, and so there's definitely 42 00:02:45,720 --> 00:02:50,400 Speaker 1: no motivation for the bad guys to stop. American society 43 00:02:50,760 --> 00:02:54,960 Speaker 1: is completely unaware just how bad the criminal elements are. 44 00:02:55,200 --> 00:02:57,200 Speaker 1: Taken a lot of money from people. If you had 45 00:02:57,240 --> 00:03:01,840 Speaker 1: to guess what percent of this is criminal, what percent's government, 46 00:03:01,880 --> 00:03:04,440 Speaker 1: and what percent of it is just the individuals jerking 47 00:03:04,440 --> 00:03:07,919 Speaker 1: around as a hobby. There are different types of things happening. 48 00:03:08,320 --> 00:03:13,200 Speaker 1: When you see ransomware that's happening, and organizations being blackmailed 49 00:03:13,200 --> 00:03:15,360 Speaker 1: without going in there, encrypt all of their servers and 50 00:03:15,400 --> 00:03:17,320 Speaker 1: then basically you have to pay a bunch of bitcoin 51 00:03:17,360 --> 00:03:20,400 Speaker 1: to get your stuff undone that is in most part 52 00:03:20,480 --> 00:03:25,280 Speaker 1: a criminal enterprise, but it can also blend into state 53 00:03:25,360 --> 00:03:28,440 Speaker 1: sponsored enterprise. The North Koreans are kind of considered for 54 00:03:28,520 --> 00:03:31,320 Speaker 1: doing this to try to increase some revenue because of 55 00:03:31,360 --> 00:03:34,440 Speaker 1: all the sanctions. I would suggest that any of the 56 00:03:34,560 --> 00:03:40,800 Speaker 1: targeting US intellectual property that's been happening over the last five, six, 57 00:03:40,920 --> 00:03:45,320 Speaker 1: seven years, to the tune of some government reporting three 58 00:03:45,400 --> 00:03:48,760 Speaker 1: hundred billion dollars a year and losses of intellectual property, 59 00:03:49,320 --> 00:03:52,760 Speaker 1: that when you're going after a Lockey Martin or a 60 00:03:52,960 --> 00:03:58,440 Speaker 1: Northrop Grumman or a caterpillar. Those require substantial capability to 61 00:03:58,440 --> 00:04:02,600 Speaker 1: defeat those organizations that is state sponsored. The criminal stuff 62 00:04:02,880 --> 00:04:05,280 Speaker 1: is going after your money in the blackmail, and then 63 00:04:05,320 --> 00:04:08,680 Speaker 1: your individual on the basement can kind of weave in between. There. 64 00:04:08,920 --> 00:04:11,520 Speaker 1: There was a report of the teenager that compromised the 65 00:04:11,600 --> 00:04:14,840 Speaker 1: director of the CIA's home account. That does happen. You 66 00:04:14,920 --> 00:04:19,560 Speaker 1: do get some activist groups. Anonymous will do some things, 67 00:04:19,560 --> 00:04:22,680 Speaker 1: and those may be kind of politically motivated, and they're 68 00:04:22,720 --> 00:04:26,039 Speaker 1: doing it more to put a message up on a website, 69 00:04:26,160 --> 00:04:28,839 Speaker 1: those type of things. So most of the lower level 70 00:04:29,560 --> 00:04:33,400 Speaker 1: script kitty, if you will, are defacing websites and that 71 00:04:33,480 --> 00:04:35,719 Speaker 1: type of thing. The criminal enterprises are getting in and 72 00:04:35,720 --> 00:04:38,080 Speaker 1: trying to go after money and doing blackmail, and then 73 00:04:38,120 --> 00:04:42,279 Speaker 1: the state sponsored organizations are stealing our trade secrets and 74 00:04:42,360 --> 00:04:46,640 Speaker 1: then positioning themselves from an order battle perspective to have 75 00:04:46,760 --> 00:04:50,760 Speaker 1: that strategic surprise. So knock out the lights, can they 76 00:04:50,760 --> 00:04:53,640 Speaker 1: shut down the FAA, Can they hurt NAZDAC, Can they 77 00:04:53,640 --> 00:04:55,720 Speaker 1: do those type of things that would kind of cause 78 00:04:55,760 --> 00:04:57,680 Speaker 1: a lot of turmoil for us. That's the way I 79 00:04:57,720 --> 00:05:00,840 Speaker 1: would kind of outline a teering thing so that in 80 00:05:00,880 --> 00:05:04,359 Speaker 1: a sense, this is the new cyber mafia and represents 81 00:05:04,360 --> 00:05:07,599 Speaker 1: a totally different set of skills and a much higher 82 00:05:07,640 --> 00:05:12,560 Speaker 1: profit margin than the traditional crimes. Agree Ten years ago, 83 00:05:13,080 --> 00:05:15,880 Speaker 1: the moneymaker on the cyber side was you would steal 84 00:05:15,960 --> 00:05:18,159 Speaker 1: data and then you had to go find an information 85 00:05:18,200 --> 00:05:21,320 Speaker 1: broker to sell the data to, or you would create 86 00:05:21,440 --> 00:05:24,760 Speaker 1: the actual exploits and sell those. So a zero day 87 00:05:25,240 --> 00:05:27,720 Speaker 1: right now can get you about a million dollars a 88 00:05:27,720 --> 00:05:29,839 Speaker 1: good one, like if you can get through windows or 89 00:05:29,839 --> 00:05:32,279 Speaker 1: get on an iPhone or something like that. But if 90 00:05:32,279 --> 00:05:34,200 Speaker 1: you're going to go hit a corporation and you're going 91 00:05:34,240 --> 00:05:37,359 Speaker 1: to hit them for five million dollars in bitcoin, and 92 00:05:37,440 --> 00:05:39,599 Speaker 1: you can use that same tool that you use against 93 00:05:39,640 --> 00:05:41,560 Speaker 1: them and hit five more companies, and now you're at 94 00:05:41,600 --> 00:05:45,640 Speaker 1: twenty five million dollars of somewhat untraceable in the way 95 00:05:45,640 --> 00:05:48,480 Speaker 1: this is being done, because it goes through multiple iterations 96 00:05:48,520 --> 00:05:52,080 Speaker 1: of different cryptocurrency providers and whatnot, it's a challenge for 97 00:05:52,160 --> 00:05:56,320 Speaker 1: law enforcement. There have been United States municipal, city and 98 00:05:56,400 --> 00:06:01,120 Speaker 1: state police departments that themselves have been hit with ransomware 99 00:06:01,160 --> 00:06:04,479 Speaker 1: and paid the ransom. It's a part of kind of 100 00:06:04,520 --> 00:06:06,839 Speaker 1: a cyber hygiene issue on our side where we're not 101 00:06:06,880 --> 00:06:09,440 Speaker 1: backing things up as much as we should. We're not 102 00:06:09,560 --> 00:06:13,480 Speaker 1: really preparing our communities and our organizations for the threat, 103 00:06:13,680 --> 00:06:16,360 Speaker 1: and then it's very, very difficult once you've been hit 104 00:06:16,360 --> 00:06:19,120 Speaker 1: without stuff to get your data unlocked. The interesting thing 105 00:06:19,160 --> 00:06:22,039 Speaker 1: about this is, for the most part the criminals, there 106 00:06:22,120 --> 00:06:24,240 Speaker 1: is some honor there, like, once you do pay, you 107 00:06:24,279 --> 00:06:27,039 Speaker 1: do get your stuff back. Very rarely does it not 108 00:06:27,120 --> 00:06:31,320 Speaker 1: work out that way because within that criminal subculture, if 109 00:06:31,360 --> 00:06:35,640 Speaker 1: you get a reputation for not following through with releasing 110 00:06:35,680 --> 00:06:38,400 Speaker 1: the holdings, then they would expect that they wouldn't be 111 00:06:38,440 --> 00:06:40,120 Speaker 1: paid in the future, so they want to keep that 112 00:06:40,200 --> 00:06:44,400 Speaker 1: fasted on by honoring the bounty. Sony produced a movie 113 00:06:44,400 --> 00:06:47,520 Speaker 1: that made fun of Kim Jongan, and they promptly had 114 00:06:47,520 --> 00:06:50,640 Speaker 1: a cyber attack on Sony. I think everybody agrees with 115 00:06:50,760 --> 00:06:54,160 Speaker 1: the Koreans, but it's really hard to track down improve 116 00:06:54,240 --> 00:06:57,320 Speaker 1: it's the Koreans. And isn't that one of the problems 117 00:06:57,320 --> 00:07:00,760 Speaker 1: that you can have these attacks and really not know 118 00:07:00,960 --> 00:07:04,080 Speaker 1: precisely where they're coming from. It is definitely one of 119 00:07:04,080 --> 00:07:07,640 Speaker 1: the challenges. To some degree, there are elements with the 120 00:07:07,680 --> 00:07:12,560 Speaker 1: intelligence community that have better visibility and understanding than others. 121 00:07:12,600 --> 00:07:16,800 Speaker 1: And in some cases it's still unknown. The other challenge 122 00:07:16,800 --> 00:07:21,560 Speaker 1: with this if you're utilizing criminal elements or you're supporting them, 123 00:07:21,640 --> 00:07:24,440 Speaker 1: and you kind of give yourself that plausible denied ability, 124 00:07:24,480 --> 00:07:27,960 Speaker 1: whether it's the Russians or the Chinese, or the Koreans 125 00:07:28,160 --> 00:07:31,040 Speaker 1: or the Uranians, if you can cause that doubt at 126 00:07:31,080 --> 00:07:35,080 Speaker 1: distance between an actual organized wearing a uniform group like 127 00:07:35,520 --> 00:07:38,640 Speaker 1: you had in the GRU I think most people in 128 00:07:38,640 --> 00:07:41,840 Speaker 1: the community would agree are still under the control or 129 00:07:41,880 --> 00:07:44,920 Speaker 1: supported by the government. Is an issue, but even more 130 00:07:44,960 --> 00:07:48,240 Speaker 1: so from a political will perspective, what happens when you 131 00:07:48,280 --> 00:07:52,480 Speaker 1: actually do attribute it to a Russian government or the 132 00:07:52,560 --> 00:07:56,080 Speaker 1: Chinese government or the North Koreans. The ability to dissuade 133 00:07:56,120 --> 00:07:59,320 Speaker 1: them from doing something like that is also a challenge, 134 00:07:59,360 --> 00:08:01,360 Speaker 1: both from a political will to do it and then 135 00:08:01,400 --> 00:08:05,840 Speaker 1: what you actually do. So it's a problem across multiple 136 00:08:06,400 --> 00:08:10,720 Speaker 1: elements of national response. After you even understand who did it, 137 00:08:10,800 --> 00:08:13,680 Speaker 1: which is still a problem, how far would you go 138 00:08:13,760 --> 00:08:17,680 Speaker 1: in responding, For example, in response to Sony, should we 139 00:08:17,760 --> 00:08:20,080 Speaker 1: have tried to take down the North Korean system? And 140 00:08:20,120 --> 00:08:22,760 Speaker 1: even if we had, given how little electricity they use, 141 00:08:23,040 --> 00:08:25,680 Speaker 1: would they even have noticed it? We have the ability 142 00:08:25,720 --> 00:08:29,960 Speaker 1: to be surgical as well. So maybe the response is 143 00:08:30,360 --> 00:08:35,720 Speaker 1: if we have attribution or a strong enough agreement within 144 00:08:35,760 --> 00:08:39,480 Speaker 1: the community, the intelligence community, maybe the effect is to 145 00:08:39,559 --> 00:08:43,079 Speaker 1: neutralize the cyber capability of the people we believe are 146 00:08:43,080 --> 00:08:46,199 Speaker 1: doing it, whether it's going right back after their machines 147 00:08:46,240 --> 00:08:49,559 Speaker 1: and the quote unquote hats back, taking out the infrastructure 148 00:08:49,640 --> 00:08:52,640 Speaker 1: that they're utilizing is a tactic that can be done 149 00:08:52,640 --> 00:08:54,679 Speaker 1: and has been done from law enforcements. So if you 150 00:08:54,720 --> 00:08:57,360 Speaker 1: look at how bot nets are being done and some 151 00:08:57,440 --> 00:09:02,079 Speaker 1: of the criminal enterprises, there are coordinated efforts with interpoll 152 00:09:02,160 --> 00:09:04,720 Speaker 1: and others to go take down bought net servers and 153 00:09:04,760 --> 00:09:08,560 Speaker 1: controllers and some of that infrastructure. Takes a long time 154 00:09:08,600 --> 00:09:11,360 Speaker 1: to build millions of dollars in investment to get this 155 00:09:11,400 --> 00:09:15,800 Speaker 1: stuff staged out, and taking that down dramatically impacts the 156 00:09:15,840 --> 00:09:19,320 Speaker 1: bad guys operations. That is something that can be done. 157 00:09:19,760 --> 00:09:22,000 Speaker 1: Maybe it needs to be a little bit more public 158 00:09:22,040 --> 00:09:25,160 Speaker 1: so people understand that this is happening. In some cases, 159 00:09:25,360 --> 00:09:29,880 Speaker 1: the United States government is aware of more than the 160 00:09:29,920 --> 00:09:32,679 Speaker 1: general population, as you would expect, on a lot of things. 161 00:09:33,120 --> 00:09:36,720 Speaker 1: So do they have capability to do things? Does NSA 162 00:09:36,880 --> 00:09:39,920 Speaker 1: and Cyber Command have the ability from a cyber perspective 163 00:09:39,960 --> 00:09:45,959 Speaker 1: to disrupt and engage, in some cases destroy cyber targets. Absolutely, 164 00:09:46,000 --> 00:09:49,800 Speaker 1: do they have the political backing or charter to do that. 165 00:09:49,800 --> 00:09:53,559 Speaker 1: That's where we're not there yet. So we're not send 166 00:09:53,600 --> 00:09:56,720 Speaker 1: in Tier one guys into Beijing to kidnap hackers, we're 167 00:09:56,760 --> 00:09:59,680 Speaker 1: not dropping bombs on buildings, and in most cases we're 168 00:09:59,679 --> 00:10:02,600 Speaker 1: not and doing a cyber operation to nuke their routers. 169 00:10:02,720 --> 00:10:05,720 Speaker 1: I'm sure there are conversations happening in various pockets of 170 00:10:05,720 --> 00:10:08,200 Speaker 1: the government on what to do. These conversations have been 171 00:10:08,240 --> 00:10:11,360 Speaker 1: happening for ten years now, and we definitely have the 172 00:10:11,400 --> 00:10:14,079 Speaker 1: capacity to do things. From my perspective, when I was 173 00:10:14,120 --> 00:10:16,760 Speaker 1: in the government as an operator doing these type of things, 174 00:10:17,160 --> 00:10:19,800 Speaker 1: the only thing we were lacking was the political world 175 00:10:19,840 --> 00:10:22,920 Speaker 1: to do it. That's the threshold. If the military is 176 00:10:23,040 --> 00:10:25,240 Speaker 1: ordered or given a green light, they have the capability 177 00:10:25,240 --> 00:10:28,439 Speaker 1: to do a lot of things and various stages of escalation. 178 00:10:28,679 --> 00:10:32,480 Speaker 1: So that's where we have to get our political leadership 179 00:10:32,800 --> 00:10:37,400 Speaker 1: on the same page, understanding the threat better, and then 180 00:10:37,480 --> 00:10:41,880 Speaker 1: having a series of responses that are publicly known to 181 00:10:42,040 --> 00:10:44,240 Speaker 1: the adversary. If you do this, this is going to 182 00:10:44,320 --> 00:10:49,360 Speaker 1: be the consequences very similar to mutually assured destruction doctrine 183 00:10:49,440 --> 00:10:52,640 Speaker 1: during the Cold War. We definitely need to start moving 184 00:10:52,679 --> 00:10:56,280 Speaker 1: towards that right now, it's just been we've been getting hit, 185 00:10:56,320 --> 00:10:59,640 Speaker 1: getting hit, getting hit, getting hit, and a small group 186 00:10:59,640 --> 00:11:02,480 Speaker 1: of you know, American citizens have been seeing this and 187 00:11:02,640 --> 00:11:05,760 Speaker 1: trying to deal with it in pockets, but we haven't 188 00:11:05,840 --> 00:11:11,199 Speaker 1: had a unified response, and that's the biggest challenge. Next, 189 00:11:11,360 --> 00:11:14,560 Speaker 1: a data breach in the opposite of Personnel Management leaks 190 00:11:14,559 --> 00:11:18,640 Speaker 1: the personal information and fingerprints of millions of federal employees. 191 00:11:39,480 --> 00:11:45,000 Speaker 1: In twenty fifteen, they reported that hackers had gotten five million, 192 00:11:45,200 --> 00:11:50,800 Speaker 1: six hundred thousand digital images of government employees fingerprints and 193 00:11:51,000 --> 00:11:54,800 Speaker 1: had broken in just a huge volume of information about 194 00:11:54,840 --> 00:11:59,280 Speaker 1: federal employement. That breach the Office of Personnel Management and 195 00:11:59,320 --> 00:12:03,640 Speaker 1: all of the the background records for security clearances was 196 00:12:03,720 --> 00:12:09,000 Speaker 1: a strategic hit. That was a focused effort to go 197 00:12:09,080 --> 00:12:13,160 Speaker 1: after those records. That has all of the information for 198 00:12:13,240 --> 00:12:16,600 Speaker 1: every federal employee who has gone through a security clearance 199 00:12:16,640 --> 00:12:20,720 Speaker 1: background check, so it'll cover the organization that you're in, 200 00:12:20,960 --> 00:12:25,600 Speaker 1: your previous employment, your family, every place you've lived, your 201 00:12:25,920 --> 00:12:29,720 Speaker 1: credit record, your fingerplants, all of it. So it was 202 00:12:29,800 --> 00:12:33,000 Speaker 1: a major hit but in terms of the scale of 203 00:12:33,160 --> 00:12:36,920 Speaker 1: constant aggressiveness, I think the Office of Personal Management said 204 00:12:36,960 --> 00:12:42,040 Speaker 1: that they get something like ten million attempted digital intrusions 205 00:12:42,080 --> 00:12:46,560 Speaker 1: every month. Shouldn't we be much more militant about stopping 206 00:12:46,600 --> 00:12:49,400 Speaker 1: the people who are doing all this. It seems to 207 00:12:49,480 --> 00:12:53,360 Speaker 1: me that we're playing defense with no offense, and eventually 208 00:12:53,400 --> 00:12:56,160 Speaker 1: they're going to break through. I agree, if it's connected 209 00:12:56,200 --> 00:12:58,400 Speaker 1: to the Internet, it's at risk, and it will always 210 00:12:58,400 --> 00:13:01,240 Speaker 1: be at risk. I would suggs that ten million attempts 211 00:13:01,280 --> 00:13:03,440 Speaker 1: a day is a lot of that scanning or system 212 00:13:03,480 --> 00:13:06,240 Speaker 1: machines that are just probing for openings. But when you 213 00:13:06,320 --> 00:13:11,080 Speaker 1: get to an actual dedicated military intelligence unit or Chinese 214 00:13:11,120 --> 00:13:16,160 Speaker 1: INNESSA or Russian NSA, they will have a dedicated campaign 215 00:13:16,240 --> 00:13:18,520 Speaker 1: where they'll go at you from our target like that 216 00:13:18,880 --> 00:13:22,120 Speaker 1: from an insider perspective, from a human perspective, from a 217 00:13:22,120 --> 00:13:25,760 Speaker 1: cyber perspective, they're going to get after it. And we 218 00:13:25,880 --> 00:13:29,319 Speaker 1: haven't really come to the realization that some of these 219 00:13:29,360 --> 00:13:32,160 Speaker 1: things are highly vulnerable and good targets. If you look 220 00:13:32,160 --> 00:13:35,840 Speaker 1: at the ramifications of what happened there. Initially that breach 221 00:13:36,000 --> 00:13:38,640 Speaker 1: went after the contractor that had to work to do 222 00:13:38,640 --> 00:13:41,480 Speaker 1: the investigations. So they would have contracted personnel that would 223 00:13:41,480 --> 00:13:43,840 Speaker 1: go out using laptops and do field interviews, and then 224 00:13:43,840 --> 00:13:46,640 Speaker 1: those laptops, we're plugging back into a corporate backbone. And 225 00:13:46,760 --> 00:13:50,560 Speaker 1: they got on through that way. The US government basically 226 00:13:50,640 --> 00:13:54,240 Speaker 1: kind of blame that company. That company basically went bankrupt, 227 00:13:54,440 --> 00:13:57,160 Speaker 1: thousands of people lost their job. They tried to file 228 00:13:57,160 --> 00:14:00,320 Speaker 1: an insurance claim about it. The insurance policy said, this 229 00:14:00,440 --> 00:14:02,240 Speaker 1: is a state sponsored thing. We're not going to protect 230 00:14:02,240 --> 00:14:04,680 Speaker 1: you in your own Come to find out that the 231 00:14:04,840 --> 00:14:07,960 Speaker 1: government itself was also compromised in this thing, and they 232 00:14:07,960 --> 00:14:11,199 Speaker 1: had not complied with inspections and security audits saying you 233 00:14:11,280 --> 00:14:15,920 Speaker 1: need to do some things. So corporate side, big impact, 234 00:14:15,960 --> 00:14:20,040 Speaker 1: government side. I'm not sure anybody even lost a job. 235 00:14:20,360 --> 00:14:23,760 Speaker 1: The bad guys got away with a strategic hall of 236 00:14:23,960 --> 00:14:28,440 Speaker 1: valuable intelligence about every employee in all their backgrounds and 237 00:14:28,480 --> 00:14:31,040 Speaker 1: everything that you could hope to have, and we really 238 00:14:31,240 --> 00:14:35,040 Speaker 1: as a response, did nothing other than offer a couple 239 00:14:35,040 --> 00:14:38,560 Speaker 1: of years of free credit check to the employees. So 240 00:14:38,800 --> 00:14:42,560 Speaker 1: I completely agree with you that we need to start 241 00:14:42,800 --> 00:14:45,760 Speaker 1: changing our mindset and how we're responding to some of 242 00:14:45,800 --> 00:14:50,080 Speaker 1: these things, because there is no fear, there is no 243 00:14:50,360 --> 00:14:54,240 Speaker 1: cost of doing business from their side, there's no ramifications 244 00:14:54,320 --> 00:14:57,360 Speaker 1: for things like this, as I understand it. In terms 245 00:14:57,360 --> 00:15:01,440 Speaker 1: of intellectual property theft, it's a different kind of problem. 246 00:15:01,600 --> 00:15:04,640 Speaker 1: There was one report the China may account for as 247 00:15:04,720 --> 00:15:08,040 Speaker 1: much as eighty seven percent of the counterfeit goods that 248 00:15:08,080 --> 00:15:11,000 Speaker 1: are sized coming into the US. Doesn't this almost have 249 00:15:11,160 --> 00:15:13,520 Speaker 1: to have the backing of the government of China to 250 00:15:13,560 --> 00:15:17,640 Speaker 1: be on this scale as far as I'm concerned. I mean, 251 00:15:17,640 --> 00:15:20,560 Speaker 1: they control the Internet, they control your access in and out. 252 00:15:21,120 --> 00:15:22,880 Speaker 1: I can tell you a story all the way back 253 00:15:22,920 --> 00:15:25,720 Speaker 1: in two thousand and one. You may recall that there 254 00:15:25,800 --> 00:15:28,800 Speaker 1: was a Navy P three surveillance plane that was flying 255 00:15:28,840 --> 00:15:31,520 Speaker 1: along the coast of China. China sent up two fighter 256 00:15:31,600 --> 00:15:34,920 Speaker 1: jets to shadow it, and they ended up clipping wings 257 00:15:34,920 --> 00:15:37,800 Speaker 1: and we had to land our Navy plane on the ground. 258 00:15:37,880 --> 00:15:40,960 Speaker 1: And it was a big kind of international incident that 259 00:15:41,120 --> 00:15:44,520 Speaker 1: kicked off around a kind of patriotic hacking between the 260 00:15:44,600 --> 00:15:47,480 Speaker 1: US and China, and on our side, the FBI and 261 00:15:47,520 --> 00:15:49,400 Speaker 1: others kind of try to track our guys down and 262 00:15:49,440 --> 00:15:52,200 Speaker 1: tell them to stop. On the Chinese side, they started 263 00:15:52,200 --> 00:15:54,000 Speaker 1: to kind of watch this and they saw that these 264 00:15:54,040 --> 00:15:57,520 Speaker 1: patriarch actors were going against the US and they were 265 00:15:57,520 --> 00:16:02,640 Speaker 1: allowed to continue. That created your first generation of what 266 00:16:02,680 --> 00:16:05,400 Speaker 1: the Chinese called big bulls or like the strong hackers, 267 00:16:05,720 --> 00:16:08,480 Speaker 1: and they started their own little hack organizations. And then 268 00:16:08,520 --> 00:16:12,320 Speaker 1: ten years later, in twenty eleven, they literally had kind 269 00:16:12,360 --> 00:16:17,320 Speaker 1: of a ten year anniversary awards ceremony for these groups 270 00:16:17,320 --> 00:16:21,400 Speaker 1: and it was held in a Chinese Communist leadership quadric facility, 271 00:16:21,800 --> 00:16:25,600 Speaker 1: a cassette approval from the government. So, without a doubt, 272 00:16:25,720 --> 00:16:28,600 Speaker 1: these groups that are going after intellectual property, and we're 273 00:16:28,600 --> 00:16:33,240 Speaker 1: talking hundreds of terabytes of data over the course of 274 00:16:33,520 --> 00:16:36,160 Speaker 1: a couple of years that have been taken. I think 275 00:16:36,240 --> 00:16:39,480 Speaker 1: the Mandian report that talked about six one eighth the 276 00:16:39,480 --> 00:16:42,480 Speaker 1: military unit that the FBI did indictments on, they were 277 00:16:42,520 --> 00:16:46,120 Speaker 1: talking about hundreds of terabytes and weekly stands, and that's 278 00:16:46,160 --> 00:16:49,520 Speaker 1: just one unit. So when you start talking about hundreds 279 00:16:49,520 --> 00:16:52,840 Speaker 1: of terabytes, I don't think people understand what that really 280 00:16:52,880 --> 00:16:56,880 Speaker 1: looks like. Fifteen terabytes. If you were to take that 281 00:16:56,960 --> 00:17:02,080 Speaker 1: and print that out, that would equate to every piece 282 00:17:02,080 --> 00:17:05,720 Speaker 1: of printed material in the Library of Congress, and you're 283 00:17:05,760 --> 00:17:09,400 Speaker 1: talking hundreds and hundreds of those. So it's a massive 284 00:17:09,440 --> 00:17:12,520 Speaker 1: amount of information that's been taken, and the Chinese are 285 00:17:12,680 --> 00:17:16,960 Speaker 1: experts at recreating through imitation. So not only are they 286 00:17:16,960 --> 00:17:20,480 Speaker 1: doing counterfeit goods for purses and shoes and t shirts 287 00:17:20,480 --> 00:17:24,320 Speaker 1: and those type of things, but they're also creating weapons systems. 288 00:17:24,400 --> 00:17:27,720 Speaker 1: They've created the J thirty one, which looks and flies 289 00:17:27,760 --> 00:17:31,040 Speaker 1: exactly like Thety five Joint Strike Fighter. They have a 290 00:17:31,160 --> 00:17:34,159 Speaker 1: drone that looks just like our Predator. They're selling it 291 00:17:34,200 --> 00:17:37,080 Speaker 1: in Africa in the Middle East. They've taken our technology, 292 00:17:37,119 --> 00:17:39,399 Speaker 1: they've created it themselves, and now they're moving into the 293 00:17:39,440 --> 00:17:42,440 Speaker 1: market and they're competing against us, so not only from 294 00:17:42,440 --> 00:17:45,800 Speaker 1: commercial goods, but from weapons of war as well. So 295 00:17:45,840 --> 00:17:49,000 Speaker 1: there's a Commission on the Theft of American Intellectual Property 296 00:17:49,040 --> 00:17:53,040 Speaker 1: which made a report in twenty seventeen, and the range 297 00:17:53,080 --> 00:17:59,080 Speaker 1: they estimate of how much has stolen of intellectual property 298 00:17:59,119 --> 00:18:03,320 Speaker 1: annually from the US economy was between one hundred and 299 00:18:03,359 --> 00:18:08,520 Speaker 1: eighty billion in five hundred and forty billion. Now, how 300 00:18:08,560 --> 00:18:11,359 Speaker 1: can we know so little that we have a range 301 00:18:11,480 --> 00:18:16,280 Speaker 1: of almost four hundred billion between the high and lowest 302 00:18:16,320 --> 00:18:19,119 Speaker 1: of it for one year. Isn't there something wrong with 303 00:18:19,160 --> 00:18:21,320 Speaker 1: our own systems if we can't get a narrower than 304 00:18:21,359 --> 00:18:23,480 Speaker 1: that well, it kind of goes back to your point 305 00:18:23,520 --> 00:18:27,080 Speaker 1: of it's hard to get everybody to admit that has happened. 306 00:18:27,480 --> 00:18:32,000 Speaker 1: It's hard to quantify the volume of what's been taken 307 00:18:32,200 --> 00:18:34,800 Speaker 1: and how you actually turn that into a dollar amount. 308 00:18:35,000 --> 00:18:37,080 Speaker 1: Some of us amongst the community will just kind of 309 00:18:37,080 --> 00:18:39,880 Speaker 1: take up middle number and it's three hundred billion a year. 310 00:18:40,160 --> 00:18:42,359 Speaker 1: And if you do that across five years, I mean 311 00:18:42,359 --> 00:18:47,119 Speaker 1: you're at one point five trillion dollars in economic impact. 312 00:18:47,680 --> 00:18:52,680 Speaker 1: We cannot sustain this, We cannot continue to compete economically 313 00:18:53,000 --> 00:18:56,840 Speaker 1: and eventually militarily. I can tell you I personally briefed 314 00:18:56,840 --> 00:18:58,560 Speaker 1: the CEO of Lockey Martin when we were doing a 315 00:18:58,640 --> 00:19:02,359 Speaker 1: joint strike fighter intrusion and had a big analyst notebook 316 00:19:02,400 --> 00:19:03,800 Speaker 1: chart out and here's kind of what we're doing and 317 00:19:03,880 --> 00:19:06,760 Speaker 1: what we know and where the case is going. He 318 00:19:06,880 --> 00:19:09,240 Speaker 1: basically kind of sat there and looked up and said, 319 00:19:09,320 --> 00:19:13,199 Speaker 1: I'm sick and tired of investing hundreds of millions of 320 00:19:13,200 --> 00:19:15,480 Speaker 1: dollars in building the stuff to have it stolen in 321 00:19:15,480 --> 00:19:18,120 Speaker 1: a matter of minutes. And that was just one company. 322 00:19:18,240 --> 00:19:20,600 Speaker 1: Some of the other statistics around that report that you 323 00:19:20,680 --> 00:19:24,639 Speaker 1: mentioned was one in five corporations has been hit or 324 00:19:24,680 --> 00:19:28,119 Speaker 1: will be hit, maybe as high as two fifths, So 325 00:19:28,200 --> 00:19:31,600 Speaker 1: that's a massive amount of intrusion. And then being able 326 00:19:31,640 --> 00:19:33,640 Speaker 1: to quantify it. So let's say you did Let's say 327 00:19:33,680 --> 00:19:36,200 Speaker 1: you actually said, we're narrowed it down. We can tell 328 00:19:36,240 --> 00:19:39,560 Speaker 1: you it's three hundred and fifty billion dollars. The question 329 00:19:39,720 --> 00:19:42,439 Speaker 1: is then, what if we know it's eighty seven percent 330 00:19:42,600 --> 00:19:45,360 Speaker 1: China and they took three hundred billion dollars a year 331 00:19:45,400 --> 00:19:48,200 Speaker 1: for five years, where at one point five trillion dollars 332 00:19:48,240 --> 00:19:51,959 Speaker 1: in economic warfare, what are we doing about it. Some 333 00:19:52,000 --> 00:19:54,119 Speaker 1: of the things that the current administration is doing is 334 00:19:54,320 --> 00:19:58,600 Speaker 1: saying enough is enough. The American public is not really, 335 00:19:58,640 --> 00:20:01,840 Speaker 1: as far as I can see, well informed or in 336 00:20:01,960 --> 00:20:04,520 Speaker 1: tune to this, because what you see on the news 337 00:20:04,520 --> 00:20:07,520 Speaker 1: on a daily basis is something else that's not really 338 00:20:07,560 --> 00:20:14,359 Speaker 1: focusing on these strategic challenges that we're facing as a society. Next, 339 00:20:14,440 --> 00:20:17,240 Speaker 1: we reveal the lack of cyber health around our senior 340 00:20:17,359 --> 00:20:20,080 Speaker 1: corporate and government leaders and what they need to do 341 00:20:20,119 --> 00:20:43,240 Speaker 1: to protect themselves. Chris, Since there's a growing pattern of 342 00:20:43,840 --> 00:20:47,520 Speaker 1: going after individuals and their home systems, etc. What advice 343 00:20:47,560 --> 00:20:49,960 Speaker 1: do you have for people who want to deal with 344 00:20:49,960 --> 00:20:52,840 Speaker 1: the cyber threat in a way that's effective for them 345 00:20:52,840 --> 00:20:57,240 Speaker 1: as individuals. A couple of things. Cyber hygiene is important, 346 00:20:57,280 --> 00:21:01,040 Speaker 1: so kind of basic practices. If you are part of 347 00:21:01,040 --> 00:21:04,440 Speaker 1: a corporation and you've got a set of security policies 348 00:21:04,480 --> 00:21:07,919 Speaker 1: in place, changing your password quarterly, making sure that your 349 00:21:07,920 --> 00:21:12,560 Speaker 1: systems are updated and patched. Apply those same disciplines to 350 00:21:12,640 --> 00:21:17,680 Speaker 1: your home life. So change your password on your Gmail regularly, 351 00:21:18,320 --> 00:21:23,960 Speaker 1: make sure that your home network equipment is updated and patched. 352 00:21:24,200 --> 00:21:26,280 Speaker 1: You need to make sure that those things are updated 353 00:21:26,280 --> 00:21:30,880 Speaker 1: and patched. Your home computers have basic anavirus, and those 354 00:21:30,920 --> 00:21:33,439 Speaker 1: things will make it a little bit more difficult for 355 00:21:33,960 --> 00:21:37,800 Speaker 1: the bad guys to move into your personal space. It 356 00:21:37,960 --> 00:21:41,080 Speaker 1: is a growing challenge. I will fully admit that this 357 00:21:41,200 --> 00:21:45,080 Speaker 1: is a challenge. The growth of the Internet of Things 358 00:21:45,119 --> 00:21:47,760 Speaker 1: and your refrigerator being able to call out and order 359 00:21:47,840 --> 00:21:50,360 Speaker 1: milk adds a level of risk to your home. What 360 00:21:50,400 --> 00:21:54,240 Speaker 1: we advise friends and family and clients is that you 361 00:21:54,359 --> 00:21:56,200 Speaker 1: kind of pay attention to this. So when you read 362 00:21:56,200 --> 00:21:58,600 Speaker 1: the news, pay attention to what's going on from a 363 00:21:58,600 --> 00:22:01,760 Speaker 1: cyber perspective as well. I mean, there's always a blurb 364 00:22:01,760 --> 00:22:04,280 Speaker 1: out there somewhere about what's going on and see how 365 00:22:04,320 --> 00:22:08,639 Speaker 1: that might impact you. It's definitely a concern across a 366 00:22:08,720 --> 00:22:11,640 Speaker 1: number of things. So I do want to touch on 367 00:22:11,680 --> 00:22:15,800 Speaker 1: this for the executives. So if you are a corporate executive, 368 00:22:16,080 --> 00:22:19,000 Speaker 1: there's an entire trend of this whale fishing where people 369 00:22:19,000 --> 00:22:21,200 Speaker 1: will go in and grab your information, make it look 370 00:22:21,240 --> 00:22:23,560 Speaker 1: like it is you send an email to your chief 371 00:22:23,560 --> 00:22:25,800 Speaker 1: financial officer to tell you to move money. It's a 372 00:22:25,800 --> 00:22:28,720 Speaker 1: whole scam that's been going around for a couple of years. 373 00:22:28,720 --> 00:22:31,439 Speaker 1: Targeting often happens at home, so if they can go 374 00:22:31,520 --> 00:22:33,840 Speaker 1: after your Gmail or your home route, they're going to 375 00:22:33,920 --> 00:22:36,840 Speaker 1: do that. What we find over and over again is 376 00:22:36,880 --> 00:22:39,840 Speaker 1: if you're a senior leader in a corporation, you're well 377 00:22:39,880 --> 00:22:43,120 Speaker 1: defended in your office, you have a team of security professionals, 378 00:22:43,160 --> 00:22:45,080 Speaker 1: you've got a lot of money invested, you've got the 379 00:22:45,160 --> 00:22:48,639 Speaker 1: latest and greatest security technology. As soon as you go home, 380 00:22:49,240 --> 00:22:52,800 Speaker 1: you are just like everybody else, and your corporate security 381 00:22:52,840 --> 00:22:55,600 Speaker 1: posture and visibility isn't coming to the home. Because we 382 00:22:55,720 --> 00:22:59,639 Speaker 1: as Americans value that privacy. So it's beyond the remit 383 00:23:00,080 --> 00:23:02,880 Speaker 1: the security team from a corporation or even the government 384 00:23:02,920 --> 00:23:06,000 Speaker 1: to protect officials at home. I can tell you that 385 00:23:06,040 --> 00:23:09,000 Speaker 1: I personally spoke to a member of Congress who gave 386 00:23:09,000 --> 00:23:12,120 Speaker 1: me their official business card with their Congressional seal on there, 387 00:23:12,440 --> 00:23:15,439 Speaker 1: and their email address was a GMAIL. That scares me 388 00:23:15,520 --> 00:23:18,800 Speaker 1: to death because that Gmail is not being protected by 389 00:23:19,080 --> 00:23:22,520 Speaker 1: the States government. That is an area where people need 390 00:23:22,560 --> 00:23:26,439 Speaker 1: to recognize that you are a target, especially if you're 391 00:23:26,440 --> 00:23:29,600 Speaker 1: in a position of political leadership or corporate leadership, and 392 00:23:30,080 --> 00:23:32,800 Speaker 1: either take the time to invest in your own security, 393 00:23:33,200 --> 00:23:36,959 Speaker 1: consider getting some consultation on that, or protect yourself. There 394 00:23:36,960 --> 00:23:40,080 Speaker 1: are some simple, free things that can be done to 395 00:23:40,160 --> 00:23:44,040 Speaker 1: improve your posture. Training is a huge benefit. Understanding how 396 00:23:44,040 --> 00:23:47,760 Speaker 1: to maybe adjust some of the settings on your mobile devices, 397 00:23:48,359 --> 00:23:50,159 Speaker 1: making sure that you come in and just have a 398 00:23:50,200 --> 00:23:53,040 Speaker 1: basic assessment of what your postor is at home is 399 00:23:53,160 --> 00:23:55,840 Speaker 1: simple thing to do that either is low cost or 400 00:23:55,880 --> 00:23:58,040 Speaker 1: no cost. But let me out distinction. If you're a 401 00:23:58,080 --> 00:24:02,840 Speaker 1: business executive, shouldn't thinking through protecting you at home be 402 00:24:03,000 --> 00:24:06,960 Speaker 1: part of your corporate system. It should be in some cases, 403 00:24:07,000 --> 00:24:10,520 Speaker 1: it is, in most cases it is not. How many 404 00:24:10,640 --> 00:24:13,440 Speaker 1: executives have you seen that walk around two phones. They'll 405 00:24:13,480 --> 00:24:15,800 Speaker 1: have their corporate phone and they'll have their personal phone. 406 00:24:16,080 --> 00:24:19,960 Speaker 1: I've seen communications in the kind of the cyber underground, 407 00:24:20,119 --> 00:24:23,320 Speaker 1: you know, the deep and dark web, where people are 408 00:24:23,680 --> 00:24:28,080 Speaker 1: offering bounties for executives, personal email addresses and personal phone numbers. 409 00:24:28,080 --> 00:24:30,600 Speaker 1: Why because that's what they want to target. I've seen 410 00:24:30,720 --> 00:24:33,520 Speaker 1: kind of dialogues happening about why would I bomb a 411 00:24:33,560 --> 00:24:36,840 Speaker 1: hack or why would I attack the general on his 412 00:24:36,920 --> 00:24:40,000 Speaker 1: dot mill account when he's protected by literally an army 413 00:24:40,040 --> 00:24:42,919 Speaker 1: of cyberdefenders. When I can attack the general on his 414 00:24:43,119 --> 00:24:46,520 Speaker 1: AOL account where he's got no desenses other than AOL 415 00:24:46,800 --> 00:24:48,639 Speaker 1: and all is not going to be able to stop. 416 00:24:48,640 --> 00:24:52,359 Speaker 1: But what's coming. There's been recent reporting that three hundred 417 00:24:52,359 --> 00:24:56,640 Speaker 1: to seven hundred thousand home routers in the United States 418 00:24:56,880 --> 00:25:00,800 Speaker 1: have been compromised by a suspected Russian A group, And 419 00:25:00,840 --> 00:25:05,560 Speaker 1: so that's moving away from corporate enterprise and businesses into 420 00:25:06,000 --> 00:25:10,200 Speaker 1: the home space. That's an extremely troubling potential where there's 421 00:25:10,280 --> 00:25:15,120 Speaker 1: definitely a lower security posture in the homes. So when 422 00:25:15,119 --> 00:25:18,280 Speaker 1: you start to look at global operations from a cyber 423 00:25:18,320 --> 00:25:21,080 Speaker 1: perspective and you look you're talking about Russian NSA or 424 00:25:21,200 --> 00:25:24,960 Speaker 1: Chinese NSA, they have the ability to go very surgical 425 00:25:25,160 --> 00:25:28,720 Speaker 1: right after an individual or step back and try to 426 00:25:28,760 --> 00:25:31,480 Speaker 1: have thousand points of light or a thousand points of 427 00:25:31,600 --> 00:25:34,280 Speaker 1: presence or a million points of presence around the world 428 00:25:34,320 --> 00:25:36,960 Speaker 1: to help them with their sick in operations. When you 429 00:25:37,000 --> 00:25:39,560 Speaker 1: start talking about a million points of presence, how much 430 00:25:39,560 --> 00:25:43,720 Speaker 1: of that's done just by using automatic devices, whether they're 431 00:25:43,800 --> 00:25:46,760 Speaker 1: bots or other things that self propagate a love it 432 00:25:46,920 --> 00:25:51,280 Speaker 1: is initially so there'll be scanning of the Internet constantly 433 00:25:51,400 --> 00:25:55,399 Speaker 1: looking for unpatched machines and vulnerabilities, and then they'll have 434 00:25:55,440 --> 00:25:59,920 Speaker 1: a library of exploits. When their machinists scanning across West 435 00:26:00,040 --> 00:26:02,719 Speaker 1: from Europe or across North America and they find an 436 00:26:02,760 --> 00:26:05,080 Speaker 1: IP address that's reporting back that there has a port 437 00:26:05,160 --> 00:26:08,359 Speaker 1: open or a vulnerability, then their exploit library will just 438 00:26:08,600 --> 00:26:10,840 Speaker 1: compromise that system and then they can take control of 439 00:26:10,840 --> 00:26:12,800 Speaker 1: it and move on to the next one. That is 440 00:26:12,880 --> 00:26:15,440 Speaker 1: kind of a ragular general noise that's happening on a 441 00:26:15,520 --> 00:26:18,000 Speaker 1: daily basis, which is what a lot of the cybersecurity 442 00:26:18,000 --> 00:26:20,520 Speaker 1: industry is kind of dealing with, where you're constantly having 443 00:26:20,520 --> 00:26:23,160 Speaker 1: an update and patcher machines and make sure your firewall 444 00:26:23,240 --> 00:26:26,119 Speaker 1: is up to date and your annavirus is good. Criminals 445 00:26:26,160 --> 00:26:29,840 Speaker 1: can do that, state sponsored organizations can do that. Teenagers 446 00:26:29,840 --> 00:26:32,720 Speaker 1: in their basement can do that. But when you start 447 00:26:32,720 --> 00:26:36,399 Speaker 1: to move into the higher order, advanced groups state sponsored 448 00:26:36,440 --> 00:26:40,760 Speaker 1: with national level backing and funding, and they're creating zuo days, 449 00:26:40,880 --> 00:26:43,520 Speaker 1: which is an exploiter, a piece of malware that has 450 00:26:43,640 --> 00:26:47,840 Speaker 1: no signature. So the way most antiviruses work is they're 451 00:26:47,840 --> 00:26:49,760 Speaker 1: based off of the signature, so it has to know 452 00:26:49,880 --> 00:26:52,480 Speaker 1: that this is a malware. It creates a fingerprint for it, 453 00:26:52,520 --> 00:26:54,240 Speaker 1: and then it can look for it some other place. 454 00:26:54,760 --> 00:26:57,240 Speaker 1: A zero day would be something that has never been 455 00:26:57,280 --> 00:27:00,240 Speaker 1: detained before, and it can run for a long time 456 00:27:00,320 --> 00:27:03,640 Speaker 1: until it's actually identified, the fingerprint created and then put 457 00:27:03,680 --> 00:27:07,560 Speaker 1: into your anavirus. When you have state sponsor organizations that 458 00:27:07,600 --> 00:27:11,919 Speaker 1: have dedicated funding, they're constantly looking and creating new zero days. 459 00:27:12,040 --> 00:27:15,280 Speaker 1: They have a library full of weapons, if you will, 460 00:27:15,400 --> 00:27:17,720 Speaker 1: that they can use to continue to maintain access in 461 00:27:17,760 --> 00:27:20,680 Speaker 1: places that they want. How would you change things if 462 00:27:20,680 --> 00:27:23,000 Speaker 1: you could get the President and the Congress to agree. 463 00:27:23,520 --> 00:27:25,679 Speaker 1: I think there's a couple of things that we should 464 00:27:25,720 --> 00:27:29,760 Speaker 1: be doing are less provocative than others, and it can 465 00:27:29,840 --> 00:27:32,080 Speaker 1: kind of go from a scale. We should be taking 466 00:27:32,119 --> 00:27:35,400 Speaker 1: more advantage of encryption and encrypting our data at rest 467 00:27:36,040 --> 00:27:39,280 Speaker 1: it's very, very difficult for it to be utilized because 468 00:27:39,280 --> 00:27:40,960 Speaker 1: it's encrypted, and you've got to spend a lot of 469 00:27:40,960 --> 00:27:43,720 Speaker 1: time and resources to decrypt that stuff. As just a 470 00:27:43,720 --> 00:27:46,720 Speaker 1: fundamental policy, we're not doing as much as we could 471 00:27:46,720 --> 00:27:49,840 Speaker 1: there As just a hygiene perspective, there would be a 472 00:27:49,880 --> 00:27:53,160 Speaker 1: good cause at some point in time for the American 473 00:27:53,200 --> 00:27:57,600 Speaker 1: public to know more about what's going on. We have 474 00:27:57,640 --> 00:28:01,119 Speaker 1: a tendency to classify a lot of stuff, and some 475 00:28:01,200 --> 00:28:03,520 Speaker 1: of it absolutely a hundred percent needs to be classified. 476 00:28:03,960 --> 00:28:06,520 Speaker 1: There may be some arguments where some of it shouldn't 477 00:28:06,560 --> 00:28:08,840 Speaker 1: be and people need to know about it. If I 478 00:28:08,960 --> 00:28:12,800 Speaker 1: had the ability to make some changes now, I would 479 00:28:13,280 --> 00:28:16,080 Speaker 1: hope that we could get both sides of the aisle 480 00:28:16,760 --> 00:28:20,080 Speaker 1: to stop looking at each other as the enemy and 481 00:28:20,200 --> 00:28:24,320 Speaker 1: look outward at Russia and China and what they're doing 482 00:28:24,359 --> 00:28:28,439 Speaker 1: to us as a country. So we're too busy pointing 483 00:28:28,480 --> 00:28:31,920 Speaker 1: a finger at each other over things then looking outward. 484 00:28:32,359 --> 00:28:34,760 Speaker 1: I mean, if you kind of look back over the 485 00:28:34,800 --> 00:28:36,920 Speaker 1: last few months, in the last year and the whole 486 00:28:37,000 --> 00:28:41,320 Speaker 1: Russia thing. The Muller Report talks about Russia doing some things, 487 00:28:41,400 --> 00:28:44,440 Speaker 1: but the since you get from the way it's being 488 00:28:44,480 --> 00:28:47,880 Speaker 1: projected on a daily basis is is still more of 489 00:28:47,920 --> 00:28:50,760 Speaker 1: a focus on the Trump administration than on what Russia 490 00:28:50,880 --> 00:28:54,440 Speaker 1: was actually doing and has been doing since the Cold War. 491 00:28:54,800 --> 00:28:58,160 Speaker 1: There are just as many or more Russian spies in 492 00:28:58,200 --> 00:29:00,680 Speaker 1: the United States now than there weren't of the Cold War. 493 00:29:01,080 --> 00:29:03,400 Speaker 1: But that's not what we're talking about. We're talking about 494 00:29:03,400 --> 00:29:06,000 Speaker 1: the wrong things. So that would be the next year 495 00:29:06,160 --> 00:29:09,000 Speaker 1: is So let's make a data that sits on our 496 00:29:09,120 --> 00:29:12,760 Speaker 1: enterprises more difficult to capitalize on because you encrypt it. 497 00:29:12,960 --> 00:29:16,600 Speaker 1: Let's get our political leadership to stop throwing stones at 498 00:29:16,600 --> 00:29:20,040 Speaker 1: each other and pay attention to what's impacting our country 499 00:29:20,040 --> 00:29:24,240 Speaker 1: from an external forces perspective. And then three, we need 500 00:29:24,280 --> 00:29:28,920 Speaker 1: to have the political resolve to escalate and for people 501 00:29:28,960 --> 00:29:32,800 Speaker 1: to understand why this is happening. I don't know what 502 00:29:32,840 --> 00:29:35,080 Speaker 1: the line is where you would start to go kinetic 503 00:29:35,120 --> 00:29:38,280 Speaker 1: on something like this, but one point five trillion dollars 504 00:29:38,320 --> 00:29:42,280 Speaker 1: over five years in economics theft is a huge number 505 00:29:42,400 --> 00:29:44,920 Speaker 1: and something should be done about that. And there needs 506 00:29:44,920 --> 00:29:48,800 Speaker 1: to be some concern by these units that sit in 507 00:29:48,880 --> 00:29:53,160 Speaker 1: Moscow or Beijing and are basically operating with impunity. I 508 00:29:53,200 --> 00:29:56,360 Speaker 1: mean mull Aer doing an indictment on GRU officers and 509 00:29:56,520 --> 00:30:00,600 Speaker 1: naming them, saying this major in the gru at fifteen 510 00:30:00,960 --> 00:30:04,560 Speaker 1: in the morning, did this event? Is telling them what 511 00:30:04,600 --> 00:30:08,680 Speaker 1: we know, but nothing's happening to these individuals or the 512 00:30:08,760 --> 00:30:11,680 Speaker 1: country or the government. From that perspective, I don't have 513 00:30:11,720 --> 00:30:14,440 Speaker 1: an answer other than I think if we had our 514 00:30:14,480 --> 00:30:19,560 Speaker 1: political leadership working together to come up with some better 515 00:30:19,600 --> 00:30:23,320 Speaker 1: solutions is the first step. And that's definitely not happening 516 00:30:23,520 --> 00:30:26,680 Speaker 1: right now because we focus on for your political cycles 517 00:30:26,800 --> 00:30:30,080 Speaker 1: and the constant campaigning and that type of thing, so 518 00:30:30,120 --> 00:30:34,280 Speaker 1: we're losing the ability to actually protect against the strategic 519 00:30:34,360 --> 00:30:37,800 Speaker 1: threats in the long term. Chris, I'm just very grateful 520 00:30:37,840 --> 00:30:40,560 Speaker 1: to you for taking the time and sharing all this 521 00:30:40,760 --> 00:30:50,080 Speaker 1: knowledge with us. Yes, sir, My Pleasure MUTUALD is produced 522 00:30:50,400 --> 00:30:53,360 Speaker 1: by Gain which Street sixty and I Heart Media. Our 523 00:30:53,400 --> 00:30:57,920 Speaker 1: executive producer is Debbie Meyers, our producer is Yarns Sloan, 524 00:30:58,320 --> 00:31:01,800 Speaker 1: and our researcher is Rachel Years. Your work for the 525 00:31:01,840 --> 00:31:05,360 Speaker 1: show was created by Steve Pemman. Special thanks to the 526 00:31:05,360 --> 00:31:08,920 Speaker 1: team at Gingwich three sixty. If you've been enjoying News World, 527 00:31:09,200 --> 00:31:12,000 Speaker 1: I hope you'll go to Apple Podcast and both rate 528 00:31:12,120 --> 00:31:15,200 Speaker 1: us with five stars and give us a review. So 529 00:31:15,320 --> 00:31:19,000 Speaker 1: others can learn what it's all about. Right now, listeners 530 00:31:19,000 --> 00:31:22,040 Speaker 1: of news World can sign up for my three free 531 00:31:22,080 --> 00:31:26,840 Speaker 1: weekly columns at Gingwish three sixty dot com slash newsletter. 532 00:31:27,480 --> 00:31:30,000 Speaker 1: I'm new Gangwish. This is news World.