1 00:00:03,840 --> 00:00:06,760 Speaker 1: On this episode of the NEWTS World. It's not often 2 00:00:06,800 --> 00:00:09,080 Speaker 1: I get the opportunity to speak with someone who used 3 00:00:09,080 --> 00:00:12,920 Speaker 1: to be a professional hacker for the CIA. My guest 4 00:00:13,039 --> 00:00:17,959 Speaker 1: is doctor Eric Cole. He is a renowned cyber security expert, entrepreneur, 5 00:00:18,520 --> 00:00:21,720 Speaker 1: and best selling author with over thirty years of experience 6 00:00:21,720 --> 00:00:24,079 Speaker 1: in the industry. He is known for his work in 7 00:00:24,120 --> 00:00:28,120 Speaker 1: advancing cybersecurity and his dedication to making the digital world 8 00:00:28,320 --> 00:00:31,160 Speaker 1: a safer place. He has advised some of the world's 9 00:00:31,200 --> 00:00:35,120 Speaker 1: top companies on reducing their digital threats and improving their 10 00:00:35,159 --> 00:00:38,880 Speaker 1: cyber health. He's the author of the book cyber Crisis, 11 00:00:39,200 --> 00:00:53,360 Speaker 1: Protecting Your Business from real threats in the Virtual World. Eric, 12 00:00:53,600 --> 00:00:56,200 Speaker 1: welcome and thank you for joining me on NEWTS World. 13 00:00:56,920 --> 00:00:58,400 Speaker 2: My pleasure and thank you for having me. 14 00:00:58,920 --> 00:01:03,120 Speaker 1: I'm very curious. Symber of last year, Chinese hackers breached 15 00:01:03,160 --> 00:01:06,640 Speaker 1: a third party vendor for the US Treasury Department to 16 00:01:06,760 --> 00:01:11,959 Speaker 1: gain access to over three thousand unclassified files. How could 17 00:01:12,000 --> 00:01:14,640 Speaker 1: this have happened and what should the US government learn 18 00:01:14,680 --> 00:01:14,959 Speaker 1: from this? 19 00:01:16,600 --> 00:01:19,760 Speaker 2: The reality is this is happening all the time. 20 00:01:20,560 --> 00:01:27,560 Speaker 3: Most security vendors, most companies, most organizations have been compromised 21 00:01:27,840 --> 00:01:32,400 Speaker 3: or penetrated by the Chinese, the Russians, or the Iranians, 22 00:01:32,800 --> 00:01:35,760 Speaker 3: and we just didn't detect it. We didn't realize it, 23 00:01:35,920 --> 00:01:37,679 Speaker 3: and we didn't know that it's happening. 24 00:01:37,720 --> 00:01:38,640 Speaker 2: So this is a much. 25 00:01:38,480 --> 00:01:41,440 Speaker 3: Bigger problem that has been brewing for a long time, 26 00:01:41,760 --> 00:01:44,679 Speaker 3: and unfortunately there's not been a lot of awareness around 27 00:01:44,720 --> 00:01:48,760 Speaker 3: just how bad the issue is. And governments and other 28 00:01:48,880 --> 00:01:54,040 Speaker 3: organizations have to realize that the probability of third party 29 00:01:54,120 --> 00:01:57,920 Speaker 3: vendors or third party sources having a vulnerability or a 30 00:01:57,960 --> 00:02:02,440 Speaker 3: compromise is very high, and we need to redesign our systems, 31 00:02:02,800 --> 00:02:06,360 Speaker 3: We need to redesign how we're configured to protect against it. 32 00:02:06,520 --> 00:02:12,040 Speaker 3: And most importantly, we need federal laws on cybersecurity. There's 33 00:02:12,080 --> 00:02:15,440 Speaker 3: a lot of state laws, California is leading the pack 34 00:02:15,639 --> 00:02:18,200 Speaker 3: where there's a lot of privacy laws, but the United 35 00:02:18,200 --> 00:02:21,679 Speaker 3: States is one of the few countries that don't have 36 00:02:22,520 --> 00:02:26,600 Speaker 3: federal laws on cybersecurity and federal laws on data privacy. 37 00:02:27,040 --> 00:02:29,200 Speaker 1: Then why is that? Why are we behind? 38 00:02:30,000 --> 00:02:33,400 Speaker 3: I believe the big issue is we always thought and 39 00:02:33,480 --> 00:02:37,440 Speaker 3: a lot of people still do, that cybersecurity gets in 40 00:02:37,480 --> 00:02:41,960 Speaker 3: the way of freedom of speech, that cybersecurity gets in 41 00:02:41,960 --> 00:02:46,480 Speaker 3: the way of exchange of information, and that cybersecurity is 42 00:02:46,600 --> 00:02:49,440 Speaker 3: not fit for a democracy. Like most people when they 43 00:02:49,480 --> 00:02:53,400 Speaker 3: think of cybersecurity, you think of North Korea, where they 44 00:02:53,400 --> 00:02:58,240 Speaker 3: don't have internet access. Citizens in North Korea, they can't 45 00:02:58,280 --> 00:03:02,480 Speaker 3: access the internet, they don't have email, they can't access information. 46 00:03:02,800 --> 00:03:06,320 Speaker 3: Even in Russia, most people don't realize a lot of 47 00:03:06,360 --> 00:03:09,160 Speaker 3: the websites that we take for granted, a lot of 48 00:03:09,200 --> 00:03:13,480 Speaker 3: the social media sites are not accessible in Russia and 49 00:03:13,520 --> 00:03:17,240 Speaker 3: they're not available. So people have always felt that cybersecurity 50 00:03:17,800 --> 00:03:21,200 Speaker 3: is more of limiting and reducing access to information. 51 00:03:21,880 --> 00:03:23,120 Speaker 2: But that's just not correct. 52 00:03:23,560 --> 00:03:26,920 Speaker 3: Cybersecurity is about how do we protect and control our 53 00:03:26,960 --> 00:03:30,880 Speaker 3: information so only people that need access to it has 54 00:03:30,919 --> 00:03:33,800 Speaker 3: access to it. And I think that's why we've fallen behind, 55 00:03:34,160 --> 00:03:37,960 Speaker 3: because we just haven't realized that cybersecurity is actually a 56 00:03:37,960 --> 00:03:40,800 Speaker 3: compliment to democracy, it's not adverse to it. 57 00:03:41,440 --> 00:03:46,720 Speaker 1: Does anybody the you know of have an effective cybersecurity 58 00:03:46,720 --> 00:03:47,560 Speaker 1: bill proposal? 59 00:03:48,560 --> 00:03:52,080 Speaker 3: None that I'm aware of. I continually try to push it. 60 00:03:52,160 --> 00:03:56,560 Speaker 3: The problem is, as you're probably very familiar, everything is 61 00:03:56,600 --> 00:04:01,000 Speaker 3: so political, everything is so either side has to disagree 62 00:04:01,040 --> 00:04:03,840 Speaker 3: with each other that anything we we're trying to push 63 00:04:03,880 --> 00:04:07,600 Speaker 3: through on cybersecurity is reading a version from. 64 00:04:07,480 --> 00:04:08,000 Speaker 2: The other side. 65 00:04:08,040 --> 00:04:12,560 Speaker 3: And the reality is we need to recognize cybersecurity as 66 00:04:12,600 --> 00:04:17,600 Speaker 3: a non bipartisan issue. It's really something that impacts both sides. 67 00:04:17,920 --> 00:04:21,880 Speaker 3: It impacts democrats and impacts Republicans and impacts America as 68 00:04:21,920 --> 00:04:24,039 Speaker 3: a whole. And one of the things I'm trying to 69 00:04:24,080 --> 00:04:26,880 Speaker 3: do is really, how can we break down those barriers 70 00:04:27,200 --> 00:04:29,680 Speaker 3: and get both sides to agree that, Okay, we can 71 00:04:29,680 --> 00:04:33,400 Speaker 3: fight about some things, but cybersecurity, we need to get 72 00:04:33,400 --> 00:04:37,320 Speaker 3: her act together. Because when the founding fathers wrote the 73 00:04:37,360 --> 00:04:41,800 Speaker 3: Constitution and the bill Wrights, they had no clue that 74 00:04:41,800 --> 00:04:43,719 Speaker 3: we were going to be carrying cell phones with us. 75 00:04:43,920 --> 00:04:47,360 Speaker 3: They had no clue that we've been having tracking devices 76 00:04:47,400 --> 00:04:49,640 Speaker 3: on us twenty four to seven and we need new 77 00:04:49,720 --> 00:04:51,800 Speaker 3: laws that are keeping up with the digital frontier. 78 00:04:52,200 --> 00:04:54,680 Speaker 1: I couldn't help but smile, dad, is we have ready 79 00:04:54,680 --> 00:04:58,200 Speaker 1: to celebrate our two hundred and fiftieth birthday that if 80 00:04:58,240 --> 00:05:01,240 Speaker 1: you were to drop George Washington or Jefferson or Franklin 81 00:05:01,720 --> 00:05:05,960 Speaker 1: under the current situation, all of them would have found them. Yeah. 82 00:05:05,960 --> 00:05:07,200 Speaker 2: I think they would be amazed. 83 00:05:08,080 --> 00:05:11,840 Speaker 1: It'll be remarkable. One of the examples I've been thinking 84 00:05:11,839 --> 00:05:15,239 Speaker 1: about a lot elon Musk and Doge sent an email 85 00:05:15,279 --> 00:05:18,480 Speaker 1: to federal workers saying, please reply to this email with 86 00:05:18,520 --> 00:05:22,160 Speaker 1: approximately five bullets of what you accomplished this week and 87 00:05:22,279 --> 00:05:26,840 Speaker 1: carbon copy your manager. Now, if people responded to that, 88 00:05:27,320 --> 00:05:31,360 Speaker 1: are there any cybersecurity risks in responding to an email 89 00:05:31,400 --> 00:05:31,599 Speaker 1: like that? 90 00:05:33,360 --> 00:05:38,080 Speaker 3: There's huge cybersecurity risks because you essentially have somebody who 91 00:05:38,920 --> 00:05:44,479 Speaker 3: is not employed by that government agency and they're asking 92 00:05:44,560 --> 00:05:48,800 Speaker 3: for information about what you're doing on a daily basis. 93 00:05:49,240 --> 00:05:53,080 Speaker 3: If I answered that accurately, I would be giving away 94 00:05:53,560 --> 00:05:56,400 Speaker 3: a lot of critical information. If I had to go 95 00:05:56,440 --> 00:05:58,720 Speaker 3: in and say, well, I'm working on this project, I'm 96 00:05:58,760 --> 00:06:00,760 Speaker 3: working in this area, I'm working. 97 00:06:00,560 --> 00:06:01,400 Speaker 2: On this research. 98 00:06:01,760 --> 00:06:04,960 Speaker 3: That's a lot of valuable information that if that got 99 00:06:05,000 --> 00:06:08,680 Speaker 3: on the wrong hands. And my question is what email 100 00:06:08,680 --> 00:06:12,680 Speaker 3: address is Elon Musk using to get replies? 101 00:06:13,160 --> 00:06:14,920 Speaker 2: Where are those emails being stored? 102 00:06:14,920 --> 00:06:17,000 Speaker 3: Because at least from as far as I can tell, 103 00:06:17,600 --> 00:06:21,200 Speaker 3: Elon is not using government servers. I know that he's 104 00:06:21,200 --> 00:06:25,039 Speaker 3: installed some of his own servers at Treasury and other areas. 105 00:06:25,279 --> 00:06:26,640 Speaker 2: So now, if these. 106 00:06:26,720 --> 00:06:32,200 Speaker 3: Emails from government employees that potentially are containing sensitive or 107 00:06:32,279 --> 00:06:38,400 Speaker 3: even classified information are stored on public servers, what happens 108 00:06:38,520 --> 00:06:40,800 Speaker 3: if foreign adversaries get access to it. And I don't 109 00:06:40,800 --> 00:06:43,560 Speaker 3: know if you saw this week, but the Doze website 110 00:06:43,760 --> 00:06:44,360 Speaker 3: got hacked. 111 00:06:44,720 --> 00:06:46,440 Speaker 2: They don't even have proper security. 112 00:06:46,600 --> 00:06:49,479 Speaker 3: So here he's setting up their website for Doze for 113 00:06:49,560 --> 00:06:53,839 Speaker 3: government efficiency. They got hacked, and they're expecting government employees 114 00:06:53,880 --> 00:06:57,120 Speaker 3: to give all this sensitive information to these servers that 115 00:06:57,240 --> 00:06:58,560 Speaker 3: clearly have vulnerabilities. 116 00:07:00,120 --> 00:07:03,520 Speaker 1: What I'm thinking that this is one of those ideas which, 117 00:07:03,600 --> 00:07:07,520 Speaker 1: when you hit the implementation phase, is a thousand times 118 00:07:07,560 --> 00:07:11,960 Speaker 1: more complicated than the idea. Yeah, and I think that 119 00:07:12,040 --> 00:07:13,600 Speaker 1: they really don't fully understand that. 120 00:07:14,680 --> 00:07:17,240 Speaker 3: And this is one when I can resonate because at 121 00:07:17,280 --> 00:07:20,080 Speaker 3: the end of the day, Elon is a geek. He's 122 00:07:20,120 --> 00:07:23,240 Speaker 3: not a businessman, he's not a cyber guy. He's a geek, 123 00:07:23,480 --> 00:07:27,080 Speaker 3: and he's all about solving problems and he wants to 124 00:07:27,120 --> 00:07:30,240 Speaker 3: solve the problem as quick and as fast as possible 125 00:07:30,480 --> 00:07:35,360 Speaker 3: and get access to information. The issue is cybersecurity is 126 00:07:35,400 --> 00:07:38,600 Speaker 3: always an afterthought. And back to your original question of 127 00:07:38,640 --> 00:07:42,040 Speaker 3: why is the United States behind it's because cybersecurity is 128 00:07:42,040 --> 00:07:45,960 Speaker 3: always an afterthought. We're not thinking of cybersecurity. Elon didn't 129 00:07:46,040 --> 00:07:49,400 Speaker 3: sit down and say, okay, how can we do this 130 00:07:49,440 --> 00:07:52,680 Speaker 3: in a secure manner. What are the cybersecurity protocols that 131 00:07:52,720 --> 00:07:55,240 Speaker 3: we need in place to do this correctly? He basically 132 00:07:55,280 --> 00:07:58,120 Speaker 3: just said, I need the data. We'll figure out cyber later. 133 00:07:58,480 --> 00:08:03,080 Speaker 3: But the problem is with digital information. Once your data 134 00:08:03,200 --> 00:08:06,600 Speaker 3: is leaked out, once your data exists on servers, you 135 00:08:06,640 --> 00:08:09,680 Speaker 3: can't get it back. It will exist forever. I know 136 00:08:09,760 --> 00:08:12,800 Speaker 3: before the podcast, I was talking with your producer how 137 00:08:13,160 --> 00:08:16,320 Speaker 3: she took her daughter to a hospital and it got 138 00:08:16,360 --> 00:08:20,880 Speaker 3: hacked and her daughter's personal information was exposed. And the 139 00:08:20,960 --> 00:08:24,200 Speaker 3: reality is now that person has to live the next 140 00:08:24,560 --> 00:08:27,960 Speaker 3: fifty sixty seventy years of their life in a world 141 00:08:28,120 --> 00:08:31,680 Speaker 3: where their personal information has been compromised because once somebody 142 00:08:31,720 --> 00:08:34,520 Speaker 3: has your social Security number, you can't get it back. 143 00:08:35,920 --> 00:08:38,720 Speaker 1: So to make clear how big a threat this says, 144 00:08:39,280 --> 00:08:42,640 Speaker 1: you talk about a cyber war, you say, quote, our 145 00:08:42,720 --> 00:08:45,640 Speaker 1: nation is currently a war, whether we realize it or not. 146 00:08:45,960 --> 00:08:48,439 Speaker 1: We're in the middle of World War three. The reason 147 00:08:48,480 --> 00:08:51,280 Speaker 1: why many people don't recognize it is because it's a 148 00:08:51,280 --> 00:08:54,760 Speaker 1: different type of world war. In this war, every single 149 00:08:54,800 --> 00:08:58,000 Speaker 1: country is involved in. Every single country is both being 150 00:08:58,040 --> 00:09:01,360 Speaker 1: attacked and attacking other countries. Walk us through all that. 151 00:09:01,440 --> 00:09:03,120 Speaker 1: I agree with you, but I think it'll be very 152 00:09:03,120 --> 00:09:06,720 Speaker 1: helpful for people to hear what this cyber war is like. 153 00:09:08,200 --> 00:09:10,400 Speaker 2: So most of us think of wars. 154 00:09:11,520 --> 00:09:18,720 Speaker 3: We think of World War one, World War two, tanks, planes, boats, missiles, 155 00:09:18,760 --> 00:09:19,719 Speaker 3: and guns. 156 00:09:20,240 --> 00:09:22,760 Speaker 2: But we're in a digital war because we live in 157 00:09:22,760 --> 00:09:23,600 Speaker 2: a digital world. 158 00:09:24,280 --> 00:09:29,640 Speaker 3: And now it's not bullets, it's not weapons, it's packets, 159 00:09:29,840 --> 00:09:34,200 Speaker 3: it's information, it's data, it's leakage. And the reality is, 160 00:09:35,200 --> 00:09:37,280 Speaker 3: as we're starting to see with some of these breaches 161 00:09:37,280 --> 00:09:40,319 Speaker 3: that come out. We saw Colonial Pipeline. I live on 162 00:09:40,400 --> 00:09:44,000 Speaker 3: the East Coast in Virginia, and when Colonial Pipeline, a 163 00:09:44,120 --> 00:09:46,959 Speaker 3: large oil supplier on the East coast, got hacked, our 164 00:09:47,040 --> 00:09:51,240 Speaker 3: gas stations were closed for four days. People were actually walking, 165 00:09:51,559 --> 00:09:54,160 Speaker 3: they were panicking, saying, are we actually going to be 166 00:09:54,200 --> 00:09:57,200 Speaker 3: able to get gas because of a cyber attack. Then 167 00:09:57,240 --> 00:10:00,800 Speaker 3: we have Solar Winds, where you talked about earlier, where 168 00:10:01,280 --> 00:10:05,680 Speaker 3: broke into a vendor that compromised government systems, And these 169 00:10:05,720 --> 00:10:10,160 Speaker 3: attacks continue to happen and occur, but the reality is 170 00:10:10,600 --> 00:10:14,719 Speaker 3: they started five to ten years ago. The Chinese, the Russians, 171 00:10:14,960 --> 00:10:18,960 Speaker 3: they're in our systems, we're in their systems, and it's 172 00:10:19,000 --> 00:10:24,080 Speaker 3: sort of like the nuclear Cold War where Russia could 173 00:10:24,080 --> 00:10:27,800 Speaker 3: destroy the United States and we could destroy Russia. So 174 00:10:28,040 --> 00:10:31,600 Speaker 3: neither side would actually launch a nuclear weapon. But we're 175 00:10:31,640 --> 00:10:36,360 Speaker 3: in Russian's critical infrastructure, they're in our critical infrastructure, and 176 00:10:36,520 --> 00:10:40,040 Speaker 3: neither side is going to do anything because it would 177 00:10:40,080 --> 00:10:43,680 Speaker 3: do mutual mass destruction. But what's happening when our information 178 00:10:43,800 --> 00:10:47,760 Speaker 3: starts leaking out? What happens when our data is being compromised? 179 00:10:48,080 --> 00:10:52,640 Speaker 3: And the reality is because we're at war. When you're 180 00:10:52,679 --> 00:10:56,640 Speaker 3: at war, you have a different mentality. I've been over 181 00:10:56,679 --> 00:10:59,640 Speaker 3: in the Ukraine, I was over in Iraq during the 182 00:10:59,640 --> 00:11:03,000 Speaker 3: Iraq War. When you're in a war, people are thinking differently, 183 00:11:03,400 --> 00:11:06,640 Speaker 3: they're acting differently. They're more scared, they're more paranoid, they're 184 00:11:06,640 --> 00:11:09,760 Speaker 3: more careful about what they're doing. The problem we have 185 00:11:09,840 --> 00:11:13,680 Speaker 3: in the United States is everybody on the Internet thinks 186 00:11:13,720 --> 00:11:17,760 Speaker 3: we're in peace time conditions. So they're sharing information, they're 187 00:11:17,760 --> 00:11:22,120 Speaker 3: giving their data, they're accessing whatever they want, they're posting pictures, 188 00:11:22,240 --> 00:11:25,319 Speaker 3: they're putting everything out there. But the reality is, if 189 00:11:25,320 --> 00:11:28,959 Speaker 3: they knew were at war, we need wartime thinking. People 190 00:11:29,040 --> 00:11:31,720 Speaker 3: need to be more paranoid, a little more scared, than 191 00:11:31,760 --> 00:11:34,000 Speaker 3: a little more protective of the data. They need to 192 00:11:34,040 --> 00:11:37,040 Speaker 3: be careful of who they're giving their information to. We 193 00:11:37,080 --> 00:11:39,920 Speaker 3: need to start implementing security because here's the great news. 194 00:11:40,800 --> 00:11:46,360 Speaker 3: Your banks, your e commerce all have security built in, 195 00:11:47,240 --> 00:11:50,520 Speaker 3: but it's turned off by default. It's not all turned 196 00:11:50,559 --> 00:11:53,520 Speaker 3: on because they don't think citizens are ready for it. 197 00:11:53,840 --> 00:11:56,440 Speaker 3: So we need to start going in to our apps, 198 00:11:56,520 --> 00:12:00,199 Speaker 3: going into our devices and start turning on security as 199 00:12:00,440 --> 00:12:05,320 Speaker 3: turning on notifications, turning on two factor authentication. So the 200 00:12:05,440 --> 00:12:08,720 Speaker 3: security is there, but we just have to start implementing it. 201 00:12:09,240 --> 00:12:13,000 Speaker 3: And the war that we're facing today is not a 202 00:12:13,120 --> 00:12:18,640 Speaker 3: visible war where there's huge explosions or banks are being 203 00:12:18,760 --> 00:12:23,680 Speaker 3: taken down. It's a war of data leakage. Imagine we 204 00:12:23,800 --> 00:12:27,760 Speaker 3: have a big bucket. Instead of somebody going in emptying 205 00:12:27,800 --> 00:12:30,720 Speaker 3: the bucket, they're just putting little holes in the bucket. 206 00:12:30,920 --> 00:12:35,240 Speaker 3: They're slowly leaking our data and leaking our information, and 207 00:12:35,400 --> 00:12:39,280 Speaker 3: by the time the bucket's empty, most people don't even notice. 208 00:12:39,800 --> 00:12:41,880 Speaker 3: A reality that I see all the time is most 209 00:12:41,960 --> 00:12:47,440 Speaker 3: people's bank accounts or credit cards are compromised. But here's 210 00:12:47,480 --> 00:12:52,440 Speaker 3: the reality. The attacker is stealing a dollar a month. Now, 211 00:12:53,000 --> 00:12:56,440 Speaker 3: imagine if somebody is taking a dollar from your bank 212 00:12:56,440 --> 00:12:59,000 Speaker 3: account each month or a dollar from your credit card, 213 00:12:59,120 --> 00:13:02,440 Speaker 3: you probably wouldn't know. Most people don't look at their 214 00:13:02,520 --> 00:13:05,880 Speaker 3: credit cards that closely. Most people don't look at their 215 00:13:05,920 --> 00:13:09,200 Speaker 3: bank accounts close enough that if a dollar was missing, 216 00:13:09,679 --> 00:13:12,880 Speaker 3: they would not recognize that error. But if you steal 217 00:13:12,920 --> 00:13:17,280 Speaker 3: a dollar from every person every single month, that starts 218 00:13:17,320 --> 00:13:20,840 Speaker 3: turning into a billion dollar industry, which is what we 219 00:13:20,920 --> 00:13:26,280 Speaker 3: have right now. Cybercrime is over fifty billion dollars. It's 220 00:13:26,320 --> 00:13:29,440 Speaker 3: costing America on a regular basis. 221 00:13:29,559 --> 00:13:32,040 Speaker 1: That's wild, and that's so much bigger than people think 222 00:13:32,080 --> 00:13:36,800 Speaker 1: it is. Exactly if people want to protect themselves somewhat 223 00:13:36,800 --> 00:13:40,000 Speaker 1: from their own devices, what should they do and how 224 00:13:40,000 --> 00:13:40,600 Speaker 1: do they do it? 225 00:13:41,800 --> 00:13:44,280 Speaker 3: So the first thing they need to do is realize 226 00:13:44,640 --> 00:13:47,760 Speaker 3: that when you buy a new iPhone or you buy 227 00:13:47,760 --> 00:13:50,600 Speaker 3: a new Android device, they are very secure. 228 00:13:51,320 --> 00:13:53,760 Speaker 2: They are very locked down and protected. 229 00:13:54,360 --> 00:13:57,719 Speaker 3: The problem is when we start installing all of these 230 00:13:57,720 --> 00:14:02,839 Speaker 3: different apps. Free is not free, and basically a free 231 00:14:02,880 --> 00:14:06,680 Speaker 3: app is tracking your location. So first, if you have 232 00:14:06,840 --> 00:14:10,240 Speaker 3: the choice between a free app or a paid version, 233 00:14:11,920 --> 00:14:14,720 Speaker 3: use the paid version. If it's something that you need 234 00:14:14,760 --> 00:14:17,600 Speaker 3: to run your life or it's critical for your life, 235 00:14:17,880 --> 00:14:20,360 Speaker 3: you need to use a paid version because the paid 236 00:14:20,440 --> 00:14:23,400 Speaker 3: versions are a lot more secure than the free versions. Next, 237 00:14:24,240 --> 00:14:28,040 Speaker 3: any app that you haven't used in forty five days, 238 00:14:28,560 --> 00:14:31,280 Speaker 3: the lead the lead off your app, And I'm going 239 00:14:31,360 --> 00:14:35,160 Speaker 3: to give you the challenge. I run my life on 240 00:14:35,240 --> 00:14:38,400 Speaker 3: ten apps. If I go and download a new app, 241 00:14:38,560 --> 00:14:40,640 Speaker 3: I only do it if I delete an old one. 242 00:14:40,760 --> 00:14:44,080 Speaker 3: So instead of having fifty and seventy apps on your 243 00:14:44,080 --> 00:14:46,920 Speaker 3: device that you're not using, do you realize an app 244 00:14:46,920 --> 00:14:49,760 Speaker 3: that you install on your device but you're not using, 245 00:14:50,600 --> 00:14:54,200 Speaker 3: actually is spying on you. It's tracking your location, it's 246 00:14:54,240 --> 00:14:58,640 Speaker 3: accessing your camera, it's accessing your information. So delete any 247 00:14:58,720 --> 00:15:03,600 Speaker 3: apps that are not needed or required. Second, for any 248 00:15:04,040 --> 00:15:06,920 Speaker 3: application you're using, you need to use what we call 249 00:15:07,000 --> 00:15:10,840 Speaker 3: two factor authentication. This is where when you log in, 250 00:15:11,520 --> 00:15:13,920 Speaker 3: you put in your password and then your text a 251 00:15:13,920 --> 00:15:16,360 Speaker 3: one time code to your cell phone, and then you 252 00:15:16,400 --> 00:15:18,960 Speaker 3: have to enter in that one time code. And I 253 00:15:19,000 --> 00:15:23,640 Speaker 3: know people's initial response is, Eric, that's annoying if every 254 00:15:23,720 --> 00:15:26,960 Speaker 3: time I need to log in, I have to enter 255 00:15:26,960 --> 00:15:30,520 Speaker 3: a code that takes a couple of extra seconds. And 256 00:15:30,600 --> 00:15:34,200 Speaker 3: my response is, you know what's really annoying your bank 257 00:15:34,240 --> 00:15:37,600 Speaker 3: account getting hacked. You know what's really annoying your identity 258 00:15:37,600 --> 00:15:40,400 Speaker 3: being stolen. So do you want a short term annoyance 259 00:15:40,640 --> 00:15:44,880 Speaker 3: with two factor or a long term annoyance of being vulnerable? Next, 260 00:15:45,760 --> 00:15:51,600 Speaker 3: turn on account notification. Every time I use my credit card, 261 00:15:52,320 --> 00:15:56,360 Speaker 3: every time I withdraw money from my bank, I get 262 00:15:56,400 --> 00:15:59,720 Speaker 3: a text notification where it says, Eric, is this you 263 00:16:00,320 --> 00:16:02,000 Speaker 3: Did you actually do this transaction? 264 00:16:02,280 --> 00:16:04,760 Speaker 2: Did you actually withdraw money from the account? 265 00:16:05,000 --> 00:16:09,880 Speaker 3: And the reality is I get text messages at least 266 00:16:10,640 --> 00:16:14,880 Speaker 3: one to two times a quarter that are unauthorized transactions. 267 00:16:15,280 --> 00:16:19,480 Speaker 3: So if I didn't have that turned on, those transactions 268 00:16:19,520 --> 00:16:22,640 Speaker 3: would have occurred, and I would have been exploited and 269 00:16:22,680 --> 00:16:26,080 Speaker 3: I would have been compromised. So once again, small short 270 00:16:26,160 --> 00:16:30,160 Speaker 3: term annoyance, but long term benefits. So turn on account 271 00:16:30,200 --> 00:16:34,640 Speaker 3: activity notification on all your systems. And then the last 272 00:16:34,680 --> 00:16:39,480 Speaker 3: piece I always give is, under no circumstances should you 273 00:16:39,600 --> 00:16:42,760 Speaker 3: ever click on a link. Don't ever click on a link, 274 00:16:42,800 --> 00:16:46,080 Speaker 3: don't ever click on attachment. This just happened to one 275 00:16:46,120 --> 00:16:51,000 Speaker 3: of my friends where they're traveling in Florida. They got 276 00:16:51,040 --> 00:16:54,080 Speaker 3: a text notification that said, you ran one of the 277 00:16:54,120 --> 00:16:57,960 Speaker 3: fast tolls in Florida and you need to pay the 278 00:16:58,000 --> 00:17:00,880 Speaker 3: fine or you're going to have huge issue. Using problems 279 00:17:01,040 --> 00:17:04,000 Speaker 3: click on this link, and because they were in Florida, 280 00:17:04,000 --> 00:17:06,200 Speaker 3: they thought it was legit. They clicked on the link 281 00:17:06,200 --> 00:17:09,199 Speaker 3: and it was a scam. So don't ever click on 282 00:17:09,240 --> 00:17:12,399 Speaker 3: a link, don't ever open an attachment. But Eric, what 283 00:17:12,520 --> 00:17:15,920 Speaker 3: if my bank sends me a notification that says there's 284 00:17:15,920 --> 00:17:18,520 Speaker 3: a problem and there's a link, Go to the app, 285 00:17:19,480 --> 00:17:22,760 Speaker 3: Go to the app, log in using a valid app 286 00:17:23,080 --> 00:17:26,119 Speaker 3: to access your bank account, But never click on a 287 00:17:26,160 --> 00:17:28,560 Speaker 3: link and never open an attachment. 288 00:17:29,359 --> 00:17:31,600 Speaker 1: Well, I'm really curious, Eric, should you have a banking 289 00:17:31,600 --> 00:17:34,080 Speaker 1: app on your phone? Yes? 290 00:17:34,960 --> 00:17:37,480 Speaker 3: I know that's counter because a lot of security people 291 00:17:37,480 --> 00:17:41,720 Speaker 3: are like, no, don't have anything. The reality is our 292 00:17:41,840 --> 00:17:45,280 Speaker 3: phone is a trusted advisor. It's something that we have 293 00:17:45,440 --> 00:17:48,640 Speaker 3: with us and we access And here's the reality. Apps 294 00:17:48,920 --> 00:17:53,040 Speaker 3: are much more secure than websites. Apps are much. 295 00:17:52,880 --> 00:17:54,840 Speaker 2: More secure than clicking links. 296 00:17:55,119 --> 00:17:57,879 Speaker 3: So if you're going to use your bank, if you're 297 00:17:57,880 --> 00:18:00,960 Speaker 3: going to do online banking, if you're going to e commerce, 298 00:18:01,520 --> 00:18:04,639 Speaker 3: it's much better to use the apps. The apps have 299 00:18:04,720 --> 00:18:08,680 Speaker 3: a lot more security and a lot more protection than websites. 300 00:18:09,040 --> 00:18:12,320 Speaker 3: So the best advice I can give you is minimize 301 00:18:12,359 --> 00:18:16,600 Speaker 3: your use of websites. Maximize your use of trusted apps, 302 00:18:16,960 --> 00:18:19,080 Speaker 3: and that's going to also make you a lot more secure. 303 00:18:35,480 --> 00:18:39,160 Speaker 1: We're really kind of a free for all where it's 304 00:18:39,200 --> 00:18:41,040 Speaker 1: not like the Cold War where there was one side 305 00:18:41,040 --> 00:18:44,399 Speaker 1: and the other side. It's more like between governments and 306 00:18:44,480 --> 00:18:47,680 Speaker 1: private criminal groups, et cetera. It can be coming from 307 00:18:47,680 --> 00:18:51,680 Speaker 1: anywhere at any time, and so you can't just focus 308 00:18:51,720 --> 00:18:54,439 Speaker 1: on North Korea or focus on Russia. You almost have 309 00:18:54,480 --> 00:18:57,320 Speaker 1: to focus on how you defend yourself against all the 310 00:18:57,359 --> 00:18:58,760 Speaker 1: attacks in every single version. 311 00:18:59,440 --> 00:19:02,800 Speaker 2: That's very because here's the reality. There are no. 312 00:19:04,280 --> 00:19:07,960 Speaker 3: National and international borders on the Internet. When you're on 313 00:19:08,000 --> 00:19:13,200 Speaker 3: the Internet, I can access different countries, different areas, different locations, 314 00:19:13,320 --> 00:19:15,040 Speaker 3: and there's no boundaries. 315 00:19:15,080 --> 00:19:15,639 Speaker 2: There's nothing. 316 00:19:15,800 --> 00:19:21,120 Speaker 3: Somebody in Russia can access servers and individuals in America 317 00:19:21,760 --> 00:19:25,760 Speaker 3: without going through customs, without presenting a passport, without going 318 00:19:25,800 --> 00:19:29,639 Speaker 3: through immigration. So the problem is, as I mentioned, the 319 00:19:29,760 --> 00:19:34,359 Speaker 3: laws were written for physical boundaries. If somebody is physically 320 00:19:34,359 --> 00:19:37,280 Speaker 3: in the United States, they have to abide by our laws. 321 00:19:38,000 --> 00:19:41,840 Speaker 3: If they're physically in Russia, they abide by Russian laws. Well, 322 00:19:41,880 --> 00:19:45,399 Speaker 3: on the Internet, you don't know where you're at. I 323 00:19:45,560 --> 00:19:48,280 Speaker 3: track very closely and I will tell you when I'm 324 00:19:48,440 --> 00:19:51,080 Speaker 3: surfing the Web and doing daily activity just like you 325 00:19:51,160 --> 00:19:56,520 Speaker 3: and anyone else. I'm frequently accessing servers in the Philippines 326 00:19:56,640 --> 00:19:59,480 Speaker 3: because there's a lot of data centers there. In Singapore, 327 00:19:59,760 --> 00:20:02,400 Speaker 3: the Middle East has a lot of data centers, Dubai, 328 00:20:02,520 --> 00:20:06,120 Speaker 3: South America, and people don't realize that even when they're 329 00:20:06,160 --> 00:20:09,400 Speaker 3: going to e commerce sites or banks or other areas 330 00:20:09,560 --> 00:20:13,359 Speaker 3: and giving away their information, those servers are often not 331 00:20:13,520 --> 00:20:16,800 Speaker 3: in the United States, which means your data and your 332 00:20:16,800 --> 00:20:20,000 Speaker 3: information is not in the United States, which means even 333 00:20:20,040 --> 00:20:24,040 Speaker 3: if we had privacy laws, they might not apply to 334 00:20:24,080 --> 00:20:27,879 Speaker 3: your data information if it's outside of our country. So 335 00:20:28,119 --> 00:20:32,199 Speaker 3: people just don't really understand the complexities that the Internet 336 00:20:32,600 --> 00:20:34,240 Speaker 3: is really one world. 337 00:20:34,480 --> 00:20:35,600 Speaker 2: There's no boundaries. 338 00:20:35,960 --> 00:20:39,199 Speaker 3: Servers can exist, data can exist anywhere, and until we 339 00:20:39,240 --> 00:20:42,600 Speaker 3: get international laws where we all cooperate and say, okay, 340 00:20:42,640 --> 00:20:44,760 Speaker 3: we're all going to work together, it's going to be 341 00:20:44,800 --> 00:20:48,360 Speaker 3: real difficult. Because this just happened this morning is I'm 342 00:20:48,359 --> 00:20:52,000 Speaker 3: working on an investigation and we found a hacking group 343 00:20:52,720 --> 00:20:56,840 Speaker 3: in Russia. We know who they are, we know where 344 00:20:56,880 --> 00:20:59,840 Speaker 3: they're located, we have their physical address. They're a company, 345 00:21:00,000 --> 00:21:04,920 Speaker 3: an incorporated company in Russia. But here's the problem. They're 346 00:21:04,960 --> 00:21:08,399 Speaker 3: not breaking any laws in Russia, and there's no extradition 347 00:21:08,480 --> 00:21:12,560 Speaker 3: treaty with Russia, so we know who's hurting us, who's 348 00:21:12,640 --> 00:21:16,359 Speaker 3: hurting Americans, who's stealing our information. But because there's no 349 00:21:16,440 --> 00:21:21,120 Speaker 3: international laws, there's really little we can do to stop them, which, 350 00:21:21,160 --> 00:21:24,960 Speaker 3: as you said, every individual has to realize they're a 351 00:21:25,040 --> 00:21:27,480 Speaker 3: target and we need to start putting measures in place 352 00:21:27,520 --> 00:21:31,520 Speaker 3: and protect us because unfortunately, until there's global laws, the 353 00:21:31,600 --> 00:21:33,560 Speaker 3: laws aren't going to be able to protect us or 354 00:21:33,600 --> 00:21:35,080 Speaker 3: keep us safe. 355 00:21:35,200 --> 00:21:39,040 Speaker 1: Can we reverse it and use our cyber capabilities to 356 00:21:39,119 --> 00:21:41,240 Speaker 1: go back in and attack the people who are doing. 357 00:21:41,040 --> 00:21:46,520 Speaker 3: This, We absolutely can. That's another area that we've seen 358 00:21:46,960 --> 00:21:49,840 Speaker 3: the recent presidents actually do a really good job on 359 00:21:50,320 --> 00:21:54,280 Speaker 3: Trump in his first term he was actually the first 360 00:21:54,320 --> 00:22:00,200 Speaker 3: president that actually allowed Department of Defense to launch scienceber 361 00:22:00,280 --> 00:22:02,440 Speaker 3: attacks without executive approval. 362 00:22:03,240 --> 00:22:05,560 Speaker 2: Prior to Trump passing. 363 00:22:05,240 --> 00:22:08,800 Speaker 3: That executive order on his first term, if the Department 364 00:22:08,840 --> 00:22:13,000 Speaker 3: of Defense wanted to launch a cyber attack and offensive operation, 365 00:22:13,600 --> 00:22:19,520 Speaker 3: they needed presidential approval. Now, China doesn't require that, Russia 366 00:22:19,560 --> 00:22:20,520 Speaker 3: doesn't require that. 367 00:22:20,800 --> 00:22:22,560 Speaker 2: I ran in Iraq doesn't require. 368 00:22:22,240 --> 00:22:26,000 Speaker 3: That, so we were really hamstrung in that capability. So yes, 369 00:22:26,160 --> 00:22:28,960 Speaker 3: we have to start getting more aggressive. But the other 370 00:22:29,000 --> 00:22:31,520 Speaker 3: thing we need to do is got a lot more 371 00:22:31,680 --> 00:22:39,040 Speaker 3: partnership between government and commercial organizations. In China, the Chinese 372 00:22:39,119 --> 00:22:44,520 Speaker 3: government spies and steals information from US companies for the 373 00:22:44,600 --> 00:22:49,400 Speaker 3: benefit of Chinese companies. In the United States, we don't 374 00:22:49,400 --> 00:22:53,000 Speaker 3: have that capability in the United States, the Department of Defense. 375 00:22:53,280 --> 00:22:57,840 Speaker 3: They can't steal corporate information and give it to US 376 00:22:57,960 --> 00:23:00,080 Speaker 3: companies because once again. 377 00:23:00,240 --> 00:23:02,040 Speaker 2: That violates our laws. 378 00:23:02,320 --> 00:23:07,040 Speaker 3: But if other countries, their governments are working on behalf 379 00:23:07,119 --> 00:23:10,239 Speaker 3: of local companies to help and support them, we need 380 00:23:10,280 --> 00:23:11,560 Speaker 3: to do the same thing. We need to have a 381 00:23:11,640 --> 00:23:15,600 Speaker 3: much closer partnership where we can launch offensive operations and 382 00:23:15,640 --> 00:23:19,000 Speaker 3: then the government can share that information with US companies 383 00:23:19,119 --> 00:23:20,719 Speaker 3: to help make them more competitive. 384 00:23:21,320 --> 00:23:25,680 Speaker 1: As you think through this continuous cyber war, as I 385 00:23:25,760 --> 00:23:31,040 Speaker 1: understand that North Korea is almost entirely government run cyber war, 386 00:23:31,760 --> 00:23:36,240 Speaker 1: but Russia has a huge amount of criminal operations. Nigeria, 387 00:23:36,320 --> 00:23:39,320 Speaker 1: I think has a lot of criminal operations. So yeahther 388 00:23:39,480 --> 00:23:45,040 Speaker 1: China has a mixture of government them free enterprise entrepreneurs. 389 00:23:45,400 --> 00:23:46,840 Speaker 1: Is that lily to all around the world that there 390 00:23:46,840 --> 00:23:47,520 Speaker 1: are different. 391 00:23:47,240 --> 00:23:52,040 Speaker 3: Patterns, absolutely, and you nailed it in North Korea. 392 00:23:52,160 --> 00:23:54,040 Speaker 2: There's really no corporations. 393 00:23:54,200 --> 00:23:57,400 Speaker 3: The government is the country and basically runs everything, so 394 00:23:57,480 --> 00:24:01,280 Speaker 3: everything is run from the govern and control by the government. 395 00:24:01,840 --> 00:24:07,000 Speaker 3: In China, it's very cooperative where companies and the government 396 00:24:07,320 --> 00:24:11,840 Speaker 3: work very closely together, so the government is going to 397 00:24:12,000 --> 00:24:14,960 Speaker 3: do attacks on the heaf of companies and vice versa. 398 00:24:15,240 --> 00:24:18,520 Speaker 3: Now when you get in to carriers like Russia and Nigeria, 399 00:24:18,560 --> 00:24:25,320 Speaker 3: it's interesting the commercial criminal actually helps and supports the government. 400 00:24:25,680 --> 00:24:30,480 Speaker 3: So these commercial elements are actually supporting and involved. 401 00:24:30,480 --> 00:24:32,000 Speaker 2: A lot of government. 402 00:24:31,600 --> 00:24:35,800 Speaker 3: Officials in Nigeria, a lot of government officials in Russia. 403 00:24:35,840 --> 00:24:39,920 Speaker 3: They're actually involved and sit on the board of these 404 00:24:39,960 --> 00:24:45,520 Speaker 3: cyber crime or criminal companies, so they're actually supporting, helping them, 405 00:24:45,680 --> 00:24:50,280 Speaker 3: and they're helping and supporting the government in return. Imagine 406 00:24:50,760 --> 00:24:54,640 Speaker 3: in the United States if we had generals and government 407 00:24:54,680 --> 00:24:59,840 Speaker 3: officials actually sitting on commercial boards that are doing offensive 408 00:25:00,080 --> 00:25:03,480 Speaker 3: operations to help the company but also help the country. 409 00:25:03,760 --> 00:25:05,320 Speaker 2: It's a total mind shift. 410 00:25:05,840 --> 00:25:09,199 Speaker 3: But the reality is until we start thinking and acting 411 00:25:09,240 --> 00:25:11,920 Speaker 3: like the adversary and started doing what the adversary does, 412 00:25:12,119 --> 00:25:17,360 Speaker 3: we're at a disadvantage. Because these other countries have commercialized cybercrime. 413 00:25:17,760 --> 00:25:21,160 Speaker 3: They're making tons of money on it. They've legalized cybercrime, 414 00:25:21,640 --> 00:25:24,320 Speaker 3: and because in the United States it's illegal, we're at 415 00:25:24,359 --> 00:25:28,960 Speaker 3: a huge disadvantage in terms of offensive operations and protecting ourselves. 416 00:25:29,520 --> 00:25:31,840 Speaker 1: And some of these things are really big. If I'm 417 00:25:31,880 --> 00:25:36,560 Speaker 1: mumber correctly, the twenty fifteen Office of Personnel Management breach 418 00:25:37,119 --> 00:25:40,680 Speaker 1: was a huge failure. Did we learn anything from it? 419 00:25:41,600 --> 00:25:45,320 Speaker 2: Unfortunately, very little. And the reality that. 420 00:25:45,920 --> 00:25:49,480 Speaker 3: Is brought up that we have to recognize is social 421 00:25:49,520 --> 00:25:53,040 Speaker 3: security numbers are no longer private information. 422 00:25:53,760 --> 00:25:54,800 Speaker 2: We have this term. 423 00:25:55,080 --> 00:26:00,919 Speaker 3: I'm sure you've heard PII personally identifiable information or PHI 424 00:26:01,119 --> 00:26:06,840 Speaker 3: personal healthcare information, and our social security number, our driver's 425 00:26:06,840 --> 00:26:11,359 Speaker 3: license are all considered private information. And if somebody knows 426 00:26:11,400 --> 00:26:16,120 Speaker 3: my social security number, my data birth and my driver's license, 427 00:26:16,640 --> 00:26:20,040 Speaker 3: they can open bank accounts, they can open credit cards, 428 00:26:20,280 --> 00:26:23,360 Speaker 3: they can access information, they can access data. But as 429 00:26:23,400 --> 00:26:26,840 Speaker 3: you said, in that breach and in other breaches, a 430 00:26:27,000 --> 00:26:31,800 Speaker 3: large percent of American social security number has been compromised. 431 00:26:32,520 --> 00:26:35,520 Speaker 3: A large number of social security numbers is public information. 432 00:26:35,960 --> 00:26:38,959 Speaker 3: So now we're living in a world where a personal 433 00:26:39,119 --> 00:26:43,719 Speaker 3: identifiable information is actually public. Our social security number is public, 434 00:26:44,080 --> 00:26:45,560 Speaker 3: our driver's license is public. 435 00:26:45,600 --> 00:26:47,040 Speaker 2: Yet that's what we're. 436 00:26:46,920 --> 00:26:50,480 Speaker 3: Using to authenticate, and verify. So in terms of back 437 00:26:50,520 --> 00:26:53,560 Speaker 3: to the federal laws, we actually need to come up 438 00:26:53,600 --> 00:26:58,840 Speaker 3: with new unique identifiers for American citizens that is actually secure, 439 00:26:59,320 --> 00:27:03,520 Speaker 3: protected and not compromise. Something along the lines of biometrics. 440 00:27:03,920 --> 00:27:07,320 Speaker 3: We're actually tying it to like your fingerprint or your 441 00:27:07,359 --> 00:27:10,520 Speaker 3: facial idea or something that's much more difficult for somebody 442 00:27:10,560 --> 00:27:13,159 Speaker 3: to steal. But the reality is what we're using as 443 00:27:13,240 --> 00:27:18,199 Speaker 3: personal information is actually public and exposed and available to 444 00:27:18,240 --> 00:27:18,800 Speaker 3: many people. 445 00:27:19,680 --> 00:27:22,080 Speaker 1: I mean, should people despair or how do you function 446 00:27:22,960 --> 00:27:25,240 Speaker 1: in the kind of wide open world you're describing? 447 00:27:25,960 --> 00:27:31,560 Speaker 3: The reality is sort of two things. One is awareness 448 00:27:32,480 --> 00:27:36,160 Speaker 3: is recognizing the reality. Don't be afraid of it, don't 449 00:27:36,160 --> 00:27:39,480 Speaker 3: be terrified. I work in cybersecurity, and people like Eric, 450 00:27:39,520 --> 00:27:40,520 Speaker 3: how are you in a good mood? 451 00:27:40,560 --> 00:27:41,520 Speaker 2: How are you not depressed? 452 00:27:41,520 --> 00:27:43,919 Speaker 3: Them like, Because I'm aware and I understand that, I 453 00:27:43,960 --> 00:27:48,240 Speaker 3: embrace it, and then it's just doing simple things, doing 454 00:27:48,359 --> 00:27:52,840 Speaker 3: cyber hygiene. But the reality is because technology came on 455 00:27:53,000 --> 00:27:57,359 Speaker 3: so quick that most of us were not trained in school. 456 00:27:57,359 --> 00:27:59,960 Speaker 3: When I went to school, the Worldwide Web didn't exist. 457 00:28:00,080 --> 00:28:03,280 Speaker 3: There weren't cell phones, there weren't computers. They didn't teach 458 00:28:03,320 --> 00:28:04,639 Speaker 3: me about cyber hygiene. 459 00:28:04,920 --> 00:28:05,640 Speaker 2: But now my. 460 00:28:05,720 --> 00:28:08,600 Speaker 3: Kids are going to school and they're still not teaching 461 00:28:08,640 --> 00:28:09,840 Speaker 3: them about cyber hygiene. 462 00:28:10,000 --> 00:28:11,639 Speaker 2: So to me, it's a lot of simple things. 463 00:28:11,640 --> 00:28:15,720 Speaker 3: One is just recognize and know that you're a target, 464 00:28:16,160 --> 00:28:20,160 Speaker 3: and understand where is your information, where is your critical data? 465 00:28:20,880 --> 00:28:24,520 Speaker 3: And then understand that passwords are a thing of the past. 466 00:28:25,520 --> 00:28:29,440 Speaker 3: Passwords are no longer strong. I can crack any password. 467 00:28:29,720 --> 00:28:32,600 Speaker 3: You give me an account that uses a password, and 468 00:28:32,640 --> 00:28:35,320 Speaker 3: I'll break into it. And we need to really embrace 469 00:28:35,400 --> 00:28:38,080 Speaker 3: what we call two factor, a multi factor, and this 470 00:28:38,120 --> 00:28:40,320 Speaker 3: is where you get an alert to your phone. You 471 00:28:40,400 --> 00:28:42,880 Speaker 3: type in a code and start doing that. The other 472 00:28:42,920 --> 00:28:46,520 Speaker 3: thing we have to realize is free apps are not free. 473 00:28:47,320 --> 00:28:49,440 Speaker 3: Those free apps that you have on your cell phone, 474 00:28:49,880 --> 00:28:52,440 Speaker 3: they're spying on you. I always love doing this. If 475 00:28:52,440 --> 00:28:55,560 Speaker 3: we were in person, with your permission, I would ask 476 00:28:55,600 --> 00:28:57,800 Speaker 3: to look at your phone and go under advanced settings 477 00:28:58,280 --> 00:29:02,560 Speaker 3: and go under tracking and camera and you would probably 478 00:29:02,600 --> 00:29:05,920 Speaker 3: be shocked of how many apps are tracking your location 479 00:29:06,440 --> 00:29:09,239 Speaker 3: and how many apps are accessing your camera, or how 480 00:29:09,280 --> 00:29:13,040 Speaker 3: many apps are accessing your microphone. And the reality is 481 00:29:13,240 --> 00:29:16,600 Speaker 3: we can turn that off if we're aware. Most people 482 00:29:16,680 --> 00:29:19,200 Speaker 3: just are not aware of how bad the threat is, 483 00:29:19,240 --> 00:29:21,920 Speaker 3: and how open and exposed our data is. 484 00:29:22,720 --> 00:29:26,120 Speaker 1: I'm sort of being sobered up just thinking about it. 485 00:29:26,520 --> 00:29:29,560 Speaker 1: Let me ask you specifically about North Korea, because several 486 00:29:29,600 --> 00:29:32,600 Speaker 1: people have said to me that a large part of 487 00:29:32,600 --> 00:29:37,080 Speaker 1: the North Korean military operation is actually subsidized by cybercrime, 488 00:29:37,520 --> 00:29:40,800 Speaker 1: and that if we were really serious about putting pressure 489 00:29:40,800 --> 00:29:43,239 Speaker 1: on North Korea, we would find ways to sort of 490 00:29:43,240 --> 00:29:46,880 Speaker 1: isolate them from a cyber theft standpoint. I mean, is 491 00:29:46,920 --> 00:29:47,520 Speaker 1: that accurate? 492 00:29:48,680 --> 00:29:52,160 Speaker 3: It is accurate, and not just for North Korea, but 493 00:29:52,280 --> 00:29:57,520 Speaker 3: also Russia and Nigeria and Argentina and a lot of 494 00:29:57,560 --> 00:30:02,959 Speaker 3: these countries that they're realizing that competing with the United 495 00:30:03,040 --> 00:30:08,360 Speaker 3: States in traditional business is really hard. It's really difficult, 496 00:30:08,800 --> 00:30:11,280 Speaker 3: and I hate to say it. You heard the phrase 497 00:30:11,920 --> 00:30:16,160 Speaker 3: crime pays. It is real easy to commit cybercrime. I 498 00:30:16,200 --> 00:30:19,320 Speaker 3: often joke with my friends and family that if I 499 00:30:19,360 --> 00:30:22,200 Speaker 3: didn't have ethics and morals and I didn't love this country, 500 00:30:23,000 --> 00:30:25,520 Speaker 3: I could be a lot richer if I moved to 501 00:30:25,560 --> 00:30:28,280 Speaker 3: South America and basically was a cyber criminal. 502 00:30:28,560 --> 00:30:31,400 Speaker 2: It is just unfortunately so easy and. 503 00:30:31,360 --> 00:30:35,200 Speaker 3: Simple to break in to these different companies, steal information, 504 00:30:35,600 --> 00:30:40,360 Speaker 3: hold them ransom, ransomware attacks where they break in, they 505 00:30:40,520 --> 00:30:43,600 Speaker 3: steal the data unless you pay ransom. Most people don't 506 00:30:43,640 --> 00:30:48,640 Speaker 3: realize last year, in two thousand and twenty four, ransomware 507 00:30:48,680 --> 00:30:53,640 Speaker 3: attacks just in the United States against US companies was 508 00:30:53,800 --> 00:30:56,960 Speaker 3: over forty two billion dollars. 509 00:30:57,440 --> 00:30:58,080 Speaker 1: Good grief. 510 00:30:58,760 --> 00:31:01,880 Speaker 3: Now take twenty bills million of that, give that to 511 00:31:01,960 --> 00:31:05,640 Speaker 3: North Korea. Take another ten billion, give that to Russia. 512 00:31:05,800 --> 00:31:10,200 Speaker 3: So yes, imagine now a country like North Korea is 513 00:31:10,280 --> 00:31:15,680 Speaker 3: making twenty billion dollars a year on cybercrime, and they're 514 00:31:15,800 --> 00:31:20,800 Speaker 3: increasing their capabilities because guess what, it's working. We can't 515 00:31:20,840 --> 00:31:24,720 Speaker 3: stop them, and they're continuing to get more advanced in 516 00:31:24,760 --> 00:31:28,400 Speaker 3: their capabilities. I always laugh is we're trying to stop 517 00:31:29,040 --> 00:31:33,200 Speaker 3: North Korea from having nuclear weapons, But the reality is, 518 00:31:33,960 --> 00:31:39,080 Speaker 3: without realizing it, North Korea has built cyber security nuclear weapons. 519 00:31:39,080 --> 00:31:42,360 Speaker 2: That are hurting and harming US, and we don't even realize. 520 00:31:41,920 --> 00:32:04,880 Speaker 1: It that the whole system you're describing. We really have 521 00:32:04,960 --> 00:32:10,320 Speaker 1: to reconceptualize how we're approaching this. It's so much bigger, 522 00:32:10,480 --> 00:32:13,400 Speaker 1: so much more powerful, it has so many more threats. 523 00:32:13,800 --> 00:32:16,520 Speaker 1: You almost need to start from ground zero and try 524 00:32:16,520 --> 00:32:20,200 Speaker 1: to imagine both what would a secure effective system be 525 00:32:20,320 --> 00:32:24,560 Speaker 1: like and what would the right kind of offensive system 526 00:32:24,640 --> 00:32:27,240 Speaker 1: be to make people decide it was too expensive and 527 00:32:27,280 --> 00:32:29,680 Speaker 1: too painful to do things to us. I mean, it 528 00:32:29,680 --> 00:32:32,280 Speaker 1: does not require a whole new way of thinking about 529 00:32:32,280 --> 00:32:33,520 Speaker 1: the system's architecture. 530 00:32:34,640 --> 00:32:35,360 Speaker 2: Absolutely. 531 00:32:35,960 --> 00:32:40,720 Speaker 3: We talked about the last several years about infrastructure. There 532 00:32:40,800 --> 00:32:44,920 Speaker 3: was the trillion dollar Infrastructure Bill to sort of rebuild 533 00:32:45,240 --> 00:32:49,520 Speaker 3: the US infrastructure because it's old and it's outdata, it's antiquated. 534 00:32:50,160 --> 00:32:54,440 Speaker 3: We need a trillion dollar bill on rebuilding our cyber infrastructure. 535 00:32:54,800 --> 00:32:59,640 Speaker 3: Because the reality is the United States created the Internet. 536 00:33:00,240 --> 00:33:02,800 Speaker 3: If you go back to the sixties and seventies, there 537 00:33:02,840 --> 00:33:07,160 Speaker 3: was Darpernet, which was the original research project with the 538 00:33:07,200 --> 00:33:11,680 Speaker 3: Department of Advanced Research, Department of Defense that actually built 539 00:33:11,680 --> 00:33:15,560 Speaker 3: out the Internet. Well, what happened is the infrastructure of 540 00:33:15,600 --> 00:33:18,880 Speaker 3: the Internet and the United States have now become one, 541 00:33:19,280 --> 00:33:22,000 Speaker 3: which means we don't have any boundaries, we don't have 542 00:33:22,040 --> 00:33:27,280 Speaker 3: any protection. North Korea can disconnect from the Internet. They 543 00:33:27,360 --> 00:33:31,120 Speaker 3: know where they're connected to the Internet. Russia has done 544 00:33:31,120 --> 00:33:35,480 Speaker 3: this twice a year. Russia disconnects from the Internet for 545 00:33:35,520 --> 00:33:38,960 Speaker 3: twenty four hours to show that they can run independently. 546 00:33:39,400 --> 00:33:43,080 Speaker 3: The problem is in the United States, the Internet is 547 00:33:43,120 --> 00:33:47,040 Speaker 3: the United states. We can't disconnect, we can't isolate, we 548 00:33:47,120 --> 00:33:49,960 Speaker 3: can't protect. So you are spot on where we need 549 00:33:50,720 --> 00:33:55,560 Speaker 3: a huge revamping, where we need to rebuild the cyber infrastructure. 550 00:33:55,640 --> 00:33:58,520 Speaker 3: We need to rebuild how we're connected to the Internet, 551 00:33:58,720 --> 00:34:02,000 Speaker 3: and we need to create ice related countries just like 552 00:34:02,080 --> 00:34:05,200 Speaker 3: Russia and North Korea, where we could protect, secure and 553 00:34:05,280 --> 00:34:08,360 Speaker 3: limit who can access and what can access or information. 554 00:34:08,680 --> 00:34:11,480 Speaker 3: But until we sort of redesign our infrastructure on the 555 00:34:11,480 --> 00:34:15,440 Speaker 3: Internet and have a new cyber infrastructure, this is going 556 00:34:15,480 --> 00:34:17,839 Speaker 3: to continue to be a problem because we're trying to 557 00:34:17,880 --> 00:34:19,400 Speaker 3: fix a broken model. 558 00:34:20,120 --> 00:34:23,640 Speaker 1: Seems smoothed. What you have is something which grew up 559 00:34:23,680 --> 00:34:27,239 Speaker 1: at hoc over a long period of time and gradually 560 00:34:27,960 --> 00:34:31,160 Speaker 1: began to attract more and more bad actors. And now 561 00:34:31,200 --> 00:34:34,640 Speaker 1: you have bad actors who have very modern technologies and 562 00:34:34,760 --> 00:34:38,319 Speaker 1: very modern approaches kind of trading a system much of 563 00:34:38,360 --> 00:34:41,680 Speaker 1: which is obsolete. This has really got to be one 564 00:34:41,680 --> 00:34:46,560 Speaker 1: of the profound infrastructure challenges of the Trump administration to 565 00:34:46,640 --> 00:34:47,520 Speaker 1: take this head on. 566 00:34:48,120 --> 00:34:51,520 Speaker 3: I agree, and that's one where I love what's going 567 00:34:51,560 --> 00:34:55,400 Speaker 3: on now with government efficiency and dodge and cutting spending. 568 00:34:55,960 --> 00:34:59,839 Speaker 3: But my concern is are we focused on the right 569 00:34:59,840 --> 00:35:03,360 Speaker 3: now problem Government efficiency is an issue. 570 00:35:04,480 --> 00:35:06,200 Speaker 2: It's an issue that we need to address. We need 571 00:35:06,239 --> 00:35:07,000 Speaker 2: to limit spending. 572 00:35:07,640 --> 00:35:11,719 Speaker 3: Cyber security is a problem that we have to stop ignoring. 573 00:35:12,000 --> 00:35:15,080 Speaker 3: So you really summarize it so well that this administration, 574 00:35:15,719 --> 00:35:19,000 Speaker 3: to me, if they want to go down and sort 575 00:35:19,000 --> 00:35:22,759 Speaker 3: of be remembered and have a legacy, the legacy is 576 00:35:22,800 --> 00:35:25,080 Speaker 3: not going to be in government efficiency. It's not going 577 00:35:25,120 --> 00:35:27,239 Speaker 3: to be in cutting spending. It's going to be Could 578 00:35:27,239 --> 00:35:32,480 Speaker 3: this be the first administration that actually passes federal cyber 579 00:35:32,640 --> 00:35:35,680 Speaker 3: security laws? Could this be the first administration that passes 580 00:35:36,040 --> 00:35:41,879 Speaker 3: a trillion dollar cyber infrastructure bill that rebuilds our cyber infrastructure. 581 00:35:42,000 --> 00:35:42,480 Speaker 2: But you're right. 582 00:35:42,560 --> 00:35:45,400 Speaker 3: Until we start taking this seriously and Congress and the 583 00:35:45,440 --> 00:35:49,040 Speaker 3: White House and everyone starts realizing that cybersecurity is the 584 00:35:49,200 --> 00:35:51,919 Speaker 3: number one problem, we're going to continue to have these 585 00:35:51,960 --> 00:35:53,880 Speaker 3: issues and continue to be vulnerable. 586 00:35:54,320 --> 00:35:57,920 Speaker 1: This is exactly right. And I'm really delighted that we 587 00:35:58,000 --> 00:36:00,600 Speaker 1: had this conversation because I think you put your finger 588 00:36:00,600 --> 00:36:03,600 Speaker 1: on one of the great challenges of the next ten years. 589 00:36:03,880 --> 00:36:06,080 Speaker 1: And I want to thank you for joining me. Your book, 590 00:36:06,760 --> 00:36:10,040 Speaker 1: Cyber Crisis, Protecting Your Business from Real Threats in the 591 00:36:10,120 --> 00:36:14,280 Speaker 1: Virtual World is available now on Amazon and in bookstores everywhere. 592 00:36:14,520 --> 00:36:16,200 Speaker 1: We're going to feature a link to buy it on 593 00:36:16,239 --> 00:36:18,440 Speaker 1: our show page, and I want to let our listeners 594 00:36:18,480 --> 00:36:21,759 Speaker 1: know they can follow your recent work by visiting your 595 00:36:21,800 --> 00:36:25,959 Speaker 1: website at doctor Ericcole dot orgon Thank you so much for. 596 00:36:25,880 --> 00:36:28,400 Speaker 2: Being here, my pleasure, Thank you for having me. 597 00:36:31,760 --> 00:36:34,120 Speaker 1: Thank you to my guest, doctor Eric Cole. You can 598 00:36:34,160 --> 00:36:36,840 Speaker 1: get a link to buy his new book Cyber Crisis 599 00:36:36,880 --> 00:36:39,840 Speaker 1: Protecting Your Business from Real Threats in the Virtual World 600 00:36:40,120 --> 00:36:43,799 Speaker 1: on our show page at newtsworld dot com. Newtsworld is 601 00:36:43,800 --> 00:36:48,240 Speaker 1: produced by Gamelish three sixty and iHeartMedia. Our executive producer 602 00:36:48,320 --> 00:36:52,320 Speaker 1: is Guarnsey Sloan. Our researcher is Rachel Peterson. The artwork 603 00:36:52,360 --> 00:36:55,919 Speaker 1: for the show was created by Steve Penley. Special thanks 604 00:36:55,960 --> 00:36:58,200 Speaker 1: to the team at Gaglish three sixty. If you've been 605 00:36:58,239 --> 00:37:01,520 Speaker 1: enjoying Newtsworld, I hope you'll go to Apple Podcast and 606 00:37:01,600 --> 00:37:04,040 Speaker 1: both rate us with five stars and give us a 607 00:37:04,040 --> 00:37:07,360 Speaker 1: review so others can learn what it's all about. Right now, 608 00:37:07,600 --> 00:37:10,320 Speaker 1: listeners of Neutrold can sign up for my three free 609 00:37:10,560 --> 00:37:14,960 Speaker 1: weekly columns at Gingrich sixty dot com slash newsletter. I'm 610 00:37:15,040 --> 00:37:17,040 Speaker 1: newt Gingrich. This is neut world,