1 00:00:00,160 --> 00:00:03,240 Speaker 1: Set Ingram fifty five KRC Deep Talk station. 2 00:00:04,960 --> 00:00:06,960 Speaker 2: Six thirty on a Friday. That means it's time for 3 00:00:07,040 --> 00:00:08,600 Speaker 2: Tech Friday with Dave how It brought to you by 4 00:00:08,640 --> 00:00:11,720 Speaker 2: his company intrust It and final Online at intrust it 5 00:00:12,039 --> 00:00:13,960 Speaker 2: dot com. And you want to find him. If you're 6 00:00:14,040 --> 00:00:16,119 Speaker 2: a business and you have computers, which I think is 7 00:00:16,120 --> 00:00:18,119 Speaker 2: all of them, you may need Dave's help get you 8 00:00:18,160 --> 00:00:20,160 Speaker 2: out of trouble or maybe even to set your systems 9 00:00:20,200 --> 00:00:23,080 Speaker 2: up engaging best practices. Bottom line is, when it comes 10 00:00:23,079 --> 00:00:25,680 Speaker 2: to business computing, intrust It are the best. You know, 11 00:00:25,800 --> 00:00:27,920 Speaker 2: take my word for a business courier says, are the best. 12 00:00:27,960 --> 00:00:30,040 Speaker 2: Welcome back, Dave. Had to appreciate your willingness to come 13 00:00:30,080 --> 00:00:33,400 Speaker 2: on the program and wake us all up every Friday. 14 00:00:33,440 --> 00:00:34,520 Speaker 2: And good to hear from you, brother. 15 00:00:35,240 --> 00:00:37,680 Speaker 1: Oh it's my pleasure, Brian. I look forward to it. 16 00:00:37,760 --> 00:00:40,840 Speaker 1: And as I always say, hopefully we're doing some good 17 00:00:40,840 --> 00:00:44,120 Speaker 1: out there helping people avoid the scams that are rampant. 18 00:00:44,320 --> 00:00:46,559 Speaker 2: Indeed, well, there's this company Striker, and I think we 19 00:00:46,600 --> 00:00:48,480 Speaker 2: talked about this a little bit a little a little 20 00:00:48,479 --> 00:00:52,600 Speaker 2: bit a week or so ago. Reportedly an Iranian threat 21 00:00:52,640 --> 00:00:56,280 Speaker 2: group called Handala had hacked into this Striker system wreaking 22 00:00:56,320 --> 00:00:59,600 Speaker 2: havoc across the blood of the globe. So terrorist actions 23 00:00:59,600 --> 00:01:01,440 Speaker 2: don't have to be in the form of bombs or 24 00:01:01,480 --> 00:01:03,520 Speaker 2: you know, driving through crowds of people. When they can 25 00:01:03,560 --> 00:01:05,520 Speaker 2: do this kind of damage, let's talk about it, Dave. 26 00:01:06,280 --> 00:01:09,080 Speaker 1: Yeah, Brian, Well, that's exactly right. And this is the 27 00:01:09,120 --> 00:01:12,600 Speaker 1: point I've been trying to make to businesses and really 28 00:01:12,720 --> 00:01:16,800 Speaker 1: organizations of any sort for a long time. Especially a 29 00:01:16,840 --> 00:01:19,760 Speaker 1: lot of smaller organizations think, well, this will never happen 30 00:01:19,800 --> 00:01:22,319 Speaker 1: to me. I don't have anything we're stealing. You know, 31 00:01:22,360 --> 00:01:24,440 Speaker 1: they're going to go after the Kroger's and the P 32 00:01:24,520 --> 00:01:26,160 Speaker 1: and G's and the fifth thirds of the world. And 33 00:01:26,200 --> 00:01:28,399 Speaker 1: that's that's true to an extent. But you also have 34 00:01:28,480 --> 00:01:30,720 Speaker 1: to understand, and you'll see how I'm going to connect 35 00:01:30,720 --> 00:01:34,880 Speaker 1: this to this Striker story. These large companies have deep pockets, 36 00:01:34,959 --> 00:01:39,240 Speaker 1: lots of resources. They'll typically have cybersecurity experts on staff. 37 00:01:39,480 --> 00:01:41,679 Speaker 1: They have all the resources to hire as many nerds 38 00:01:41,720 --> 00:01:45,160 Speaker 1: like me as they want, right, Yeah, And they often, 39 00:01:45,200 --> 00:01:49,680 Speaker 1: typically because they've got these experts, understand the risks that 40 00:01:49,720 --> 00:01:54,360 Speaker 1: they're up against. Many medium and small organizations, businesses, nonprofits, 41 00:01:54,440 --> 00:01:58,040 Speaker 1: local governments just this past week, another city in California 42 00:01:58,120 --> 00:02:00,920 Speaker 1: was hit thirty three thousand residents, much of the city 43 00:02:00,960 --> 00:02:05,520 Speaker 1: down through a devastating ransomware attack. These smaller organizations don't 44 00:02:05,600 --> 00:02:07,560 Speaker 1: understand the risk, and the point I'm always trying to 45 00:02:07,560 --> 00:02:10,840 Speaker 1: make is, first off, honestly, there is nothing you can 46 00:02:10,880 --> 00:02:13,639 Speaker 1: do to make your organization completely impurposed. I don't care 47 00:02:13,680 --> 00:02:15,840 Speaker 1: how much you spend, I don't care what you do. 48 00:02:16,560 --> 00:02:18,480 Speaker 1: If someone is telling you they can make you completely 49 00:02:18,520 --> 00:02:21,200 Speaker 1: impurpose to an attack, unless you just completely get off 50 00:02:21,240 --> 00:02:24,240 Speaker 1: all technology, it's not possible. But what you can do 51 00:02:25,000 --> 00:02:28,920 Speaker 1: is make yourself a much more difficult target, and this 52 00:02:28,960 --> 00:02:30,920 Speaker 1: has never been more important. The second part of this 53 00:02:31,560 --> 00:02:34,880 Speaker 1: improve your defenses and build systems that are resilient so 54 00:02:34,919 --> 00:02:37,160 Speaker 1: that you can recover quickly in the event of an 55 00:02:37,160 --> 00:02:38,919 Speaker 1: outage for whatever reason. It doesn't have to be a 56 00:02:38,960 --> 00:02:43,359 Speaker 1: cyber attack. Systems fail right, things go down. So Striker 57 00:02:43,560 --> 00:02:47,639 Speaker 1: huge company, according to the reporting I've seen, somewhere around 58 00:02:47,680 --> 00:02:50,959 Speaker 1: fifty six thousand employees in sixty one countries, they get 59 00:02:51,000 --> 00:02:53,520 Speaker 1: hit with this Wiper attack. So Wiper is kind of 60 00:02:53,520 --> 00:02:56,680 Speaker 1: like ransomware, but instead of just encrypting your data and 61 00:02:56,680 --> 00:02:58,560 Speaker 1: then saying I'll let you decrypt it so you can 62 00:02:58,680 --> 00:03:01,919 Speaker 1: use your systems again. And as a reminder, folks, if 63 00:03:01,919 --> 00:03:06,120 Speaker 1: I encrypt your systems, right, they won't run. Data is 64 00:03:06,120 --> 00:03:11,040 Speaker 1: still there, but it's all scrambled up computers, systems won't operate, okay, right. 65 00:03:11,120 --> 00:03:13,880 Speaker 1: So as part of a ransomware attack, typically it's it's 66 00:03:13,880 --> 00:03:16,960 Speaker 1: a two pronged attack. I encrypt your data, which shuts 67 00:03:16,960 --> 00:03:20,640 Speaker 1: your systems down, and I steal slash X filtrate your data. 68 00:03:20,960 --> 00:03:23,840 Speaker 1: I copy the data so that if you have built 69 00:03:23,840 --> 00:03:26,320 Speaker 1: resilient systems, if you've listened to some nerd like me, 70 00:03:26,919 --> 00:03:29,919 Speaker 1: you can wipe the systems, restore your data, get back 71 00:03:29,919 --> 00:03:33,160 Speaker 1: to work without paying the ransom. I say, well that's nice, 72 00:03:33,280 --> 00:03:35,280 Speaker 1: but now I'm going to start to release your data 73 00:03:35,320 --> 00:03:39,120 Speaker 1: on the internet. Could be you know, sensitive employee information, 74 00:03:39,240 --> 00:03:44,880 Speaker 1: sensitive customer information, trade secrets, information protected by NBA, contractual 75 00:03:44,960 --> 00:03:48,440 Speaker 1: information with another company who knows what the consequence. You know, 76 00:03:48,720 --> 00:03:54,960 Speaker 1: information through regular regulation is protected HIPPA for example, Right Striker, 77 00:03:55,000 --> 00:03:59,440 Speaker 1: big company, deep pockets. The reporting is that over two 78 00:03:59,480 --> 00:04:03,520 Speaker 1: hundred thousand devices were basically shut down. I think about this, 79 00:04:03,640 --> 00:04:05,760 Speaker 1: two hundred thousand devices. I know. I said this the 80 00:04:05,840 --> 00:04:08,000 Speaker 1: last time we talked about it. Even if you have 81 00:04:08,120 --> 00:04:11,600 Speaker 1: resilient systems and they can be restored, if it took 82 00:04:11,640 --> 00:04:14,720 Speaker 1: ten minutes to restore each system, I think how long 83 00:04:14,760 --> 00:04:15,080 Speaker 1: that would do? 84 00:04:15,200 --> 00:04:16,760 Speaker 2: Yeah, yeah, yeah, And I think that's the point you 85 00:04:16,760 --> 00:04:19,400 Speaker 2: brought up previously. But you said this is a wiper attack. 86 00:04:19,520 --> 00:04:22,040 Speaker 2: This isn't what you just described. This is just go 87 00:04:22,160 --> 00:04:24,279 Speaker 2: in and delete right right. 88 00:04:24,360 --> 00:04:27,520 Speaker 1: It's even worse than ransomware because there's no intention for 89 00:04:27,600 --> 00:04:29,080 Speaker 1: you to be able to pay a ransom to get 90 00:04:29,120 --> 00:04:33,000 Speaker 1: your data back. They've just basically destroyed the data in 91 00:04:33,040 --> 00:04:35,800 Speaker 1: some form, could be encrypted, could be deleted, whatever. So 92 00:04:35,839 --> 00:04:38,720 Speaker 1: now at best you have to recover all these systems. 93 00:04:38,720 --> 00:04:40,680 Speaker 1: So here's the latest reporting, Brian, because I know we'll 94 00:04:40,720 --> 00:04:45,360 Speaker 1: run out of time. Striker Manufacturing mostly restored after cyber attack. 95 00:04:45,520 --> 00:04:49,360 Speaker 1: This story is from yesterday. The attack occurred on the eleventh, 96 00:04:49,960 --> 00:04:54,279 Speaker 1: So for somewhere around fifteen days they've had no or 97 00:04:54,320 --> 00:04:57,320 Speaker 1: limited capacity. Fifty six What do you think it costs 98 00:04:57,320 --> 00:05:01,000 Speaker 1: for fifty six thousand employees to sit around doing nothing now? 99 00:05:01,000 --> 00:05:02,960 Speaker 1: And what do you think it costs to restore all 100 00:05:03,040 --> 00:05:07,960 Speaker 1: of this? So my bigger point is businesses or organizations 101 00:05:07,960 --> 00:05:10,119 Speaker 1: of all sides need to be thinking about what would 102 00:05:10,120 --> 00:05:14,479 Speaker 1: it cost me for extended downtime, and how can I 103 00:05:14,520 --> 00:05:17,520 Speaker 1: build systems that, if they're down for whatever reason, are 104 00:05:17,560 --> 00:05:20,360 Speaker 1: resilient and can recover quickly so I don't end up 105 00:05:20,360 --> 00:05:23,279 Speaker 1: out of business right because I can't recover there's so 106 00:05:23,400 --> 00:05:26,039 Speaker 1: much damage I can't recover, or it takes so long 107 00:05:26,080 --> 00:05:28,440 Speaker 1: to recover that I go out of business. And again, 108 00:05:28,480 --> 00:05:30,840 Speaker 1: this is a company that makes advanced medical devices. Now, 109 00:05:30,839 --> 00:05:34,120 Speaker 1: apparently none of the devices they make for their customers 110 00:05:34,120 --> 00:05:37,880 Speaker 1: were impacted. But again, imagine if fifty six thousand employees 111 00:05:38,279 --> 00:05:41,920 Speaker 1: had nothing to do for fifteen days. Could your organization 112 00:05:42,120 --> 00:05:45,560 Speaker 1: survive at fifteen day outage? I don't know, maybe, but 113 00:05:45,760 --> 00:05:48,599 Speaker 1: we've got to get serious about our defense. We've got 114 00:05:48,680 --> 00:05:52,040 Speaker 1: to get serious about resiliency. It has never been more important. 115 00:05:52,360 --> 00:05:54,679 Speaker 1: There's going to be more of this, you can count 116 00:05:54,720 --> 00:05:57,520 Speaker 1: on it, and we have to start thinking about building 117 00:05:57,520 --> 00:06:01,360 Speaker 1: systems that are robust and resilient because you're going to 118 00:06:01,400 --> 00:06:03,400 Speaker 1: see more of this and it's only going to cause 119 00:06:03,480 --> 00:06:06,719 Speaker 1: more societal impact as we get more and more reliant 120 00:06:06,720 --> 00:06:08,000 Speaker 1: on this digital technology. 121 00:06:08,080 --> 00:06:10,120 Speaker 2: Yeah, get in trusted interest I T and back up 122 00:06:10,120 --> 00:06:13,040 Speaker 2: your system. Apparently these are nefarious characters out there. Dave 123 00:06:13,040 --> 00:06:15,960 Speaker 2: Hadder will continue our conversation, bring it back and talk 124 00:06:16,000 --> 00:06:19,640 Speaker 2: about this grandmother who is jailed after being improperly identified 125 00:06:19,680 --> 00:06:22,920 Speaker 2: on artificial intelligence as a fraudser more with Dave Hatter 126 00:06:22,920 --> 00:06:25,600 Speaker 2: after I mentioned Rhino Shield serving the greater insane community. 127 00:06:25,200 --> 00:06:26,120 Speaker 1: From talk station. 128 00:06:28,080 --> 00:06:30,520 Speaker 2: May six, forty one fifty five KARCD talk station Love 129 00:06:30,560 --> 00:06:32,720 Speaker 2: this time of the week, talking with Dave Hatter intrustit 130 00:06:32,920 --> 00:06:35,080 Speaker 2: dot com. Thank you for sponsoring a segment and bringing 131 00:06:35,120 --> 00:06:38,040 Speaker 2: all this great information for us and warning us about 132 00:06:38,040 --> 00:06:40,640 Speaker 2: the dangers of AI. Got two articles, this one and 133 00:06:40,760 --> 00:06:45,400 Speaker 2: the next segment. Let's start with a grandma on Tennessee. Dave, Yeah, this. 134 00:06:45,320 --> 00:06:50,400 Speaker 1: Is a really disturbing story. Brian, grandma and Tennessee is arrested, 135 00:06:51,120 --> 00:06:55,800 Speaker 1: which is wild because some AI system some facial recognition, right, 136 00:06:55,800 --> 00:06:59,080 Speaker 1: we're talking facial recognition specifically. You and I have been 137 00:06:59,080 --> 00:07:02,880 Speaker 1: talking about facial recognition for a long long time and 138 00:07:02,960 --> 00:07:05,320 Speaker 1: many of the issues with it bias and the algorithms 139 00:07:05,360 --> 00:07:09,640 Speaker 1: bias and the data problems like this. So this woman, 140 00:07:09,720 --> 00:07:12,920 Speaker 1: Angela Lips, spent nearly six months in jail. AI saw 141 00:07:12,960 --> 00:07:18,080 Speaker 1: for Linkder to North Dakota Bank fraud. It's really difficult 142 00:07:18,080 --> 00:07:20,840 Speaker 1: for me to understand exactly how as you read through 143 00:07:20,840 --> 00:07:23,080 Speaker 1: the reporting on this, that the police weren't able to 144 00:07:23,080 --> 00:07:25,679 Speaker 1: do some of the required legwork to see that. Frankly, 145 00:07:25,800 --> 00:07:27,080 Speaker 1: it could not have been her. 146 00:07:28,560 --> 00:07:29,760 Speaker 2: There's always that, But. 147 00:07:30,240 --> 00:07:32,600 Speaker 1: Yeah, it's so, you know, it says. She remained in 148 00:07:32,680 --> 00:07:35,280 Speaker 1: jail for nearly four months without bail while awaiting extradition, 149 00:07:35,440 --> 00:07:38,320 Speaker 1: charged with four counts of unauthorized use of personal identifying 150 00:07:38,360 --> 00:07:43,400 Speaker 1: information and four counts of theft. Officers used facial recognition 151 00:07:43,440 --> 00:07:48,800 Speaker 1: to identify the suspect, is Lips, detective wrote in court 152 00:07:48,840 --> 00:07:52,320 Speaker 1: documents looked appeared to match the suspect based on facial features, 153 00:07:52,360 --> 00:07:56,240 Speaker 1: body type, and hairstyle. Lips said no one from the 154 00:07:56,280 --> 00:08:01,000 Speaker 1: Fargo Police Department contacted her before rest. Yeah, and this 155 00:08:01,040 --> 00:08:04,080 Speaker 1: is where it gets crazy. So then her attorney, Jay Greenwood, said, 156 00:08:04,400 --> 00:08:06,520 Speaker 1: but if the only thing you have is facial recognition, 157 00:08:06,560 --> 00:08:08,440 Speaker 1: I don't want to dig a little deeper. Yeah, I'd say, 158 00:08:08,480 --> 00:08:11,360 Speaker 1: it's like good, good thinking on her attorney'sad we have 159 00:08:12,000 --> 00:08:15,160 Speaker 1: so the reporting, says Lips, was later released on Christmas 160 00:08:15,240 --> 00:08:17,680 Speaker 1: Eve after Greenwood obtained her bank records and presented them 161 00:08:17,680 --> 00:08:20,560 Speaker 1: to investigators. The record showed Lips was more than twelve 162 00:08:20,640 --> 00:08:23,000 Speaker 1: hundred miles away in Tennessee at the time. The investigator 163 00:08:23,040 --> 00:08:26,680 Speaker 1: said the fraud occurred. It's difficult for me to understand 164 00:08:27,080 --> 00:08:29,200 Speaker 1: how it could take almost six months to get that 165 00:08:29,240 --> 00:08:31,240 Speaker 1: information and why her attorney would have to do. 166 00:08:31,360 --> 00:08:33,840 Speaker 2: I am with you one hundredercent. That's exact same question 167 00:08:34,000 --> 00:08:36,200 Speaker 2: was swirling around my melon when you were talking about this. 168 00:08:37,320 --> 00:08:40,240 Speaker 1: But the really disturbing part, It goes on to say, worse. 169 00:08:41,000 --> 00:08:43,280 Speaker 1: Oh yeah, it gets worse. Whips is now back home. 170 00:08:43,920 --> 00:08:47,240 Speaker 1: Last experience had lasting consequences. While Jail was unable to 171 00:08:47,240 --> 00:08:48,959 Speaker 1: pay her bills, she lost her home or car and 172 00:08:49,040 --> 00:08:51,360 Speaker 1: her dog. Oh no, it was not even got an 173 00:08:51,360 --> 00:08:54,240 Speaker 1: apology from the Fargo police. Now this is not the 174 00:08:54,280 --> 00:08:56,560 Speaker 1: only time something like this has happened. This story. The 175 00:08:56,600 --> 00:08:59,679 Speaker 1: same story goes on to say, apparently in October, and 176 00:08:59,720 --> 00:09:02,760 Speaker 1: AI system apparently mistook a Baltimore High school student's bag 177 00:09:02,800 --> 00:09:05,800 Speaker 1: of Dorita's for a firearm call. The police. Police show up, 178 00:09:06,240 --> 00:09:08,560 Speaker 1: pull guns on this kid, make him get on his knees, 179 00:09:08,600 --> 00:09:12,040 Speaker 1: handcuff him, search him, find nothing, and then Brian, before 180 00:09:12,080 --> 00:09:14,160 Speaker 1: we run out of time, let me read another headline 181 00:09:14,200 --> 00:09:17,320 Speaker 1: to you, leaked emails suggests RING plans to expand quote 182 00:09:17,360 --> 00:09:20,640 Speaker 1: search party surveillance beyond dogs. This is the thing that 183 00:09:20,679 --> 00:09:24,080 Speaker 1: caused all the controversy on the supuper Bowl, where these 184 00:09:24,200 --> 00:09:26,520 Speaker 1: awesome ring cameras are going to help you find your 185 00:09:26,559 --> 00:09:30,559 Speaker 1: lost dog. Right, But here you go leaked emails I'm 186 00:09:30,600 --> 00:09:33,880 Speaker 1: read quoting from this article, Ring CEO told staff the 187 00:09:33,920 --> 00:09:37,720 Speaker 1: feature is quote first for finding dogs unquote, indicating plans 188 00:09:37,720 --> 00:09:41,600 Speaker 1: to expand. Now couple that your neighbor's friendly ring camera 189 00:09:42,080 --> 00:09:44,760 Speaker 1: drops the dime on you through its bad facial recognition, 190 00:09:44,840 --> 00:09:46,719 Speaker 1: and next thing you know, you're going to Fargo to 191 00:09:46,800 --> 00:09:49,600 Speaker 1: jail for some crime you didn't commit because the ring 192 00:09:49,679 --> 00:09:55,880 Speaker 1: AI thinks it's you. Wow. Yeah, So you know, I'm 193 00:09:55,880 --> 00:09:58,640 Speaker 1: not a fan of the Internet of Things akas or devices, 194 00:09:58,679 --> 00:10:01,360 Speaker 1: as you know, Brian. And this is just one more 195 00:10:01,400 --> 00:10:04,240 Speaker 1: concrete example where I'm trying to help people connect the 196 00:10:04,280 --> 00:10:06,920 Speaker 1: dots about why we really need to slow down with 197 00:10:07,000 --> 00:10:09,559 Speaker 1: this stuff. And despite the fact that it may seem 198 00:10:09,600 --> 00:10:12,400 Speaker 1: like it's good for society, it brings convenience, it's going 199 00:10:12,480 --> 00:10:14,760 Speaker 1: to catch all these criminals and so forth. We just 200 00:10:14,800 --> 00:10:18,199 Speaker 1: see example after example. Now I'm sure miss lips will 201 00:10:18,200 --> 00:10:22,040 Speaker 1: probably end up suing someone and they'll get ye, which 202 00:10:22,320 --> 00:10:24,640 Speaker 1: which is going to cost the taxpayers. Right, that's gonna 203 00:10:24,679 --> 00:10:26,560 Speaker 1: the taxpayers are the one that's going to pay her 204 00:10:26,960 --> 00:10:28,720 Speaker 1: if this happens. And I mean, the whole thing is 205 00:10:28,760 --> 00:10:31,560 Speaker 1: just sad. Uh. We really need to slow down with 206 00:10:31,600 --> 00:10:33,480 Speaker 1: the stuff. We need to think about the long term 207 00:10:33,520 --> 00:10:37,400 Speaker 1: consequences of it. It's nowhere near perfect as we see. 208 00:10:37,920 --> 00:10:41,319 Speaker 1: And you know, here's yet another example of really negative 209 00:10:41,320 --> 00:10:43,440 Speaker 1: consequences for a person and m. 210 00:10:43,880 --> 00:10:48,920 Speaker 2: Law enforcement taking the easy way out. I recognition of 211 00:10:48,960 --> 00:10:51,360 Speaker 2: her or presumption that it is her, that was just 212 00:10:51,440 --> 00:10:54,360 Speaker 2: the one data point, one element in what might be 213 00:10:54,440 --> 00:10:57,040 Speaker 2: probable cause. But you know what, maybe we should check 214 00:10:57,080 --> 00:10:58,959 Speaker 2: and find out where she was. Maybe we should give 215 00:10:58,960 --> 00:11:00,680 Speaker 2: her a call and see if she's got an alibi. 216 00:11:00,840 --> 00:11:05,520 Speaker 2: Maybe she's not the person. That's just lazy law enforcement 217 00:11:05,600 --> 00:11:07,679 Speaker 2: right there, absolutely. 218 00:11:07,320 --> 00:11:10,280 Speaker 1: Lazy, little hard to understand. Maybe we hit this ring 219 00:11:10,360 --> 00:11:13,160 Speaker 1: thing next week because I think it's crazy and disturbing. 220 00:11:13,200 --> 00:11:15,920 Speaker 1: And again, people are buying these devices and plugging them 221 00:11:15,960 --> 00:11:19,200 Speaker 1: in with little thought to the potential consequences to them 222 00:11:19,240 --> 00:11:23,000 Speaker 1: and others because they're cool and they bring convenience. And 223 00:11:23,080 --> 00:11:26,120 Speaker 1: yet we see time and time again these problems that 224 00:11:26,320 --> 00:11:29,199 Speaker 1: frankly are often not reported on outside the tech press, 225 00:11:29,200 --> 00:11:32,040 Speaker 1: and I think it's more people start to start to 226 00:11:32,120 --> 00:11:35,920 Speaker 1: understand the potential risks and consequences. I'm hoping we will 227 00:11:36,000 --> 00:11:37,160 Speaker 1: slow down with these things. 228 00:11:37,240 --> 00:11:40,160 Speaker 2: Hey man, I'm going to station six if you want 229 00:11:40,160 --> 00:11:43,280 Speaker 2: to think about krcdtalkstation interest dot com as we find 230 00:11:43,320 --> 00:11:48,160 Speaker 2: Dave Hatter, my wife pointed out, our ring camera is 231 00:11:48,200 --> 00:11:52,599 Speaker 2: called Liam, Liam the perpetually barking doverman. Dave had a 232 00:11:52,640 --> 00:11:54,360 Speaker 2: real quick here. Maybe you can address it next week. 233 00:11:54,360 --> 00:11:56,559 Speaker 2: I was kind of freaked out this going back to 234 00:11:56,600 --> 00:11:59,319 Speaker 2: your Internet of Things device comment. I don't want to 235 00:11:59,320 --> 00:12:00,760 Speaker 2: take up too much time because they even know the 236 00:12:00,760 --> 00:12:03,240 Speaker 2: story to talk about. But the FCC earlier this week 237 00:12:03,920 --> 00:12:07,480 Speaker 2: said wireless routers sold in the United States should be 238 00:12:07,480 --> 00:12:11,920 Speaker 2: avoided if they contain anything from foreign nations national security determination. 239 00:12:12,080 --> 00:12:15,040 Speaker 2: Routers produced in a foreign country, regardless of the nationality 240 00:12:15,040 --> 00:12:17,839 Speaker 2: of the producer, pose an unacceptable risk to the national 241 00:12:17,920 --> 00:12:19,960 Speaker 2: security the United States of America and to the safety 242 00:12:19,960 --> 00:12:22,400 Speaker 2: and security of US persons. And of course, the article 243 00:12:22,400 --> 00:12:24,920 Speaker 2: points out that what all the routers are basically made 244 00:12:24,960 --> 00:12:27,000 Speaker 2: in foreign countries anyway, so we're left without really a 245 00:12:27,040 --> 00:12:28,600 Speaker 2: solution to the problem they identified. 246 00:12:28,679 --> 00:12:31,960 Speaker 1: Dave. It's a tricky problem, Brian, And you know we've 247 00:12:32,000 --> 00:12:34,280 Speaker 1: talked many times in the past, and yeah, I think 248 00:12:34,280 --> 00:12:36,760 Speaker 1: we should dig into this more next week because there's 249 00:12:36,760 --> 00:12:40,720 Speaker 1: something I've been talking about a lot. You know, people 250 00:12:40,840 --> 00:12:43,600 Speaker 1: work remotely now, right, a lot. So even if your 251 00:12:43,600 --> 00:12:46,559 Speaker 1: company is doing all the right things, if people are 252 00:12:46,559 --> 00:12:50,520 Speaker 1: working remotely, you on many cases maybe maybe they've got 253 00:12:50,559 --> 00:12:54,000 Speaker 1: a corporate issued notebook or something. But the equipment from 254 00:12:54,040 --> 00:12:56,520 Speaker 1: these remote sites, whether it's a hotel room, it's their 255 00:12:56,559 --> 00:13:00,439 Speaker 1: home office, it's Panera, you don't know what that is, right. 256 00:13:00,679 --> 00:13:03,000 Speaker 1: And the point they're making here is so much of 257 00:13:03,040 --> 00:13:05,280 Speaker 1: this equipment comes from off shore. It could have back 258 00:13:05,320 --> 00:13:07,840 Speaker 1: doors in it. We've talked many times over the years 259 00:13:07,840 --> 00:13:11,480 Speaker 1: about warnings of Chinese Communist Party hackers and the telecom 260 00:13:11,520 --> 00:13:15,080 Speaker 1: system and the potential or chaos and destruction that could cause. 261 00:13:15,120 --> 00:13:17,560 Speaker 1: If you know, if you turn off the telecom system, 262 00:13:17,800 --> 00:13:19,679 Speaker 1: you might as well just turn off the power grid too, 263 00:13:19,760 --> 00:13:24,000 Speaker 1: because all this stuff runs over the telecom systems, right, yes, 264 00:13:24,040 --> 00:13:26,680 Speaker 1: it does. So the point they're making here is, you know, 265 00:13:26,720 --> 00:13:29,560 Speaker 1: we've we've got to get serious about building stuff that 266 00:13:29,640 --> 00:13:33,360 Speaker 1: is secured by design, that is coming from trusted manufacturers, 267 00:13:33,600 --> 00:13:35,680 Speaker 1: where you don't have soft doors with back door and 268 00:13:35,720 --> 00:13:37,520 Speaker 1: it's back doors in it where it not only would 269 00:13:37,559 --> 00:13:40,480 Speaker 1: allow spying, but could be remotely shut off. I mean, 270 00:13:40,520 --> 00:13:44,400 Speaker 1: imagine if a foreign adversary could just suddenly shut off 271 00:13:44,800 --> 00:13:48,559 Speaker 1: all of the routers that came from their country, or 272 00:13:48,600 --> 00:13:51,160 Speaker 1: all of the cars made from their country, or whatever 273 00:13:51,240 --> 00:13:55,240 Speaker 1: it is. Right, we've got reshoring critical, you know, advanced 274 00:13:55,240 --> 00:13:59,240 Speaker 1: manufacturing critical. We gotta gotta think critical infrastructure. We got 275 00:13:59,360 --> 00:14:03,439 Speaker 1: to think national security. And because everything is digital, because 276 00:14:03,480 --> 00:14:06,240 Speaker 1: this stuff is embedded and everything. Now, you know, we've 277 00:14:06,280 --> 00:14:09,319 Speaker 1: got to get focused on this stuff. You know, this 278 00:14:09,440 --> 00:14:11,040 Speaker 1: has been a concern of mine for a long time. 279 00:14:11,160 --> 00:14:13,520 Speaker 1: I'm glad that we're starting to try to raise awareness 280 00:14:13,559 --> 00:14:16,240 Speaker 1: on this. You know, the Biden administration had come out 281 00:14:16,280 --> 00:14:18,600 Speaker 1: with a program similar to Energy Star, and I'm drawing 282 00:14:18,600 --> 00:14:20,280 Speaker 1: a blank on what it is. We talked about it. 283 00:14:20,400 --> 00:14:22,600 Speaker 1: I can't think of the name. But the idea is 284 00:14:22,680 --> 00:14:26,400 Speaker 1: that American manufacturers would start to build things and they 285 00:14:26,440 --> 00:14:28,960 Speaker 1: would get a sense of a certification like Energy Star 286 00:14:29,360 --> 00:14:31,960 Speaker 1: telling you that it's safe. Yeah, as opposed to buying 287 00:14:32,160 --> 00:14:34,840 Speaker 1: you know, something you found at best Buy or Walmart 288 00:14:34,960 --> 00:14:37,680 Speaker 1: or whatever that has who knows what in it. So 289 00:14:38,640 --> 00:14:42,040 Speaker 1: this is going to be costly to fix this problem. 290 00:14:42,240 --> 00:14:44,760 Speaker 1: But you know, again we have gotten addicted in America 291 00:14:44,840 --> 00:14:48,960 Speaker 1: to cheap garbage. Yes, and we've got to start thinking. 292 00:14:49,160 --> 00:14:52,800 Speaker 1: Is now that everything is smart quote unquote hahaha. Everything 293 00:14:52,880 --> 00:14:55,920 Speaker 1: is an Internet of Things device. It's all hackable, it's 294 00:14:55,960 --> 00:14:58,920 Speaker 1: all got software, and the software can have vulnerabilities. The 295 00:14:59,000 --> 00:15:02,240 Speaker 1: hardware can have vulnerable abilities by design, Brian, by design 296 00:15:02,360 --> 00:15:05,720 Speaker 1: to spy on you and to frankly, at some future point, 297 00:15:06,080 --> 00:15:08,680 Speaker 1: have a have a kill switch that could be flipped remotely, 298 00:15:09,240 --> 00:15:13,640 Speaker 1: and then what you know, imagine if I don't know 299 00:15:13,720 --> 00:15:17,200 Speaker 1: how many routers, Wi Fi routers, switches that sort of 300 00:15:17,240 --> 00:15:20,880 Speaker 1: thing there are out there. It non enterprise great stuff 301 00:15:21,080 --> 00:15:25,080 Speaker 1: that small businesses are using, individuals are using, but obviously 302 00:15:25,200 --> 00:15:28,040 Speaker 1: it's in the tens to hundreds of millions. Imagine if 303 00:15:28,080 --> 00:15:30,680 Speaker 1: all of those were just remotely shut off one day. 304 00:15:30,880 --> 00:15:34,320 Speaker 2: It's certainly a possibility. You've identified this as a real 305 00:15:34,520 --> 00:15:37,600 Speaker 2: threat for years now, Dave, and you know I'm sitting 306 00:15:37,640 --> 00:15:39,160 Speaker 2: here looking at this article about you know, we need 307 00:15:39,240 --> 00:15:41,680 Speaker 2: to reshure this and get safe routers manufactured here so 308 00:15:41,720 --> 00:15:43,800 Speaker 2: they're not loaded up with back doors. As you point out, 309 00:15:44,120 --> 00:15:46,760 Speaker 2: where's Elon Musk. The man's got a trillion dollars if 310 00:15:46,800 --> 00:15:50,160 Speaker 2: he had he has a guaranteed money making business here 311 00:15:50,200 --> 00:15:53,160 Speaker 2: because he's got the money to create a US manufacturer 312 00:15:53,240 --> 00:15:56,320 Speaker 2: of routers company that could be declared safe, and everyone 313 00:15:56,480 --> 00:15:58,920 Speaker 2: and his brother will wine up to buy one because 314 00:15:58,960 --> 00:15:59,120 Speaker 2: of this. 315 00:16:00,120 --> 00:16:02,720 Speaker 1: In my mind, Brian, one of the single most important 316 00:16:02,760 --> 00:16:05,440 Speaker 1: things we have to do as a country is resure 317 00:16:05,720 --> 00:16:09,400 Speaker 1: advanced manufacturing so that anything that has software in it 318 00:16:09,880 --> 00:16:13,760 Speaker 1: can and is made in the United States by trusted manufacturers. 319 00:16:14,040 --> 00:16:16,240 Speaker 1: Doesn't mean it won't be hacked, doesn't mean it can't 320 00:16:16,280 --> 00:16:19,080 Speaker 1: be hacked, but at least you don't have to have 321 00:16:19,280 --> 00:16:22,000 Speaker 1: the fear that is there a back door built into 322 00:16:22,040 --> 00:16:24,600 Speaker 1: this that would allow it foreign adversary to control it, 323 00:16:25,120 --> 00:16:28,400 Speaker 1: including turn it off at some an opportune time for you. 324 00:16:29,040 --> 00:16:32,000 Speaker 2: Maybe we got to get serious about this, gets serious 325 00:16:32,080 --> 00:16:34,600 Speaker 2: about it critical. Maybe we can dive onto this more detail. 326 00:16:34,640 --> 00:16:36,280 Speaker 2: Maybe we have a recommendation on a router that we 327 00:16:36,400 --> 00:16:38,240 Speaker 2: might have that in the world that's safe. Right now, 328 00:16:38,440 --> 00:16:42,640 Speaker 2: Dave had her real quick, all right, I'm sorry, brother. 329 00:16:43,240 --> 00:16:45,920 Speaker 2: Next week thanks Signal ninety nine and Corey Bowman coming 330 00:16:46,000 --> 00:16:48,960 Speaker 2: up next violence in downtown Cincinnati on opening day, Dave 331 00:16:49,040 --> 00:16:51,080 Speaker 2: had her God bless you, sir. Interest it dot com 332 00:16:51,360 --> 00:16:51,720 Speaker 2: right back. 333 00:16:52,440 --> 00:16:53,640 Speaker 1: We put the app