1 00:00:00,000 --> 00:00:01,560 Speaker 1: As the talk station. 2 00:00:03,080 --> 00:00:05,080 Speaker 2: One FIFTU about RCD Talks Station. Brian Thomas right here 3 00:00:05,120 --> 00:00:06,720 Speaker 2: always looking forward to this moment in time. During the 4 00:00:07,040 --> 00:00:11,080 Speaker 2: fifty five KRSC Morning Show, because it's time for Tech Friday, 5 00:00:11,080 --> 00:00:13,520 Speaker 2: brought to you sponsored by a great company, asked the 6 00:00:13,560 --> 00:00:15,760 Speaker 2: business courier, who's the best in the business in terms 7 00:00:15,760 --> 00:00:19,040 Speaker 2: of dealing with businesses computer needs and get the amount of troubles. 8 00:00:19,079 --> 00:00:22,600 Speaker 2: It's in trust it that is Dave Hatter's company, interesst 9 00:00:23,000 --> 00:00:24,760 Speaker 2: dot com is where you find them online. Welcome back, 10 00:00:24,840 --> 00:00:27,280 Speaker 2: Dave Hatter. And your name came up like eight times 11 00:00:27,360 --> 00:00:31,280 Speaker 2: yesterday talking to my mom having more struggles with apps. 12 00:00:31,280 --> 00:00:33,920 Speaker 2: Mom thinks apps for the answer, and I just said, listen, 13 00:00:34,159 --> 00:00:38,360 Speaker 2: just start for the proposition you are at a technological disadvantage. 14 00:00:38,400 --> 00:00:40,800 Speaker 2: You don't understand the tech. And then couple with the 15 00:00:40,800 --> 00:00:44,000 Speaker 2: fact that there is a giant cluster of either criminals 16 00:00:44,040 --> 00:00:45,920 Speaker 2: or nefarious types out there that want to try to 17 00:00:45,960 --> 00:00:47,879 Speaker 2: exploit you one way or another, either by stealing your 18 00:00:47,920 --> 00:00:51,159 Speaker 2: money or getting your data. That's what apps help them do. 19 00:00:51,159 --> 00:00:55,840 Speaker 2: Don't do it, I mean, just don't do it. 20 00:00:55,840 --> 00:00:57,120 Speaker 1: It's a simple default role. 21 00:00:57,200 --> 00:00:59,920 Speaker 2: I said, unless you can come up with enough pro 22 00:01:00,720 --> 00:01:04,320 Speaker 2: for downloading the app that negate those two starting points, 23 00:01:04,680 --> 00:01:06,039 Speaker 2: then it's probably not worth it. 24 00:01:07,800 --> 00:01:10,039 Speaker 3: Well, first off, are's always thanks for having me on 25 00:01:10,480 --> 00:01:13,720 Speaker 3: and the secondarily, you know, I agree with your basic premise. 26 00:01:13,920 --> 00:01:16,280 Speaker 3: You know, I tell people this all the time, despite 27 00:01:16,319 --> 00:01:19,520 Speaker 3: thirty years in tech, more than twenty five years as 28 00:01:19,520 --> 00:01:23,319 Speaker 3: a programmer building software apps. If you will, I have 29 00:01:23,400 --> 00:01:25,440 Speaker 3: the minimum number of apps on my phone. I have 30 00:01:25,480 --> 00:01:27,960 Speaker 3: an Apple phone. Apple tends to be more privacy and 31 00:01:28,000 --> 00:01:31,520 Speaker 3: security friendly than Android, but they're not without their flaws, folks. Okay, 32 00:01:32,000 --> 00:01:34,520 Speaker 3: it's just it's a question of can I get into 33 00:01:34,600 --> 00:01:38,400 Speaker 3: a more privacy and security friendly situation, and that's primarily 34 00:01:38,480 --> 00:01:42,600 Speaker 3: driven by their business model. You know, Apple sells hardware 35 00:01:42,680 --> 00:01:47,360 Speaker 3: and software. Google basically buys and sells your data. As 36 00:01:47,400 --> 00:01:49,240 Speaker 3: a matter of fact, just to segue, while I was 37 00:01:49,480 --> 00:01:51,440 Speaker 3: loading something up to get ready for this, I mean 38 00:01:51,880 --> 00:01:54,280 Speaker 3: Google to pay eight point twenty five million dollars to 39 00:01:54,320 --> 00:01:58,480 Speaker 3: settle lawsuit alleging children's privacy violations. There's just a headline 40 00:01:58,480 --> 00:02:01,760 Speaker 3: for you I happen to see. So back to my phone, 41 00:02:01,840 --> 00:02:03,920 Speaker 3: I have the apps that came on my Apple phone, 42 00:02:04,240 --> 00:02:08,520 Speaker 3: and I've probably installed may be six apps that did 43 00:02:08,520 --> 00:02:10,840 Speaker 3: not come on the phone. Most of them are related 44 00:02:10,880 --> 00:02:14,760 Speaker 3: to my job, and I have things like LinkedIn which 45 00:02:14,760 --> 00:02:17,360 Speaker 3: I use on a regular basis primarily related to my 46 00:02:17,440 --> 00:02:20,920 Speaker 3: job as well. So yes, less apps better generally speaking 47 00:02:20,960 --> 00:02:24,840 Speaker 3: for most people, especially if you're not real privacy and 48 00:02:24,919 --> 00:02:25,560 Speaker 3: tech savvy. 49 00:02:26,080 --> 00:02:28,840 Speaker 2: So than thank you very much. Mom, I told you, 50 00:02:29,000 --> 00:02:30,760 Speaker 2: and I knew Dave you doubled down on that. I'm 51 00:02:30,800 --> 00:02:32,480 Speaker 2: just trying to keep her out of trouble, that's all. 52 00:02:32,840 --> 00:02:35,359 Speaker 2: And you know, there's always another source for information out 53 00:02:35,360 --> 00:02:37,480 Speaker 2: there that doesn't involve loading your phone up with something 54 00:02:37,520 --> 00:02:40,480 Speaker 2: like a simple Internet search like what's Cincinnati weather going 55 00:02:40,560 --> 00:02:43,399 Speaker 2: to be like over the next several days? Anyway, Artificial 56 00:02:43,440 --> 00:02:47,079 Speaker 2: intelligence crazy stuff we use chat GPT. This morning, I 57 00:02:47,080 --> 00:02:48,960 Speaker 2: had Joe churn out a country music song about a 58 00:02:48,960 --> 00:02:50,960 Speaker 2: guy with a naked guy with a harp that was 59 00:02:50,960 --> 00:02:53,240 Speaker 2: from the stack of stupid. In about two seconds, he 60 00:02:53,280 --> 00:02:55,760 Speaker 2: had a three minute song, four versus and a chorus. 61 00:02:55,800 --> 00:02:57,760 Speaker 2: It was hilarious. He's going to post it on the 62 00:02:57,800 --> 00:03:00,480 Speaker 2: Morning show page. But we have deep fakes form of 63 00:03:00,520 --> 00:03:04,960 Speaker 2: explicit content. They're using these AI programs to create pornography, 64 00:03:05,080 --> 00:03:07,799 Speaker 2: sometimes involving children. You know how lawmakers apparently want to 65 00:03:07,840 --> 00:03:10,760 Speaker 2: do something about it, Davis, can they. 66 00:03:10,720 --> 00:03:13,000 Speaker 3: Well that's a really good question, thank you. So deep 67 00:03:13,200 --> 00:03:15,040 Speaker 3: deep when it comes to AI, you and I have 68 00:03:15,080 --> 00:03:17,480 Speaker 3: been talking about deep fakes for a long long time now, 69 00:03:17,520 --> 00:03:21,200 Speaker 3: because it was really one of the first places where 70 00:03:21,240 --> 00:03:25,600 Speaker 3: you could see significant potential problems coming from this technology. Right, 71 00:03:25,639 --> 00:03:27,520 Speaker 3: deep fakes have been around for a long time. It's 72 00:03:27,560 --> 00:03:30,079 Speaker 3: the idea that I can create a photo, I can 73 00:03:30,120 --> 00:03:32,720 Speaker 3: create a video, I can create an audio some combination 74 00:03:32,760 --> 00:03:36,440 Speaker 3: of all of the above that increasingly looks unbelievably real 75 00:03:36,560 --> 00:03:41,280 Speaker 3: or sounds unbelievably realistic. Right. So, again, this concept is 76 00:03:41,320 --> 00:03:43,920 Speaker 3: not new, but as the technology gets better and better 77 00:03:43,960 --> 00:03:46,440 Speaker 3: and better and again, to kind of segue off into 78 00:03:46,520 --> 00:03:50,640 Speaker 3: a different, different article here, Whired Dredda ran a story 79 00:03:50,680 --> 00:03:53,720 Speaker 3: recently Google and open AI's chatbots can strip women in 80 00:03:53,800 --> 00:03:56,920 Speaker 3: photos down to their bikinis. So kind of related, right, yeah, 81 00:03:56,920 --> 00:03:58,840 Speaker 3: because a deep fake can be I just want to 82 00:03:58,920 --> 00:04:02,160 Speaker 3: edit one thing and existing photo to you know, change 83 00:04:02,160 --> 00:04:05,920 Speaker 3: its context, change its meaning. Like this WCPO story where 84 00:04:05,920 --> 00:04:10,480 Speaker 3: they show Ohio State Representative Adam Matthews, who's apparently a Republican, 85 00:04:10,560 --> 00:04:13,680 Speaker 3: standing near a sign that says jd Vance and then 86 00:04:13,880 --> 00:04:17,920 Speaker 3: AI was manipulated us. It was used to manipulate that photo. 87 00:04:18,440 --> 00:04:21,000 Speaker 3: And I mean, honestly, looking at the photo, you know 88 00:04:21,080 --> 00:04:24,559 Speaker 3: now he's standing next to a Harris sign. I don't 89 00:04:25,360 --> 00:04:26,840 Speaker 3: There's no way I would be able to tell the 90 00:04:26,880 --> 00:04:29,080 Speaker 3: difference between these two photos just looking at it. 91 00:04:29,160 --> 00:04:31,680 Speaker 2: Right, About how lawmakers come up with legislation that's going 92 00:04:31,720 --> 00:04:33,600 Speaker 2: to allow them to determine what's real and what's not. 93 00:04:34,520 --> 00:04:37,320 Speaker 3: Well, I don't know, Brian, that's a really good question. 94 00:04:37,520 --> 00:04:41,279 Speaker 3: I mean, theoretically, you could say that you have to 95 00:04:42,400 --> 00:04:44,840 Speaker 3: put some kind of watermark in something if you use 96 00:04:44,880 --> 00:04:48,760 Speaker 3: AI on it. But here's the thing. Bad people, especially 97 00:04:48,800 --> 00:04:52,160 Speaker 3: bad people, bad actors offshore in other countries that want 98 00:04:52,200 --> 00:04:55,120 Speaker 3: to sow chaos and dissension in this country aren't going 99 00:04:55,120 --> 00:04:58,200 Speaker 3: to follow those laws. You know. Could you pass a 100 00:04:58,279 --> 00:05:00,599 Speaker 3: law that says you could be penalized for this. Could 101 00:05:00,640 --> 00:05:03,280 Speaker 3: you pass the law that tries to hold these companies responsible. 102 00:05:03,600 --> 00:05:05,599 Speaker 3: I mean, you get into the same kind of situation, 103 00:05:06,080 --> 00:05:11,320 Speaker 3: you know, with it's a little different because let's just say, Facebook, TikTok, 104 00:05:11,400 --> 00:05:16,080 Speaker 3: whatever isn't creating certain content necessarily, it's allowing people to 105 00:05:16,120 --> 00:05:20,239 Speaker 3: post it. Whereas whereas these tools are actually creating things 106 00:05:20,240 --> 00:05:25,760 Speaker 3: from whole cloth. But offhand, I'm not really sure. Again, 107 00:05:25,880 --> 00:05:32,160 Speaker 3: other than requiring some kind of disclaimer water mark something 108 00:05:32,279 --> 00:05:36,279 Speaker 3: that would say this came from some AI platform, I 109 00:05:36,400 --> 00:05:38,080 Speaker 3: don't really know how you would do that. 110 00:05:38,520 --> 00:05:40,640 Speaker 2: Yeah, well, I think we've reached that moment in time 111 00:05:40,680 --> 00:05:42,360 Speaker 2: We've been talking about day where you know, you're not 112 00:05:42,400 --> 00:05:44,559 Speaker 2: going to believe your own eyes anymore. You can't believe 113 00:05:44,960 --> 00:05:45,640 Speaker 2: eyes done. 114 00:05:46,400 --> 00:05:48,960 Speaker 3: We're past that. Yeah, that those days are over. I 115 00:05:49,000 --> 00:05:51,400 Speaker 3: mean you can't assume that anything you see or hear 116 00:05:51,480 --> 00:05:55,480 Speaker 3: at this point hasn't either been altered or created entirely 117 00:05:55,760 --> 00:05:58,400 Speaker 3: by AI. And the technology gets better. 118 00:05:58,480 --> 00:06:00,520 Speaker 2: Well, much in the same way you can't control some 119 00:06:00,600 --> 00:06:02,840 Speaker 2: idiot from putting a headline on a newspaper saying that 120 00:06:02,880 --> 00:06:05,760 Speaker 2: ice detained a five year old boy in an ice 121 00:06:05,839 --> 00:06:08,840 Speaker 2: raid when that never even happened. You know, people are 122 00:06:08,839 --> 00:06:10,400 Speaker 2: gonna make stuff up, and they're gonna do it for 123 00:06:10,440 --> 00:06:13,120 Speaker 2: their own political interest, nefarious or political, or whatever other 124 00:06:13,160 --> 00:06:15,800 Speaker 2: reason they've got. And just start not believing what you 125 00:06:15,920 --> 00:06:18,320 Speaker 2: see and quit taking everything for granted that it's real. 126 00:06:19,240 --> 00:06:21,880 Speaker 2: We'll bring Dave Hatter back. Hundreds of millions of audio 127 00:06:21,920 --> 00:06:26,040 Speaker 2: devices need a pat six about KRC de talk station 128 00:06:26,080 --> 00:06:31,960 Speaker 2: Brian Thomas with intrust it dot COM's Dave Hatter. Nice figure, 129 00:06:32,080 --> 00:06:34,280 Speaker 2: Dave Hatter. Joe Striker put up the link for this 130 00:06:34,360 --> 00:06:37,400 Speaker 2: discussion on the blog pagment Do you have KRC dot com? 131 00:06:37,760 --> 00:06:40,000 Speaker 1: That's that? That is your figure, isn't it? 132 00:06:43,240 --> 00:06:46,560 Speaker 3: Are we talking about the photo of me? I haven't 133 00:06:46,560 --> 00:06:47,480 Speaker 3: looked at your blog this year? 134 00:06:47,520 --> 00:06:47,680 Speaker 2: Yeah? 135 00:06:47,720 --> 00:06:47,880 Speaker 3: I know. 136 00:06:48,000 --> 00:06:50,800 Speaker 2: Now I'm just taking you on my listening audience. That 137 00:06:50,839 --> 00:06:52,960 Speaker 2: actually is a real picture of Dave Hatter. You can 138 00:06:52,960 --> 00:06:54,760 Speaker 2: tell Edie. 139 00:06:57,360 --> 00:07:00,600 Speaker 3: I can't wait to see it. Hold it up here 140 00:07:00,640 --> 00:07:01,360 Speaker 3: while we're chatting. 141 00:07:01,760 --> 00:07:01,960 Speaker 1: Yeah. 142 00:07:02,160 --> 00:07:04,360 Speaker 2: Also, try to generate traffic over the fifty five KRC 143 00:07:04,480 --> 00:07:06,960 Speaker 2: dot com page so they check you out and then 144 00:07:07,000 --> 00:07:08,239 Speaker 2: can hear what you have to say. 145 00:07:08,600 --> 00:07:10,160 Speaker 1: Anyhow, sorry to take you down that road. 146 00:07:10,280 --> 00:07:12,720 Speaker 2: Hundreds of millions of audio devices apparently need a patch. 147 00:07:13,480 --> 00:07:14,800 Speaker 2: What's this one all about? Is this kind of like 148 00:07:14,840 --> 00:07:15,600 Speaker 2: a Bluetooth thing? 149 00:07:16,480 --> 00:07:19,000 Speaker 3: Yes? It is, and really it gets back to the 150 00:07:19,000 --> 00:07:21,200 Speaker 3: point though to some extent you made about your mom. 151 00:07:21,280 --> 00:07:24,240 Speaker 3: One of the reasons why the less apps you have, 152 00:07:24,720 --> 00:07:27,240 Speaker 3: the less so called smart devices you have, is the 153 00:07:27,320 --> 00:07:31,280 Speaker 3: less risk you have by exposure through these kinds of flaws. Okay, So, 154 00:07:31,880 --> 00:07:34,640 Speaker 3: just as a reminder, folks, an app is nothing but software. 155 00:07:35,000 --> 00:07:38,360 Speaker 3: A so called smart device has software in it. You know, 156 00:07:38,440 --> 00:07:40,440 Speaker 3: you've heard me, and if you pay any attention to 157 00:07:40,480 --> 00:07:43,920 Speaker 3: this sector at all, people like me talk about patching 158 00:07:43,920 --> 00:07:47,200 Speaker 3: your software over and over and over, right, the patch 159 00:07:47,280 --> 00:07:50,840 Speaker 3: Tuesday for Microsoft, all these sort of things. Right, if 160 00:07:50,880 --> 00:07:54,800 Speaker 3: something has software in it, your car, your doorbell, your refrigerator, 161 00:07:54,800 --> 00:07:59,480 Speaker 3: your thermostat, whatever, guaranteed, because human beings wrote that software, 162 00:07:59,520 --> 00:08:02,640 Speaker 3: it will have bugs, will have flaws that allow bad 163 00:08:02,640 --> 00:08:06,800 Speaker 3: guys to potentially exploit the software to attack the device 164 00:08:06,840 --> 00:08:10,480 Speaker 3: and then potentially other devices. So again, the less of 165 00:08:10,520 --> 00:08:13,120 Speaker 3: this stuff you have, the less exposure you have to 166 00:08:13,160 --> 00:08:16,760 Speaker 3: this kind of risk. So in this case, apparently Google, Now, 167 00:08:16,840 --> 00:08:19,080 Speaker 3: as you know, Brian, I generally try to avoid things 168 00:08:19,120 --> 00:08:21,440 Speaker 3: from Google, so this isn't really a problem for me. 169 00:08:21,520 --> 00:08:24,880 Speaker 3: But apparently they created something called fast pair, which is 170 00:08:24,880 --> 00:08:28,000 Speaker 3: a protocol to allow you to basically with one tap 171 00:08:28,080 --> 00:08:32,400 Speaker 3: connect to you or Bluetooth devices to Android devices, Chrome books, 172 00:08:32,440 --> 00:08:36,240 Speaker 3: that sort of thing. Okay, So, and this is the 173 00:08:36,320 --> 00:08:39,720 Speaker 3: biggest part of this problem in my mind. So much 174 00:08:39,760 --> 00:08:42,800 Speaker 3: of these things do make your life easier and more 175 00:08:42,840 --> 00:08:45,440 Speaker 3: convenient to a point, right as you and I discuss. 176 00:08:45,640 --> 00:08:48,240 Speaker 3: I mean, I've been on this earth for fifty plus years. 177 00:08:48,640 --> 00:08:50,280 Speaker 3: Most of it I didn't have any of this stuff, 178 00:08:50,320 --> 00:08:54,080 Speaker 3: and I've done just fine. So I understand the wow 179 00:08:54,160 --> 00:08:57,160 Speaker 3: factor of some of this stuff. I understand the convenience 180 00:08:57,200 --> 00:08:59,679 Speaker 3: factor of some of this stuff. And again, you know 181 00:08:59,880 --> 00:09:01,680 Speaker 3: I want to have to go through fifteen steps to 182 00:09:01,679 --> 00:09:04,200 Speaker 3: hook up my wireless headphones. No, if I can just 183 00:09:04,240 --> 00:09:09,119 Speaker 3: press one button. But pressing the one button requires more software, 184 00:09:09,360 --> 00:09:14,440 Speaker 3: more software because it's more guarantees, more bugs, more flaws, 185 00:09:14,440 --> 00:09:17,640 Speaker 3: more patching, etc. And So my bigger point of all 186 00:09:17,640 --> 00:09:20,720 Speaker 3: of this is again your ring doorbell, your nest thermostat 187 00:09:20,800 --> 00:09:23,080 Speaker 3: all of these things have software in them. If you 188 00:09:23,080 --> 00:09:25,480 Speaker 3: don't know how to configure it correctly, if you don't 189 00:09:25,520 --> 00:09:28,120 Speaker 3: know how to update it, and probably of equal importance, 190 00:09:28,200 --> 00:09:30,680 Speaker 3: when they stop putting out updates for it because they've 191 00:09:30,679 --> 00:09:33,319 Speaker 3: moved on to a new product, there's risk for you. 192 00:09:33,840 --> 00:09:38,520 Speaker 3: So this particular case again Google fast Pair. There are 193 00:09:38,559 --> 00:09:42,360 Speaker 3: millions of devices out there that use this, and apparently 194 00:09:42,360 --> 00:09:46,680 Speaker 3: there's a fairly significant flaw. Numerous outlets have written about this. 195 00:09:46,800 --> 00:09:49,680 Speaker 3: So let me just to start the fast pair process. 196 00:09:49,720 --> 00:09:51,800 Speaker 3: A seeker phone sends a message to the provider and 197 00:09:51,880 --> 00:09:55,959 Speaker 3: accessory indicating it wants to pair. The fast pair specification 198 00:09:56,040 --> 00:09:58,199 Speaker 3: states that the accessory is not in pairing mode, it 199 00:09:58,200 --> 00:10:01,600 Speaker 3: should disregard the message. Some devices fail to check this, 200 00:10:02,000 --> 00:10:05,440 Speaker 3: allowing on authoriss devices to start the pairing process. After 201 00:10:05,520 --> 00:10:08,760 Speaker 3: receiving reply, an attacker can finish the fast pair procedure, 202 00:10:08,840 --> 00:10:12,559 Speaker 3: establishing a regular Bluetooth pairing Now, is this the greatest 203 00:10:12,559 --> 00:10:14,600 Speaker 3: thing in the world or the greatest flaw in the world. 204 00:10:15,000 --> 00:10:16,559 Speaker 3: Is a hacker going to be able to sit down 205 00:10:16,559 --> 00:10:19,120 Speaker 3: the street from you and potentially attack your devices, get 206 00:10:19,160 --> 00:10:21,600 Speaker 3: to your computer, get into your work environment or your bank, 207 00:10:21,640 --> 00:10:25,480 Speaker 3: and steal all your money. Probably not, because Bluetooth has 208 00:10:25,840 --> 00:10:27,840 Speaker 3: a real You have to be fairly close for a 209 00:10:27,840 --> 00:10:30,640 Speaker 3: Bluetooth device to work. In most cases it's around thirty 210 00:10:30,640 --> 00:10:33,400 Speaker 3: three feet, maybe a little longer, depends on all kinds 211 00:10:33,400 --> 00:10:36,800 Speaker 3: of environmental conditions. The big thing for me, though, is 212 00:10:36,840 --> 00:10:39,880 Speaker 3: this is just another example that shows you. Do you 213 00:10:39,880 --> 00:10:41,960 Speaker 3: think the average person even knows this is a flaw, 214 00:10:42,160 --> 00:10:44,000 Speaker 3: And if they do, do you think they know their 215 00:10:44,040 --> 00:10:46,520 Speaker 3: devices might have it? And even if they do, do 216 00:10:46,559 --> 00:10:48,600 Speaker 3: you think they know how to fix it or will 217 00:10:48,640 --> 00:10:52,320 Speaker 3: the device even support fixing it. And the longer you 218 00:10:52,400 --> 00:10:55,800 Speaker 3: go without fixing a flaw in your devices, whatever that 219 00:10:55,880 --> 00:10:58,880 Speaker 3: might be, the more exposure you have because more bad 220 00:10:58,920 --> 00:11:01,880 Speaker 3: actors will learn about the flaw, and depending on what 221 00:11:01,960 --> 00:11:04,880 Speaker 3: it is and how it can be accessed, it creates 222 00:11:05,040 --> 00:11:08,280 Speaker 3: risk for you and your family, your business, your work, etc. 223 00:11:09,160 --> 00:11:13,560 Speaker 3: So if you have Google devices, I suggest you look 224 00:11:13,559 --> 00:11:16,199 Speaker 3: into this. It would be good to patch these flaws. 225 00:11:16,240 --> 00:11:18,240 Speaker 3: But again, in many cases this is some sort of 226 00:11:18,280 --> 00:11:22,600 Speaker 3: third party headphones or something. It's a tricky one, but 227 00:11:22,679 --> 00:11:24,920 Speaker 3: it gets to the heart of what I keep trying 228 00:11:24,960 --> 00:11:27,840 Speaker 3: to tell people all the time. All of this stuff 229 00:11:28,120 --> 00:11:32,719 Speaker 3: does not but it does not favor the consumer right. 230 00:11:32,760 --> 00:11:35,840 Speaker 3: It's speed to market, market share, ease of use. The 231 00:11:35,880 --> 00:11:38,280 Speaker 3: stuff goes out of life soon, meaning it doesn't get 232 00:11:38,360 --> 00:11:40,760 Speaker 3: updates from the vendor. You don't know how to set 233 00:11:40,800 --> 00:11:43,240 Speaker 3: it up right, you don't know how to keep it updated, 234 00:11:43,520 --> 00:11:45,720 Speaker 3: you don't know when it goes into life. The less 235 00:11:45,760 --> 00:11:48,480 Speaker 3: of this stuff you have until the business model changes, 236 00:11:48,800 --> 00:11:50,840 Speaker 3: the better off you will be. And this is just 237 00:11:51,160 --> 00:11:54,920 Speaker 3: yet another of dozens and dozens and dozens of these examples. 238 00:11:55,280 --> 00:11:57,920 Speaker 3: Why the less so called smart devices apps et cetera. 239 00:11:58,000 --> 00:11:58,960 Speaker 3: You have the better off. 240 00:11:58,840 --> 00:12:03,520 Speaker 2: You are, Amen, underscore in bold underline, Tell your friends 241 00:12:03,520 --> 00:12:05,480 Speaker 2: and neighbors six forty seven. Right now one more with 242 00:12:05,559 --> 00:12:07,800 Speaker 2: intrust its Dave Hatter. Now we can hear from corner 243 00:12:08,160 --> 00:12:12,320 Speaker 2: sixty two fifty five KERCIT talkstation Brian Thomas with Dave Hatter. 244 00:12:12,400 --> 00:12:15,920 Speaker 2: Interest It dot com is where you find him, and 245 00:12:16,040 --> 00:12:18,120 Speaker 2: also on LinkedIn. You go to LinkedIn dot com just 246 00:12:18,120 --> 00:12:19,760 Speaker 2: search for Dave Hatter. He has all the articles that 247 00:12:19,800 --> 00:12:21,680 Speaker 2: he talks about during this segment of the program, and 248 00:12:21,720 --> 00:12:23,800 Speaker 2: that's some It's a worthy endeavor to stay up with 249 00:12:23,880 --> 00:12:26,680 Speaker 2: him throughout the week as well. Chinese hackers again at 250 00:12:26,760 --> 00:12:31,320 Speaker 2: Chinese Communist Party up to no Good again, Dave Hatter, Yeah. 251 00:12:31,120 --> 00:12:33,160 Speaker 3: I know this is a chocker, Brian. And also, you know, 252 00:12:33,240 --> 00:12:35,520 Speaker 3: all this stuff we talk about on here, I'm sharing 253 00:12:35,559 --> 00:12:38,360 Speaker 3: on X two if X is easier for people, but 254 00:12:38,480 --> 00:12:42,160 Speaker 3: you know, I'm trying to throughout them week put out 255 00:12:42,200 --> 00:12:44,319 Speaker 3: as much information about this stuff that I think will 256 00:12:44,360 --> 00:12:47,480 Speaker 3: be helpful for people, to help people understand the risks 257 00:12:47,480 --> 00:12:49,640 Speaker 3: and hopefully make smarter decisions. Because at the end of 258 00:12:49,679 --> 00:12:53,520 Speaker 3: the day, Brian, if you make a risk informed decision, 259 00:12:53,640 --> 00:12:56,080 Speaker 3: you understand what you're signing up for with all this stuff. 260 00:12:56,080 --> 00:12:59,000 Speaker 3: You read the terms of service, the privacy policy, all 261 00:12:59,040 --> 00:13:01,080 Speaker 3: of which, as you know, will be eighty page legal 262 00:13:01,200 --> 00:13:05,400 Speaker 3: mumbo jumbo confuse opoly, and you decide to proceed. Well, 263 00:13:05,400 --> 00:13:08,400 Speaker 3: you're an adult, okay, have at it exactly right. But 264 00:13:08,800 --> 00:13:10,720 Speaker 3: my issue was all of this stuff at the end 265 00:13:10,760 --> 00:13:13,400 Speaker 3: of the day really is again it does not favor 266 00:13:13,400 --> 00:13:15,880 Speaker 3: the consumer, as I mentioned in the last segment, and 267 00:13:16,000 --> 00:13:19,679 Speaker 3: most people don't really understand the risk, both the cybersecurity 268 00:13:19,760 --> 00:13:22,880 Speaker 3: risk or the privacy risk of this stuff. So with 269 00:13:23,000 --> 00:13:25,520 Speaker 3: all of that said, yeah, here we are again. Probably 270 00:13:25,559 --> 00:13:27,400 Speaker 3: in the last five or six years, you and I 271 00:13:27,440 --> 00:13:30,320 Speaker 3: have discussed this pretty much relentlessly because it just keeps 272 00:13:30,400 --> 00:13:33,920 Speaker 3: hitting the news. Right, So headline Chinese hackers targeting high 273 00:13:33,960 --> 00:13:38,160 Speaker 3: value North American critical infrastructure, Cisco says for people who 274 00:13:38,160 --> 00:13:40,440 Speaker 3: don't know, you know, Cisco is one of the largest 275 00:13:40,440 --> 00:13:43,440 Speaker 3: and oldest major tech companies out there. They make a 276 00:13:43,440 --> 00:13:46,560 Speaker 3: lot of the equipment that powers the Internet and the 277 00:13:46,600 --> 00:13:49,959 Speaker 3: networking infrastructure that makes this work, switches, routers, et cetera. 278 00:13:51,080 --> 00:13:53,480 Speaker 3: So you know this are these are people that know 279 00:13:53,480 --> 00:13:56,360 Speaker 3: what they're talking about. Is my point. Again. Cisco has 280 00:13:56,400 --> 00:13:59,000 Speaker 3: been around for a long time, and you know, in 281 00:13:59,120 --> 00:14:03,640 Speaker 3: the business from an enterprise grade professional standpoint, they're you know, 282 00:14:03,679 --> 00:14:08,160 Speaker 3: one of the vendors chosen most often by people. So 283 00:14:09,280 --> 00:14:12,720 Speaker 3: we've heard DHS warn't about this, we'd we've heard FBI 284 00:14:12,840 --> 00:14:15,400 Speaker 3: warn't about this, we've heard scission worn about this. Sciss 285 00:14:15,400 --> 00:14:20,680 Speaker 3: of the Cybersecurity and Infrastructure Security Agency, part of DHS. 286 00:14:22,080 --> 00:14:24,280 Speaker 3: I have been trying to explain to people for a 287 00:14:24,280 --> 00:14:26,840 Speaker 3: long time. When the Internet was originally designed back in 288 00:14:26,880 --> 00:14:31,280 Speaker 3: the late sixties and early seventies, right, the underlying protocols, 289 00:14:31,320 --> 00:14:33,600 Speaker 3: the technologies that make it work were not designed with 290 00:14:33,640 --> 00:14:36,520 Speaker 3: security in mind. Right, they were disclad they could make 291 00:14:36,520 --> 00:14:39,680 Speaker 3: it work at all. Over time, as it's expanded and 292 00:14:39,760 --> 00:14:42,880 Speaker 3: people have loaded more and more stuff on the Internet, 293 00:14:43,040 --> 00:14:46,040 Speaker 3: and now we basically live in this digital world, all 294 00:14:46,120 --> 00:14:49,000 Speaker 3: powered by the technology that drives the Internet. I mean, 295 00:14:49,040 --> 00:14:51,480 Speaker 3: you don't even really hear people talk about the internet 296 00:14:51,560 --> 00:14:55,200 Speaker 3: first say anymore. And internet really just means internetwork, right, 297 00:14:55,280 --> 00:14:58,120 Speaker 3: A network connected to another network connected to another network. 298 00:14:58,160 --> 00:15:00,160 Speaker 3: That's really all it is. All these different networks that 299 00:15:00,200 --> 00:15:03,280 Speaker 3: are connected. They all use the same protocols or rules 300 00:15:03,280 --> 00:15:05,400 Speaker 3: to talk to each other, and all of that stuff 301 00:15:05,440 --> 00:15:07,880 Speaker 3: again was designed back at a time when security was 302 00:15:07,920 --> 00:15:11,200 Speaker 3: not a concern. So you know, fast forward to now 303 00:15:11,400 --> 00:15:14,760 Speaker 3: you've got all this mission critical stuff. Our entile society 304 00:15:14,840 --> 00:15:18,440 Speaker 3: runs off of this. And while you can apply security 305 00:15:18,480 --> 00:15:21,360 Speaker 3: to this, typically you're trying to retro fit it in. 306 00:15:21,520 --> 00:15:26,440 Speaker 3: It's creating friction, It creates costs that causes problems. Right, 307 00:15:26,520 --> 00:15:30,360 Speaker 3: So you've got a situation too, where as you stack 308 00:15:30,520 --> 00:15:32,800 Speaker 3: more and more stuff on this going back to the 309 00:15:32,840 --> 00:15:35,680 Speaker 3: point I made before, more and more software, more and 310 00:15:35,760 --> 00:15:39,920 Speaker 3: more flaws, it creates opportunities for people that really understand 311 00:15:40,120 --> 00:15:43,080 Speaker 3: how this stuff works to try to exploit these flaws, 312 00:15:43,080 --> 00:15:47,240 Speaker 3: to try to exploit the inherent vulnerabilities in this stuff 313 00:15:47,280 --> 00:15:50,560 Speaker 3: if it's not hatched and it's not set up correctly. Right, 314 00:15:51,280 --> 00:15:54,600 Speaker 3: And this is just another example. So here from this article, 315 00:15:54,880 --> 00:15:58,880 Speaker 3: Chinese hackers successfully breached multiple critical infrastructure organizations in North 316 00:15:58,880 --> 00:16:01,760 Speaker 3: America over the last year. You're using a combination of 317 00:16:01,800 --> 00:16:06,400 Speaker 3: compromised credentials and exploitable servers. Researchers at Cisco Talos foun 318 00:16:06,480 --> 00:16:09,960 Speaker 3: Tellos is their security focus group. They research this stuff, 319 00:16:09,960 --> 00:16:12,240 Speaker 3: they look for flaws, they try to help other vendors 320 00:16:12,560 --> 00:16:16,560 Speaker 3: fix these problems. Now there's two key points there. Exploitable servers. 321 00:16:16,880 --> 00:16:20,240 Speaker 3: That's the it's not patched, it doesn't have the latest updates, 322 00:16:20,320 --> 00:16:23,040 Speaker 3: it's not set up correctly, it has an inherent flaw 323 00:16:23,080 --> 00:16:25,000 Speaker 3: that can't be patched because it's too old and it 324 00:16:25,040 --> 00:16:29,120 Speaker 3: needs to be replaced. But compromise credentials. This gets back 325 00:16:29,160 --> 00:16:33,200 Speaker 3: to people use bad passwords. They use the same bad 326 00:16:33,240 --> 00:16:37,640 Speaker 3: passwords across multiple systems. They don't use multi factor authentication 327 00:16:37,840 --> 00:16:41,000 Speaker 3: or other technologies that can make it more difficult to 328 00:16:41,040 --> 00:16:44,720 Speaker 3: log in, hence account takeover attacks. So even though I 329 00:16:44,800 --> 00:16:47,400 Speaker 3: talk about this stuff calin blue in the face, you know, 330 00:16:48,440 --> 00:16:50,480 Speaker 3: you see these examples out in the real world. Now, 331 00:16:50,760 --> 00:16:54,120 Speaker 3: is it bad, Brian? If hackers do an account takeover 332 00:16:54,200 --> 00:16:57,600 Speaker 3: because you have a bad password and know MFA, get 333 00:16:57,640 --> 00:17:00,200 Speaker 3: into your company email and eventually steal all your money 334 00:17:00,240 --> 00:17:02,040 Speaker 3: and put you out of business, Yeah, that's pretty bad. 335 00:17:02,840 --> 00:17:05,560 Speaker 3: But is it Is it even worse when they can 336 00:17:05,640 --> 00:17:09,280 Speaker 3: get into the electrical grid or the water systems, or 337 00:17:09,560 --> 00:17:13,440 Speaker 3: let's say, uh, you know, some railroads somewhere and cause 338 00:17:13,480 --> 00:17:16,160 Speaker 3: a train full of chlorine gas to derail. Yeah, that's 339 00:17:16,200 --> 00:17:16,680 Speaker 3: a lot worse. 340 00:17:16,720 --> 00:17:19,200 Speaker 2: We call that a rhetorical question. Dave we're out of time, 341 00:17:19,280 --> 00:17:20,639 Speaker 2: my friend. You want to sum it up the critical 342 00:17:20,720 --> 00:17:22,360 Speaker 2: infrastructure critically. 343 00:17:21,920 --> 00:17:23,640 Speaker 3: We gotta fix it, Brian gott to fix. 344 00:17:23,440 --> 00:17:26,840 Speaker 2: It priority number one for all elected officials from local 345 00:17:26,920 --> 00:17:29,800 Speaker 2: to federal. That's the bottom line. Yes, Dave had our 346 00:17:29,800 --> 00:17:31,960 Speaker 2: intrust it dot com. Where you find him and his 347 00:17:32,000 --> 00:17:34,440 Speaker 2: crew for your business computer needs. Find him at LinkedIn 348 00:17:34,600 --> 00:17:36,520 Speaker 2: dot com and follow him on x get all the 349 00:17:36,600 --> 00:17:39,000 Speaker 2: articles and documents, and tune in every Friday at six 350 00:17:39,000 --> 00:17:42,560 Speaker 2: point thirty for h well some sound advice. Dave, Thank 351 00:17:42,560 --> 00:17:45,159 Speaker 2: you very much, have a great weekend, stay safe, and 352 00:17:45,320 --> 00:17:47,400 Speaker 2: we'll look forward to next Friday again. Coming up Corey 353 00:17:47,440 --> 00:17:49,280 Speaker 2: Bowman with well stuff and