1 00:00:00,200 --> 00:00:07,040 Speaker 1: Chuck Ingram fifty five KRC the talk station six thirty 2 00:00:07,080 --> 00:00:08,959 Speaker 1: on a Friday. It's that time of the week I 3 00:00:09,000 --> 00:00:11,160 Speaker 1: always look forward to. I always call it appointment listening, 4 00:00:11,160 --> 00:00:13,520 Speaker 1: and I beg my listeners, please, dear God, listen to 5 00:00:13,560 --> 00:00:15,680 Speaker 1: what Dave had it from Interest. I has to say. 6 00:00:16,360 --> 00:00:18,200 Speaker 1: Interest it dot coms where you find Dave and the 7 00:00:18,239 --> 00:00:22,279 Speaker 1: crew for your business computer needs, and he's on the 8 00:00:22,280 --> 00:00:24,439 Speaker 1: program every Friday to try to keep us out of trouble. 9 00:00:24,520 --> 00:00:26,560 Speaker 1: I'm so happy, Dave, welcome back to the Morning Show, 10 00:00:26,560 --> 00:00:30,440 Speaker 1: Happy Friday, that you have the seven thousand robot vacuum article. 11 00:00:30,480 --> 00:00:32,000 Speaker 1: I brought this up early in the week and I 12 00:00:32,240 --> 00:00:34,280 Speaker 1: mentioned you right out of the gate. I said, see 13 00:00:34,520 --> 00:00:37,280 Speaker 1: Dave Hatter's right, listen to him. Don't go Internet of 14 00:00:37,320 --> 00:00:40,400 Speaker 1: things stuff because they suck. Here's a terrible illustration of 15 00:00:40,440 --> 00:00:43,000 Speaker 1: exactly what you're talking about. Fortunately, at least this guy 16 00:00:43,120 --> 00:00:45,680 Speaker 1: was honest about it and didn't use it for nefarious purposes. 17 00:00:45,720 --> 00:00:47,560 Speaker 1: Let my listeners know, Dave, welcome back. 18 00:00:48,240 --> 00:00:52,760 Speaker 2: Yeah, Brian, great, great intro, as always, thanks for having me, 19 00:00:52,800 --> 00:00:55,640 Speaker 2: and yeah, I've been on this particular story now for 20 00:00:55,680 --> 00:00:59,200 Speaker 2: several days, posted it all over my social media and 21 00:00:59,320 --> 00:01:02,400 Speaker 2: you know, I will frequently try to share these stories 22 00:01:02,440 --> 00:01:05,280 Speaker 2: and use them to show people why the Internet of 23 00:01:05,319 --> 00:01:09,920 Speaker 2: things aka so so called smart devices, which are actually 24 00:01:10,040 --> 00:01:13,440 Speaker 2: rather dumb, as evidenced yet again by this story, to 25 00:01:13,560 --> 00:01:16,039 Speaker 2: just point out why you don't want this stuff. So 26 00:01:16,160 --> 00:01:18,880 Speaker 2: before I get into the details, let me just simplify 27 00:01:18,920 --> 00:01:22,399 Speaker 2: this for everyone. Almost all of these so called smart 28 00:01:22,400 --> 00:01:25,600 Speaker 2: devices come from China. They're full of software that no 29 00:01:25,640 --> 00:01:27,880 Speaker 2: one has vetted. They could have back doors and if 30 00:01:27,880 --> 00:01:33,000 Speaker 2: nothing else, all kinds of security vulnerabilities. They typically will 31 00:01:33,040 --> 00:01:36,680 Speaker 2: not be managed or maintained the software anyway for more 32 00:01:36,720 --> 00:01:38,440 Speaker 2: than a couple of years as they move on to 33 00:01:38,520 --> 00:01:40,800 Speaker 2: a new product. So you know, we've been doing this 34 00:01:40,840 --> 00:01:42,279 Speaker 2: show for over ten years, Brian. 35 00:01:42,640 --> 00:01:43,480 Speaker 3: It's not just me. 36 00:01:43,680 --> 00:01:47,360 Speaker 2: Look at the DHS, FBI, Microsoft, Google, Apple, anyone will 37 00:01:47,360 --> 00:01:48,080 Speaker 2: tell you. 38 00:01:48,080 --> 00:01:49,760 Speaker 3: You know, software vulnerabilities are. 39 00:01:49,680 --> 00:01:51,200 Speaker 2: One of the ways bad guys get any You have 40 00:01:51,200 --> 00:01:53,520 Speaker 2: to patch your software well when they stop making updates 41 00:01:53,520 --> 00:01:55,480 Speaker 2: for it. How do you patch it right right now? 42 00:01:55,480 --> 00:01:58,200 Speaker 2: You're in the legacy bucket. People don't understand this. They 43 00:01:58,200 --> 00:02:00,160 Speaker 2: don't know how to configure these things correctly. But the 44 00:02:00,200 --> 00:02:02,640 Speaker 2: big problem, besides the fact that almost all of this 45 00:02:02,840 --> 00:02:06,680 Speaker 2: is coming from China is that it's not designed with 46 00:02:06,760 --> 00:02:08,760 Speaker 2: your privacy and security in mind, and in fact, in 47 00:02:08,800 --> 00:02:13,639 Speaker 2: many cases it's the opposite. It's designed with speed to market, 48 00:02:13,680 --> 00:02:16,200 Speaker 2: market share, easy abuse in data collection in mind. So 49 00:02:16,280 --> 00:02:19,480 Speaker 2: this this particular guy again Popular Science magazine. This isn't 50 00:02:19,520 --> 00:02:21,760 Speaker 2: like ten foil hat magazine. 51 00:02:21,320 --> 00:02:22,079 Speaker 3: Or something like that. 52 00:02:22,520 --> 00:02:24,800 Speaker 1: Listen, I didn't somebody sent me a meme about this. 53 00:02:24,840 --> 00:02:26,400 Speaker 1: I didn't believe it at first, because I don't believe 54 00:02:26,400 --> 00:02:29,760 Speaker 1: anything anymore. So I immediately started searching in a found Yes, 55 00:02:29,840 --> 00:02:31,799 Speaker 1: this is actually a genuine thing. And one of the 56 00:02:31,840 --> 00:02:34,200 Speaker 1: reasons I gave Popular Science at least credit for some 57 00:02:34,240 --> 00:02:36,639 Speaker 1: measure of credibility. They were the ones reporting on it. Anyway, 58 00:02:36,639 --> 00:02:37,200 Speaker 1: go ahead, Dave. 59 00:02:37,760 --> 00:02:40,960 Speaker 2: So this particular guy is a software engineer, so he's 60 00:02:41,000 --> 00:02:43,600 Speaker 2: got some technical skill, right. He bought a now, wait 61 00:02:43,639 --> 00:02:46,480 Speaker 2: for two thousand dollars robot vacuum. I can't imagine why 62 00:02:46,480 --> 00:02:48,799 Speaker 2: you'd pay two thousand dollars for one of these things. 63 00:02:49,080 --> 00:02:50,880 Speaker 1: People who want a six thousand dollars toilet. 64 00:02:50,960 --> 00:02:51,160 Speaker 3: Dave. 65 00:02:51,560 --> 00:02:54,000 Speaker 2: Yeah, you know, Brian, I wouldn't pay two dollars for 66 00:02:54,040 --> 00:02:57,320 Speaker 2: one of these. But nevertheless, he buys one of these 67 00:02:57,600 --> 00:03:00,200 Speaker 2: and he decides that it would be fun as a 68 00:03:00,240 --> 00:03:03,680 Speaker 2: little side project for himself to try to steer the 69 00:03:03,720 --> 00:03:07,640 Speaker 2: thing around using a video game controller. Right, right, So 70 00:03:07,760 --> 00:03:10,200 Speaker 2: he's going to reverse engineer it. And as he gets 71 00:03:10,240 --> 00:03:13,120 Speaker 2: into it and starts to fool with it, he realizes 72 00:03:13,200 --> 00:03:16,480 Speaker 2: that the robot is talking to the company that makes it. 73 00:03:16,600 --> 00:03:20,000 Speaker 2: Dji've never heard of them, talks to their cloud servers, right, 74 00:03:20,040 --> 00:03:24,000 Speaker 2: So it's connected to the internet through his Wi Fi. Obviously, 75 00:03:24,000 --> 00:03:26,839 Speaker 2: the thing moves around, it doesn't have a cord necessarily, 76 00:03:27,440 --> 00:03:30,320 Speaker 2: and it's it's communicating back to their servers. Now, one 77 00:03:30,360 --> 00:03:33,320 Speaker 2: could ask why does the robot vacuum which is just 78 00:03:33,320 --> 00:03:36,480 Speaker 2: sweeping need to communicate with their servers? Well, you know 79 00:03:36,640 --> 00:03:39,600 Speaker 2: one answer, one legitimateswer, Well, you want to want it 80 00:03:39,640 --> 00:03:42,480 Speaker 2: to get software updates, but you know it might be 81 00:03:42,560 --> 00:03:44,600 Speaker 2: mapping your house. Who knows what it's doing? Does it 82 00:03:44,680 --> 00:03:46,920 Speaker 2: have a camera and a microphone? Is it listening to you? 83 00:03:46,960 --> 00:03:47,480 Speaker 2: Who knows? 84 00:03:47,760 --> 00:03:47,960 Speaker 3: Right? 85 00:03:48,480 --> 00:03:52,600 Speaker 2: So he gets into it and he discovers that now 86 00:03:52,640 --> 00:03:54,960 Speaker 2: I'm reading directly from the article. While building his own 87 00:03:54,960 --> 00:03:58,800 Speaker 2: remote control app. This guy reportedly used an AI coding 88 00:03:58,800 --> 00:04:01,720 Speaker 2: assistant to help reverse ineer how the robot communicated with 89 00:04:01,760 --> 00:04:04,680 Speaker 2: their cloud servers. He soon discovered that the same credentials 90 00:04:05,040 --> 00:04:07,800 Speaker 2: us your name and password. This is a common problem 91 00:04:07,880 --> 00:04:10,800 Speaker 2: embedded in the business. We call it secrets use your 92 00:04:10,840 --> 00:04:13,560 Speaker 2: name and password in the code so that they can 93 00:04:13,560 --> 00:04:16,000 Speaker 2: talk to their servers. He discovers the same credentials that 94 00:04:16,040 --> 00:04:19,080 Speaker 2: allow him to see and control his own device also 95 00:04:19,160 --> 00:04:23,800 Speaker 2: provided access to live camera feeds, microphone audio maps, and 96 00:04:23,880 --> 00:04:28,600 Speaker 2: status data from nearly seven thousand other vacuums across twenty 97 00:04:28,640 --> 00:04:33,720 Speaker 2: four countries. Now this guy, yeah reports it. Imagine if 98 00:04:33,760 --> 00:04:37,360 Speaker 2: this were and a hacker who wanted to steal your 99 00:04:37,400 --> 00:04:41,320 Speaker 2: information or wanted to use their access to this device 100 00:04:41,400 --> 00:04:44,240 Speaker 2: to get to your other devices or possibly eventually to 101 00:04:44,279 --> 00:04:45,640 Speaker 2: your bank or to your work. 102 00:04:45,760 --> 00:04:47,919 Speaker 1: Well, you can mention all that, but they have access 103 00:04:47,960 --> 00:04:50,760 Speaker 1: to the cameras on the things they could spy on you. 104 00:04:50,800 --> 00:04:53,600 Speaker 1: Maybe some creepy pedophile wants to look at your twelve 105 00:04:53,680 --> 00:04:54,880 Speaker 1: year old daughter or something. 106 00:04:55,960 --> 00:05:00,799 Speaker 2: Yes, or again, imagine if this weren't just a robe vacuum. 107 00:05:01,160 --> 00:05:05,800 Speaker 2: Imagine if it were, for example, a car that adversarial 108 00:05:05,839 --> 00:05:09,320 Speaker 2: hackers a nation state actor like China could take over, 109 00:05:09,680 --> 00:05:11,839 Speaker 2: and the day that they decide to launch an attack 110 00:05:11,920 --> 00:05:15,040 Speaker 2: on Taiwan. They make every one of those cars missiles. 111 00:05:15,800 --> 00:05:20,360 Speaker 1: They could guess drive them into power stations, right anything. 112 00:05:20,839 --> 00:05:24,159 Speaker 2: This article, And this is not the first time something 113 00:05:24,240 --> 00:05:26,559 Speaker 2: like this has come up, This is just the most 114 00:05:26,600 --> 00:05:29,440 Speaker 2: recent time. When I shared these things, I'll usually come 115 00:05:29,520 --> 00:05:32,160 Speaker 2: up with some sort of snarky comment like reason number 116 00:05:32,200 --> 00:05:36,360 Speaker 2: one million and seven, why you don't want this garbage? Okay, So, 117 00:05:37,279 --> 00:05:41,880 Speaker 2: yet again, yet again, we have an example of these 118 00:05:41,880 --> 00:05:46,840 Speaker 2: things being built in a way that they are inherently insecure. 119 00:05:47,120 --> 00:05:50,839 Speaker 2: I mean, even the most basic person, even most basic 120 00:05:50,920 --> 00:05:56,440 Speaker 2: cybersecurity person, knows you should not store credentials, use your names, passwords, tokens, 121 00:05:56,800 --> 00:06:00,159 Speaker 2: anything that would give you access like that in the code. 122 00:06:00,440 --> 00:06:02,599 Speaker 2: This is like one oh one type stuff, right, But 123 00:06:02,880 --> 00:06:06,080 Speaker 2: it comes up over and over again. Now, I'm not 124 00:06:06,200 --> 00:06:09,280 Speaker 2: suggesting in this case this was done in a nefarious way. 125 00:06:09,839 --> 00:06:13,360 Speaker 2: The problem is many software engineers are not trained on 126 00:06:13,400 --> 00:06:16,560 Speaker 2: security at all. They don't care about security. They care 127 00:06:16,560 --> 00:06:19,520 Speaker 2: about against Did I get it done on budget? Did 128 00:06:19,560 --> 00:06:22,120 Speaker 2: I get it done on time? These companies do not 129 00:06:22,279 --> 00:06:25,520 Speaker 2: care about your privacy or security. They care about selling 130 00:06:25,520 --> 00:06:28,960 Speaker 2: you their cheap, garbage spyware. And yet here is another 131 00:06:29,080 --> 00:06:32,320 Speaker 2: example of it. Well, documented you your own para. 132 00:06:33,200 --> 00:06:35,360 Speaker 1: Gotta take a break. We're way over time and Joe 133 00:06:35,360 --> 00:06:37,960 Speaker 1: Strecker is yelling in my headset. We'll continue with Dave Hatter, 134 00:06:38,040 --> 00:06:41,320 Speaker 1: Talk Interest, I dot Com, Jimmy Care fireplaces, though I 135 00:06:41,320 --> 00:06:43,480 Speaker 1: always say they do literally everything related to chimneys and 136 00:06:43,480 --> 00:06:45,080 Speaker 1: fireplaces to you know, a lot of the second. 137 00:06:44,839 --> 00:06:46,159 Speaker 3: Do they talk station? 138 00:06:48,560 --> 00:06:51,280 Speaker 1: If you about kr CD talk station, we call it 139 00:06:51,320 --> 00:06:53,760 Speaker 1: tech Friday with Dave Hatter. Listen to what Dave has 140 00:06:53,800 --> 00:06:57,440 Speaker 1: to say, including when he says no, I'm a no. 141 00:06:57,480 --> 00:07:00,520 Speaker 1: There's a quote. I saw this article about the mayor 142 00:07:00,560 --> 00:07:03,120 Speaker 1: of Fort Wright and they were discussing the contract with 143 00:07:03,240 --> 00:07:07,600 Speaker 1: Flock Safety, which aggregates your personal ring camera, doorbells and 144 00:07:07,640 --> 00:07:10,240 Speaker 1: consolidates and for the purposes of safety, and the mayor 145 00:07:10,320 --> 00:07:13,240 Speaker 1: over there some guy named Dave Hatter said, I'm gonna 146 00:07:13,240 --> 00:07:15,760 Speaker 1: know nothing you say will convince me. Unless you all 147 00:07:15,800 --> 00:07:19,120 Speaker 1: have a super majority, this is dead. I will veto this. 148 00:07:19,240 --> 00:07:21,240 Speaker 1: I think we're talking about the same kind of concept here, 149 00:07:21,240 --> 00:07:22,840 Speaker 1: aren't we, Dave Hatter, mayor. 150 00:07:23,760 --> 00:07:26,320 Speaker 2: Oh Brian, We are talking about the same kind of concept, 151 00:07:26,920 --> 00:07:29,720 Speaker 2: which is my issue with this. So you know, flock 152 00:07:29,760 --> 00:07:32,160 Speaker 2: cameras have popped up all around the country. They've come 153 00:07:32,240 --> 00:07:35,200 Speaker 2: under a lot of fire. In fact, you may recall 154 00:07:35,280 --> 00:07:37,880 Speaker 2: after the Super Bowl a flock and Ring had some 155 00:07:38,000 --> 00:07:40,200 Speaker 2: kind of partnership to you, and then the Ring ad 156 00:07:40,200 --> 00:07:44,000 Speaker 2: that freaked everyone out with the lost dog finding capability 157 00:07:44,520 --> 00:07:47,120 Speaker 2: has thrown cold water on that. And you know, I've 158 00:07:47,200 --> 00:07:49,920 Speaker 2: done a lot of research on this over time, but 159 00:07:51,360 --> 00:07:54,200 Speaker 2: I have fundamental issues with the concept of this and 160 00:07:54,240 --> 00:07:58,239 Speaker 2: the idea of this Orwellian digital surveillance state we're building 161 00:07:58,320 --> 00:08:00,520 Speaker 2: all under the guise of convenience and safety. 162 00:08:01,360 --> 00:08:03,240 Speaker 3: I know we'll run out of time because. 163 00:08:02,960 --> 00:08:04,880 Speaker 2: I can get ranting on this, just like I did 164 00:08:05,000 --> 00:08:07,840 Speaker 2: the last segment, and a great segue from the last 165 00:08:07,840 --> 00:08:09,720 Speaker 2: segment about the Internet of things. I mean, these are 166 00:08:09,760 --> 00:08:13,520 Speaker 2: Internet of Things devices, and in general, folks, anything that 167 00:08:13,640 --> 00:08:15,920 Speaker 2: is smart, anything that is connected to the Internet, is 168 00:08:15,960 --> 00:08:18,800 Speaker 2: an Internet of Things device in the most general sense. Right, 169 00:08:20,200 --> 00:08:24,560 Speaker 2: There's a guy that I think it was in Wired magazine. 170 00:08:24,560 --> 00:08:26,640 Speaker 2: I don't have it in front of me, went out 171 00:08:26,720 --> 00:08:28,680 Speaker 2: and was able to hack one of these things and 172 00:08:28,800 --> 00:08:30,840 Speaker 2: actually see himself standing in. 173 00:08:30,760 --> 00:08:31,560 Speaker 3: Front of the camera. 174 00:08:32,160 --> 00:08:33,920 Speaker 2: It's in and I've been I'm pretty sure it's in 175 00:08:33,960 --> 00:08:38,160 Speaker 2: Wired magazine. I'll post it in my notes today. But 176 00:08:38,480 --> 00:08:40,440 Speaker 2: the way these things work and all the back end 177 00:08:40,520 --> 00:08:44,280 Speaker 2: surveillance and so forth, I totally understand the argument of 178 00:08:44,320 --> 00:08:45,960 Speaker 2: why people think these are good. 179 00:08:46,320 --> 00:08:46,520 Speaker 3: Right. 180 00:08:47,000 --> 00:08:50,200 Speaker 2: Hey, it might help us find a lost person, it 181 00:08:50,240 --> 00:08:51,760 Speaker 2: may help us capture a criminal. 182 00:08:52,200 --> 00:08:54,040 Speaker 3: I get all of that. Okay. 183 00:08:54,559 --> 00:08:57,880 Speaker 2: However, I'm going to go back to our boy Benjamin Franklin, 184 00:08:58,160 --> 00:09:00,400 Speaker 2: who said, you know, he who gives up say for 185 00:09:00,440 --> 00:09:03,120 Speaker 2: liberty liberty for safety, deserves neither. I'm paraphrasing. 186 00:09:03,520 --> 00:09:04,120 Speaker 3: I mean we are. 187 00:09:04,120 --> 00:09:07,800 Speaker 2: We are building a digital surveillance state, and people like 188 00:09:07,920 --> 00:09:11,520 Speaker 2: the ACLU, the eff most people who have spent any 189 00:09:11,559 --> 00:09:18,280 Speaker 2: time really studying these things are not fans. Right now, Again, again, 190 00:09:18,320 --> 00:09:20,360 Speaker 2: I get a lot of your listeners like the ACLU. 191 00:09:20,440 --> 00:09:22,480 Speaker 2: I don't agree with them on versus anything. Well, just 192 00:09:22,520 --> 00:09:25,000 Speaker 2: go read what they tell you about this, okay, Again, 193 00:09:25,080 --> 00:09:27,000 Speaker 2: take any of the politics out of it for a second, 194 00:09:27,240 --> 00:09:30,880 Speaker 2: Electronic Frontier Foundation, just read what they have to say 195 00:09:30,880 --> 00:09:34,800 Speaker 2: about these things. So this came up, you know, typically, 196 00:09:35,240 --> 00:09:37,760 Speaker 2: I am very fortunate that on our city council we 197 00:09:37,840 --> 00:09:40,920 Speaker 2: have six members and we all generally see things the 198 00:09:41,000 --> 00:09:43,400 Speaker 2: same way. We have a simple objective here and for 199 00:09:43,480 --> 00:09:45,120 Speaker 2: it right, for the most part, we want to leave 200 00:09:45,160 --> 00:09:46,960 Speaker 2: you alone and let's keep your own money. That's what 201 00:09:46,960 --> 00:09:48,920 Speaker 2: we're trying to do here. That's what we've been doing 202 00:09:48,960 --> 00:09:51,760 Speaker 2: for a long time for anyone. That's for anyone that's 203 00:09:51,800 --> 00:09:53,600 Speaker 2: looking to live in a place where you'll be left 204 00:09:53,640 --> 00:09:55,959 Speaker 2: alone and get to keep most of your own money. 205 00:09:56,280 --> 00:09:58,200 Speaker 2: Right is your place. Our tax rate is the lowest 206 00:09:58,240 --> 00:09:59,400 Speaker 2: it's been since two thousand and eight. 207 00:09:59,440 --> 00:10:03,440 Speaker 1: Bryan, and I'll mobitarian dream right there. We trusted with Zippra, 208 00:10:03,480 --> 00:10:06,120 Speaker 1: We're trusted with your money, will leave you alone. Welcome 209 00:10:06,200 --> 00:10:06,880 Speaker 1: to Fort Wright. 210 00:10:07,400 --> 00:10:11,480 Speaker 2: Yeah, exactly. So this came up again. I understand why 211 00:10:11,600 --> 00:10:13,640 Speaker 2: some people were interested. So you know, I did a 212 00:10:13,679 --> 00:10:15,720 Speaker 2: lot of research. I sent out a lot of information. 213 00:10:16,200 --> 00:10:17,640 Speaker 2: One of the things and I think you and I 214 00:10:17,679 --> 00:10:19,559 Speaker 2: have talked about this over the years, and again I'll 215 00:10:19,559 --> 00:10:23,240 Speaker 2: put all this in my show notes. Frank Church, the 216 00:10:23,320 --> 00:10:27,040 Speaker 2: Church Committee back in the seventies, Okay, he was a 217 00:10:27,120 --> 00:10:30,959 Speaker 2: US senator. There was some potential abuses of the surveillance 218 00:10:30,960 --> 00:10:34,080 Speaker 2: state at that time, and there was this big committee. 219 00:10:34,320 --> 00:10:37,080 Speaker 2: Now there's a video from the seventies of him talking 220 00:10:37,080 --> 00:10:39,120 Speaker 2: about this, and this is it's all well document. If 221 00:10:39,160 --> 00:10:41,240 Speaker 2: you know where to look. Most people don't know about this. 222 00:10:42,440 --> 00:10:44,920 Speaker 2: There's a video, and there's the subject of the video. 223 00:10:45,000 --> 00:10:48,040 Speaker 2: Senator Frank Church quote, there is nowhere to hide from 224 00:10:48,040 --> 00:10:51,959 Speaker 2: government surveillance, from the government's surveillance state unquote. That was 225 00:10:52,040 --> 00:10:56,199 Speaker 2: nineteen seventy five long befoording of this stuff. Now, think 226 00:10:56,280 --> 00:10:58,720 Speaker 2: about the capabilities today, Think about what was exposed by 227 00:10:58,840 --> 00:11:01,440 Speaker 2: Edward Snowden, And before we run out of time, there's 228 00:11:01,480 --> 00:11:03,959 Speaker 2: a great quote from Snowden. I use it all the time. 229 00:11:04,720 --> 00:11:06,920 Speaker 2: Arguing that you don't care about privacy because you have 230 00:11:07,000 --> 00:11:09,080 Speaker 2: nothing to hide. Is no difference in saying you don't 231 00:11:09,120 --> 00:11:11,559 Speaker 2: care about free speech because you have nothing to say. 232 00:11:12,040 --> 00:11:17,520 Speaker 2: Edward Snowden, I am just fundamentally against this stuff. Again, 233 00:11:17,679 --> 00:11:21,000 Speaker 2: all the arguments for it, I understand, But when I 234 00:11:21,040 --> 00:11:23,000 Speaker 2: weigh it down in my mind and I see where 235 00:11:23,040 --> 00:11:25,679 Speaker 2: we're going, and then we see yet another example of 236 00:11:25,760 --> 00:11:29,479 Speaker 2: Internet of Things devices being used in ways you didn't anticipate, 237 00:11:29,600 --> 00:11:34,800 Speaker 2: being hacked, being controlled by foreign nation state actors. Imagine 238 00:11:34,840 --> 00:11:38,720 Speaker 2: if again, the day China decides to take to attack Taiwan, 239 00:11:39,240 --> 00:11:42,280 Speaker 2: in addition to all kinds of other attacks on the grid, 240 00:11:42,360 --> 00:11:45,040 Speaker 2: attacks on the IoT, they were able to use these 241 00:11:45,080 --> 00:11:48,640 Speaker 2: cameras for surveillance purposes, to launch other types of attacks 242 00:11:48,640 --> 00:11:49,200 Speaker 2: of some sort. 243 00:11:49,400 --> 00:11:51,440 Speaker 1: Certainly, I mean we envisioned that easily. 244 00:11:52,080 --> 00:11:54,680 Speaker 2: We're throwing so much of this stuff into our environment 245 00:11:54,880 --> 00:11:57,560 Speaker 2: with a little thought to privacy or security. Much of 246 00:11:57,600 --> 00:12:00,679 Speaker 2: it is garbage, and for all of those reasons, I'm like, 247 00:12:00,880 --> 00:12:05,000 Speaker 2: you know, sorry, guys, there's no convincing I'm against this. 248 00:12:05,360 --> 00:12:07,640 Speaker 2: We're not doing it as long as I have a say. 249 00:12:07,480 --> 00:12:08,400 Speaker 3: Well, it's end of story. 250 00:12:08,600 --> 00:12:11,320 Speaker 1: In the name of security, we're giving up security. There's 251 00:12:11,320 --> 00:12:13,760 Speaker 1: an interesting test on this. Let's continue with Dave Hatter 252 00:12:13,800 --> 00:12:17,640 Speaker 1: and find out if you see the talk station six 253 00:12:17,679 --> 00:12:19,360 Speaker 1: if you want to, if if you have CARCD talk station. 254 00:12:19,440 --> 00:12:23,360 Speaker 1: Happy Friday. One more segment with Dave Hatter Intrust dot 255 00:12:23,400 --> 00:12:25,000 Speaker 1: Com Get Dave and the crew on the job your 256 00:12:25,000 --> 00:12:27,440 Speaker 1: business Computer Needs. Business Career says they are the best 257 00:12:27,440 --> 00:12:29,679 Speaker 1: in the business. Dave Hatter, We've got one more thing 258 00:12:29,720 --> 00:12:32,760 Speaker 1: to talk about, and that is artificial intelligence making work 259 00:12:32,800 --> 00:12:33,480 Speaker 1: more efficient. 260 00:12:34,559 --> 00:12:37,840 Speaker 2: Yeah, so real quick, Brian, I just want to say, 261 00:12:38,080 --> 00:12:41,000 Speaker 2: when you understand how to use these AI tools that 262 00:12:41,040 --> 00:12:43,439 Speaker 2: are out there, and you understand what they can and 263 00:12:43,520 --> 00:12:46,400 Speaker 2: can't do, right, the hallucinations and all that sort of 264 00:12:46,400 --> 00:12:52,280 Speaker 2: thing that you can fastly improve your productivity individually. As 265 00:12:52,280 --> 00:12:56,199 Speaker 2: an organization, you know, we use these things internally at 266 00:12:56,200 --> 00:13:01,960 Speaker 2: interust in carefully designed way. You know, we're not just 267 00:13:02,200 --> 00:13:05,080 Speaker 2: throwing everything at it and hoping that it produces correct 268 00:13:05,080 --> 00:13:07,559 Speaker 2: results in that sort of thing. Again, you've got hallucinations, 269 00:13:07,559 --> 00:13:09,760 Speaker 2: you have privacy issues with the data you're putting in 270 00:13:09,800 --> 00:13:13,680 Speaker 2: these things. These things, to some extent can be solved. 271 00:13:13,760 --> 00:13:15,760 Speaker 2: And again I just want to be clear. When you 272 00:13:16,440 --> 00:13:18,640 Speaker 2: understand how to use these tools, which by the way, 273 00:13:18,679 --> 00:13:21,920 Speaker 2: is something we can help you with, it can be 274 00:13:21,960 --> 00:13:25,559 Speaker 2: a huge productivity booster. But there's so much hype and 275 00:13:25,640 --> 00:13:27,840 Speaker 2: hyperbolea around all of this and has been now for 276 00:13:27,880 --> 00:13:29,560 Speaker 2: a couple of years. It's hard to know who to 277 00:13:29,600 --> 00:13:32,680 Speaker 2: believe what to think about any of this stuff. And 278 00:13:32,720 --> 00:13:35,040 Speaker 2: then you know, recently some of the bloom has come 279 00:13:35,080 --> 00:13:37,840 Speaker 2: off the rows on some of these things. For example, 280 00:13:38,040 --> 00:13:40,720 Speaker 2: here was an article from inc magazine. It turns out 281 00:13:40,760 --> 00:13:44,480 Speaker 2: AI agents can't replace experienced humans. That's just one of 282 00:13:44,520 --> 00:13:48,920 Speaker 2: many articles. IBM recently announced they're going to triple their 283 00:13:49,040 --> 00:13:52,439 Speaker 2: entry level hiring with the idea that because one of 284 00:13:52,480 --> 00:13:54,680 Speaker 2: the arguments has been okay, entry level jobs are going 285 00:13:54,679 --> 00:13:57,480 Speaker 2: away because of this stuff. IBM's theory is we're going 286 00:13:57,520 --> 00:13:59,920 Speaker 2: to triple this and we're going to augment these people 287 00:14:00,200 --> 00:14:03,160 Speaker 2: with AI tools in ways that makes sense and work. 288 00:14:05,800 --> 00:14:09,080 Speaker 2: So again, there's so much hype around this. Here's another 289 00:14:09,240 --> 00:14:12,400 Speaker 2: article from Ink magazine. A new report says AI layoffs 290 00:14:12,400 --> 00:14:16,320 Speaker 2: are backfiring and half of companies will start rehiring. The 291 00:14:16,480 --> 00:14:18,600 Speaker 2: article I think Joe found or maybe I sent him. 292 00:14:18,600 --> 00:14:20,960 Speaker 2: I don't rember which the Wall Street Journals CEOs say 293 00:14:21,000 --> 00:14:24,040 Speaker 2: AI is making work more efficient. Employees tell a different story, 294 00:14:24,720 --> 00:14:26,840 Speaker 2: and then you know, they've got some results in here 295 00:14:26,840 --> 00:14:31,120 Speaker 2: from a study that was done, so you know. I 296 00:14:31,160 --> 00:14:34,880 Speaker 2: think the bottom line is if you understand or your 297 00:14:35,000 --> 00:14:39,320 Speaker 2: organization understands how to use these tools, which ones are 298 00:14:39,320 --> 00:14:42,440 Speaker 2: good at, which things put some guardrails around them. Again, 299 00:14:42,560 --> 00:14:46,280 Speaker 2: think about things like privacy and security. Are employees putting 300 00:14:46,400 --> 00:14:51,880 Speaker 2: sensitive data, whether it's trade secrets or compliant information that 301 00:14:51,920 --> 00:14:55,920 Speaker 2: would be covered under some sort of compliance or contractual 302 00:14:56,000 --> 00:14:59,000 Speaker 2: arrangements with a customer or partner or whatever. If you're 303 00:14:59,040 --> 00:15:01,120 Speaker 2: just dumping these into the the free online tools, you 304 00:15:01,240 --> 00:15:04,200 Speaker 2: got a real problem. But there are ways to use 305 00:15:04,280 --> 00:15:07,640 Speaker 2: private versions of these things, offline versions of these things. 306 00:15:08,440 --> 00:15:11,080 Speaker 2: You know, there's this whole agentic AI where they can 307 00:15:11,400 --> 00:15:14,080 Speaker 2: take some action on their own. That's caused a bunch 308 00:15:14,080 --> 00:15:15,960 Speaker 2: of controversy and a bunch of concerns too. 309 00:15:17,440 --> 00:15:19,240 Speaker 3: These tools can be used in a way. 310 00:15:19,080 --> 00:15:23,720 Speaker 2: Where you can get significant productivity boosts in some ways 311 00:15:23,760 --> 00:15:27,400 Speaker 2: in certain cases. So again, it's hard to know what 312 00:15:27,480 --> 00:15:29,880 Speaker 2: to think about this stuff at this point because you 313 00:15:29,920 --> 00:15:33,280 Speaker 2: have all these divergent viewpoints as well of people who 314 00:15:33,320 --> 00:15:35,600 Speaker 2: know way more about this than me, people who've worked 315 00:15:35,640 --> 00:15:38,200 Speaker 2: in the field building this stuff for years. You know, 316 00:15:38,400 --> 00:15:41,960 Speaker 2: some are saying the large language model based AI tools 317 00:15:41,960 --> 00:15:44,560 Speaker 2: we see now, the chat GPT's the world and so forth, 318 00:15:45,160 --> 00:15:47,880 Speaker 2: or are more or less peaked. They can't get better 319 00:15:48,080 --> 00:15:50,160 Speaker 2: really because of the way they work. They'll never get 320 00:15:50,160 --> 00:15:54,400 Speaker 2: to senient artificial AI. And there are others that say, 321 00:15:54,440 --> 00:15:58,160 Speaker 2: you know, that's going to happen tomorrow. So it's really 322 00:15:58,240 --> 00:15:59,880 Speaker 2: hard to know what to think, Brian. All I know 323 00:16:00,280 --> 00:16:02,160 Speaker 2: is when you see, you know, many of the studies 324 00:16:02,200 --> 00:16:04,720 Speaker 2: that have come out and you see these stories where 325 00:16:04,760 --> 00:16:07,640 Speaker 2: now companies are starting to rethink, hey, we've gotten rid 326 00:16:07,680 --> 00:16:09,920 Speaker 2: of a bunch of people because we're going to use 327 00:16:09,960 --> 00:16:10,600 Speaker 2: these tools. 328 00:16:11,080 --> 00:16:13,280 Speaker 3: And then perhaps they don't work the way we thought 329 00:16:13,320 --> 00:16:14,000 Speaker 3: they would. 330 00:16:15,280 --> 00:16:17,640 Speaker 2: You know, I think the bottom line in my opinion, 331 00:16:17,680 --> 00:16:19,680 Speaker 2: based on what I what I see out there with 332 00:16:19,840 --> 00:16:25,119 Speaker 2: our customers and what people are doing. Again, as an organization, 333 00:16:25,760 --> 00:16:28,520 Speaker 2: if you work with people that actually know how to 334 00:16:28,640 --> 00:16:32,840 Speaker 2: use these things in a secure, responsible, productive way, you 335 00:16:32,920 --> 00:16:36,520 Speaker 2: can when you arm your people with these tools, get 336 00:16:36,520 --> 00:16:41,560 Speaker 2: some significant productivity increases. But if you just say, oh, 337 00:16:41,800 --> 00:16:45,640 Speaker 2: here's an account, go use this, you not only are 338 00:16:45,680 --> 00:16:48,160 Speaker 2: potentially not going to get a lot of value out 339 00:16:48,200 --> 00:16:50,880 Speaker 2: of it for these studies, but you also are setting 340 00:16:50,920 --> 00:16:54,920 Speaker 2: yourself up again for privacy and compliance issues if people 341 00:16:54,960 --> 00:16:59,080 Speaker 2: don't understand how these things work, where their data might be, 342 00:16:59,240 --> 00:17:01,200 Speaker 2: where the data they end put might be going, and 343 00:17:01,320 --> 00:17:04,480 Speaker 2: ultimately what the consequences might be from again a legal, 344 00:17:04,480 --> 00:17:08,880 Speaker 2: ir regulatory perspective. So it's it's really interesting to see 345 00:17:08,880 --> 00:17:11,720 Speaker 2: all the hype around this, but there are ways to 346 00:17:11,800 --> 00:17:14,760 Speaker 2: use this stuff that will make you more efficient and 347 00:17:14,800 --> 00:17:15,440 Speaker 2: more productive. 348 00:17:15,800 --> 00:17:18,120 Speaker 1: Well possibility anyway, You just got to know what you're doing, 349 00:17:18,119 --> 00:17:20,000 Speaker 1: and there in lies the challenge finding people who know 350 00:17:20,040 --> 00:17:21,399 Speaker 1: what they're doing and the limitations. 351 00:17:21,440 --> 00:17:22,240 Speaker 3: AI that's what we. 352 00:17:22,200 --> 00:17:25,160 Speaker 1: Have you for. Intrust it dot COM's where you find 353 00:17:25,240 --> 00:17:27,440 Speaker 1: Dave and the crew for your computer needs and business needs. 354 00:17:27,560 --> 00:17:29,600 Speaker 1: They can teach your business how to use AI. Apparently 355 00:17:29,680 --> 00:17:32,000 Speaker 1: that's what he just said. And find him online for 356 00:17:32,040 --> 00:17:34,440 Speaker 1: all the resource material he use for this segment every week. 357 00:17:34,480 --> 00:17:37,680 Speaker 1: It's LinkedIn dot com search Dave Hatter and you will 358 00:17:37,680 --> 00:17:39,200 Speaker 1: find him, or just look for the mayor for it, 359 00:17:39,320 --> 00:17:41,439 Speaker 1: right Dave, thank you so much for coming on the 360 00:17:41,440 --> 00:17:42,440 Speaker 1: program every day. In sponsor