1 00:00:00,000 --> 00:00:03,440 Speaker 1: I brought to you by end trust It. Every business 2 00:00:03,480 --> 00:00:06,360 Speaker 1: has a computer, presumably unless they're a ludeite business not 3 00:00:06,400 --> 00:00:09,680 Speaker 1: making any money. Computers meeting potential problems. You need proper setup, 4 00:00:09,680 --> 00:00:12,240 Speaker 1: you need proper best practices, and you need to get 5 00:00:12,240 --> 00:00:13,880 Speaker 1: out of problems when you run into them. That's what 6 00:00:13,920 --> 00:00:15,880 Speaker 1: interest I t is all about. The best in the business, 7 00:00:15,880 --> 00:00:19,360 Speaker 1: according to the business courier and the man behind interest 8 00:00:19,400 --> 00:00:22,760 Speaker 1: I find him online at interest it dot com. Dave Hatter, Daddy. 9 00:00:22,840 --> 00:00:26,520 Speaker 1: Welcome man. The Verizon outage impacted quite a few people, 10 00:00:26,520 --> 00:00:28,160 Speaker 1: and it lasted a hell of a lot longer than 11 00:00:28,200 --> 00:00:29,960 Speaker 1: people realize. And it was at that point they all 12 00:00:30,000 --> 00:00:32,960 Speaker 1: realized how much we depend upon and our lives revolve 13 00:00:33,000 --> 00:00:35,320 Speaker 1: around our cell phone. Welcome back, my friend. It's always 14 00:00:35,320 --> 00:00:36,160 Speaker 1: great having you on. 15 00:00:36,760 --> 00:00:39,240 Speaker 2: Brian, always good to be here. Yeah. How many times 16 00:00:39,280 --> 00:00:40,120 Speaker 2: have you and I in the. 17 00:00:40,159 --> 00:00:43,960 Speaker 3: Last ten plus here has talked about are ever increasingly 18 00:00:44,040 --> 00:00:48,080 Speaker 3: digital society, how we depend on this technology more and 19 00:00:48,159 --> 00:00:51,239 Speaker 3: more all the time, and how fragile it is. The 20 00:00:51,280 --> 00:00:54,280 Speaker 3: point I try to make all the time. And here's 21 00:00:54,320 --> 00:00:57,640 Speaker 3: yet another example. You know, now, this is not unique 22 00:00:57,640 --> 00:00:59,760 Speaker 3: to Verizon. It's not the only time. It's happened to. 23 00:01:02,160 --> 00:01:05,640 Speaker 3: But the bottom line is, in general, I don't think 24 00:01:05,680 --> 00:01:09,600 Speaker 3: the average person really realizes how much network traffic goes 25 00:01:09,600 --> 00:01:13,880 Speaker 3: through these cellular carriers networks at this point. So it's 26 00:01:13,920 --> 00:01:16,480 Speaker 3: not just your cell phone doesn't work, it's other things 27 00:01:16,520 --> 00:01:19,200 Speaker 3: may not work as well, and especially if you could 28 00:01:19,280 --> 00:01:21,800 Speaker 3: knock out one of these carriers entire network. Because it 29 00:01:21,800 --> 00:01:25,120 Speaker 3: looks like this Verizon thing did not affect all users, 30 00:01:25,520 --> 00:01:27,639 Speaker 3: does not appear to be a cyber attack from everything 31 00:01:27,680 --> 00:01:29,200 Speaker 3: I've read, it looks like they had a problem with 32 00:01:29,240 --> 00:01:31,800 Speaker 3: a server. Now again, you know, this is still early 33 00:01:31,880 --> 00:01:35,400 Speaker 3: and sometimes when these incidents happen, as time wears on, 34 00:01:35,440 --> 00:01:37,880 Speaker 3: you get more detailed when you find out whatever was 35 00:01:37,920 --> 00:01:39,360 Speaker 3: originally reported wasn't right. 36 00:01:39,400 --> 00:01:42,560 Speaker 2: I'm not saying that's necessarily because they're lying about it. 37 00:01:42,600 --> 00:01:44,120 Speaker 3: I'm saying that, you know, you have to do some 38 00:01:44,160 --> 00:01:46,000 Speaker 3: forensics to get to the bottom of things to really 39 00:01:46,080 --> 00:01:49,200 Speaker 3: understand what went on. But apparently, for roughly ten hours, 40 00:01:49,240 --> 00:01:52,640 Speaker 3: a bunch of Verizon customers had issues with their phones. 41 00:01:53,160 --> 00:01:56,240 Speaker 3: I'm a Verizon customer. It did not impact me, for example. 42 00:01:57,560 --> 00:01:59,320 Speaker 3: But you know, kind of going back to the second 43 00:01:59,360 --> 00:02:02,000 Speaker 3: part of my point, which is not it's not just 44 00:02:02,120 --> 00:02:05,320 Speaker 3: cell phone traffic necessarily going through this stuff, but when 45 00:02:05,360 --> 00:02:08,240 Speaker 3: you see the sheer number of people that are potentially 46 00:02:08,639 --> 00:02:11,840 Speaker 3: suddenly unable to do the things they need to do 47 00:02:11,880 --> 00:02:15,880 Speaker 3: in their daily lives because they rely on these phones. Right, 48 00:02:15,919 --> 00:02:18,840 Speaker 3: whether it's making a doctor's appointment, whether it's trying to 49 00:02:18,840 --> 00:02:22,480 Speaker 3: get an uber or as an uber driver, whether it's 50 00:02:22,600 --> 00:02:23,560 Speaker 3: you know, trying. 51 00:02:23,320 --> 00:02:29,160 Speaker 1: To two factor authentication. You know, the days when I 52 00:02:29,280 --> 00:02:31,560 Speaker 1: when I when when I forgot my phone at home, 53 00:02:31,600 --> 00:02:33,680 Speaker 1: I can't I come in, I can't get into my email, 54 00:02:33,840 --> 00:02:36,640 Speaker 1: the of the iHeart system because of course, the the 55 00:02:36,720 --> 00:02:39,320 Speaker 1: good reason is I use two factor authentication. That's an 56 00:02:39,360 --> 00:02:43,040 Speaker 1: important safety tools. You've talked time and time again. Sorry 57 00:02:43,200 --> 00:02:44,960 Speaker 1: sucks to be you. You're not going to get your 58 00:02:45,160 --> 00:02:48,720 Speaker 1: six digit message saying that allow you access into your stuff. 59 00:02:49,880 --> 00:02:53,400 Speaker 3: Yeah, that's that's another interesting angle because even if your 60 00:02:53,400 --> 00:02:56,799 Speaker 3: computer is working because it's connected to a different network, right, 61 00:02:56,800 --> 00:02:59,720 Speaker 3: it's not using Verizons sell your network or at and 62 00:02:59,760 --> 00:03:04,600 Speaker 3: Tier whomever. Again we're specifically talking about Verizon in this case. Yeah, 63 00:03:04,880 --> 00:03:07,680 Speaker 3: you might not be able to get your MFA code. 64 00:03:08,120 --> 00:03:08,360 Speaker 2: Now. 65 00:03:08,440 --> 00:03:10,520 Speaker 3: You know, in some cases, Brian, if you have a 66 00:03:10,560 --> 00:03:12,760 Speaker 3: cell phone, I would just remind folks, you know, if 67 00:03:12,760 --> 00:03:13,799 Speaker 3: you have a cell phone. 68 00:03:13,720 --> 00:03:15,799 Speaker 2: All new ones that I'm aware of support Wi Fi. 69 00:03:16,560 --> 00:03:19,799 Speaker 3: If your cellular carrier is down, you could still theoretically 70 00:03:19,960 --> 00:03:22,400 Speaker 3: use some of the phone's functionality. You might not be 71 00:03:22,440 --> 00:03:25,840 Speaker 3: able to make phone calls, for example, text may or 72 00:03:25,840 --> 00:03:27,840 Speaker 3: may not work, but you might be able to at 73 00:03:27,919 --> 00:03:31,080 Speaker 3: least still do certain things because you're connected to the 74 00:03:31,080 --> 00:03:34,600 Speaker 3: Wi Fi. But the bigger point in my mind now Verizon, 75 00:03:34,639 --> 00:03:36,520 Speaker 3: you know, has apologized for this, and they said they're 76 00:03:36,520 --> 00:03:39,440 Speaker 3: going to issue credits and that sort of thing, which 77 00:03:39,440 --> 00:03:43,320 Speaker 3: I think is good and I think even more important 78 00:03:43,400 --> 00:03:46,040 Speaker 3: is from all initial reporting, this does not appear to 79 00:03:46,040 --> 00:03:49,840 Speaker 3: be a cyber attack of any kind. You know, I've 80 00:03:49,880 --> 00:03:51,880 Speaker 3: sent you all kinds of articles. We've talked about it 81 00:03:51,920 --> 00:03:54,680 Speaker 3: on here, I've shared many times in my social media 82 00:03:55,000 --> 00:03:58,360 Speaker 3: stories where you know, the FBI, DHS, etc. Is warning 83 00:03:58,400 --> 00:04:02,720 Speaker 3: about Chinese communists already infiltration in the telecom network, you know, 84 00:04:02,920 --> 00:04:05,600 Speaker 3: back doors and this stuff. The possibility that this stuff 85 00:04:05,640 --> 00:04:08,800 Speaker 3: could be knocked out on purpose again doesn't seem to 86 00:04:08,800 --> 00:04:12,200 Speaker 3: be anything like that, I'm happy to say. But the 87 00:04:12,240 --> 00:04:14,600 Speaker 3: bigger point in my mind, and I just keep telling 88 00:04:14,600 --> 00:04:18,920 Speaker 3: people this all the time, we must, we must, we must. 89 00:04:19,040 --> 00:04:22,560 Speaker 3: It's serious about protecting this infrastructure as we get You know, 90 00:04:22,640 --> 00:04:25,960 Speaker 3: it's thirty years ago. I don't know, maybe it's been 91 00:04:26,000 --> 00:04:28,080 Speaker 3: twenty five years. Whenever I first got a cell phone, 92 00:04:28,520 --> 00:04:30,600 Speaker 3: it was a convenience for me. It was a novelty 93 00:04:30,680 --> 00:04:32,800 Speaker 3: for me. It was not nearly as important to me 94 00:04:32,839 --> 00:04:34,960 Speaker 3: then as it is now and. 95 00:04:34,960 --> 00:04:37,840 Speaker 1: A half pounds plus weighed three and a half pounds. 96 00:04:38,240 --> 00:04:40,839 Speaker 3: Oh yeah, yeah, big old brick, right, But I mean 97 00:04:40,920 --> 00:04:43,880 Speaker 3: if it stopped working for a couple of hours, wasn't 98 00:04:43,920 --> 00:04:46,520 Speaker 3: that big of a deal. Landlines were everywhere, pay phones 99 00:04:46,560 --> 00:04:49,360 Speaker 3: were everywhere. Most people didn't have a cell phone. They 100 00:04:49,360 --> 00:04:51,320 Speaker 3: weren't trying to call my cell phone. It was mostly, 101 00:04:51,640 --> 00:04:54,159 Speaker 3: you know, a business rated thing and a convenience and 102 00:04:54,240 --> 00:04:57,400 Speaker 3: cool thing. Well now I don't even have a landline anymore, 103 00:04:57,440 --> 00:04:58,040 Speaker 3: Brian Dye. 104 00:04:58,960 --> 00:04:59,960 Speaker 1: I mentioned that earlier. 105 00:05:00,160 --> 00:05:04,320 Speaker 3: Yeah, yeah, many people don't. So you know, again, if now, 106 00:05:04,440 --> 00:05:07,159 Speaker 3: if my cell phone doesn't work, it's much more problematic 107 00:05:07,200 --> 00:05:09,880 Speaker 3: for me. And I don't exist out of my phone 108 00:05:09,920 --> 00:05:11,800 Speaker 3: like my kids do or like other people I know. 109 00:05:11,960 --> 00:05:14,000 Speaker 3: Do you know I would still get on the computer. 110 00:05:14,080 --> 00:05:14,240 Speaker 1: Now. 111 00:05:14,240 --> 00:05:16,960 Speaker 3: You mentioned MFA that may or may not work through 112 00:05:17,000 --> 00:05:20,640 Speaker 3: another connection. But the bigger point is we have we 113 00:05:20,680 --> 00:05:23,760 Speaker 3: have got to at every level of our society realize 114 00:05:23,880 --> 00:05:27,560 Speaker 3: how all this digital technology is impacting us, how fragile 115 00:05:27,600 --> 00:05:30,279 Speaker 3: it is, and how you know, We've got to get 116 00:05:30,279 --> 00:05:33,839 Speaker 3: this stuff fixed before some catastrophe happens. And my last point, 117 00:05:33,839 --> 00:05:35,920 Speaker 3: because I know will be out of time, is, as 118 00:05:36,040 --> 00:05:38,240 Speaker 3: you and I have discussed many times, one of the 119 00:05:38,279 --> 00:05:42,440 Speaker 3: many reasons why I'm completely and totally against any sort 120 00:05:42,480 --> 00:05:47,040 Speaker 3: of cashless society is, imagine all three carriers go down. 121 00:05:47,360 --> 00:05:50,560 Speaker 3: I imagine the telecom system is down entirely. How do 122 00:05:50,640 --> 00:05:52,680 Speaker 3: you bank, how do you get money? How do you 123 00:05:52,720 --> 00:05:53,920 Speaker 3: pay your bills? 124 00:05:54,400 --> 00:05:58,200 Speaker 1: At least stash of cash on hand in the end 125 00:05:58,200 --> 00:06:01,359 Speaker 1: of those contingencies, have something like a fire plan in 126 00:06:01,400 --> 00:06:04,599 Speaker 1: the event cell phone connectivity goes away for any period 127 00:06:04,640 --> 00:06:07,160 Speaker 1: of time. Pause. We'll bring Dave headerback to talk about 128 00:06:07,200 --> 00:06:09,920 Speaker 1: his op edpes the benefits of the Kentucky Consumer Data 129 00:06:09,960 --> 00:06:13,960 Speaker 1: protect Station tech part of the Dave Hatter and Opinion 130 00:06:13,960 --> 00:06:16,359 Speaker 1: Writer op ed opinion writer still keeping his interest it 131 00:06:16,680 --> 00:06:19,040 Speaker 1: dot com hat on. He's talking about the benefits of 132 00:06:19,120 --> 00:06:21,440 Speaker 1: Kentucky Consumer Data Protection Act. We can all read the 133 00:06:21,480 --> 00:06:23,560 Speaker 1: article in the Northern Kentucky Tribune Day. But tell us 134 00:06:23,600 --> 00:06:26,440 Speaker 1: about this. What is the Kentucky Consumer Data Protection Act? 135 00:06:27,080 --> 00:06:30,560 Speaker 3: Yeah, Brian, for the first time and probably my entire life, 136 00:06:31,240 --> 00:06:34,880 Speaker 3: Kentucky has actually beat Ohio to something useful in the 137 00:06:34,920 --> 00:06:39,600 Speaker 3: tech space. I'm happy to say. You know, Ohio honestly 138 00:06:39,720 --> 00:06:43,000 Speaker 3: is generally way ahead of Kentucky and anything related to tech. 139 00:06:43,120 --> 00:06:45,240 Speaker 3: I think we've talked on her show about Ohio House 140 00:06:45,279 --> 00:06:49,240 Speaker 3: Built ninety six, which requires local governments in Ohio to 141 00:06:50,279 --> 00:06:54,480 Speaker 3: take steps towards protecting your assets in the cybersecurity space 142 00:06:54,520 --> 00:06:57,880 Speaker 3: and some other stuff. So Ohio generally, in all honesty, 143 00:06:57,960 --> 00:06:59,840 Speaker 3: really has their act together, and I'm constantly trying to 144 00:06:59,839 --> 00:07:02,160 Speaker 3: get evince my friends here in Kentucky to follow suit. 145 00:07:02,200 --> 00:07:06,960 Speaker 3: But back in twenty four Kentucky passed the Kentucky Consumer 146 00:07:07,040 --> 00:07:11,840 Speaker 3: Data Protection Act, and it's not perfect, but it's a 147 00:07:11,840 --> 00:07:15,360 Speaker 3: great first step towards protecting you as a consumer. And 148 00:07:15,360 --> 00:07:16,880 Speaker 3: one of the reasons why I'm glad you wanted to 149 00:07:16,880 --> 00:07:19,200 Speaker 3: talk about this despite the fact that obviously you have 150 00:07:19,240 --> 00:07:22,000 Speaker 3: a large audience in Kentucky. I live in Kentucky. Is 151 00:07:22,040 --> 00:07:25,960 Speaker 3: I'm hoping Ohio will follow suits. So the International Association 152 00:07:26,040 --> 00:07:28,680 Speaker 3: of Privacy Professionals has a website I've linked to it 153 00:07:28,720 --> 00:07:31,240 Speaker 3: from this article that shows you which states have a 154 00:07:31,240 --> 00:07:35,840 Speaker 3: consumer data protection law. Currently there are nineteen. Kentucky is 155 00:07:35,880 --> 00:07:39,520 Speaker 3: one of the most recent. Indiana has one. Many states 156 00:07:39,560 --> 00:07:42,880 Speaker 3: surrounding Ohio either have one in place or are working 157 00:07:42,920 --> 00:07:44,600 Speaker 3: on one. From what I can tell, it looks like 158 00:07:44,680 --> 00:07:48,120 Speaker 3: Ohio's attempt at this sort of stalled out, and I 159 00:07:48,160 --> 00:07:51,000 Speaker 3: can't find any evidence that it's been reunited. So all 160 00:07:51,040 --> 00:07:52,880 Speaker 3: my Ohio friends, you might want to reach out to 161 00:07:52,920 --> 00:07:55,200 Speaker 3: your legislators. You can send them this article or there's 162 00:07:55,200 --> 00:07:58,640 Speaker 3: plenty of research on this went into law January first 163 00:07:58,640 --> 00:08:02,200 Speaker 3: of this year, and it's design. It has two prongs, 164 00:08:02,240 --> 00:08:04,240 Speaker 3: and I'm going to write another two articles on this, 165 00:08:04,320 --> 00:08:07,040 Speaker 3: one for Kentucky businesses what they need to know, because 166 00:08:07,040 --> 00:08:08,680 Speaker 3: I touch on it briefly in here, but I'm really 167 00:08:08,680 --> 00:08:11,400 Speaker 3: trying to focus on why you should care about this 168 00:08:11,440 --> 00:08:14,000 Speaker 3: as a consumer, and then another article about as a 169 00:08:14,000 --> 00:08:17,200 Speaker 3: consumer in Kentucky how to use this thing. But basically, 170 00:08:17,200 --> 00:08:19,520 Speaker 3: what it does is it says if you're a Kentucky 171 00:08:19,560 --> 00:08:23,320 Speaker 3: business and you collect consumer data, once you hit certain thresholds, 172 00:08:23,800 --> 00:08:26,800 Speaker 3: then there are penalties built into the bill. 173 00:08:27,120 --> 00:08:29,320 Speaker 2: If you don't protect people's data. 174 00:08:30,560 --> 00:08:33,120 Speaker 3: So there's a cybersecurity angle to this, right, It puts 175 00:08:33,160 --> 00:08:36,760 Speaker 3: teeth in the law so that businesses now are incentivized 176 00:08:36,760 --> 00:08:39,120 Speaker 3: to want to protect your data to stop all these 177 00:08:39,200 --> 00:08:41,839 Speaker 3: data breaches that we see. You know, I bet you, Brian, 178 00:08:41,920 --> 00:08:44,360 Speaker 3: everyone in your audience has gotten at least one, probably 179 00:08:44,360 --> 00:08:47,760 Speaker 3: many notices from some company somewhere, probably some they haven't 180 00:08:47,760 --> 00:08:50,360 Speaker 3: done business with that their data has been breached. Right, So, 181 00:08:51,000 --> 00:08:53,720 Speaker 3: if you're doing business in Kentucky and you have Kentucky 182 00:08:53,760 --> 00:08:58,680 Speaker 3: consumers data, there are exceptions. Again, there are thresholds, but 183 00:08:58,840 --> 00:09:03,360 Speaker 3: now as a consumer you potentially can get You're forcing 184 00:09:03,400 --> 00:09:06,520 Speaker 3: businesses to take this seriously. Essentially, the big knock on 185 00:09:06,600 --> 00:09:09,319 Speaker 3: this because it's not nearly as stringent as consumer or 186 00:09:09,440 --> 00:09:14,839 Speaker 3: California's law the California Consumer Day California Consumer Privacy Act 187 00:09:14,920 --> 00:09:17,439 Speaker 3: is by far the most stringent law in. 188 00:09:17,400 --> 00:09:18,240 Speaker 2: The United States. 189 00:09:18,280 --> 00:09:20,559 Speaker 3: It's kind of loosely based on the GDP. You are 190 00:09:20,720 --> 00:09:26,480 Speaker 3: out of the EU, it's much more rigorous, much bigger penalties. 191 00:09:26,840 --> 00:09:29,360 Speaker 3: But again, in my mind, this is a great first step. 192 00:09:29,480 --> 00:09:32,199 Speaker 3: It's something that got done, and the main things that 193 00:09:32,280 --> 00:09:34,480 Speaker 3: does fuse well. The main knock on it is. You 194 00:09:34,559 --> 00:09:37,600 Speaker 3: have no private right to action if you believe you 195 00:09:37,720 --> 00:09:40,679 Speaker 3: are a victim of if one of these things have 196 00:09:40,920 --> 00:09:43,120 Speaker 3: you know, been uh, I don't know. I can't talk 197 00:09:43,160 --> 00:09:46,160 Speaker 3: this morning, Brian. If a company runs a foul one 198 00:09:46,200 --> 00:09:48,280 Speaker 3: of these things and you believe you're a victim, you 199 00:09:48,320 --> 00:09:50,160 Speaker 3: can't sue them outright. You have to go through the 200 00:09:50,240 --> 00:09:54,000 Speaker 3: Kentucky Attorney General's Office. A lot of experts, you know, 201 00:09:54,080 --> 00:09:56,560 Speaker 3: have knocked it for that, But what it does do 202 00:09:56,960 --> 00:09:59,320 Speaker 3: is a whole lot of things. For example, you now 203 00:09:59,400 --> 00:10:02,120 Speaker 3: have a right to go to a company that has 204 00:10:02,160 --> 00:10:05,360 Speaker 3: your data data brokers, for example. Now this is not easy. 205 00:10:05,400 --> 00:10:07,400 Speaker 3: That's why I'm going to do another article to explain 206 00:10:07,640 --> 00:10:10,120 Speaker 3: how to work through this and say, I want to 207 00:10:10,160 --> 00:10:12,679 Speaker 3: see what data you have about me. I want to 208 00:10:12,840 --> 00:10:15,600 Speaker 3: is this data accurate. I want you to correct this 209 00:10:15,760 --> 00:10:19,360 Speaker 3: data in some cases potentially delete the data. I want 210 00:10:19,400 --> 00:10:21,280 Speaker 3: you to give me my data so I can take 211 00:10:21,280 --> 00:10:24,400 Speaker 3: it somewhere else. So without a law like this right 212 00:10:25,280 --> 00:10:28,080 Speaker 3: often you have no idea who has your data. You 213 00:10:28,200 --> 00:10:30,200 Speaker 3: have no way to know if that data is correct. 214 00:10:30,200 --> 00:10:32,040 Speaker 3: You and I've talked about this many times. It's one 215 00:10:32,080 --> 00:10:33,800 Speaker 3: of the reasons why I don't like the Internet of 216 00:10:33,840 --> 00:10:36,880 Speaker 3: Things and apps, all this data is being collected. 217 00:10:37,240 --> 00:10:38,440 Speaker 2: Is it even right. 218 00:10:38,520 --> 00:10:41,240 Speaker 1: You don't know you have no access to it. This 219 00:10:41,360 --> 00:10:45,200 Speaker 1: is basically creating a legal freedom of information act for 220 00:10:45,200 --> 00:10:48,320 Speaker 1: people to allow them to ask and demand the information 221 00:10:48,400 --> 00:10:50,719 Speaker 1: that is being collected about them. I love that. And 222 00:10:50,800 --> 00:10:52,760 Speaker 1: once you're around with that information, you can determine the 223 00:10:52,800 --> 00:10:55,160 Speaker 1: accuracy of it, and then maybe if Kentucky's you know, 224 00:10:55,880 --> 00:10:59,040 Speaker 1: is the norm, or or California's law becomes the norm 225 00:10:59,040 --> 00:11:02,040 Speaker 1: for America, then you'll have some right to legal recourse. 226 00:11:02,600 --> 00:11:04,079 Speaker 1: I think it's a step in the right direction. I 227 00:11:04,160 --> 00:11:06,360 Speaker 1: hate the idea that they're selling our information to third 228 00:11:06,360 --> 00:11:09,120 Speaker 1: parties all the time, and I guess the FTC recognized 229 00:11:09,160 --> 00:11:11,560 Speaker 1: that too. They have banned GM from disclosing all the 230 00:11:11,559 --> 00:11:14,679 Speaker 1: information it's been collecting about our driving habits for the 231 00:11:14,760 --> 00:11:17,120 Speaker 1: next five years, and I presume that that's going to 232 00:11:17,120 --> 00:11:19,280 Speaker 1: be an extension of in a broader application. 233 00:11:19,760 --> 00:11:22,439 Speaker 3: So, because keep in mind, right there is no national 234 00:11:22,480 --> 00:11:25,199 Speaker 3: privacy law, you've got nineteen states that have this patch 235 00:11:25,240 --> 00:11:28,160 Speaker 3: work built. Now again, is this law perfect, because I've 236 00:11:28,200 --> 00:11:30,480 Speaker 3: already gotten feedback that this is no good. 237 00:11:30,840 --> 00:11:33,280 Speaker 2: It's way better than what we had, which was nothing. 238 00:11:33,920 --> 00:11:36,000 Speaker 2: If you are business doing you don't have to be 239 00:11:36,040 --> 00:11:38,720 Speaker 2: a Kentucky business. If you're a business doing business in 240 00:11:38,840 --> 00:11:41,920 Speaker 2: Kentucky right now, you're a subject to this. 241 00:11:41,920 --> 00:11:45,240 Speaker 1: Law that makes it stretch all around the nation like 242 00:11:45,640 --> 00:11:49,160 Speaker 1: California cafe standards, forcing everybody to fall in line with California. 243 00:11:49,320 --> 00:11:51,080 Speaker 1: Pause will bring Dave haderback. He's going to tell us 244 00:11:51,120 --> 00:11:53,720 Speaker 1: us the worst products of the Consumer elect Products Show, 245 00:11:53,720 --> 00:11:57,960 Speaker 1: probably an endless list. One more with dation sixth if 246 00:11:58,000 --> 00:11:59,880 Speaker 1: you want to pick about kr CEDE talk station in 247 00:12:00,080 --> 00:12:01,960 Speaker 1: trust it dot com to find day you've had on 248 00:12:01,960 --> 00:12:04,120 Speaker 1: the crew for all your business computerity and sponsoring the 249 00:12:04,240 --> 00:12:06,600 Speaker 1: Tech Friday segment. Thank god for them doing it all. 250 00:12:06,679 --> 00:12:08,839 Speaker 1: Right now the fun and games begins. Can you narrow 251 00:12:08,920 --> 00:12:11,800 Speaker 1: down a list of the worst products of the CEO 252 00:12:11,880 --> 00:12:13,480 Speaker 1: Show that you can fit in the remaining four or 253 00:12:13,520 --> 00:12:14,440 Speaker 1: five minutes we've got? 254 00:12:15,240 --> 00:12:18,079 Speaker 3: Well, yeah, Brian, that's a that's a tall charge there, 255 00:12:18,679 --> 00:12:21,960 Speaker 3: I would be. You know, it's funny this so this 256 00:12:22,040 --> 00:12:25,520 Speaker 3: is an ap article. Worst in shows CES products include 257 00:12:25,559 --> 00:12:29,440 Speaker 3: AI refrigerators, AI companions, and AI doorbells. I'll post this 258 00:12:29,559 --> 00:12:31,280 Speaker 3: and all the other links to all the other stuff 259 00:12:31,320 --> 00:12:33,240 Speaker 3: we talked about this morning, so people can go see 260 00:12:33,240 --> 00:12:34,040 Speaker 3: these things for themselves. 261 00:12:34,080 --> 00:12:36,520 Speaker 2: But I love it the very first picture they start 262 00:12:36,520 --> 00:12:37,720 Speaker 2: out in the article. 263 00:12:38,520 --> 00:12:42,640 Speaker 3: Yes, Internet of Things coffee makers that conveniently not only 264 00:12:42,679 --> 00:12:45,040 Speaker 3: have a screen on them, but the Wi Fi symbol 265 00:12:45,120 --> 00:12:47,200 Speaker 3: telling you that you can plug this thing in and 266 00:12:47,240 --> 00:12:49,400 Speaker 3: you know, wreck your whole life, your business, et cetera, 267 00:12:50,000 --> 00:12:52,360 Speaker 3: because probably in two years or less, there'll be no 268 00:12:52,480 --> 00:12:54,080 Speaker 3: software updates coming for this thing. 269 00:12:54,120 --> 00:12:54,960 Speaker 2: That's how it works. 270 00:12:55,679 --> 00:12:58,200 Speaker 3: And as you and I have discussed so many times before, 271 00:12:58,280 --> 00:13:01,360 Speaker 3: you know, Internet of things aka smart devices, most of 272 00:13:01,360 --> 00:13:04,880 Speaker 3: which are really stupid, dumb dumpster fires for your privacy 273 00:13:04,920 --> 00:13:08,920 Speaker 3: and security. You know, the show was full of this stuff. Robots, 274 00:13:09,120 --> 00:13:13,840 Speaker 3: ai AI refrigerators. You know you've talked We've talked about 275 00:13:13,840 --> 00:13:16,640 Speaker 3: the Samsung refrigerator before with a screen that now serves 276 00:13:16,720 --> 00:13:17,600 Speaker 3: up ads. 277 00:13:17,679 --> 00:13:19,480 Speaker 2: Let me just add, let me just. 278 00:13:21,120 --> 00:13:24,800 Speaker 1: Sure to get advertising pumped into your own via your refrigerators. 279 00:13:25,679 --> 00:13:28,920 Speaker 3: You paid for a refrigerator that is going to send ads. 280 00:13:28,920 --> 00:13:30,679 Speaker 3: I mean, think of the drivel that's coming out of 281 00:13:30,720 --> 00:13:32,160 Speaker 3: your average gas pump now. 282 00:13:32,120 --> 00:13:34,920 Speaker 2: And now you can bring that into your kitchen. Is 283 00:13:35,000 --> 00:13:37,439 Speaker 2: that not? Are we not entertained? Brian? 284 00:13:37,600 --> 00:13:40,720 Speaker 3: This is aw so so they say here, now again, 285 00:13:40,920 --> 00:13:44,920 Speaker 3: these are a group of privacy experts from folks like 286 00:13:45,120 --> 00:13:48,240 Speaker 3: the Internet or Electronic Frontier Foundation and other groups. They 287 00:13:48,280 --> 00:13:50,800 Speaker 3: go to this thing and then you know, they collectively 288 00:13:50,920 --> 00:13:53,079 Speaker 3: vote on these things. Right, so it's it's ten four 289 00:13:53,120 --> 00:13:57,280 Speaker 3: hat people like me, but shouting at a bespoke AI 290 00:13:57,360 --> 00:13:59,840 Speaker 3: fridge that also hawks grocery products. This is from the 291 00:14:00,040 --> 00:14:04,920 Speaker 3: article Samsung's Bespoke AI Family Hub refrigerator received the overall 292 00:14:05,000 --> 00:14:08,880 Speaker 3: worst end show from the group of consumer privacy advocates. 293 00:14:09,920 --> 00:14:13,160 Speaker 3: And they basically go one to say for example, you know, 294 00:14:13,160 --> 00:14:14,800 Speaker 3: they talk about trying to talk to the thing it 295 00:14:14,840 --> 00:14:19,120 Speaker 3: was too loud. But that's not the thing that gets me. 296 00:14:19,240 --> 00:14:22,320 Speaker 3: It's the idea. Now again I'm reading from the article, right, Yeah, 297 00:14:23,120 --> 00:14:26,240 Speaker 3: that was just part of the complications and reliability concerned 298 00:14:26,280 --> 00:14:28,520 Speaker 3: Samsung added to an appliance that's supposed to have an 299 00:14:28,560 --> 00:14:32,040 Speaker 3: important job keeping food cold. Right, they talk about how 300 00:14:32,040 --> 00:14:34,520 Speaker 3: it could try to order food for you. Well, if 301 00:14:34,560 --> 00:14:37,720 Speaker 3: it could order food for you, it means it knows 302 00:14:37,760 --> 00:14:40,920 Speaker 3: what you're eating, it knows how much you're eating. It's 303 00:14:41,080 --> 00:14:44,120 Speaker 3: probably selling that to God knows who, like your insurance company. 304 00:14:44,400 --> 00:14:47,120 Speaker 3: You know, I've brought this up many times talking to you, Brian, 305 00:14:47,160 --> 00:14:49,440 Speaker 3: and I know in the past people like, well, that's crazy. 306 00:14:49,680 --> 00:14:52,280 Speaker 3: But think about this. If this can order your food, 307 00:14:52,560 --> 00:14:54,400 Speaker 3: then it could know that you eat. You know, let's 308 00:14:54,440 --> 00:14:55,400 Speaker 3: say eight pounds. 309 00:14:55,120 --> 00:14:57,400 Speaker 1: A bake in a week, how many twelve packs you 310 00:14:57,400 --> 00:15:00,200 Speaker 1: got in the fridge, that's all that's exactly right, on 311 00:15:00,280 --> 00:15:01,880 Speaker 1: and on and on. It is scary. 312 00:15:02,000 --> 00:15:04,040 Speaker 3: And while it might be convenient that it could order 313 00:15:04,080 --> 00:15:07,040 Speaker 3: that for you, unless, of course the Verizon network is down, 314 00:15:07,360 --> 00:15:09,560 Speaker 3: but it could order that for you, right and then 315 00:15:09,760 --> 00:15:12,080 Speaker 3: sell that to your insurance company, who might decide that 316 00:15:12,120 --> 00:15:14,880 Speaker 3: your premium should go up, or maybe they should cancel 317 00:15:14,960 --> 00:15:17,800 Speaker 3: you because you're eating in a way that is not 318 00:15:17,920 --> 00:15:21,440 Speaker 3: conducive to their insurance premiums and their payouts to your. 319 00:15:21,360 --> 00:15:24,160 Speaker 1: Desk, like Ford collecting your driving data, which they sell 320 00:15:24,200 --> 00:15:27,000 Speaker 1: to insurance companies who will cancel you or otherwise jack 321 00:15:27,040 --> 00:15:29,160 Speaker 1: your premium up. This is all a vicious circle leading 322 00:15:29,200 --> 00:15:30,800 Speaker 1: to a bad place for the consumer. 323 00:15:31,600 --> 00:15:32,160 Speaker 2: Yeah. 324 00:15:32,240 --> 00:15:35,960 Speaker 3: Again, the main I am not against the Internet of 325 00:15:36,000 --> 00:15:39,200 Speaker 3: things in principle. I'm against the Internet of things because, 326 00:15:39,200 --> 00:15:41,360 Speaker 3: as I've said so many times that anyone that will listen, 327 00:15:41,600 --> 00:15:43,000 Speaker 3: the incentives are wrong for you. 328 00:15:43,040 --> 00:15:43,720 Speaker 2: As a consumer. 329 00:15:43,920 --> 00:15:46,320 Speaker 3: One of the reasons why the Kentucky Consumer Data Protection 330 00:15:46,360 --> 00:15:48,880 Speaker 3: Act is helpful to you as a consumer. These things 331 00:15:48,920 --> 00:15:53,640 Speaker 3: are cheap garbage, usually coming from China, possibly full of backdoors. 332 00:15:54,080 --> 00:15:56,200 Speaker 3: They're designed for speed to market, ease of use, and 333 00:15:56,280 --> 00:15:59,680 Speaker 3: market share. They don't get software updates after a certain 334 00:15:59,680 --> 00:16:01,880 Speaker 3: peri of time. People do not know how to configure 335 00:16:01,920 --> 00:16:05,360 Speaker 3: them correctly, and they're sucking up all of your data 336 00:16:05,720 --> 00:16:08,640 Speaker 3: and selling it to God knows who, leaking it, breaching it, 337 00:16:08,720 --> 00:16:10,640 Speaker 3: et cetera. Right, so I know it will be out 338 00:16:10,640 --> 00:16:14,360 Speaker 3: of time. Amazon's doorbells once again ring privacy alarms. Right, 339 00:16:14,720 --> 00:16:18,360 Speaker 3: So again reading from the thing that includes facial recognition cone, 340 00:16:18,400 --> 00:16:20,800 Speaker 3: set of ring features, and includes mobile surveillance towers that 341 00:16:20,840 --> 00:16:23,120 Speaker 3: can be deployed at parking lots in other places, and 342 00:16:23,200 --> 00:16:25,240 Speaker 3: includes an app store that's going to let people develop 343 00:16:25,240 --> 00:16:27,360 Speaker 3: even sketch your apps for the doorbell than the ones 344 00:16:27,360 --> 00:16:31,760 Speaker 3: even Amazon provides. Now again, not my commentary. This coming 345 00:16:31,800 --> 00:16:35,480 Speaker 3: from Cindy Cone, executive director of the Electronic Frontier Foundation, 346 00:16:35,840 --> 00:16:38,520 Speaker 3: someone that knows about this stuff. So this article covers 347 00:16:38,520 --> 00:16:39,360 Speaker 3: a bunch of other things. 348 00:16:39,400 --> 00:16:40,000 Speaker 2: I'll post it. 349 00:16:40,600 --> 00:16:44,760 Speaker 3: Get folks, stop buying this stuff. Stop buying this stuff 350 00:16:45,080 --> 00:16:48,440 Speaker 3: at least until there are laws that force these companies 351 00:16:48,440 --> 00:16:51,440 Speaker 3: to get serious about protecting your privacy and security. Otherwise 352 00:16:51,880 --> 00:16:54,720 Speaker 3: you are setting yourself up for problems. 353 00:16:55,760 --> 00:16:59,920 Speaker 1: Underscore, amen bold and underline on that. Get to linked 354 00:17:00,080 --> 00:17:03,640 Speaker 1: in id or LinkedIn dot com, LinkedIn dot com. All 355 00:17:03,680 --> 00:17:05,240 Speaker 1: you need to do is type in Dave Hatter. You 356 00:17:05,280 --> 00:17:07,280 Speaker 1: will find Dave Hatter and the stories that he refers 357 00:17:07,320 --> 00:17:10,120 Speaker 1: to here on the fifty five KC Morning Show. Very 358 00:17:10,160 --> 00:17:13,480 Speaker 1: interesting article and it just I just shake my head 359 00:17:13,480 --> 00:17:15,800 Speaker 1: and wonder why anybody thinks they need any of this 360 00:17:15,840 --> 00:17:18,440 Speaker 1: stuff day. But then again, I've been thoroughly convinced by you, 361 00:17:18,600 --> 00:17:22,000 Speaker 1: and I know my life has been fine, swimmingly decent. 362 00:17:22,200 --> 00:17:24,880 Speaker 1: Uh without this stuff my entire life. I'm not going 363 00:17:24,920 --> 00:17:28,120 Speaker 1: to add it to my list of things that I want, 364 00:17:28,280 --> 00:17:30,120 Speaker 1: or then I will not buy it either. That's why 365 00:17:30,119 --> 00:17:32,399 Speaker 1: we've got you, Dave, get more people on the same ledger. 366 00:17:32,440 --> 00:17:34,679 Speaker 1: We'll do this again next Friday. Thanks again to your company, 367 00:17:34,720 --> 00:17:37,280 Speaker 1: interest it and online at interest it dot com for 368 00:17:37,359 --> 00:17:40,240 Speaker 1: sponsoring this valuable segment. Appreciate what you do. Have a 369 00:17:40,280 --> 00:17:41,600 Speaker 1: wonderful weekend, my friend. 370 00:17:42,400 --> 00:17:44,560 Speaker 3: Always my pleasure, Brian, Thanks to you and Joe and 371 00:17:44,560 --> 00:17:46,120 Speaker 3: I look forward to chatting with you next week. 372 00:17:46,200 --> 00:17:47,359 Speaker 1: Thanks brother from the Hudson