1 00:00:00,120 --> 00:00:04,320 Speaker 1: The talk station sixty one. If if you've got KRCD 2 00:00:04,440 --> 00:00:08,440 Speaker 1: talk station the Friday this time of week appointment listening 3 00:00:09,039 --> 00:00:11,520 Speaker 1: Tech Friday's Dave Hatter brought to you by intrust It. 4 00:00:11,800 --> 00:00:14,240 Speaker 1: That's Dave's company, which business Curier says is the best 5 00:00:14,280 --> 00:00:16,439 Speaker 1: in the greater Cincinnati area in the tri State for 6 00:00:16,720 --> 00:00:19,799 Speaker 1: helping you with your manage and deal with your computer 7 00:00:19,880 --> 00:00:23,560 Speaker 1: needs for businesses. So businesses rely on interest It, you'd 8 00:00:23,600 --> 00:00:25,040 Speaker 1: be in great hands with Dave had Ter. Dave had 9 00:00:25,079 --> 00:00:26,720 Speaker 1: to welcome back to the morning show. Understand you're not 10 00:00:26,720 --> 00:00:28,160 Speaker 1: feeling real good today, Dave. 11 00:00:28,720 --> 00:00:30,200 Speaker 2: A little bit of a cold. I'm all right. I 12 00:00:30,280 --> 00:00:31,720 Speaker 2: just sounded a little crazy this morning. 13 00:00:31,760 --> 00:00:33,600 Speaker 1: All right, Well, we'll wish you the best and then 14 00:00:33,640 --> 00:00:35,920 Speaker 1: we'll find out in the next segment whether you'll trust 15 00:00:35,960 --> 00:00:38,040 Speaker 1: an AI bot to figure out what the hell's wrong 16 00:00:38,080 --> 00:00:41,760 Speaker 1: with you. But before that, I was reminded Dave on 17 00:00:41,880 --> 00:00:44,040 Speaker 1: a on a comical level, and I asked Joe Streker 18 00:00:44,080 --> 00:00:46,199 Speaker 1: if he knew what the joke was about Rurou. And 19 00:00:46,240 --> 00:00:47,960 Speaker 1: of course my dad had told him that joke, and 20 00:00:48,000 --> 00:00:50,199 Speaker 1: that dad told me that joke too. Remember death by 21 00:00:50,280 --> 00:00:54,320 Speaker 1: Ruru also death by Unga Bunga. We have not RUU 22 00:00:54,440 --> 00:00:58,080 Speaker 1: but review. So you can either acknowledge your understanding of 23 00:00:58,120 --> 00:01:01,400 Speaker 1: that joke or we can just die into the subject matter. 24 00:01:01,480 --> 00:01:05,959 Speaker 2: Dave Hatter, Yeah, that is funny. And also there's a 25 00:01:06,040 --> 00:01:08,919 Speaker 2: take off of that on if you ever watch Futurama 26 00:01:09,040 --> 00:01:10,080 Speaker 2: Death by Snoozenoode. 27 00:01:10,160 --> 00:01:12,480 Speaker 1: There you go. Okay, we're all on the same page. 28 00:01:12,880 --> 00:01:18,200 Speaker 2: Yeah, so this is an interesting story headline. Wi Fi 29 00:01:18,319 --> 00:01:23,120 Speaker 2: signals reveal human activities through walls by mapping body key points, 30 00:01:24,000 --> 00:01:28,080 Speaker 2: and this is the review you're talking about. Is an 31 00:01:28,080 --> 00:01:32,360 Speaker 2: open source AI system and basically just using ordinary Wi Fi. 32 00:01:32,600 --> 00:01:35,559 Speaker 2: Keep in mind, your Wi Fi router is throwing out 33 00:01:35,640 --> 00:01:38,800 Speaker 2: a wireless signal that allows the devices within range to 34 00:01:38,800 --> 00:01:42,160 Speaker 2: connect to it. Your computer, your phone, your smart TV, 35 00:01:42,360 --> 00:01:45,399 Speaker 2: whatever other devices you have that can pick up a 36 00:01:45,400 --> 00:01:48,600 Speaker 2: Wi Fi signal. And I'm sure many of your listeners 37 00:01:48,640 --> 00:01:51,160 Speaker 2: out there have fought in the past with well, the 38 00:01:51,200 --> 00:01:54,320 Speaker 2: signals not strong enough in this room, or sometimes it 39 00:01:54,400 --> 00:01:56,400 Speaker 2: drops off at this time of day or whatever. But 40 00:01:56,440 --> 00:01:59,840 Speaker 2: my point is you're throwing out a radio frequency right 41 00:02:00,400 --> 00:02:03,440 Speaker 2: typically in the three hundred and sixty degree arc or 42 00:02:04,040 --> 00:02:07,080 Speaker 2: circle that all these devices are trying to connect to 43 00:02:07,800 --> 00:02:11,600 Speaker 2: and this is This is not new exactly. Researchers have 44 00:02:11,639 --> 00:02:13,359 Speaker 2: shown in the past you could do this, but now 45 00:02:13,360 --> 00:02:17,359 Speaker 2: with this new system, it basically shows you how as 46 00:02:17,360 --> 00:02:20,160 Speaker 2: the radio signals are either bouncing off of things or 47 00:02:20,160 --> 00:02:24,600 Speaker 2: absorbed by things. Think the way radar works. Basically without 48 00:02:24,600 --> 00:02:27,760 Speaker 2: any kind of cameras or anything. Someone that has the 49 00:02:27,840 --> 00:02:31,040 Speaker 2: right equipment and this in the right system can pick 50 00:02:31,120 --> 00:02:34,520 Speaker 2: up that Wi Fi and see where people are in 51 00:02:34,560 --> 00:02:37,200 Speaker 2: a room through the walls with no windows, no cameras 52 00:02:37,280 --> 00:02:41,000 Speaker 2: or anything like that. It's it's pretty wild, really, it 53 00:02:41,080 --> 00:02:46,160 Speaker 2: says in the article turning ordinary Wi Fi infrastructure into 54 00:02:46,200 --> 00:02:49,640 Speaker 2: a through the wall human sensing platform detecting body pos, 55 00:02:49,800 --> 00:02:52,640 Speaker 2: vital signs, and movement patterns without a single camera, raising 56 00:02:52,720 --> 00:02:58,399 Speaker 2: security and raising urgent security and surveillance concerns. Yeah, I'd 57 00:02:58,440 --> 00:03:02,600 Speaker 2: say yeah, and they say. Researchers and developers have long 58 00:03:02,639 --> 00:03:06,200 Speaker 2: theorized that ambient radio signals could be weaponized for passive surveillance, 59 00:03:06,760 --> 00:03:10,800 Speaker 2: and this review, built by developer RUBN Cohen and available 60 00:03:10,800 --> 00:03:14,359 Speaker 2: on GitHub, implements Wi Fi dns POS, a sensing technique 61 00:03:14,400 --> 00:03:17,440 Speaker 2: originally pioneered by Carnegie Mellon University, is a practical, low 62 00:03:17,520 --> 00:03:21,600 Speaker 2: cost edge system that reconstructs full body human poses through 63 00:03:21,600 --> 00:03:25,720 Speaker 2: walls using only standard Wi Fi signals. Wow, so yeah, 64 00:03:25,800 --> 00:03:28,040 Speaker 2: it's it just goes to show you again, all of 65 00:03:28,040 --> 00:03:31,160 Speaker 2: this technology can be used in ways that are often 66 00:03:31,520 --> 00:03:34,239 Speaker 2: unintended and potentially negative. 67 00:03:34,680 --> 00:03:37,160 Speaker 1: This sounds like something or American military would have developed 68 00:03:37,160 --> 00:03:39,640 Speaker 1: in order to do like breaches of you know, rooms, 69 00:03:39,640 --> 00:03:41,680 Speaker 1: whether so they know exactly where people are before they 70 00:03:41,760 --> 00:03:42,400 Speaker 1: kick a door in. 71 00:03:42,640 --> 00:03:45,800 Speaker 2: But well and maybe they did have Brian and you 72 00:03:45,920 --> 00:03:50,080 Speaker 2: just haven't heard about their version of it. You know. Again, 73 00:03:50,120 --> 00:03:53,480 Speaker 2: this idea of using Wi Fi signals to be able 74 00:03:53,520 --> 00:03:55,760 Speaker 2: to like map an area and so forth isn't new, 75 00:03:55,840 --> 00:04:00,040 Speaker 2: but this, this review application is fairly new. It's it 76 00:04:00,080 --> 00:04:01,600 Speaker 2: goes on to say, when a human body moves within 77 00:04:01,640 --> 00:04:05,400 Speaker 2: a wireless environment, it distorted signal paths across dozens of 78 00:04:05,720 --> 00:04:09,240 Speaker 2: o FDM subcarriers. This is a signal process. This software 79 00:04:09,280 --> 00:04:13,400 Speaker 2: is a signal processing pipeline. It captures these disturbances and 80 00:04:13,560 --> 00:04:16,640 Speaker 2: extracts the amplitude and phase variation, feeds them through a 81 00:04:16,680 --> 00:04:22,039 Speaker 2: modified process and essentially will show you again where a 82 00:04:22,120 --> 00:04:26,520 Speaker 2: person is, how they're positioned, and so forth, simply because 83 00:04:26,560 --> 00:04:28,880 Speaker 2: their body is block interfering with. 84 00:04:28,880 --> 00:04:34,000 Speaker 1: The signals frightening. It doesn't extract me that the technology 85 00:04:34,000 --> 00:04:37,599 Speaker 1: exists that can do that, but that the technology exists 86 00:04:37,760 --> 00:04:39,520 Speaker 1: that they can do that is just in and of 87 00:04:39,560 --> 00:04:43,240 Speaker 1: itself kind of frightening. So it's the death of privacy 88 00:04:43,320 --> 00:04:44,320 Speaker 1: by review. 89 00:04:45,440 --> 00:04:47,719 Speaker 2: Certainly one more step towards it. And it goes on 90 00:04:47,760 --> 00:04:51,360 Speaker 2: to say, unlike cameras which are regulating under GDPR and CCPA, 91 00:04:51,880 --> 00:04:55,720 Speaker 2: you know privacy laws which again vary all around the world, 92 00:04:55,800 --> 00:05:00,000 Speaker 2: and there isn't a single single privacy on the US 93 00:05:00,160 --> 00:05:04,320 Speaker 2: cameras which are regulating GDPR and CCPA and physical installation laws. 94 00:05:04,400 --> 00:05:07,960 Speaker 2: Passive Wi Fi sensing is invisible and requires no physical 95 00:05:08,040 --> 00:05:11,680 Speaker 2: access to the target environment. Legal analysis is noted. Quote, 96 00:05:12,080 --> 00:05:15,599 Speaker 2: it's quite difficult to ask pedestrians for permission and advance unquote, 97 00:05:15,640 --> 00:05:19,120 Speaker 2: and can send frameworks collap entirely when the sensing is passive. 98 00:05:19,320 --> 00:05:21,080 Speaker 1: Yeah. Well, let's face it. Even if there's a law 99 00:05:21,120 --> 00:05:23,520 Speaker 1: on the book outright prohibiting it and subjecting you to 100 00:05:23,640 --> 00:05:27,440 Speaker 1: criminal penalty if you do it, nefarious actors don't follow 101 00:05:27,520 --> 00:05:29,440 Speaker 1: the law, Dave. So I don't think one of for 102 00:05:29,520 --> 00:05:31,520 Speaker 1: want of a regulation, this is going to go anywhere. 103 00:05:32,000 --> 00:05:34,000 Speaker 1: All right, Well, keep that in mind. If you have 104 00:05:34,160 --> 00:05:36,160 Speaker 1: Wi Fi, someone could hack in and use this as 105 00:05:36,160 --> 00:05:37,280 Speaker 1: basically what it boils down to. 106 00:05:38,000 --> 00:05:39,720 Speaker 2: Yes, that is what it boils down to. 107 00:05:39,839 --> 00:05:41,880 Speaker 1: Yeah, right, get those Internet of Things devices to make 108 00:05:41,920 --> 00:05:43,440 Speaker 1: it really easy for people to break into your Wi 109 00:05:43,520 --> 00:05:44,080 Speaker 1: Fi system. 110 00:05:44,440 --> 00:05:45,440 Speaker 2: More on that, dude. 111 00:05:45,760 --> 00:05:47,960 Speaker 1: Maybe AI can figure out what's wrong with Dave this morning, 112 00:05:48,000 --> 00:05:49,520 Speaker 1: We're going to find that out. Coming up next first, 113 00:05:49,800 --> 00:05:58,080 Speaker 1: Kate of Heaven talk Station six, is that you love 114 00:05:58,160 --> 00:06:01,800 Speaker 1: KC the talk station. This is my theme song Sunday 115 00:06:01,800 --> 00:06:05,000 Speaker 1: Money and Tuesday Joe and of course Dave Hatter Feeling 116 00:06:05,080 --> 00:06:08,680 Speaker 1: slightly under the weather this morning in trust it dot 117 00:06:08,839 --> 00:06:11,839 Speaker 1: Com sponsoring the Tech Friday segment here with Dave Hatter. 118 00:06:11,920 --> 00:06:13,320 Speaker 1: Of course, they're the ones that rely on for all 119 00:06:13,360 --> 00:06:15,760 Speaker 1: your business computer needs. Feel a little under the weather? 120 00:06:15,800 --> 00:06:18,640 Speaker 1: Are you have you consulted AI bought Dave Hatter? 121 00:06:19,800 --> 00:06:23,560 Speaker 2: Now, Brian, I'm gonna have you guests. Do you think no, 122 00:06:23,960 --> 00:06:27,400 Speaker 2: that I would that I would a put any sensitive 123 00:06:27,440 --> 00:06:31,479 Speaker 2: information into any sort of AI bot? No partless? Yeah, 124 00:06:31,640 --> 00:06:35,600 Speaker 2: and then b take its advice as uh as gospel, 125 00:06:35,720 --> 00:06:38,760 Speaker 2: because you know, we have an ever increasing number of 126 00:06:38,839 --> 00:06:41,760 Speaker 2: cases showing that these things will literally just make things 127 00:06:41,839 --> 00:06:43,960 Speaker 2: up and then try to convince you there correct. 128 00:06:43,839 --> 00:06:46,240 Speaker 1: Or to kill yourself because you have a head colts. 129 00:06:47,080 --> 00:06:49,400 Speaker 2: Yeah, da wow, Yeah you might not. 130 00:06:49,720 --> 00:06:51,919 Speaker 1: You're dying. You're dying. You may as well just fast 131 00:06:51,960 --> 00:06:53,800 Speaker 1: forward and get it over with. It's the right thing 132 00:06:53,839 --> 00:06:54,000 Speaker 1: to do. 133 00:06:54,880 --> 00:06:57,600 Speaker 2: Yeah, some sort of nasal cancer, You better go off yourself. 134 00:06:58,560 --> 00:07:03,200 Speaker 2: It's no, there's there's Folks have to have to understand. 135 00:07:03,240 --> 00:07:04,920 Speaker 2: And this is something you and I have talked about 136 00:07:04,960 --> 00:07:08,120 Speaker 2: off and on, not on this specific topic, but the 137 00:07:08,279 --> 00:07:12,440 Speaker 2: idea that the more information you put out there about yourself, right, 138 00:07:12,680 --> 00:07:17,320 Speaker 2: the more that information will get corelated, sold, et cetera. 139 00:07:18,000 --> 00:07:21,800 Speaker 2: And is it possible that at some point you won't 140 00:07:21,800 --> 00:07:24,360 Speaker 2: get a job, or your insurance premium will go up, 141 00:07:24,640 --> 00:07:27,160 Speaker 2: or something like that, you won't be able to rent 142 00:07:27,160 --> 00:07:30,360 Speaker 2: an apartment because there's data out there about you which 143 00:07:30,440 --> 00:07:33,400 Speaker 2: might be incorrect. I mean, for example, you may enter 144 00:07:33,640 --> 00:07:37,560 Speaker 2: your symptoms about something and that data gets sold and 145 00:07:37,680 --> 00:07:40,960 Speaker 2: then you know someone some AI some person draws a 146 00:07:41,040 --> 00:07:43,720 Speaker 2: wrong conclusion or it gets mixed in with other data 147 00:07:43,840 --> 00:07:47,480 Speaker 2: and they draw wrong conclusion, and some AI algorithm somewhere 148 00:07:47,520 --> 00:07:50,440 Speaker 2: that someone has sold to some company that helps them 149 00:07:50,560 --> 00:07:53,920 Speaker 2: supposedly determine whether you're a good renter or a good employee, 150 00:07:54,360 --> 00:07:57,600 Speaker 2: or a good insurable or whatever it is, is making 151 00:07:57,680 --> 00:08:00,840 Speaker 2: wrong decisions based on that information which is not correct 152 00:08:00,840 --> 00:08:04,120 Speaker 2: to begin with. Yeah, that's a real threat. And I'll 153 00:08:04,200 --> 00:08:07,160 Speaker 2: just tell you right now. It's you know, it's one thing. 154 00:08:07,200 --> 00:08:10,520 Speaker 2: If you have a standalone AI set up like Ohama 155 00:08:10,680 --> 00:08:13,600 Speaker 2: or something, you know, I still wouldn't really put my 156 00:08:13,720 --> 00:08:17,360 Speaker 2: information in there. I would not take its advice. Might 157 00:08:17,440 --> 00:08:19,240 Speaker 2: I ask it a question and see what it says 158 00:08:19,320 --> 00:08:23,560 Speaker 2: and then compare that to other resources? Yes, maybe I 159 00:08:23,720 --> 00:08:27,440 Speaker 2: probably wouldn't, just because that's the way I am. However, 160 00:08:28,000 --> 00:08:31,000 Speaker 2: if you're using these free online tools, you need to 161 00:08:31,120 --> 00:08:34,480 Speaker 2: understand first off, unless you've read and understood the privacy 162 00:08:34,559 --> 00:08:37,319 Speaker 2: policy and or the terms of service, which is probably 163 00:08:37,400 --> 00:08:39,880 Speaker 2: going to be at minimum and eighty page confuse ofully 164 00:08:39,920 --> 00:08:43,079 Speaker 2: of mumbo jumbo that you can't understand. You don't know 165 00:08:43,200 --> 00:08:45,199 Speaker 2: where your data is going to go, you don't know 166 00:08:45,280 --> 00:08:47,600 Speaker 2: who they might sell it to, you don't know how 167 00:08:47,679 --> 00:08:49,679 Speaker 2: that could be used in ways that you would not 168 00:08:49,800 --> 00:08:53,079 Speaker 2: appreciate or approve of if you understood. So yeah, I'm 169 00:08:53,160 --> 00:08:55,800 Speaker 2: just I'm telling you all straight up now, there is 170 00:08:55,920 --> 00:08:59,600 Speaker 2: absolutely no chance I would use any sort of free 171 00:08:59,840 --> 00:09:05,480 Speaker 2: online AI tool Chat, GPT, Perplexity, Claude, groc fill in 172 00:09:05,559 --> 00:09:08,520 Speaker 2: the blank. I would not use any of them by 173 00:09:08,800 --> 00:09:11,120 Speaker 2: you know, to as a platform where I'm going to 174 00:09:11,280 --> 00:09:15,480 Speaker 2: enter sensitive information about myself or my family, especially health 175 00:09:15,520 --> 00:09:19,120 Speaker 2: related information, well knowing that there's no control over that 176 00:09:19,440 --> 00:09:20,280 Speaker 2: once I hit the. 177 00:09:20,280 --> 00:09:23,160 Speaker 1: Button, okay, and I get all that. And you know, 178 00:09:23,280 --> 00:09:25,720 Speaker 1: you often find out after crimes have been committed, they 179 00:09:25,800 --> 00:09:28,439 Speaker 1: go back and do look at your Internet search and 180 00:09:28,480 --> 00:09:30,360 Speaker 1: see if you've looked up like how to dispose of 181 00:09:30,400 --> 00:09:33,319 Speaker 1: a body or whatever that's obviously incriminating relative to the 182 00:09:33,360 --> 00:09:37,040 Speaker 1: crime scene. So but I have you know, I haven't 183 00:09:37,120 --> 00:09:40,400 Speaker 1: used the AI platforms like chat, GPT, but I have 184 00:09:40,640 --> 00:09:44,800 Speaker 1: looked up certain diseases. Like a friend is diagnosed with something, 185 00:09:44,880 --> 00:09:46,439 Speaker 1: I'm like, what the hell is that, I'll go in 186 00:09:46,480 --> 00:09:48,280 Speaker 1: and look it up. That doesn't mean it applies to me, 187 00:09:48,400 --> 00:09:50,240 Speaker 1: it's just I'm curious about it. It was a Mayo 188 00:09:50,320 --> 00:09:52,920 Speaker 1: clinic website. It's got all kinds of information, for example, 189 00:09:53,360 --> 00:09:57,079 Speaker 1: So that could be presumed by the folks out there 190 00:09:57,120 --> 00:10:00,360 Speaker 1: collecting information about my IP address that I'm the one 191 00:10:00,360 --> 00:10:02,400 Speaker 1: struggling with that, and I'm the one suffering for whatever 192 00:10:02,440 --> 00:10:03,440 Speaker 1: disease has been looked up? 193 00:10:04,200 --> 00:10:06,320 Speaker 2: Yes, how do they know that you're not the person 194 00:10:06,480 --> 00:10:08,800 Speaker 2: that has the potential disease that you're looking at? 195 00:10:09,240 --> 00:10:11,719 Speaker 1: How do they know that I don't have it and 196 00:10:11,760 --> 00:10:13,560 Speaker 1: I'm just looking it up out of curiosity. 197 00:10:14,679 --> 00:10:17,720 Speaker 2: Well, they don't. But my point is, once that data 198 00:10:17,800 --> 00:10:22,360 Speaker 2: gets sold somewhere down the line, can someone infer that 199 00:10:22,559 --> 00:10:25,520 Speaker 2: maybe Brian Thomas has fill in the blank because he 200 00:10:25,600 --> 00:10:28,880 Speaker 2: looked it up twice? Wow? You know so? 201 00:10:29,080 --> 00:10:32,000 Speaker 1: Yeah, I think I'd like a hearing on that, a 202 00:10:32,040 --> 00:10:35,920 Speaker 1: little due process before people reach irroneous conclusions, perhaps without 203 00:10:35,960 --> 00:10:36,559 Speaker 1: my knowledge. 204 00:10:36,640 --> 00:10:36,839 Speaker 2: That is. 205 00:10:36,960 --> 00:10:38,679 Speaker 1: That's that's scary, Dave, It really is. 206 00:10:38,920 --> 00:10:41,880 Speaker 2: Yeah. Well, I'm just telling you that at the end 207 00:10:41,920 --> 00:10:44,679 Speaker 2: of the day, when you use anything that's online, you're 208 00:10:44,720 --> 00:10:47,400 Speaker 2: being tracked constantly. May you know a lot of these 209 00:10:47,480 --> 00:10:51,080 Speaker 2: medical sites are some of the worst offenders of trackers 210 00:10:51,160 --> 00:10:53,280 Speaker 2: and fingerprinting and all that sort of thing. We've talked 211 00:10:53,280 --> 00:10:56,040 Speaker 2: about that before, and people can go research that for themselves, 212 00:10:56,160 --> 00:10:59,480 Speaker 2: but again they don't. Yes, you may have some just 213 00:10:59,600 --> 00:11:01,839 Speaker 2: random curiosity. You saw an ad on TV and what 214 00:11:01,960 --> 00:11:04,920 Speaker 2: does this drug do? What is this disease? Right? Because 215 00:11:04,920 --> 00:11:07,600 Speaker 2: they're constantly trying to tell you this stuff. But yes, 216 00:11:07,800 --> 00:11:10,840 Speaker 2: understand when you go look that up, if it's traceable 217 00:11:10,920 --> 00:11:14,520 Speaker 2: directly to you and through things like browser fingerprinting, it's 218 00:11:14,559 --> 00:11:17,959 Speaker 2: increasingly difficult to stop that. You know, if you have 219 00:11:18,120 --> 00:11:19,800 Speaker 2: the right tools and you know what you're doing, and 220 00:11:19,880 --> 00:11:22,080 Speaker 2: you use a VPN and all that, you can make 221 00:11:22,120 --> 00:11:25,880 Speaker 2: it much more difficult to attribute that to you specifically. 222 00:11:26,080 --> 00:11:28,040 Speaker 1: Okay, that's what my question was going to be. Using 223 00:11:28,040 --> 00:11:31,439 Speaker 1: a VPN that masks my personal IP address, then it 224 00:11:31,480 --> 00:11:33,480 Speaker 1: gives me one of the ones that's generically used by 225 00:11:33,520 --> 00:11:36,400 Speaker 1: anybody else with a VPN. So you got Nord VPN. 226 00:11:36,520 --> 00:11:39,079 Speaker 1: They think you're in Chicago or Poughkeepsie or in you know, 227 00:11:39,200 --> 00:11:42,319 Speaker 1: Venezuela or wherever the servers are. They can't trace that 228 00:11:42,440 --> 00:11:45,880 Speaker 1: directly to you individually. It's the VPN that they're they're 229 00:11:45,920 --> 00:11:46,800 Speaker 1: they're connecting it with. 230 00:11:47,000 --> 00:11:50,240 Speaker 2: Well they can, Yes, the endpoint is not going to 231 00:11:50,240 --> 00:11:51,959 Speaker 2: be able to trace that back to you. They'll just 232 00:11:52,080 --> 00:11:56,280 Speaker 2: see whatever VPN endpoint you came out of. So to 233 00:11:56,400 --> 00:11:59,120 Speaker 2: your point, you know, I could sign into my Proton 234 00:11:59,280 --> 00:12:02,959 Speaker 2: VPN today and come out of let's say Austria or 235 00:12:03,160 --> 00:12:05,520 Speaker 2: you know wherever I want, so it's going to appear 236 00:12:05,600 --> 00:12:07,760 Speaker 2: that I'm there. But here's the thing that most people 237 00:12:07,840 --> 00:12:11,959 Speaker 2: still don't really understand. Your web browser, in conjunction with 238 00:12:12,040 --> 00:12:15,679 Speaker 2: your operating system on your device, can be fingerprinted in 239 00:12:15,800 --> 00:12:19,320 Speaker 2: a way that is, if not unique, very very close 240 00:12:19,360 --> 00:12:21,520 Speaker 2: to unique. And here's what I mean by that. The 241 00:12:21,640 --> 00:12:25,920 Speaker 2: exact way your computer, your phone, whatever is configured is 242 00:12:26,040 --> 00:12:28,360 Speaker 2: unlikely to be exactly the same as mine, even if 243 00:12:28,400 --> 00:12:31,240 Speaker 2: we both have Windows eleven, for example. And all of 244 00:12:31,320 --> 00:12:34,079 Speaker 2: this information is not all of it. Much of it 245 00:12:34,280 --> 00:12:37,679 Speaker 2: is sent to every web request, and these apps that 246 00:12:37,760 --> 00:12:40,400 Speaker 2: people are using can collect even more information than a 247 00:12:40,440 --> 00:12:43,760 Speaker 2: web browser can. So even if you're using a VPN, 248 00:12:44,559 --> 00:12:48,400 Speaker 2: I can potentially capture a unique fingerprint or something that 249 00:12:48,559 --> 00:12:51,599 Speaker 2: is very likely to be unique to you, right, like 250 00:12:51,720 --> 00:12:53,920 Speaker 2: maybe you're one in one hundred people in the whole world. 251 00:12:54,640 --> 00:12:58,040 Speaker 2: And even with things like blocking cookies, even with things 252 00:12:58,120 --> 00:13:02,080 Speaker 2: like a VPN, it is possible to more or less 253 00:13:02,200 --> 00:13:05,679 Speaker 2: uniquely identify someone. So I'm just just warning jokes out 254 00:13:05,720 --> 00:13:08,720 Speaker 2: there if you think that you can have anonymity simply 255 00:13:08,760 --> 00:13:11,120 Speaker 2: because you're using a VPN, and you can go online 256 00:13:11,160 --> 00:13:13,040 Speaker 2: and look up things that you don't want other people 257 00:13:13,080 --> 00:13:18,240 Speaker 2: to know about, whether it's born or diseases, or whatever 258 00:13:18,800 --> 00:13:23,160 Speaker 2: purient interest you might have. It is possible that you 259 00:13:23,280 --> 00:13:27,320 Speaker 2: will be uniquely identified. And again you've got to understand, Brian, 260 00:13:27,480 --> 00:13:29,360 Speaker 2: most of these companies that are providing all these so 261 00:13:29,559 --> 00:13:33,439 Speaker 2: called free services to you, free medical advice, whatever it is, right, 262 00:13:33,960 --> 00:13:37,040 Speaker 2: they're making money off collecting your data. They're making money 263 00:13:37,080 --> 00:13:39,680 Speaker 2: off serving up ads to you. They're potentially making money 264 00:13:39,720 --> 00:13:42,559 Speaker 2: off selling your data so that other people can sell 265 00:13:42,640 --> 00:13:44,760 Speaker 2: ads to you or know that or think you have 266 00:13:44,960 --> 00:13:47,520 Speaker 2: disease X, so they can now send you the latest 267 00:13:47,600 --> 00:13:53,199 Speaker 2: medical information, latest pharmaceutical data, or whatever it is. You know. 268 00:13:53,240 --> 00:13:56,319 Speaker 2: They are heavily incentivized to want to uniquely identify you 269 00:13:56,440 --> 00:13:59,719 Speaker 2: and track you across the internet because there's power in 270 00:13:59,800 --> 00:14:03,599 Speaker 2: that data, right, there's power in the profiling of you 271 00:14:03,920 --> 00:14:05,840 Speaker 2: and me and Joe and everyone else is out there. 272 00:14:05,880 --> 00:14:07,680 Speaker 1: Yeah, yeah, to figure out how you. 273 00:14:07,760 --> 00:14:10,760 Speaker 2: Think, what you do, what you're interested in, and sell 274 00:14:10,760 --> 00:14:13,280 Speaker 2: you more stuff. That's the fundamental reason for it. Now 275 00:14:13,320 --> 00:14:17,120 Speaker 2: there are these others downstream consequences. 276 00:14:16,600 --> 00:14:18,840 Speaker 1: All right, fair enough, Yeah, yeah, don't. 277 00:14:18,760 --> 00:14:21,000 Speaker 2: Don't enter your health information in a chat pot. Don't 278 00:14:21,040 --> 00:14:21,200 Speaker 2: do it. 279 00:14:21,320 --> 00:14:23,360 Speaker 1: And Tom, let's say don't fhote Democrats six forty nine 280 00:14:23,400 --> 00:14:25,560 Speaker 1: fifty five KRC detox station odor exit gets rid of 281 00:14:25,560 --> 00:14:30,760 Speaker 1: the smells that're regularly surrounding you. At six fifty two 282 00:14:30,840 --> 00:14:33,840 Speaker 1: fift about car Ce detalks station tech part of Dave 283 00:14:33,880 --> 00:14:36,440 Speaker 1: had to brought to you by intrust it. Okay, So 284 00:14:36,640 --> 00:14:41,280 Speaker 1: they're deliberately degrading consoles through software upgrades. This it's like 285 00:14:41,440 --> 00:14:44,600 Speaker 1: the products these days are have like an internal self 286 00:14:44,680 --> 00:14:47,200 Speaker 1: destruct mechanism. Nothing lasts as long as it used to. 287 00:14:47,360 --> 00:14:47,560 Speaker 2: Days. 288 00:14:47,600 --> 00:14:48,680 Speaker 1: You got to go out and buy a new one. 289 00:14:48,840 --> 00:14:51,400 Speaker 1: Is this done? I guess the same thing via software. 290 00:14:53,160 --> 00:14:56,760 Speaker 2: Well, apparently that is the case. So Tom's Hardware has 291 00:14:56,880 --> 00:15:00,200 Speaker 2: this article. Uh am I allowed to say this word, Bran. 292 00:15:00,920 --> 00:15:02,360 Speaker 1: No, Joe said, I don't even know what word you 293 00:15:02,440 --> 00:15:08,080 Speaker 1: talking about. Yes, okay, so are we ns wortification? We 294 00:15:08,160 --> 00:15:10,040 Speaker 1: can call it that, yes, which. 295 00:15:09,920 --> 00:15:11,920 Speaker 2: I love this word, by the way, because I think 296 00:15:11,960 --> 00:15:15,280 Speaker 2: it is a perfect description. And Corey Doctor wrote a 297 00:15:15,280 --> 00:15:20,120 Speaker 2: whole book about this, the idea that these so called 298 00:15:20,160 --> 00:15:26,880 Speaker 2: smart products are increasingly making the world less good. Now 299 00:15:26,960 --> 00:15:28,840 Speaker 2: you know, I've been on this for a long time, Brian, 300 00:15:28,960 --> 00:15:32,360 Speaker 2: really primarily because of the privacy and security aspects of 301 00:15:32,520 --> 00:15:35,520 Speaker 2: these devices. But I'm just going to quote directly from 302 00:15:35,600 --> 00:15:38,160 Speaker 2: this article. Okay, so the headline again, I'll skip the 303 00:15:38,200 --> 00:15:41,160 Speaker 2: word in Norwegen Government watch dog calls out quote in 304 00:15:41,520 --> 00:15:45,960 Speaker 2: essification of video games, connected devices and others names hardware 305 00:15:46,000 --> 00:15:50,680 Speaker 2: deliberately degraded after purchase, and they say this direct quote 306 00:15:50,720 --> 00:15:54,040 Speaker 2: from the article, Well, companies can degrade the functionality of 307 00:15:54,080 --> 00:15:56,960 Speaker 2: your car or effectively destroy your connected washing machine with 308 00:15:57,000 --> 00:16:00,440 Speaker 2: the software updates. Does the report going on call out 309 00:16:00,480 --> 00:16:03,200 Speaker 2: printer in cartridges, smart home devices that lose features or 310 00:16:03,280 --> 00:16:07,160 Speaker 2: require subscriptions post purchase, and connected devices where functionality is 311 00:16:07,200 --> 00:16:09,720 Speaker 2: gated or removed over time, such as a Tesla's self 312 00:16:09,800 --> 00:16:13,080 Speaker 2: driving feature, which is switched to a subscription only service 313 00:16:13,120 --> 00:16:16,320 Speaker 2: as of February fourteenth. The report also Yeah. The report 314 00:16:16,360 --> 00:16:19,160 Speaker 2: also describes how freemium games use forced to add breaks 315 00:16:19,200 --> 00:16:22,040 Speaker 2: and in game virtual currencies to convert what we're one 316 00:16:22,240 --> 00:16:25,480 Speaker 2: single game purchase titles into recurring revenue streams. You know, 317 00:16:25,600 --> 00:16:27,440 Speaker 2: the idea that I bought a new car, and if 318 00:16:27,480 --> 00:16:29,720 Speaker 2: I want to use the heated seats, because it's all 319 00:16:29,840 --> 00:16:33,240 Speaker 2: software based, right, It's all software based, so I can 320 00:16:33,520 --> 00:16:35,520 Speaker 2: turn off your seats until you want to pay the 321 00:16:35,560 --> 00:16:38,920 Speaker 2: subscription fee or any other feature. But the bigger concern 322 00:16:39,000 --> 00:16:41,480 Speaker 2: I have about this, you know I and this goes 323 00:16:41,560 --> 00:16:45,120 Speaker 2: back to, you know, lawsuits against John Deere, where they've said, oh, 324 00:16:45,200 --> 00:16:47,560 Speaker 2: you can't fix your own equipment because we own the 325 00:16:47,640 --> 00:16:49,960 Speaker 2: intellectual property of the software and this sort of thing. 326 00:16:50,480 --> 00:16:52,440 Speaker 2: And it's not unique to John Deere. I'm not trying 327 00:16:52,440 --> 00:16:54,400 Speaker 2: to single them out. I'm just saying that, in general, 328 00:16:54,840 --> 00:16:58,320 Speaker 2: there have been lawsuits against companies in the past because 329 00:16:58,400 --> 00:17:01,720 Speaker 2: of the right to repair. That's sort of the idea 330 00:17:01,800 --> 00:17:05,399 Speaker 2: that I bought something, it's mine now, and if I 331 00:17:05,520 --> 00:17:07,639 Speaker 2: want to repair it myself, I should be able to. 332 00:17:08,119 --> 00:17:10,879 Speaker 2: But as we get into this increasingly digital world of 333 00:17:11,040 --> 00:17:14,240 Speaker 2: Internet of things so called smart devices, well, do you 334 00:17:14,359 --> 00:17:16,359 Speaker 2: really own it if it has software in it that 335 00:17:16,440 --> 00:17:18,760 Speaker 2: can control it and you know one or more features 336 00:17:18,760 --> 00:17:20,840 Speaker 2: can be shut off? And let me give you an 337 00:17:20,880 --> 00:17:24,040 Speaker 2: example of why I despise this stuff, Brian, and I 338 00:17:24,119 --> 00:17:26,960 Speaker 2: keep trying to explain to people. I'm not inherently against it. 339 00:17:27,520 --> 00:17:29,440 Speaker 2: I'm against the fact that we are now in a 340 00:17:29,520 --> 00:17:32,920 Speaker 2: world where everything has software in it. People don't understand 341 00:17:33,000 --> 00:17:34,960 Speaker 2: the risks, they don't know how to configure it correctly. 342 00:17:35,040 --> 00:17:36,800 Speaker 2: They don't know how to update it, and in fact, 343 00:17:36,880 --> 00:17:39,080 Speaker 2: in many cases the companies that make it will not 344 00:17:39,200 --> 00:17:42,959 Speaker 2: provide updates after a certain period of time. They can 345 00:17:43,040 --> 00:17:46,120 Speaker 2: turn things off, but it can be hacked. Here's an example, 346 00:17:46,240 --> 00:17:50,840 Speaker 2: headline cyber attack leaves main drivers with breathalyzer test systems 347 00:17:50,960 --> 00:17:54,200 Speaker 2: unable to start vehicles. A cyber attacks shut down a 348 00:17:54,280 --> 00:17:57,720 Speaker 2: national breathalyzer test system found in vehicles of OUI offenders, 349 00:17:58,160 --> 00:18:02,600 Speaker 2: impacting thousands of drivers in forty five other states. Then 350 00:18:02,600 --> 00:18:04,840 Speaker 2: I'm reading from the article. Once the device is installed, 351 00:18:04,960 --> 00:18:07,159 Speaker 2: drivers have to pass the breathalyzer test where they can 352 00:18:07,200 --> 00:18:09,760 Speaker 2: start their vehicle. It won't start if your alcohol blood 353 00:18:09,760 --> 00:18:12,000 Speaker 2: alcohol is point to or higher. But since the cyber 354 00:18:12,080 --> 00:18:15,720 Speaker 2: breach shut its entire system down on Saturday, anyone with 355 00:18:15,840 --> 00:18:19,080 Speaker 2: one of the devices cannot start their car. So you 356 00:18:19,160 --> 00:18:21,760 Speaker 2: may end up losing your job because your car won't 357 00:18:21,800 --> 00:18:25,480 Speaker 2: start due to some sort of Internet connected device. Or 358 00:18:25,760 --> 00:18:28,399 Speaker 2: better yet, Brian, now that your car is a rolling computer, 359 00:18:28,720 --> 00:18:31,520 Speaker 2: what happens when an adversarial nation just turns your car 360 00:18:31,600 --> 00:18:35,720 Speaker 2: off for you, or when you violate some you know, 361 00:18:35,920 --> 00:18:38,280 Speaker 2: theoretical law or whatever, and now you can't use your 362 00:18:38,359 --> 00:18:42,520 Speaker 2: car because the system has decided that you can't you 363 00:18:42,640 --> 00:18:47,560 Speaker 2: can't use the car. This is where the in word 364 00:18:47,680 --> 00:18:53,400 Speaker 2: comes in that we've been describing the essification right. We're 365 00:18:53,440 --> 00:18:57,240 Speaker 2: headed into a crazy place if people do not get 366 00:18:57,280 --> 00:18:59,080 Speaker 2: wise to the fact that all of these Internet of 367 00:18:59,160 --> 00:19:03,600 Speaker 2: things devices where we're at today are not good. They're hackable. 368 00:19:03,640 --> 00:19:05,960 Speaker 2: Here's yet another example. I keep trying to point these 369 00:19:06,000 --> 00:19:08,840 Speaker 2: out real quick, and yeah, the less of these we 370 00:19:08,920 --> 00:19:09,360 Speaker 2: have the better. 371 00:19:09,880 --> 00:19:12,480 Speaker 1: Thus, with ease, we have the better. Continuing theme with 372 00:19:12,560 --> 00:19:15,040 Speaker 1: tech Friday's Dave at or interest it dot com. Get 373 00:19:15,080 --> 00:19:17,359 Speaker 1: in touch with Dave. You're in the best hands with 374 00:19:17,560 --> 00:19:20,000 Speaker 1: business computer needs. Dave, thank you very much every week 375 00:19:20,040 --> 00:19:21,800 Speaker 1: for the segment. We'll talk again next Friday. I hope 376 00:19:21,800 --> 00:19:22,639 Speaker 1: you have a wonderful weekend. 377 00:19:22,680 --> 00:19:24,560 Speaker 2: Brother, Always my pleasure. 378 00:19:24,600 --> 00:19:26,880 Speaker 1: Thanks six to fifty six. Coming up in seven fifty 379 00:19:26,880 --> 00:19:29,960 Speaker 1: five KRCD Talk Stations Signal ninety nine full hour