1 00:00:01,400 --> 00:00:04,000 Speaker 1: If you've got KARC the talk station Friday. I love 2 00:00:04,040 --> 00:00:06,640 Speaker 1: this time of week Tech Friday with Dave Hatter. Such 3 00:00:06,680 --> 00:00:09,560 Speaker 1: an important segment on my program. He's doing a public 4 00:00:09,600 --> 00:00:12,960 Speaker 1: service and you know what interest it sponsors this segment 5 00:00:13,320 --> 00:00:16,439 Speaker 1: providing all this wonderful information and yeah, an ad campaign, 6 00:00:16,480 --> 00:00:19,080 Speaker 1: but it's working there. The best business courtier says, listen 7 00:00:19,360 --> 00:00:22,200 Speaker 1: for you businesses out there. Everybody's got computers. If you 8 00:00:22,239 --> 00:00:25,400 Speaker 1: need computer assistants, best practices get you out of a jam, 9 00:00:25,440 --> 00:00:29,360 Speaker 1: solving your problems in trust dot com. They're the best 10 00:00:29,400 --> 00:00:31,639 Speaker 1: game in the business. And God bless you Dave Hatter 11 00:00:31,720 --> 00:00:33,720 Speaker 1: for doing that and coming on the program and saving 12 00:00:33,760 --> 00:00:36,080 Speaker 1: us from ourselves. Welcome back, my friend, Happy Friday. 13 00:00:37,000 --> 00:00:37,640 Speaker 2: Thanks Brian. 14 00:00:37,720 --> 00:00:40,280 Speaker 3: Always good to be here, and as I always like 15 00:00:40,360 --> 00:00:42,519 Speaker 3: to say, hopefully we're doing some good out there. 16 00:00:42,560 --> 00:00:45,440 Speaker 1: You are. People can choose to disregard you, man, but 17 00:00:45,479 --> 00:00:47,120 Speaker 1: they do it at their own peril. Real quick here, 18 00:00:47,880 --> 00:00:51,240 Speaker 1: Dick Durbin, Dick got it was a victim of AI 19 00:00:51,360 --> 00:00:53,680 Speaker 1: making himself look like an idiot. He pulled up a 20 00:00:54,160 --> 00:00:58,600 Speaker 1: doctored AI doctored photo of the Alex preddy man who 21 00:00:58,600 --> 00:01:02,160 Speaker 1: got shot by Border patrol agents. Whoops, Oh yeah, there 22 00:01:02,200 --> 00:01:05,040 Speaker 1: is no head on that agent. It was an enhanced 23 00:01:05,040 --> 00:01:07,959 Speaker 1: image from a very greeny video footage that looks crystal 24 00:01:08,040 --> 00:01:10,240 Speaker 1: clear like I got taken with a high def camera, 25 00:01:10,280 --> 00:01:12,600 Speaker 1: and he thought it was wise to bring that one 26 00:01:12,600 --> 00:01:15,520 Speaker 1: out to show how bad the agent's acted. And of 27 00:01:15,600 --> 00:01:18,200 Speaker 1: course it was a source of much embarrassment. I'm sure 28 00:01:18,240 --> 00:01:20,600 Speaker 1: somebody lost their job over sending Dick out into the 29 00:01:20,600 --> 00:01:23,520 Speaker 1: world to that photo. But that's an example of artificial intelligence, 30 00:01:24,000 --> 00:01:25,400 Speaker 1: not that it has anything to do with the first 31 00:01:25,400 --> 00:01:29,120 Speaker 1: topic AI image geolocation. What is that and why is 32 00:01:29,160 --> 00:01:30,280 Speaker 1: it now easier than before? 33 00:01:30,400 --> 00:01:35,440 Speaker 3: Dave Hatter so Brian in the past talked about metadata 34 00:01:35,840 --> 00:01:39,959 Speaker 3: and in particular EXF data in photos. Most modern cameras, 35 00:01:40,520 --> 00:01:43,880 Speaker 3: probably every modern digital camera, I can't say for sure every, 36 00:01:43,920 --> 00:01:49,240 Speaker 3: but most have the capability to stamp data. Metadata data 37 00:01:49,280 --> 00:01:51,480 Speaker 3: you don't see into a picture. So behind the scenes 38 00:01:51,520 --> 00:01:54,320 Speaker 3: in your photos there's data that would tell you, like 39 00:01:54,400 --> 00:01:57,320 Speaker 3: the GPS coordinates where it was taken, and various other 40 00:01:57,440 --> 00:02:00,600 Speaker 3: data that describes the picture. When you hear nerd like 41 00:02:00,640 --> 00:02:04,920 Speaker 3: me say, metadata, metadata is essentially data that describes other data. 42 00:02:05,360 --> 00:02:08,120 Speaker 3: So if you've ever dug into like a word file 43 00:02:08,160 --> 00:02:11,560 Speaker 3: or an Excel document, there's metadata in there that will 44 00:02:11,600 --> 00:02:14,040 Speaker 3: tell you that it's an Excel file, when it was created, 45 00:02:14,080 --> 00:02:16,959 Speaker 3: who created it, that sort of thing, right, So a 46 00:02:17,000 --> 00:02:19,680 Speaker 3: lot of times people don't realize this metadata exists, and 47 00:02:19,680 --> 00:02:21,680 Speaker 3: they get themselves in trouble because they don't know what's 48 00:02:21,720 --> 00:02:25,560 Speaker 3: there and they think they're doing something that's untrackable or whatever. 49 00:02:25,600 --> 00:02:27,880 Speaker 3: There's a famous story of a hacker who was taunting 50 00:02:27,919 --> 00:02:31,440 Speaker 3: the FBI. He knew about metadata. He would post pictures 51 00:02:31,440 --> 00:02:34,959 Speaker 3: where the exif data was removed, and one time slipped 52 00:02:35,040 --> 00:02:37,200 Speaker 3: up posted a picture where he forgot to remove the 53 00:02:37,880 --> 00:02:41,200 Speaker 3: exif data, which included the location the geo coordinates of 54 00:02:41,240 --> 00:02:43,760 Speaker 3: where that picture was taken, and they eventually caught him 55 00:02:43,760 --> 00:02:48,080 Speaker 3: as a result. So this idea of being able to 56 00:02:48,639 --> 00:02:51,120 Speaker 3: take a photo and then use the metadata to figure 57 00:02:51,120 --> 00:02:53,960 Speaker 3: out where it came from as not new. However, this 58 00:02:54,160 --> 00:02:58,200 Speaker 3: article is basically pointing out that now AI can take 59 00:02:58,240 --> 00:03:02,160 Speaker 3: a photo even without the meta data. Okay, look at 60 00:03:02,160 --> 00:03:06,400 Speaker 3: the photo itself, and you know, using things like databases, 61 00:03:06,560 --> 00:03:10,440 Speaker 3: Google Earth, things like that, figure out where that photo 62 00:03:10,520 --> 00:03:10,880 Speaker 3: was taken. 63 00:03:10,919 --> 00:03:12,840 Speaker 1: Oh yeah, yeah. 64 00:03:12,919 --> 00:03:16,600 Speaker 3: So again, even if you are savvy and you understand 65 00:03:16,600 --> 00:03:21,160 Speaker 3: things like metadata, stripping it out is not necessarily a 66 00:03:21,200 --> 00:03:25,160 Speaker 3: guarantee that these newer AI tools wouldn't be able to 67 00:03:26,040 --> 00:03:30,000 Speaker 3: identify where that photo was taken. 68 00:03:31,320 --> 00:03:34,560 Speaker 1: When they say it's just like facial recognition, there's enough 69 00:03:34,600 --> 00:03:37,600 Speaker 1: information photographs of the globe out there that they can 70 00:03:37,640 --> 00:03:40,160 Speaker 1: match whatever photograph you took, as long as it's sort 71 00:03:40,160 --> 00:03:42,920 Speaker 1: of like outdoors or something like that, or even indoors. 72 00:03:44,280 --> 00:03:47,560 Speaker 1: Multiple millions of photographs in the world, this AI program 73 00:03:47,600 --> 00:03:50,040 Speaker 1: is it can easily go through all that, scrape all 74 00:03:50,120 --> 00:03:52,640 Speaker 1: that and compare your photo with what exists in the 75 00:03:52,640 --> 00:03:55,600 Speaker 1: world and say, no, that was taken in downtown Cincinnati, right, 76 00:03:55,640 --> 00:03:56,400 Speaker 1: I mean that kind of thing. 77 00:03:57,200 --> 00:03:58,920 Speaker 2: Yeah, here's here's an example. 78 00:03:59,040 --> 00:04:03,560 Speaker 3: So there's a picture an anchor, a very large anchor, 79 00:04:03,600 --> 00:04:07,480 Speaker 3: probably from like a freighter or battleship or something, propped 80 00:04:07,560 --> 00:04:10,200 Speaker 3: up on a street and all you see really is 81 00:04:10,200 --> 00:04:12,680 Speaker 3: the anchor and a cobblestone street and a stone. 82 00:04:12,440 --> 00:04:13,120 Speaker 2: Building behind it. 83 00:04:13,200 --> 00:04:14,600 Speaker 3: I mean, there's no way I could look at this 84 00:04:14,640 --> 00:04:17,520 Speaker 3: and even begin to guess where this was taken. And 85 00:04:17,839 --> 00:04:21,680 Speaker 3: apparently it's from Helsinki, Finland, And they said in the 86 00:04:21,839 --> 00:04:24,920 Speaker 3: article tool the tool identified this, though it also might 87 00:04:24,960 --> 00:04:28,000 Speaker 3: be in other locations. Still this is amazingly impressive because 88 00:04:28,000 --> 00:04:32,040 Speaker 3: there's been a little go off. So you know, basically 89 00:04:32,120 --> 00:04:34,279 Speaker 3: they give you some tips here again, if you're going 90 00:04:34,320 --> 00:04:37,560 Speaker 3: to post photos online, turn off the location access on 91 00:04:37,600 --> 00:04:41,520 Speaker 3: your camera, strip actif data. Don't post pictures of your kids, 92 00:04:41,920 --> 00:04:44,640 Speaker 3: don't post pictures of your house or landscape. You know, 93 00:04:45,000 --> 00:04:48,839 Speaker 3: don't post things publicly. But remember this idea of privacy. 94 00:04:48,920 --> 00:04:51,919 Speaker 3: Once you post something, Right if you and I are friends, Brian, 95 00:04:51,960 --> 00:04:54,640 Speaker 3: and you post something it's only for your friends, Well, 96 00:04:54,680 --> 00:04:56,760 Speaker 3: once it's on my screen, I can take it, I 97 00:04:56,800 --> 00:04:59,960 Speaker 3: can potentially reshare it, I can take a screenshot of it. 98 00:05:00,400 --> 00:05:02,720 Speaker 3: You can't really stop me from doing whatever I want 99 00:05:02,760 --> 00:05:04,800 Speaker 3: with at that point. So that's not a great tip 100 00:05:04,839 --> 00:05:07,520 Speaker 3: in my mind. But the point they're making here is, 101 00:05:07,680 --> 00:05:09,920 Speaker 3: you know, and it's the point I'm trying to make 102 00:05:09,960 --> 00:05:12,960 Speaker 3: all the time. I'm not saying you should never post 103 00:05:13,000 --> 00:05:16,320 Speaker 3: anything online. I'm saying you should think about your digital footprint. 104 00:05:16,480 --> 00:05:20,880 Speaker 3: And as this technology progresses, you know, if someone were 105 00:05:20,920 --> 00:05:24,799 Speaker 3: stalking you, for example, now, even posting an image where 106 00:05:25,400 --> 00:05:29,120 Speaker 3: you know enough to strip out the exit data might 107 00:05:29,160 --> 00:05:31,599 Speaker 3: not be enough to prevent someone from using these kinds 108 00:05:31,600 --> 00:05:33,599 Speaker 3: of tools to figure out where that photo was taken 109 00:05:33,960 --> 00:05:35,120 Speaker 3: and eventually find you. 110 00:05:36,320 --> 00:05:38,599 Speaker 1: Makes sense for me to the consider Yeah, it's always 111 00:05:38,600 --> 00:05:42,240 Speaker 1: something to consider anything you post anything online anymore. Wow, 112 00:05:43,040 --> 00:05:45,080 Speaker 1: I don't view this as a good road we're going 113 00:05:45,160 --> 00:05:47,520 Speaker 1: on or going down. Dave, It really really it frightens 114 00:05:47,520 --> 00:05:49,880 Speaker 1: me more and more every day considering the forms of 115 00:05:49,920 --> 00:05:52,400 Speaker 1: abuse that AI is coming or the ability to abuse 116 00:05:52,440 --> 00:05:55,640 Speaker 1: of AI is coming up. With Dave hadd we're gonna 117 00:05:55,640 --> 00:06:00,480 Speaker 1: talk about his sixty one fib KCD talk station tech 118 00:06:00,480 --> 00:06:04,239 Speaker 1: part of the Dave Hatter introst it dot com and LinkedIn. 119 00:06:05,279 --> 00:06:06,560 Speaker 1: I guess in the next segment we're going to be 120 00:06:06,560 --> 00:06:08,520 Speaker 1: talking a little bit about LinkedIn day. That's your favorite 121 00:06:08,520 --> 00:06:10,560 Speaker 1: platform to put up information that you talk about here 122 00:06:10,560 --> 00:06:12,560 Speaker 1: in the morning show and throughout the week and just 123 00:06:12,600 --> 00:06:15,480 Speaker 1: search for Dave hat or on LinkedIn. But in the meantime, 124 00:06:15,560 --> 00:06:18,000 Speaker 1: check out the Northern Kentucky Tribune op ed piece by 125 00:06:18,040 --> 00:06:21,320 Speaker 1: the Man The Myths Legend Data January twenty first, Dave Hatter, 126 00:06:21,920 --> 00:06:24,800 Speaker 1: How Kentuckians can you use State Consumer Data Protection Act? 127 00:06:24,920 --> 00:06:25,120 Speaker 2: Dave? 128 00:06:25,160 --> 00:06:26,120 Speaker 1: What's this all about? 129 00:06:26,920 --> 00:06:32,680 Speaker 3: Yeah, Brian, So, Kentucky passd a Consumer Data Protection Act 130 00:06:33,560 --> 00:06:36,200 Speaker 3: that gives you some rights as a consumer to protect 131 00:06:36,320 --> 00:06:39,880 Speaker 3: your privacy from these data brokers and so forth that 132 00:06:39,920 --> 00:06:43,279 Speaker 3: are constantly buying and selling your data. And while it's 133 00:06:43,279 --> 00:06:48,320 Speaker 3: not a perfect law there, we're one of about twenty states. 134 00:06:48,000 --> 00:06:48,520 Speaker 2: That have a law. 135 00:06:48,600 --> 00:06:51,320 Speaker 3: Now, I don't think Ohio's gotten there yet, So for 136 00:06:51,400 --> 00:06:53,159 Speaker 3: all my friends in Ohio, you might want to reach 137 00:06:53,160 --> 00:06:54,920 Speaker 3: out to your legislators and tell them to take a 138 00:06:54,960 --> 00:06:57,920 Speaker 3: look at Kentucky's law. Again, they are about twenty states 139 00:06:57,960 --> 00:06:59,560 Speaker 3: that have done this, And as a reminder, there is 140 00:06:59,600 --> 00:07:02,599 Speaker 3: no FED privacy law at this point, So the best 141 00:07:02,640 --> 00:07:04,880 Speaker 3: you can hope for is some sort of state law 142 00:07:04,920 --> 00:07:08,320 Speaker 3: that would give you some protections as a consumer and 143 00:07:08,400 --> 00:07:12,840 Speaker 3: ideally incentivize businesses that are collecting your data to take 144 00:07:12,880 --> 00:07:16,560 Speaker 3: it seriously, protect it, that sort of thing. But this 145 00:07:16,680 --> 00:07:20,240 Speaker 3: law went into effect in January, and Kentucky, like I said, 146 00:07:20,280 --> 00:07:20,960 Speaker 3: it's not perfect. 147 00:07:21,000 --> 00:07:22,440 Speaker 2: It doesn't have a private right of action. 148 00:07:22,600 --> 00:07:25,160 Speaker 3: If you believe that you have been harmed under the 149 00:07:26,000 --> 00:07:27,480 Speaker 3: provisions of the law, you have to go to the 150 00:07:27,560 --> 00:07:30,160 Speaker 3: Kentucky Attorney General's office. They're the only ones that can 151 00:07:30,280 --> 00:07:33,520 Speaker 3: currently enforce this. But it does have some teeth. Businesses 152 00:07:33,560 --> 00:07:38,720 Speaker 3: can be penalized pretty substantially, depending on the type of business, 153 00:07:38,760 --> 00:07:40,960 Speaker 3: size of business, the amount of day to day collect 154 00:07:41,000 --> 00:07:43,560 Speaker 3: things like that. But what I was shooting for here, 155 00:07:43,600 --> 00:07:45,560 Speaker 3: because I did a different piece just trying to explain 156 00:07:45,640 --> 00:07:47,760 Speaker 3: the law so people would know about it. What I 157 00:07:47,760 --> 00:07:50,360 Speaker 3: tried to explain here is as a consumer in Kentucky, 158 00:07:50,760 --> 00:07:53,000 Speaker 3: and I'm going to try to put something together. So 159 00:07:53,240 --> 00:07:56,360 Speaker 3: business is doing business in Kentucky, understand how this will 160 00:07:56,360 --> 00:08:00,400 Speaker 3: apply to them, and hopefully start to take action to 161 00:08:00,440 --> 00:08:04,440 Speaker 3: not become someone who ends up in trouble because they 162 00:08:04,480 --> 00:08:08,160 Speaker 3: did not do the right things. But basically, this gives 163 00:08:08,200 --> 00:08:10,080 Speaker 3: you the opportunity to go to a company that you 164 00:08:10,120 --> 00:08:11,440 Speaker 3: believe is collected your data. 165 00:08:11,920 --> 00:08:13,920 Speaker 2: Now, this is not easy, Brian. This is one of 166 00:08:13,920 --> 00:08:14,520 Speaker 2: the tricks. Right. 167 00:08:14,560 --> 00:08:16,400 Speaker 3: You got to go to their website. You got to 168 00:08:16,400 --> 00:08:19,440 Speaker 3: find that their privacy policy or whatever, or how you 169 00:08:19,480 --> 00:08:22,440 Speaker 3: would contact them and say, you know, you have to 170 00:08:22,440 --> 00:08:25,360 Speaker 3: turn over your data to me. You know, you might 171 00:08:25,360 --> 00:08:28,800 Speaker 3: look for things like private privacy rights, data subject requests, 172 00:08:28,840 --> 00:08:31,679 Speaker 3: that sort of thing. Right, you got to submit a request. 173 00:08:31,760 --> 00:08:34,200 Speaker 3: You have to say, hey, I want you and I've 174 00:08:34,200 --> 00:08:36,760 Speaker 3: got some sample language in here. For example, please confirm 175 00:08:36,760 --> 00:08:39,839 Speaker 3: whether you are processing my personal data provide access to 176 00:08:39,880 --> 00:08:43,440 Speaker 3: all personal data associated with my account, including identifiers, browsing history, 177 00:08:43,480 --> 00:08:48,560 Speaker 3: location data in any inferences. Right, and then basically you 178 00:08:48,600 --> 00:08:51,679 Speaker 3: should get a report and you can look at that report, 179 00:08:51,760 --> 00:08:53,920 Speaker 3: and then you can do things like say, well this 180 00:08:53,960 --> 00:08:56,439 Speaker 3: is wrong, I want you to correct that. You can 181 00:08:56,480 --> 00:09:00,800 Speaker 3: request the LEAs of personal data collected directly from you. 182 00:09:01,240 --> 00:09:03,800 Speaker 3: And that's one of the things that makes this less ideal. 183 00:09:04,880 --> 00:09:07,880 Speaker 3: If they if they've bought it from someone else, well 184 00:09:07,920 --> 00:09:11,000 Speaker 3: that wasn't collected directly from you, so you can't they 185 00:09:11,040 --> 00:09:13,160 Speaker 3: won't have to delete that necessarily. 186 00:09:13,280 --> 00:09:14,079 Speaker 2: Right. 187 00:09:14,160 --> 00:09:17,600 Speaker 3: So again, there are some things where this could be better, 188 00:09:18,000 --> 00:09:21,160 Speaker 3: but even for people who've complained about it, my answer is, well, 189 00:09:21,240 --> 00:09:23,160 Speaker 3: it's way better than what we had before. The law 190 00:09:23,280 --> 00:09:26,040 Speaker 3: was passed in January twenty four and again went into 191 00:09:26,080 --> 00:09:28,440 Speaker 3: effect in January of this year. It was actually was 192 00:09:28,480 --> 00:09:31,480 Speaker 3: passed in twenty four, not January, went into effect this year. 193 00:09:31,600 --> 00:09:32,720 Speaker 2: It's way better than nothing. 194 00:09:33,040 --> 00:09:35,800 Speaker 3: It does give you, again, some rights as a consumer 195 00:09:36,160 --> 00:09:41,240 Speaker 3: to see what companies have to request, collect corrections, to 196 00:09:41,280 --> 00:09:45,600 Speaker 3: request deletions, to ask for the data so you could 197 00:09:45,640 --> 00:09:48,640 Speaker 3: take it somewhere else. And you know, I'm hoping that 198 00:09:48,720 --> 00:09:52,200 Speaker 3: this will just be a first step towards tightening this 199 00:09:52,360 --> 00:09:55,280 Speaker 3: up over time, you know, and giving businesses a chance 200 00:09:55,320 --> 00:10:00,320 Speaker 3: to understand this and start to implement practices to secure 201 00:10:00,360 --> 00:10:04,800 Speaker 3: your data, protect your data, store less data, and for 202 00:10:04,920 --> 00:10:07,560 Speaker 3: you as a consumer, that's all the wind. So again, 203 00:10:07,600 --> 00:10:10,400 Speaker 3: if you live in Kentucky, go read this op ed. 204 00:10:10,559 --> 00:10:13,680 Speaker 3: I've posted it several times in my social media LinkedIn acts, 205 00:10:13,679 --> 00:10:16,000 Speaker 3: et cetera. I'll post it again as part of the 206 00:10:16,040 --> 00:10:18,280 Speaker 3: notes for this and then you know, start to take 207 00:10:18,320 --> 00:10:19,240 Speaker 3: action to try. 208 00:10:19,040 --> 00:10:20,559 Speaker 2: To unwind some of the stuff. 209 00:10:20,559 --> 00:10:23,000 Speaker 3: I also talk about, like trying to do a personal 210 00:10:23,040 --> 00:10:26,120 Speaker 3: privacy audit. And you know, I've written some articles on 211 00:10:26,160 --> 00:10:28,240 Speaker 3: this kind of thing before, got some tips out there, 212 00:10:28,640 --> 00:10:31,520 Speaker 3: links to tools that I use, so you. 213 00:10:31,480 --> 00:10:32,959 Speaker 2: Know, ultimately this is it. 214 00:10:33,400 --> 00:10:36,679 Speaker 3: And also this week is Data Privacy Month, so this 215 00:10:36,760 --> 00:10:39,319 Speaker 3: is a great time to start thinking about your privacy, 216 00:10:39,679 --> 00:10:42,240 Speaker 3: clean up your digital footprint, and if you live in Kentucky, 217 00:10:42,679 --> 00:10:45,920 Speaker 3: start to use this law to unwind some of this stuff. 218 00:10:45,920 --> 00:10:47,520 Speaker 1: All right, Well, at the risk of going down some 219 00:10:47,640 --> 00:10:49,160 Speaker 1: rabbit hole, which I don't want to do. I'm just 220 00:10:49,360 --> 00:10:52,160 Speaker 1: I'm trying to process how complicated. 221 00:10:51,480 --> 00:10:53,960 Speaker 2: This is going to be. It's going to be complished. 222 00:10:54,080 --> 00:10:58,040 Speaker 1: Okay, let's say I interact with City Bank or something like, 223 00:10:58,480 --> 00:11:02,880 Speaker 1: I purposefully intend go to a website, right, I give 224 00:11:02,880 --> 00:11:04,560 Speaker 1: them my log in information, my name, they got my 225 00:11:04,600 --> 00:11:07,400 Speaker 1: credit card number, whatever. So I exchanged information with them. 226 00:11:07,440 --> 00:11:09,559 Speaker 1: I can say, hey, City Bank, what are you collecting 227 00:11:09,600 --> 00:11:11,520 Speaker 1: about me? What do you have internally related to me? 228 00:11:11,640 --> 00:11:14,920 Speaker 1: That makes sense, It's an easy ask. But lots of 229 00:11:14,960 --> 00:11:17,920 Speaker 1: businesses out there have people within the company that are 230 00:11:18,040 --> 00:11:21,240 Speaker 1: using other forms of let's say artificial apps. 231 00:11:21,720 --> 00:11:21,800 Speaker 2: Right. 232 00:11:21,920 --> 00:11:24,520 Speaker 1: Let's say within the working model of any given business, 233 00:11:24,559 --> 00:11:29,000 Speaker 1: they use information you provided them through some app created 234 00:11:29,000 --> 00:11:32,160 Speaker 1: by a third party. Now do they have to let 235 00:11:32,200 --> 00:11:34,520 Speaker 1: you know that what they're using involves a third party app? 236 00:11:34,520 --> 00:11:36,240 Speaker 1: Because that app, as you pointed out, may very well 237 00:11:36,240 --> 00:11:39,320 Speaker 1: be collecting your information, perhaps even unbeknownst to the company 238 00:11:39,320 --> 00:11:40,560 Speaker 1: that I'm asking for data from. 239 00:11:41,559 --> 00:11:45,880 Speaker 3: So the law has some exemptions for organizations and agencies 240 00:11:45,920 --> 00:11:48,960 Speaker 3: that would not be subject to this, for example, nonprofits, 241 00:11:49,400 --> 00:11:54,240 Speaker 3: higher education, state and local government agencies, businesses regulated under 242 00:11:54,320 --> 00:11:57,840 Speaker 3: federal laws like HIPPA and Graham leads Filey Act. So 243 00:11:58,480 --> 00:12:01,800 Speaker 3: not every business is subject this. But as I understand it, 244 00:12:01,840 --> 00:12:05,079 Speaker 3: I'm not an attorney. As I understand it, Yes, if 245 00:12:05,120 --> 00:12:07,360 Speaker 3: you don't fall into one of the exemptions. As someone 246 00:12:07,400 --> 00:12:10,600 Speaker 3: requests this information from you, you must turn it over. 247 00:12:10,880 --> 00:12:13,640 Speaker 1: Okay, if that's the answer, that's the answer, and I'm 248 00:12:13,640 --> 00:12:17,160 Speaker 1: comfortable with that. I'm just again a compliance with this 249 00:12:18,120 --> 00:12:21,160 Speaker 1: from the recip or by the recipient of the request. 250 00:12:21,240 --> 00:12:22,640 Speaker 1: That's you're gonna have to put a lot of faith 251 00:12:22,679 --> 00:12:24,440 Speaker 1: in them that they're actually looking for all of the 252 00:12:24,480 --> 00:12:26,760 Speaker 1: records that might apply to your request. This is something 253 00:12:26,760 --> 00:12:28,839 Speaker 1: a litigation attorneys have to deal with all the time. 254 00:12:29,080 --> 00:12:31,520 Speaker 1: Six forty seven, we're going to hear about well hackers 255 00:12:31,640 --> 00:12:36,079 Speaker 1: using day's favorite platform to the talk station. Six you 256 00:12:36,120 --> 00:12:38,319 Speaker 1: want if you buy KRCD talk station Byan Thomas with 257 00:12:38,400 --> 00:12:41,679 Speaker 1: Dave had Or doing Tech Friday intrust it dot com. 258 00:12:41,760 --> 00:12:43,920 Speaker 1: So your favorite platform has got a problem, Dave hat 259 00:12:44,040 --> 00:12:46,400 Speaker 1: Or what's the story here with pcgamer dot COM's Jane 260 00:12:46,440 --> 00:12:47,840 Speaker 1: Bentley bringing this to our attention. 261 00:12:48,760 --> 00:12:52,400 Speaker 3: Well, it isn't a LinkedIn problem, It's a file sharing 262 00:12:52,480 --> 00:12:54,800 Speaker 3: problem on any social media platform. 263 00:12:55,400 --> 00:12:56,720 Speaker 2: So the point she's. 264 00:12:56,520 --> 00:13:00,800 Speaker 3: Making here is hackers pretending to be a recruiter, as 265 00:13:00,880 --> 00:13:04,760 Speaker 3: one example, might go on LinkedIn and say hey, I 266 00:13:04,760 --> 00:13:06,680 Speaker 3: looked at your profile. You look like a great fit 267 00:13:06,760 --> 00:13:10,800 Speaker 3: for this job, fill out this application. Right, they're using 268 00:13:10,840 --> 00:13:14,200 Speaker 3: the messaging, so let me take a step back LinkedIn. 269 00:13:14,720 --> 00:13:17,440 Speaker 3: And for folks who haven't used LinkedIn, if you've used Facebook, 270 00:13:17,480 --> 00:13:20,760 Speaker 3: think of Facebook Messenger right now. I try to avoid 271 00:13:20,800 --> 00:13:23,960 Speaker 3: ever using Facebook Messenger. Don't send me messages on Facebook 272 00:13:24,000 --> 00:13:25,960 Speaker 3: Messenger because I probably won't see him for the next 273 00:13:26,000 --> 00:13:26,600 Speaker 3: six months. 274 00:13:26,760 --> 00:13:27,760 Speaker 2: I don't even look at it. 275 00:13:27,880 --> 00:13:30,800 Speaker 3: Okay, but Messenger is a way for you to send 276 00:13:30,800 --> 00:13:33,040 Speaker 3: a direct message to someone rather than make a post 277 00:13:33,520 --> 00:13:36,040 Speaker 3: that potentially everyone can see. It's a one to one 278 00:13:36,120 --> 00:13:37,880 Speaker 3: message or possibly a group message. 279 00:13:37,960 --> 00:13:40,800 Speaker 1: Right, every morning I get a whole bunch of them. 280 00:13:40,840 --> 00:13:44,360 Speaker 3: Yeah, same thing as a direct message on X Right, 281 00:13:44,480 --> 00:13:48,080 Speaker 3: it's a message between two people generally, so LinkedIn has 282 00:13:48,120 --> 00:13:51,440 Speaker 3: that same capability. You can go on LinkedIn and send 283 00:13:51,480 --> 00:13:53,960 Speaker 3: a message to someone. I use this all the time. 284 00:13:54,000 --> 00:13:56,280 Speaker 3: I get messages all the time, usually people trying to 285 00:13:56,280 --> 00:13:59,280 Speaker 3: sell me something. But again, someone can pretend to be 286 00:13:59,320 --> 00:14:02,560 Speaker 3: a recruiter and they send you a file. So in 287 00:14:02,559 --> 00:14:05,320 Speaker 3: addition to just typing a message in there, you can 288 00:14:05,480 --> 00:14:08,240 Speaker 3: send someone a link, but you can also attach a file, 289 00:14:08,400 --> 00:14:12,520 Speaker 3: so a word documented PDF, whatever. And they're pointing out 290 00:14:12,559 --> 00:14:16,000 Speaker 3: that you know people on LinkedIn are often looking for jobs, 291 00:14:16,040 --> 00:14:19,600 Speaker 3: looking for their next opportunity. It's not unusual to have 292 00:14:19,640 --> 00:14:22,320 Speaker 3: a legitimate recruiter reach out to you. But bad guys 293 00:14:22,480 --> 00:14:26,480 Speaker 3: understand this and are increasingly using LinkedIn, especially for people 294 00:14:26,560 --> 00:14:29,760 Speaker 3: looking for jobs, as a way to defraud them. And 295 00:14:29,800 --> 00:14:32,360 Speaker 3: what she's pointing out in this article, And again, I 296 00:14:32,400 --> 00:14:34,880 Speaker 3: want to be clear, Brian, this could happen to you 297 00:14:35,240 --> 00:14:38,200 Speaker 3: on anywhere. Someone can send you a file. Right in 298 00:14:38,240 --> 00:14:40,840 Speaker 3: the old days, this will primarily be email. I send 299 00:14:40,880 --> 00:14:43,960 Speaker 3: you an email, It's got an attachment. The attachment is 300 00:14:43,960 --> 00:14:47,720 Speaker 3: some kind of malware, virus, keystrokelog or whatever. Well, the 301 00:14:47,760 --> 00:14:50,520 Speaker 3: bad guys are smart. They go where people are. They 302 00:14:50,560 --> 00:14:53,480 Speaker 3: know that many folks don't really think. Well, wait a minute. 303 00:14:53,760 --> 00:14:59,360 Speaker 3: If someone can send me a file through Facebook, X, TikTok, Pinterest, LinkedIn, whatever, 304 00:14:59,720 --> 00:15:02,960 Speaker 3: that file could be malicious, just like it could if 305 00:15:03,000 --> 00:15:05,480 Speaker 3: I got it via email, or just like it could 306 00:15:05,520 --> 00:15:08,280 Speaker 3: if I went to a website and downloaded it. Right, 307 00:15:08,360 --> 00:15:12,160 Speaker 3: the file itself contains malicious content, a virus, if you will. 308 00:15:12,080 --> 00:15:14,040 Speaker 1: Well, this comes as no surprise to me given all 309 00:15:14,080 --> 00:15:16,160 Speaker 1: the times you've told us about other platforms. I mean, 310 00:15:16,160 --> 00:15:18,480 Speaker 1: that's why I don't open text messages that had links 311 00:15:18,480 --> 00:15:19,960 Speaker 1: to them. You know, I don't never open the link 312 00:15:20,000 --> 00:15:21,800 Speaker 1: because of your warnings. This sounds to me like no 313 00:15:21,920 --> 00:15:25,080 Speaker 1: different from that, So I'm asking this comes as a surprise. 314 00:15:25,520 --> 00:15:28,600 Speaker 3: It's no different at all. It's just that they're now 315 00:15:28,640 --> 00:15:32,760 Speaker 3: exploiting social media platforms and LinkedIn in particular, and they're 316 00:15:32,840 --> 00:15:35,960 Speaker 3: using social engineering, right the idea of it, Yes, here's 317 00:15:36,000 --> 00:15:38,920 Speaker 3: a job, here's a job. Click this form, open this 318 00:15:39,000 --> 00:15:41,440 Speaker 3: form and fill it out or something like that, which 319 00:15:41,480 --> 00:15:43,880 Speaker 3: then infects your computer with a virus potentially. 320 00:15:44,760 --> 00:15:45,680 Speaker 1: Well, I get all that. 321 00:15:45,800 --> 00:15:46,160 Speaker 2: I guess. 322 00:15:46,160 --> 00:15:49,280 Speaker 1: Part of me wants to say, how many true genuine 323 00:15:49,320 --> 00:15:53,440 Speaker 1: employers go out there and scan through LinkedIn profiles looking 324 00:15:53,480 --> 00:15:55,800 Speaker 1: for someone who fits the bill for the open spot 325 00:15:55,880 --> 00:15:59,640 Speaker 1: as opposed to posting a position open and having real 326 00:15:59,640 --> 00:16:00,560 Speaker 1: people respond. 327 00:16:02,160 --> 00:16:05,760 Speaker 3: I would say most they have recruiters, and their recruiters 328 00:16:05,760 --> 00:16:08,320 Speaker 3: are using LinkedIn to find people that fit their really 329 00:16:08,480 --> 00:16:09,240 Speaker 3: do it all the time? 330 00:16:09,400 --> 00:16:11,520 Speaker 1: Oh wow? Well, see, I'm not in business. What the 331 00:16:11,520 --> 00:16:13,960 Speaker 1: hell does Thomas know? So at least you answer my question, 332 00:16:14,080 --> 00:16:14,840 Speaker 1: So that does happen? 333 00:16:14,880 --> 00:16:15,680 Speaker 2: So? 334 00:16:15,680 --> 00:16:17,920 Speaker 3: So, yeah, if you're a recruiter, one of the best 335 00:16:17,920 --> 00:16:20,520 Speaker 3: ways to find people that would potentially fill an open 336 00:16:20,520 --> 00:16:23,160 Speaker 3: position is to use a tool like LinkedIn and say, well, 337 00:16:23,160 --> 00:16:25,360 Speaker 3: they have the right education, they have the right experience, 338 00:16:25,360 --> 00:16:27,960 Speaker 3: they have the right training. This is someone I want 339 00:16:28,000 --> 00:16:31,080 Speaker 3: to talk to and direct message. The message feature of 340 00:16:31,120 --> 00:16:33,320 Speaker 3: LinkedIn would be a way to get to them directly. 341 00:16:33,480 --> 00:16:35,760 Speaker 2: All right, Well, again, how does this? 342 00:16:35,880 --> 00:16:38,480 Speaker 1: How does one discern whether that is a real, genuine 343 00:16:38,560 --> 00:16:42,000 Speaker 1: tax with real genuine recruiter or somebody trying to rip 344 00:16:42,040 --> 00:16:42,360 Speaker 1: them off. 345 00:16:42,960 --> 00:16:45,240 Speaker 3: Well, that is an excellent question, and I would say 346 00:16:45,240 --> 00:16:47,840 Speaker 3: there is no one hundred percent guaranteed way to do it, 347 00:16:47,880 --> 00:16:50,120 Speaker 3: because the bad guys will go in and set up 348 00:16:50,160 --> 00:16:53,200 Speaker 3: a fake profile. They make it look realistic, they look 349 00:16:53,280 --> 00:16:56,120 Speaker 3: like they work at a real company. They you know, 350 00:16:56,160 --> 00:17:00,200 Speaker 3: they probably have. They've created activity make themselves look legitimate, right, 351 00:17:00,720 --> 00:17:01,720 Speaker 3: because it's easy to do. 352 00:17:02,000 --> 00:17:02,200 Speaker 2: So. 353 00:17:02,680 --> 00:17:05,480 Speaker 3: My advice to you would be if you have someone 354 00:17:05,520 --> 00:17:08,640 Speaker 3: reach out to you in an unsolicited way on any 355 00:17:08,640 --> 00:17:11,600 Speaker 3: of these platforms, but linked In particularly, and especially if 356 00:17:11,640 --> 00:17:13,720 Speaker 3: it's you're out of a job and you're looking for 357 00:17:13,760 --> 00:17:15,880 Speaker 3: a job. It's an offer, you know, all the stuff Brian, 358 00:17:15,960 --> 00:17:18,480 Speaker 3: too good to be true, whatever, you should be highly 359 00:17:18,520 --> 00:17:21,879 Speaker 3: skeptical of it, and you should do everything possible to 360 00:17:21,960 --> 00:17:25,840 Speaker 3: confirm that that is a real, real, legitimate thing before 361 00:17:25,880 --> 00:17:28,480 Speaker 3: you click that file. I'm not saying don't interact with 362 00:17:28,520 --> 00:17:31,800 Speaker 3: the person necessarily, but I'm saying, like, look at their profile, 363 00:17:31,920 --> 00:17:35,040 Speaker 3: try to confirm that they're real, Do they have other activity? 364 00:17:35,320 --> 00:17:39,040 Speaker 3: Can you confirm anything? But don't open the file until 365 00:17:39,160 --> 00:17:41,879 Speaker 3: you have a high, high level of certainty, and certainly 366 00:17:41,920 --> 00:17:43,919 Speaker 3: if you don't have any kind of antivirus, don't open it. 367 00:17:43,960 --> 00:17:47,040 Speaker 3: But even then, it's a risky proposition to open any 368 00:17:47,040 --> 00:17:49,880 Speaker 3: file someone sends you that you weren't expecting and did 369 00:17:49,920 --> 00:17:50,480 Speaker 3: not ask for. 370 00:17:50,840 --> 00:17:55,560 Speaker 1: Exclamation point. Dave atorinterest dot com. Appreciate what you do 371 00:17:55,640 --> 00:17:57,400 Speaker 1: every week on the program. I'll look forward to another 372 00:17:57,440 --> 00:18:00,840 Speaker 1: talk with you next Friday. In the meantime, you guys at Intrust, 373 00:18:00,920 --> 00:18:04,280 Speaker 1: I have a great week and thanks again. It's six 374 00:18:04,480 --> 00:18:05,520 Speaker 1: fifty seven and fifty five. 375 00:18:05,560 --> 00:18:05,639 Speaker 2: Care