1 00:00:00,000 --> 00:00:01,000 Speaker 1: The talk station. 2 00:00:03,240 --> 00:00:06,240 Speaker 2: It is six point thirty. It's Friday. Stay in the obvious. 3 00:00:06,320 --> 00:00:08,879 Speaker 2: It is time for tech Friday with Dave Hatter. In 4 00:00:08,960 --> 00:00:11,440 Speaker 2: trust It is Dave's company. He has been voted the 5 00:00:11,480 --> 00:00:14,920 Speaker 2: best IT company for businesses by the Business Courrier. I 6 00:00:14,920 --> 00:00:17,000 Speaker 2: trust the business career and I certainly trust Dave Hatter 7 00:00:17,040 --> 00:00:19,799 Speaker 2: and his team. So interest it dot Com to tap 8 00:00:19,840 --> 00:00:23,079 Speaker 2: into their awesome services. Welcome back, Dave Hatter. Happy Friday 9 00:00:23,079 --> 00:00:23,279 Speaker 2: to you. 10 00:00:24,200 --> 00:00:26,119 Speaker 3: Always good to be here, Brian, Thanks for having me, 11 00:00:26,160 --> 00:00:28,320 Speaker 3: and Happy Friday to you and Joe and all your listeners. 12 00:00:28,360 --> 00:00:31,280 Speaker 2: All right, everybody's seen the idea that artificial intelligence can 13 00:00:31,320 --> 00:00:35,440 Speaker 2: make clothed people look naked. But this is different. Apparently 14 00:00:35,560 --> 00:00:40,360 Speaker 2: chatbots stripper. Now you're gonna have to explain this to me, 15 00:00:40,479 --> 00:00:42,240 Speaker 2: you know me, I don't use made it and stuff 16 00:00:42,280 --> 00:00:42,559 Speaker 2: like that. 17 00:00:43,479 --> 00:00:45,879 Speaker 1: Yeah, you may be better off, Brian, you may be 18 00:00:45,920 --> 00:00:46,440 Speaker 1: better off. 19 00:00:46,560 --> 00:00:49,080 Speaker 3: So this is a story and wired and it's interesting 20 00:00:49,159 --> 00:00:52,760 Speaker 3: because you may have noticed in the media lately there's 21 00:00:52,800 --> 00:00:55,640 Speaker 3: been a lot of talk about this, specifically around Groc. 22 00:00:56,280 --> 00:01:00,880 Speaker 3: As a reminder, Groc is x ai a ka elon 23 00:01:01,040 --> 00:01:05,840 Speaker 3: Musk's chat bots. So when you think chatbot, when you 24 00:01:05,880 --> 00:01:09,080 Speaker 3: hear that word. Essentially, what they're talking about is large 25 00:01:09,160 --> 00:01:14,959 Speaker 3: language model based generative AI tools aka rock Gemini from Google, 26 00:01:16,160 --> 00:01:20,679 Speaker 3: Chat GPT from Open Ai, Claude, perplexity, et cetera. Now 27 00:01:20,840 --> 00:01:22,720 Speaker 3: it does a lot more than what I'm going to describe. 28 00:01:22,720 --> 00:01:25,319 Speaker 3: But you know the idea that if you go to chat, GPT, 29 00:01:25,600 --> 00:01:27,360 Speaker 3: you sit down, you type in a prompt and it 30 00:01:27,440 --> 00:01:31,399 Speaker 3: generates some kind of content text, audio, video, whatever. You know, 31 00:01:31,440 --> 00:01:35,840 Speaker 3: it's gotten much better at that these tools. And this 32 00:01:35,920 --> 00:01:37,520 Speaker 3: really gets to the heart of something you and I 33 00:01:37,520 --> 00:01:40,080 Speaker 3: have been talking about ever since AI sort of hit 34 00:01:40,120 --> 00:01:45,000 Speaker 3: the public's consciousness. Deep fakes which are could be audio, 35 00:01:45,080 --> 00:01:47,480 Speaker 3: could be video, could be text, but it's something that 36 00:01:47,680 --> 00:01:49,480 Speaker 3: is made up by the AI. 37 00:01:50,040 --> 00:01:50,520 Speaker 1: It's fake. 38 00:01:51,160 --> 00:01:53,400 Speaker 3: And you know, we've talked about the potential implications on 39 00:01:53,440 --> 00:01:57,680 Speaker 3: the legal system, on politics. But the latest thing that's 40 00:01:57,760 --> 00:02:01,120 Speaker 3: caused quite a stir is people have figured out how 41 00:02:01,160 --> 00:02:04,840 Speaker 3: to use these chatbot tools again think chat, GPT, rock whatever, 42 00:02:05,320 --> 00:02:09,160 Speaker 3: to take someone's photo and either a put them in 43 00:02:09,200 --> 00:02:12,600 Speaker 3: a bikini or b make them nude. And again this 44 00:02:12,720 --> 00:02:15,680 Speaker 3: story from whyre Google and Open AI's chatbots can strip 45 00:02:15,720 --> 00:02:18,840 Speaker 3: women down to photos, women in photos down to Bikinis 46 00:02:20,600 --> 00:02:24,320 Speaker 3: points out that this is fairly easy to do and unfortunately, 47 00:02:24,880 --> 00:02:29,720 Speaker 3: unfortunately are to stop. One of the terms you hear 48 00:02:29,760 --> 00:02:32,280 Speaker 3: when people talk about this is the idea of jail breaking, 49 00:02:32,360 --> 00:02:34,760 Speaker 3: that I'm purposely going to attempt to get one of 50 00:02:34,800 --> 00:02:42,560 Speaker 3: these chatbots to do something it's not supposed to do, right, yeah, like, yeah, 51 00:02:42,639 --> 00:02:45,359 Speaker 3: this is the latest thing. And you know, in many cases, 52 00:02:46,080 --> 00:02:48,320 Speaker 3: if someone just sits down and types in a prompt 53 00:02:48,840 --> 00:02:53,079 Speaker 3: give me this right, if that violates some rule, then 54 00:02:53,200 --> 00:02:56,400 Speaker 3: it'll say I can't do that. But what what people 55 00:02:56,440 --> 00:03:00,320 Speaker 3: have gotten good at is figuring out ways to spend 56 00:03:00,320 --> 00:03:03,160 Speaker 3: time working their way into what they want, you know, 57 00:03:03,200 --> 00:03:06,280 Speaker 3: step by step, trying to get to an end result 58 00:03:06,400 --> 00:03:09,200 Speaker 3: that would be denied if you just asked it straight out. 59 00:03:10,080 --> 00:03:13,160 Speaker 3: And I'm not saying that's necessarily what's happening here in 60 00:03:13,200 --> 00:03:16,359 Speaker 3: every case. I'm just saying it's really difficult to stop 61 00:03:16,400 --> 00:03:19,519 Speaker 3: people from jail breaking these things. 62 00:03:19,680 --> 00:03:22,400 Speaker 2: I just am led to the conclusion I just wonder, 63 00:03:23,480 --> 00:03:25,560 Speaker 2: don't these people have anything better to do with their 64 00:03:25,600 --> 00:03:27,799 Speaker 2: time day? I know that's a ridiculous thing to say, 65 00:03:27,840 --> 00:03:28,840 Speaker 2: but apparently they don't. 66 00:03:29,760 --> 00:03:33,000 Speaker 3: Apparently they don't, now, you know, and I understand why. 67 00:03:33,080 --> 00:03:35,440 Speaker 3: You know, if you're a woman and you have a 68 00:03:35,440 --> 00:03:39,560 Speaker 3: photo online and someone takes it and then put you 69 00:03:39,600 --> 00:03:43,120 Speaker 3: in a bikini or some other subjective thing or unfortunately makes. 70 00:03:43,000 --> 00:03:43,280 Speaker 1: You new it. 71 00:03:43,320 --> 00:03:47,400 Speaker 3: I mean, it's sad that this can happen, and I'm 72 00:03:47,400 --> 00:03:52,160 Speaker 3: not really sure ultimately it can be fixed. Again, can 73 00:03:52,200 --> 00:03:54,840 Speaker 3: they build these systems over time so that they can't 74 00:03:54,880 --> 00:03:58,880 Speaker 3: ever be jailbroken? I don't know, Brian, I really don't know. 75 00:03:59,800 --> 00:04:03,600 Speaker 3: I think it's a very difficult challenge. And then to your. 76 00:04:03,480 --> 00:04:07,120 Speaker 2: Point, a programmer, you're saying out loud, this is a 77 00:04:07,200 --> 00:04:10,800 Speaker 2: difficult challenge to get to that point. It's almost impossible, 78 00:04:11,080 --> 00:04:11,680 Speaker 2: is what I'm hearing. 79 00:04:12,640 --> 00:04:15,200 Speaker 3: Well, you know, maybe if the AI gets smart enough 80 00:04:15,200 --> 00:04:18,000 Speaker 3: that on its own, it could understand that it is 81 00:04:18,080 --> 00:04:20,680 Speaker 3: being led down a path to jail breaking to do 82 00:04:20,760 --> 00:04:23,680 Speaker 3: something it's not supposed to do. Maybe, But you know, 83 00:04:24,400 --> 00:04:26,280 Speaker 3: to your point, as a guy who spent a lot 84 00:04:26,320 --> 00:04:29,840 Speaker 3: of time writing code, could I anticipate every single way 85 00:04:29,880 --> 00:04:32,080 Speaker 3: that a user would do something that they shouldn't do 86 00:04:32,160 --> 00:04:34,960 Speaker 3: and that it would cause an error? It's really hard. Yeah, 87 00:04:35,080 --> 00:04:37,200 Speaker 3: it's really hard to do that, and that's why software 88 00:04:37,240 --> 00:04:39,240 Speaker 3: has bugs in it, right, you know, I know, it 89 00:04:39,320 --> 00:04:42,160 Speaker 3: never occurred to me they would do this one thing here, 90 00:04:42,760 --> 00:04:47,480 Speaker 3: and so it's a difficult challenge. I understand why women 91 00:04:47,520 --> 00:04:51,680 Speaker 3: are unhappy about this, especially like, you know, if this 92 00:04:51,760 --> 00:04:55,279 Speaker 3: was your daughter and some creep goes out and finds 93 00:04:55,279 --> 00:04:57,520 Speaker 3: a picture of her and next thing you know, there 94 00:04:57,520 --> 00:05:00,720 Speaker 3: are deep fake nudes of her floating or or again, 95 00:05:00,800 --> 00:05:04,719 Speaker 3: and suggestive photos and bikinis or whatever. I totally understand. 96 00:05:04,760 --> 00:05:06,320 Speaker 3: I think it's sad that this is a thing. 97 00:05:06,520 --> 00:05:09,760 Speaker 2: I think it could be embarrassing for guys too, right, 98 00:05:10,680 --> 00:05:13,480 Speaker 2: you know, I mean you could resize the junk and 99 00:05:13,760 --> 00:05:15,360 Speaker 2: embarrassed guys if you wanted, right. 100 00:05:15,920 --> 00:05:19,120 Speaker 3: Yeah, I mean you theoretically using these tools can alter 101 00:05:19,240 --> 00:05:23,120 Speaker 3: pictures in all kinds of ways, again, including nudes or 102 00:05:23,240 --> 00:05:25,480 Speaker 3: you know, could you try to frame someone with this? 103 00:05:25,720 --> 00:05:26,919 Speaker 1: Well, yeah, yeah you could. 104 00:05:27,000 --> 00:05:31,599 Speaker 3: It's this has been one of the latest things to 105 00:05:31,680 --> 00:05:34,520 Speaker 3: catch a lot of bad press. But when you when 106 00:05:34,520 --> 00:05:38,280 Speaker 3: you extrapolate out using these tools to take a photo 107 00:05:38,360 --> 00:05:42,320 Speaker 3: of someone or frankly, create a whole cloth photo of someone, yeah, 108 00:05:42,360 --> 00:05:44,120 Speaker 3: and then you know, put them into some kind of 109 00:05:44,160 --> 00:05:48,839 Speaker 3: compromising position, it's a real concern. And again, while groc 110 00:05:48,920 --> 00:05:52,400 Speaker 3: has gotten a lot of the the IR on this. 111 00:05:52,400 --> 00:05:54,359 Speaker 3: This is a story from Wired again pointing out the 112 00:05:54,440 --> 00:05:56,160 Speaker 3: Google and open. 113 00:05:55,920 --> 00:05:57,560 Speaker 1: Ai seem to have the same problem. 114 00:05:57,680 --> 00:06:00,360 Speaker 3: And you know it's it's just one of the many 115 00:06:00,440 --> 00:06:04,120 Speaker 3: challenges we face as a result of this technology. So 116 00:06:04,279 --> 00:06:06,800 Speaker 3: people should be aware of this. About the only thing 117 00:06:06,839 --> 00:06:08,760 Speaker 3: you can do to protect yourself from it is try 118 00:06:08,800 --> 00:06:12,320 Speaker 3: to limit the amount of photos that are out there 119 00:06:12,480 --> 00:06:13,200 Speaker 3: with you in them. 120 00:06:13,240 --> 00:06:15,479 Speaker 1: Yeah, good luck, which is difficult. Yeah, good luck. 121 00:06:15,520 --> 00:06:19,400 Speaker 3: And then understand that someone can potentially just create something 122 00:06:19,440 --> 00:06:21,839 Speaker 3: completely from whole cloth in and without a photo of 123 00:06:21,880 --> 00:06:24,640 Speaker 3: you to begin with, as this tech progresses. 124 00:06:24,720 --> 00:06:28,800 Speaker 2: So what a sad empathetic existence that person has who 125 00:06:28,880 --> 00:06:31,200 Speaker 2: takes the time to do this. We'll bring Dave patterback 126 00:06:31,279 --> 00:06:35,360 Speaker 2: talking about ways artificial intelligence can inflict damage. Plus stick 127 00:06:35,360 --> 00:06:39,880 Speaker 2: around comp It's six COVID sixty two. It's about kercittalk 128 00:06:39,920 --> 00:06:42,359 Speaker 2: station interest it dot com where you find Dave Patter 129 00:06:42,400 --> 00:06:45,120 Speaker 2: and you find them every week here on Friday morning, 130 00:06:45,120 --> 00:06:50,120 Speaker 2: meeting at six moving over ways artificial intelligence can inflict damage, 131 00:06:50,680 --> 00:06:53,800 Speaker 2: Unprecedented damage, says the headline of the zd net article 132 00:06:53,800 --> 00:06:56,800 Speaker 2: you'll have posted on LinkedIn on your account. Go ahead, Dave, 133 00:06:56,800 --> 00:06:57,240 Speaker 2: what's this all? 134 00:06:57,240 --> 00:06:59,640 Speaker 3: About Yeah, Brian, we won't have time to get into 135 00:06:59,640 --> 00:07:01,680 Speaker 3: all of these. I'm glad you mentioned that, so I'll 136 00:07:01,720 --> 00:07:04,800 Speaker 3: share this article. And I think that the headline is 137 00:07:04,800 --> 00:07:06,960 Speaker 3: a little hyperbolic, and I mean we just talked about 138 00:07:06,960 --> 00:07:08,560 Speaker 3: one of the ways they don't cover. This is more 139 00:07:08,560 --> 00:07:11,400 Speaker 3: of an IT focus for you as as a person 140 00:07:11,440 --> 00:07:14,200 Speaker 3: trying to defend your systems against this stuff. And I 141 00:07:14,200 --> 00:07:16,800 Speaker 3: would say, you know, every business needs to be thinking 142 00:07:16,840 --> 00:07:18,480 Speaker 3: about this, and I also want to start out with 143 00:07:19,720 --> 00:07:23,720 Speaker 3: these tools can be extremely powerful and productive if you 144 00:07:23,800 --> 00:07:26,360 Speaker 3: understand what they can and can't do, and you think 145 00:07:26,400 --> 00:07:28,920 Speaker 3: about some of the security issues that they point out. 146 00:07:28,960 --> 00:07:32,520 Speaker 3: In the cd in net article, a couple headlines are 147 00:07:32,560 --> 00:07:35,320 Speaker 3: a couple of topics. They say AI enabled malware will 148 00:07:35,400 --> 00:07:37,600 Speaker 3: unleash havoc. I'm just going to hit a couple of 149 00:07:37,600 --> 00:07:39,720 Speaker 3: these and come back to them. Agentic AI is involving 150 00:07:39,720 --> 00:07:44,960 Speaker 3: into every thread actor's fantasy. Prompting prompt injection AI tools 151 00:07:45,000 --> 00:07:47,760 Speaker 3: will be the new attack surface. Thread actors will use 152 00:07:47,760 --> 00:07:49,800 Speaker 3: AI to go after the weakest link humans. That's my 153 00:07:49,840 --> 00:07:52,200 Speaker 3: biggest concern. By the way, come back again, I'm going 154 00:07:52,200 --> 00:07:54,280 Speaker 3: to get into each of these a little bit. AI 155 00:07:54,400 --> 00:08:00,000 Speaker 3: will expose APIs too easily exploited. Extortion tactics will evolve 156 00:08:00,160 --> 00:08:03,120 Speaker 3: from ransomware encryption, and they mentioned in the next one 157 00:08:03,160 --> 00:08:06,600 Speaker 3: how contagion spreads to industrial control and operations. I don't 158 00:08:06,600 --> 00:08:08,320 Speaker 3: remember if we talked about this last year or not. 159 00:08:08,480 --> 00:08:12,400 Speaker 3: Range Rover Jaguar. Range Rover had a devastating ransomware attack. 160 00:08:12,480 --> 00:08:15,240 Speaker 3: They had some systems down for months, and the last 161 00:08:15,280 --> 00:08:17,760 Speaker 3: number I saw they lost something like two billion dollars. 162 00:08:18,000 --> 00:08:20,240 Speaker 3: People got laid off there, People got laid off with 163 00:08:20,280 --> 00:08:22,960 Speaker 3: their suppliers because their suppliers had nothing to do since 164 00:08:23,000 --> 00:08:26,080 Speaker 3: they weren't making cars. I mean, it was a devastating attack, 165 00:08:26,160 --> 00:08:29,760 Speaker 3: not necessarily AI related, but they're just drawing the nexus 166 00:08:29,800 --> 00:08:34,040 Speaker 3: of these things and the two big threats I think 167 00:08:34,120 --> 00:08:37,040 Speaker 3: every business really needs to consider around AI as it 168 00:08:37,120 --> 00:08:37,720 Speaker 3: stands today. 169 00:08:37,720 --> 00:08:41,040 Speaker 1: Again, there's a couple more in this article. If you 170 00:08:41,880 --> 00:08:42,560 Speaker 1: don't have. 171 00:08:42,480 --> 00:08:45,199 Speaker 3: People that understand what these things can and can't do, 172 00:08:45,280 --> 00:08:47,760 Speaker 3: the idea of hallucinations and so forth, and you can't 173 00:08:47,800 --> 00:08:50,559 Speaker 3: just assume anything that it creates as one hundred percent accurate, 174 00:08:51,320 --> 00:08:56,560 Speaker 3: that's a problem. And when you couple that with the 175 00:08:56,600 --> 00:08:59,120 Speaker 3: idea that I'm going to have an AI agent that 176 00:08:59,200 --> 00:09:01,200 Speaker 3: can go do something on its own. It's not a 177 00:09:01,240 --> 00:09:04,439 Speaker 3: human being sitting and typing prompts in and getting. 178 00:09:04,120 --> 00:09:05,240 Speaker 1: Output and doing something. 179 00:09:05,679 --> 00:09:09,000 Speaker 3: I'm going to let something run on its own and 180 00:09:09,040 --> 00:09:11,200 Speaker 3: do stuff. I'm going to give it credentials, use your 181 00:09:11,240 --> 00:09:13,960 Speaker 3: name and passwords so it can act on its own. 182 00:09:15,480 --> 00:09:17,600 Speaker 3: There is an enormous amount of risk in that. They 183 00:09:17,640 --> 00:09:19,440 Speaker 3: talk about that in here, and there are many other 184 00:09:19,559 --> 00:09:22,679 Speaker 3: articles about that. I'm not saying you shouldn't do it. 185 00:09:22,760 --> 00:09:25,680 Speaker 3: I'm not saying it can't work. What I'm saying is 186 00:09:26,160 --> 00:09:29,520 Speaker 3: if you think that you're just going to give this 187 00:09:29,600 --> 00:09:32,480 Speaker 3: thing a user name and password and let it start 188 00:09:32,480 --> 00:09:37,200 Speaker 3: doing stuff that that couldn't go catastrophically wrong, you need 189 00:09:37,240 --> 00:09:41,480 Speaker 3: to be very very careful and really understand all the 190 00:09:41,480 --> 00:09:43,839 Speaker 3: ins and outs of that. And frankly, if you give 191 00:09:43,880 --> 00:09:46,000 Speaker 3: it credentials and let it do things on its own, 192 00:09:46,520 --> 00:09:48,920 Speaker 3: have you secured it correctly so it can't be hacked? 193 00:09:48,960 --> 00:09:51,880 Speaker 3: And then the bad guys are now operating with permissions 194 00:09:51,920 --> 00:09:54,480 Speaker 3: in your system, the permissions that you gave it. So 195 00:09:55,280 --> 00:09:58,200 Speaker 3: that's something everyone needs to look at very carefully and 196 00:09:58,360 --> 00:10:00,960 Speaker 3: really understand what the risks are and make sure they're 197 00:10:01,520 --> 00:10:06,360 Speaker 3: taking adequate protection against that risk. But the biggest thing 198 00:10:06,440 --> 00:10:09,640 Speaker 3: I'm still mostly concerned about is as these tools progress, 199 00:10:09,720 --> 00:10:11,680 Speaker 3: as we just talked about in the last segment about 200 00:10:11,679 --> 00:10:15,240 Speaker 3: the bikini thing, it's easier and easier and easier for 201 00:10:15,280 --> 00:10:18,480 Speaker 3: bad guys to create things that are incredibly realistic, including 202 00:10:18,520 --> 00:10:22,080 Speaker 3: cloning someone's voice. For example, so you come into work, 203 00:10:22,400 --> 00:10:25,240 Speaker 3: there's a quote voicemail unquote from your boss telling you 204 00:10:25,280 --> 00:10:26,920 Speaker 3: to do a transfer of some sort. 205 00:10:27,760 --> 00:10:29,000 Speaker 1: Sounds exactly like them. 206 00:10:29,240 --> 00:10:32,320 Speaker 3: They got your voices or your boss's voice from his 207 00:10:32,440 --> 00:10:35,560 Speaker 3: voicemail or from some public speech he gave, or whatever. 208 00:10:35,880 --> 00:10:38,240 Speaker 3: They use a tool to clone his voice, a deep 209 00:10:38,280 --> 00:10:42,400 Speaker 3: fake voice clone. It's simple, it's free, and these things 210 00:10:42,440 --> 00:10:46,360 Speaker 3: are unbelievably good at this now, so you know the 211 00:10:46,440 --> 00:10:51,040 Speaker 3: idea that as a criminal, I can create text, audio, 212 00:10:51,080 --> 00:10:54,160 Speaker 3: and video that is unbelievably realistic. It will look exactly 213 00:10:54,240 --> 00:10:57,560 Speaker 3: like the real person. Could you get a call where 214 00:10:57,600 --> 00:11:01,160 Speaker 3: the hacker is literally using a voice cloning tool and 215 00:11:01,200 --> 00:11:03,560 Speaker 3: talking to you real time by typing a response in 216 00:11:03,600 --> 00:11:06,040 Speaker 3: and hitting a button and generating a voice. Yes, that 217 00:11:06,160 --> 00:11:08,000 Speaker 3: is a thing at this point. I think you and 218 00:11:08,040 --> 00:11:11,440 Speaker 3: I talked about the Ferrari situation last year where the 219 00:11:11,520 --> 00:11:14,760 Speaker 3: CFO of Ferrari almost was fooled by a voice clone 220 00:11:14,800 --> 00:11:17,000 Speaker 3: of the CEO of Ferrari. I mean they know each 221 00:11:17,040 --> 00:11:20,320 Speaker 3: other personally, they work together all the time. So I 222 00:11:20,360 --> 00:11:23,600 Speaker 3: think the average person does not realize how advanced all 223 00:11:23,640 --> 00:11:25,400 Speaker 3: this stuff is at this point in terms of this 224 00:11:25,480 --> 00:11:29,520 Speaker 3: deep faate capability, and every business should be training their employees, 225 00:11:29,760 --> 00:11:32,400 Speaker 3: you know, on the risks how to use this stuff, 226 00:11:32,559 --> 00:11:37,080 Speaker 3: the risks of it, and particularly cybersecurity awareness training that 227 00:11:37,160 --> 00:11:40,880 Speaker 3: includes something around AI, because I mean, we see it 228 00:11:41,880 --> 00:11:44,640 Speaker 3: growing and we've talked about the Grandparent scam and all 229 00:11:44,679 --> 00:11:47,880 Speaker 3: this stuff. Businesses are going to be increasingly targeted with 230 00:11:47,960 --> 00:11:51,880 Speaker 3: sophisticated attacks, not only from a technical perspective with AI, 231 00:11:52,040 --> 00:11:54,200 Speaker 3: but focused on the humans. It's one of the points 232 00:11:54,200 --> 00:11:57,160 Speaker 3: in this article, you know, fooling human beings into doing 233 00:11:57,200 --> 00:12:02,000 Speaker 3: something they shouldn't do using the advanced capability. So you know, 234 00:12:02,160 --> 00:12:04,240 Speaker 3: we can help. If anyone wants some help, call me up. 235 00:12:04,280 --> 00:12:08,680 Speaker 3: I'm happy to talk about this. But this is only 236 00:12:08,720 --> 00:12:10,920 Speaker 3: going to get worse before it gets better, because most 237 00:12:10,920 --> 00:12:15,480 Speaker 3: people do not understand how capable this stuff is in 238 00:12:15,559 --> 00:12:18,679 Speaker 3: terms of creating things that seem unbelievably realistic. Sadly, it's 239 00:12:19,120 --> 00:12:19,880 Speaker 3: believe it's a thing. 240 00:12:19,960 --> 00:12:20,360 Speaker 1: Folks. 241 00:12:20,600 --> 00:12:25,200 Speaker 2: Yeah, I can't believe you yearly can't linked LinkedIn dot com. 242 00:12:25,280 --> 00:12:27,000 Speaker 2: Let's look for Dave Hadder. You'll find him and you'll 243 00:12:27,000 --> 00:12:29,520 Speaker 2: find all the articles in information. One more and there's 244 00:12:29,520 --> 00:12:31,680 Speaker 2: a feature on your Google Gmail. 245 00:12:31,559 --> 00:12:33,120 Speaker 1: Five KRC the talk station. 246 00:12:38,520 --> 00:12:40,800 Speaker 2: It is six fifty one six two fifty five krcity 247 00:12:40,840 --> 00:12:44,800 Speaker 2: talk station. All right, what's the story on this Gmail 248 00:12:44,880 --> 00:12:46,679 Speaker 2: feature that you we need to turn off? 249 00:12:46,760 --> 00:12:51,440 Speaker 3: Dave hatter Well, Brian, you should know me well left 250 00:12:51,440 --> 00:12:54,400 Speaker 3: to know by now. My initial response is this headline 251 00:12:54,640 --> 00:12:56,600 Speaker 3: rather than if you use Gmail, you're going to want 252 00:12:56,600 --> 00:13:00,280 Speaker 3: to turn off this one automatic setting asapp should day 253 00:13:00,360 --> 00:13:05,440 Speaker 3: instead stop using Gmail. But good pud since it doesn't 254 00:13:05,480 --> 00:13:09,920 Speaker 3: say that, let me explain. Now this article is a 255 00:13:09,960 --> 00:13:12,079 Speaker 3: few months old, and again I'll post the link. If 256 00:13:12,080 --> 00:13:14,480 Speaker 3: you use Gmail and you plan to keep using Gmail, 257 00:13:14,559 --> 00:13:17,760 Speaker 3: I strongly recommend that you read this. And this is 258 00:13:17,840 --> 00:13:20,040 Speaker 3: one of the problems I have with all of these 259 00:13:20,040 --> 00:13:24,000 Speaker 3: big tech tech platforms. It's not just Google, right, So 260 00:13:24,120 --> 00:13:29,680 Speaker 3: many of these things are basically surveillance capitalism driven. You 261 00:13:29,800 --> 00:13:32,920 Speaker 3: are the product, not the customer. It's your data that 262 00:13:33,000 --> 00:13:36,920 Speaker 3: makes them quote free unquote for you, and when you 263 00:13:36,960 --> 00:13:40,520 Speaker 3: look at the ever increasing leaks of your data, the 264 00:13:41,120 --> 00:13:43,840 Speaker 3: ways this data is used in ways you would not 265 00:13:43,920 --> 00:13:48,840 Speaker 3: necessarily understand, and things like surveillance pricing. I just wrote 266 00:13:48,880 --> 00:13:50,679 Speaker 3: an op ed on this, which i'll have out there. 267 00:13:50,679 --> 00:13:52,880 Speaker 3: Maybe we talk about that next week. The idea that 268 00:13:52,920 --> 00:13:56,000 Speaker 3: because there's so much data about you out there and 269 00:13:56,040 --> 00:13:58,360 Speaker 3: these companies are capturing it, they're selling it to other 270 00:13:58,400 --> 00:14:01,200 Speaker 3: companies and so forth, I'm able to look at your 271 00:14:01,240 --> 00:14:03,720 Speaker 3: profile and say, you know what, I think Brian Thomas 272 00:14:03,720 --> 00:14:05,880 Speaker 3: would pay more for this than Dave Hatter. Looks like 273 00:14:05,960 --> 00:14:08,720 Speaker 3: he's probably got more money, or maybe he's more desperate 274 00:14:08,760 --> 00:14:10,880 Speaker 3: to get this thing, So even though we're on the 275 00:14:10,960 --> 00:14:14,320 Speaker 3: same website, you might see a higher price than me. 276 00:14:14,679 --> 00:14:17,640 Speaker 3: It's a real thing that's happening all the time. And 277 00:14:17,679 --> 00:14:19,960 Speaker 3: that's I bring that up because I think it's one 278 00:14:19,960 --> 00:14:23,040 Speaker 3: of the most concrete ways to help regular folks who 279 00:14:23,080 --> 00:14:26,640 Speaker 3: aren't tenfoil hat wearing nerds like me understand why you 280 00:14:26,680 --> 00:14:29,040 Speaker 3: should care about all your data. Because I still hear 281 00:14:29,080 --> 00:14:30,440 Speaker 3: why I don't care about that. You know, I don't 282 00:14:30,480 --> 00:14:33,080 Speaker 3: care about privacy. Second, nothing to hide, and I'm going 283 00:14:33,160 --> 00:14:34,800 Speaker 3: to come back to this article but I'm just trying 284 00:14:34,840 --> 00:14:36,520 Speaker 3: to set the stage for why you should care about 285 00:14:36,560 --> 00:14:39,640 Speaker 3: this and why I find this approach to things offensive. 286 00:14:39,720 --> 00:14:42,560 Speaker 3: So these companies will opt you into things. They roll 287 00:14:42,600 --> 00:14:45,640 Speaker 3: out some new feature that's beneficial for them, They turn 288 00:14:45,720 --> 00:14:47,360 Speaker 3: it on, and you have to figure out how to 289 00:14:47,400 --> 00:14:49,840 Speaker 3: turn it off, and often you don't even know this 290 00:14:49,880 --> 00:14:53,200 Speaker 3: has been done. Secondarily, even if you do, it's you know, 291 00:14:53,320 --> 00:14:55,840 Speaker 3: buried thirty six menus in and they tell you seven 292 00:14:55,920 --> 00:14:59,120 Speaker 3: times why you shouldn't turn it off. Dark patterns, trying 293 00:14:59,120 --> 00:15:02,320 Speaker 3: to convince you to turn off something that is beneficial 294 00:15:02,360 --> 00:15:04,880 Speaker 3: to them. And so to the heart of this article, 295 00:15:05,520 --> 00:15:10,560 Speaker 3: Google Gemini, that's their AI chatout platform, Generative AILM if 296 00:15:10,600 --> 00:15:15,880 Speaker 3: you will. They have now incorporated it into Gmail so 297 00:15:15,960 --> 00:15:18,520 Speaker 3: it can train off all of your emails. So if 298 00:15:18,560 --> 00:15:20,520 Speaker 3: you've been using Gmail for a long time, you might 299 00:15:20,560 --> 00:15:23,320 Speaker 3: have tens of thousands, hundreds of thousands of emails in there. 300 00:15:23,680 --> 00:15:26,320 Speaker 3: This thing is in there reading your emails, learning about you, 301 00:15:26,520 --> 00:15:29,880 Speaker 3: training purportedly to help you, right, because it's going to 302 00:15:29,960 --> 00:15:35,920 Speaker 3: help you schedule tasks and do things and compose emails 303 00:15:35,920 --> 00:15:38,080 Speaker 3: on your behalf and so forth. Right, But I'm going 304 00:15:38,120 --> 00:15:39,880 Speaker 3: to quote directly from the article for around of time. 305 00:15:39,920 --> 00:15:42,840 Speaker 3: Important message for everyone using Gmail. You've been automatically opted 306 00:15:42,880 --> 00:15:45,880 Speaker 3: in to allow Gmail to access all your private messages 307 00:15:45,920 --> 00:15:49,520 Speaker 3: and attachments to train AI models. Engineer Dave Jones shared 308 00:15:49,560 --> 00:15:51,400 Speaker 3: on X earlier this week. Again, this was a couple 309 00:15:51,440 --> 00:15:53,720 Speaker 3: of years ago, a couple months ago. You have to 310 00:15:53,760 --> 00:15:56,760 Speaker 3: manually turn off the quote smart features in the setting 311 00:15:56,920 --> 00:16:00,120 Speaker 3: menu in two locations. So first off, you have to 312 00:16:00,200 --> 00:16:03,080 Speaker 3: know that this has been turned on. Secondarily, you have 313 00:16:03,120 --> 00:16:04,840 Speaker 3: to know where to go. And then third you have 314 00:16:04,880 --> 00:16:07,240 Speaker 3: to go to two different places to turn it off. 315 00:16:07,680 --> 00:16:11,800 Speaker 3: And again it's it's the fact that they if you 316 00:16:12,040 --> 00:16:14,600 Speaker 3: have fully informed consent and you agree to use the 317 00:16:14,680 --> 00:16:17,280 Speaker 3: Gmail and you agree to use this feature, have at 318 00:16:17,280 --> 00:16:20,680 Speaker 3: it for an adult. Do what you want. I don't 319 00:16:20,760 --> 00:16:23,000 Speaker 3: use Gmail, and if I did, I would turn this off. 320 00:16:23,080 --> 00:16:23,520 Speaker 1: There you go. 321 00:16:23,640 --> 00:16:25,600 Speaker 2: That's the final point and we are at a time. 322 00:16:25,720 --> 00:16:28,640 Speaker 2: So heed his advice. Go to LinkedIn dot com find Dave. 323 00:16:28,680 --> 00:16:30,440 Speaker 2: How do you'll find these articles and you can read 324 00:16:30,480 --> 00:16:32,840 Speaker 2: the whys and the wearforce. Dave, thank you for what 325 00:16:32,880 --> 00:16:34,720 Speaker 2: you do for here on the morning show. My listeners. 326 00:16:34,760 --> 00:16:37,720 Speaker 2: Thanks to Interest I your company for sponsoring them. And 327 00:16:37,800 --> 00:16:42,560 Speaker 2: another word for Interest I interest dot com. Thank you brother, 328 00:16:42,680 --> 00:16:45,800 Speaker 2: have a great weekend. Don't go away. Corey Bowman up next. 329 00:16:46,960 --> 00:16:49,640 Speaker 2: Today's tough headlines coming up at the