1 00:00:00,880 --> 00:00:04,439 Speaker 1: Six thirty. I fIF you have KRCD talk station. Happy Tuesday, 2 00:00:05,480 --> 00:00:08,520 Speaker 1: well special edition of Tech Friday. Interrust It dot com 3 00:00:08,520 --> 00:00:10,800 Speaker 1: is where you find Dave Hatter and the crew. Business 4 00:00:10,800 --> 00:00:13,160 Speaker 1: courier says they are the absolute best in the business 5 00:00:13,160 --> 00:00:15,920 Speaker 1: when it comes to your business's computer related needs, including 6 00:00:16,440 --> 00:00:18,239 Speaker 1: getting you out of a gym. If you're not paying 7 00:00:18,280 --> 00:00:21,160 Speaker 1: attention to what Dave tells us every week on this program, 8 00:00:21,800 --> 00:00:24,840 Speaker 1: or best practices setting up systems, whatever it is you need, 9 00:00:25,000 --> 00:00:27,600 Speaker 1: Dave and the crew at interest It are there for you. 10 00:00:27,760 --> 00:00:30,320 Speaker 1: Dave Hatter, thank you for doing a special edition, an 11 00:00:30,360 --> 00:00:33,080 Speaker 1: advanced early Happy Thanksgiving to you and your family and 12 00:00:33,120 --> 00:00:33,960 Speaker 1: everybody in trust It. 13 00:00:35,520 --> 00:00:38,960 Speaker 2: Yeah. Always my pleasure buying and Happy Thanksgiving, Jim and 14 00:00:39,080 --> 00:00:40,440 Speaker 2: your family and all your listeners. 15 00:00:40,920 --> 00:00:42,400 Speaker 1: I don't want to. I know it's not on the list. 16 00:00:42,479 --> 00:00:44,280 Speaker 1: We're going to talk about how not to get scammed 17 00:00:44,280 --> 00:00:46,640 Speaker 1: on Black Friday and cool tech gadgets. But if you 18 00:00:46,760 --> 00:00:49,720 Speaker 1: got a moment or two. This Wall Street Journal article 19 00:00:49,840 --> 00:00:53,120 Speaker 1: Georgia Wells writing teens are saying tearful goodbyes to their 20 00:00:53,120 --> 00:00:56,960 Speaker 1: AI companions. Apparently, one of the chat bot makers, called 21 00:00:57,080 --> 00:01:00,240 Speaker 1: Character AI, is now limiting and cutting off teams from 22 00:01:00,280 --> 00:01:02,760 Speaker 1: access to it. And the only reason I raised this 23 00:01:03,000 --> 00:01:06,919 Speaker 1: I just it was disturbing to read what these young people, 24 00:01:07,040 --> 00:01:10,440 Speaker 1: thirteen year olds for example, are talking about in terms 25 00:01:10,440 --> 00:01:12,679 Speaker 1: of their interaction with and the amount of time they 26 00:01:12,720 --> 00:01:17,119 Speaker 1: spend talking with artificial intelligence, creating characters that they kind 27 00:01:17,160 --> 00:01:19,680 Speaker 1: of believe are real, and then now that they're being 28 00:01:20,080 --> 00:01:23,280 Speaker 1: these are being taken away, they're struggling emotionally with dealing 29 00:01:23,440 --> 00:01:26,360 Speaker 1: with this was lack of communication. This is all ones 30 00:01:26,360 --> 00:01:29,479 Speaker 1: in Zero's Dave. They have no connection or no pre 31 00:01:29,560 --> 00:01:31,959 Speaker 1: appreciation for the fact they're just talking to a computer, 32 00:01:32,120 --> 00:01:35,280 Speaker 1: not a real living entity. This is scary stuff, man. 33 00:01:36,480 --> 00:01:39,120 Speaker 2: Yeah, I agree, it is pretty scary. It's hard for 34 00:01:39,160 --> 00:01:43,360 Speaker 2: me to understand how anyone thinks that when they're talking 35 00:01:43,400 --> 00:01:47,319 Speaker 2: to some kind of chatbot that it's a real person. Now, 36 00:01:47,560 --> 00:01:50,520 Speaker 2: you know, if you've used these things, you'll find that 37 00:01:50,680 --> 00:01:54,480 Speaker 2: they will be conversational. I'll answer your questions, you know. 38 00:01:55,840 --> 00:01:59,120 Speaker 2: But yeah, the idea that you're actually having a conversation 39 00:01:59,200 --> 00:02:01,400 Speaker 2: with some sort of living being, that it's your friend, 40 00:02:02,800 --> 00:02:05,320 Speaker 2: it's hard for me to understand. Now I'm not a teenager, 41 00:02:05,520 --> 00:02:08,600 Speaker 2: so I don't know, you know, but it's it's a 42 00:02:08,639 --> 00:02:12,919 Speaker 2: little crazy to see how impactful this stuff has been 43 00:02:13,000 --> 00:02:15,040 Speaker 2: on young people at this point. I mean, and not 44 00:02:15,160 --> 00:02:17,200 Speaker 2: just AI went all the way back to social media. 45 00:02:17,240 --> 00:02:19,880 Speaker 2: You know, there's now all true studies coming out talking 46 00:02:19,880 --> 00:02:22,720 Speaker 2: about schemes and computs in the classroom and all this 47 00:02:22,760 --> 00:02:26,120 Speaker 2: sort of stuff and how harmful it is. And you know, 48 00:02:26,240 --> 00:02:29,800 Speaker 2: I guess I'm not really that surprised to see that 49 00:02:30,840 --> 00:02:33,800 Speaker 2: having so many distractions thrown at you all the time 50 00:02:35,200 --> 00:02:36,560 Speaker 2: is not very helpful to learning. 51 00:02:37,160 --> 00:02:40,520 Speaker 1: Well, so it isn't always just younger. This was even 52 00:02:40,560 --> 00:02:44,280 Speaker 1: more disturbing. A guy named Jose Ignacio Tarrillo is cited 53 00:02:44,280 --> 00:02:47,960 Speaker 1: in the article thirty four years old, says even though 54 00:02:48,000 --> 00:02:49,919 Speaker 1: this band will apply to him because he's thirty four, 55 00:02:50,000 --> 00:02:53,160 Speaker 1: he acknowledges it's really addictive. He says he hasn't been 56 00:02:53,200 --> 00:02:56,320 Speaker 1: able to quit the character AI program, even though he 57 00:02:56,360 --> 00:02:59,079 Speaker 1: no longer enjoys the hours he chats with his u 58 00:02:59,760 --> 00:03:04,280 Speaker 1: F fiction characters. Even at work. He takes his phone 59 00:03:04,280 --> 00:03:07,040 Speaker 1: out every couple of hours to send a quick message 60 00:03:07,080 --> 00:03:10,200 Speaker 1: to his chat bots. He said, I don't even have 61 00:03:10,240 --> 00:03:12,839 Speaker 1: anything new to say. I don't know why I do it, 62 00:03:13,600 --> 00:03:17,280 Speaker 1: so these are fake characters. He's created, and he feels 63 00:03:17,280 --> 00:03:22,120 Speaker 1: compelled to go issue a message to them even though 64 00:03:22,160 --> 00:03:24,080 Speaker 1: he really has nothing to convey to them. It's like, 65 00:03:24,440 --> 00:03:27,280 Speaker 1: I think he's operating on the delusion that they're expecting 66 00:03:27,320 --> 00:03:29,720 Speaker 1: to hear from him. It's a damn computer. 67 00:03:31,440 --> 00:03:34,280 Speaker 2: Yeah, it's it's interesting by him because along those same 68 00:03:34,280 --> 00:03:36,360 Speaker 2: lines in New York Times just had an article what 69 00:03:36,520 --> 00:03:39,040 Speaker 2: op and ai did when chat gpt uses lost touch 70 00:03:39,080 --> 00:03:41,120 Speaker 2: with the reality, and kind of along in those same lines, 71 00:03:41,160 --> 00:03:43,800 Speaker 2: and I'm reading directly from the article Sea one of 72 00:03:43,800 --> 00:03:45,960 Speaker 2: the first signs came and Mark sam Altman, the chief 73 00:03:45,960 --> 00:03:49,040 Speaker 2: executive and another time Leader's got an influx of punting emails 74 00:03:49,040 --> 00:03:52,040 Speaker 2: from people who are having incredible conversations with chat TPT. 75 00:03:52,760 --> 00:03:55,360 Speaker 2: These people said the company's AI chatbot understood there was 76 00:03:55,440 --> 00:03:57,360 Speaker 2: no person ever had that was shedding light on the 77 00:03:57,400 --> 00:04:02,000 Speaker 2: mysteries of the universe. Oh lord, yeah, it's uh, you 78 00:04:02,040 --> 00:04:03,120 Speaker 2: know that It goes out to say it was a 79 00:04:03,120 --> 00:04:06,080 Speaker 2: warning that there's something wrong with the chat thought, uh, 80 00:04:06,520 --> 00:04:10,400 Speaker 2: you know, blah blah blah. It's you know, I think 81 00:04:10,760 --> 00:04:13,520 Speaker 2: this is just mine. I'm not a psychiatrist, and you know, 82 00:04:13,800 --> 00:04:17,479 Speaker 2: I just I think what's happening is these things want 83 00:04:17,520 --> 00:04:20,440 Speaker 2: to answer their questions. Again, one of the problems of hallucinations. 84 00:04:20,480 --> 00:04:23,640 Speaker 2: They make things up right, and they will confidently tell 85 00:04:23,680 --> 00:04:27,000 Speaker 2: you that something that's false is actually true. And I 86 00:04:27,040 --> 00:04:29,440 Speaker 2: think because they want to please you, they want to 87 00:04:29,480 --> 00:04:31,880 Speaker 2: keep you using the sight and so forth. You know, 88 00:04:32,040 --> 00:04:34,240 Speaker 2: they tend to be echo chambers that tell you what 89 00:04:34,320 --> 00:04:36,400 Speaker 2: you want to hear. And that's why people come away 90 00:04:36,400 --> 00:04:39,040 Speaker 2: with this idea that well, it knows me better than 91 00:04:39,040 --> 00:04:41,479 Speaker 2: any other person. Well, it's because it's it's telling me 92 00:04:41,520 --> 00:04:44,080 Speaker 2: what you want to hear, as opposed to perhaps the 93 00:04:44,120 --> 00:04:47,600 Speaker 2: reality of a situation. So yeah, I don't know. It's 94 00:04:47,680 --> 00:04:50,080 Speaker 2: not good. I don't I don't have an answer for you. 95 00:04:50,480 --> 00:04:52,440 Speaker 2: I don't know where this all goes in the long run. 96 00:04:53,240 --> 00:04:57,120 Speaker 2: But when you see this sort of negative psychological effect 97 00:04:57,240 --> 00:05:01,000 Speaker 2: this stuff is having on people, it's it's kind of disturbing. 98 00:05:01,279 --> 00:05:03,719 Speaker 1: It is very disturbing, most notably when they start to 99 00:05:03,760 --> 00:05:06,440 Speaker 1: egg you on and encourage you to engage in criminal behavior. 100 00:05:06,520 --> 00:05:10,000 Speaker 1: Self harm and even suicide has been well documented. You know, 101 00:05:10,040 --> 00:05:11,600 Speaker 1: one of the folks that was interviewed on this, I 102 00:05:11,600 --> 00:05:15,559 Speaker 1: think it was the guy who created this character AI 103 00:05:15,720 --> 00:05:19,640 Speaker 1: program quote the difficulty logging off doesn't mean something is 104 00:05:19,680 --> 00:05:23,440 Speaker 1: wrong with the team. It means the tech worked exactly 105 00:05:23,560 --> 00:05:27,719 Speaker 1: as designed, and that is a very revealing statement. Points 106 00:05:27,760 --> 00:05:29,880 Speaker 1: out this is what they want, these tech companies. They 107 00:05:29,880 --> 00:05:32,840 Speaker 1: want you to spend hours, I guess engaging with artificial 108 00:05:32,880 --> 00:05:35,840 Speaker 1: intelligence for nefarious reasons. I don't know, but it's scary 109 00:05:35,839 --> 00:05:38,520 Speaker 1: as how. I only bring this update because there's parents 110 00:05:38,560 --> 00:05:41,239 Speaker 1: out there, there's grandparents out there, maybe even young people 111 00:05:41,240 --> 00:05:43,320 Speaker 1: out there that might not fully appreciate what's going on. 112 00:05:43,400 --> 00:05:44,800 Speaker 1: But you know, you got to keep in mind the 113 00:05:44,800 --> 00:05:50,080 Speaker 1: whole time you're not dealing with a real person. That 114 00:05:50,120 --> 00:05:51,760 Speaker 1: in and of itself is enough to keep me from 115 00:05:51,800 --> 00:05:54,839 Speaker 1: engaging in a conversation with them. Sorry, Dave took you 116 00:05:54,880 --> 00:05:59,000 Speaker 1: down a road you weren't necessarily prepared for. No, you 117 00:05:59,040 --> 00:06:01,800 Speaker 1: were prepared for it. How to not get scammed on 118 00:06:01,839 --> 00:06:03,680 Speaker 1: Black Friday. We'll bring day back to talk about that 119 00:06:03,720 --> 00:06:06,080 Speaker 1: one after I mentioned my dear friends at cover. Since 120 00:06:06,120 --> 00:06:10,200 Speaker 1: he just make the call, maybe you just enrolled in 121 00:06:10,240 --> 00:06:13,360 Speaker 1: your group plan from your employer. Maybe you don't think 122 00:06:13,360 --> 00:06:15,120 Speaker 1: it covers enough. And I'm sure, John Rollman to the 123 00:06:15,120 --> 00:06:16,839 Speaker 1: team that covers since you can point out all the 124 00:06:16,960 --> 00:06:19,520 Speaker 1: what they call holes in the bucket of insurance. Those 125 00:06:19,520 --> 00:06:21,880 Speaker 1: holes mean things that aren't covered. How about your out 126 00:06:21,920 --> 00:06:24,720 Speaker 1: of pocket liability? You know you could add a very 127 00:06:25,000 --> 00:06:29,480 Speaker 1: affordable layer of coverage upfront to cover those expenses when 128 00:06:29,520 --> 00:06:32,120 Speaker 1: you face them regular day to day type of medical expenses. Yes, 129 00:06:32,200 --> 00:06:34,839 Speaker 1: you can do that, and it's very very affordable. The 130 00:06:34,880 --> 00:06:36,960 Speaker 1: point being, John, on the team working for you. They 131 00:06:36,960 --> 00:06:39,880 Speaker 1: work with hundreds of different medical insurance companies out in 132 00:06:39,920 --> 00:06:42,599 Speaker 1: the world, and there's more than Obamacare out there, and 133 00:06:43,279 --> 00:06:45,440 Speaker 1: that allows them to customize a plan that fits you, 134 00:06:45,480 --> 00:06:48,159 Speaker 1: and it could be layered policies like I just mentioned 135 00:06:48,440 --> 00:06:54,240 Speaker 1: the ultimate benefit thirty to sixty percent savings with better coverage. Yeah, 136 00:06:54,279 --> 00:06:57,120 Speaker 1: and you get the team. Not only does this service 137 00:06:57,240 --> 00:06:59,640 Speaker 1: not cost you any money at all once you were 138 00:06:59,680 --> 00:07:02,360 Speaker 1: insure through cover, since you get their team, which will 139 00:07:02,360 --> 00:07:04,599 Speaker 1: iron out any insurance problems you have down the road, 140 00:07:04,640 --> 00:07:06,680 Speaker 1: like which policy do I use? That's a simple one. 141 00:07:07,240 --> 00:07:09,600 Speaker 1: But if you get a claims denial, do you want 142 00:07:09,640 --> 00:07:10,960 Speaker 1: to spend a couple of hours on the phone with 143 00:07:11,000 --> 00:07:13,040 Speaker 1: an insurance company? Do you know what you're supposed to say, 144 00:07:13,040 --> 00:07:14,640 Speaker 1: who you're supposed to ask for? No, you don't have 145 00:07:14,680 --> 00:07:16,040 Speaker 1: to do that anymore. John and the team will take 146 00:07:16,080 --> 00:07:17,960 Speaker 1: care of all those problems for you down the road, 147 00:07:18,360 --> 00:07:20,679 Speaker 1: so free. I love that and it is a great, 148 00:07:20,760 --> 00:07:23,320 Speaker 1: great service they're offering. So take them up on the challenge. 149 00:07:23,360 --> 00:07:25,040 Speaker 1: See if they can do something better for you. It 150 00:07:25,080 --> 00:07:27,240 Speaker 1: could in your to your financial benefit. Five to one 151 00:07:27,320 --> 00:07:30,840 Speaker 1: three eight hundred call five one three eight hundred two 152 00:07:30,880 --> 00:07:33,320 Speaker 1: two five five any state in the Union. You're hearing me, 153 00:07:33,400 --> 00:07:35,560 Speaker 1: you can call them. They'll help you out online. You 154 00:07:35,600 --> 00:07:38,000 Speaker 1: can start the process by filling the form out online. 155 00:07:38,040 --> 00:07:41,760 Speaker 1: Cover sincy dot com. That's Coversincy dot com. 156 00:07:41,800 --> 00:07:51,280 Speaker 2: This is fifty five KRC an iHeartRadio station, say. 157 00:07:51,200 --> 00:07:53,280 Speaker 1: Forty one if you've got KRC at de talk station. 158 00:07:56,400 --> 00:07:59,400 Speaker 1: Tech Friday on a Tuesday, So Tech Tuesday with Dave 159 00:07:59,400 --> 00:08:03,320 Speaker 1: had our interest. All Right, we know there are scammers, 160 00:08:03,360 --> 00:08:05,680 Speaker 1: thanks to you, Dave. Maybe we've all been scammed ourselves 161 00:08:05,680 --> 00:08:06,960 Speaker 1: and it was a lot of it going on in there. 162 00:08:06,960 --> 00:08:09,800 Speaker 1: But how do we avoid getting scammed? On Black Friday specifically, 163 00:08:09,880 --> 00:08:14,440 Speaker 1: all the sales are on and I guess don't buy 164 00:08:14,520 --> 00:08:15,560 Speaker 1: online or something. 165 00:08:17,480 --> 00:08:19,320 Speaker 2: Well, I think you know, Brian, I don't know if 166 00:08:19,320 --> 00:08:21,080 Speaker 2: I go quite that far, because I'm sure that's how 167 00:08:21,080 --> 00:08:22,040 Speaker 2: most people are going to shot. 168 00:08:22,200 --> 00:08:22,840 Speaker 1: Yeah, I'm kidding. 169 00:08:24,200 --> 00:08:27,880 Speaker 2: It's the old education and awareness and skepticism, you know. 170 00:08:28,160 --> 00:08:29,880 Speaker 2: I don't know if you recall. I think we talked 171 00:08:29,880 --> 00:08:34,479 Speaker 2: about this earlier this year, before Prime Day, a cybersecurity 172 00:08:34,520 --> 00:08:37,600 Speaker 2: company had done some research and found some are around two 173 00:08:37,679 --> 00:08:42,520 Speaker 2: hundred thousand fake websites set up to scam people around 174 00:08:42,559 --> 00:08:46,400 Speaker 2: Amazon Prime Day. So again, I don't think the average 175 00:08:46,440 --> 00:08:49,160 Speaker 2: person really understands how easy it is to spoof something. 176 00:08:49,240 --> 00:08:52,520 Speaker 2: To make some spoof in general would be making something 177 00:08:52,559 --> 00:08:55,520 Speaker 2: look real it's fake, whether it's an email, a text message, 178 00:08:55,520 --> 00:08:59,680 Speaker 2: of phone number, a whole website. It's really pretty easy, 179 00:09:00,480 --> 00:09:02,880 Speaker 2: and you can do it at scale. Again, and something 180 00:09:02,920 --> 00:09:05,480 Speaker 2: like three or four months loughly two hundred thousand fake 181 00:09:05,559 --> 00:09:08,760 Speaker 2: websites were set up around Prime Day. So you can 182 00:09:08,840 --> 00:09:13,000 Speaker 2: expect a similar sort of approach to Black Friday and 183 00:09:13,040 --> 00:09:16,720 Speaker 2: Cyber Monday. And here's one headline from malware Byte Labs, 184 00:09:17,320 --> 00:09:21,040 Speaker 2: a security company. Black Friday scammers offer fake gifts from 185 00:09:21,040 --> 00:09:25,760 Speaker 2: big name brands to anty bank accounts. And you know, 186 00:09:25,760 --> 00:09:28,120 Speaker 2: without getting too deep into the article, you know they 187 00:09:28,120 --> 00:09:31,720 Speaker 2: talk about bands being impersonated or spoofed. They mentioned Walmart 188 00:09:31,760 --> 00:09:35,520 Speaker 2: home depolos and when you when you couple the spoofing 189 00:09:35,600 --> 00:09:38,440 Speaker 2: with you know, people looking to get good deals. Let's see, 190 00:09:38,920 --> 00:09:43,319 Speaker 2: you know, when inflation is high, it's you know, it's 191 00:09:43,360 --> 00:09:46,280 Speaker 2: easy to send out a billion emails. It's easy to 192 00:09:46,280 --> 00:09:49,800 Speaker 2: follow that up with a billion text messages. Right, it 193 00:09:49,920 --> 00:09:52,800 Speaker 2: costs virtually nothing for the bad guys to send out 194 00:09:52,840 --> 00:09:57,560 Speaker 2: these very realistic looking emails and texts. You know, they're 195 00:09:57,640 --> 00:10:00,440 Speaker 2: using AI to get rid of all all the old 196 00:10:00,440 --> 00:10:04,800 Speaker 2: school tells. Where the grammar is wrong, it doesn't lead right. 197 00:10:05,760 --> 00:10:08,720 Speaker 2: You know, they can copy content from the legitimate website 198 00:10:08,760 --> 00:10:11,600 Speaker 2: of a retailer. So my point is this, this spoofing 199 00:10:11,840 --> 00:10:14,640 Speaker 2: coupled with the ability to reach large viouds of people, 200 00:10:14,640 --> 00:10:18,200 Speaker 2: whether it's email, text, social media, some combination of all 201 00:10:18,240 --> 00:10:20,679 Speaker 2: of the above makes it really easy for them. And 202 00:10:20,960 --> 00:10:24,000 Speaker 2: once you understand that, right, then hopefully you'll say, Okay, 203 00:10:24,600 --> 00:10:27,880 Speaker 2: I need to be a lot more skeptical on anything, 204 00:10:27,920 --> 00:10:30,600 Speaker 2: any kind of deal that I get right, regardless of 205 00:10:30,640 --> 00:10:33,600 Speaker 2: how it came to me. And I need to be 206 00:10:33,640 --> 00:10:36,360 Speaker 2: cautious before I find a deal that seems to me 207 00:10:36,440 --> 00:10:38,520 Speaker 2: to be true and act on it, because you know, 208 00:10:38,559 --> 00:10:40,720 Speaker 2: when you get right down to it, in some cases, 209 00:10:41,520 --> 00:10:43,960 Speaker 2: you're going to hit a fake online store. It looks 210 00:10:44,040 --> 00:10:46,840 Speaker 2: exactly like the real thing. Maybe the URL has just 211 00:10:47,080 --> 00:10:50,280 Speaker 2: ever so slightly changed, there's one character or something that's different. 212 00:10:52,120 --> 00:10:54,080 Speaker 2: And you know they're going to make it easy for 213 00:10:54,160 --> 00:10:57,040 Speaker 2: you to either a get nothing, they just steal your money, 214 00:10:57,480 --> 00:11:00,840 Speaker 2: b send you some sort of cheap knockoff product that's 215 00:11:00,920 --> 00:11:04,120 Speaker 2: no good, or see possibly some combination of all of 216 00:11:04,160 --> 00:11:06,880 Speaker 2: the above, including some kind of malware so they can 217 00:11:06,920 --> 00:11:09,640 Speaker 2: steal all their usernames and passwords. Malware you know, think 218 00:11:09,720 --> 00:11:15,520 Speaker 2: keystrokelogger sitting in the background capturing everything that you're doing. 219 00:11:15,640 --> 00:11:20,800 Speaker 2: So it really is important to move slowly, be cautious, 220 00:11:21,120 --> 00:11:23,840 Speaker 2: be skeptical. You know, if it seems too good to 221 00:11:23,840 --> 00:11:26,040 Speaker 2: be true, it probably is. And here's the thing. If 222 00:11:26,080 --> 00:11:30,280 Speaker 2: you get something right, whether it's again, text, email, whatever, 223 00:11:30,840 --> 00:11:32,680 Speaker 2: and you think it might be legit, you want to 224 00:11:32,679 --> 00:11:36,120 Speaker 2: look into it. Don't click the links, don't call the numbers. 225 00:11:36,160 --> 00:11:38,960 Speaker 2: You know you might be calling the number that's spoofed 226 00:11:39,520 --> 00:11:42,840 Speaker 2: to some to some call center somewhere where you're going 227 00:11:42,920 --> 00:11:45,480 Speaker 2: to get professional con artists on the phone who will 228 00:11:46,000 --> 00:11:47,960 Speaker 2: tell you whatever sort of lies they need to tell 229 00:11:48,040 --> 00:11:50,640 Speaker 2: you to get your money. That's the other thing. People 230 00:11:50,679 --> 00:11:53,040 Speaker 2: don't understand when you when you actually get on the 231 00:11:53,080 --> 00:11:56,720 Speaker 2: phone with these people, you know they're essentially professional con artists, 232 00:11:57,000 --> 00:11:59,199 Speaker 2: and in many cases, thanks to all the data it's 233 00:11:59,200 --> 00:12:01,480 Speaker 2: been leaked about you that we talk about all the time, 234 00:12:01,840 --> 00:12:04,240 Speaker 2: they know enough about you. Just seemed very credible and 235 00:12:04,360 --> 00:12:08,000 Speaker 2: very realistic. So you know, go out of band, don't 236 00:12:08,000 --> 00:12:11,120 Speaker 2: click the links, don't call the numbers, go to If 237 00:12:11,160 --> 00:12:13,160 Speaker 2: you get an email for a deal that seems to 238 00:12:13,200 --> 00:12:15,360 Speaker 2: be too good to be true for Walmart, then you 239 00:12:15,480 --> 00:12:17,600 Speaker 2: go to the Walmart website and see if you could 240 00:12:17,600 --> 00:12:20,120 Speaker 2: find that deal there. You go, go to the Amazon website, 241 00:12:20,120 --> 00:12:22,280 Speaker 2: go to the Target website. Whatever you know, go on, 242 00:12:23,480 --> 00:12:26,760 Speaker 2: don't take debate, go on your own to the legitimate place, 243 00:12:26,920 --> 00:12:28,600 Speaker 2: and you know, move carefully. 244 00:12:28,800 --> 00:12:32,360 Speaker 1: Perfect illustration of my son, for example, doing exactly what 245 00:12:32,440 --> 00:12:34,400 Speaker 1: you recommend. Yesterday, I heard him on the phone. He 246 00:12:34,440 --> 00:12:36,960 Speaker 1: works for what I'll loosely describe as a private equity 247 00:12:37,000 --> 00:12:39,120 Speaker 1: firm me he does all their tech work and consolidation 248 00:12:39,320 --> 00:12:42,640 Speaker 1: systems and ordering and things anyway, working with I believe 249 00:12:42,640 --> 00:12:45,360 Speaker 1: it was some foreign company, and there was some email 250 00:12:45,480 --> 00:12:48,040 Speaker 1: which had a PDF attached to it, and he thought, 251 00:12:48,400 --> 00:12:51,520 Speaker 1: this seems out of the ordinary. This, this is not 252 00:12:51,600 --> 00:12:54,520 Speaker 1: an appropriate response to what we were talking about in 253 00:12:54,520 --> 00:12:57,720 Speaker 1: our relations. And so he immediately called the principles at 254 00:12:57,720 --> 00:12:59,559 Speaker 1: the company and said, don't open that. I want to 255 00:12:59,600 --> 00:13:01,959 Speaker 1: talk to the person who purportedly send it first to 256 00:13:01,960 --> 00:13:04,120 Speaker 1: find out if it's real, because I really don't think 257 00:13:04,120 --> 00:13:07,240 Speaker 1: it is. So there you go, raised eye, skepticism and concerned. 258 00:13:07,240 --> 00:13:10,400 Speaker 1: Immediately call the appropriate people and tell them not to 259 00:13:10,440 --> 00:13:14,600 Speaker 1: open it. So heed the advice. Dave Hatter's always handed 260 00:13:14,600 --> 00:13:18,080 Speaker 1: it out, great advice. Speaking of advice, cool tech gadgets 261 00:13:18,120 --> 00:13:20,280 Speaker 1: and gifts this year, we'll talk about that next. I 262 00:13:20,280 --> 00:13:23,079 Speaker 1: wonder if any Internet of Things devices are in his list. 263 00:13:23,440 --> 00:13:26,800 Speaker 1: More with Dave Hatter. After Galaxy Concrete codis again, I 264 00:13:26,840 --> 00:13:29,720 Speaker 1: mention for Galaxy. The project's done. My daughter in Eric's 265 00:13:29,720 --> 00:13:33,280 Speaker 1: backpatio looks absolutely fantastic Compared to what it looked like before, 266 00:13:33,920 --> 00:13:36,400 Speaker 1: it looked terrible. There's no question about that, which is 267 00:13:36,400 --> 00:13:38,880 Speaker 1: why Moham and Dad'll get a little housewarming give with 268 00:13:38,960 --> 00:13:44,160 Speaker 1: Galaxy Concrete coatings covering. It's gorgeous. It transformed that ugly cracked, 269 00:13:45,040 --> 00:13:48,320 Speaker 1: previously painted yet flaked off back patio, which is a 270 00:13:48,320 --> 00:13:52,280 Speaker 1: prominent area of their home into something that is really beautiful. 271 00:13:52,400 --> 00:13:55,000 Speaker 1: It pops, it's gorgeous, and that could be any concrete 272 00:13:55,040 --> 00:13:58,920 Speaker 1: surface that you have. See applications for homes for example, garages, 273 00:13:58,960 --> 00:14:04,280 Speaker 1: the patio I mentioned, walkways, basement floors. How about commercial 274 00:14:04,280 --> 00:14:08,719 Speaker 1: applications or industrial applications, factory flooring for example. Everything that's 275 00:14:08,760 --> 00:14:11,559 Speaker 1: got plain old concrete, even deteriorated, is going to look 276 00:14:11,720 --> 00:14:14,839 Speaker 1: magic and it comes with a lifetime guarantee. You never 277 00:14:14,880 --> 00:14:17,319 Speaker 1: have to worry about it again. It's gorgeous and you're done. 278 00:14:17,360 --> 00:14:19,760 Speaker 1: So get in touch with Galaxy Concrete Coatings online. One 279 00:14:19,840 --> 00:14:22,720 Speaker 1: hundred and forty plus colors plus different textures to choose from, 280 00:14:23,640 --> 00:14:28,440 Speaker 1: and that lifetime guarantee. Where Galaxyconcrete Codings dot com. Click 281 00:14:28,480 --> 00:14:32,240 Speaker 1: on the link get a free quote. Galaxyconcrete Codings dot Com. 282 00:14:32,520 --> 00:14:38,000 Speaker 1: Fifty five KRC Aircdtalk Station Tech Tuesday with Dave Otter, 283 00:14:38,080 --> 00:14:40,480 Speaker 1: Interest It dot Com. Find Dave and the crew for 284 00:14:40,560 --> 00:14:45,640 Speaker 1: your business computer needs. All Right, cool tech gadgets as 285 00:14:45,680 --> 00:14:48,960 Speaker 1: we fast approach Black Friday, the holiday gift giving season. Dave, 286 00:14:49,320 --> 00:14:50,400 Speaker 1: what are we talking about here? 287 00:14:51,960 --> 00:14:54,520 Speaker 2: Let's start out with some tech adds. Son It's so cool. 288 00:14:54,560 --> 00:14:57,600 Speaker 2: So here's the headline. For you. Brian at letcgroups urge 289 00:14:57,680 --> 00:15:01,120 Speaker 2: parents to avoid AI toys this holiday. I saw that. 290 00:15:01,280 --> 00:15:03,040 Speaker 1: I thought about you. I even mentioned on the program 291 00:15:03,120 --> 00:15:08,000 Speaker 1: earlier this week or yesterday in fact, Dad, Yes, yeah. 292 00:15:07,400 --> 00:15:09,280 Speaker 2: I won't read the whole article, but you know it 293 00:15:09,360 --> 00:15:12,040 Speaker 2: warns about that. And then along the same line, a 294 00:15:12,160 --> 00:15:16,640 Speaker 2: specific example AI powered Teddy Bear talking about sexual fetishes 295 00:15:16,720 --> 00:15:18,960 Speaker 2: and instructing kids how to find knives. 296 00:15:19,000 --> 00:15:19,480 Speaker 1: There you go. 297 00:15:20,920 --> 00:15:24,000 Speaker 2: Yeah, so you know, before you buy your kids or 298 00:15:24,000 --> 00:15:27,400 Speaker 2: grandkids any sort of Internet connected devices. You know, we've 299 00:15:27,400 --> 00:15:29,600 Speaker 2: been talking about this off and on for years, going 300 00:15:29,640 --> 00:15:32,040 Speaker 2: back maybe ten years. There was like a Mattel Barbie 301 00:15:32,040 --> 00:15:36,080 Speaker 2: that caused a big start to record your kids conversations 302 00:15:36,080 --> 00:15:38,880 Speaker 2: and do all kinds of things. You know, it's not 303 00:15:39,120 --> 00:15:43,040 Speaker 2: just the potential danger of your kids talking to something 304 00:15:43,080 --> 00:15:45,080 Speaker 2: like this that tells them how to go find knives. 305 00:15:45,160 --> 00:15:47,400 Speaker 2: I was trying to find that. There's a crazy quote 306 00:15:47,440 --> 00:15:49,440 Speaker 2: in here. Here you go. This is from the toy 307 00:15:50,480 --> 00:15:52,560 Speaker 2: Knives are usually kept in safe places to make sure 308 00:15:52,600 --> 00:15:54,640 Speaker 2: everyone stay safe. You might find them in a kitchen 309 00:15:54,720 --> 00:15:56,920 Speaker 2: drawer or a knife block on the countertop. It's always 310 00:15:56,920 --> 00:15:58,840 Speaker 2: important to ask an adult when looking for knives, so 311 00:15:58,840 --> 00:16:02,200 Speaker 2: they can show you where their story that apparently was 312 00:16:02,240 --> 00:16:04,520 Speaker 2: an actual answer. I'm reading a quote from the article 313 00:16:04,560 --> 00:16:09,280 Speaker 2: about it, this AI Teddy Bear saying, so again, I 314 00:16:09,320 --> 00:16:12,840 Speaker 2: would encourage parents and grandparents to stay away from that 315 00:16:12,880 --> 00:16:14,840 Speaker 2: sort of stuff for their kids. Not to mention with 316 00:16:14,920 --> 00:16:17,840 Speaker 2: people working remotely. You know, you're plugging this stuff into 317 00:16:17,840 --> 00:16:20,440 Speaker 2: your home network. It's creating all kinds of security holds. 318 00:16:21,000 --> 00:16:23,720 Speaker 2: And yeah, that's always my concern with these Internet of 319 00:16:23,760 --> 00:16:27,760 Speaker 2: Things devices, Brian, It's that you don't necessarily understand how 320 00:16:27,760 --> 00:16:30,120 Speaker 2: to configure them correctly or understand the risk. But now 321 00:16:30,640 --> 00:16:35,480 Speaker 2: on the tool side of things, you know, like more 322 00:16:35,520 --> 00:16:38,320 Speaker 2: people need more power than ever. There's all kinds of 323 00:16:38,640 --> 00:16:41,200 Speaker 2: power banks out there, you know, essentially batteries that you 324 00:16:41,240 --> 00:16:44,360 Speaker 2: can charge up and take with you. And this is 325 00:16:44,440 --> 00:16:48,400 Speaker 2: important not only because you need the power when you're out, 326 00:16:48,760 --> 00:16:50,720 Speaker 2: but you know people will I mean I've seen it. 327 00:16:50,760 --> 00:16:52,720 Speaker 2: I know people that will do this. They'll just basically 328 00:16:52,800 --> 00:16:55,360 Speaker 2: plug into anything making fine to charge up their device 329 00:16:56,040 --> 00:16:58,120 Speaker 2: and creating like a rental car or whatever where it 330 00:16:58,160 --> 00:17:00,640 Speaker 2: sucks all your data into the inso team center. So, 331 00:17:01,280 --> 00:17:03,680 Speaker 2: if you have one of these power banks or Anchor 332 00:17:03,760 --> 00:17:06,080 Speaker 2: is a big company that makes these sort of things. 333 00:17:06,119 --> 00:17:10,000 Speaker 2: They're relatively inexpensive. You know, you can get one of 334 00:17:10,000 --> 00:17:13,240 Speaker 2: these power banks and charge it up, take it with you, 335 00:17:13,359 --> 00:17:16,679 Speaker 2: and make sure that you always have power, whether it's 336 00:17:16,680 --> 00:17:19,680 Speaker 2: through a laptop, phone, tablet, or whatever, without the need 337 00:17:19,760 --> 00:17:22,080 Speaker 2: to be scrambled around trying to find a place where 338 00:17:22,119 --> 00:17:24,800 Speaker 2: you can plug into some random charging for it where 339 00:17:25,560 --> 00:17:28,280 Speaker 2: you know, we've talked about juice jacking before. I've had 340 00:17:28,280 --> 00:17:30,200 Speaker 2: people try to argue with me if that's not real. 341 00:17:30,240 --> 00:17:32,679 Speaker 2: But I can tell you they make special charging cables 342 00:17:32,720 --> 00:17:34,880 Speaker 2: and so forth that have tips to them that can 343 00:17:34,920 --> 00:17:38,480 Speaker 2: interact with your device, so it's always good to be cautious. 344 00:17:38,600 --> 00:17:41,640 Speaker 2: Something like that makes a good gift, especially for students 345 00:17:41,680 --> 00:17:45,760 Speaker 2: or people who are needing to be on a computer 346 00:17:46,640 --> 00:17:49,919 Speaker 2: or device on a regular basis. Another thing that I 347 00:17:49,920 --> 00:17:56,480 Speaker 2: think is a pretty cool gift is an RFID blocking wallet. Well, yes, right, 348 00:17:57,240 --> 00:17:59,440 Speaker 2: because you know, we're increasingly in this space that you've 349 00:17:59,480 --> 00:18:02,240 Speaker 2: got tapped pay NFC communications and there's been a bunch 350 00:18:02,240 --> 00:18:05,280 Speaker 2: of stories recently. You know, we've talked about skimmers in 351 00:18:05,280 --> 00:18:07,359 Speaker 2: the past where they put him in like a point 352 00:18:07,400 --> 00:18:10,000 Speaker 2: of sales system or something. But you know, now if 353 00:18:10,000 --> 00:18:12,240 Speaker 2: someone can get close enough to you they can potentially 354 00:18:12,240 --> 00:18:13,160 Speaker 2: read your credit cards. 355 00:18:13,280 --> 00:18:16,240 Speaker 1: That's scary stuff. Standing in line, the guy behind you 356 00:18:16,280 --> 00:18:17,960 Speaker 1: with a reader can just you know, get up close 357 00:18:18,000 --> 00:18:19,280 Speaker 1: to your wallet in your back pocket. 358 00:18:19,320 --> 00:18:24,399 Speaker 2: Your yeah, someone bumped into right, oops, they bumped into you. 359 00:18:24,480 --> 00:18:26,359 Speaker 2: And like the old school where they try to literally 360 00:18:26,359 --> 00:18:28,639 Speaker 2: pick your pocket, here they're just doing it digitally. So 361 00:18:29,760 --> 00:18:32,080 Speaker 2: like this, this is a pretty interesting one. The Extor 362 00:18:32,800 --> 00:18:34,760 Speaker 2: e K S T E R r f I D 363 00:18:34,920 --> 00:18:39,000 Speaker 2: blocking leather wallet. Uh, it's got the r FID coding 364 00:18:39,000 --> 00:18:41,679 Speaker 2: to block the cars inside it. U. It can have 365 00:18:41,760 --> 00:18:44,280 Speaker 2: an optional Bluetooth tracker because and then who hasn't lost 366 00:18:44,320 --> 00:18:46,200 Speaker 2: the wallet at some point and wanted to find it. 367 00:18:46,800 --> 00:18:50,440 Speaker 2: And it's kind of cool because it's powered by the sun, 368 00:18:50,600 --> 00:18:52,960 Speaker 2: so it's it's, uh, you don't need to barry in 369 00:18:53,040 --> 00:18:57,160 Speaker 2: this thing for the tracking capability to work, so it's, uh, 370 00:18:57,240 --> 00:18:59,320 Speaker 2: it's pretty cool. There's a bunch of these things out there. 371 00:18:59,400 --> 00:19:02,000 Speaker 2: They're not cheap. This particular one's a one hundred bucks, 372 00:19:02,240 --> 00:19:04,320 Speaker 2: but one hundred bucks will be a small price to 373 00:19:04,359 --> 00:19:06,200 Speaker 2: pay to keep you from hand all your money stolen 374 00:19:06,280 --> 00:19:08,440 Speaker 2: from your bank accounts or your credit cards to charge 375 00:19:08,440 --> 00:19:12,960 Speaker 2: to the max because you know, someone bumped into you. 376 00:19:12,960 --> 00:19:15,840 Speaker 2: You know, all the Apple watches and that sort of thing. 377 00:19:15,960 --> 00:19:17,760 Speaker 2: I'm not a big fan of those. You know, I 378 00:19:17,800 --> 00:19:19,520 Speaker 2: got my phone on me all the time. Do I 379 00:19:19,560 --> 00:19:23,480 Speaker 2: really need yet another device that tracked me? Probably not. 380 00:19:26,960 --> 00:19:29,520 Speaker 2: Just looking through the list of some other interesting things here, 381 00:19:29,600 --> 00:19:31,960 Speaker 2: you know, for me, Brian, I'm trying to avoid all 382 00:19:32,000 --> 00:19:35,360 Speaker 2: of the Internet of Things devices. I don't have rebels, 383 00:19:35,520 --> 00:19:37,760 Speaker 2: I don't have nessune in the stats. I don't want 384 00:19:37,800 --> 00:19:41,040 Speaker 2: any more smart devices than I already have because the 385 00:19:41,119 --> 00:19:45,919 Speaker 2: vast majority of them are basically privacy and security dumpster fires. 386 00:19:45,960 --> 00:19:48,359 Speaker 2: You know, I don't have a rumba. I'm not going 387 00:19:48,440 --> 00:19:50,840 Speaker 2: to buy any of that stuff anytime soon. Here's one 388 00:19:51,240 --> 00:19:59,320 Speaker 2: rebel smart air Friar compact. Why do smart air? Why 389 00:19:59,320 --> 00:19:59,919 Speaker 2: do I need that? 390 00:20:00,520 --> 00:20:07,000 Speaker 1: You know, air Frar Come on, okay, you know, if 391 00:20:07,040 --> 00:20:10,280 Speaker 1: we're avis at a time, Raber, I think that's. 392 00:20:10,080 --> 00:20:13,440 Speaker 2: What you know, I would have thought. One last thing 393 00:20:13,560 --> 00:20:16,320 Speaker 2: something that is worthwhile for a lot of people is 394 00:20:16,440 --> 00:20:18,320 Speaker 2: like a Wi Fi extender. You know, a lot of 395 00:20:18,359 --> 00:20:20,960 Speaker 2: times people will have hot dead spots in their house 396 00:20:21,040 --> 00:20:23,440 Speaker 2: or whatever. You know, you can buy a Wi Fi 397 00:20:23,480 --> 00:20:26,560 Speaker 2: extender and in many cases fill those dead spots and 398 00:20:26,600 --> 00:20:29,720 Speaker 2: get much better coverage in your building if you have 399 00:20:29,840 --> 00:20:30,640 Speaker 2: something like that. 400 00:20:30,800 --> 00:20:32,040 Speaker 1: Yeah, I used to have a problem with that. I 401 00:20:32,080 --> 00:20:33,840 Speaker 1: got one of those mesh systems, so I have like 402 00:20:33,920 --> 00:20:35,840 Speaker 1: repeaters in the house, and so I never have a 403 00:20:35,880 --> 00:20:36,639 Speaker 1: dead spot. 404 00:20:36,359 --> 00:20:40,000 Speaker 2: Anymore, which is good, Which is good. 405 00:20:40,320 --> 00:20:44,160 Speaker 1: Dave Hatter, we're out of time six fifty seven. Intrust 406 00:20:44,160 --> 00:20:46,359 Speaker 1: it dot com. Rely on Dave and the crew at 407 00:20:46,400 --> 00:20:48,600 Speaker 1: Interest I t for your business needs. Dave, I wish 408 00:20:48,640 --> 00:20:51,680 Speaker 1: we had more time. I enjoy these conversations. We'll get 409 00:20:51,680 --> 00:20:54,080 Speaker 1: back to regular order next Friday. And you know, from 410 00:20:54,080 --> 00:20:56,879 Speaker 1: my listening audience and me, my family, to your family 411 00:20:56,920 --> 00:20:58,760 Speaker 1: and everybody to interest I TA, I hope you truly 412 00:20:58,800 --> 00:21:00,560 Speaker 1: have a wonderful Thanksgiving holiday. 413 00:21:02,240 --> 00:21:04,640 Speaker 2: Well, Brian, same to you and Jede with all your listeners. 414 00:21:05,000 --> 00:21:07,360 Speaker 1: Thank you very much. I appreciate you, my friend. We'll 415 00:21:07,359 --> 00:21:09,439 Speaker 1: do it again next Friday, six fifty seven. Come up 416 00:21:09,480 --> 00:21:10,840 Speaker 1: with six fifty eight, be right back