1 00:00:00,080 --> 00:00:02,400 Speaker 1: Fifty five KRC the talk station. 2 00:00:04,320 --> 00:00:11,319 Speaker 2: Six thirty Friday. It is a time intrust dot com 3 00:00:11,360 --> 00:00:13,520 Speaker 2: for the best in the business and the greater Cincinnati area. 4 00:00:13,520 --> 00:00:15,239 Speaker 2: You want to get rely on interest I for all 5 00:00:15,280 --> 00:00:19,119 Speaker 2: your business computer needs, especially if your entire system gets 6 00:00:19,160 --> 00:00:21,760 Speaker 2: wiped out by a malware attack. Welcome back, Dave, had 7 00:00:21,760 --> 00:00:23,920 Speaker 2: it from interest I, T well. Thank you. Start by 8 00:00:24,400 --> 00:00:28,040 Speaker 2: appreciating your company interest I for sponsoring this segment, valuable 9 00:00:28,120 --> 00:00:31,360 Speaker 2: it is and scaring the living crap out of us regularly, 10 00:00:31,400 --> 00:00:34,040 Speaker 2: which is a good thing, most notably on the heels 11 00:00:34,080 --> 00:00:38,400 Speaker 2: of this Striker attack. What the heck, Dave, this. 12 00:00:38,479 --> 00:00:41,159 Speaker 1: Is so bad, Brian, as always, thanks for having me 13 00:00:41,200 --> 00:00:43,120 Speaker 1: on and the pleasure. I hate to always be the 14 00:00:43,120 --> 00:00:45,080 Speaker 1: guy with the bad news, but if you look at 15 00:00:45,159 --> 00:00:47,680 Speaker 1: what happened with Striker, and I'm a little surprised this 16 00:00:47,720 --> 00:00:53,479 Speaker 1: has not gotten more news coverage or or or because 17 00:00:53,520 --> 00:00:57,960 Speaker 1: of the connection to the action in Iran. So but 18 00:00:58,040 --> 00:00:59,360 Speaker 1: this is the kind of thing, Brian, you and I 19 00:00:59,440 --> 00:01:02,320 Speaker 1: have talked about for years. I've warned about it for years. 20 00:01:03,320 --> 00:01:06,720 Speaker 1: So Striker is a large company, apparently based on a 21 00:01:06,720 --> 00:01:09,240 Speaker 1: Michigan I really wasn't very familiar with them up to 22 00:01:09,280 --> 00:01:12,160 Speaker 1: this incident, although I'll tell you a connection I have 23 00:01:12,280 --> 00:01:14,560 Speaker 1: to it which is really interesting and even more concerning. 24 00:01:15,280 --> 00:01:20,360 Speaker 1: They make medical devices that are used in hospitals, doctors' offices, ambulances, 25 00:01:20,400 --> 00:01:22,520 Speaker 1: that sort of thing. So I'm just going to quote 26 00:01:22,560 --> 00:01:25,440 Speaker 1: from an article here Fortune five hundred company reported global 27 00:01:25,440 --> 00:01:28,319 Speaker 1: sales of twenty two point six billion and twenty twenty 28 00:01:28,319 --> 00:01:31,880 Speaker 1: four fifty three thousand employees. Yeah, so we're we're not 29 00:01:31,920 --> 00:01:34,679 Speaker 1: talking about some mom and pop shop that doesn't have money, 30 00:01:34,840 --> 00:01:39,039 Speaker 1: doesn't have resources, and frankly is a technology company. They're 31 00:01:39,040 --> 00:01:42,600 Speaker 1: making you know again quoting this article leading medical technology company. 32 00:01:43,280 --> 00:01:47,640 Speaker 1: So they were hit apparently as a result of the 33 00:01:47,720 --> 00:01:51,040 Speaker 1: hostilities in Iran. Now, can't say it wouldn't have happened anyway, 34 00:01:51,160 --> 00:01:54,800 Speaker 1: I know, but the organization that was behind this attack 35 00:01:55,280 --> 00:01:58,040 Speaker 1: claims to be affiliated with Iran. You can see what 36 00:01:58,120 --> 00:02:01,600 Speaker 1: they had to say in them. They put out, you know, 37 00:02:01,720 --> 00:02:04,840 Speaker 1: Zionist rooted Corporation Striker, one of the key arms of 38 00:02:04,840 --> 00:02:09,120 Speaker 1: the globalist Zionist lobby New Central Ring in the New 39 00:02:09,120 --> 00:02:15,240 Speaker 1: Epstein Chain. I'm reading to you from a statement from Hondala. 40 00:02:14,520 --> 00:02:17,320 Speaker 2: Is big Foot, are big Foot and Chupacabra mentioned in 41 00:02:17,360 --> 00:02:20,200 Speaker 2: there as well as the other conspiracy theories that they're fabricating. 42 00:02:21,520 --> 00:02:26,080 Speaker 1: They are not. They are just So here's what they say, Brian. Again, 43 00:02:26,120 --> 00:02:30,000 Speaker 1: the organization that claims to have done this, Striker's offices 44 00:02:30,000 --> 00:02:32,120 Speaker 1: in seventy nine countries have been forced to shut down. 45 00:02:32,240 --> 00:02:33,960 Speaker 1: All the acquired data is now in the hands of 46 00:02:33,960 --> 00:02:35,600 Speaker 1: the three people of the world, ready to be used 47 00:02:35,600 --> 00:02:37,640 Speaker 1: for the true lancement of humanity and the exposure of 48 00:02:37,680 --> 00:02:41,399 Speaker 1: injustice and corruption. They say. In this operation, over two 49 00:02:41,440 --> 00:02:44,280 Speaker 1: hundred thousand systems, servers, and mobile devices have been wiped 50 00:02:44,280 --> 00:02:47,640 Speaker 1: and fifty terabytes of critical data have been extracted. So again, 51 00:02:47,680 --> 00:02:49,800 Speaker 1: this is a company that makes medical equipment. 52 00:02:49,880 --> 00:02:51,920 Speaker 2: I was just trying to ask you, are the terrorist 53 00:02:51,960 --> 00:02:54,880 Speaker 2: hackers behind the Iranians? Are they going to start making 54 00:02:55,080 --> 00:02:57,920 Speaker 2: surgical and neurotechnology the equipment that the Striker can no 55 00:02:58,000 --> 00:03:00,160 Speaker 2: longer manufacture? Are they going to bring this harmony to 56 00:03:00,200 --> 00:03:02,240 Speaker 2: the globe and taking over the operation? 57 00:03:02,400 --> 00:03:06,960 Speaker 1: Dave Quest, Yes, I understand, I doubt it. But what 58 00:03:07,000 --> 00:03:09,440 Speaker 1: they could potentially do is start to then hack the 59 00:03:09,480 --> 00:03:11,960 Speaker 1: equipment with the information that they have. But here here's 60 00:03:12,000 --> 00:03:15,000 Speaker 1: my real concern about this, Brian. So a wiper attack 61 00:03:15,200 --> 00:03:18,519 Speaker 1: is essentially a type of malware similar to ransomware. The 62 00:03:18,600 --> 00:03:20,880 Speaker 1: difference is, you know, if I hit you with ransomware, 63 00:03:21,080 --> 00:03:23,600 Speaker 1: I want you to pay a ransom to decrypt your data. Right, 64 00:03:23,680 --> 00:03:27,320 Speaker 1: I encrypt your data, scrambles it up, renders the devices 65 00:03:27,360 --> 00:03:31,480 Speaker 1: where that data exists unusable, and you pay me a ransom. 66 00:03:31,600 --> 00:03:35,440 Speaker 1: I theoretically decrypt the data. Everything goes back to normal. 67 00:03:35,800 --> 00:03:38,120 Speaker 1: And then if I say, as the victim of it, 68 00:03:38,160 --> 00:03:39,920 Speaker 1: I said well, I'm not going to pay, then they say, well, 69 00:03:40,000 --> 00:03:43,840 Speaker 1: let's fine. Typically they have also exfiltrated stolen your data, 70 00:03:44,320 --> 00:03:47,520 Speaker 1: and you know they'll use that as leverage, say well, 71 00:03:47,560 --> 00:03:50,440 Speaker 1: we'll start to expose this so called fifty terabytes of 72 00:03:50,480 --> 00:03:52,840 Speaker 1: data they claim they've stolen to the world unless you 73 00:03:52,880 --> 00:03:55,560 Speaker 1: pay us. A wiper is different. Though a wiper is 74 00:03:55,600 --> 00:03:58,120 Speaker 1: not designed to get a ransom. A wiper is not 75 00:03:58,200 --> 00:04:00,800 Speaker 1: designed to be able to recover from its diffictly designed 76 00:04:00,800 --> 00:04:03,640 Speaker 1: to just destroy the data on whatever devices it hits, 77 00:04:04,160 --> 00:04:06,640 Speaker 1: so that they're down and you can't use them. This 78 00:04:06,680 --> 00:04:09,840 Speaker 1: has happened before, you may recall, I think it was 79 00:04:09,880 --> 00:04:14,280 Speaker 1: twenty fifteen. A Ukrainian company that made software was hit 80 00:04:14,360 --> 00:04:16,240 Speaker 1: with something like this, which then went out to their 81 00:04:16,240 --> 00:04:19,480 Speaker 1: customers and knocked hundreds of thousands of devices from huge 82 00:04:19,480 --> 00:04:22,840 Speaker 1: companies like MAYERSK offline. So this isn't the first time 83 00:04:22,920 --> 00:04:25,960 Speaker 1: something like this has happened, but it's the first time 84 00:04:26,040 --> 00:04:27,839 Speaker 1: I can tell you I'm aware of anything of this 85 00:04:28,279 --> 00:04:32,520 Speaker 1: significance in the US, not only because now think about this, Brian, 86 00:04:32,600 --> 00:04:35,640 Speaker 1: think about this, if they really knocked out two hundred 87 00:04:35,640 --> 00:04:38,400 Speaker 1: thousand devices. And Striker has commented on this, but it's 88 00:04:38,400 --> 00:04:41,680 Speaker 1: been kind of vague. Even if I had systems in place, 89 00:04:42,000 --> 00:04:45,800 Speaker 1: so I had resiliency and business continuity and could recover 90 00:04:45,880 --> 00:04:49,039 Speaker 1: these two hundred thousand devices quickly, even if it took 91 00:04:49,080 --> 00:04:51,960 Speaker 1: ten minutes per device. What's ten minutes times two hundred 92 00:04:52,000 --> 00:04:55,279 Speaker 1: thousand Think at the downtime. 93 00:04:54,839 --> 00:04:57,520 Speaker 2: Here, Holy cow when you put it in them in terms. 94 00:04:58,279 --> 00:05:00,720 Speaker 1: Yes, well, this is the thing that people are often 95 00:05:00,839 --> 00:05:04,080 Speaker 1: not thinking about. So you know, part of cybersecurity is 96 00:05:04,080 --> 00:05:06,719 Speaker 1: how do I harden my systems and reduce the likelihood 97 00:05:06,800 --> 00:05:10,200 Speaker 1: that something like this will happen? Right? And then secondarily, 98 00:05:10,720 --> 00:05:12,919 Speaker 1: if something happens, doesn't have to be a cyber attack, 99 00:05:12,960 --> 00:05:15,800 Speaker 1: could be natural disaster, could be equipment failure. How do 100 00:05:15,920 --> 00:05:19,240 Speaker 1: I have an environment and systems in it that are 101 00:05:19,240 --> 00:05:23,200 Speaker 1: resilient and can recover quickly with minimal downtime. You know, again, 102 00:05:23,320 --> 00:05:25,560 Speaker 1: I only know what I've read about this, Brian. But 103 00:05:25,640 --> 00:05:28,839 Speaker 1: think about this again. Ten minutes at two hundred thousand, 104 00:05:28,839 --> 00:05:30,159 Speaker 1: and I can tell you it. We'll take more than 105 00:05:30,200 --> 00:05:32,840 Speaker 1: ten minutes. I'm just speculating here. I don't know if 106 00:05:32,839 --> 00:05:35,880 Speaker 1: this company can survive. I know, you know there and 107 00:05:36,000 --> 00:05:39,240 Speaker 1: that fifty three thousand jobs, twenty two billion dollar company. 108 00:05:39,240 --> 00:05:43,039 Speaker 1: But here's the really scary part. That's all terrible, absolutely terrible. 109 00:05:44,240 --> 00:05:48,000 Speaker 1: But these medical devices are everywhere. I can tell you 110 00:05:48,080 --> 00:05:50,000 Speaker 1: when I put my mayor's hat on, we have a 111 00:05:50,000 --> 00:05:52,520 Speaker 1: bunch of their equipment in our fire department and our ambulances. 112 00:05:52,680 --> 00:05:55,039 Speaker 2: My wife works one of the largest hospital systems in 113 00:05:55,040 --> 00:05:57,520 Speaker 2: the country, and she knew about this right away, and 114 00:05:57,560 --> 00:05:59,160 Speaker 2: she even mentioned it to me the other day that 115 00:05:59,520 --> 00:06:01,640 Speaker 2: this had happened. Then it impacted. I mean, they have 116 00:06:01,800 --> 00:06:04,800 Speaker 2: relationships established with all these hospital systems, and so there 117 00:06:04,800 --> 00:06:05,960 Speaker 2: may be a ripple effect there. 118 00:06:06,680 --> 00:06:09,960 Speaker 1: Yes, I mean, okay, if they don't survive when those 119 00:06:10,000 --> 00:06:13,000 Speaker 1: devices start to go bad, what happens? Could the devices 120 00:06:13,200 --> 00:06:15,320 Speaker 1: could summer All of the devices they make that are 121 00:06:15,320 --> 00:06:19,039 Speaker 1: Internet connected be hacked as a result of this information 122 00:06:19,120 --> 00:06:22,360 Speaker 1: they now have who knows. I mean, this is where 123 00:06:22,440 --> 00:06:25,800 Speaker 1: people do not think about the downstream consequences of not 124 00:06:25,960 --> 00:06:29,800 Speaker 1: having the right security in place, and frankly of equal 125 00:06:29,880 --> 00:06:32,039 Speaker 1: maybe more importance when you see these kind of things 126 00:06:32,240 --> 00:06:35,560 Speaker 1: with having systems in place for resiliency and a quick recovery. 127 00:06:35,680 --> 00:06:37,919 Speaker 2: Interest I dot Com a plug right there for that 128 00:06:37,960 --> 00:06:40,159 Speaker 2: one pause will bring him back and talk about more issues. 129 00:06:40,200 --> 00:06:43,880 Speaker 2: Foreign exchange, your premier European and Asian imported automotive specialists. 130 00:06:43,880 --> 00:06:44,880 Speaker 1: You talk station. 131 00:06:48,240 --> 00:06:50,200 Speaker 2: Sixty one if you have carec the talk station Brian 132 00:06:50,240 --> 00:06:53,159 Speaker 2: Thomas with Interest i T dot COM's Dave had Or 133 00:06:54,200 --> 00:06:56,359 Speaker 2: business curer says interest I T is the best in 134 00:06:56,360 --> 00:06:58,080 Speaker 2: the business when it comes to things like we were talking 135 00:06:58,120 --> 00:07:00,520 Speaker 2: about the last hour. Try to avoid your you avoid 136 00:07:00,560 --> 00:07:04,239 Speaker 2: the risk of being hacked. Dave pivoting over local person. 137 00:07:04,600 --> 00:07:07,080 Speaker 2: Talk about imposter scams. You got one every week? What 138 00:07:07,200 --> 00:07:07,760 Speaker 2: happened here? 139 00:07:09,560 --> 00:07:13,320 Speaker 1: Yeah, these scams are super prevalent, Brian. I know people 140 00:07:13,320 --> 00:07:17,320 Speaker 1: that have been victims of them personally, and as you 141 00:07:17,360 --> 00:07:19,320 Speaker 1: and I have discussed so many times, and I think 142 00:07:19,360 --> 00:07:22,080 Speaker 1: you might have mentioned this yesterday you had another breach 143 00:07:22,280 --> 00:07:27,720 Speaker 1: of a billion records. Your data is everywhere now, everywhere, everywhere, 144 00:07:27,920 --> 00:07:30,920 Speaker 1: And I would bet you there's not a single person 145 00:07:31,040 --> 00:07:33,960 Speaker 1: listening to us ever that hasn't gotten at least one 146 00:07:33,960 --> 00:07:35,720 Speaker 1: notice of some sort of data breach. 147 00:07:35,520 --> 00:07:39,200 Speaker 2: Even if they're a loud ate. That's because public information, 148 00:07:39,440 --> 00:07:42,280 Speaker 2: like auditor records, public information. It's out there for all 149 00:07:42,280 --> 00:07:44,320 Speaker 2: to see. So even if you're hiding in a closet, 150 00:07:44,960 --> 00:07:47,000 Speaker 2: you're still going to have your information hacked. You have 151 00:07:47,040 --> 00:07:48,000 Speaker 2: an address, yes. 152 00:07:47,800 --> 00:07:50,720 Speaker 1: I mean, you're exactly right, Brian. I mean, the more 153 00:07:50,760 --> 00:07:53,280 Speaker 1: apps you use, the more Internet of Things devices you have, 154 00:07:53,400 --> 00:07:54,760 Speaker 1: the more your data is going to be out there. 155 00:07:54,760 --> 00:07:56,520 Speaker 1: But even if you don't do any of that stuff, 156 00:07:56,920 --> 00:07:59,680 Speaker 1: I mean you're using credit cards, people are buying and 157 00:07:59,720 --> 00:08:02,480 Speaker 1: selling your data. And the reason why that's important is 158 00:08:02,560 --> 00:08:04,800 Speaker 1: all of that data gets in the hands of bad guys. 159 00:08:04,840 --> 00:08:07,360 Speaker 1: They buy it off the dark web and such. Again, 160 00:08:07,400 --> 00:08:10,360 Speaker 1: we see stories regularly about oh, here's someone found a 161 00:08:10,440 --> 00:08:13,000 Speaker 1: database with a being and social security numbers or whatever. 162 00:08:14,240 --> 00:08:16,720 Speaker 1: All that data in the hands of these bad guys 163 00:08:16,760 --> 00:08:19,600 Speaker 1: helps them refine their scams, It helps them target you, 164 00:08:20,080 --> 00:08:21,840 Speaker 1: It helps them figure out what to say to you 165 00:08:22,360 --> 00:08:24,000 Speaker 1: if and when they talk to you. On the phone. 166 00:08:24,120 --> 00:08:27,000 Speaker 1: It helps them guide those conversations so they seem like 167 00:08:27,080 --> 00:08:31,760 Speaker 1: the legitimate agency that claims you have some kind of problem, right, 168 00:08:31,840 --> 00:08:35,440 Speaker 1: So they'll fish you via text, via email, sometimes they'll 169 00:08:35,440 --> 00:08:38,440 Speaker 1: call you, sometimes a combination of all these things. They're 170 00:08:38,520 --> 00:08:41,680 Speaker 1: using this information to create messages that are much more 171 00:08:41,720 --> 00:08:44,240 Speaker 1: realistic than the standard. Hey, I'm in Nigerian Prince. I 172 00:08:44,280 --> 00:08:46,480 Speaker 1: want to transfer eight million dollars to your bank account. 173 00:08:47,960 --> 00:08:50,000 Speaker 1: And that's what you see here in this story price 174 00:08:50,720 --> 00:08:54,000 Speaker 1: woman rather loses five thousand dollars in bank imposter scam. 175 00:08:54,520 --> 00:08:57,360 Speaker 1: And I think it's I have so much respect for 176 00:08:57,440 --> 00:09:00,400 Speaker 1: people who come forward and tell these stories. Yes, because 177 00:09:00,520 --> 00:09:03,840 Speaker 1: I know it's embarrassing. I know it's hurtful to them. 178 00:09:04,000 --> 00:09:05,920 Speaker 1: You know, who has a spare five grand they want 179 00:09:05,960 --> 00:09:08,560 Speaker 1: to lose. But rather than listen to someone like me 180 00:09:08,760 --> 00:09:11,880 Speaker 1: speculate on this, orn't about this, when these real people 181 00:09:12,240 --> 00:09:15,319 Speaker 1: come forward and tell these stories, at least I believe 182 00:09:15,360 --> 00:09:17,360 Speaker 1: it puts a fine point on it. This stuff is real. 183 00:09:17,400 --> 00:09:19,800 Speaker 1: It's not just even they've had her making stuff up. 184 00:09:19,880 --> 00:09:21,320 Speaker 1: This is happening every day. 185 00:09:21,160 --> 00:09:23,800 Speaker 2: Well, and even more so, she works in banking and 186 00:09:23,840 --> 00:09:26,839 Speaker 2: she knows how financial systems work. That's what the reporting shows. 187 00:09:26,880 --> 00:09:29,079 Speaker 2: So this isn't some novice out there. This is someone 188 00:09:29,120 --> 00:09:32,600 Speaker 2: who has at least some understanding of the banking system. 189 00:09:33,360 --> 00:09:36,720 Speaker 1: Yeah, I agree. And so she lost five thousand dollars. 190 00:09:36,720 --> 00:09:40,000 Speaker 1: She claims that, you know, she got a call. They 191 00:09:40,000 --> 00:09:42,360 Speaker 1: said there was some kind of fraud on her account. 192 00:09:42,880 --> 00:09:47,679 Speaker 1: The number appeared to come from her bank, and they said. 193 00:09:48,800 --> 00:09:51,480 Speaker 1: There's one part again, I only know what the reporting says. 194 00:09:51,840 --> 00:09:54,520 Speaker 1: One part that says that the callers, the hackers, the 195 00:09:54,559 --> 00:09:57,240 Speaker 1: bad actors, were prepared. They knew her recent transactions. They 196 00:09:57,320 --> 00:09:59,560 Speaker 1: knew she had just ordered a new debit card. They 197 00:09:59,559 --> 00:10:02,040 Speaker 1: had a deep tell personal account information that made them 198 00:10:02,080 --> 00:10:03,160 Speaker 1: sound completely legitimate. 199 00:10:03,280 --> 00:10:03,800 Speaker 2: That's scary. 200 00:10:03,880 --> 00:10:07,240 Speaker 1: Exactly exactly how they would know her recent transactions, that 201 00:10:07,320 --> 00:10:09,840 Speaker 1: she just ordered a debit card, I don't know. But 202 00:10:09,880 --> 00:10:13,760 Speaker 1: the detailed personal account information, again, that stuff is everywhere 203 00:10:13,800 --> 00:10:16,480 Speaker 1: out there now. So just because you get a call, 204 00:10:16,720 --> 00:10:19,600 Speaker 1: it's easy to spoof a phone number. Anyone right now 205 00:10:19,640 --> 00:10:21,480 Speaker 1: can go do a search and find a website that 206 00:10:21,520 --> 00:10:23,360 Speaker 1: will let them send a text or make a phone 207 00:10:23,360 --> 00:10:26,400 Speaker 1: call from any number they want. It is trivially easy. 208 00:10:26,440 --> 00:10:28,920 Speaker 1: You cannot look at a phone number on your phone 209 00:10:29,040 --> 00:10:31,680 Speaker 1: and know whether it's a legitimate number again, be a 210 00:10:31,760 --> 00:10:35,000 Speaker 1: text or phone. So you know, they call her, so 211 00:10:35,040 --> 00:10:37,920 Speaker 1: they got guts, they call her. They're con artists. They're 212 00:10:37,920 --> 00:10:40,040 Speaker 1: going to talk to her on the phone. They have 213 00:10:40,200 --> 00:10:43,600 Speaker 1: this information that makes them seem legitimate. Then they claim 214 00:10:43,760 --> 00:10:46,200 Speaker 1: frauds and activity in Dallas. She needs to open up 215 00:10:46,200 --> 00:10:50,480 Speaker 1: a new account and transfer her money over now again, Brian, 216 00:10:50,200 --> 00:10:52,960 Speaker 1: the bank can just turn your account right, No? Right? 217 00:10:53,000 --> 00:10:57,840 Speaker 1: I mean you should not transfer your money should You 218 00:10:57,840 --> 00:10:59,920 Speaker 1: should not go buy Venmo or put it in ven 219 00:11:00,320 --> 00:11:03,440 Speaker 1: By gift cards, put it in an ATM and another bank. 220 00:11:03,520 --> 00:11:06,680 Speaker 1: I know someone that had something similar to this happened. 221 00:11:08,720 --> 00:11:13,760 Speaker 1: But unfortunately she did it, and you know, this article says, okay, 222 00:11:14,040 --> 00:11:15,800 Speaker 1: didn't even know what happens. She took every you know, 223 00:11:15,920 --> 00:11:19,040 Speaker 1: call their real customer service line, by a police repport, 224 00:11:19,080 --> 00:11:22,600 Speaker 1: contact a better business view, freeze your credit, you know, 225 00:11:23,080 --> 00:11:26,640 Speaker 1: escalate with the bank, contact the FBI. Now, let's be real, 226 00:11:26,679 --> 00:11:28,880 Speaker 1: the FBI is not going to have the resources to 227 00:11:28,880 --> 00:11:32,040 Speaker 1: get engaged with a five thousand dollars individual theft. But 228 00:11:32,120 --> 00:11:34,079 Speaker 1: it's not a bad idea to report that, at least 229 00:11:34,120 --> 00:11:36,720 Speaker 1: to the Internet Crime Complaints Center IC three dot gov. 230 00:11:36,760 --> 00:11:40,120 Speaker 1: Which the the FBI runs. But I mean, here's the 231 00:11:40,280 --> 00:11:42,360 Speaker 1: here's the bottom line, and they say it in the article. 232 00:11:42,400 --> 00:11:45,400 Speaker 1: If you get something like this, hang up immediately. Do 233 00:11:45,480 --> 00:11:48,480 Speaker 1: not talk to these people. They're con artists. They'll tell 234 00:11:48,520 --> 00:11:51,520 Speaker 1: you whatever lies they need to hang up. Get out 235 00:11:51,520 --> 00:11:54,400 Speaker 1: your debit card or credit card, call that number, use 236 00:11:54,480 --> 00:11:56,560 Speaker 1: the app you normally use. You know, you need to 237 00:11:56,640 --> 00:11:59,880 Speaker 1: verify this. Go to a local bank branch if there is. 238 00:12:00,559 --> 00:12:02,600 Speaker 1: Don't click the links, don't don't use any of the 239 00:12:02,640 --> 00:12:05,520 Speaker 1: information they gave you. Hang up and use information that 240 00:12:05,600 --> 00:12:08,720 Speaker 1: you already have, and you know, reach out to your 241 00:12:08,720 --> 00:12:10,240 Speaker 1: bank and say, hey, I got this call. 242 00:12:10,640 --> 00:12:14,600 Speaker 2: Yeah, don't engage in a knee jerk reaction. Stop hanging up, 243 00:12:14,640 --> 00:12:17,560 Speaker 2: give it some thought, thinking maybe this is a scam. 244 00:12:17,880 --> 00:12:19,959 Speaker 2: It's not going to unring the bell of whatever they 245 00:12:20,000 --> 00:12:22,199 Speaker 2: claim to be warning you about. So you do have 246 00:12:22,280 --> 00:12:24,880 Speaker 2: some time. The damage has already been done, according to them, 247 00:12:24,920 --> 00:12:26,560 Speaker 2: so go to the bank and find out for sure. 248 00:12:26,640 --> 00:12:28,679 Speaker 2: Dave yes at a time. In the segment, let's bring 249 00:12:28,720 --> 00:12:30,720 Speaker 2: you back and we'll talk about Meta getting sued over 250 00:12:30,800 --> 00:12:36,559 Speaker 2: the wonderfully gorgeous looking privacy. Intrust dot Com sponsors the 251 00:12:36,600 --> 00:12:38,680 Speaker 2: Tech Friday with Dave Hatter segment Dave Hatter, it's his 252 00:12:38,760 --> 00:12:40,959 Speaker 2: company and the best in the business. So get yourself 253 00:12:40,960 --> 00:12:43,120 Speaker 2: out of a jam, or prevent jams by calling them up. 254 00:12:43,160 --> 00:12:45,600 Speaker 2: And then we pivot over Dave Hatter. Metas getting sued 255 00:12:45,640 --> 00:12:48,040 Speaker 2: over the new AI smart glasses and I'm looking at 256 00:12:48,040 --> 00:12:51,080 Speaker 2: a photograph of them, and honestly, you can tell when 257 00:12:51,160 --> 00:12:54,240 Speaker 2: someone has them on based upon the awesome fashion statement 258 00:12:54,280 --> 00:12:58,040 Speaker 2: they present. But I would not feel comfortable talking with 259 00:12:58,120 --> 00:12:59,280 Speaker 2: anyone with these things on. 260 00:13:00,600 --> 00:13:02,760 Speaker 1: Well, I'm sure it doesn't come as a shock to you. 261 00:13:03,320 --> 00:13:05,719 Speaker 1: I'm in the same boat as you. You know, I'm 262 00:13:05,800 --> 00:13:09,800 Speaker 1: not a big fan of the Internet of Things aka 263 00:13:09,960 --> 00:13:15,480 Speaker 1: smart devices, and as we see miniaturization of this technology 264 00:13:15,520 --> 00:13:18,320 Speaker 1: and the ability to add cameras and microphones to everything, 265 00:13:18,880 --> 00:13:21,720 Speaker 1: you know, you see things like the Meta smart classes here. Now, 266 00:13:21,800 --> 00:13:24,640 Speaker 1: as a reminder, this is not new. A few years ago, 267 00:13:24,760 --> 00:13:28,440 Speaker 1: Google tried this. They had something called Google Glass, which 268 00:13:28,480 --> 00:13:32,319 Speaker 1: coined the awesome term for people wearing these things glassholes 269 00:13:32,480 --> 00:13:37,319 Speaker 1: quote unquote. They didn't really catch on. Yeah, they didn't 270 00:13:37,320 --> 00:13:41,400 Speaker 1: really catch on. So now Meta's back. They're apparently in 271 00:13:41,440 --> 00:13:46,600 Speaker 1: partnership with Luxotica for their ray band brand, and yeah, 272 00:13:46,760 --> 00:13:49,600 Speaker 1: there's plenty of pictures out there of Mark Zuckerberg wearing 273 00:13:49,640 --> 00:13:51,440 Speaker 1: these things. You can see there's a little light on 274 00:13:51,640 --> 00:13:54,680 Speaker 1: that shows you what it's active in recording. But from 275 00:13:54,679 --> 00:13:55,640 Speaker 1: what I have read. 276 00:13:55,480 --> 00:13:59,240 Speaker 2: You can disable the light go on there and. 277 00:13:59,200 --> 00:14:01,880 Speaker 1: You know so. Here's an article from tech Crunch metasuit 278 00:14:01,880 --> 00:14:05,680 Speaker 1: over AI smart classes privacy concerns after workers reviewed nudity, 279 00:14:05,760 --> 00:14:09,800 Speaker 1: sex and other footage now again allegedly in this lawsuit. 280 00:14:09,960 --> 00:14:13,560 Speaker 1: This is what this article says. You know. It goes 281 00:14:13,600 --> 00:14:16,120 Speaker 1: on to say in the article Meta is facing a 282 00:14:16,160 --> 00:14:18,760 Speaker 1: new class action lawsuit or it's our smart classes and 283 00:14:18,760 --> 00:14:22,040 Speaker 1: their lack of privacy after an investigation by Swedish newspapers 284 00:14:22,400 --> 00:14:24,880 Speaker 1: find them that workers at a Kenya based subcontractor are 285 00:14:24,920 --> 00:14:29,200 Speaker 1: reviewing footage from customers classes which included sensitive content like nudity, 286 00:14:29,280 --> 00:14:31,480 Speaker 1: people having sex and using the toilet. 287 00:14:32,400 --> 00:14:35,560 Speaker 2: Medically don't have with someone wearing smart classes. 288 00:14:35,600 --> 00:14:37,600 Speaker 1: Yes, Or don't go to the toilet while your smart 289 00:14:37,640 --> 00:14:40,600 Speaker 1: classes are on, or maybe perhaps just don't have smart 290 00:14:40,600 --> 00:14:43,480 Speaker 1: classes or be around people with smart classes, and then 291 00:14:43,480 --> 00:14:45,840 Speaker 1: at least you will have one less orwelly and spy 292 00:14:45,960 --> 00:14:49,520 Speaker 1: mechanism tracking your every move, listening to your conversations, et cetera. 293 00:14:50,080 --> 00:14:52,440 Speaker 1: Now it goes on to say in here, the complaint 294 00:14:52,440 --> 00:14:56,080 Speaker 1: alleges that meta AI smart classes are advertising using promises 295 00:14:56,160 --> 00:14:59,800 Speaker 1: like quote designed for privacy, controlled by you unquote and 296 00:15:00,040 --> 00:15:03,320 Speaker 1: quote built for your privacy unquote, which might not lead 297 00:15:03,360 --> 00:15:06,240 Speaker 1: the might not lead customers to assume their glasses footage, 298 00:15:06,240 --> 00:15:10,360 Speaker 1: including innimate moments, was being watched by workers overseas. You know, 299 00:15:10,480 --> 00:15:13,200 Speaker 1: they claim that this is being done for quality purposes. 300 00:15:13,280 --> 00:15:15,760 Speaker 1: Same thing all these companies say about the data that 301 00:15:15,800 --> 00:15:19,960 Speaker 1: they collect and upload, and you know, again, in many cases, 302 00:15:20,000 --> 00:15:23,400 Speaker 1: I'm assuming some of this more sensitive footage isn't because 303 00:15:23,840 --> 00:15:27,120 Speaker 1: someone is purposely violating someone else's privacy. It's because they 304 00:15:27,120 --> 00:15:30,240 Speaker 1: don't really understand how these things work. They don't understand 305 00:15:30,240 --> 00:15:32,760 Speaker 1: when they're on, they forget that they're on, they set 306 00:15:32,800 --> 00:15:35,960 Speaker 1: them down in an inconvenient spot. But ultimately that data 307 00:15:36,000 --> 00:15:39,760 Speaker 1: gets uploaded to servers where someone you know is theoretically 308 00:15:39,760 --> 00:15:41,400 Speaker 1: reviewing this for quality purposes. 309 00:15:41,600 --> 00:15:43,920 Speaker 2: It's like it's like the room go back with the 310 00:15:43,920 --> 00:15:47,160 Speaker 2: camera in it. This is the same thing, except Lord Almighty, 311 00:15:47,800 --> 00:15:50,440 Speaker 2: the implications are far broader. With the glasses, they go 312 00:15:50,480 --> 00:15:52,920 Speaker 2: with the ever wherever you have them on, unlike a room, 313 00:15:52,960 --> 00:15:54,320 Speaker 2: but which is at least going to be you know, 314 00:15:54,400 --> 00:15:55,520 Speaker 2: stuck in your living room or. 315 00:15:55,480 --> 00:15:59,880 Speaker 1: Your house exactly. And so also reading from the article here, 316 00:16:00,320 --> 00:16:03,360 Speaker 1: and twenty twenty five seven million people bought Meta smart glasses. 317 00:16:03,400 --> 00:16:05,560 Speaker 1: The footage from those glasses is fed into a data 318 00:16:05,600 --> 00:16:09,720 Speaker 1: pipeline for a review, and users can't opt out. So 319 00:16:09,880 --> 00:16:11,760 Speaker 1: and I know we'll we're out of time, but yeah, 320 00:16:11,960 --> 00:16:14,160 Speaker 1: the room at least is confined to your house. You 321 00:16:14,200 --> 00:16:16,360 Speaker 1: could be sitting in a restaurant next to someone that's 322 00:16:16,480 --> 00:16:20,400 Speaker 1: essentially recording your conversation without your knowledge. And I just 323 00:16:20,440 --> 00:16:22,160 Speaker 1: want to point out, let's go back to the nest 324 00:16:22,200 --> 00:16:25,440 Speaker 1: thing in the Guthrie situation. She didn't have a subscription, 325 00:16:25,600 --> 00:16:28,080 Speaker 1: They said we can't retrieve this data, and then magically, 326 00:16:28,120 --> 00:16:30,960 Speaker 1: ten days later the data shows up. So if you 327 00:16:31,120 --> 00:16:33,960 Speaker 1: think this stuff is going into their servers and is 328 00:16:34,000 --> 00:16:38,320 Speaker 1: somehow eventually being deleted, or isn't potentially being sold to 329 00:16:38,360 --> 00:16:42,520 Speaker 1: someone else who knows, So, yeah, there is literally no 330 00:16:42,680 --> 00:16:45,160 Speaker 1: chance I would use these things. There is no chance 331 00:16:45,200 --> 00:16:48,840 Speaker 1: I would be around someone that has these on. This 332 00:16:48,880 --> 00:16:53,080 Speaker 1: is just more of it, just illustrates once again how 333 00:16:53,080 --> 00:16:56,760 Speaker 1: as we make everything smart. Quote unquote put software in everything. 334 00:16:57,120 --> 00:17:01,200 Speaker 1: There are all these unintended consequences to everyday people that 335 00:17:01,240 --> 00:17:05,040 Speaker 1: they typically don't understand. So again, all of this is alleged. 336 00:17:05,040 --> 00:17:08,800 Speaker 1: That's in a lawsuit. Go out, read these articles, see 337 00:17:08,800 --> 00:17:10,680 Speaker 1: the pictures. But yeah, I'm not a fan, and I 338 00:17:10,720 --> 00:17:13,919 Speaker 1: would encourage people not to buy this stuff and to 339 00:17:14,119 --> 00:17:19,320 Speaker 1: try to avoid being around it as much as possible. Yeah, 340 00:17:19,359 --> 00:17:22,320 Speaker 1: the Internet of Things. Not a fan. Yes, that's a 341 00:17:22,320 --> 00:17:23,080 Speaker 1: good stopping point. 342 00:17:23,200 --> 00:17:25,679 Speaker 2: Take a pass on Internet of Things devices,