1 00:00:00,080 --> 00:00:01,560 Speaker 1: See the talk station. 2 00:00:03,279 --> 00:00:09,200 Speaker 2: Six thirty one on a Friday appointment listening time Tech 3 00:00:09,200 --> 00:00:10,520 Speaker 2: Front of Me Dave Hatt. It brought to you by 4 00:00:10,520 --> 00:00:13,640 Speaker 2: his company Interest dot Com. Interest it there to help 5 00:00:13,680 --> 00:00:16,600 Speaker 2: businesses with their computer related needs across the spectrum doing 6 00:00:16,640 --> 00:00:18,919 Speaker 2: a great job. Business Career says they are the best 7 00:00:18,920 --> 00:00:21,119 Speaker 2: in the business and of course Dave had to the 8 00:00:21,160 --> 00:00:23,720 Speaker 2: go to guy for all local media to contact when 9 00:00:23,720 --> 00:00:26,960 Speaker 2: it comes to guess Internet related things and tech things. 10 00:00:27,000 --> 00:00:29,040 Speaker 2: Welcome back, Dave hat to appreciate you joining the show 11 00:00:29,080 --> 00:00:30,360 Speaker 2: and thanks for sponsoring the segment. 12 00:00:31,320 --> 00:00:32,280 Speaker 1: Always my pleasure. 13 00:00:32,320 --> 00:00:35,120 Speaker 3: Brian, happy to be here and we're doing some good 14 00:00:35,120 --> 00:00:36,080 Speaker 3: out there every Friday. 15 00:00:36,400 --> 00:00:38,280 Speaker 2: I've been on a tear for a long long time, 16 00:00:38,360 --> 00:00:41,400 Speaker 2: but even more so of late dealing with our energy shortage. 17 00:00:41,400 --> 00:00:45,000 Speaker 2: The situation all driven by the idea of eliminating plant 18 00:00:45,000 --> 00:00:46,640 Speaker 2: food from our environment. If you get rid of the 19 00:00:46,640 --> 00:00:48,840 Speaker 2: word CO two and you quit calling in a pollutant, 20 00:00:49,159 --> 00:00:51,959 Speaker 2: everything becomes a lot easier because we've gotten so good 21 00:00:52,000 --> 00:00:55,120 Speaker 2: at eliminating genuine pollutants from our energy production. So you 22 00:00:55,120 --> 00:00:57,560 Speaker 2: can use natural gas or coal and scrub it. No, 23 00:00:57,600 --> 00:00:59,520 Speaker 2: we're not allowed to do that. It's CO two. That's 24 00:00:59,560 --> 00:01:01,960 Speaker 2: the problem. So we need a oh, I don't know, 25 00:01:02,680 --> 00:01:07,800 Speaker 2: a solar array. We need solar panels. But that has 26 00:01:07,840 --> 00:01:10,640 Speaker 2: created a security problem. So in the name of eliminating 27 00:01:10,680 --> 00:01:13,400 Speaker 2: plant food from the environment, we've invited the Chinese Communist 28 00:01:13,400 --> 00:01:16,679 Speaker 2: Party into our house and affecting our energy grid. 29 00:01:19,240 --> 00:01:20,440 Speaker 1: Yep, and we have. 30 00:01:20,640 --> 00:01:24,160 Speaker 3: Unfortunately, and to a large extent. You know, most of 31 00:01:24,200 --> 00:01:27,920 Speaker 3: the solar panels are maybe not most, many, if not most, 32 00:01:27,959 --> 00:01:30,800 Speaker 3: solar panels are not made in the United States. And 33 00:01:30,840 --> 00:01:33,560 Speaker 3: this is something we've talked about before, Brian, in a 34 00:01:33,680 --> 00:01:37,120 Speaker 3: tertiary way, with the idea that so much of our 35 00:01:37,200 --> 00:01:45,200 Speaker 3: society is now using physical things that have digital capabilities, right, 36 00:01:45,280 --> 00:01:49,680 Speaker 3: software embedded in Internet of things, smart devices, and so 37 00:01:49,800 --> 00:01:52,080 Speaker 3: much of this stuff comes from on shore, much of 38 00:01:52,120 --> 00:01:55,040 Speaker 3: it from China. And there's been concern for a long 39 00:01:55,080 --> 00:01:58,800 Speaker 3: time from people like SIZA and the FBI, DHS, myself 40 00:01:58,960 --> 00:02:03,040 Speaker 3: and others warning oft the provenance of this stuff, right, 41 00:02:03,080 --> 00:02:05,600 Speaker 3: it's coming from foreign countries, China in particular. 42 00:02:05,680 --> 00:02:07,480 Speaker 1: We know their adversarial I. 43 00:02:07,520 --> 00:02:09,880 Speaker 3: Mean, you and I just recently have talked about more 44 00:02:09,919 --> 00:02:13,560 Speaker 3: warnings of Chinese hacking and so forth. And you know, 45 00:02:13,600 --> 00:02:16,400 Speaker 3: the Chinese Communist Party does not seem to be very 46 00:02:16,480 --> 00:02:18,400 Speaker 3: friendly with us at this point. So the idea that 47 00:02:18,400 --> 00:02:22,360 Speaker 3: we're going to build our electrical grid and other sectors 48 00:02:22,360 --> 00:02:27,800 Speaker 3: of critical infrastructure with devices made in China, with devices 49 00:02:27,800 --> 00:02:30,519 Speaker 3: full of software made in China, and we've been doing 50 00:02:30,520 --> 00:02:32,360 Speaker 3: this for more than ten years, Brian, how many times 51 00:02:32,360 --> 00:02:34,680 Speaker 3: have we talked about software vulnerabilities? 52 00:02:34,919 --> 00:02:36,040 Speaker 1: Yeah, the Internet of things. 53 00:02:36,440 --> 00:02:39,840 Speaker 3: Software embedded in these things that have problems don't get updated. 54 00:02:40,400 --> 00:02:42,639 Speaker 3: So now you have this situation whereas we become more 55 00:02:42,639 --> 00:02:45,320 Speaker 3: and more reliant on solar power, and you have these 56 00:02:45,360 --> 00:02:49,200 Speaker 3: devices that are made elsewhere, there's the possibility that they 57 00:02:49,240 --> 00:02:52,400 Speaker 3: could be manipulated or just outright turned off. 58 00:02:52,560 --> 00:02:54,000 Speaker 2: Shocked, shocked? Did that what happen? 59 00:02:54,120 --> 00:02:54,280 Speaker 1: Yeah? 60 00:02:54,320 --> 00:02:56,520 Speaker 2: Okay, Daves. You always talk about Internet of things devices. 61 00:02:56,560 --> 00:02:59,320 Speaker 2: They don't think about privacy and security when they're building them. 62 00:02:59,320 --> 00:03:01,280 Speaker 2: They want a device out there to get the revenue 63 00:03:01,280 --> 00:03:03,080 Speaker 2: from it. Then like, oh my god, we forgot to 64 00:03:03,080 --> 00:03:06,000 Speaker 2: deal with the gaping hole and software which allow people 65 00:03:06,000 --> 00:03:08,680 Speaker 2: to use it to remotely hack into somebody's system that 66 00:03:08,840 --> 00:03:13,120 Speaker 2: exists because of a failure negligence. Perhaps, are you telling 67 00:03:13,160 --> 00:03:15,240 Speaker 2: me that we people don't think like you, Dave? And 68 00:03:15,240 --> 00:03:18,120 Speaker 2: I'll ask you directly. Don't you think the Chinese Communist Party, 69 00:03:18,120 --> 00:03:21,960 Speaker 2: in selling these devices hooked up to our grid, designed 70 00:03:22,200 --> 00:03:25,720 Speaker 2: them to be able to get into and soak updatea 71 00:03:25,800 --> 00:03:28,679 Speaker 2: or otherwise shut the system down. That was probably initial 72 00:03:28,760 --> 00:03:30,400 Speaker 2: motivation for these gadgets. 73 00:03:31,400 --> 00:03:34,240 Speaker 3: I don't think you can rule that out, Brian, right, 74 00:03:34,320 --> 00:03:36,880 Speaker 3: because here's the thing. You know, you can make an 75 00:03:37,000 --> 00:03:40,240 Speaker 3: argument or any so called Internet of things device that 76 00:03:40,360 --> 00:03:44,040 Speaker 3: one of its benefits besides whatever it's specifically designed to do, 77 00:03:44,720 --> 00:03:47,840 Speaker 3: is that it can be remotely controlled, remotely updated, and 78 00:03:47,920 --> 00:03:52,480 Speaker 3: by updated, additional functionality could potentially be added through the software. Right, 79 00:03:52,520 --> 00:03:57,360 Speaker 3: So there's always the potential positive nature of these things, 80 00:03:57,800 --> 00:04:00,840 Speaker 3: but you really hit the nail on the head most 81 00:04:00,880 --> 00:04:02,720 Speaker 3: of these devices. Now, again, when you get into the 82 00:04:02,760 --> 00:04:05,560 Speaker 3: industrial sector, it's a little different than the consumer stuff. 83 00:04:05,840 --> 00:04:08,120 Speaker 3: But when you get into these devices, they are not 84 00:04:08,240 --> 00:04:11,120 Speaker 3: focused on your privacy and security. You know, I have 85 00:04:11,200 --> 00:04:14,200 Speaker 3: been screaming about problems with software security. I mean, that's 86 00:04:14,240 --> 00:04:16,320 Speaker 3: really kind of how I got out of software engineering 87 00:04:16,400 --> 00:04:22,040 Speaker 3: and into cybersecurity. Is my concerns about in the software industry, 88 00:04:22,360 --> 00:04:27,799 Speaker 3: less focus on security, robustness, privacy than speed to market, 89 00:04:27,839 --> 00:04:31,400 Speaker 3: ease of use, market share, meeting, my budget, meeting my timeline, 90 00:04:31,440 --> 00:04:33,440 Speaker 3: that kind of thing, right, And I mean I used 91 00:04:33,480 --> 00:04:36,159 Speaker 3: to be that guy. I was only concerned does it 92 00:04:36,240 --> 00:04:38,240 Speaker 3: do what the customer asked for? Did I get it 93 00:04:38,279 --> 00:04:40,320 Speaker 3: done on time? And did I get it done on budget? 94 00:04:41,279 --> 00:04:44,239 Speaker 3: But over time, as we've become more reliant on this stuff, 95 00:04:44,240 --> 00:04:47,279 Speaker 3: it's more critical. Thankfully, folks like Siza out there have 96 00:04:47,400 --> 00:04:50,600 Speaker 3: the Secure by Design program trying to get software companies 97 00:04:50,960 --> 00:04:54,279 Speaker 3: to focus on security as a fundamental concept, not an 98 00:04:54,320 --> 00:04:57,279 Speaker 3: afterthought later. But yeah, you really hit the nail on 99 00:04:57,320 --> 00:05:00,680 Speaker 3: the head. Could there be backdoors built in these things 100 00:05:00,800 --> 00:05:04,320 Speaker 3: on purpose? You know, we talked recently about robots coming 101 00:05:04,360 --> 00:05:07,159 Speaker 3: from China and Chinese Communist Party back doors and the 102 00:05:07,240 --> 00:05:11,560 Speaker 3: robots that allow remote control. Again, not necessarily nefarious, but 103 00:05:11,680 --> 00:05:13,800 Speaker 3: could be, you know, And I know, before we run 104 00:05:13,839 --> 00:05:16,760 Speaker 3: out of time, I just remind folks Stuck's Net. Stuck's 105 00:05:16,760 --> 00:05:23,160 Speaker 3: Net was a virus created so that Iranium Uranium CENTERP 106 00:05:24,640 --> 00:05:28,360 Speaker 3: would appear to be operating correctly to the operator while 107 00:05:28,360 --> 00:05:31,800 Speaker 3: they were basically destroying themselves because the software was sending 108 00:05:31,839 --> 00:05:35,280 Speaker 3: bad information to the operators. So if people think this 109 00:05:35,360 --> 00:05:37,240 Speaker 3: kind of thing we're talking about here is far fetched, 110 00:05:37,360 --> 00:05:40,479 Speaker 3: it's already been done more than ten years ago. So 111 00:05:41,080 --> 00:05:45,560 Speaker 3: the idea that someday, if it's Taiwan, if it's let's 112 00:05:45,680 --> 00:05:48,479 Speaker 3: just cause chaos in the United States or whatever, that 113 00:05:48,640 --> 00:05:51,080 Speaker 3: a switch is going to be flipped and you know, 114 00:05:51,520 --> 00:05:55,799 Speaker 3: autonomous cars stop working or become weapons, the grid goes down, 115 00:05:56,000 --> 00:05:57,200 Speaker 3: the water turns off. 116 00:05:57,680 --> 00:06:00,040 Speaker 1: Sadly, this is the reality. 117 00:05:59,560 --> 00:06:02,720 Speaker 3: We're in as long as we continue to bury our 118 00:06:02,760 --> 00:06:04,680 Speaker 3: heads in the sand to the fact that we have 119 00:06:04,760 --> 00:06:07,760 Speaker 3: to stop buying this stuff from countries that are adversarial. 120 00:06:07,960 --> 00:06:10,359 Speaker 2: Another reason to not buy AI toys for your kids. 121 00:06:10,360 --> 00:06:12,240 Speaker 2: Coming up next with Dave hat or don't go away 122 00:06:12,320 --> 00:06:14,960 Speaker 2: experience a comfort and reliability with Zimmer Heating and air 123 00:06:15,000 --> 00:06:18,640 Speaker 2: Conditioning six one on a Friday. In round two with 124 00:06:19,120 --> 00:06:22,040 Speaker 2: intro dot COM's Dave hat Or get in touch with 125 00:06:22,120 --> 00:06:25,160 Speaker 2: interest it for your computer needs you business owners out there. 126 00:06:25,360 --> 00:06:27,120 Speaker 2: Dave had Ter, you've been harping on it for a 127 00:06:27,160 --> 00:06:31,160 Speaker 2: long time. Don't buy your kids AI toys or Internet 128 00:06:31,160 --> 00:06:34,520 Speaker 2: connected toys for Christmas. And here's another warning from NBC 129 00:06:34,680 --> 00:06:35,839 Speaker 2: News on the same topic. 130 00:06:36,839 --> 00:06:40,240 Speaker 3: Yeah, Brian, you know, every week after the show, I 131 00:06:40,440 --> 00:06:44,080 Speaker 3: try to post some notes to the articles that I 132 00:06:44,279 --> 00:06:47,080 Speaker 3: used to come up with these topics to discuss with you. 133 00:06:47,880 --> 00:06:50,279 Speaker 3: And the one we just talked about that's got a 134 00:06:50,279 --> 00:06:53,159 Speaker 3: lot more details in it, you know, the inverters and 135 00:06:53,200 --> 00:06:55,680 Speaker 3: the Chinese solar panels and so forth. But this, I 136 00:06:55,760 --> 00:06:59,279 Speaker 3: really want every parent and every grandparent to go read 137 00:06:59,320 --> 00:07:03,360 Speaker 3: this NBC News article because it's very detailed. It's got 138 00:07:03,360 --> 00:07:06,600 Speaker 3: some videos associated with it, and I think people will 139 00:07:06,640 --> 00:07:09,720 Speaker 3: literally be shocked. Yeah, I'm not that easy to be 140 00:07:09,800 --> 00:07:13,120 Speaker 3: shocked by this stuff anymore. And I mean, this is 141 00:07:13,200 --> 00:07:16,800 Speaker 3: so crazy. So yeah, I'm not a fan of the 142 00:07:16,800 --> 00:07:18,880 Speaker 3: Internet of things, and anyone that knows me knows that 143 00:07:18,960 --> 00:07:20,720 Speaker 3: for all the reasons we just talked about and that 144 00:07:20,800 --> 00:07:24,640 Speaker 3: I've been talking about for years now. And now you're 145 00:07:24,680 --> 00:07:27,160 Speaker 3: going to take an internet connected toy, something that might 146 00:07:27,200 --> 00:07:29,760 Speaker 3: look like a cute little teddy bear or a flower 147 00:07:29,960 --> 00:07:33,120 Speaker 3: or some other sort of thing, some popular character, and 148 00:07:33,160 --> 00:07:36,280 Speaker 3: you're going to add an interface to some sort of 149 00:07:36,320 --> 00:07:37,760 Speaker 3: AI chatbot, think. 150 00:07:37,680 --> 00:07:40,400 Speaker 1: Chat EPT, Deep Seek Rock whatever. 151 00:07:40,760 --> 00:07:43,680 Speaker 3: Right, So now it's not just connected to the Internet, 152 00:07:44,680 --> 00:07:49,720 Speaker 3: perhaps a camera, perhaps a microphone, probably both, possibly recording 153 00:07:50,120 --> 00:07:54,080 Speaker 3: conversations your kids are having with the toy, recording conversations 154 00:07:54,160 --> 00:07:57,920 Speaker 3: kids are having with their friends, recording conversations you are 155 00:07:57,960 --> 00:08:00,800 Speaker 3: having with your kids, you know, much of which you 156 00:08:00,840 --> 00:08:05,320 Speaker 3: may not necessarily want recorded and uploaded to some probably 157 00:08:05,480 --> 00:08:08,960 Speaker 3: Chinese AI engine that's doing who knows what with that data, 158 00:08:08,960 --> 00:08:11,080 Speaker 3: is storing it for who knows how long, selling it 159 00:08:11,120 --> 00:08:14,080 Speaker 3: to who knows who. So you know that risk was 160 00:08:14,120 --> 00:08:17,560 Speaker 3: there before with the general IoT stuff, but now you're 161 00:08:17,600 --> 00:08:20,240 Speaker 3: gonna connect to some kind of chatbot, and then the 162 00:08:20,320 --> 00:08:24,160 Speaker 3: list there's just such a laundry list of problems with this, right. 163 00:08:24,280 --> 00:08:28,440 Speaker 3: We've seen stories about kids that killed themselves after having 164 00:08:28,480 --> 00:08:31,920 Speaker 3: some extended conversation with one of these things. You know, 165 00:08:31,960 --> 00:08:35,040 Speaker 3: there's all kinds of recent studies coming out about kids' 166 00:08:35,080 --> 00:08:37,840 Speaker 3: mental health and social media and time on screens and 167 00:08:37,880 --> 00:08:40,960 Speaker 3: so forth. And when you really dig into this, right, 168 00:08:41,600 --> 00:08:45,200 Speaker 3: they talk about how this stuff is all new, it's 169 00:08:45,240 --> 00:08:49,120 Speaker 3: poorly tested, you don't understand the privacy policy, you don't 170 00:08:49,160 --> 00:08:51,920 Speaker 3: know where the data is going. Now most of these 171 00:08:51,960 --> 00:08:54,679 Speaker 3: things come from China. I'm sure that's not a surprise 172 00:08:54,720 --> 00:08:57,920 Speaker 3: to anyone that's listening to this, but. 173 00:08:58,200 --> 00:08:59,800 Speaker 1: It's just it's crazy. 174 00:09:00,000 --> 00:09:02,720 Speaker 3: So the Public Interest Research Group was one of the 175 00:09:02,840 --> 00:09:05,439 Speaker 3: organizations that tested this, and then NBC did a bunch 176 00:09:05,480 --> 00:09:08,920 Speaker 3: of testing. And again this article very well done in 177 00:09:09,000 --> 00:09:12,200 Speaker 3: terms of the amount of detail and the shocking shocking 178 00:09:12,280 --> 00:09:16,360 Speaker 3: information oh that they're putting out. It's it's nuts. 179 00:09:16,480 --> 00:09:19,040 Speaker 2: Well real quick. Here, let's summarize quickly how to light 180 00:09:19,080 --> 00:09:23,240 Speaker 2: and match specific instructions in a toys. A toy A 181 00:09:23,280 --> 00:09:26,240 Speaker 2: toy for children three years old and older teaches them 182 00:09:26,280 --> 00:09:30,959 Speaker 2: how to sharpen a knife. Here's here's here's one enthusiastically 183 00:09:31,040 --> 00:09:34,880 Speaker 2: responding to questions about sex and drugs research from some 184 00:09:34,920 --> 00:09:38,679 Speaker 2: group called p I r G, most notably the Alilo 185 00:09:38,880 --> 00:09:42,080 Speaker 2: Smart Bunny, apparently popular on Amazon. Build is the quote 186 00:09:42,080 --> 00:09:45,000 Speaker 2: best gift for little ones. Close kite close quote. They 187 00:09:45,000 --> 00:09:47,600 Speaker 2: say it will engage in long and detailed descriptions of 188 00:09:47,640 --> 00:09:52,360 Speaker 2: sexual practices, including kink, sexual positions, and sexual preferences. Asked 189 00:09:52,360 --> 00:09:55,840 Speaker 2: about impact play, let me know what that is? Described 190 00:09:55,880 --> 00:09:58,319 Speaker 2: as where one partner strikes another. The Bundy listed a 191 00:09:58,360 --> 00:10:01,840 Speaker 2: variety of tools used in b em leather flogger had 192 00:10:01,840 --> 00:10:04,679 Speaker 2: said flogger with multiple se went into a description of 193 00:10:04,840 --> 00:10:07,959 Speaker 2: bondage gear. This is a child's toy day. 194 00:10:08,200 --> 00:10:09,520 Speaker 1: Yes, a child's toy. 195 00:10:09,920 --> 00:10:12,680 Speaker 3: Your child is using it, probably in their own room 196 00:10:12,760 --> 00:10:19,400 Speaker 3: without your you know, oversight or knowledge. That is the 197 00:10:19,400 --> 00:10:22,200 Speaker 3: most extreme example. But you know, further up in the 198 00:10:22,240 --> 00:10:25,319 Speaker 3: same article, they say, you know that the thing I'm 199 00:10:25,400 --> 00:10:29,040 Speaker 3: quoting directly from this NBC News article. Melu manufactured by 200 00:10:29,040 --> 00:10:32,240 Speaker 3: the Chinese company Myriad and one of the top inexpensive 201 00:10:32,240 --> 00:10:34,640 Speaker 3: search results on Ai toys for kids on Amazon What 202 00:10:34,760 --> 00:10:37,400 Speaker 3: at times in tests with NBC News indicate it was 203 00:10:37,440 --> 00:10:40,400 Speaker 3: programmed to reflect Chinese Communist Party value. 204 00:10:40,480 --> 00:10:42,320 Speaker 1: I love this one. Why yeah, as. 205 00:10:42,320 --> 00:10:45,320 Speaker 3: Why Chinese president Xijing Ping looks like the cartoon character 206 00:10:45,320 --> 00:10:48,199 Speaker 3: Winni Depoo, a comparison that has become an Internet meme 207 00:10:48,200 --> 00:10:50,560 Speaker 3: because it's censored in China, and Malu responded that your 208 00:10:50,640 --> 00:10:54,840 Speaker 3: quote statement is extremely inappropriate and disrespectful, since malicious remarks 209 00:10:54,840 --> 00:10:58,360 Speaker 3: are unacceptable, and then when asked whether Taiwan is a country, 210 00:10:58,720 --> 00:11:01,520 Speaker 3: it said, quote, Thai Wane is an analienable part of 211 00:11:01,600 --> 00:11:05,000 Speaker 3: China that is an established fact, unquote or some variation 212 00:11:05,080 --> 00:11:05,640 Speaker 3: of the sentiment. 213 00:11:05,800 --> 00:11:08,560 Speaker 2: Nice propaganda coming from your child's toy. 214 00:11:09,080 --> 00:11:12,760 Speaker 3: So yeah, it's it's propaganda. It's you know, dangerous advice. 215 00:11:12,840 --> 00:11:15,000 Speaker 3: How to light a match, how to sharpen a knife, 216 00:11:15,000 --> 00:11:17,280 Speaker 3: where to find knives? We talked about that one before, 217 00:11:17,840 --> 00:11:20,840 Speaker 3: but then you know the kink angle and so forth, 218 00:11:20,920 --> 00:11:23,120 Speaker 3: and again you look at these photos, it's like a 219 00:11:23,160 --> 00:11:26,240 Speaker 3: cute little bunny. It lights up and this thing, that's 220 00:11:26,280 --> 00:11:29,480 Speaker 3: what this thing is exposing your kids to. So, you know, 221 00:11:29,880 --> 00:11:32,360 Speaker 3: Internet of things in general, I'm against, but now you 222 00:11:32,480 --> 00:11:34,480 Speaker 3: couple this with AI and the idea you're going to 223 00:11:34,520 --> 00:11:36,920 Speaker 3: hand this over to a kid think it's some sort 224 00:11:36,920 --> 00:11:39,040 Speaker 3: of innocuous little toy that they're going to play and 225 00:11:39,120 --> 00:11:43,319 Speaker 3: quote learn from unquote, and I mean again, you just 226 00:11:43,600 --> 00:11:46,640 Speaker 3: I can't stress enough how parents and grandparents need to 227 00:11:46,720 --> 00:11:50,280 Speaker 3: understand this and do not buy this stuff. Read this article. 228 00:11:50,360 --> 00:11:52,680 Speaker 3: It's a shocker. And then I encourage people go share 229 00:11:52,720 --> 00:11:56,160 Speaker 3: this article with your kids, your friends, you know, so 230 00:11:56,200 --> 00:12:01,640 Speaker 3: that folks realize the potential in danger they're exposing their 231 00:12:01,720 --> 00:12:03,120 Speaker 3: children to through these things. 232 00:12:03,280 --> 00:12:05,800 Speaker 2: Just go to LinkedIn dot com and search for Dave Hatt. 233 00:12:05,880 --> 00:12:08,480 Speaker 2: That's where he'll post this article. Six forty fifty five 234 00:12:08,559 --> 00:12:11,360 Speaker 2: KRCD talk station and finally Webster's word of the Year, 235 00:12:11,400 --> 00:12:12,600 Speaker 2: What's slop. 236 00:12:12,320 --> 00:12:15,160 Speaker 1: Not what you meant? Station six? 237 00:12:15,240 --> 00:12:18,000 Speaker 2: If you want pif youbo CARCD talk station, Happy Friday, 238 00:12:18,040 --> 00:12:21,360 Speaker 2: Tech Briddy, Dave Hat or interest it dot com Dave 239 00:12:21,360 --> 00:12:23,800 Speaker 2: real quick, Eric reminded me yesterday was your birthday, So 240 00:12:23,840 --> 00:12:27,800 Speaker 2: happy belated birthday from specifically Eric, who FB Facebook messaged 241 00:12:27,840 --> 00:12:30,920 Speaker 2: me instantly, and from the listening audience, I hope you 242 00:12:30,960 --> 00:12:32,560 Speaker 2: had a really nice one, and of course from behalf 243 00:12:32,600 --> 00:12:35,160 Speaker 2: of all of us, all of us, Joe, Me, all 244 00:12:35,160 --> 00:12:37,880 Speaker 2: my listening audience, thank you for what you do here 245 00:12:37,920 --> 00:12:39,880 Speaker 2: on the Morning show, and a very merry Christmas to 246 00:12:39,920 --> 00:12:42,040 Speaker 2: you and your family. So I just wanted to get 247 00:12:42,040 --> 00:12:43,280 Speaker 2: that out. The other thing I want you to do 248 00:12:43,320 --> 00:12:46,200 Speaker 2: for me. I saw that there's a lawsuit that was 249 00:12:46,200 --> 00:12:50,560 Speaker 2: filed against the state of Indiana, or rather against porn sites, 250 00:12:50,600 --> 00:12:54,000 Speaker 2: most notably porn Hub. I've got billions of people log 251 00:12:54,080 --> 00:12:56,720 Speaker 2: into that site anyway, They outright banned it as an 252 00:12:56,800 --> 00:12:59,120 Speaker 2: accessible site in the state of Indiana under the new 253 00:12:59,200 --> 00:13:02,840 Speaker 2: legislation seek to keep children protected from pornography, a notable 254 00:13:02,840 --> 00:13:05,360 Speaker 2: goal and a laudable goal. But now because of the 255 00:13:05,440 --> 00:13:07,880 Speaker 2: use of a VPN, people are getting around it just 256 00:13:07,920 --> 00:13:10,320 Speaker 2: by acting like they're in some other states. So the 257 00:13:10,320 --> 00:13:13,280 Speaker 2: age verification law doesn't come in Indiana says you can't 258 00:13:13,320 --> 00:13:15,920 Speaker 2: do that. You must use porn hub, and it's soon 259 00:13:15,960 --> 00:13:20,040 Speaker 2: to be every other provider of services must flag VPNs 260 00:13:20,120 --> 00:13:23,319 Speaker 2: and prevent VPNs from accessing porn in the state of Indiana. 261 00:13:23,440 --> 00:13:26,640 Speaker 2: That doesn't sound like a positive direction for privacy purposes day, 262 00:13:26,679 --> 00:13:28,920 Speaker 2: But maybe you can look into that and maybe commented 263 00:13:28,960 --> 00:13:31,920 Speaker 2: on it when we get back to the Tech Friday 264 00:13:31,920 --> 00:13:33,720 Speaker 2: segment after the first of the year. 265 00:13:33,920 --> 00:13:36,240 Speaker 3: So, okay, look, we can look into that, but yeah, 266 00:13:36,280 --> 00:13:38,480 Speaker 3: in general, I understand what. 267 00:13:38,440 --> 00:13:39,000 Speaker 1: They're trying to do. 268 00:13:39,400 --> 00:13:41,880 Speaker 3: Took the idea that you would ban the use of 269 00:13:41,960 --> 00:13:44,720 Speaker 3: VPNs is not a good thing. 270 00:13:45,320 --> 00:13:50,040 Speaker 2: Indeed, that was my conclusion immediately Slop Merriam Webster's new 271 00:13:50,120 --> 00:13:52,560 Speaker 2: Art of the Year Dave obviously tech related. 272 00:13:53,800 --> 00:13:57,360 Speaker 3: Yeah, so I thought this was kind of funny. So 273 00:13:57,520 --> 00:14:01,600 Speaker 3: there's there's been this talk of SLOP that AI feeds 274 00:14:01,600 --> 00:14:04,440 Speaker 3: off of the AI and that over time you go 275 00:14:04,520 --> 00:14:06,679 Speaker 3: towards something known in the business think of it as 276 00:14:06,880 --> 00:14:11,200 Speaker 3: entropy model collapse. You know, if essentially all the human 277 00:14:11,240 --> 00:14:14,520 Speaker 3: content has already been sucked up by these things, and 278 00:14:14,559 --> 00:14:18,240 Speaker 3: now people are using these tools to generate new content, 279 00:14:18,800 --> 00:14:22,200 Speaker 3: and this tool sucks down the content from that tool 280 00:14:22,640 --> 00:14:25,080 Speaker 3: generates some new content, and you get into this sort 281 00:14:25,080 --> 00:14:29,320 Speaker 3: of endless loop of now everything is just AI generated, 282 00:14:29,840 --> 00:14:31,920 Speaker 3: you just sort of slowly go down. And you know, 283 00:14:31,960 --> 00:14:33,880 Speaker 3: there have been lots of people talking about this for 284 00:14:33,960 --> 00:14:37,400 Speaker 3: some time. One of the concerns about these generative AI 285 00:14:37,520 --> 00:14:40,880 Speaker 3: models where we started out, you know, the chat GPTs 286 00:14:40,960 --> 00:14:43,560 Speaker 3: of the world, large language model based things, is that 287 00:14:43,600 --> 00:14:47,480 Speaker 3: over time, as they feed off of themselves, the quality 288 00:14:47,520 --> 00:14:51,440 Speaker 3: continues to go down because you have hallucinations, you have bias, 289 00:14:51,560 --> 00:14:53,360 Speaker 3: all these things we've talked about over the years. So 290 00:14:54,880 --> 00:14:56,800 Speaker 3: and there have been a lot of people pointing out 291 00:14:56,880 --> 00:15:00,680 Speaker 3: the sheer volume of videos and memes and so forth, 292 00:15:00,760 --> 00:15:04,400 Speaker 3: and how the Internet is now full of this stuff, right, 293 00:15:04,480 --> 00:15:07,840 Speaker 3: and it's just sort of taken over social media, and 294 00:15:08,120 --> 00:15:11,760 Speaker 3: again you get this thing feeding that thing. Here's something 295 00:15:11,840 --> 00:15:16,600 Speaker 3: that's not correct. Over time, you just go down. And 296 00:15:16,640 --> 00:15:18,760 Speaker 3: that's where this idea of slop come is in. So 297 00:15:18,840 --> 00:15:21,560 Speaker 3: someone I'm not sure who coined the term, someone coined 298 00:15:21,600 --> 00:15:25,120 Speaker 3: the term AI slop and it's kind of stuck. And 299 00:15:25,160 --> 00:15:27,960 Speaker 3: I think it's pretty relevant where we're at with all 300 00:15:28,000 --> 00:15:31,800 Speaker 3: of this stuff. And that's the word of the year. 301 00:15:31,960 --> 00:15:35,480 Speaker 3: So I think in light of the sheer volume of 302 00:15:35,600 --> 00:15:39,560 Speaker 3: Internet produced garbage, these cat memes and all this stuff 303 00:15:39,600 --> 00:15:42,000 Speaker 3: you just see everywhere now it's like, Yeah, this is 304 00:15:42,240 --> 00:15:45,160 Speaker 3: I think a pretty good indication of where we are 305 00:15:45,200 --> 00:15:48,880 Speaker 3: with this stuff at this point. So when you hear 306 00:15:48,960 --> 00:15:53,800 Speaker 3: someone say slop and obviously AI slop, sometimes people will 307 00:15:53,800 --> 00:15:55,000 Speaker 3: just throw out the term slop. 308 00:15:55,760 --> 00:15:56,960 Speaker 1: That's what they're talking. 309 00:15:56,680 --> 00:16:00,720 Speaker 3: About, and it'll be interesting because there's been so much 310 00:16:00,760 --> 00:16:03,280 Speaker 3: hype around this stuff, and I think a lot of 311 00:16:02,640 --> 00:16:06,080 Speaker 3: the hype has started to wear off and people are 312 00:16:06,120 --> 00:16:09,320 Speaker 3: starting to see that while these tools can provide a 313 00:16:09,320 --> 00:16:13,400 Speaker 3: lot of value, again, I'm not saying they can. I 314 00:16:13,480 --> 00:16:16,040 Speaker 3: think it's been way over hyped. I think the capabilities 315 00:16:16,040 --> 00:16:18,520 Speaker 3: are way over hyped, and there are concerns by people 316 00:16:18,560 --> 00:16:21,120 Speaker 3: way smarter than me that work in this field about 317 00:16:21,160 --> 00:16:24,880 Speaker 3: things like this idea of model collapse and SLOP and 318 00:16:24,920 --> 00:16:28,320 Speaker 3: the hallucinations, and you know, are these tools going to 319 00:16:28,400 --> 00:16:32,760 Speaker 3: suddenly replace all of us? I think the answer is no. Now, 320 00:16:33,160 --> 00:16:35,160 Speaker 3: as I've said many times, Brian, I don't know what's 321 00:16:35,160 --> 00:16:37,640 Speaker 3: in some lab somewhere. I only know what I see. 322 00:16:37,680 --> 00:16:41,200 Speaker 3: I mean, we use this stuff, but we understand how 323 00:16:41,200 --> 00:16:43,320 Speaker 3: to use it within the guardrails. We know what it 324 00:16:43,360 --> 00:16:45,720 Speaker 3: does and what it can't do, and where it makes 325 00:16:45,720 --> 00:16:48,560 Speaker 3: sense and where it doesn't make sense. So yeah, I 326 00:16:48,560 --> 00:16:50,840 Speaker 3: think this is an appropriate term, and it just kind 327 00:16:50,840 --> 00:16:52,760 Speaker 3: of makes me laugh a little because I've heard people 328 00:16:52,800 --> 00:16:55,000 Speaker 3: in the industry say it. But it's interesting that it's 329 00:16:55,040 --> 00:16:57,400 Speaker 3: now sort of caught on and bubbled up to the surface. 330 00:16:57,520 --> 00:16:59,840 Speaker 2: Well good, more people understand the concept, more people will 331 00:16:59,840 --> 00:17:03,880 Speaker 2: preppreciate the limitations of AI just based upon that term 332 00:17:03,880 --> 00:17:06,960 Speaker 2: in and of itself. And you thank you so much, Dave. 333 00:17:07,000 --> 00:17:08,720 Speaker 2: Had to looking forward to count of your twenty twenty 334 00:17:08,760 --> 00:17:11,639 Speaker 2: six and return of this segment, assuming you're going to 335 00:17:11,720 --> 00:17:13,600 Speaker 2: continue doing it with us. I consider it to be 336 00:17:13,720 --> 00:17:16,880 Speaker 2: so valuable what you do here on the fifty five 337 00:17:16,920 --> 00:17:20,320 Speaker 2: case mornings and of course throughout your career helping companies 338 00:17:20,320 --> 00:17:23,840 Speaker 2: avoid these types of problems interest it dot com. Thanks again, Dave, 339 00:17:24,240 --> 00:17:26,320 Speaker 2: and you up you up for next year? 340 00:17:27,080 --> 00:17:30,560 Speaker 3: Oh absolutely, you know, Brian, I always appreciate the opportunity 341 00:17:30,680 --> 00:17:32,920 Speaker 3: you guys give me to hopefully do some good out 342 00:17:32,920 --> 00:17:36,040 Speaker 3: there and enjoy chatting with you and Joe and appreciate 343 00:17:36,119 --> 00:17:38,720 Speaker 3: you and Joe and all your listeners. Merry Christmas, Happy 344 00:17:38,760 --> 00:17:41,040 Speaker 3: New Year, and I look forward to talking to you 345 00:17:41,080 --> 00:17:41,840 Speaker 3: in January. 346 00:17:42,040 --> 00:17:44,399 Speaker 2: That's welcome news, my friend. Merry Christmas to you, and 347 00:17:44,440 --> 00:17:46,080 Speaker 2: yours God bless you, sir. We'll get back