1 00:00:09,920 --> 00:00:12,239 Speaker 1: So the overall winner of Worston Show this year was 2 00:00:12,280 --> 00:00:15,680 Speaker 1: the Samsun Bespoke AI refrigerator. 3 00:00:16,560 --> 00:00:19,520 Speaker 2: Liz Chamberlain is director of Sustainability and I fix It, 4 00:00:19,600 --> 00:00:23,079 Speaker 2: which is an organization that publishes free online repair guides 5 00:00:23,079 --> 00:00:26,680 Speaker 2: for consumer electronics. I'm sorry, I'm sorry, just I don't 6 00:00:26,680 --> 00:00:30,160 Speaker 2: think I had heard anybody say the word bespoke AI 7 00:00:30,280 --> 00:00:34,760 Speaker 2: refrigerator before. There's something about reading it and then hearing 8 00:00:34,840 --> 00:00:36,200 Speaker 2: it is somehow hits different. 9 00:00:36,520 --> 00:00:38,320 Speaker 1: I know what you mean. It's a lot of words 10 00:00:38,360 --> 00:00:40,800 Speaker 1: together that don't feel like they should go together. 11 00:00:41,840 --> 00:00:45,280 Speaker 2: Liz also runs something called Worst in Show, which is 12 00:00:45,280 --> 00:00:47,240 Speaker 2: a very particular kind of award show. 13 00:00:47,960 --> 00:00:51,320 Speaker 1: The Worst In Show is a set of anti awards 14 00:00:51,320 --> 00:00:54,800 Speaker 1: that we give it the Consumer Electronics Show ces and 15 00:00:55,000 --> 00:00:58,160 Speaker 1: we point out the least secure, the least private, the 16 00:00:58,240 --> 00:01:02,080 Speaker 1: least repairable, the most fined products that we see on 17 00:01:02,120 --> 00:01:05,600 Speaker 1: the show floor. There are awards that nobody should want 18 00:01:05,680 --> 00:01:06,560 Speaker 1: to get. 19 00:01:06,760 --> 00:01:08,679 Speaker 2: So, you know how, you can look at the oscars 20 00:01:08,720 --> 00:01:10,840 Speaker 2: for any year and you can kind of see trends. 21 00:01:11,120 --> 00:01:13,160 Speaker 2: For example, you could look back at nineteen seventy eight 22 00:01:13,200 --> 00:01:15,440 Speaker 2: and see, oh, okay, this is what society was into 23 00:01:15,520 --> 00:01:18,080 Speaker 2: back then. Well, I think you can also do that 24 00:01:18,160 --> 00:01:21,959 Speaker 2: with tech awards, especially with the Worst in Show, but 25 00:01:22,200 --> 00:01:23,840 Speaker 2: there's also a bigger picture. 26 00:01:24,640 --> 00:01:27,400 Speaker 1: The goal is to shame the big tech companies for 27 00:01:28,200 --> 00:01:31,880 Speaker 1: the thing is that everybody is annoyed by but too 28 00:01:31,959 --> 00:01:34,520 Speaker 1: many people are just talking about how cool and new 29 00:01:34,560 --> 00:01:38,560 Speaker 1: all this stuff is and not shaming them for stuff 30 00:01:38,560 --> 00:01:40,840 Speaker 1: that really does bake the world worse. 31 00:01:42,240 --> 00:01:44,120 Speaker 2: So today we're going to do a recap of the 32 00:01:44,120 --> 00:01:47,080 Speaker 2: winners of the Worst In Show twenty twenty six with 33 00:01:47,200 --> 00:01:55,640 Speaker 2: Liz Chamberlain from Kaleidoscope and iHeart Podcasts. 34 00:01:54,680 --> 00:01:59,880 Speaker 3: This is kill Switch. I'm Dexter Thomas. 35 00:02:00,920 --> 00:02:38,280 Speaker 1: I'm sorry, I'm goodbye. 36 00:02:43,400 --> 00:02:46,320 Speaker 2: The Worst in Show Awards started in twenty twenty one 37 00:02:46,480 --> 00:02:49,240 Speaker 2: and they're organized by repair dot org, which is a 38 00:02:49,240 --> 00:02:52,440 Speaker 2: nonprofit that advocates for the right to repair, which means 39 00:02:52,440 --> 00:02:55,400 Speaker 2: that they fight for independent repair shops and consumers like 40 00:02:55,480 --> 00:02:58,160 Speaker 2: me and you to be able to repair our own stuff. 41 00:02:58,600 --> 00:03:01,519 Speaker 2: So each category in the award is judged and presented 42 00:03:01,600 --> 00:03:04,280 Speaker 2: by an organization that works in that field, and the 43 00:03:04,320 --> 00:03:08,560 Speaker 2: products are judged based on five criteria. First, how bad 44 00:03:08,720 --> 00:03:09,480 Speaker 2: is this product? 45 00:03:10,000 --> 00:03:10,360 Speaker 3: Second? 46 00:03:10,639 --> 00:03:13,400 Speaker 2: Are the problems with this gadget innovatively bad? 47 00:03:13,840 --> 00:03:14,120 Speaker 1: Third? 48 00:03:14,400 --> 00:03:17,120 Speaker 2: What is the global impact if this technology was to 49 00:03:17,120 --> 00:03:20,960 Speaker 2: be widely adopted. Fourth, how much worse is this than 50 00:03:21,040 --> 00:03:25,480 Speaker 2: previous iterations of similar technology? And five how much do 51 00:03:25,520 --> 00:03:28,959 Speaker 2: the negatives outweigh the positives? And this year we saw 52 00:03:29,000 --> 00:03:32,360 Speaker 2: winners in eight different categories. So let's get into it. 53 00:03:32,400 --> 00:03:36,320 Speaker 2: So I want to go through this year's winners of 54 00:03:36,440 --> 00:03:39,640 Speaker 2: the worst than show a cees in the privacy category 55 00:03:39,920 --> 00:03:40,600 Speaker 2: who you got. 56 00:03:42,800 --> 00:03:46,680 Speaker 1: So the privacy category is awarded by the Electronic Frontier Foundation, 57 00:03:46,880 --> 00:03:49,680 Speaker 1: which is a group of lawyers that fights for electronic 58 00:03:49,800 --> 00:03:53,560 Speaker 1: rights and they're really focused on stuff that makes it 59 00:03:54,040 --> 00:03:59,200 Speaker 1: easier for governments or companies or people. You don't want 60 00:03:59,240 --> 00:04:02,880 Speaker 1: to survey you so look in your home and find 61 00:04:02,920 --> 00:04:04,720 Speaker 1: out who you are and what you're doing and where 62 00:04:04,720 --> 00:04:06,800 Speaker 1: you're going, and what you're writing about and what you're 63 00:04:06,840 --> 00:04:09,680 Speaker 1: voting and you know all of these things. And they 64 00:04:09,720 --> 00:04:13,440 Speaker 1: gave the award this year to the expanded features of 65 00:04:13,600 --> 00:04:18,440 Speaker 1: Ring AI. So Ring is the video doorbell video camera 66 00:04:18,600 --> 00:04:21,720 Speaker 1: system run by Amazon. Been a huge player in the 67 00:04:22,120 --> 00:04:25,320 Speaker 1: video door bell security market for a long time, but 68 00:04:25,440 --> 00:04:29,640 Speaker 1: what they announced this year was this huge overarching AI 69 00:04:29,839 --> 00:04:34,839 Speaker 1: system that connects all of their video cameras and security 70 00:04:34,839 --> 00:04:38,839 Speaker 1: cameras and so on, and so they were advertising dog 71 00:04:39,000 --> 00:04:42,160 Speaker 1: search party. For instance, this is milow heets our family. 72 00:04:43,279 --> 00:04:46,280 Speaker 1: But every year ten million go missing, and the way 73 00:04:46,320 --> 00:04:51,080 Speaker 1: we look for them hasn't changed in years until now. 74 00:04:51,600 --> 00:04:53,440 Speaker 2: One post of a dog's photo in the Ring app 75 00:04:53,480 --> 00:04:56,280 Speaker 2: starts outdoor cameras looking for a match. Search Party from 76 00:04:56,360 --> 00:04:59,200 Speaker 2: Ring uses AI to help families find lost dogs. 77 00:04:59,480 --> 00:05:02,840 Speaker 1: Your camera already recognizes your dog's face, and now if 78 00:05:02,839 --> 00:05:06,560 Speaker 1: your dog runs away, you can alert all of your 79 00:05:06,600 --> 00:05:12,200 Speaker 1: neighbors ring doorbells to look for your dog. Cool. Okay, yeah, 80 00:05:12,240 --> 00:05:15,839 Speaker 1: but that also tells me that my neighbor's cameras know 81 00:05:15,880 --> 00:05:19,080 Speaker 1: who I am and can recognize me and see me 82 00:05:19,160 --> 00:05:21,080 Speaker 1: leave in the house, see where I'm going and what 83 00:05:21,160 --> 00:05:24,200 Speaker 1: I'm doing. If they showed off, you can search in 84 00:05:24,279 --> 00:05:27,280 Speaker 1: your video history for green hat and pull up all 85 00:05:27,279 --> 00:05:31,240 Speaker 1: the videos of somebody with a green hat. It suggests 86 00:05:31,320 --> 00:05:37,599 Speaker 1: that they've got some really, really intense capacity to understand 87 00:05:37,640 --> 00:05:42,560 Speaker 1: what's happening in the video and connect videos across cameras. 88 00:05:43,240 --> 00:05:46,720 Speaker 1: They've got cameras deployed in parking lots across the country. 89 00:05:47,520 --> 00:05:51,000 Speaker 1: It's scary for a world in which we know our 90 00:05:51,360 --> 00:05:56,240 Speaker 1: governments are you know, asking major tech companies for that 91 00:05:56,360 --> 00:05:59,400 Speaker 1: data and using that data to surveil us. 92 00:05:59,680 --> 00:06:03,160 Speaker 2: So I'm going to play the I don't know whose 93 00:06:03,200 --> 00:06:05,560 Speaker 2: advocate I am here, but I'm going to give you 94 00:06:06,080 --> 00:06:09,200 Speaker 2: a common response that I hear often when I talk 95 00:06:09,240 --> 00:06:13,920 Speaker 2: about privacy. Right, Okay, Amazon's got this new ring AI thing. 96 00:06:14,320 --> 00:06:16,159 Speaker 2: I'm not a bad person. I'm not a criminal. I 97 00:06:16,160 --> 00:06:18,280 Speaker 2: don't have anything to hide. If it knows that I'm 98 00:06:18,279 --> 00:06:21,080 Speaker 2: out working in the garden every day at twelve oh 99 00:06:21,120 --> 00:06:23,000 Speaker 2: five pm, what's the harm. 100 00:06:23,640 --> 00:06:27,359 Speaker 1: Yeah, I mean, as long as you're only doing things 101 00:06:27,360 --> 00:06:30,000 Speaker 1: that your government agrees with, then sure it's not a 102 00:06:30,040 --> 00:06:36,280 Speaker 1: problem for now. Even for people that are not afraid 103 00:06:36,560 --> 00:06:39,400 Speaker 1: of the government knowing that they're outside in the garden 104 00:06:39,400 --> 00:06:43,080 Speaker 1: at twelve oh five pm, they might not want to 105 00:06:43,120 --> 00:06:47,000 Speaker 1: be surveilled. In grocery stores, for instance, watch the whole 106 00:06:47,040 --> 00:06:49,880 Speaker 1: time they're walking around and looking at products. People are 107 00:06:50,000 --> 00:06:54,279 Speaker 1: very aware of how algorithms target you based on how 108 00:06:54,320 --> 00:06:56,640 Speaker 1: long you linger on a video or what you're doing 109 00:06:56,640 --> 00:07:00,599 Speaker 1: as you're scrolling through Instagram or whatever. Same kinds of 110 00:07:00,680 --> 00:07:05,160 Speaker 1: things are happening in AI camera systems in stores, AI 111 00:07:05,200 --> 00:07:08,279 Speaker 1: camera systems out side of stores. That data can be 112 00:07:08,480 --> 00:07:12,640 Speaker 1: used not just for government surveillance purposes, but also for 113 00:07:13,240 --> 00:07:17,560 Speaker 1: pretty invasive advertising practices that are becoming really really common. 114 00:07:18,080 --> 00:07:21,120 Speaker 2: In the past, what sorts of things have won in 115 00:07:21,160 --> 00:07:22,240 Speaker 2: the privacy category. 116 00:07:22,600 --> 00:07:26,160 Speaker 1: So last year the winner of the privacy category was 117 00:07:26,600 --> 00:07:31,680 Speaker 1: WEE Things your analysis device, that is urine analysis. It 118 00:07:31,720 --> 00:07:35,080 Speaker 1: was a little puck that sat in your toilet that 119 00:07:35,320 --> 00:07:40,120 Speaker 1: would do some basic monitoring of things and send data 120 00:07:40,160 --> 00:07:43,520 Speaker 1: to your phone about what's in your p Again, one 121 00:07:43,560 --> 00:07:45,920 Speaker 1: of these things were like, I can certainly see some 122 00:07:46,000 --> 00:07:49,000 Speaker 1: cool potential uses of it. 123 00:07:49,360 --> 00:07:51,480 Speaker 2: Please tell me these cool potential uses of it, because 124 00:07:51,480 --> 00:07:52,320 Speaker 2: I'm not aware of these. 125 00:07:52,880 --> 00:07:56,040 Speaker 1: You can measure blood sugars if you're diabetic, Okay, you know, 126 00:07:56,080 --> 00:07:58,920 Speaker 1: I can tell you if you're dehydrated. I guess if 127 00:07:58,960 --> 00:08:03,600 Speaker 1: you're not attentant your pea color enough. But Cindy Cone, 128 00:08:03,640 --> 00:08:07,240 Speaker 1: the director of You Have, pointed out that you know, 129 00:08:07,280 --> 00:08:14,000 Speaker 1: it included measurement of privacy hormone and it theoretically could 130 00:08:14,040 --> 00:08:17,920 Speaker 1: be used to track women who are pregnant and then 131 00:08:18,320 --> 00:08:21,480 Speaker 1: try to seek abortion care out of state or something. 132 00:08:21,640 --> 00:08:24,400 Speaker 1: That was the really sort of scary use case. She 133 00:08:24,480 --> 00:08:25,640 Speaker 1: pointed out. 134 00:08:25,440 --> 00:08:30,680 Speaker 2: Because post Roe v Way, the government could specifically, law 135 00:08:30,760 --> 00:08:33,839 Speaker 2: enforcement could go to that company and say give me 136 00:08:33,880 --> 00:08:38,040 Speaker 2: your data exactly, which sounds very orwellian, whatever you want 137 00:08:38,040 --> 00:08:40,120 Speaker 2: to call it, but this is the world that we're 138 00:08:40,160 --> 00:08:45,679 Speaker 2: living in now exactly. So Historically, the privacy category awards 139 00:08:45,679 --> 00:08:48,560 Speaker 2: the product whose features and data collection could be used 140 00:08:48,559 --> 00:08:50,800 Speaker 2: for surveillance and tracking, like the one that we just 141 00:08:50,840 --> 00:08:54,720 Speaker 2: talked about, the urine analysis tool. Bad privacy is products 142 00:08:54,720 --> 00:08:57,720 Speaker 2: that keep too much of your data in. But then 143 00:08:57,720 --> 00:09:00,280 Speaker 2: there's bad security, which is products that do a bad 144 00:09:00,360 --> 00:09:04,200 Speaker 2: job of keeping people out, like say packers, and this 145 00:09:04,240 --> 00:09:07,400 Speaker 2: brings us to the worst in shows security board. 146 00:09:09,480 --> 00:09:14,840 Speaker 1: The winner was Marak Treadmill. It's a smart treadmill thing 147 00:09:15,120 --> 00:09:20,480 Speaker 1: that Marac, which is a relatively small Chinese home fitness company, announced. 148 00:09:21,120 --> 00:09:25,040 Speaker 1: But what was unusual about this is in their privacy 149 00:09:25,080 --> 00:09:30,920 Speaker 1: policy said that they cannot guarantee the privacy of customers data, 150 00:09:31,280 --> 00:09:34,400 Speaker 1: which you know, hold on. 151 00:09:34,440 --> 00:09:36,719 Speaker 3: I appreciate that's that's real, you. 152 00:09:36,679 --> 00:09:39,480 Speaker 1: Know, Like, on the one hand, yes, it is, it's 153 00:09:39,520 --> 00:09:43,560 Speaker 1: true that they can't guarantee it. Nobody can guarantee it, right, 154 00:09:43,679 --> 00:09:48,440 Speaker 1: But just saying that shouldn't be the majority of your 155 00:09:48,640 --> 00:09:54,240 Speaker 1: privacy policy, right, you should have more to offer your 156 00:09:54,240 --> 00:09:57,560 Speaker 1: customers about how you're protecting their data than that. 157 00:09:58,520 --> 00:10:01,040 Speaker 2: The problem with this treadmill is pri policy was that 158 00:10:01,080 --> 00:10:05,520 Speaker 2: it actually said, verbatim, we cannot guarantee the security of 159 00:10:05,559 --> 00:10:09,840 Speaker 2: your personal information end quote. For me, the only reasonable 160 00:10:09,880 --> 00:10:11,839 Speaker 2: way to read that is that they don't want to 161 00:10:11,880 --> 00:10:14,800 Speaker 2: be held legally liable if your data was hacked, which 162 00:10:15,000 --> 00:10:17,680 Speaker 2: naturally makes you wonder, all right, are y'all just not 163 00:10:17,880 --> 00:10:20,560 Speaker 2: hiring anybody to handle security? Are you expecting to be 164 00:10:20,600 --> 00:10:23,840 Speaker 2: hacked or something? Now, to be fair, after this award 165 00:10:23,920 --> 00:10:27,400 Speaker 2: was handed out, Marek has updated their security policy. They 166 00:10:27,520 --> 00:10:30,400 Speaker 2: deleted that line and now there's a new line encouraging 167 00:10:30,440 --> 00:10:33,840 Speaker 2: you to use strong passwords, which yo, I agree with that. 168 00:10:33,840 --> 00:10:36,720 Speaker 2: That's actually a pretty good outcome. So we might now 169 00:10:36,800 --> 00:10:39,360 Speaker 2: have an example of the worst in show having a 170 00:10:39,400 --> 00:10:42,319 Speaker 2: positive impact on the industry. 171 00:10:43,480 --> 00:10:46,600 Speaker 3: What is it that this treadmill is even offering. 172 00:10:47,120 --> 00:10:51,360 Speaker 1: There's some kind of AI trainer that watches your biometrics 173 00:10:51,400 --> 00:10:55,199 Speaker 1: and gives you feedback as you're working through the program. 174 00:10:55,840 --> 00:10:58,960 Speaker 2: Maybe you're sensing a theme here. This year's Worse Than 175 00:10:59,000 --> 00:11:02,520 Speaker 2: Show featured a lot of AI, and look, AI features 176 00:11:02,520 --> 00:11:05,120 Speaker 2: can be cool, but these features are also showing what 177 00:11:05,200 --> 00:11:07,800 Speaker 2: happens when you're forcing a feature because of the hype 178 00:11:08,160 --> 00:11:11,120 Speaker 2: and not because it's actually going to make your product better. 179 00:11:11,559 --> 00:11:14,280 Speaker 2: Which brings us to the winner of the next category. 180 00:11:14,880 --> 00:11:17,640 Speaker 2: It's called the Who asked for This Award? 181 00:11:19,360 --> 00:11:22,080 Speaker 1: Who asked for this? We gave this year to the 182 00:11:22,120 --> 00:11:27,040 Speaker 1: Bosh eight hundred series coffee maker with Alexa Plus. Bosh 183 00:11:27,080 --> 00:11:29,280 Speaker 1: has had this series of coffee makers for a long time. 184 00:11:29,320 --> 00:11:32,360 Speaker 1: They're really popular, countertop all in one kind of thing. 185 00:11:32,400 --> 00:11:33,960 Speaker 1: You push a button and it makes you a latte 186 00:11:34,080 --> 00:11:37,440 Speaker 1: or amoca or whatever, and they have had an Alexa 187 00:11:37,480 --> 00:11:40,079 Speaker 1: feature on this thing for a while. You can walk 188 00:11:40,120 --> 00:11:42,439 Speaker 1: into your kitchen and tell it to make you amoga 189 00:11:42,559 --> 00:11:46,400 Speaker 1: and it'll do it. They announced that they were adding 190 00:11:46,400 --> 00:11:50,000 Speaker 1: in Alexa Plus this year. The machine isn't out yet 191 00:11:50,040 --> 00:11:53,000 Speaker 1: with Alexa Plus, but when it is, it should be 192 00:11:53,160 --> 00:11:56,360 Speaker 1: theoretically the same thing. You walk in you ask for 193 00:11:56,679 --> 00:12:00,800 Speaker 1: a moca, but the plus adds in a large language 194 00:12:00,800 --> 00:12:04,800 Speaker 1: model between you and the execution of that command. And 195 00:12:04,880 --> 00:12:09,199 Speaker 1: what people who have tested the Alexa Plus feature on 196 00:12:09,280 --> 00:12:13,720 Speaker 1: this copy maker and have said is that they don't 197 00:12:13,760 --> 00:12:17,280 Speaker 1: always get the response they're expecting. They give the coffeemaker 198 00:12:17,280 --> 00:12:20,200 Speaker 1: a command and it doesn't do what they asked, or 199 00:12:20,240 --> 00:12:22,920 Speaker 1: it has follow up question it wants to have a 200 00:12:22,960 --> 00:12:23,839 Speaker 1: conversation with. 201 00:12:23,880 --> 00:12:25,160 Speaker 3: That follow up questions. 202 00:12:25,360 --> 00:12:29,360 Speaker 2: You're so right to what a latte? Can I tell 203 00:12:29,400 --> 00:12:31,079 Speaker 2: you about the history of lates? 204 00:12:31,200 --> 00:12:32,479 Speaker 1: Exactly exactly? 205 00:12:33,000 --> 00:12:35,120 Speaker 2: I don't know about you, but I do not need 206 00:12:35,120 --> 00:12:38,079 Speaker 2: a coffee maker to make me a latte by voice command, 207 00:12:38,440 --> 00:12:40,560 Speaker 2: let alone trying to have a full blown conversation with 208 00:12:40,640 --> 00:12:44,320 Speaker 2: me when I still haven't had any coffee yet. So yeah, 209 00:12:44,360 --> 00:12:45,840 Speaker 2: this makes a lot of sense to me as a 210 00:12:45,840 --> 00:12:49,280 Speaker 2: winner in the who Asked for It? Category, But apparently 211 00:12:49,360 --> 00:12:51,880 Speaker 2: the company itself really disagrees with that. 212 00:12:52,640 --> 00:12:55,240 Speaker 1: This year, I got a long email from a Bosh 213 00:12:55,320 --> 00:12:59,239 Speaker 1: marketing rep and the marketing rep said he was confused 214 00:12:59,280 --> 00:13:03,120 Speaker 1: by that award because it is apparently their most requested product. 215 00:13:03,679 --> 00:13:06,960 Speaker 1: I don't know what people are asking for exactly. Nevertheless, 216 00:13:07,040 --> 00:13:10,679 Speaker 1: I really doubt that anyone was asking for the shift 217 00:13:10,800 --> 00:13:15,120 Speaker 1: that we highlighted from Alexa powered coffee machine where you 218 00:13:15,160 --> 00:13:17,680 Speaker 1: can walk into your room and ask for a cup 219 00:13:17,720 --> 00:13:21,880 Speaker 1: of coffee to an LM Alexa powered coffee machine, which 220 00:13:21,920 --> 00:13:24,840 Speaker 1: sometimes gets it wrong. It just introduces this layer of 221 00:13:24,960 --> 00:13:28,520 Speaker 1: like nondeterministic question mark between you and getting a cup 222 00:13:28,559 --> 00:13:29,199 Speaker 1: of coffee. 223 00:13:29,559 --> 00:13:32,600 Speaker 2: Okay, fair enough, maybe someone did ask for this feature. 224 00:13:32,800 --> 00:13:35,360 Speaker 2: It's not for me, but hey, that's okay. Spend your 225 00:13:35,400 --> 00:13:38,240 Speaker 2: money how you won. But the funny thing about a 226 00:13:38,280 --> 00:13:40,319 Speaker 2: lot of these products that win these awards is that 227 00:13:40,400 --> 00:13:42,320 Speaker 2: it kind of feels like some of them could have 228 00:13:42,360 --> 00:13:45,360 Speaker 2: won any of the categories. Like the winner of the 229 00:13:45,360 --> 00:13:49,360 Speaker 2: worst Environmental Impact Award, I also think is a pretty 230 00:13:49,360 --> 00:13:51,880 Speaker 2: good contender for the who asked for this award. 231 00:13:52,280 --> 00:13:57,640 Speaker 1: The environmental Impact winner this year was Lollipop Star, which 232 00:13:57,679 --> 00:14:01,720 Speaker 1: is a lollipop that music when you put it in 233 00:14:01,760 --> 00:14:02,280 Speaker 1: your mount. 234 00:14:02,960 --> 00:14:13,120 Speaker 3: We'll get into that one after the break. Okay, quick recap. 235 00:14:13,440 --> 00:14:15,400 Speaker 2: So far, the winners of this year's awards have been 236 00:14:15,440 --> 00:14:18,960 Speaker 2: products with features that you can mostly understand. So like 237 00:14:19,000 --> 00:14:21,800 Speaker 2: the ring camera watching your every move is weird, but 238 00:14:22,000 --> 00:14:23,640 Speaker 2: on the other hand, I mean it could help you 239 00:14:23,760 --> 00:14:26,440 Speaker 2: track down a lost dog. That's kind of cool. I 240 00:14:26,520 --> 00:14:28,960 Speaker 2: get it. And the maker of that treadmill straight up 241 00:14:29,000 --> 00:14:31,560 Speaker 2: telling you that we aren't responsible if we get hacked. 242 00:14:32,040 --> 00:14:34,640 Speaker 2: That's not good, but maybe some people don't mind it. 243 00:14:34,720 --> 00:14:37,160 Speaker 2: If the trade off is that your treadmill gives you 244 00:14:37,240 --> 00:14:39,840 Speaker 2: workout tips or encourages you or something I don't know, 245 00:14:40,240 --> 00:14:43,480 Speaker 2: and the talking coffee maker still weird for me, but 246 00:14:43,600 --> 00:14:46,800 Speaker 2: I can see why somebody might want it, But I 247 00:14:46,960 --> 00:14:49,960 Speaker 2: genuinely cannot think of any reason why you would want 248 00:14:50,000 --> 00:14:54,440 Speaker 2: to buy the winner of the worst environmental impact, the 249 00:14:54,520 --> 00:14:55,360 Speaker 2: Lollipop Star. 250 00:14:56,720 --> 00:14:59,640 Speaker 1: It plays sound via bone conduction, so you bite down 251 00:14:59,640 --> 00:15:02,520 Speaker 1: on it and it sends sound waves through your teeth 252 00:15:02,560 --> 00:15:06,360 Speaker 1: directly to your earbones. This concept isn't brand new. There 253 00:15:06,440 --> 00:15:09,240 Speaker 1: was a product called sound Bites back in the nineties 254 00:15:09,480 --> 00:15:13,320 Speaker 1: to this Hey, there are six new sound bites. Each 255 00:15:13,400 --> 00:15:16,280 Speaker 1: one is an amazing new experience. We make your head 256 00:15:16,280 --> 00:15:17,040 Speaker 1: a boom. 257 00:15:16,760 --> 00:15:21,160 Speaker 2: Box in case you don't remember these. The sound Bites 258 00:15:21,240 --> 00:15:24,200 Speaker 2: was basically a plastic handle into which you would insert 259 00:15:24,240 --> 00:15:27,280 Speaker 2: a normal lollipop on top, and you could press buttons 260 00:15:27,320 --> 00:15:29,680 Speaker 2: on that plastic handle and it would play generic like 261 00:15:29,840 --> 00:15:33,200 Speaker 2: rock guitar sound effects and it would vibrate through your teeth. 262 00:15:33,840 --> 00:15:36,840 Speaker 2: Another company had a product called tooth Tunes, which were 263 00:15:36,880 --> 00:15:40,560 Speaker 2: toothbrushes that did a similar thing. So in defense of 264 00:15:40,640 --> 00:15:43,000 Speaker 2: tooth tunes, which is a sentence that I cannot believe 265 00:15:43,360 --> 00:15:45,960 Speaker 2: that I am starting, But in defense of tooth tunes, 266 00:15:46,120 --> 00:15:49,120 Speaker 2: at least that product was made to get kids to 267 00:15:49,200 --> 00:15:52,080 Speaker 2: brush their teeth. So now imagine that there was a 268 00:15:52,080 --> 00:15:55,640 Speaker 2: product that revived that form factor but was somehow actually 269 00:15:55,720 --> 00:15:57,240 Speaker 2: worse than the nineties version. 270 00:15:57,720 --> 00:16:01,160 Speaker 1: Sound bites you could replace the battery of and Lollipop 271 00:16:01,200 --> 00:16:05,240 Speaker 1: Star has two embedded batteries still replaceable battery. On the 272 00:16:05,320 --> 00:16:11,200 Speaker 1: packaging it says up to sixty minutes. Frontime, you use 273 00:16:11,240 --> 00:16:13,080 Speaker 1: it for sixty minutes and then throw it away. That's 274 00:16:13,120 --> 00:16:14,840 Speaker 1: what they want you to do. That's what they expect 275 00:16:14,840 --> 00:16:15,240 Speaker 1: you to do. 276 00:16:15,920 --> 00:16:18,160 Speaker 3: First off, can you choose the music you're listening to? 277 00:16:18,880 --> 00:16:22,880 Speaker 1: No, I mean, they've got a couple different artists that 278 00:16:22,880 --> 00:16:25,320 Speaker 1: they've worked with, and I think you can choose by 279 00:16:25,360 --> 00:16:27,440 Speaker 1: buying a different Lollipop Star. 280 00:16:28,080 --> 00:16:30,760 Speaker 2: I'm not a big lollipop connoisseur, and I'm kind of 281 00:16:30,760 --> 00:16:33,360 Speaker 2: also thinking, you know, TUTSI roll, how many licks does 282 00:16:33,360 --> 00:16:34,400 Speaker 2: it take to get to the center of it? 283 00:16:34,400 --> 00:16:34,920 Speaker 3: Totsy pop? 284 00:16:35,000 --> 00:16:37,200 Speaker 2: Right, Like, I don't really know how long it takes 285 00:16:37,280 --> 00:16:39,360 Speaker 2: me to get through a lollipop. I don't know if 286 00:16:39,400 --> 00:16:42,160 Speaker 2: I want to listen to Acon for that long, like 287 00:16:42,280 --> 00:16:44,640 Speaker 2: stream directly into my bones. That doesn't sound like a 288 00:16:44,680 --> 00:16:45,400 Speaker 2: good time for me. 289 00:16:46,240 --> 00:16:48,640 Speaker 1: I hadn't considered that part of it, but you're right. 290 00:16:48,840 --> 00:16:53,480 Speaker 1: That is, you know, you got to commit, and the 291 00:16:53,520 --> 00:16:56,880 Speaker 1: product requires it to be really quiet for you to 292 00:16:56,920 --> 00:16:59,520 Speaker 1: be able to hear what's going on. You know, you 293 00:16:59,560 --> 00:17:02,240 Speaker 1: can't you can't really hear if there's any noise around you. 294 00:17:02,560 --> 00:17:04,760 Speaker 1: And the show floor was way too loud for people 295 00:17:04,760 --> 00:17:07,879 Speaker 1: to hear. And so they were handing out with the lollipop, 296 00:17:07,920 --> 00:17:11,320 Speaker 1: they were handing out earplugs to everybody and telling you 297 00:17:11,320 --> 00:17:13,520 Speaker 1: you gotta plug your ears to be able to listen 298 00:17:13,520 --> 00:17:13,840 Speaker 1: to it. 299 00:17:14,400 --> 00:17:16,800 Speaker 2: So I need to like go into the privacy of 300 00:17:16,800 --> 00:17:19,800 Speaker 2: my own home and sort a lollipop into my mouth 301 00:17:20,240 --> 00:17:23,520 Speaker 2: so I can listen to Acon. Yeah, this is like 302 00:17:23,680 --> 00:17:26,359 Speaker 2: the worst combination of everything that I can think of. 303 00:17:26,400 --> 00:17:30,280 Speaker 2: Like Apple Music exists, Spotify exists, band Camp exists. Like 304 00:17:30,320 --> 00:17:31,879 Speaker 2: if I want to listen to Acon, I shouldn't have 305 00:17:31,960 --> 00:17:32,840 Speaker 2: to put some of my mouth. 306 00:17:33,080 --> 00:17:35,399 Speaker 1: Yeah, you got a lot of options that aren't a 307 00:17:35,480 --> 00:17:37,040 Speaker 1: disposable lollipop. 308 00:17:38,720 --> 00:17:41,719 Speaker 2: Not to mention the fact that this disposable lollipop costs 309 00:17:41,840 --> 00:17:42,680 Speaker 2: nine dollars. 310 00:17:43,040 --> 00:17:44,200 Speaker 3: Nine dollars to a lollipop. 311 00:17:44,280 --> 00:17:46,640 Speaker 2: Someone really said, all right, yo, what if we brought 312 00:17:46,640 --> 00:17:49,359 Speaker 2: back the tooth tunes and the sound bites, but removed 313 00:17:49,520 --> 00:17:51,880 Speaker 2: all the redeeming qualities of both of those and charged 314 00:17:52,000 --> 00:17:55,320 Speaker 2: nine dollars for it. But here's why the Lollipop Star 315 00:17:55,480 --> 00:17:59,359 Speaker 2: got the worst Environmental Impact award. Lollipop Star is a 316 00:17:59,400 --> 00:18:02,960 Speaker 2: single U device and the batteries inside of it cannot 317 00:18:03,000 --> 00:18:03,680 Speaker 2: be removed. 318 00:18:06,040 --> 00:18:10,600 Speaker 1: We are always looking for products with embedded batteries to 319 00:18:10,640 --> 00:18:14,159 Speaker 1: give Horst and show awards too, because they're everywhere and 320 00:18:14,200 --> 00:18:17,320 Speaker 1: they're a huge problem. Even if you know best case 321 00:18:17,320 --> 00:18:20,400 Speaker 1: scenario product is into a landfill, just rots and a landfill, 322 00:18:20,720 --> 00:18:24,040 Speaker 1: they've got toxic chemicals that can leach into groundwater, leach 323 00:18:24,080 --> 00:18:26,720 Speaker 1: into the soil. That's not good for the planet. But 324 00:18:26,760 --> 00:18:30,800 Speaker 1: also what can happen is a lot of waste processing 325 00:18:31,080 --> 00:18:35,320 Speaker 1: facilities includes some amount of moving stuff around, banging it around, 326 00:18:35,400 --> 00:18:39,440 Speaker 1: passing it through shredders, passing it through devices that separate 327 00:18:39,520 --> 00:18:42,920 Speaker 1: stuff out, and in that process batteries can cause fires. 328 00:18:43,320 --> 00:18:47,040 Speaker 1: And there are battery fires and trash trucks. If most 329 00:18:47,119 --> 00:18:50,040 Speaker 1: municipalities the rate of like one a week. I know, 330 00:18:50,200 --> 00:18:52,959 Speaker 1: my little town of forty seven thousand people has like 331 00:18:53,040 --> 00:18:57,479 Speaker 1: one trash fire a week, and it's usually batteries and 332 00:18:58,080 --> 00:18:59,960 Speaker 1: that's a huge problem for waste process. 333 00:19:00,119 --> 00:19:03,640 Speaker 2: Same so once again in summary, sixty minutes of acon 334 00:19:04,040 --> 00:19:07,080 Speaker 2: vibrating through your teeth and a non zero chance that 335 00:19:07,080 --> 00:19:10,200 Speaker 2: you could start a literal garbage fire. Did I mention 336 00:19:10,280 --> 00:19:12,359 Speaker 2: that it was nine dollars? I'm really not sure what 337 00:19:12,440 --> 00:19:15,159 Speaker 2: eales I can say here, So let's move on to 338 00:19:15,240 --> 00:19:18,800 Speaker 2: the next category in the Worst in Show Awards in shitification, 339 00:19:19,160 --> 00:19:21,640 Speaker 2: which is a word coined by doctor Cory doctor Rowe 340 00:19:21,920 --> 00:19:25,560 Speaker 2: about how as tech companies value profit and extraction over 341 00:19:25,680 --> 00:19:29,760 Speaker 2: any user experience, their services get in shitified. 342 00:19:30,160 --> 00:19:36,360 Speaker 1: Intification is basically how internet things tend to get worse 343 00:19:36,400 --> 00:19:40,520 Speaker 1: over time because when they start out, they're just looking 344 00:19:40,600 --> 00:19:42,919 Speaker 1: for users. They're looking for people to sign on and 345 00:19:42,920 --> 00:19:46,159 Speaker 1: do the thing. But over time they need to make money, 346 00:19:46,440 --> 00:19:51,520 Speaker 1: and so they find ways to reduce the services that 347 00:19:51,560 --> 00:19:55,760 Speaker 1: they're offering and make more money out of you squeeze 348 00:19:55,800 --> 00:19:59,160 Speaker 1: more blood from your stone and so the in Sittification 349 00:19:59,359 --> 00:20:04,159 Speaker 1: Award went this year was the Bosh e Bike app. 350 00:20:04,920 --> 00:20:07,280 Speaker 2: The Bosh e Bike Flow app pairs to your e 351 00:20:07,359 --> 00:20:09,560 Speaker 2: bike and you can use it as a digital lock. 352 00:20:09,640 --> 00:20:11,800 Speaker 2: You can track your bike if it gets stolen, and 353 00:20:11,840 --> 00:20:15,000 Speaker 2: you can record your rides with pretty detailed stats. It 354 00:20:15,040 --> 00:20:18,520 Speaker 2: also links the bike's major components like the motor, the battery, 355 00:20:18,600 --> 00:20:22,320 Speaker 2: and the display together digitally. Each part is registered to 356 00:20:22,440 --> 00:20:25,440 Speaker 2: your bike and your account so that the system knows 357 00:20:25,480 --> 00:20:29,560 Speaker 2: what hardware belongs to whom. This is called parts pairing. 358 00:20:30,680 --> 00:20:33,960 Speaker 1: Parts pairing is basically, there's a little computer on the part, 359 00:20:34,040 --> 00:20:37,800 Speaker 1: there's a little computer on your bigger device. They talk 360 00:20:37,880 --> 00:20:40,960 Speaker 1: to each other, they have a little handshake agreement, and 361 00:20:41,200 --> 00:20:44,359 Speaker 1: parts pairing limits what the part can do, or what 362 00:20:44,440 --> 00:20:47,080 Speaker 1: kinds of repairs you can do, or what its capabilities are, 363 00:20:47,240 --> 00:20:50,439 Speaker 1: how it's being tracked. It really became a huge issue 364 00:20:50,440 --> 00:20:55,440 Speaker 1: with smartphones. We tracked between twenty eighteen and twenty twenty four. 365 00:20:56,359 --> 00:20:59,720 Speaker 1: Apple products that came out had more and more and 366 00:20:59,760 --> 00:21:03,359 Speaker 1: more parts paired to the moneyboard and would limit repair 367 00:21:03,480 --> 00:21:06,440 Speaker 1: in some way for independent repair technicians who weren't under 368 00:21:06,520 --> 00:21:09,439 Speaker 1: the Apple ecosystem. So if you wanted to replace an 369 00:21:09,440 --> 00:21:13,199 Speaker 1: iPhone battery, sure, if you're not part of Apple, you 370 00:21:13,240 --> 00:21:15,200 Speaker 1: might be able to buy a battery from I fix 371 00:21:15,280 --> 00:21:18,080 Speaker 1: It or somewhere and put it in yourself, but then 372 00:21:18,080 --> 00:21:21,399 Speaker 1: there would be warnings that you couldn't dismiss or there 373 00:21:21,400 --> 00:21:23,920 Speaker 1: would be some features that you wouldn't have, like battery 374 00:21:24,040 --> 00:21:26,679 Speaker 1: help metrics, for instance, They would just shut off. So 375 00:21:26,720 --> 00:21:29,399 Speaker 1: what this would mean is the little computer on your 376 00:21:29,440 --> 00:21:33,680 Speaker 1: motor has an agreement with your bike. It recognizes like, 377 00:21:33,720 --> 00:21:36,359 Speaker 1: all right, this is Dexter's bike. He bought it on 378 00:21:36,400 --> 00:21:39,600 Speaker 1: this day. And then let's say your bike gets stolen 379 00:21:40,160 --> 00:21:44,720 Speaker 1: and you mark your bike as stolen. Watch then links 380 00:21:44,760 --> 00:21:47,600 Speaker 1: that part. Let's say the person who steals your bike 381 00:21:47,840 --> 00:21:50,840 Speaker 1: separates out the motor, sells out the motor to somebody. 382 00:21:51,760 --> 00:21:56,640 Speaker 1: When that motor gets connected to the system again, it 383 00:21:56,680 --> 00:22:00,480 Speaker 1: gets flagged as the stolen part and it can't be 384 00:22:00,560 --> 00:22:03,280 Speaker 1: used again. It gets sort of locked out, and BOSCH 385 00:22:03,320 --> 00:22:06,080 Speaker 1: gets notified. You get notified where your stolen part is. 386 00:22:06,840 --> 00:22:10,400 Speaker 3: Okay, so on paper, sounds like a pretty good thing. 387 00:22:10,600 --> 00:22:11,480 Speaker 3: Why is that a bad thing? 388 00:22:11,840 --> 00:22:14,840 Speaker 1: That piece of it not really a bad thing. The 389 00:22:14,880 --> 00:22:18,199 Speaker 1: fear and the reason it won the Ancientification Award is 390 00:22:18,240 --> 00:22:22,720 Speaker 1: that this capability allows BOSH to do lots of other things. 391 00:22:22,880 --> 00:22:26,679 Speaker 1: It allows them to turn off your bike subscription entirely 392 00:22:26,720 --> 00:22:28,840 Speaker 1: if you don't pay the monthly fee. It allows them 393 00:22:28,880 --> 00:22:31,240 Speaker 1: to shut down your motor if they want to. It 394 00:22:31,280 --> 00:22:33,639 Speaker 1: allows them to make it so that only a Bosh 395 00:22:33,800 --> 00:22:37,439 Speaker 1: repair tech can replace that motor, and you can't do 396 00:22:37,520 --> 00:22:39,480 Speaker 1: it yourself, or you can't take it to the bike 397 00:22:39,520 --> 00:22:42,160 Speaker 1: shop down the street. You know, you're just locked into 398 00:22:42,160 --> 00:22:45,640 Speaker 1: that system. And the more and more of our things 399 00:22:45,800 --> 00:22:49,680 Speaker 1: get tied into electronic systems that can be shut off 400 00:22:49,720 --> 00:22:53,359 Speaker 1: whenever manufacturer wants or whenever they go out of business, 401 00:22:53,440 --> 00:22:55,920 Speaker 1: the worst our stuff gets. And so my straight up 402 00:22:55,920 --> 00:22:59,919 Speaker 1: manual bicycle is never going to lose features because you know, 403 00:23:00,160 --> 00:23:04,520 Speaker 1: Twin is done supporting my line of bicycles. Right, Bosh 404 00:23:04,560 --> 00:23:07,399 Speaker 1: hasn't done these things yet to be cleared. Right now, 405 00:23:07,520 --> 00:23:11,399 Speaker 1: it's mostly upside, but that's the point of the Intentification award, 406 00:23:11,520 --> 00:23:15,000 Speaker 1: is to point to things where right now mostly upside, 407 00:23:15,040 --> 00:23:18,600 Speaker 1: but there's the potential here for some real nastiness that's 408 00:23:18,640 --> 00:23:20,240 Speaker 1: built into the way this thing is framed. 409 00:23:21,560 --> 00:23:23,480 Speaker 2: All right, Now that we're almost through all the winners, 410 00:23:23,600 --> 00:23:26,600 Speaker 2: let's get back to that Samsung AI fridge, which is 411 00:23:26,640 --> 00:23:29,280 Speaker 2: a two time winner. This year, it not only won 412 00:23:29,359 --> 00:23:32,879 Speaker 2: the worst overall, but it also won specifically in the 413 00:23:32,920 --> 00:23:35,040 Speaker 2: repair ability category. 414 00:23:35,240 --> 00:23:40,960 Speaker 1: Samsung bespoke AI fridge with the voice activated door, the 415 00:23:41,000 --> 00:23:46,320 Speaker 1: food recognition, the ads on your screen the recommendations to 416 00:23:46,359 --> 00:23:50,040 Speaker 1: buy blueberries because it knows you're a blueberry lover and 417 00:23:50,080 --> 00:23:54,760 Speaker 1: you're running out of blueberries. It represents more points of 418 00:23:54,920 --> 00:24:00,320 Speaker 1: potential failure in a high failure product from a company 419 00:24:00,400 --> 00:24:03,320 Speaker 1: that has not been consistently good about repair. 420 00:24:04,320 --> 00:24:07,280 Speaker 2: Samsung has a pretty bad reputation when it comes to 421 00:24:07,280 --> 00:24:10,720 Speaker 2: repair ability. I fix it has continually criticized them for 422 00:24:10,800 --> 00:24:14,320 Speaker 2: charging prices for parts that just don't seem justified if 423 00:24:14,359 --> 00:24:17,119 Speaker 2: those parts are even made available. And then there's the 424 00:24:17,160 --> 00:24:19,160 Speaker 2: pricing for repair manuals. 425 00:24:19,960 --> 00:24:22,439 Speaker 1: I don't have any repair technicians for Samsung in my 426 00:24:22,520 --> 00:24:26,120 Speaker 1: town because Samsung charges repair texts like one thousand dollars 427 00:24:26,160 --> 00:24:29,840 Speaker 1: a year to have access to their repair documentation, and 428 00:24:29,880 --> 00:24:32,280 Speaker 1: so a lot of appliance texts just don't do it 429 00:24:32,320 --> 00:24:33,480 Speaker 1: because it's so expensive. 430 00:24:33,640 --> 00:24:37,800 Speaker 2: Wait, it charges them a thousand dollars a year, yeah 431 00:24:37,840 --> 00:24:40,000 Speaker 2: to be able to have a manual. 432 00:24:40,200 --> 00:24:45,040 Speaker 1: Yeah yeah. Basically the online app diagnostic thing. There's a 433 00:24:45,040 --> 00:24:48,399 Speaker 1: subscription fee, and there's a special dongle you need to 434 00:24:48,560 --> 00:24:51,119 Speaker 1: connect to the fridge, and so a lot of people 435 00:24:51,280 --> 00:24:52,120 Speaker 1: just don't do it. 436 00:24:53,040 --> 00:24:59,560 Speaker 2: A dongle and a subscription to fix the box that 437 00:24:59,720 --> 00:25:01,280 Speaker 2: keep your food cold. 438 00:25:02,160 --> 00:25:03,120 Speaker 3: Okay, all right. 439 00:25:03,640 --> 00:25:06,159 Speaker 1: So that's why it won the repair Ability Award. It 440 00:25:06,240 --> 00:25:10,040 Speaker 1: won overall Worston Show because not only does it have 441 00:25:10,080 --> 00:25:14,000 Speaker 1: all these potential repair problems, it also has the intitification 442 00:25:14,119 --> 00:25:17,760 Speaker 1: potential of ads showing up on your screen. Samsung has 443 00:25:17,840 --> 00:25:21,359 Speaker 1: just done this with their eighteen hundred dollars refrigerators. They 444 00:25:21,520 --> 00:25:24,720 Speaker 1: started putting ads on people's screens in their homes, even 445 00:25:24,720 --> 00:25:27,280 Speaker 1: though you've already paid almost two thousand dollars for your fridge. 446 00:25:27,320 --> 00:25:30,760 Speaker 1: But now it's showing you ads too. And the privacy 447 00:25:31,080 --> 00:25:35,000 Speaker 1: problem of having a bunch of information about you and 448 00:25:35,040 --> 00:25:37,640 Speaker 1: your habits and what you're eating, and you know, could 449 00:25:37,680 --> 00:25:41,520 Speaker 1: your health insurance company charge you higher premiums because you've 450 00:25:41,560 --> 00:25:43,439 Speaker 1: got too much frozen pizza in your fridge. 451 00:25:43,920 --> 00:25:46,400 Speaker 3: I have so much frozen pizza in my fridge right now. 452 00:25:46,840 --> 00:25:48,119 Speaker 3: I'm not even kidding. 453 00:25:48,200 --> 00:25:49,240 Speaker 1: They're coming for you, dexter. 454 00:25:51,119 --> 00:25:53,240 Speaker 3: I'm not even playing with you. 455 00:25:53,320 --> 00:25:56,159 Speaker 2: I have I legitimately have like six frozen pizzas in 456 00:25:56,200 --> 00:25:57,160 Speaker 2: my freezer right now. 457 00:25:57,400 --> 00:25:59,480 Speaker 1: Hey. You know, when the deal's good, you got to 458 00:25:59,480 --> 00:26:00,960 Speaker 1: buy and bull that's it was. 459 00:26:00,960 --> 00:26:01,680 Speaker 3: A good deal. 460 00:26:02,880 --> 00:26:06,440 Speaker 2: One thing that I noticed is there's some interesting features. 461 00:26:07,080 --> 00:26:11,280 Speaker 2: You can say open sesame and the door opens, like 462 00:26:11,359 --> 00:26:13,840 Speaker 2: you can voice command open the door. How do we 463 00:26:13,880 --> 00:26:14,520 Speaker 2: feel about that? 464 00:26:15,080 --> 00:26:17,640 Speaker 1: The image that they keep showing of this is somebody 465 00:26:17,720 --> 00:26:20,320 Speaker 1: with their hands full, got a big cast role, trying 466 00:26:20,320 --> 00:26:22,119 Speaker 1: to put it in fridge. You say open sess me, 467 00:26:22,240 --> 00:26:24,840 Speaker 1: the door opens. Okay. When the judges got together, we 468 00:26:24,920 --> 00:26:27,920 Speaker 1: talked about the chance that maybe you're in a wheelchair, 469 00:26:28,040 --> 00:26:29,600 Speaker 1: you want to be able to open the door that way. 470 00:26:29,800 --> 00:26:32,399 Speaker 1: I recognize that some of the time this might be 471 00:26:32,440 --> 00:26:35,840 Speaker 1: a useful feature, but it just introduces so many more 472 00:26:35,880 --> 00:26:39,600 Speaker 1: points of failure. There's now the automatic door opening that's 473 00:26:39,600 --> 00:26:42,320 Speaker 1: built into the hinge of the door mechanism. That is 474 00:26:42,480 --> 00:26:45,320 Speaker 1: yet another complicated part that you can only get a 475 00:26:45,320 --> 00:26:49,200 Speaker 1: Samsung or pairtech to the service, and it again adds 476 00:26:49,600 --> 00:26:53,320 Speaker 1: an LLM between you and opening your fridge. And that 477 00:26:53,760 --> 00:26:55,760 Speaker 1: they were showing it off on the show floor and 478 00:26:56,280 --> 00:26:59,800 Speaker 1: it was so noisy in there that the people demoing 479 00:26:59,800 --> 00:27:02,600 Speaker 1: it had to get really close to the door and 480 00:27:03,320 --> 00:27:06,000 Speaker 1: sort of shout it into the microphone of the fridge. 481 00:27:06,040 --> 00:27:08,720 Speaker 1: And I'm imagining at a party or something, you know, 482 00:27:08,800 --> 00:27:12,000 Speaker 1: the music's play loud, people are talking like shouting at 483 00:27:12,000 --> 00:27:14,040 Speaker 1: the fridge to get it to open or. 484 00:27:14,000 --> 00:27:16,400 Speaker 2: You got kids yelling in the background and you try 485 00:27:16,440 --> 00:27:18,200 Speaker 2: to like, you know, I'm trying to get to the fridge. 486 00:27:18,240 --> 00:27:20,840 Speaker 2: I need to drink, Like what are we doing? Also, 487 00:27:21,320 --> 00:27:23,600 Speaker 2: does this thing have handles? I'm looking at a picture 488 00:27:23,600 --> 00:27:23,840 Speaker 2: of this. 489 00:27:24,160 --> 00:27:26,359 Speaker 1: It doesn't have a handle on the front like normal, 490 00:27:26,400 --> 00:27:30,119 Speaker 1: and you can't open it manually from underneath. But it 491 00:27:30,480 --> 00:27:32,879 Speaker 1: worsens the normal amble experience. 492 00:27:34,119 --> 00:27:36,520 Speaker 2: And just to be real here for a second voice 493 00:27:36,520 --> 00:27:39,119 Speaker 2: command to open a fridge on paper sounds like it 494 00:27:39,119 --> 00:27:42,000 Speaker 2: could be really helpful for someone who's disabled and needs 495 00:27:42,080 --> 00:27:45,520 Speaker 2: help opening the fridge door sometimes. But if something goes 496 00:27:45,600 --> 00:27:48,600 Speaker 2: wrong and they need to fix the door, now you're 497 00:27:48,640 --> 00:27:51,640 Speaker 2: looking at a really expensive repair. But if you live 498 00:27:51,720 --> 00:27:55,080 Speaker 2: in a rural area without one of those Samsung certified technicians, 499 00:27:55,359 --> 00:27:58,200 Speaker 2: then what you just can't open the fridge door at all? 500 00:27:58,240 --> 00:28:00,800 Speaker 2: What are you supposed to do? Moving on, though, there's 501 00:28:00,920 --> 00:28:03,000 Speaker 2: one more winner from this year that we didn't get to, 502 00:28:03,400 --> 00:28:06,560 Speaker 2: the lapro Ai Soulmate, which was the winner of this 503 00:28:06,640 --> 00:28:10,560 Speaker 2: year's People's Choice. So it's advertised as quote You're always 504 00:28:10,600 --> 00:28:14,640 Speaker 2: on three d Ai Soulmate. Basically, it's an anime girl 505 00:28:14,680 --> 00:28:16,879 Speaker 2: and a cylindrical tube with the camera on it, and 506 00:28:17,119 --> 00:28:20,480 Speaker 2: essentially people were unsettled by a camera that's marketed as 507 00:28:20,560 --> 00:28:21,800 Speaker 2: being always on. 508 00:28:22,400 --> 00:28:23,240 Speaker 3: I get it. I don't know. 509 00:28:23,280 --> 00:28:25,600 Speaker 2: Anime girl and a signallogrical twobe I've seeing that almost 510 00:28:25,600 --> 00:28:27,600 Speaker 2: a decade ago in Japan doesn't really move the needle 511 00:28:27,600 --> 00:28:31,080 Speaker 2: from me, but I get it. So those are this 512 00:28:31,160 --> 00:28:33,920 Speaker 2: year's losers or winners. And remember what I was saying 513 00:28:33,960 --> 00:28:37,399 Speaker 2: earlier about awards shows telling us about what people like 514 00:28:37,560 --> 00:28:39,680 Speaker 2: or where society is at a certain point in time. 515 00:28:40,160 --> 00:28:42,920 Speaker 2: That's where I think the Worst in Show is a 516 00:28:42,960 --> 00:28:45,680 Speaker 2: little bit different in that it's actually self aware of that. 517 00:28:46,160 --> 00:28:48,160 Speaker 2: But on top of that, it also can maybe give 518 00:28:48,200 --> 00:28:51,440 Speaker 2: some hints on how we as just regular people can 519 00:28:51,520 --> 00:28:55,000 Speaker 2: push back against stuff that we don't like. We'll get 520 00:28:55,040 --> 00:29:05,280 Speaker 2: into that after the break. I feel like I hear 521 00:29:05,800 --> 00:29:08,440 Speaker 2: a lot of people saying I don't want this AI 522 00:29:09,000 --> 00:29:11,080 Speaker 2: in my coffee maker. You know, a couple of years ago, 523 00:29:11,080 --> 00:29:12,440 Speaker 2: it would have been why do you got to put 524 00:29:12,520 --> 00:29:14,680 Speaker 2: NFTs in my whatever? 525 00:29:14,720 --> 00:29:15,520 Speaker 3: App? Right? 526 00:29:16,120 --> 00:29:19,760 Speaker 2: Why is it the companies keep making this stuff if 527 00:29:19,760 --> 00:29:22,240 Speaker 2: there are people who are saying that they don't want it, 528 00:29:22,360 --> 00:29:24,120 Speaker 2: I mean, are people like me and you are we 529 00:29:24,240 --> 00:29:28,280 Speaker 2: just weird? And everybody else does want it or what's 530 00:29:28,320 --> 00:29:28,920 Speaker 2: happening here. 531 00:29:29,320 --> 00:29:32,160 Speaker 1: I mean, yet they are making it because it sells, right, 532 00:29:32,240 --> 00:29:34,640 Speaker 1: They wouldn't make it if it didn't sell, So I 533 00:29:34,680 --> 00:29:37,960 Speaker 1: think that's part of it. I think there's a lot 534 00:29:38,000 --> 00:29:42,000 Speaker 1: of AI being introduced in places where it doesn't need 535 00:29:42,040 --> 00:29:47,240 Speaker 1: to be. But I think what we give up in 536 00:29:48,160 --> 00:29:51,480 Speaker 1: a lot of that introduction is that now all of 537 00:29:51,520 --> 00:29:54,800 Speaker 1: our stuff has microphones on it and cameras on it, 538 00:29:55,240 --> 00:29:59,040 Speaker 1: and there's all this always on data that's being collected 539 00:29:59,160 --> 00:30:02,480 Speaker 1: about us room of our home. You know, if you've 540 00:30:02,520 --> 00:30:05,160 Speaker 1: got the Bosh coffee maker that's listening to you all 541 00:30:05,200 --> 00:30:08,400 Speaker 1: the time, and your refrigerator is watching you and watching 542 00:30:08,440 --> 00:30:12,160 Speaker 1: your food, and your doorbell out front is watching all 543 00:30:12,200 --> 00:30:16,720 Speaker 1: the neighbors. It's kind of mind boggling how dramatic the 544 00:30:16,800 --> 00:30:20,480 Speaker 1: increase in surveillance of us and our stuff and our 545 00:30:20,520 --> 00:30:25,800 Speaker 1: habits has been in technology in the last couple of years. 546 00:30:25,960 --> 00:30:30,480 Speaker 1: And we'll only know over time what that means for 547 00:30:31,440 --> 00:30:34,320 Speaker 1: our independence and our freedoms. 548 00:30:34,960 --> 00:30:37,200 Speaker 2: But there are other trends that are happening in the 549 00:30:37,240 --> 00:30:41,320 Speaker 2: opposite direction, that is trends coming from us, the consumers. 550 00:30:42,040 --> 00:30:45,320 Speaker 1: I see two things that make me hopeful. And one 551 00:30:45,320 --> 00:30:50,520 Speaker 1: of those things is that there is this rising public 552 00:30:51,600 --> 00:30:55,920 Speaker 1: pushback against this kind of new every year thing. And 553 00:30:56,000 --> 00:30:59,840 Speaker 1: I see the growth of the refirm marketplace back market. 554 00:31:00,160 --> 00:31:03,280 Speaker 1: They're growing hugely, They're getting bigger and bigger, and I 555 00:31:03,280 --> 00:31:04,760 Speaker 1: think a lot of people are saying, you know, I 556 00:31:04,800 --> 00:31:08,400 Speaker 1: don't really need the brand new phone. I'm fine with 557 00:31:08,520 --> 00:31:11,000 Speaker 1: last year's model or the year before that, and why 558 00:31:11,000 --> 00:31:13,520 Speaker 1: don't I buy it used? I can save some money 559 00:31:13,560 --> 00:31:16,280 Speaker 1: and help out the planet a little bit. There is 560 00:31:16,320 --> 00:31:20,680 Speaker 1: a cultural swell of pushback against why does all tech 561 00:31:20,760 --> 00:31:23,120 Speaker 1: have to be new and new every year? And that 562 00:31:23,160 --> 00:31:27,360 Speaker 1: makes me hopeful. And then there's also globally some legislation 563 00:31:27,880 --> 00:31:33,000 Speaker 1: that is helping the sealed in batteries thing. I think 564 00:31:33,520 --> 00:31:37,280 Speaker 1: in January twenty twenty seven, I am expecting there be 565 00:31:37,960 --> 00:31:42,240 Speaker 1: a major difference at CEES because in twenty twenty seven, 566 00:31:42,760 --> 00:31:47,320 Speaker 1: the EU Replaceable Batteries legislation goes into force. So this 567 00:31:47,360 --> 00:31:50,320 Speaker 1: is a law that's passed in the European Union that 568 00:31:50,400 --> 00:31:54,280 Speaker 1: says that by twenty twenty seven, all portable products sold 569 00:31:54,360 --> 00:31:57,320 Speaker 1: in the EU must have user replaceable batteries. 570 00:31:57,960 --> 00:32:02,240 Speaker 2: User replaceable so anybody can pop in batteries to replace 571 00:32:02,280 --> 00:32:03,120 Speaker 2: it exactly. 572 00:32:03,320 --> 00:32:06,800 Speaker 1: Really okay, you should be able to change your phone 573 00:32:06,800 --> 00:32:09,880 Speaker 1: battery yourself, change your laptop batter yourself. There are some 574 00:32:09,960 --> 00:32:13,120 Speaker 1: exceptions battery power tooth brushes, for instance, anything that's supposed 575 00:32:13,120 --> 00:32:16,640 Speaker 1: to be used in a quote unquote white environment isn't covered. 576 00:32:17,120 --> 00:32:19,520 Speaker 1: But something like smart rings, they're gonna have to figure 577 00:32:19,520 --> 00:32:22,520 Speaker 1: out an answer or stop selling in the EU. And 578 00:32:22,560 --> 00:32:23,520 Speaker 1: that's a huge market. 579 00:32:23,920 --> 00:32:28,720 Speaker 3: Wow, okay, I'm looking forward to that. Shoot me too. Well, 580 00:32:29,640 --> 00:32:31,600 Speaker 3: what do you think the awards will look like next year? Then? 581 00:32:31,880 --> 00:32:33,720 Speaker 1: You know, I would love to have a Worst in 582 00:32:33,840 --> 00:32:37,400 Speaker 1: Show where we don't give a sealed and battery award. 583 00:32:37,680 --> 00:32:39,760 Speaker 1: I would love for there to be so few things 584 00:32:39,800 --> 00:32:41,520 Speaker 1: on the market that it's not worth pointing to. 585 00:32:42,240 --> 00:32:45,479 Speaker 2: If we're to take one thing from this year's Worst 586 00:32:45,520 --> 00:32:48,200 Speaker 2: and Show Awards or the Worst in Show awards in general, 587 00:32:48,720 --> 00:32:49,560 Speaker 2: what do you think it would be. 588 00:32:50,440 --> 00:32:55,000 Speaker 1: I would like people to be less bought into the 589 00:32:55,040 --> 00:32:59,080 Speaker 1: hype cycle. I would like people to win something breaks 590 00:32:59,720 --> 00:33:03,760 Speaker 1: than about whether they can fix it, not bollow it. 591 00:33:03,760 --> 00:33:06,800 Speaker 1: It's time for me to upgrade anyway. I think if 592 00:33:06,880 --> 00:33:12,440 Speaker 1: people recognize that upgrade is a you know, it's brainwashing. 593 00:33:12,560 --> 00:33:16,160 Speaker 1: By manufacturers to some extent, manufacturers have trained us all 594 00:33:16,360 --> 00:33:20,080 Speaker 1: to look for next year's model, look for what are 595 00:33:20,080 --> 00:33:22,800 Speaker 1: the new features out on the market today, And for 596 00:33:22,840 --> 00:33:25,560 Speaker 1: those of us that love technology, it's a it's really 597 00:33:25,880 --> 00:33:30,520 Speaker 1: effective brainwashing, you know it. I work in repair, I 598 00:33:30,560 --> 00:33:32,960 Speaker 1: work for a repair company. I say this stuff every day, 599 00:33:33,120 --> 00:33:35,960 Speaker 1: and I still have that thought at first when something breaks, 600 00:33:36,000 --> 00:33:38,240 Speaker 1: like oh well, I should just get a new one. 601 00:33:38,880 --> 00:33:41,680 Speaker 1: But I think it takes all of us sort of 602 00:33:41,880 --> 00:33:45,200 Speaker 1: pushing back against that sub boutine in our own heads, 603 00:33:45,280 --> 00:33:45,560 Speaker 1: you know. 604 00:33:47,640 --> 00:33:49,360 Speaker 3: And that's it. There we have it. 605 00:33:49,480 --> 00:33:51,720 Speaker 2: So I'd actually be curious to see what you thought 606 00:33:51,760 --> 00:33:53,520 Speaker 2: about this year's Worse Than Show Awards. 607 00:33:53,720 --> 00:33:55,040 Speaker 3: Did you agree with the results? 608 00:33:55,120 --> 00:33:57,680 Speaker 2: I mean me personally, I'm not really sure how an 609 00:33:57,680 --> 00:34:01,200 Speaker 2: AI companion beat out the Lollipop Star for People's Choice, 610 00:34:01,240 --> 00:34:04,200 Speaker 2: But you know what, that's what award shows are for 611 00:34:04,400 --> 00:34:07,400 Speaker 2: seeing the results and then vehemently disagreeing with them in 612 00:34:07,440 --> 00:34:09,359 Speaker 2: the comments section online. 613 00:34:09,440 --> 00:34:11,120 Speaker 3: So if you got your own take, you. 614 00:34:11,080 --> 00:34:12,880 Speaker 2: Know what to do, and I'd love to see what 615 00:34:12,920 --> 00:34:15,960 Speaker 2: you think. Thank you one more time to Liz to 616 00:34:16,000 --> 00:34:18,680 Speaker 2: Prepare dot org and to everyone who participated in the 617 00:34:18,719 --> 00:34:21,719 Speaker 2: Worst and Show Awards and thank you so much for 618 00:34:21,800 --> 00:34:24,120 Speaker 2: listening to kill Switch. You can email us if you 619 00:34:24,160 --> 00:34:27,960 Speaker 2: want at kill Switch at Kaleidoscope dot NYC or on Instagram. 620 00:34:28,000 --> 00:34:30,560 Speaker 2: We're at kill switchpod and if you like the show, 621 00:34:30,640 --> 00:34:33,000 Speaker 2: hopefully you do, you know, think about leaving us a review. 622 00:34:33,200 --> 00:34:35,640 Speaker 2: It helps other people find the show, which helps us 623 00:34:35,800 --> 00:34:38,319 Speaker 2: keep doing our thing. And once you've done that, did 624 00:34:38,360 --> 00:34:40,880 Speaker 2: you know that kill Switch is on YouTube? So if 625 00:34:40,920 --> 00:34:43,280 Speaker 2: you want to see video of any of the weird 626 00:34:43,400 --> 00:34:45,880 Speaker 2: products that we talked about today, you can check us 627 00:34:45,880 --> 00:34:49,600 Speaker 2: out at YouTube dot com, slash, kill Switch, Underscore pod 628 00:34:50,080 --> 00:34:51,680 Speaker 2: or you can find a link for that in the 629 00:34:51,680 --> 00:34:55,759 Speaker 2: show notes. Killswitch is hosted by Me Dexter Thomas. It's 630 00:34:55,760 --> 00:35:00,160 Speaker 2: produced by Sheena Ozaki, Darluck Potts and Julian Nutter. Our 631 00:35:00,200 --> 00:35:03,040 Speaker 2: theme song is by me and Kyle Murdach and Kyle 632 00:35:03,120 --> 00:35:06,560 Speaker 2: also mixes a show from Kaleidoscope. Our executive producers are 633 00:35:06,600 --> 00:35:11,120 Speaker 2: Oswa Lashin, Mangesh Hatigadur, and Kay Osborne from iHeart. Our 634 00:35:11,160 --> 00:35:13,960 Speaker 2: executive producers are Katrina Norville and Nikki e. 635 00:35:14,080 --> 00:35:14,279 Speaker 3: Tour. 636 00:35:14,719 --> 00:35:15,600 Speaker 2: Catch on next week. 637 00:35:20,120 --> 00:35:20,880 Speaker 1: Goodbye