1 00:00:04,440 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:15,880 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,880 --> 00:00:19,400 Speaker 1: I'm an executive producer with iHeart Podcasts and how the 4 00:00:19,440 --> 00:00:25,640 Speaker 1: tech are you? So? Longtime no podcast y'all. The last 5 00:00:25,640 --> 00:00:29,800 Speaker 1: time I actually recorded a new episode, it was about 6 00:00:29,840 --> 00:00:33,320 Speaker 1: CT scanning machines, and I talked about how I had 7 00:00:33,400 --> 00:00:37,080 Speaker 1: just had my first CT scan. That was on December 8 00:00:37,159 --> 00:00:41,199 Speaker 1: thirtieth of twenty twenty three when that happened, and it 9 00:00:41,240 --> 00:00:44,600 Speaker 1: was all part of a medical emergency I experienced over 10 00:00:44,640 --> 00:00:47,600 Speaker 1: that last weekend of twenty twenty three. But guess what, 11 00:00:48,600 --> 00:00:52,120 Speaker 1: The very same day that I was recording that episode, 12 00:00:52,159 --> 00:00:54,760 Speaker 1: I ended up having another emergency and I had to 13 00:00:54,760 --> 00:00:57,880 Speaker 1: go right back to the hospital. My blood pressure had 14 00:00:57,920 --> 00:01:00,240 Speaker 1: spiked again. It got to the point where I was 15 00:01:00,280 --> 00:01:04,920 Speaker 1: completely incoherent, more so than usual. You know, I realize 16 00:01:05,120 --> 00:01:09,320 Speaker 1: my coherence is a spectrum. Well this was off of it, 17 00:01:10,200 --> 00:01:14,840 Speaker 1: and I honestly only remember the second half of what 18 00:01:15,080 --> 00:01:19,400 Speaker 1: happened at the hospital once I was moved into the 19 00:01:19,440 --> 00:01:23,680 Speaker 1: Intensive care Unit or ICU. Before that, apparently I spent 20 00:01:23,760 --> 00:01:25,880 Speaker 1: several hours in the emergency room, but I have no 21 00:01:26,000 --> 00:01:29,320 Speaker 1: memory of it. I was told I even received a 22 00:01:29,360 --> 00:01:32,319 Speaker 1: lumbar puncture, which is I understand it. Those are pretty painful, 23 00:01:33,200 --> 00:01:37,400 Speaker 1: but none of that is in my mind. But a 24 00:01:37,400 --> 00:01:40,880 Speaker 1: group of doctors and nurses took very good care of me, 25 00:01:41,440 --> 00:01:45,559 Speaker 1: both in the er and the ICU. And as I 26 00:01:45,600 --> 00:01:51,720 Speaker 1: am recording this now, I'm just back from getting from 27 00:01:51,760 --> 00:01:54,080 Speaker 1: the hospital. I had to go in for a surgical 28 00:01:54,120 --> 00:02:00,280 Speaker 1: procedure that is connected to all the emergency stuff. The 29 00:02:00,360 --> 00:02:05,960 Speaker 1: surgery went pretty well. The recovery has been rough, but 30 00:02:06,080 --> 00:02:09,639 Speaker 1: hopefully this will get me toward being on the right track. 31 00:02:10,040 --> 00:02:12,880 Speaker 1: It has been a heck of a way to start 32 00:02:12,919 --> 00:02:16,799 Speaker 1: off twenty twenty four. And as I mentioned in that 33 00:02:16,919 --> 00:02:20,880 Speaker 1: last new episode from Me, I am scaling back on 34 00:02:20,919 --> 00:02:23,440 Speaker 1: the number of shows that I do per week. One 35 00:02:23,560 --> 00:02:29,840 Speaker 1: contributing factor to the blood pressure issue is stress, and 36 00:02:29,919 --> 00:02:33,400 Speaker 1: we figured that reducing my stress should be a little 37 00:02:33,440 --> 00:02:36,120 Speaker 1: bit helpful. Now, that's just one part of what I'm 38 00:02:36,160 --> 00:02:38,880 Speaker 1: doing to get better. It's not like cutting back is 39 00:02:39,360 --> 00:02:42,000 Speaker 1: going to magically make me healthy. I have to do 40 00:02:42,040 --> 00:02:44,480 Speaker 1: a lot of other stuff. I'm on medication. I've got 41 00:02:44,480 --> 00:02:47,560 Speaker 1: a new approach to my diet and exercise, which is 42 00:02:47,600 --> 00:02:51,080 Speaker 1: well overdue. So it's just one piece of a bigger 43 00:02:51,400 --> 00:02:55,680 Speaker 1: puzzle anyway. Enough of all that. One of the many 44 00:02:55,800 --> 00:02:59,160 Speaker 1: things that I missed out on while I was in 45 00:02:59,240 --> 00:03:04,040 Speaker 1: the hospital was CES, which sometimes is also referred to 46 00:03:04,160 --> 00:03:07,560 Speaker 1: as the Consumer Electronics Show, although I think now they 47 00:03:07,680 --> 00:03:11,280 Speaker 1: just prefer to have it called CES. And in case 48 00:03:11,320 --> 00:03:15,760 Speaker 1: you're not familiar with this event, CES is a huge 49 00:03:15,960 --> 00:03:20,240 Speaker 1: trade industry conference in the United States in which companies 50 00:03:20,280 --> 00:03:24,040 Speaker 1: come together in Las Vegas, Nevada, every January to focus 51 00:03:24,080 --> 00:03:27,560 Speaker 1: on all things in the consumer electronics space and beyond. 52 00:03:27,960 --> 00:03:31,280 Speaker 1: If we're being honest, consumer electronics is really just one 53 00:03:31,320 --> 00:03:33,240 Speaker 1: part of it, which might be why they prefer to 54 00:03:33,280 --> 00:03:36,760 Speaker 1: call it just CES these days. It's at CES where 55 00:03:36,880 --> 00:03:40,800 Speaker 1: manufacturers and retailers will show off products intended to launch 56 00:03:41,000 --> 00:03:45,000 Speaker 1: within the next year or so. Sometimes they just show 57 00:03:45,040 --> 00:03:47,600 Speaker 1: off things like prototypes that are never going to see 58 00:03:47,600 --> 00:03:50,360 Speaker 1: the light of day, but might show off some features 59 00:03:50,360 --> 00:03:52,360 Speaker 1: that could find their way into products in the future. 60 00:03:52,880 --> 00:03:55,280 Speaker 1: Various members of the media attend and an effort to 61 00:03:55,280 --> 00:03:58,280 Speaker 1: get really cool stories. And you know, you've got other 62 00:03:58,360 --> 00:04:01,160 Speaker 1: industry professionals who go for all sorts of reasons, you know, 63 00:04:01,200 --> 00:04:04,800 Speaker 1: analysts and such. Most of them are really looking for 64 00:04:04,880 --> 00:04:07,560 Speaker 1: ways to find free food. At least in my experience, 65 00:04:07,640 --> 00:04:10,360 Speaker 1: that's kind of what the main priority seems to be. 66 00:04:10,840 --> 00:04:15,840 Speaker 1: And every year at CEES, trends emerge. Sometimes these trends 67 00:04:15,880 --> 00:04:20,360 Speaker 1: will establish themselves and eventually become a pillar in technology. 68 00:04:20,920 --> 00:04:24,400 Speaker 1: Like think about flat screen televisions. I mean, that's the 69 00:04:24,600 --> 00:04:28,920 Speaker 1: common form factor now, but once upon a time it wasn't, 70 00:04:29,440 --> 00:04:33,080 Speaker 1: and when those started to pop up, they eventually did 71 00:04:33,120 --> 00:04:37,599 Speaker 1: become the standard. Or things like Bluetooth connectivity and smart 72 00:04:37,600 --> 00:04:41,760 Speaker 1: home technology those fall into these categories too. Now. Other times, 73 00:04:41,920 --> 00:04:44,960 Speaker 1: various industries will push really hard to get a technology 74 00:04:45,360 --> 00:04:48,560 Speaker 1: kind of established in order to make it a trend, 75 00:04:48,640 --> 00:04:51,240 Speaker 1: and it ends up going nowhere. So the one I 76 00:04:51,320 --> 00:04:55,240 Speaker 1: always cite is three D television. When I first started 77 00:04:55,279 --> 00:04:58,359 Speaker 1: going to CES, which is back in the mid two thousands, 78 00:04:58,839 --> 00:05:01,480 Speaker 1: all the major companies in television space were gung ho 79 00:05:01,640 --> 00:05:04,840 Speaker 1: on three D televisions. There was a ton of support 80 00:05:05,200 --> 00:05:09,479 Speaker 1: behind three DTV within the film and TV industries, you know, 81 00:05:09,520 --> 00:05:12,960 Speaker 1: because three D formats would be a lot harder for 82 00:05:13,040 --> 00:05:16,599 Speaker 1: people to pirate than your typical visual media. So the 83 00:05:16,760 --> 00:05:20,080 Speaker 1: studios were really gung ho on three D TV as 84 00:05:20,120 --> 00:05:22,520 Speaker 1: well because they were like, well, if this means people 85 00:05:22,560 --> 00:05:25,080 Speaker 1: can't just steal our stuff and they have to buy 86 00:05:25,120 --> 00:05:27,840 Speaker 1: it from us, then that means more money. So let's 87 00:05:27,960 --> 00:05:30,760 Speaker 1: definitely make three D television a thing. I mean, what 88 00:05:30,839 --> 00:05:33,240 Speaker 1: better way to crack down on those pesty pirates than 89 00:05:33,279 --> 00:05:35,360 Speaker 1: to convince them that they all have to watch the 90 00:05:35,400 --> 00:05:38,240 Speaker 1: stuff in three D. And you know a lot of 91 00:05:38,240 --> 00:05:40,359 Speaker 1: pirates can't even watch stuff in three D because they 92 00:05:40,400 --> 00:05:42,440 Speaker 1: got eye patches, which means they have a lack of 93 00:05:42,480 --> 00:05:46,760 Speaker 1: death perception. That's a parallax joke. Anyway, the whole gambit 94 00:05:46,800 --> 00:05:50,320 Speaker 1: for three D television didn't work out. Consumers rejected three DTV. 95 00:05:50,480 --> 00:05:52,440 Speaker 1: Most folks decided they didn't want to have to wear 96 00:05:52,480 --> 00:05:55,479 Speaker 1: glasses just to watch television in their homes, or that 97 00:05:55,600 --> 00:05:59,159 Speaker 1: having yet another component that could go missing was a 98 00:05:59,200 --> 00:06:01,160 Speaker 1: real hassle. Like if you've ever been one of those 99 00:06:01,160 --> 00:06:03,600 Speaker 1: folks who's like, where the heck did the remote control go, 100 00:06:04,040 --> 00:06:07,800 Speaker 1: imagine doing that with glasses as well. Or they were 101 00:06:07,880 --> 00:06:10,560 Speaker 1: arguing that there just wasn't enough compelling material that was 102 00:06:10,680 --> 00:06:13,760 Speaker 1: in three D to justify the investment of purchasing one 103 00:06:13,760 --> 00:06:17,839 Speaker 1: of these televisions. So after a couple of valiant years 104 00:06:18,120 --> 00:06:20,400 Speaker 1: of trying to make three D happen. It followed in 105 00:06:20,400 --> 00:06:25,080 Speaker 1: the footsteps of Fetch and didn't happen. This year was 106 00:06:25,120 --> 00:06:28,000 Speaker 1: no exception. There were other trends that came out this year. 107 00:06:28,080 --> 00:06:31,040 Speaker 1: A few different ones popped up, like transparent O led 108 00:06:31,080 --> 00:06:34,200 Speaker 1: television displays. There were a few of those, and I'll 109 00:06:34,200 --> 00:06:36,919 Speaker 1: probably do an episode talking about that in the future. 110 00:06:36,960 --> 00:06:41,600 Speaker 1: But undeniably, one type of tech really dominated conversation at 111 00:06:41,600 --> 00:06:44,960 Speaker 1: the show floor this year, and that was artificial intelligence. 112 00:06:45,279 --> 00:06:47,559 Speaker 1: And we all know that AI is a big deal. 113 00:06:47,960 --> 00:06:52,760 Speaker 1: Companies like Microsoft and Google and Amazon and Apple are 114 00:06:52,800 --> 00:06:57,200 Speaker 1: all struggling to find ways to incorporate AI into their 115 00:06:57,240 --> 00:07:00,240 Speaker 1: business models in a way that benefits them, and that 116 00:07:00,320 --> 00:07:02,919 Speaker 1: might also mean that these companies will make cuts to 117 00:07:03,000 --> 00:07:05,800 Speaker 1: actual human staff in the process if they find that 118 00:07:05,880 --> 00:07:08,440 Speaker 1: the AI can take on some of the load that 119 00:07:08,520 --> 00:07:11,920 Speaker 1: people would normally carry. And if you read up on 120 00:07:12,520 --> 00:07:16,160 Speaker 1: any business conference, any conference that has happened in the 121 00:07:16,240 --> 00:07:18,560 Speaker 1: last year and a half, you're going to see a 122 00:07:18,560 --> 00:07:21,680 Speaker 1: lot of discussion will be devoted to AI and how 123 00:07:21,680 --> 00:07:25,120 Speaker 1: it's going to change everything. And often at these exact 124 00:07:25,160 --> 00:07:28,320 Speaker 1: same conferences, at these exact same speeches, you're going to 125 00:07:28,440 --> 00:07:32,120 Speaker 1: find hardly any detail as to how we're going to 126 00:07:32,200 --> 00:07:35,520 Speaker 1: get there or what strategies companies should employ well forging 127 00:07:35,520 --> 00:07:38,680 Speaker 1: the path, it just becomes like this will change everything. 128 00:07:39,200 --> 00:07:43,560 Speaker 1: How it becomes like the underpants gnomes in South Park. 129 00:07:43,600 --> 00:07:45,160 Speaker 1: You just have a bunch of question marks and then 130 00:07:45,200 --> 00:07:49,040 Speaker 1: at the end it says profit. So essentially, everyone knows 131 00:07:49,080 --> 00:07:52,280 Speaker 1: that AI is powerful and it's important, but we don't 132 00:07:52,480 --> 00:07:56,559 Speaker 1: have wide agreement on how it should be developed or deployed. Also, 133 00:07:56,880 --> 00:07:58,920 Speaker 1: it's not like this was the first time that AI 134 00:07:59,080 --> 00:08:02,600 Speaker 1: was part of the conversation and at CES. In recent years, 135 00:08:02,680 --> 00:08:05,720 Speaker 1: lots of companies have leaned on AI for all sorts 136 00:08:05,720 --> 00:08:11,280 Speaker 1: of things, from voice assistance to image recognition to robot navigation. 137 00:08:11,920 --> 00:08:14,480 Speaker 1: If I'm being honest, I would say the AI on 138 00:08:14,520 --> 00:08:18,200 Speaker 1: display this year more often seem to lean on the 139 00:08:18,440 --> 00:08:23,160 Speaker 1: large language model and generative AI versions of artificial intelligence, 140 00:08:23,200 --> 00:08:24,920 Speaker 1: the kind of stuff that we've been seeing so much 141 00:08:24,960 --> 00:08:28,240 Speaker 1: about from companies like open AI and Google. And I 142 00:08:28,280 --> 00:08:31,400 Speaker 1: get it. They're very flashy and they are impressive when 143 00:08:31,440 --> 00:08:36,160 Speaker 1: they're working properly, but it gives a very narrow view 144 00:08:36,240 --> 00:08:39,640 Speaker 1: of what artificial intelligence is, and AI is so much 145 00:08:39,760 --> 00:08:44,160 Speaker 1: more than just large language models and generative AI. But 146 00:08:44,280 --> 00:08:48,760 Speaker 1: unfortunately that's harder to sell. So it gets easier if 147 00:08:48,800 --> 00:08:52,240 Speaker 1: you just kind of reduce it all to one form 148 00:08:52,800 --> 00:08:55,840 Speaker 1: and say, like, this is what AI is. I don't 149 00:08:55,880 --> 00:08:59,720 Speaker 1: think that's very wise because it's misleading, but you know, 150 00:09:00,440 --> 00:09:03,080 Speaker 1: I'm one voice in a big crowd. Now, there were 151 00:09:03,120 --> 00:09:05,920 Speaker 1: a couple of major approaches that I saw while reading 152 00:09:06,000 --> 00:09:09,840 Speaker 1: up on how various companies were positioning AI in their 153 00:09:10,360 --> 00:09:14,400 Speaker 1: pitches to the media over at CEES. The big companies, 154 00:09:14,520 --> 00:09:18,480 Speaker 1: generally speaking, were actually a little less bullish. You know. 155 00:09:18,520 --> 00:09:22,640 Speaker 1: They didn't position AI as being a definitive feature in 156 00:09:22,679 --> 00:09:27,000 Speaker 1: their technologies, that did not seem to be the big thing. 157 00:09:27,040 --> 00:09:29,080 Speaker 1: Like it wasn't like a big flashing knee on sign 158 00:09:29,160 --> 00:09:31,559 Speaker 1: saying this has AI in it. So in their products, 159 00:09:31,720 --> 00:09:35,000 Speaker 1: AI was often a component that they might mention as 160 00:09:35,520 --> 00:09:39,959 Speaker 1: contributing to the functionality, but it wasn't positioned as being 161 00:09:40,000 --> 00:09:43,120 Speaker 1: the main event. Now, some smaller companies went in the 162 00:09:43,120 --> 00:09:47,160 Speaker 1: opposite direction. They developed products that put artificial intelligence front 163 00:09:47,160 --> 00:09:51,040 Speaker 1: and center, like this is AI. And we're still seeing 164 00:09:51,120 --> 00:09:53,679 Speaker 1: some examples of companies that are shoving AI into their 165 00:09:53,679 --> 00:09:57,160 Speaker 1: marketing message, even if it seems like they still don't 166 00:09:57,200 --> 00:10:00,679 Speaker 1: quite have a handle on how AI adds value or functionality, 167 00:10:00,800 --> 00:10:04,560 Speaker 1: or in some cases, I'm not entirely convinced that AI 168 00:10:04,720 --> 00:10:07,480 Speaker 1: is actually part of the whole thing in the first place. 169 00:10:07,800 --> 00:10:10,200 Speaker 1: Sometimes it just seems like we need to put AI 170 00:10:10,400 --> 00:10:14,080 Speaker 1: in there because that's a buzzword that if we don't 171 00:10:14,120 --> 00:10:15,520 Speaker 1: put it in there, it's going to seem like we're 172 00:10:15,559 --> 00:10:18,480 Speaker 1: falling behind. But I would argue that, at least in 173 00:10:18,520 --> 00:10:21,320 Speaker 1: some of the cases I was looking at, AI really 174 00:10:22,200 --> 00:10:24,720 Speaker 1: was a grandiose way of saying whatever it was the 175 00:10:24,720 --> 00:10:28,040 Speaker 1: product was supposed to do. So let's get started. Let's 176 00:10:28,040 --> 00:10:30,160 Speaker 1: talk about an implementation that I actually think is a 177 00:10:30,160 --> 00:10:34,040 Speaker 1: pretty good idea, and it's BMW's use of a technique 178 00:10:34,080 --> 00:10:39,520 Speaker 1: called retrieval augmented generation. So BMW has a voice assistant 179 00:10:39,760 --> 00:10:43,160 Speaker 1: that you can get in in certain BMW vehicles, and 180 00:10:43,240 --> 00:10:47,080 Speaker 1: it uses generative AI to respond to your requests. But 181 00:10:47,160 --> 00:10:50,120 Speaker 1: this voice assistant can't just chat about anything at all. 182 00:10:50,559 --> 00:10:54,320 Speaker 1: It's it's not grant access to a limitless selection of topics, 183 00:10:54,360 --> 00:10:56,439 Speaker 1: So you can't just be like on a long road 184 00:10:56,440 --> 00:10:59,680 Speaker 1: trip and you're like, let's start talking about Sartra, or 185 00:11:00,440 --> 00:11:03,240 Speaker 1: can we have a deep discussion about Lord of the Rings. 186 00:11:03,360 --> 00:11:06,480 Speaker 1: It can't do that. Instead, this voice assistant can really 187 00:11:06,559 --> 00:11:11,439 Speaker 1: only provide information about the vehicle itself. So this restriction 188 00:11:12,080 --> 00:11:16,440 Speaker 1: means that the voice assistant isn't prone to hallucinations or confabulations. 189 00:11:16,960 --> 00:11:20,360 Speaker 1: So that's the tendency for generative AI to just plane 190 00:11:20,400 --> 00:11:23,960 Speaker 1: make stuff up on occasion. Right, Sometimes AI, in the 191 00:11:24,040 --> 00:11:27,400 Speaker 1: lack of information, will make something up and it sounds 192 00:11:28,320 --> 00:11:31,960 Speaker 1: like it's reliable, but it turns out it's completely false. 193 00:11:32,720 --> 00:11:36,199 Speaker 1: This is why AI critics warn that without strong guidelines, 194 00:11:36,240 --> 00:11:39,760 Speaker 1: AI could manufacture and distribute misinformation in such a way 195 00:11:39,800 --> 00:11:43,960 Speaker 1: that the misinformation seems like it's reliable and it's not malicious. 196 00:11:44,000 --> 00:11:46,439 Speaker 1: It's not that the AI is trying to mislead. It's 197 00:11:46,520 --> 00:11:48,800 Speaker 1: just trying to answer a question and doesn't have the answer, 198 00:11:48,840 --> 00:11:53,120 Speaker 1: and like some other people I know, it's too scared 199 00:11:53,160 --> 00:11:55,280 Speaker 1: to say I don't know the answer to that question. 200 00:11:55,800 --> 00:12:00,720 Speaker 1: So BMW built a barrier around their voice assistance knowledge 201 00:12:00,720 --> 00:12:04,040 Speaker 1: base to prevent this from happening. They said, well, it's 202 00:12:04,200 --> 00:12:07,880 Speaker 1: only going to be restricted to matters that involve the 203 00:12:08,000 --> 00:12:12,000 Speaker 1: vehicle itself. So the assistant draws upon the power of 204 00:12:12,280 --> 00:12:17,520 Speaker 1: Amazon's Alexa large language model, and it can interpret what 205 00:12:17,559 --> 00:12:21,120 Speaker 1: you mean when you ask questions. So that way, even 206 00:12:21,160 --> 00:12:23,240 Speaker 1: if you are not a car person and you don't 207 00:12:23,280 --> 00:12:26,400 Speaker 1: really know how to frame a question properly, like you 208 00:12:26,880 --> 00:12:29,800 Speaker 1: don't know what you're asking about, you're just trying to 209 00:12:29,840 --> 00:12:33,079 Speaker 1: find an answer to something you don't know the answer to, well, 210 00:12:33,080 --> 00:12:35,920 Speaker 1: this assistant can still try to help you get to 211 00:12:36,040 --> 00:12:38,760 Speaker 1: where you need to go, like what information do you 212 00:12:38,840 --> 00:12:41,640 Speaker 1: need to know? It can actually ask follow up questions 213 00:12:41,640 --> 00:12:43,640 Speaker 1: to you in order to get a better understanding of 214 00:12:43,640 --> 00:12:46,400 Speaker 1: what it is you're asking about, So like if you 215 00:12:46,400 --> 00:12:48,719 Speaker 1: don't know what it is you don't know, it can 216 00:12:48,760 --> 00:12:51,160 Speaker 1: at least ask follow up questions to try and narrow 217 00:12:51,280 --> 00:12:55,319 Speaker 1: down what the matter is. And then it can simplify explanations. 218 00:12:55,320 --> 00:12:57,560 Speaker 1: If you find something too technical to follow, you could say, 219 00:12:57,840 --> 00:13:01,640 Speaker 1: can you explain that to me in in plain English? 220 00:13:01,679 --> 00:13:05,520 Speaker 1: And it can reduce the complexity and do it now. 221 00:13:05,520 --> 00:13:08,240 Speaker 1: The beauty of this is that the voice assistant becomes 222 00:13:08,280 --> 00:13:11,880 Speaker 1: sort of an interactive owner's manual that's capable of rephrasing 223 00:13:12,000 --> 00:13:14,960 Speaker 1: passages if they just don't make sense to you, which 224 00:13:15,000 --> 00:13:18,240 Speaker 1: is great if you've ever flipped through a car owner 225 00:13:18,320 --> 00:13:22,079 Speaker 1: manual and encountered stuf where you're like, this doesn't seem 226 00:13:22,080 --> 00:13:24,440 Speaker 1: like a human being wrote this. I don't understand what 227 00:13:24,440 --> 00:13:26,920 Speaker 1: they're actually getting at. Well, imagine that it would be 228 00:13:26,920 --> 00:13:29,800 Speaker 1: able to rephrase it so that it could convey that 229 00:13:29,920 --> 00:13:32,960 Speaker 1: meaning to you like that to me is incredibly valuable. 230 00:13:33,240 --> 00:13:36,760 Speaker 1: So for example, if your vehicle has multiple drive modes, right, 231 00:13:36,840 --> 00:13:39,880 Speaker 1: and you have no clue which drive mode should be 232 00:13:40,040 --> 00:13:43,160 Speaker 1: used in any given situation, the assistant can help. You 233 00:13:43,160 --> 00:13:46,040 Speaker 1: could ask it and you might find out that if 234 00:13:46,080 --> 00:13:48,800 Speaker 1: you were to switch to a different driving mode, you're 235 00:13:48,840 --> 00:13:51,400 Speaker 1: going to get way better performance on the roads that 236 00:13:51,400 --> 00:13:54,240 Speaker 1: you're currently driving on and your drive is going to 237 00:13:54,320 --> 00:13:57,640 Speaker 1: be far more pleasant. Or maybe that by making a 238 00:13:57,679 --> 00:14:01,120 Speaker 1: few other changes you can dramatically decrease the amount of 239 00:14:01,120 --> 00:14:04,640 Speaker 1: fuel consumption you're going through and save some money. I 240 00:14:04,760 --> 00:14:08,720 Speaker 1: think that's a pretty neat idea. Now, I should say 241 00:14:08,720 --> 00:14:11,120 Speaker 1: that the description I read about this tool was done 242 00:14:11,160 --> 00:14:15,000 Speaker 1: as part of a non interactive demonstration. So the reporter 243 00:14:15,120 --> 00:14:18,040 Speaker 1: whose report I read is Tim Stevens. He wrote a 244 00:14:18,080 --> 00:14:20,800 Speaker 1: piece for Ours Technica, and he was not allowed to 245 00:14:20,840 --> 00:14:23,920 Speaker 1: ask any questions of the voice assistant himself, so he 246 00:14:23,920 --> 00:14:25,880 Speaker 1: could not interact with it. Instead he had to just 247 00:14:26,280 --> 00:14:31,640 Speaker 1: witness a kind of a performance almost so BMW representatives 248 00:14:31,720 --> 00:14:34,240 Speaker 1: ran the whole demo. And it's hard to say for 249 00:14:34,520 --> 00:14:36,440 Speaker 1: you know how far along this product is, or how 250 00:14:36,480 --> 00:14:38,720 Speaker 1: reliable it is, or whether or not you could prompt 251 00:14:38,760 --> 00:14:41,920 Speaker 1: it to make a mistake, because often that becomes a 252 00:14:41,960 --> 00:14:45,520 Speaker 1: thing where people will try to get AI things to 253 00:14:45,560 --> 00:14:47,720 Speaker 1: mess up to see if, in fact it's possible, because 254 00:14:47,720 --> 00:14:51,000 Speaker 1: it's better to find out through testing than to find 255 00:14:51,000 --> 00:14:53,720 Speaker 1: out through actual use. But I think this is an 256 00:14:53,760 --> 00:14:56,680 Speaker 1: AI implementation that really makes sense, and I could easily 257 00:14:56,680 --> 00:15:00,280 Speaker 1: see other companies in other industries making similar use of 258 00:15:00,320 --> 00:15:04,000 Speaker 1: this technology to make it easier to navigate increasingly complex systems, 259 00:15:04,040 --> 00:15:06,920 Speaker 1: not just vehicles, but all sorts of stuff like I 260 00:15:06,960 --> 00:15:10,640 Speaker 1: can see businesses that are incorporating generative AI to do 261 00:15:10,720 --> 00:15:14,000 Speaker 1: it in this kind of way where they are effectively 262 00:15:14,080 --> 00:15:18,400 Speaker 1: geo fencing the AI so that it's only focused on 263 00:15:18,440 --> 00:15:20,840 Speaker 1: the things that are pertinent to the business. That just 264 00:15:20,880 --> 00:15:23,320 Speaker 1: makes sense to me. Okay, we're going to take a 265 00:15:23,360 --> 00:15:25,040 Speaker 1: quick break. When we come back, I'm going to talk 266 00:15:25,080 --> 00:15:38,440 Speaker 1: a lot more about AI at CEES twenty twenty four. Okay, 267 00:15:38,480 --> 00:15:41,640 Speaker 1: we're back. So we talked about BMW's approach, which I 268 00:15:41,640 --> 00:15:43,720 Speaker 1: thought was really cool. I don't know if the tool 269 00:15:43,760 --> 00:15:46,960 Speaker 1: itself is cool. I just think the strategy makes sense. 270 00:15:48,320 --> 00:15:53,400 Speaker 1: But let's contrast BMW's rather focused application with a handheld 271 00:15:53,440 --> 00:15:58,000 Speaker 1: device that is completely dependent upon AI and large language 272 00:15:58,000 --> 00:16:02,800 Speaker 1: models or large models, I probably should say, and that 273 00:16:02,920 --> 00:16:06,480 Speaker 1: is the Rabbit r one. So this gadget got a 274 00:16:06,560 --> 00:16:09,720 Speaker 1: ton of attention at CEES, at least from the media 275 00:16:10,000 --> 00:16:13,160 Speaker 1: as I understand it, it did not really go viral 276 00:16:13,240 --> 00:16:16,000 Speaker 1: over on Twitter. I'm not on Twitter anymore, so I 277 00:16:16,440 --> 00:16:18,720 Speaker 1: just have to take other people's word for it. But 278 00:16:19,480 --> 00:16:22,520 Speaker 1: like whenever I went to any outlet, if I were 279 00:16:22,560 --> 00:16:27,520 Speaker 1: looking at Wired or the Verge or Ours, Tetnica or anything, 280 00:16:28,000 --> 00:16:30,600 Speaker 1: there were articles seen at the articles about this thing, 281 00:16:30,880 --> 00:16:33,640 Speaker 1: the Rabbit are one. This kind of happens at the 282 00:16:33,640 --> 00:16:37,760 Speaker 1: show occasionally, right if you go to CEES, sometimes a 283 00:16:37,880 --> 00:16:42,320 Speaker 1: particular thing will stand out, and it's usually something that's 284 00:16:42,400 --> 00:16:45,840 Speaker 1: not from a big company. It's usually something that's kind 285 00:16:45,840 --> 00:16:48,840 Speaker 1: of a surprise, and it often can be a small thing, 286 00:16:49,200 --> 00:16:52,560 Speaker 1: but it just captures everyone's attention and then it generates 287 00:16:52,600 --> 00:16:56,960 Speaker 1: tons of articles. So like in past years, there was 288 00:16:57,000 --> 00:17:00,320 Speaker 1: a vibrating fork which was a real thing. It was 289 00:17:00,320 --> 00:17:03,240 Speaker 1: a fork that had a little haptic motor inside of it, 290 00:17:03,600 --> 00:17:06,320 Speaker 1: and if it detected that you were eating too quickly, 291 00:17:06,680 --> 00:17:08,960 Speaker 1: it would vibrate so that the food would fall off 292 00:17:08,960 --> 00:17:11,919 Speaker 1: your fork, forcing you to eat more slowly. That was 293 00:17:11,960 --> 00:17:14,520 Speaker 1: the whole concept. That was a big deal one year. 294 00:17:15,119 --> 00:17:18,119 Speaker 1: The pebble smart watch was a huge deal one year, 295 00:17:18,480 --> 00:17:21,760 Speaker 1: and that doesn't even exist anymore. They'll fitbit purchase that 296 00:17:21,920 --> 00:17:25,240 Speaker 1: and then kind of killed it off. So the pebble, 297 00:17:25,960 --> 00:17:29,600 Speaker 1: it shows that these little things that can become viral 298 00:17:29,680 --> 00:17:32,200 Speaker 1: and get everyone's attention and get a lot of excitement, 299 00:17:32,520 --> 00:17:35,520 Speaker 1: that doesn't mean that they are guaranteed to stick around. 300 00:17:35,880 --> 00:17:39,040 Speaker 1: They're not guaranteed to be a success. The Pebble was 301 00:17:39,400 --> 00:17:43,240 Speaker 1: huge and doesn't exist anymore, so that can raise the 302 00:17:43,320 --> 00:17:46,400 Speaker 1: question will the rabbit r one have better luck than 303 00:17:46,480 --> 00:17:49,560 Speaker 1: say the pebble. Well, let's talk about what it is first. 304 00:17:49,680 --> 00:17:53,639 Speaker 1: So it's a handheld device. It's kind of square ish 305 00:17:53,720 --> 00:17:56,920 Speaker 1: in shape. It's about half the height of your typical smartphone, 306 00:17:56,960 --> 00:17:59,199 Speaker 1: so it's not as big as a smartphone. Most of 307 00:17:59,240 --> 00:18:02,320 Speaker 1: the front face of this device is covered with a 308 00:18:02,359 --> 00:18:05,560 Speaker 1: touch screen. That touch screen typically has a sort of 309 00:18:06,640 --> 00:18:12,159 Speaker 1: cyber rabbit icon face on it. To the right of 310 00:18:12,240 --> 00:18:17,440 Speaker 1: the screen is a little physical scroll wheel. It's mounted 311 00:18:17,880 --> 00:18:20,240 Speaker 1: so that you can scroll up or down with it. 312 00:18:20,840 --> 00:18:24,200 Speaker 1: Above that is a forward facing camera. It has a 313 00:18:24,240 --> 00:18:28,240 Speaker 1: simcard port, It's got Wi Fi and Bluetooth connectivity. It's 314 00:18:28,280 --> 00:18:30,560 Speaker 1: got a speaker, it's got a microphone. It's got a 315 00:18:30,560 --> 00:18:33,480 Speaker 1: few more things. But the physical stuff isn't really where 316 00:18:33,480 --> 00:18:37,080 Speaker 1: the story is with this device. See The company behind 317 00:18:37,080 --> 00:18:40,320 Speaker 1: the rabbit are One Rabbit in other words, says that 318 00:18:40,400 --> 00:18:44,000 Speaker 1: it wants to create the next generation of computers. But 319 00:18:44,200 --> 00:18:47,240 Speaker 1: these computers should be able to interface with all the 320 00:18:47,280 --> 00:18:51,160 Speaker 1: functions we rely on today without us having to deal 321 00:18:51,320 --> 00:18:57,199 Speaker 1: with the actual applications that provide us those functions. So, 322 00:18:58,320 --> 00:19:02,639 Speaker 1: for example, than having to open up an application or 323 00:19:02,680 --> 00:19:05,760 Speaker 1: a program and run a function, you could just tell 324 00:19:05,760 --> 00:19:07,919 Speaker 1: the computer what it was you wanted to have happen 325 00:19:08,560 --> 00:19:13,480 Speaker 1: and the computer would do it. So these computers should 326 00:19:13,480 --> 00:19:15,480 Speaker 1: be able to interface with all the functions we rely 327 00:19:15,600 --> 00:19:18,320 Speaker 1: on today on behalf of us. They can act as 328 00:19:18,560 --> 00:19:22,840 Speaker 1: almost like a middleman. So for example, I could get 329 00:19:22,840 --> 00:19:26,280 Speaker 1: a smartphone and download and install apps on my phone 330 00:19:26,320 --> 00:19:28,520 Speaker 1: to do all sorts of stuff like to call a 331 00:19:28,560 --> 00:19:32,280 Speaker 1: cab or a ride share. I could have an app 332 00:19:32,320 --> 00:19:34,600 Speaker 1: to order food. I could have a different app so 333 00:19:34,640 --> 00:19:37,080 Speaker 1: I could listen to music, a different app so I 334 00:19:37,080 --> 00:19:40,399 Speaker 1: could watch films, or maybe I have apps that are 335 00:19:40,480 --> 00:19:44,000 Speaker 1: various games. But I have to download and install each 336 00:19:44,040 --> 00:19:46,399 Speaker 1: of those apps in order to do that. And obviously 337 00:19:47,359 --> 00:19:50,240 Speaker 1: lots of different companies make these apps, so these interfaces 338 00:19:50,280 --> 00:19:53,560 Speaker 1: aren't universal, right, So it might actually take me a 339 00:19:53,600 --> 00:19:56,800 Speaker 1: little while to learn how to navigate those apps properly, 340 00:19:57,200 --> 00:20:00,000 Speaker 1: because one app may use one kind of interface, another 341 00:20:00,280 --> 00:20:03,080 Speaker 1: might use a different one. And I don't know if 342 00:20:03,119 --> 00:20:06,239 Speaker 1: you've had the experience where you've opened up programs and 343 00:20:06,320 --> 00:20:10,159 Speaker 1: like you have that moment where you realize, oh, the 344 00:20:10,280 --> 00:20:12,800 Speaker 1: shortcut I'm trying to use doesn't work because that's actually 345 00:20:12,800 --> 00:20:15,439 Speaker 1: for a totally different program. I have it happen all 346 00:20:15,480 --> 00:20:18,400 Speaker 1: the time. But then I'm also I'm also getting old 347 00:20:18,680 --> 00:20:22,000 Speaker 1: and broken and perhaps senile. So maybe that's just a 348 00:20:22,080 --> 00:20:27,560 Speaker 1: Jonathan problem. But what if instead of doing all that, 349 00:20:27,760 --> 00:20:31,120 Speaker 1: my device just took those steps out and just interface 350 00:20:31,200 --> 00:20:35,359 Speaker 1: directly with the underlying platforms. What if all I needed 351 00:20:35,400 --> 00:20:38,280 Speaker 1: to do was to tell my computer order me a 352 00:20:38,320 --> 00:20:40,840 Speaker 1: deep dish Chicago pizza to get here by five pm, 353 00:20:41,280 --> 00:20:44,000 Speaker 1: and the computer, acting like a personal assistant, handles the 354 00:20:44,119 --> 00:20:46,960 Speaker 1: entire transaction. And what if I could do any kind 355 00:20:46,960 --> 00:20:50,040 Speaker 1: of transaction that way, not just ordering pizza. Maybe I 356 00:20:50,119 --> 00:20:53,359 Speaker 1: need to order a ride. Maybe I want to watch 357 00:20:53,400 --> 00:20:56,040 Speaker 1: the latest episode of True Detective. Maybe I have a 358 00:20:56,119 --> 00:20:58,879 Speaker 1: list of things I want to do, like maybe I 359 00:20:58,920 --> 00:21:02,640 Speaker 1: want to do a full fledged vacation, and I need 360 00:21:02,680 --> 00:21:04,639 Speaker 1: to do things like I need to book flights, I 361 00:21:04,680 --> 00:21:06,800 Speaker 1: need to secure a hotel room, I need to get 362 00:21:06,840 --> 00:21:10,119 Speaker 1: tickets for a walking tour I was interested in, I 363 00:21:10,160 --> 00:21:13,000 Speaker 1: want to make a dinner reservation for a particular restaurant. 364 00:21:13,320 --> 00:21:15,119 Speaker 1: What if I just had a gadget and I just 365 00:21:15,200 --> 00:21:17,000 Speaker 1: told it all these things I needed it to do, 366 00:21:17,040 --> 00:21:19,080 Speaker 1: and it took care all of that for me in 367 00:21:19,160 --> 00:21:22,760 Speaker 1: one go, Right. That's the kind of idea behind the 368 00:21:22,840 --> 00:21:26,240 Speaker 1: Rabbit r one. The company says the secret Sauce is 369 00:21:26,280 --> 00:21:29,640 Speaker 1: a system they call a large action model, and it's 370 00:21:29,680 --> 00:21:33,160 Speaker 1: this model that figures out how to interface with all 371 00:21:33,240 --> 00:21:35,960 Speaker 1: the different services out there in order to get the 372 00:21:36,000 --> 00:21:39,840 Speaker 1: result that you desire, whether that's pizza delivery or an 373 00:21:39,920 --> 00:21:42,080 Speaker 1: update on your health records or whatever it might be. 374 00:21:42,520 --> 00:21:44,879 Speaker 1: There are a lot of unanswered questions when it comes 375 00:21:44,920 --> 00:21:48,760 Speaker 1: to this actual approach, like how can the product and 376 00:21:48,840 --> 00:21:53,679 Speaker 1: system ensure privacy and data security? For example. As is 377 00:21:53,680 --> 00:21:56,520 Speaker 1: often the case at cees, answers to the hard questions 378 00:21:56,520 --> 00:21:59,760 Speaker 1: didn't really take center stage. Instead, folks got swept up 379 00:21:59,800 --> 00:22:02,800 Speaker 1: with this idea of a device that you know, possibly 380 00:22:02,880 --> 00:22:06,200 Speaker 1: could do all these things, And in just over a week, 381 00:22:06,960 --> 00:22:09,520 Speaker 1: Rabbit had sold out of its pre order inventory that 382 00:22:09,600 --> 00:22:11,639 Speaker 1: had set aside, and then it sold out of the 383 00:22:11,680 --> 00:22:14,600 Speaker 1: next batch that it set aside, and then the next one, 384 00:22:14,880 --> 00:22:17,280 Speaker 1: and it keeps going and one hundred and ninety nine 385 00:22:17,359 --> 00:22:20,119 Speaker 1: dollars a pop. It's a pricey piece of technology, but 386 00:22:20,160 --> 00:22:24,840 Speaker 1: admittedly it's significantly cheaper than your typical smartphone. So you 387 00:22:24,840 --> 00:22:28,000 Speaker 1: could argue, well, if you wanted to free yourself of 388 00:22:28,080 --> 00:22:30,680 Speaker 1: a smartphone, then maybe you could use this. Now, gret 389 00:22:30,760 --> 00:22:32,840 Speaker 1: you can't use this thing as a smartphone, it's not 390 00:22:32,880 --> 00:22:35,280 Speaker 1: intended to be a smartphone, but you could use it 391 00:22:35,359 --> 00:22:39,520 Speaker 1: to do all these other tasks. Now, some evangelists have 392 00:22:39,560 --> 00:22:42,120 Speaker 1: already talked about how the device lets you do lots 393 00:22:42,160 --> 00:22:45,159 Speaker 1: of stuff that we do use smartphones for today, but 394 00:22:45,200 --> 00:22:48,439 Speaker 1: it removes pesky things like notifications and distractions. So if 395 00:22:48,480 --> 00:22:52,240 Speaker 1: you're someone who's like, I'm so sick of getting text 396 00:22:52,240 --> 00:22:55,320 Speaker 1: messages and phone calls and all these little pop ups 397 00:22:55,320 --> 00:22:57,199 Speaker 1: and stuff. I just want to be able to do 398 00:22:57,240 --> 00:22:58,920 Speaker 1: what I need to do when I need to do it, 399 00:22:59,240 --> 00:23:02,280 Speaker 1: and then not be bothered that I could see where 400 00:23:02,280 --> 00:23:04,879 Speaker 1: you'd be. Really you'd find the appeal of the rabbit 401 00:23:05,000 --> 00:23:07,560 Speaker 1: r one. Maybe you go back to having just a 402 00:23:07,600 --> 00:23:10,919 Speaker 1: phone at home and an answering machine or something similar 403 00:23:10,920 --> 00:23:14,320 Speaker 1: to that, where you're not carrying your communications device with 404 00:23:14,359 --> 00:23:17,320 Speaker 1: you everywhere. I think a lot of people, at least 405 00:23:17,440 --> 00:23:21,080 Speaker 1: on one level, think that's attractive. But when you start 406 00:23:21,119 --> 00:23:24,879 Speaker 1: really thinking about the convenience of having a communications device 407 00:23:24,920 --> 00:23:28,520 Speaker 1: on you, it does. I mean, that's a strong use case, 408 00:23:28,600 --> 00:23:31,359 Speaker 1: right Like, as someone who has been through a couple 409 00:23:31,359 --> 00:23:34,280 Speaker 1: of medical emergencies recently, I can tell you having a 410 00:23:34,280 --> 00:23:38,639 Speaker 1: communications device on you is invaluable at times. But you know, 411 00:23:38,960 --> 00:23:40,560 Speaker 1: it does seem like it would be cool to have 412 00:23:40,600 --> 00:23:43,640 Speaker 1: a gadget that could handle these things and you know, 413 00:23:43,800 --> 00:23:45,520 Speaker 1: you just tell it what you want and it handles 414 00:23:45,560 --> 00:23:48,200 Speaker 1: all the details. That does seem pretty nice. But as 415 00:23:48,240 --> 00:23:52,760 Speaker 1: one YouTuber vrun Maya and I apologize for the Butchering 416 00:23:52,760 --> 00:23:56,119 Speaker 1: of your name. I saw his YouTube video about the 417 00:23:56,160 --> 00:23:58,400 Speaker 1: Rabbit R one, and he made some really good points, 418 00:23:58,480 --> 00:24:03,080 Speaker 1: some criticisms about this, and he said that the rabbit 419 00:24:03,160 --> 00:24:05,960 Speaker 1: R one's place in the pantheon of established technology is 420 00:24:06,000 --> 00:24:09,240 Speaker 1: by no means assured. He argues that the use case 421 00:24:09,240 --> 00:24:11,760 Speaker 1: for the r one is pretty limited. And if you 422 00:24:11,800 --> 00:24:14,880 Speaker 1: were to actually look at how much time the typical 423 00:24:14,880 --> 00:24:18,560 Speaker 1: person spends on their smartphone and then take another step 424 00:24:18,680 --> 00:24:22,120 Speaker 1: and say, all right, let's break that down, what are 425 00:24:22,160 --> 00:24:25,440 Speaker 1: you using that time to do? Like, what what are 426 00:24:25,440 --> 00:24:28,680 Speaker 1: you mostly doing on your smartphone? He argues that you 427 00:24:28,680 --> 00:24:32,160 Speaker 1: would only see a tiny fraction of the use time 428 00:24:32,800 --> 00:24:35,720 Speaker 1: that would go to interfacing with apps in this way 429 00:24:35,800 --> 00:24:38,960 Speaker 1: for doing things like ordering say a ride haaling service 430 00:24:39,119 --> 00:24:42,439 Speaker 1: or food or something. He says, yeah, people do that, 431 00:24:42,800 --> 00:24:44,800 Speaker 1: but that's like five or six minutes a day. If 432 00:24:44,800 --> 00:24:46,879 Speaker 1: you were to look at how much time they're spending 433 00:24:46,920 --> 00:24:50,880 Speaker 1: on their phone, most of their time is spent consuming content, right, 434 00:24:51,280 --> 00:24:55,080 Speaker 1: watching videos, listening to podcasts, you know, that sort of stuff, 435 00:24:55,320 --> 00:24:59,080 Speaker 1: but not booking travel or ordering food or whatever. So 436 00:24:59,119 --> 00:25:01,560 Speaker 1: his argument is that there's actually not enough of a 437 00:25:01,680 --> 00:25:04,920 Speaker 1: use case to justify purchasing the R one that yeah, 438 00:25:04,960 --> 00:25:07,479 Speaker 1: it might work. If it does work, we don't know 439 00:25:07,760 --> 00:25:11,159 Speaker 1: because we only got to see some demonstrations and stuff. 440 00:25:11,680 --> 00:25:15,040 Speaker 1: But it's only taking off like the like, if it's 441 00:25:15,080 --> 00:25:17,680 Speaker 1: taking off like five minutes of work a day, that's 442 00:25:17,800 --> 00:25:21,639 Speaker 1: hardly valuable, right. It's not like that's a huge load 443 00:25:21,760 --> 00:25:24,240 Speaker 1: off of you. It's not a big enough departure from 444 00:25:24,280 --> 00:25:27,399 Speaker 1: the smartphone to necessitate it. Plus, and this is a 445 00:25:27,440 --> 00:25:29,800 Speaker 1: really good argument. I think it stands to reason that 446 00:25:29,840 --> 00:25:33,480 Speaker 1: smartphone manufacturers are going to experiment with their own approach 447 00:25:33,520 --> 00:25:36,760 Speaker 1: to things like large action models, which means you're likely 448 00:25:36,840 --> 00:25:40,840 Speaker 1: to see rabbit R one functionality finding its way into 449 00:25:41,040 --> 00:25:44,360 Speaker 1: established smartphone lines in the future, right, whether it's Apple's 450 00:25:44,400 --> 00:25:48,480 Speaker 1: Siri or Google's Assistant, that kind of thing. You would 451 00:25:48,480 --> 00:25:52,960 Speaker 1: expect to see these companies start to build those capabilities 452 00:25:53,040 --> 00:25:56,600 Speaker 1: into their own tools, which means, well, you don't need 453 00:25:56,640 --> 00:25:59,280 Speaker 1: the R one anyway. You're going to still want a phone, 454 00:26:00,119 --> 00:26:02,119 Speaker 1: and your phone is going to end up doing the 455 00:26:02,119 --> 00:26:03,879 Speaker 1: same thing the R one does, so why would you 456 00:26:03,920 --> 00:26:07,000 Speaker 1: need a separate device to do those things. It's kind 457 00:26:07,000 --> 00:26:11,960 Speaker 1: of like why most people, not everyone, but most people 458 00:26:11,960 --> 00:26:16,680 Speaker 1: don't bother having their own like separate MP three player anymore. Right, 459 00:26:16,720 --> 00:26:18,959 Speaker 1: you don't need an iPod or anything like that. You 460 00:26:18,960 --> 00:26:23,600 Speaker 1: can just use your smartphone to access music. So for 461 00:26:23,720 --> 00:26:26,960 Speaker 1: that reason, most people don't carry around iPods are MP 462 00:26:27,040 --> 00:26:32,800 Speaker 1: three players. Why would you carry around a separate AI device. Still, 463 00:26:32,840 --> 00:26:35,359 Speaker 1: it was neat seeing something that was different from everything 464 00:26:35,359 --> 00:26:38,040 Speaker 1: else that showed up in Las Vegas. Right, it was 465 00:26:38,160 --> 00:26:41,160 Speaker 1: nice to see something that did not look like everything else. 466 00:26:41,800 --> 00:26:43,760 Speaker 1: The design is also kind of it's got the sort 467 00:26:43,760 --> 00:26:47,000 Speaker 1: of retro cool thing to it. It looks a little clunky. 468 00:26:47,040 --> 00:26:48,840 Speaker 1: It actually makes me think of the seventies because it 469 00:26:48,880 --> 00:26:52,320 Speaker 1: was in bright orange, and I associate the seventies with 470 00:26:52,400 --> 00:26:55,439 Speaker 1: like oranges and browns and it, you know, it was 471 00:26:55,480 --> 00:26:57,440 Speaker 1: like a little square box and I was like, well, 472 00:26:57,480 --> 00:27:00,840 Speaker 1: this feels like it was like technology was imagined in 473 00:27:00,880 --> 00:27:04,000 Speaker 1: the seventies. It's got this little scroll wheel thing, and yeah, 474 00:27:04,200 --> 00:27:07,240 Speaker 1: that was cool a neat idea. And I think the 475 00:27:07,280 --> 00:27:11,919 Speaker 1: idea of having AI that can interface with these different platforms, 476 00:27:11,920 --> 00:27:14,600 Speaker 1: I think that's interesting too. I do think that there 477 00:27:14,680 --> 00:27:17,680 Speaker 1: is some value in that, like, especially if you're on 478 00:27:17,760 --> 00:27:20,119 Speaker 1: a vacation or something and you want to be able 479 00:27:20,160 --> 00:27:22,199 Speaker 1: to arrange a bunch of stuff, but you don't want 480 00:27:22,240 --> 00:27:24,720 Speaker 1: to take up time on your vacation to do it. 481 00:27:25,119 --> 00:27:28,520 Speaker 1: I could see that being really valuable, so I can 482 00:27:28,600 --> 00:27:31,280 Speaker 1: definitely see this evolving from here. I don't know that 483 00:27:31,320 --> 00:27:33,560 Speaker 1: the Rabbit R one is going to stick around. There's 484 00:27:33,600 --> 00:27:39,920 Speaker 1: also the question of if Rabbit were to encounter financial difficulties, 485 00:27:39,960 --> 00:27:43,399 Speaker 1: Like the company is really young. It launched in like 486 00:27:43,480 --> 00:27:46,000 Speaker 1: October of last year. It has not been around for 487 00:27:46,080 --> 00:27:49,080 Speaker 1: very long at all, which should also raise some questions 488 00:27:49,080 --> 00:27:53,119 Speaker 1: about things like how likely is this device to work? 489 00:27:53,359 --> 00:27:56,560 Speaker 1: But if it doesn't stick around, then does that mean 490 00:27:56,600 --> 00:28:00,480 Speaker 1: the R one just becomes a paperweight? Right? Two hundred 491 00:28:00,480 --> 00:28:04,320 Speaker 1: dollars paper weight? Because if it doesn't have the connectivity 492 00:28:04,359 --> 00:28:11,199 Speaker 1: to any underlying services, presumably this device doesn't have a 493 00:28:11,240 --> 00:28:13,720 Speaker 1: massive amount of processing power. If it's two hundred bucks, 494 00:28:13,720 --> 00:28:18,200 Speaker 1: there's no way it's got really high end components inside 495 00:28:18,200 --> 00:28:22,000 Speaker 1: of it. It must be relying upon cloud and edge 496 00:28:22,000 --> 00:28:26,480 Speaker 1: based computing. If the company that's behind all that goes under, 497 00:28:27,000 --> 00:28:32,400 Speaker 1: then presumably that functionality goes away. So I don't want 498 00:28:32,440 --> 00:28:34,040 Speaker 1: to see them fail. I think it would be really 499 00:28:34,040 --> 00:28:37,399 Speaker 1: cool if they succeed, but I would warn people to 500 00:28:38,160 --> 00:28:41,320 Speaker 1: consider it carefully before plunking down two hundred bucks that 501 00:28:41,440 --> 00:28:45,000 Speaker 1: they might not ever see again for something that ends 502 00:28:45,080 --> 00:28:49,520 Speaker 1: up being interesting, but then ultimately fuels other companies to 503 00:28:49,800 --> 00:28:53,720 Speaker 1: create similar tools and the r one gets lost in 504 00:28:53,720 --> 00:28:56,840 Speaker 1: the shuffle. But who knows, maybe like a year from now, 505 00:28:57,080 --> 00:28:59,720 Speaker 1: Rabbit's going to become like one of the big names 506 00:28:59,760 --> 00:29:03,240 Speaker 1: and consumer AI. All right, we're gonna take another quick break. 507 00:29:03,280 --> 00:29:05,520 Speaker 1: When we come back, I'm gonna talk about some other 508 00:29:05,800 --> 00:29:10,920 Speaker 1: AI components and consumer goods that were shown off at 509 00:29:11,000 --> 00:29:14,160 Speaker 1: CES and talk a little bit about whether or not 510 00:29:14,280 --> 00:29:27,680 Speaker 1: I think they're interesting. But first let's take this quick break. Okay, 511 00:29:27,760 --> 00:29:31,160 Speaker 1: so we're back now. I got a question for you. 512 00:29:31,240 --> 00:29:34,840 Speaker 1: Would you sleep better if you knew that artificial intelligence 513 00:29:35,040 --> 00:29:39,400 Speaker 1: was in your pillow? Because the Motion pillow is a 514 00:29:39,440 --> 00:29:44,640 Speaker 1: pillow that is outfitted with AI according to the company 515 00:29:44,640 --> 00:29:48,000 Speaker 1: behind it. So you might say, why why is there 516 00:29:48,040 --> 00:29:50,800 Speaker 1: AI in a pillow? Well, the idea is that this 517 00:29:50,880 --> 00:29:55,440 Speaker 1: pillow can monitor your sleep and detect when you do 518 00:29:55,560 --> 00:30:00,760 Speaker 1: things like snore, and snoring indicates that you're your breathing 519 00:30:00,840 --> 00:30:05,120 Speaker 1: pathway is obstructed and so you're not getting as much 520 00:30:05,240 --> 00:30:11,320 Speaker 1: air as your body wants. So if the pillow in 521 00:30:11,480 --> 00:30:13,560 Speaker 1: you know, if it detects that you're starting to snore, 522 00:30:14,200 --> 00:30:18,600 Speaker 1: it will activate pumps to inflate little balloons within the 523 00:30:18,640 --> 00:30:22,760 Speaker 1: pillow to help raise your head in an effort to 524 00:30:22,920 --> 00:30:27,280 Speaker 1: clear your airway. And it also includes other stuff inside 525 00:30:27,320 --> 00:30:30,920 Speaker 1: it as well, like sensors that are meant to keep 526 00:30:30,960 --> 00:30:33,600 Speaker 1: track of how well you sleep right, you know, are 527 00:30:33,640 --> 00:30:36,840 Speaker 1: you staying nice and still, are you moving around a lot? 528 00:30:36,960 --> 00:30:38,520 Speaker 1: Are you getting up in the middle of the night. 529 00:30:39,280 --> 00:30:42,880 Speaker 1: And it can give you sort of a score in 530 00:30:42,960 --> 00:30:47,440 Speaker 1: the morning indicating how well you slept the night before. Now, 531 00:30:47,480 --> 00:30:50,680 Speaker 1: as someone who personally very soon I'm going to have 532 00:30:50,720 --> 00:30:54,120 Speaker 1: to do a sleep study because that's another part of 533 00:30:54,680 --> 00:30:57,320 Speaker 1: the follow up. They want to check me for sleep apneam. 534 00:30:57,840 --> 00:31:02,440 Speaker 1: I understand the appeal of a device like this, right 535 00:31:02,520 --> 00:31:05,040 Speaker 1: the idea of, oh, here's something that can help me 536 00:31:05,680 --> 00:31:10,560 Speaker 1: deal with sleep apnea, and I'll get better rest and 537 00:31:10,640 --> 00:31:13,360 Speaker 1: I don't have to worry about all that other stuff, 538 00:31:13,400 --> 00:31:15,080 Speaker 1: like I don't have to worry about getting a seapap 539 00:31:15,160 --> 00:31:18,640 Speaker 1: machine or anything like that. I actually worry that some 540 00:31:18,720 --> 00:31:22,080 Speaker 1: folks might lean on technology like this in an effort 541 00:31:22,200 --> 00:31:26,120 Speaker 1: to bypass the need to consult with physicians about this 542 00:31:26,280 --> 00:31:29,680 Speaker 1: kind of thing. I do think technology can help. Don't 543 00:31:29,680 --> 00:31:32,000 Speaker 1: get me wrong. I think that having technology to help 544 00:31:32,040 --> 00:31:34,880 Speaker 1: support a healthy lifestyle is a good idea. I just 545 00:31:35,040 --> 00:31:39,400 Speaker 1: worry that people will rely on the tech rather than 546 00:31:39,480 --> 00:31:44,520 Speaker 1: rely on medical assistance, and they won't get the quality 547 00:31:44,760 --> 00:31:47,760 Speaker 1: and quantity of help that they really need. But still 548 00:31:47,760 --> 00:31:50,480 Speaker 1: I think it's a neat idea. I don't know how 549 00:31:50,520 --> 00:31:54,080 Speaker 1: I would feel about plunking down nearly seven hundred bucks 550 00:31:54,080 --> 00:31:56,440 Speaker 1: for a pillow, though, because that's how much it costs. 551 00:31:57,000 --> 00:31:59,240 Speaker 1: At that point. I think I might be thinking more 552 00:31:59,240 --> 00:32:01,440 Speaker 1: of the seapap mache because as much as I don't 553 00:32:01,480 --> 00:32:05,360 Speaker 1: want to look like a Star Wars alien when I 554 00:32:05,400 --> 00:32:07,680 Speaker 1: go to bed, I would like to have something that 555 00:32:07,760 --> 00:32:11,760 Speaker 1: has a good track record, and potentially my insurance could 556 00:32:11,760 --> 00:32:15,840 Speaker 1: help me pay for it. But you know, I don't know. 557 00:32:15,880 --> 00:32:18,840 Speaker 1: Maybe the pillow is a really good idea. Now. I 558 00:32:18,880 --> 00:32:22,320 Speaker 1: did mention that pillow has a lot of components in 559 00:32:22,360 --> 00:32:25,479 Speaker 1: it that are mirroring things that you find in fitness trackers, 560 00:32:25,560 --> 00:32:27,239 Speaker 1: right Like, there are a lot of fitness trackers out 561 00:32:27,280 --> 00:32:29,200 Speaker 1: there that you're supposed to wear when you go to sleep, 562 00:32:29,480 --> 00:32:33,440 Speaker 1: and they'll track your sleep. Well, we saw lots of 563 00:32:33,440 --> 00:32:36,840 Speaker 1: other fitness trackers at CEES twenty twenty four, No big surprise. 564 00:32:37,040 --> 00:32:39,440 Speaker 1: They have been a huge thing there for years now, 565 00:32:40,120 --> 00:32:43,640 Speaker 1: and this year we saw a couple that were meant 566 00:32:43,680 --> 00:32:47,720 Speaker 1: for our four legged friends. Again not totally new, I've 567 00:32:47,760 --> 00:32:51,680 Speaker 1: seen this sort of stuff before, But in Voxia introduced 568 00:32:51,680 --> 00:32:55,520 Speaker 1: a dog collar. It calls Mini Tales and includes stuff 569 00:32:55,520 --> 00:32:58,360 Speaker 1: like a GPS tracker, very useful if your dog happens 570 00:32:58,360 --> 00:33:02,000 Speaker 1: to get away. So has health sensors to monitor things 571 00:33:02,000 --> 00:33:04,600 Speaker 1: like heart rate and breathing and that kind of stuff. 572 00:33:05,200 --> 00:33:07,720 Speaker 1: And it also includes an AI element, so that if 573 00:33:07,760 --> 00:33:09,600 Speaker 1: you wanted to check in and see what your puppers 574 00:33:09,720 --> 00:33:12,480 Speaker 1: was doing while you were, say, off at work, you 575 00:33:12,520 --> 00:33:15,040 Speaker 1: could open up an app you could see if they 576 00:33:15,040 --> 00:33:17,560 Speaker 1: were sacked out or if maybe they had a serious 577 00:33:17,600 --> 00:33:20,560 Speaker 1: case of the zuomis. So the AI's purpose, from what 578 00:33:20,960 --> 00:33:25,440 Speaker 1: I was reading on this description is to interpret what 579 00:33:25,480 --> 00:33:28,640 Speaker 1: your dog is actually doing based upon the data it's gathering. 580 00:33:28,760 --> 00:33:30,640 Speaker 1: So if your dog's heart rate is elevated and if 581 00:33:30,640 --> 00:33:34,960 Speaker 1: it's breathing is faster, then and GPS is giving indications 582 00:33:34,960 --> 00:33:37,480 Speaker 1: that could say, oh, well, your dog's running around right, 583 00:33:37,880 --> 00:33:42,240 Speaker 1: because it's not like there's you know, a clear running 584 00:33:42,240 --> 00:33:45,360 Speaker 1: around indicator on there, the AI is making an interpretation. 585 00:33:45,560 --> 00:33:49,400 Speaker 1: So I think that's kind of an interesting concept. It's 586 00:33:49,440 --> 00:33:53,520 Speaker 1: probably one of the the fuzzier AI things that we're 587 00:33:53,560 --> 00:33:57,240 Speaker 1: talking about today. This color's ninety nine bucks, which for 588 00:33:57,280 --> 00:33:59,800 Speaker 1: a fitness tracker isn't terrible, but it also requires a 589 00:33:59,800 --> 00:34:02,840 Speaker 1: month subscription of twenty five bucks, and that does start 590 00:34:02,880 --> 00:34:04,360 Speaker 1: to pile up, and just like I was saying with 591 00:34:04,400 --> 00:34:08,240 Speaker 1: the R one, it makes you worried. If you're paying 592 00:34:08,239 --> 00:34:12,400 Speaker 1: a subscription, that usually indicates that the functionality of the 593 00:34:12,400 --> 00:34:16,520 Speaker 1: device is dependent upon ongoing support from a back end, 594 00:34:17,320 --> 00:34:21,120 Speaker 1: and if that means that the company, for whatever reason, 595 00:34:21,280 --> 00:34:24,480 Speaker 1: has trouble, then you may no longer get that support, 596 00:34:24,520 --> 00:34:28,040 Speaker 1: which means your smart collar just becomes a regular old 597 00:34:28,080 --> 00:34:30,960 Speaker 1: collar and it becomes one hundred dollars collar that does 598 00:34:31,000 --> 00:34:33,400 Speaker 1: the same thing as just a regular, you know, fabric 599 00:34:33,440 --> 00:34:37,440 Speaker 1: based collar would do, or leather or whatever. So yeah, again, 600 00:34:37,520 --> 00:34:40,879 Speaker 1: buyer beware. I don't think you know if you really 601 00:34:40,920 --> 00:34:42,640 Speaker 1: want to buy these things. I don't think there's anything 602 00:34:42,680 --> 00:34:46,120 Speaker 1: wrong with it. I just think knowing ahead of time, 603 00:34:46,440 --> 00:34:51,520 Speaker 1: the risks involved in purchasing something that could eventually become 604 00:34:52,400 --> 00:34:55,120 Speaker 1: kind of a dumb version of what it once was 605 00:34:55,239 --> 00:34:58,319 Speaker 1: due to a company dropping support. I think that's a 606 00:34:58,320 --> 00:35:01,080 Speaker 1: good thing to know. The gamer out there already know 607 00:35:01,120 --> 00:35:02,880 Speaker 1: this really well, because there are a lot of games 608 00:35:02,920 --> 00:35:07,719 Speaker 1: that over years have lost features due to companies either 609 00:35:07,760 --> 00:35:10,600 Speaker 1: going under or just stopping support. Now, there were a 610 00:35:10,640 --> 00:35:13,799 Speaker 1: couple of smart mirror devices at CEES twenty twenty four 611 00:35:14,040 --> 00:35:16,359 Speaker 1: that leaned a little bit on AI. One of them, 612 00:35:16,560 --> 00:35:20,760 Speaker 1: the b Mind Smart Mirror, claimed it could help users 613 00:35:20,800 --> 00:35:23,799 Speaker 1: practice mental wellness activities to improve their state of mind. 614 00:35:24,360 --> 00:35:27,640 Speaker 1: So imagine you're the evil queen in snow White, and 615 00:35:27,680 --> 00:35:30,040 Speaker 1: you walk up to the magic mirror and you demand 616 00:35:30,040 --> 00:35:32,200 Speaker 1: that it proclaimed you as the fairest of the land, 617 00:35:32,960 --> 00:35:36,840 Speaker 1: and instead the mirror adjusts the lighting to a more 618 00:35:37,120 --> 00:35:40,800 Speaker 1: soothing color and brightness, and it encourages you to find 619 00:35:40,880 --> 00:35:44,320 Speaker 1: value within yourself rather than to seek validation from others 620 00:35:44,680 --> 00:35:47,279 Speaker 1: or something. I've only read about this mirror, so I'm 621 00:35:47,280 --> 00:35:49,880 Speaker 1: not entirely sure how it works or how extensive this 622 00:35:49,920 --> 00:35:52,560 Speaker 1: AI actually is. I mean, it could be that it's 623 00:35:52,600 --> 00:35:56,240 Speaker 1: a relatively small library of responses that it can provide 624 00:35:56,280 --> 00:35:59,279 Speaker 1: based upon your prompts, but I don't have any idea 625 00:35:59,320 --> 00:36:03,120 Speaker 1: if it's some that can make more, you know, extensive interpretations. 626 00:36:03,160 --> 00:36:07,480 Speaker 1: That's really interesting, But I don't know that. Another mirror 627 00:36:07,560 --> 00:36:10,880 Speaker 1: like gadget is the Anura Magic Mirror, which looked to 628 00:36:10,920 --> 00:36:14,239 Speaker 1: me more like a screen than a mirror, sort of 629 00:36:14,280 --> 00:36:17,200 Speaker 1: like how some people use the forward facing camera on 630 00:36:17,239 --> 00:36:19,640 Speaker 1: their phone to do stuff like check their makeup and 631 00:36:19,680 --> 00:36:22,440 Speaker 1: that kind of thing. Anyway, this gadget can perform a 632 00:36:22,480 --> 00:36:25,240 Speaker 1: full face scan. Apparently it takes like half a minute, 633 00:36:25,600 --> 00:36:27,919 Speaker 1: and then it analyzes your face to determine a bunch 634 00:36:27,960 --> 00:36:30,399 Speaker 1: of stuff like what your blood pressure and heart rate 635 00:36:30,440 --> 00:36:32,759 Speaker 1: happens to be that kind of thing, And maybe I 636 00:36:32,760 --> 00:36:35,319 Speaker 1: should get one of these, because you know, I need 637 00:36:35,360 --> 00:36:36,879 Speaker 1: to keep track of that sort of stuff. Or maybe 638 00:36:36,920 --> 00:36:39,160 Speaker 1: I should just keep using my blood pressure cuff and 639 00:36:39,280 --> 00:36:43,000 Speaker 1: just to use that. Anyway, according to the company, the 640 00:36:43,160 --> 00:36:47,080 Speaker 1: planned customer base isn't for the average person. Instead, it's 641 00:36:47,080 --> 00:36:49,839 Speaker 1: for stuff like doctor's offices and gyms and that kind 642 00:36:49,920 --> 00:36:53,040 Speaker 1: of thing. We also got more than a few robots 643 00:36:53,080 --> 00:36:56,200 Speaker 1: at CEES twenty twenty four. That's to be expected. Every 644 00:36:56,280 --> 00:37:00,319 Speaker 1: year that I've attended, I've seen tons of robots. Some 645 00:37:00,719 --> 00:37:02,640 Speaker 1: end up being fairly simple, like there might be a 646 00:37:02,719 --> 00:37:06,480 Speaker 1: robot that's essentially kind of like a smart shopping cart 647 00:37:06,560 --> 00:37:09,880 Speaker 1: that will follow behind a specific person and act as 648 00:37:09,960 --> 00:37:12,600 Speaker 1: kind of like a little robotic valet and carry their stuff. 649 00:37:13,120 --> 00:37:15,680 Speaker 1: Others can get much more complicated, or at the very 650 00:37:15,760 --> 00:37:20,399 Speaker 1: least can fulfill more complicated tasks. So this year there 651 00:37:20,440 --> 00:37:24,120 Speaker 1: were a couple that I saw lots of folks mentioned 652 00:37:24,160 --> 00:37:26,560 Speaker 1: in particular. I mean there were tons of robots, don't 653 00:37:26,560 --> 00:37:29,600 Speaker 1: get me wrong, like hundreds of different types of robots, 654 00:37:29,880 --> 00:37:34,520 Speaker 1: but two that a lot of people specifically wrote features on. 655 00:37:34,520 --> 00:37:38,719 Speaker 1: One came from LG, but I couldn't find like a 656 00:37:38,800 --> 00:37:43,200 Speaker 1: real specific official name for it, although the one I 657 00:37:43,239 --> 00:37:47,920 Speaker 1: saw referenced it as the smart Home AI agent. That 658 00:37:48,040 --> 00:37:50,520 Speaker 1: just doesn't seem very snappy to me. But it's part 659 00:37:50,600 --> 00:37:55,680 Speaker 1: of IG's zero labor home concept, and it's a very 660 00:37:55,719 --> 00:38:01,040 Speaker 1: cute little robot. It's got two stubby little legs, the 661 00:38:01,239 --> 00:38:05,080 Speaker 1: end in wheels, and its body looks like a horizontal cylinder, 662 00:38:05,600 --> 00:38:08,480 Speaker 1: so like imagine like a canister on its side, but 663 00:38:08,520 --> 00:38:11,000 Speaker 1: it's being held up by these two legs with wheels 664 00:38:11,000 --> 00:38:12,920 Speaker 1: on the end, and it looks like it's wearing a 665 00:38:12,920 --> 00:38:15,480 Speaker 1: little pair of headphones, and it's got a little digital 666 00:38:15,520 --> 00:38:18,400 Speaker 1: screen with digital eyes in it. I really wish I 667 00:38:18,400 --> 00:38:20,960 Speaker 1: could have seen this thing in person. It's so adorable 668 00:38:20,960 --> 00:38:24,720 Speaker 1: in pictures. So, according to LG is essentially a moving 669 00:38:24,960 --> 00:38:28,879 Speaker 1: smart home hub. So it's meant to interface with other 670 00:38:28,960 --> 00:38:31,760 Speaker 1: smart home appliances and such so that you can control 671 00:38:31,800 --> 00:38:34,399 Speaker 1: your home by talking to your plastic pal who's fun 672 00:38:34,440 --> 00:38:37,279 Speaker 1: to be with. Shout out if you happen to get 673 00:38:37,280 --> 00:38:41,200 Speaker 1: that reference. So, this device has facial recognition capabilities that 674 00:38:41,239 --> 00:38:43,600 Speaker 1: means it can learn to recognize the various members of 675 00:38:43,640 --> 00:38:47,239 Speaker 1: the household. It can monitor the home. It has a 676 00:38:47,239 --> 00:38:50,200 Speaker 1: built in camera so it can patrol and keep an 677 00:38:50,200 --> 00:38:54,480 Speaker 1: eye on things. It can also check on various factors 678 00:38:54,480 --> 00:38:57,359 Speaker 1: like air temperature, humidity, and air quality within the home 679 00:38:57,400 --> 00:39:01,360 Speaker 1: and alert you if any of those are in ranges 680 00:39:01,360 --> 00:39:04,600 Speaker 1: that are perhaps unhealthy. And it's even supposed to be 681 00:39:04,680 --> 00:39:06,560 Speaker 1: able to figure out if you're in a good mood 682 00:39:06,640 --> 00:39:08,800 Speaker 1: or not. So the idea is that you get home, 683 00:39:09,120 --> 00:39:12,520 Speaker 1: this little robot rolls up and looks adoringly into your 684 00:39:12,600 --> 00:39:15,160 Speaker 1: face and then tries to figure out if you're happy 685 00:39:15,280 --> 00:39:18,400 Speaker 1: or if you're grouchy or whatever. Then immediately it begins 686 00:39:18,440 --> 00:39:21,960 Speaker 1: to select settings and content to help you out with 687 00:39:22,000 --> 00:39:26,320 Speaker 1: the various smart home appliances in your home. So maybe 688 00:39:26,520 --> 00:39:28,000 Speaker 1: you come home, it looks at you and it can 689 00:39:28,040 --> 00:39:31,000 Speaker 1: tell that you're all stressed out, so immediately starts to 690 00:39:31,080 --> 00:39:34,120 Speaker 1: set the lights to a kind of calm lower level 691 00:39:34,400 --> 00:39:38,280 Speaker 1: and plays soothing music on a smart speaker and puts 692 00:39:38,280 --> 00:39:40,680 Speaker 1: a silence on notifications for the time being so that 693 00:39:40,719 --> 00:39:43,319 Speaker 1: you don't flip out. I have no idea if this 694 00:39:43,360 --> 00:39:45,440 Speaker 1: thing is ever actually going to be an actual product, 695 00:39:45,680 --> 00:39:49,759 Speaker 1: but I can definitely see the appeal of it. Samsung 696 00:39:49,920 --> 00:39:52,960 Speaker 1: also got some buzz by showing off an update to 697 00:39:53,160 --> 00:39:57,160 Speaker 1: its Bali robot or bally, depending on how you pronounce it. 698 00:39:57,200 --> 00:40:00,040 Speaker 1: I watched a video where it was from Samsung. I 699 00:40:00,120 --> 00:40:04,640 Speaker 1: had both pronunciations in there. But it's a b a lllie. 700 00:40:04,880 --> 00:40:07,200 Speaker 1: I would think it's Bali because it is shaped like 701 00:40:07,200 --> 00:40:11,080 Speaker 1: a ball. Now. Samsung first introduced Bali back in twenty twenty, 702 00:40:11,400 --> 00:40:14,400 Speaker 1: but this twenty twenty four version has a few extra 703 00:40:14,440 --> 00:40:18,280 Speaker 1: bells and whistles. Now, like I said, it's rather ball shaped, 704 00:40:18,320 --> 00:40:21,640 Speaker 1: but it uses again a pair of thin wheels, one 705 00:40:21,680 --> 00:40:24,120 Speaker 1: on either side of the ball that helps it move 706 00:40:24,200 --> 00:40:27,680 Speaker 1: around the environment. And this new version of Bali has 707 00:40:27,760 --> 00:40:30,760 Speaker 1: a projector built into it, which allows it to project 708 00:40:31,080 --> 00:40:35,399 Speaker 1: images on floors or walls or ceilings, essentially being able 709 00:40:35,400 --> 00:40:39,320 Speaker 1: to turn any surface into a video screen. So promotional 710 00:40:39,400 --> 00:40:43,160 Speaker 1: video showed folks using Bali to create an impromptu video 711 00:40:43,239 --> 00:40:45,920 Speaker 1: calling screen, or to turn a wall into a television, 712 00:40:46,280 --> 00:40:48,560 Speaker 1: or even project stuff on a floor in an effort 713 00:40:48,640 --> 00:40:52,359 Speaker 1: to entertain a Golden retriever, which I think is unnecessary 714 00:40:52,360 --> 00:40:55,400 Speaker 1: because we all know golden retrievers have like two brain cells, 715 00:40:55,840 --> 00:40:58,240 Speaker 1: so you really don't need to work that hard anyway. 716 00:40:58,600 --> 00:41:02,040 Speaker 1: Like LG's robots, msun showed off that Bali is meant 717 00:41:02,080 --> 00:41:05,240 Speaker 1: to interact with smart devices and thus give the robot 718 00:41:05,280 --> 00:41:09,040 Speaker 1: control over appliances and Internet of Things gadgets throughout the home. 719 00:41:09,440 --> 00:41:12,279 Speaker 1: So like the LG one, you could use it to 720 00:41:12,600 --> 00:41:16,640 Speaker 1: do things like adjust the thermostat, or change what's playing 721 00:41:16,680 --> 00:41:19,600 Speaker 1: on the smart speakers, or change the lighting. It can also, 722 00:41:20,080 --> 00:41:22,319 Speaker 1: like the LG one, it can also patrol and keep 723 00:41:22,320 --> 00:41:24,359 Speaker 1: an eye out on things that are going on back 724 00:41:24,400 --> 00:41:26,680 Speaker 1: at home base while you're out and about. So those 725 00:41:26,680 --> 00:41:28,200 Speaker 1: were some of the robots I mean, like I said, 726 00:41:28,200 --> 00:41:30,080 Speaker 1: there were tons of others. There was one that was 727 00:41:30,560 --> 00:41:34,120 Speaker 1: a few people mentioned that was like a combination robot 728 00:41:34,200 --> 00:41:37,839 Speaker 1: that was a couple of different appliances, with one big 729 00:41:37,880 --> 00:41:41,640 Speaker 1: appliance acting as like the docking station for the roving 730 00:41:41,760 --> 00:41:44,759 Speaker 1: robot that could mop. It was a washing machine and 731 00:41:45,120 --> 00:41:49,400 Speaker 1: mop combo, and the mop part could wander around the 732 00:41:49,400 --> 00:41:52,640 Speaker 1: house and mop and then dock with the washing machine 733 00:41:52,960 --> 00:41:57,520 Speaker 1: and offload the dirty water through the washing machine's drain 734 00:41:58,000 --> 00:42:00,520 Speaker 1: so that you didn't even have to empty the mop. 735 00:42:00,680 --> 00:42:03,200 Speaker 1: It could fill itself with clean water and empty the 736 00:42:03,280 --> 00:42:06,279 Speaker 1: dirty water, which I think was pretty cool idea. So 737 00:42:06,440 --> 00:42:08,719 Speaker 1: a lot of different stuff like that, But moving on, 738 00:42:08,880 --> 00:42:12,399 Speaker 1: let's talk about Nvidia, because that was another company that's 739 00:42:12,560 --> 00:42:15,200 Speaker 1: heavily entrenched in AI, and that should come as no 740 00:42:15,280 --> 00:42:20,480 Speaker 1: surprise because it has been manufacturing powerful processors that have 741 00:42:20,560 --> 00:42:23,480 Speaker 1: been tweaked to support AI functionality for the past couple 742 00:42:23,520 --> 00:42:28,080 Speaker 1: of years. Powering artificial intelligence requires a whole lot of OOF, 743 00:42:28,560 --> 00:42:30,759 Speaker 1: and Nvidia has a rep for building chips that are 744 00:42:30,880 --> 00:42:34,000 Speaker 1: very much oomph centric, whether it's to provide the best 745 00:42:34,040 --> 00:42:37,200 Speaker 1: performance for a state of the art gaming PC or 746 00:42:37,239 --> 00:42:41,400 Speaker 1: a computer system that's running artificial intelligence applications. The company 747 00:42:41,400 --> 00:42:44,719 Speaker 1: held a special address during CEES to talk about how 748 00:42:44,760 --> 00:42:48,440 Speaker 1: its products will power the tech of tomorrow, and it 749 00:42:48,440 --> 00:42:53,960 Speaker 1: can be challenging to walk away from stuff like CES 750 00:42:54,640 --> 00:42:57,080 Speaker 1: and not have a feeling that at least some companies 751 00:42:57,120 --> 00:42:59,680 Speaker 1: are still more than a little wishy washy when it 752 00:42:59,680 --> 00:43:03,040 Speaker 1: comes to You've got companies like Nvidia that can very 753 00:43:03,080 --> 00:43:07,719 Speaker 1: firmly point at how they support AI functionality, but when 754 00:43:07,719 --> 00:43:10,200 Speaker 1: it comes to the companies that are building the actual 755 00:43:10,280 --> 00:43:15,799 Speaker 1: AI implementations, it gets a little more vague. You might 756 00:43:15,840 --> 00:43:20,280 Speaker 1: have limited implementations, you might have some very loose definitions 757 00:43:20,960 --> 00:43:23,600 Speaker 1: that don't make a very strong stance as to how 758 00:43:23,680 --> 00:43:27,880 Speaker 1: AI is a factor. But we do need to remember 759 00:43:27,920 --> 00:43:31,239 Speaker 1: that artificial intelligence itself is kind of on thin eyes 760 00:43:31,320 --> 00:43:33,520 Speaker 1: at the moment. There are governments around the world that 761 00:43:33,560 --> 00:43:35,960 Speaker 1: are taking a very close look at AI and are 762 00:43:35,960 --> 00:43:38,480 Speaker 1: starting to consider the sorts of regulations that may be 763 00:43:38,680 --> 00:43:42,120 Speaker 1: needed to keep AI from going all terminator on us, 764 00:43:42,480 --> 00:43:45,319 Speaker 1: and companies need to keep that in mind too. It 765 00:43:45,360 --> 00:43:49,759 Speaker 1: may be necessary one day to walk back some AI strategies, 766 00:43:50,000 --> 00:43:54,399 Speaker 1: so diving wholeheartedly into AI tech could end up being 767 00:43:54,480 --> 00:43:56,919 Speaker 1: a costly mistake, and that might be one reason why 768 00:43:56,960 --> 00:44:00,440 Speaker 1: companies are a little slow to do so. It's not 769 00:44:00,719 --> 00:44:02,440 Speaker 1: just that it's hard to figure out how do we 770 00:44:02,520 --> 00:44:04,960 Speaker 1: do this in a way that makes sense. It's also 771 00:44:05,280 --> 00:44:06,960 Speaker 1: how can we do this in a way where we 772 00:44:07,000 --> 00:44:11,600 Speaker 1: don't over commit and if governments decide to push back 773 00:44:11,680 --> 00:44:15,399 Speaker 1: hard against AI. We haven't gotten into a position where 774 00:44:15,400 --> 00:44:19,000 Speaker 1: we've you know, over invested in an area of business 775 00:44:19,040 --> 00:44:23,319 Speaker 1: that ultimately doesn't pan out. So that also opens up 776 00:44:23,440 --> 00:44:27,359 Speaker 1: opportunities for smaller companies like Rabbit to potentially cash in. 777 00:44:27,520 --> 00:44:29,520 Speaker 1: But I'm still not convinced that Rabbit will see much 778 00:44:29,560 --> 00:44:33,080 Speaker 1: success beyond its initial launch. Maybe I'm wrong, We'll have 779 00:44:33,120 --> 00:44:36,120 Speaker 1: to wait and see. It's a really weird situation. We 780 00:44:36,160 --> 00:44:39,640 Speaker 1: already have and use so much technology that has various 781 00:44:39,640 --> 00:44:42,160 Speaker 1: elements of AI built into it like a Again, AI 782 00:44:42,280 --> 00:44:45,560 Speaker 1: is not new. You know your smartphone has AI components 783 00:44:45,560 --> 00:44:49,040 Speaker 1: built into it. It's something that's everywhere all around us. 784 00:44:49,600 --> 00:44:52,080 Speaker 1: It's clear. It's obvious AI is going to be a 785 00:44:52,080 --> 00:44:55,120 Speaker 1: big part of our technology moving forward. There's no denying it. 786 00:44:55,719 --> 00:44:57,440 Speaker 1: But at the same time, I think most of us 787 00:44:57,480 --> 00:45:01,680 Speaker 1: recognize that AI also has the potential to do amazing things, 788 00:45:02,040 --> 00:45:05,800 Speaker 1: but potentially also terrible things. So here's hoping that companies 789 00:45:05,840 --> 00:45:08,919 Speaker 1: make the best choices and that our refrigerators don't rise 790 00:45:09,000 --> 00:45:11,640 Speaker 1: up against us, because I'm pretty sure it could take 791 00:45:11,680 --> 00:45:15,319 Speaker 1: me anyway. That's kind of a round up on an 792 00:45:15,320 --> 00:45:18,800 Speaker 1: overview of what was going on with AI over at CES. 793 00:45:18,920 --> 00:45:21,600 Speaker 1: As I said, that was just one small thing that 794 00:45:21,680 --> 00:45:24,520 Speaker 1: happened at CES this year. I might do another episode 795 00:45:25,239 --> 00:45:27,719 Speaker 1: talking about some of the technologies that were shown off 796 00:45:27,760 --> 00:45:31,080 Speaker 1: to go into further detail, like those transparent OLED screens. 797 00:45:31,280 --> 00:45:33,040 Speaker 1: It's something we had been hearing about for a very 798 00:45:33,080 --> 00:45:35,120 Speaker 1: long time, and we had even seen some prototypes in 799 00:45:35,120 --> 00:45:38,200 Speaker 1: the past, but man, they were on display like crazy 800 00:45:38,840 --> 00:45:41,480 Speaker 1: this past year from what I've seen, and I'm kind 801 00:45:41,480 --> 00:45:44,000 Speaker 1: of sad that I missed seeing them in person. Not 802 00:45:44,280 --> 00:45:47,560 Speaker 1: so sad that, you know, I would have traded all 803 00:45:47,640 --> 00:45:52,320 Speaker 1: that wonderful time I spent going to various doctor's appointments. Anyway, 804 00:45:52,400 --> 00:45:55,520 Speaker 1: enough about all that. I hope you are all well. 805 00:45:55,600 --> 00:45:59,839 Speaker 1: I'm so glad to be back recording. I look forward 806 00:45:59,880 --> 00:46:03,840 Speaker 1: to doing that three times a week. I'm thinking about 807 00:46:03,960 --> 00:46:07,320 Speaker 1: news episodes on Fridays and then other just regular textuff 808 00:46:07,320 --> 00:46:11,680 Speaker 1: episodes on Mondays and Wednesdays. And it's a pleasure to 809 00:46:11,760 --> 00:46:16,479 Speaker 1: be back in the saddle and recording again. I hope 810 00:46:16,600 --> 00:46:18,520 Speaker 1: you are all well. I hope you all take very 811 00:46:18,520 --> 00:46:21,000 Speaker 1: good care of yourselves, go see your doctors on a 812 00:46:21,040 --> 00:46:24,680 Speaker 1: regular basis. Trust me on this, you don't want to 813 00:46:24,719 --> 00:46:27,080 Speaker 1: fall into the same trap I did, and I'll talk 814 00:46:27,120 --> 00:46:37,760 Speaker 1: to you again really soon. Tech Stuff is an iHeartRadio production. 815 00:46:38,040 --> 00:46:43,080 Speaker 1: For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 816 00:46:43,200 --> 00:46:48,680 Speaker 1: or wherever you listen to your favorite shows.