1 00:00:01,720 --> 00:00:05,640 Speaker 1: Also media. 2 00:00:06,600 --> 00:00:10,520 Speaker 2: What's Pippin' my Bops? It could happen here a podcast 3 00:00:10,600 --> 00:00:13,840 Speaker 2: that is sometimes competently introduced, but not on the days 4 00:00:13,840 --> 00:00:14,880 Speaker 2: that I'm recording. 5 00:00:15,440 --> 00:00:16,600 Speaker 3: We're at CEES, the. 6 00:00:16,520 --> 00:00:20,440 Speaker 2: Consumer Electronics Show, seeing what the tech industry has in 7 00:00:20,440 --> 00:00:23,080 Speaker 2: mind for all of us. Right, this is a show 8 00:00:23,160 --> 00:00:27,080 Speaker 2: where the industry talks to itself and its investors and 9 00:00:27,160 --> 00:00:30,040 Speaker 2: clients about what the future is going to be. And 10 00:00:30,120 --> 00:00:32,120 Speaker 2: so Garrison Davis and I are going to sit down 11 00:00:32,120 --> 00:00:35,880 Speaker 2: with you and tell, based on our explorations and investigations 12 00:00:36,000 --> 00:00:41,040 Speaker 2: this week, what the future of artificial intelligence means for 13 00:00:41,200 --> 00:00:44,599 Speaker 2: all of us and for the world. Garrison, Hi, how 14 00:00:44,680 --> 00:00:44,960 Speaker 2: you doing. 15 00:00:45,440 --> 00:00:46,000 Speaker 4: I'm tired. 16 00:00:46,240 --> 00:00:47,000 Speaker 3: Yeah, you look tired. 17 00:00:47,040 --> 00:00:48,720 Speaker 4: It's been a long week. It's been a long week 18 00:00:48,760 --> 00:00:49,920 Speaker 4: convention walking. 19 00:00:50,280 --> 00:00:51,480 Speaker 3: Yeah, we've worked very hard. 20 00:00:51,680 --> 00:00:53,159 Speaker 4: I've talked to too many robots. 21 00:00:53,400 --> 00:00:55,040 Speaker 3: Yeah, I've talked to a lot of chatbots. 22 00:00:55,160 --> 00:00:56,600 Speaker 4: I mean it's a bit of a stretch, is that 23 00:00:56,640 --> 00:00:57,400 Speaker 4: we've talked with them. 24 00:00:57,440 --> 00:00:59,160 Speaker 3: I've talked to add a lot of chat bots. 25 00:00:59,280 --> 00:01:01,800 Speaker 4: Yeah, times they responded, sometimes they don't. 26 00:01:02,240 --> 00:01:03,680 Speaker 2: I guess one of the things that's kind of shocked 27 00:01:03,720 --> 00:01:07,680 Speaker 2: me is because, like, despite being very critical about AI 28 00:01:07,840 --> 00:01:10,680 Speaker 2: in the industry. I have actually a pretty good idea 29 00:01:10,720 --> 00:01:12,880 Speaker 2: of what these things are capable of. And I know 30 00:01:13,280 --> 00:01:16,920 Speaker 2: that chat GBT and Jim and I and the other 31 00:01:17,200 --> 00:01:20,240 Speaker 2: like they're capable of doing some things that look very impressive. 32 00:01:20,280 --> 00:01:23,440 Speaker 2: They are capable of conversations you know, that can be 33 00:01:23,480 --> 00:01:25,680 Speaker 2: fairly in depth, and that can cover a wide variety 34 00:01:25,720 --> 00:01:27,560 Speaker 2: of topics. And so one of the things that has 35 00:01:27,560 --> 00:01:29,679 Speaker 2: surprised me is that as I have gone up and 36 00:01:29,720 --> 00:01:33,920 Speaker 2: tried to communicate with every various chatbot enabled AI enabled product, 37 00:01:34,400 --> 00:01:36,960 Speaker 2: about seventy percent of the time it's not actually capable 38 00:01:36,959 --> 00:01:39,240 Speaker 2: of responding to me in a way that makes any sense. 39 00:01:39,319 --> 00:01:44,240 Speaker 2: Like the majority of those products just don't function. Sorry, 40 00:01:44,480 --> 00:01:47,680 Speaker 2: was that literally your AI and your phone yelling at us? 41 00:01:48,280 --> 00:01:51,960 Speaker 2: Case in point, I was trying to pull up one 42 00:01:51,960 --> 00:01:54,680 Speaker 2: of the one of the AI robots that we saw today, 43 00:01:55,000 --> 00:01:56,640 Speaker 2: And I guess this is something that we talked about 44 00:01:56,680 --> 00:01:59,280 Speaker 2: on Better Offline a bit. In the main thing this 45 00:01:59,840 --> 00:02:03,880 Speaker 2: year year is the complete, the complete like victory of 46 00:02:04,000 --> 00:02:08,400 Speaker 2: like chat GPT, yeah across not just not just like it's. 47 00:02:08,200 --> 00:02:10,840 Speaker 3: Like the cultural victory within the tech industry. 48 00:02:10,520 --> 00:02:13,639 Speaker 4: Yes, right, and it's moved into like the physical world 49 00:02:13,680 --> 00:02:16,400 Speaker 4: through like their API license, saying, yeah, so many of 50 00:02:16,440 --> 00:02:20,000 Speaker 4: the quote unquote products this year is building a physical 51 00:02:20,200 --> 00:02:22,320 Speaker 4: thing around chat GPT. 52 00:02:22,520 --> 00:02:26,040 Speaker 2: We have a necklace that has chat GPT and you 53 00:02:26,120 --> 00:02:28,120 Speaker 2: can talk to it. We have a pin that chat 54 00:02:28,200 --> 00:02:30,240 Speaker 2: GPT is in and you can talk to it and 55 00:02:30,440 --> 00:02:32,840 Speaker 2: have it do things like transcribe an interview. 56 00:02:33,280 --> 00:02:39,919 Speaker 4: Earbuds, there's there's earbuds, little robot dot, every everything has 57 00:02:40,040 --> 00:02:42,040 Speaker 4: has chat GPT inside it. And that's the main thing 58 00:02:42,040 --> 00:02:45,280 Speaker 4: that makes it a like uniquer special compared to you know, 59 00:02:45,360 --> 00:02:47,359 Speaker 4: the types of products we've seen we've seen before. 60 00:02:47,480 --> 00:02:49,040 Speaker 2: And I would say again when I when I say 61 00:02:49,080 --> 00:02:51,520 Speaker 2: that like seventy percent of the chatbot enabled products that 62 00:02:51,520 --> 00:02:53,600 Speaker 2: I tried to interact with could not converse with me 63 00:02:53,720 --> 00:02:55,880 Speaker 2: or could not do so in a functional way. It's 64 00:02:55,919 --> 00:02:58,160 Speaker 2: not because the chatbots aren't able to talk to you, 65 00:02:58,200 --> 00:03:00,840 Speaker 2: because they are. Anyone who's not like you can. It's 66 00:03:00,880 --> 00:03:04,760 Speaker 2: that all of the chatbots require an active Internet connection 67 00:03:04,800 --> 00:03:07,040 Speaker 2: because the vast majority of these products do not have 68 00:03:07,200 --> 00:03:10,919 Speaker 2: anything on device, and when you're in a crowded convention floor, 69 00:03:10,960 --> 00:03:13,480 Speaker 2: the Internet is bad and so they just don't work. 70 00:03:14,120 --> 00:03:16,320 Speaker 2: And it kind of it's one of those I'm sure 71 00:03:16,360 --> 00:03:18,800 Speaker 2: most of these products would work better in the real world, 72 00:03:18,800 --> 00:03:21,520 Speaker 2: but also the fact that they're all completely hobbled by 73 00:03:21,560 --> 00:03:25,120 Speaker 2: their access to data is kind of one of the things. 74 00:03:24,919 --> 00:03:27,079 Speaker 2: It's one of the seams that you can see here. 75 00:03:27,760 --> 00:03:32,840 Speaker 4: Yeah, the LLLM rappers, so LLLM rappers and robotics are 76 00:03:32,840 --> 00:03:35,360 Speaker 4: the big things this year. Often these things. 77 00:03:35,120 --> 00:03:37,880 Speaker 3: Are I mean by an LLLM rapper garrison. 78 00:03:37,520 --> 00:03:38,760 Speaker 4: Well, this is this is the thing that we're talking 79 00:03:38,800 --> 00:03:40,720 Speaker 4: about it it'd say it's that this physical product that's 80 00:03:40,760 --> 00:03:44,480 Speaker 4: built around something like chat GPT or Gemini or Claude 81 00:03:44,840 --> 00:03:47,360 Speaker 4: or a number of like the Chinese made ones, right, 82 00:03:47,680 --> 00:03:51,000 Speaker 4: a lot of Chinese companies here. So these physical products, 83 00:03:51,800 --> 00:03:55,120 Speaker 4: whether those are you headphones, earbuds, or in many cases 84 00:03:55,120 --> 00:03:57,920 Speaker 4: little tiny robots whose main main features that you can 85 00:03:57,960 --> 00:04:00,640 Speaker 4: talk to can talk to it and what you're actually 86 00:04:00,680 --> 00:04:03,000 Speaker 4: talking to is like a filtered version of chat GPT. 87 00:04:03,640 --> 00:04:05,120 Speaker 4: And there's a lot of these products for kids that 88 00:04:05,160 --> 00:04:07,720 Speaker 4: we've seen, like robots for kids, because there's a lot 89 00:04:07,760 --> 00:04:09,880 Speaker 4: a lot of robotics this year as well. That these 90 00:04:10,040 --> 00:04:12,720 Speaker 4: these are the two things that after years and years 91 00:04:12,760 --> 00:04:14,800 Speaker 4: of them trying to find a new thing for each 92 00:04:14,840 --> 00:04:18,320 Speaker 4: cees they've like settled on not not actually having anything 93 00:04:18,640 --> 00:04:21,720 Speaker 4: of robotics, not having anything new because like we've we've 94 00:04:21,760 --> 00:04:26,640 Speaker 4: seen robotics before at other years, and this is the 95 00:04:26,720 --> 00:04:29,800 Speaker 4: year that they're combining their physical robotics, which aren't new, 96 00:04:30,080 --> 00:04:32,800 Speaker 4: but combining them with chat GPT and presenting it as 97 00:04:32,800 --> 00:04:33,640 Speaker 4: a new product. 98 00:04:33,360 --> 00:04:36,760 Speaker 2: Look, now you can intelligence robot and they can't do 99 00:04:36,880 --> 00:04:39,960 Speaker 2: more tasks than it used to, Like we're still doing 100 00:04:40,040 --> 00:04:42,920 Speaker 2: really good if it can slowly and not very competently 101 00:04:42,960 --> 00:04:47,440 Speaker 2: fold laundry. Oh right, Like LG's Kloyd Kloyd, which is 102 00:04:47,680 --> 00:04:49,800 Speaker 2: a robot designed to be in your home and do 103 00:04:50,000 --> 00:04:53,200 Speaker 2: chores for you and visibly does not do them well. 104 00:04:53,279 --> 00:04:54,280 Speaker 3: We watched it now. 105 00:04:54,839 --> 00:04:57,839 Speaker 2: Where they're presumably it's presumably working better than it normally 106 00:04:57,880 --> 00:04:59,159 Speaker 2: does because it's a demo. 107 00:05:00,080 --> 00:05:03,560 Speaker 4: I went to the first Cloyd Cloyd demo and they 108 00:05:03,800 --> 00:05:05,960 Speaker 4: had like three three different setups for the different stories 109 00:05:05,960 --> 00:05:09,839 Speaker 4: for like different use cases for Cloyd, once with ones 110 00:05:09,839 --> 00:05:12,359 Speaker 4: with a family, ones with like a single guy, and 111 00:05:12,440 --> 00:05:14,840 Speaker 4: ones with like a like a middle aged woman. And 112 00:05:15,360 --> 00:05:18,479 Speaker 4: with the family, the robots able to find keys that 113 00:05:18,560 --> 00:05:21,599 Speaker 4: are lost. Notably, what that means is that Cloyd is 114 00:05:21,640 --> 00:05:24,039 Speaker 4: moving keys around the house, which might actually contribute to 115 00:05:24,040 --> 00:05:31,120 Speaker 4: the problem of your keys somewhere else. The robot could 116 00:05:31,360 --> 00:05:34,599 Speaker 4: put a tray of croissants in an oven and the 117 00:05:34,680 --> 00:05:37,800 Speaker 4: robot knows it, and the robot knows exactly how you 118 00:05:37,839 --> 00:05:39,920 Speaker 4: want your croissants done. You don't even have to tell 119 00:05:39,960 --> 00:05:42,960 Speaker 4: the robot. It already knows. And that's something that was 120 00:05:43,000 --> 00:05:45,200 Speaker 4: stressed the power of ail and over again is that 121 00:05:45,480 --> 00:05:46,920 Speaker 4: it will it will start to like know what you 122 00:05:47,040 --> 00:05:49,120 Speaker 4: want is so you wan't need to tell a memory. 123 00:05:49,480 --> 00:05:51,400 Speaker 2: So many of these products, with the big sell ones, 124 00:05:51,480 --> 00:05:54,599 Speaker 2: it's got a memory, and it has a rant. They 125 00:05:54,680 --> 00:05:57,760 Speaker 2: can't stop themselves. And I think some of this was 126 00:05:57,800 --> 00:05:59,880 Speaker 2: like the actual companies and the way that they're structuring 127 00:06:00,120 --> 00:06:02,320 Speaker 2: a campaigns. But a lot of this was just most 128 00:06:02,320 --> 00:06:05,640 Speaker 2: of the companies here hire pr people who don't regularly 129 00:06:05,640 --> 00:06:07,760 Speaker 2: work for the company and don't know much about the products, 130 00:06:07,760 --> 00:06:10,599 Speaker 2: and they're just there to demo stuff. And some of 131 00:06:10,600 --> 00:06:12,880 Speaker 2: it is those people just defaulting too well. They're talking 132 00:06:12,880 --> 00:06:15,080 Speaker 2: about how this thing like like remembers and knows you, 133 00:06:15,120 --> 00:06:17,720 Speaker 2: so I'll talk about how it like it has a personality, 134 00:06:17,800 --> 00:06:20,719 Speaker 2: it has memories, it has experienced, it has core memories, 135 00:06:21,200 --> 00:06:24,320 Speaker 2: you know, it has preferences and like a personality, it 136 00:06:24,360 --> 00:06:27,520 Speaker 2: wants things. I talked to a couple of different people 137 00:06:27,560 --> 00:06:29,760 Speaker 2: at booths who like that was the thing they were emphasizing, 138 00:06:29,800 --> 00:06:32,240 Speaker 2: is that, like, this is an AI that like feels 139 00:06:32,279 --> 00:06:34,599 Speaker 2: and gets to know you and has a relationship with you. 140 00:06:35,200 --> 00:06:39,119 Speaker 2: And it's very number one not what they would want 141 00:06:39,120 --> 00:06:42,279 Speaker 2: publicly because that's crazy and none of the products actually 142 00:06:42,360 --> 00:06:45,400 Speaker 2: work that way. But the attempt to convince people that, like, 143 00:06:45,440 --> 00:06:48,720 Speaker 2: what we have done is create a robot that lives 144 00:06:48,720 --> 00:06:51,280 Speaker 2: in your home and does chores and it can think 145 00:06:51,320 --> 00:06:51,880 Speaker 2: and feel. 146 00:06:52,480 --> 00:06:55,120 Speaker 3: And anytime you say, like, well, is there you just 147 00:06:55,120 --> 00:06:56,000 Speaker 3: saying you built. 148 00:06:55,760 --> 00:06:58,920 Speaker 2: A slave, Like are you saying, well, there's thinking sentient 149 00:06:59,120 --> 00:07:01,760 Speaker 2: robots that you have lived in your house and do 150 00:07:01,839 --> 00:07:02,560 Speaker 2: your laundry. 151 00:07:03,760 --> 00:07:04,600 Speaker 3: Isn't that a slave? 152 00:07:06,720 --> 00:07:09,239 Speaker 2: And it's not, actually, because the robot doesn't not actually, 153 00:07:09,279 --> 00:07:11,480 Speaker 2: but what they were saying was true, it would be 154 00:07:11,520 --> 00:07:14,040 Speaker 2: a problem, right, really. 155 00:07:13,800 --> 00:07:16,800 Speaker 4: Oh, if you get introduce yourself hoever youone introduced. 156 00:07:16,480 --> 00:07:20,000 Speaker 1: Oh, I'm Ben Ben Rose Porter, I am an academic, 157 00:07:20,040 --> 00:07:22,080 Speaker 1: I'm a sociologist at Kearney. 158 00:07:21,720 --> 00:07:24,720 Speaker 4: And you have accompanied us through the wonderful world of 159 00:07:24,800 --> 00:07:27,239 Speaker 4: Las Vegas and cees this year. I always like bringing 160 00:07:27,280 --> 00:07:29,640 Speaker 4: people to witness the beautiful world. 161 00:07:29,880 --> 00:07:32,560 Speaker 1: Yeah, I've been brought to this place very far from god, 162 00:07:32,760 --> 00:07:35,920 Speaker 1: Las Vegas and the Tech Convention Center. There is this 163 00:07:36,080 --> 00:07:40,080 Speaker 1: moment where we were walking through and it was the 164 00:07:40,600 --> 00:07:44,040 Speaker 1: the the Amy Bot for kids. 165 00:07:44,000 --> 00:07:46,680 Speaker 4: Which me and Robertsau last year, that the little like 166 00:07:46,760 --> 00:07:49,080 Speaker 4: oval owl looking robot. 167 00:07:49,200 --> 00:07:52,240 Speaker 1: Yeah yeah, and they had this She's got to be 168 00:07:52,280 --> 00:07:55,320 Speaker 1: an actress and she was doing like a little skit 169 00:07:55,400 --> 00:07:58,800 Speaker 1: with the the Amy Bot where it was like it 170 00:07:58,840 --> 00:08:01,120 Speaker 1: was like the Amy about birthday or something, and she 171 00:08:01,320 --> 00:08:03,960 Speaker 1: was like very clearly having this this very produced. It 172 00:08:04,000 --> 00:08:07,200 Speaker 1: reminded me exactly of how like Cheap telea novella's like 173 00:08:07,280 --> 00:08:10,600 Speaker 1: actors talk where she is just like wow, I mean, 174 00:08:10,640 --> 00:08:13,160 Speaker 1: you've really gotten to know me over the years. And 175 00:08:13,320 --> 00:08:16,800 Speaker 1: it was bizarre in that one the selling point of 176 00:08:16,800 --> 00:08:19,840 Speaker 1: the robot was I think they said, turning data into empathy. 177 00:08:19,920 --> 00:08:22,560 Speaker 4: Yes, it's able to turn data into empathy. 178 00:08:22,160 --> 00:08:25,400 Speaker 1: Which God knows what that means, but also that like, 179 00:08:26,080 --> 00:08:29,920 Speaker 1: so clearly the robot turns data into empathy, but also 180 00:08:30,040 --> 00:08:34,320 Speaker 1: we cannot show you the robot doing anything concretely, so 181 00:08:34,640 --> 00:08:37,160 Speaker 1: we will have a person like it was just this 182 00:08:37,280 --> 00:08:40,520 Speaker 1: very one sided like skit where this person was doing 183 00:08:40,520 --> 00:08:43,640 Speaker 1: this really overly emotional like back and forth of the robot, 184 00:08:43,679 --> 00:08:45,800 Speaker 1: where the robot would just respond with like the bare 185 00:08:45,920 --> 00:08:51,199 Speaker 1: minimum like phrases, and so like what they're selling is 186 00:08:51,200 --> 00:08:55,760 Speaker 1: is questionable if anyone wants it, and all speculative, it's 187 00:08:55,760 --> 00:08:57,440 Speaker 1: all none of it can actually be presented. 188 00:08:57,440 --> 00:09:00,200 Speaker 3: It's all like the potential to do this, and then 189 00:09:00,480 --> 00:09:01,880 Speaker 3: and then even the way. 190 00:09:01,679 --> 00:09:04,880 Speaker 1: That they're actually showing that is mostly just cheap tricks. 191 00:09:05,120 --> 00:09:08,880 Speaker 1: There's another booth where they had the sex robots, and 192 00:09:09,200 --> 00:09:13,440 Speaker 1: I was just it was shocking because the stand like 193 00:09:13,480 --> 00:09:16,480 Speaker 1: you're at this convention, you've presumably you know, gone through 194 00:09:16,520 --> 00:09:19,559 Speaker 1: a lot to get here, and you're the image you're 195 00:09:19,559 --> 00:09:21,680 Speaker 1: putting forward of your robot that you know you're selling 196 00:09:21,720 --> 00:09:25,239 Speaker 1: as the sex robot. It's like this cheap AI image, 197 00:09:25,320 --> 00:09:27,200 Speaker 1: not even one of the good ones, like that they 198 00:09:27,200 --> 00:09:31,000 Speaker 1: are like clear artifacts and very weird lines and things, 199 00:09:31,080 --> 00:09:34,559 Speaker 1: and that you could google an anime jpeg and get 200 00:09:34,559 --> 00:09:38,480 Speaker 1: a better image for this. So just even the smoke 201 00:09:38,520 --> 00:09:39,800 Speaker 1: and mirrors of it was cheap. 202 00:09:39,880 --> 00:09:42,640 Speaker 4: Yeah, that was that Love Ants is the Sex Robot booth, 203 00:09:42,880 --> 00:09:45,400 Speaker 4: And yeah, they had two products. One was like just 204 00:09:45,440 --> 00:09:48,880 Speaker 4: like you know silicon like realistic human skin sex robot, 205 00:09:48,880 --> 00:09:51,280 Speaker 4: which is similar to like you know, those horrifying sex dolls, 206 00:09:51,320 --> 00:09:54,240 Speaker 4: except now we have an LM inside. It's another one 207 00:09:54,240 --> 00:09:57,400 Speaker 4: of those LM rappers, except it's wrapped around a redheaded woman. 208 00:09:57,880 --> 00:09:59,680 Speaker 3: Garrison. I find that very offensive. 209 00:10:00,120 --> 00:10:03,440 Speaker 2: It's actually some people are just they're not capable of 210 00:10:03,480 --> 00:10:05,960 Speaker 2: talking to women or other human beings of any kind. 211 00:10:06,080 --> 00:10:07,079 Speaker 4: Yeah, people with ADHD. 212 00:10:07,200 --> 00:10:10,560 Speaker 2: It's actually a disability where people can't know other people 213 00:10:10,600 --> 00:10:13,640 Speaker 2: and can only have sexual gratification. 214 00:10:13,120 --> 00:10:13,960 Speaker 3: Through a creepy robot. 215 00:10:14,360 --> 00:10:18,200 Speaker 4: I apologize for my respect for my on air ableism. Yeah, 216 00:10:18,520 --> 00:10:21,120 Speaker 4: but no again. This is the year of l M rappers, 217 00:10:21,160 --> 00:10:24,079 Speaker 4: and now they're putting it, putting it in a sexpod, 218 00:10:24,280 --> 00:10:26,839 Speaker 4: which is more unnerving between than a regular sex doll, 219 00:10:26,880 --> 00:10:29,040 Speaker 4: because a regular sex doll, you kind of know it's 220 00:10:29,040 --> 00:10:31,400 Speaker 4: an object. Like it's it's not trying to be much 221 00:10:31,400 --> 00:10:32,839 Speaker 4: more than an object. It can't you can't you could 222 00:10:32,840 --> 00:10:35,120 Speaker 4: you put in positions, but it's it's static. Yeah, this 223 00:10:35,440 --> 00:10:39,040 Speaker 4: because this thing tries to kind of engage with you. 224 00:10:39,440 --> 00:10:43,880 Speaker 4: It activates my uncanny Valley response way more because it's 225 00:10:43,920 --> 00:10:46,840 Speaker 4: like it's kind of trying to pretend to be a person, 226 00:10:47,480 --> 00:10:49,520 Speaker 4: and like, I could not look at the thing for 227 00:10:49,679 --> 00:10:51,640 Speaker 4: very long. So they just like started like I just 228 00:10:51,720 --> 00:10:55,800 Speaker 4: felt bad. Yeah, and some of that's probably my latent Protestantism, 229 00:10:56,080 --> 00:10:58,320 Speaker 4: but it's I just feel I just felt bad. But 230 00:10:58,480 --> 00:11:00,880 Speaker 4: the other product they had, like around the corner was 231 00:11:00,920 --> 00:11:03,880 Speaker 4: this was this like you know, anime style like like Avatar, 232 00:11:03,920 --> 00:11:06,240 Speaker 4: which was which is on like a screen that you 233 00:11:06,280 --> 00:11:08,520 Speaker 4: can talk to and it's synced up to like a 234 00:11:08,600 --> 00:11:11,480 Speaker 4: jack Off robot, right, so you can you can engage 235 00:11:11,520 --> 00:11:11,800 Speaker 4: with this. 236 00:11:11,880 --> 00:11:12,720 Speaker 3: Finally, you can. 237 00:11:12,559 --> 00:11:15,679 Speaker 4: Engage with this like this like blonde blonde hair, blue 238 00:11:15,720 --> 00:11:19,520 Speaker 4: eyed anime woman as it's as it connects to like 239 00:11:19,559 --> 00:11:22,560 Speaker 4: a little a little like jackof machine. And that was there. 240 00:11:22,559 --> 00:11:24,360 Speaker 4: That was the other product which did not work because 241 00:11:24,360 --> 00:11:26,200 Speaker 4: there was no Wi Fi in the Venetian Yeah, so 242 00:11:26,400 --> 00:11:28,640 Speaker 4: we could we could not see it, but the Jackoff 243 00:11:28,720 --> 00:11:29,800 Speaker 4: machine was still going strong. 244 00:11:29,880 --> 00:11:32,360 Speaker 2: Yeah, so you could say, I guess that, like, well, 245 00:11:32,720 --> 00:11:36,120 Speaker 2: obviously there's there's fundamental issues with like having Wi Fi 246 00:11:36,200 --> 00:11:38,360 Speaker 2: be decent. When you've got seventy thousand people like all 247 00:11:38,360 --> 00:11:40,440 Speaker 2: cramming themselves into a room, of course, that's going to 248 00:11:40,480 --> 00:11:44,000 Speaker 2: cause problems that the chatbots and the devices using them 249 00:11:44,000 --> 00:11:47,160 Speaker 2: are actually capable of more they should. They're more impressive 250 00:11:47,160 --> 00:11:49,400 Speaker 2: than you giving credit for just because the WiFi didn't work. 251 00:11:49,400 --> 00:11:51,960 Speaker 2: But also if all your builds is a shell that 252 00:11:52,040 --> 00:11:55,880 Speaker 2: without the Internet and access to someone else's chatbot, it's useless. 253 00:11:55,880 --> 00:11:56,960 Speaker 3: It doesn't do anything. 254 00:11:57,160 --> 00:11:59,000 Speaker 4: This is the big people I haven't really made a 255 00:11:59,040 --> 00:12:02,319 Speaker 4: product acts are going to brick as soon as soon 256 00:12:02,360 --> 00:12:05,120 Speaker 4: as as soon as chat GPT raises its API costs, 257 00:12:05,480 --> 00:12:08,080 Speaker 4: they're going to do one of two things. They're either 258 00:12:08,080 --> 00:12:10,400 Speaker 4: going to stop working or they're going to move their 259 00:12:10,440 --> 00:12:13,559 Speaker 4: chatbot provider to a different one that's going to behave differently, 260 00:12:13,559 --> 00:12:15,000 Speaker 4: and then it's fundamentally a different. 261 00:12:14,760 --> 00:12:16,960 Speaker 2: And that's why periodically I would run into someone where 262 00:12:16,960 --> 00:12:19,080 Speaker 2: it's like everything that we do is on device and everything, 263 00:12:19,160 --> 00:12:21,200 Speaker 2: and even the ones that were still because almost you 264 00:12:21,200 --> 00:12:24,360 Speaker 2: almost have to say that whatever you're doing is AI 265 00:12:25,120 --> 00:12:27,319 Speaker 2: and stuff like. There was a company that I came 266 00:12:27,360 --> 00:12:29,439 Speaker 2: across because of their name because I just saw the 267 00:12:29,520 --> 00:12:31,520 Speaker 2: name trans Ai and I was like, well, I gotta 268 00:12:31,520 --> 00:12:32,160 Speaker 2: go see what that is. 269 00:12:32,200 --> 00:12:33,280 Speaker 4: I did see that, and it's. 270 00:12:33,320 --> 00:12:36,040 Speaker 2: Simply I believe they're a Korean company, but it's it's 271 00:12:36,120 --> 00:12:38,240 Speaker 2: just a company that makes like a translator, right, and 272 00:12:38,280 --> 00:12:40,800 Speaker 2: they make it specifically. It's like the size of a 273 00:12:40,800 --> 00:12:43,800 Speaker 2: smaller smartphone. It looks kind of like a smartphone and 274 00:12:43,960 --> 00:12:46,480 Speaker 2: you set it down and it will on device. It 275 00:12:46,520 --> 00:12:48,800 Speaker 2: does not touch the cloud at all for any reason. 276 00:12:49,160 --> 00:12:50,600 Speaker 2: It can translate, so if you want to have a 277 00:12:50,600 --> 00:12:52,920 Speaker 2: conversation with someone in a foreign language, it can like 278 00:12:52,920 --> 00:12:55,000 Speaker 2: like live translate for you both. And also it will 279 00:12:55,000 --> 00:12:58,160 Speaker 2: transcribe whatever conversations you're happening. And they were like, yeah, 280 00:12:58,160 --> 00:13:00,360 Speaker 2: this is for people who want to transcribe. No have 281 00:13:00,400 --> 00:13:02,640 Speaker 2: been there at college. It's for people who are doing interviews, 282 00:13:02,720 --> 00:13:04,800 Speaker 2: journalists and stuff. So and it was a really good 283 00:13:05,120 --> 00:13:06,880 Speaker 2: It seemed to be a good product. I've not gotten 284 00:13:06,920 --> 00:13:08,800 Speaker 2: to test it outside of the show floor I saw. 285 00:13:08,800 --> 00:13:09,600 Speaker 4: I saw a few of these. 286 00:13:09,880 --> 00:13:12,079 Speaker 1: One of the big differences I saw between like the 287 00:13:12,120 --> 00:13:15,120 Speaker 1: few there were like two or three or four booths 288 00:13:15,520 --> 00:13:17,760 Speaker 1: that we saw where the product. I was like, I 289 00:13:17,800 --> 00:13:19,520 Speaker 1: had a positive. I was like I walked away with 290 00:13:19,559 --> 00:13:24,920 Speaker 1: some mildly positive. Was that almost everything else talk about 291 00:13:24,960 --> 00:13:28,400 Speaker 1: like the sort of insatiability of capital is that it 292 00:13:28,480 --> 00:13:31,280 Speaker 1: has to sell. The sex doll was a perfect example 293 00:13:31,280 --> 00:13:33,960 Speaker 1: of this in that you know, if you make a 294 00:13:33,960 --> 00:13:36,200 Speaker 1: sex doll and you put the chat GPT inside of 295 00:13:36,200 --> 00:13:38,280 Speaker 1: it and then you sell it as this is a 296 00:13:38,320 --> 00:13:41,640 Speaker 1: sex doll. It's an object you find, but now you 297 00:13:41,640 --> 00:13:44,360 Speaker 1: can like you could have sexy conversation with Its still 298 00:13:44,360 --> 00:13:48,119 Speaker 1: an object, but like you know, that's fun for some people. 299 00:13:48,640 --> 00:13:50,720 Speaker 3: Thing that people didn't have, and it is new. 300 00:13:50,880 --> 00:13:53,200 Speaker 1: Yeah, and it's a phenomena you could clearly show off. 301 00:13:53,240 --> 00:13:55,280 Speaker 1: It's like, oh you can, you know, now the sex 302 00:13:55,360 --> 00:13:59,480 Speaker 1: doll can like say your name and stuff. But almost 303 00:13:59,480 --> 00:14:01,719 Speaker 1: all of the boot that we're selling some kind of aiproduct. 304 00:14:02,040 --> 00:14:05,320 Speaker 1: It was like, we have to sell the opportunity to 305 00:14:05,400 --> 00:14:09,720 Speaker 1: like transcend like your mortal shell and become a part 306 00:14:09,760 --> 00:14:13,480 Speaker 1: of the cosmos itself. Like the SEXTL was literally sitting 307 00:14:13,520 --> 00:14:15,960 Speaker 1: in this corner talking to no one and saying stuff 308 00:14:15,960 --> 00:14:20,280 Speaker 1: like I'm about emotional intelligence and building a connection, getting 309 00:14:20,320 --> 00:14:23,160 Speaker 1: to know you and reaching into your soul. And it's 310 00:14:23,200 --> 00:14:26,240 Speaker 1: like it clearly cannot do this. And the few products 311 00:14:26,240 --> 00:14:28,880 Speaker 1: that were good were the ones where the people showing 312 00:14:28,880 --> 00:14:32,000 Speaker 1: it were just able to like just put that aside 313 00:14:32,160 --> 00:14:34,680 Speaker 1: and just say here's what the product. 314 00:14:34,280 --> 00:14:35,440 Speaker 4: Does, right, here's what it can do. 315 00:14:35,480 --> 00:14:39,040 Speaker 2: Concrete and that has become my baseline first question, which 316 00:14:39,080 --> 00:14:41,760 Speaker 2: is like, have you done anything with your product? And 317 00:14:41,800 --> 00:14:43,840 Speaker 2: if all that you've done is we invented a new 318 00:14:43,880 --> 00:14:47,320 Speaker 2: device that didn't previously have a chatbot that it connects 319 00:14:47,360 --> 00:14:50,080 Speaker 2: to through data that someone else built, you didn't do 320 00:14:50,120 --> 00:14:52,960 Speaker 2: anything that's not a product, that's not real. So kind 321 00:14:52,960 --> 00:14:56,000 Speaker 2: of my first my filter was like, is there anything 322 00:14:56,120 --> 00:14:59,800 Speaker 2: here beyond another way to interact with a completely different 323 00:14:59,800 --> 00:15:01,120 Speaker 2: p that you didn't make. 324 00:15:11,880 --> 00:15:14,480 Speaker 4: To go back to like the AI note taking devices, 325 00:15:14,480 --> 00:15:17,040 Speaker 4: which I saw a few where there's a Lindon and 326 00:15:17,080 --> 00:15:18,160 Speaker 4: it will take notes for you. I saw like a 327 00:15:18,360 --> 00:15:19,920 Speaker 4: lot of these, like marketed to like a college student. 328 00:15:19,960 --> 00:15:21,480 Speaker 2: And it's the thing that it is a thing that 329 00:15:22,320 --> 00:15:24,640 Speaker 2: machine learning, because I hate that it gets lumped in 330 00:15:24,680 --> 00:15:26,520 Speaker 2: with AI, but machine learning has gotten up a lot, 331 00:15:26,560 --> 00:15:26,800 Speaker 2: but it. 332 00:15:26,840 --> 00:15:27,960 Speaker 3: Is really good. 333 00:15:27,960 --> 00:15:29,720 Speaker 4: It's it's good at note taking and here's the thing, 334 00:15:29,760 --> 00:15:31,800 Speaker 4: and that's alliable and there's different devices that you can 335 00:15:31,800 --> 00:15:33,440 Speaker 4: get it on. Like I saw like like a note 336 00:15:33,440 --> 00:15:36,000 Speaker 4: taking pin that's a pin that just automatically takes notes 337 00:15:36,040 --> 00:15:37,000 Speaker 4: for you, and that was like, you know, kind of 338 00:15:37,000 --> 00:15:38,840 Speaker 4: like kind of kind of fun. But the thing is 339 00:15:39,200 --> 00:15:42,400 Speaker 4: you can do this exact same thing on your phone 340 00:15:42,720 --> 00:15:45,760 Speaker 4: with the jet GPT app. It's the exact same thing. Yes, 341 00:15:45,840 --> 00:15:48,120 Speaker 4: you don't need it in a pen, just turn your 342 00:15:48,160 --> 00:15:50,920 Speaker 4: phone on and it'll auto do the notes for you. 343 00:15:50,960 --> 00:15:54,440 Speaker 4: The actual product part is useless. The whole idea of 344 00:15:54,480 --> 00:15:57,680 Speaker 4: the smartphone is that you have everything you need already 345 00:15:57,720 --> 00:15:58,000 Speaker 4: on it. 346 00:15:58,120 --> 00:16:01,000 Speaker 2: And that's why I did respect again companies like trans 347 00:16:01,040 --> 00:16:03,200 Speaker 2: Ai where it's like, no, this is actually on device, 348 00:16:03,440 --> 00:16:06,040 Speaker 2: and this is a thing. This is that better utility 349 00:16:06,080 --> 00:16:08,080 Speaker 2: that my phone doesn't have, which is that no matter 350 00:16:08,120 --> 00:16:10,240 Speaker 2: where I am, even if I'm not connected to the internet, 351 00:16:10,560 --> 00:16:13,640 Speaker 2: I can translate and I can transcribe using this device. 352 00:16:13,960 --> 00:16:16,760 Speaker 2: That's real utility. And transa is not the only company. 353 00:16:16,840 --> 00:16:18,320 Speaker 2: A couple of companies had products like that. 354 00:16:18,560 --> 00:16:22,160 Speaker 4: Yeah, we saw this emotion tracking pendant which is oh 355 00:16:22,200 --> 00:16:25,800 Speaker 4: my god, on device, which which listens to everyone. 356 00:16:25,920 --> 00:16:29,040 Speaker 3: So you said motion, not emotion, no emotion, oh god, 357 00:16:29,080 --> 00:16:30,240 Speaker 3: but it listens. 358 00:16:30,360 --> 00:16:33,200 Speaker 4: It listens to everything you're saying. It doesn't uput anything 359 00:16:33,200 --> 00:16:36,080 Speaker 4: to cloud, but it in the AA is on device, 360 00:16:36,200 --> 00:16:39,160 Speaker 4: so it listens everything. It is like emotional sentiment analysis. 361 00:16:39,320 --> 00:16:43,320 Speaker 4: It also monitors your breath and your heartbeat because the 362 00:16:43,360 --> 00:16:45,480 Speaker 4: necklace like rests like on your chest, and then it 363 00:16:45,560 --> 00:16:48,120 Speaker 4: like talk and then it can like analyze like around 364 00:16:48,160 --> 00:16:51,760 Speaker 4: like six or seven. That's mystic seven different emotions, and 365 00:16:51,800 --> 00:16:54,560 Speaker 4: like it was like, fine, I don't I would never 366 00:16:54,800 --> 00:16:57,200 Speaker 4: need this thing to tell me how I'm feeling. I 367 00:16:57,240 --> 00:16:59,920 Speaker 4: know how to feel, but like it might be fun 368 00:17:00,040 --> 00:17:02,320 Speaker 4: for some people to track how they're feeling or be like, 369 00:17:02,480 --> 00:17:04,520 Speaker 4: oh I was more stressed. I was more stressed this 370 00:17:04,560 --> 00:17:07,360 Speaker 4: week than like last week, and like still. 371 00:17:07,119 --> 00:17:10,280 Speaker 2: And there's even there's at least a degree of baseline 372 00:17:10,320 --> 00:17:13,360 Speaker 2: optimism that you have for the product when it's like, Okay, 373 00:17:13,400 --> 00:17:15,760 Speaker 2: this is a device where you're trying to track people's emotions, 374 00:17:16,119 --> 00:17:19,040 Speaker 2: and your immediate first thing you decided as a company 375 00:17:19,119 --> 00:17:20,280 Speaker 2: was this can't go on the cloud. 376 00:17:20,680 --> 00:17:21,919 Speaker 3: That would be a responsible This. 377 00:17:21,920 --> 00:17:23,600 Speaker 4: Is why that was the first thing I asked, right. 378 00:17:23,760 --> 00:17:26,440 Speaker 2: Right, And that is I guess the most important fundamental 379 00:17:26,480 --> 00:17:30,400 Speaker 2: difference between companies and people here and between the companies 380 00:17:30,440 --> 00:17:32,680 Speaker 2: that are embracing to some extent AI is the ones 381 00:17:32,720 --> 00:17:35,320 Speaker 2: who whose default was like, well, but we're doing something 382 00:17:35,400 --> 00:17:40,240 Speaker 2: that involves emotions or that involves like interviews or conversations 383 00:17:40,240 --> 00:17:42,840 Speaker 2: that people might not want on like we shouldn't have 384 00:17:42,920 --> 00:17:44,520 Speaker 2: that on the cloud, versus the people who are like, 385 00:17:44,520 --> 00:17:46,959 Speaker 2: why wouldn't you put literally everything on the cloud. Why 386 00:17:46,960 --> 00:17:49,200 Speaker 2: don't you want your health and medical data on the cloud. 387 00:17:49,359 --> 00:17:51,680 Speaker 2: Why don't we want your financial data on the cloud? Right, 388 00:17:51,760 --> 00:17:54,520 Speaker 2: Like that is kind of like the most fundamental difference 389 00:17:54,560 --> 00:17:55,680 Speaker 2: that you see between people. 390 00:17:55,560 --> 00:17:59,040 Speaker 1: Here, part and parcel of the insatiable like just drive 391 00:17:59,119 --> 00:18:03,200 Speaker 1: for endless value. And probably the comparisons between this convention 392 00:18:03,320 --> 00:18:06,400 Speaker 1: and its location Las Vegas are really overlaiden at this point, 393 00:18:06,440 --> 00:18:08,840 Speaker 1: but I mean there is something about, like, you know, 394 00:18:08,880 --> 00:18:12,440 Speaker 1: the appeal of gambling is the promise of the speculative 395 00:18:12,480 --> 00:18:15,600 Speaker 1: promise of endless value, and how all of these technologies 396 00:18:15,600 --> 00:18:18,040 Speaker 1: are selling themselves off of endless value. And for the 397 00:18:18,080 --> 00:18:22,760 Speaker 1: producer side, that means like this this device has endless 398 00:18:22,760 --> 00:18:26,800 Speaker 1: function potentially, you know, we say endless functions. 399 00:18:26,280 --> 00:18:27,760 Speaker 4: Especially with these like AI devices. 400 00:18:27,840 --> 00:18:30,800 Speaker 1: Yeah yeah, but from the consumer side, it's from a well, 401 00:18:31,359 --> 00:18:34,800 Speaker 1: if you just give yourself over to you know, to 402 00:18:34,960 --> 00:18:37,240 Speaker 1: the god of capital, if you if you just bleed 403 00:18:37,280 --> 00:18:39,800 Speaker 1: into the machine and connect yourself to the cloud and 404 00:18:39,800 --> 00:18:43,159 Speaker 1: give over like everything, and it really is everything. I mean, 405 00:18:43,200 --> 00:18:45,879 Speaker 1: there's like AI towels that are like analyzing your sweat. 406 00:18:46,160 --> 00:18:49,200 Speaker 1: If you give over everything, there is this vague promise 407 00:18:49,240 --> 00:18:52,119 Speaker 1: of transcendence and that like you will escape the the 408 00:18:52,280 --> 00:18:54,919 Speaker 1: like the misery of the world that this place is 409 00:18:55,080 --> 00:18:59,400 Speaker 1: just both completely blind to and then also without ever 410 00:18:59,680 --> 00:19:02,800 Speaker 1: saying it, like also responding to it entirely. 411 00:19:03,240 --> 00:19:06,280 Speaker 2: First off, obviously you're you're coming at this from more 412 00:19:06,280 --> 00:19:08,640 Speaker 2: of a left wing perspective, so you probably don't understand 413 00:19:08,720 --> 00:19:11,880 Speaker 2: that gambling always works and you're definitely going to win. 414 00:19:12,640 --> 00:19:13,800 Speaker 2: So first off, you. 415 00:19:13,760 --> 00:19:16,760 Speaker 4: Know, no Vegas really is the anarcho capitalist paradise. 416 00:19:16,920 --> 00:19:19,440 Speaker 2: It sure is, but no, like like what you're saying 417 00:19:19,520 --> 00:19:22,280 Speaker 2: is they want you to give everything over. There is 418 00:19:22,440 --> 00:19:26,200 Speaker 2: absolutely there's not outside of again the odd booth where 419 00:19:26,240 --> 00:19:28,560 Speaker 2: you find sane people, which is almost how I think 420 00:19:28,560 --> 00:19:30,680 Speaker 2: about it them in my head, where it's like yeah, 421 00:19:31,040 --> 00:19:33,359 Speaker 2: where you're putting front and forward. This stays on the device. 422 00:19:33,440 --> 00:19:35,840 Speaker 2: You don't have to be online. We are not exposing 423 00:19:35,840 --> 00:19:36,320 Speaker 2: your data. 424 00:19:36,480 --> 00:19:38,679 Speaker 4: It's like seeing a lighthouse right in like in like 425 00:19:38,680 --> 00:19:40,840 Speaker 4: a horrible like rainstorm and you're like sailing god a 426 00:19:40,920 --> 00:19:42,880 Speaker 4: ship and you can't see anything and everyone's you'll see 427 00:19:42,880 --> 00:19:44,760 Speaker 4: a booth with like a real person God it's like 428 00:19:44,800 --> 00:19:47,800 Speaker 4: talking about it was solving a real plum, like oh finally, Yeah. 429 00:19:47,840 --> 00:19:50,560 Speaker 1: It's a spectrum between talking to the ais, talking to 430 00:19:50,920 --> 00:19:54,080 Speaker 1: the real few people and then the other people who 431 00:19:54,080 --> 00:19:54,520 Speaker 1: are kind of. 432 00:19:54,520 --> 00:19:55,320 Speaker 3: In between the two. 433 00:19:55,520 --> 00:19:58,520 Speaker 2: And it was I I went from seeing this app 434 00:19:58,560 --> 00:20:01,760 Speaker 2: where the whole purpose of this company that makes like 435 00:20:01,840 --> 00:20:05,800 Speaker 2: agentic ai solutions, who I'm scrolling to find there the 436 00:20:05,840 --> 00:20:08,040 Speaker 2: company name right now, all of it is they're making 437 00:20:08,119 --> 00:20:10,800 Speaker 2: agents that you can put in like point of sale things, 438 00:20:10,880 --> 00:20:12,800 Speaker 2: or you can put in like cars as a chatbot. 439 00:20:12,800 --> 00:20:12,879 Speaker 4: Like. 440 00:20:12,880 --> 00:20:14,919 Speaker 2: One thing they said is yeah, we can put this 441 00:20:14,960 --> 00:20:16,800 Speaker 2: in a car and we can have the you know, 442 00:20:16,840 --> 00:20:19,640 Speaker 2: you can navigate using voice the way you would normally 443 00:20:19,720 --> 00:20:21,760 Speaker 2: like with a bunch of other apps. But if you 444 00:20:21,840 --> 00:20:24,680 Speaker 2: navigate with voice using our app, it will only send 445 00:20:24,680 --> 00:20:27,720 Speaker 2: you to restaurants or businesses that we have a deal 446 00:20:27,800 --> 00:20:30,360 Speaker 2: with that are giving us a cut and so and 447 00:20:30,400 --> 00:20:32,399 Speaker 2: you too the car company gets a cut. 448 00:20:32,760 --> 00:20:33,080 Speaker 3: Two. 449 00:20:33,520 --> 00:20:36,800 Speaker 2: And that was the innovation is that we can not 450 00:20:37,119 --> 00:20:39,680 Speaker 2: give people like tell people where things are. We can 451 00:20:39,680 --> 00:20:42,119 Speaker 2: tell people where things are that hey us and you 452 00:20:42,160 --> 00:20:42,760 Speaker 2: get a cut of it. 453 00:20:42,920 --> 00:20:45,240 Speaker 4: The company can like make a partnership with like Coca Cola, 454 00:20:45,400 --> 00:20:46,560 Speaker 4: right so then when you we're. 455 00:20:46,400 --> 00:20:48,440 Speaker 2: Literally talking to Coca Cola reps when I was there, 456 00:20:48,560 --> 00:20:51,320 Speaker 2: it's showing them that like, yeah, we have a like, look, 457 00:20:51,359 --> 00:20:53,520 Speaker 2: we've replaced the human beings that take your orders at 458 00:20:53,520 --> 00:20:56,520 Speaker 2: Burger King, and the chatbot can alter on the fly 459 00:20:56,600 --> 00:20:59,760 Speaker 2: of the menu. If you have to get a smooth 460 00:20:59,800 --> 00:21:02,000 Speaker 2: a lot out of vanilla coat and you want to 461 00:21:02,000 --> 00:21:04,359 Speaker 2: sell as many large as possible, it can tell people 462 00:21:04,359 --> 00:21:05,240 Speaker 2: that's the only option. 463 00:21:05,640 --> 00:21:07,399 Speaker 3: WHOA And like like the. 464 00:21:07,600 --> 00:21:09,399 Speaker 2: Fact that they were just like bald faced about it, 465 00:21:09,400 --> 00:21:12,280 Speaker 2: because when I showed up, they were they were demoing 466 00:21:12,359 --> 00:21:15,960 Speaker 2: how this uh, this like Burger King menu with AI worked, 467 00:21:16,000 --> 00:21:18,239 Speaker 2: And they were like there was a full menu that 468 00:21:18,280 --> 00:21:20,920 Speaker 2: you could see that was like a like an updated 469 00:21:20,960 --> 00:21:23,440 Speaker 2: screen menu, but there was a secondary menu where they're 470 00:21:23,480 --> 00:21:25,040 Speaker 2: like pretending to be a guy who drew up to 471 00:21:25,040 --> 00:21:28,040 Speaker 2: Burger King. And the way that they started was like, yeah, 472 00:21:28,200 --> 00:21:30,439 Speaker 2: what burgers are good? What do you think I'd like? 473 00:21:30,640 --> 00:21:32,000 Speaker 2: And I was like, who No one. 474 00:21:31,920 --> 00:21:35,159 Speaker 4: Drives up, No one good drive through window and asks 475 00:21:35,240 --> 00:21:35,880 Speaker 4: what's good? 476 00:21:36,080 --> 00:21:37,960 Speaker 2: Yeah, No, that's not how they And again there's a 477 00:21:38,040 --> 00:21:40,760 Speaker 2: menu in front of you. You look at the menu 478 00:21:40,840 --> 00:21:44,320 Speaker 2: and it's like that's how everyone buys food. So at 479 00:21:44,320 --> 00:21:47,399 Speaker 2: first I was like, is this just a company that 480 00:21:47,440 --> 00:21:49,679 Speaker 2: doesn't know how life works? Who are like trying to 481 00:21:50,320 --> 00:21:54,280 Speaker 2: pretend this is like what people want where they go through? Yeah, yeah, 482 00:21:54,320 --> 00:21:57,000 Speaker 2: well what's good today in McDonald's you know, do you 483 00:21:57,000 --> 00:21:57,879 Speaker 2: have any specials? 484 00:21:58,040 --> 00:21:58,159 Speaker 4: Uh? 485 00:21:58,520 --> 00:21:59,360 Speaker 3: Which was crazy? 486 00:21:59,359 --> 00:22:00,920 Speaker 2: But then I realized because they were talking to this 487 00:22:00,960 --> 00:22:03,760 Speaker 2: small group of people, and I realized after a second, oh, 488 00:22:03,880 --> 00:22:06,000 Speaker 2: because I looked at their badges. Everyone has badges that 489 00:22:06,280 --> 00:22:08,800 Speaker 2: all of the people worked for Coca Cola. And so 490 00:22:08,840 --> 00:22:10,960 Speaker 2: they were talking about how, yes, if Burger King wants 491 00:22:11,040 --> 00:22:13,560 Speaker 2: to move you know, a specific kind of whopper, then 492 00:22:13,600 --> 00:22:15,520 Speaker 2: we can put that front and center when people ask 493 00:22:15,600 --> 00:22:17,880 Speaker 2: what's good and we can push it. And said, for coke, 494 00:22:17,880 --> 00:22:19,440 Speaker 2: if you guys want to move vanilla coke, we can 495 00:22:19,480 --> 00:22:21,640 Speaker 2: have Whenever people order anything, we can have it. Say 496 00:22:21,760 --> 00:22:23,280 Speaker 2: do you want to add a vanilla coke? A large 497 00:22:23,320 --> 00:22:23,880 Speaker 2: vanilla Cokes? 498 00:22:24,000 --> 00:22:24,960 Speaker 4: Just like yeah. 499 00:22:24,960 --> 00:22:29,040 Speaker 2: And so what I realized is that this company whose 500 00:22:29,119 --> 00:22:31,199 Speaker 2: name is this is SoundHound. 501 00:22:30,640 --> 00:22:34,880 Speaker 4: AI sound Hound Yeah yeah, great, pretty pretty good name. 502 00:22:35,000 --> 00:22:38,280 Speaker 2: Their motto was faster and more accurate, higher revenue. I 503 00:22:38,520 --> 00:22:41,359 Speaker 2: came to realize, and this is this is not entirely 504 00:22:41,400 --> 00:22:43,919 Speaker 2: a separation from other years because they are always selling 505 00:22:43,960 --> 00:22:47,080 Speaker 2: to companies like this. But there was absolutely they were 506 00:22:47,200 --> 00:22:49,520 Speaker 2: the only thing that they were talking about actual end users, 507 00:22:49,560 --> 00:22:51,960 Speaker 2: as was a thing that you can pull extra money 508 00:22:52,000 --> 00:22:55,240 Speaker 2: out of by by tricking them, by pushing extra ads 509 00:22:55,280 --> 00:22:57,480 Speaker 2: to them. And that's who they were actually selling to, 510 00:22:57,640 --> 00:22:59,320 Speaker 2: is these companies that they were like. The other thing 511 00:22:59,320 --> 00:23:01,480 Speaker 2: they demoed was you can make an agent on the 512 00:23:01,480 --> 00:23:03,520 Speaker 2: fly and you can include the capabilities, and they showed 513 00:23:03,560 --> 00:23:06,119 Speaker 2: us how to select it and then built an AI, 514 00:23:06,400 --> 00:23:09,560 Speaker 2: an agent to live in your car. And the demo 515 00:23:09,600 --> 00:23:12,240 Speaker 2: they did was like, hey, my car is making this sound. 516 00:23:12,280 --> 00:23:13,800 Speaker 2: What do you think it is? They didn't play the 517 00:23:13,880 --> 00:23:15,840 Speaker 2: sound for the AI by the way they described it, 518 00:23:16,000 --> 00:23:18,200 Speaker 2: and the AA said that sounds like it could be 519 00:23:18,480 --> 00:23:21,480 Speaker 2: da teta da. It'll cost about seven hundred dollars to fix. 520 00:23:21,840 --> 00:23:25,480 Speaker 2: Great book me an appointment with the dealership. So first off, 521 00:23:25,520 --> 00:23:27,680 Speaker 2: that's not how people work. I've had car everyone has 522 00:23:27,720 --> 00:23:30,480 Speaker 2: car issues. A regular person. There's a problem with your car. 523 00:23:30,800 --> 00:23:33,639 Speaker 2: You either have a mechanic. That's not the dealership that 524 00:23:33,680 --> 00:23:35,440 Speaker 2: you go to because they didn't rip you off in 525 00:23:35,480 --> 00:23:37,720 Speaker 2: the past, and you're like, well, I trust them, not 526 00:23:37,800 --> 00:23:40,119 Speaker 2: to fuck me too bad, or you go to a 527 00:23:40,200 --> 00:23:43,200 Speaker 2: couple because most people don't just drop seven hundred dollars 528 00:23:43,240 --> 00:23:45,240 Speaker 2: in a repair and not think about it. But the 529 00:23:45,320 --> 00:23:47,879 Speaker 2: person that this engineer is pretending to be for the 530 00:23:47,880 --> 00:23:51,439 Speaker 2: purpose of this AI demo said, great, book me a 531 00:23:51,520 --> 00:23:54,680 Speaker 2: repair at the dealership. And the AI was like, okay, 532 00:23:54,720 --> 00:23:56,400 Speaker 2: I've called them and I've booked you an appointment. 533 00:23:56,440 --> 00:23:58,760 Speaker 3: And by the way, would you like to schedule the 534 00:23:58,800 --> 00:24:01,560 Speaker 3: test drive for this if it kind of car. Oh, 535 00:24:01,640 --> 00:24:03,280 Speaker 3: my wife loves that car. Book us. 536 00:24:03,280 --> 00:24:05,040 Speaker 2: And that was the whole thing is he was like, 537 00:24:05,119 --> 00:24:07,720 Speaker 2: don't be impressed that we can theoretically book you an appointment. 538 00:24:07,760 --> 00:24:10,359 Speaker 2: Be impressed that we can have the machine upsell you 539 00:24:10,440 --> 00:24:12,240 Speaker 2: on trying to buy a new car when you come 540 00:24:12,240 --> 00:24:14,879 Speaker 2: in to fix your old car that broke because you 541 00:24:14,920 --> 00:24:18,760 Speaker 2: bought a bad car. And there was no shame. They 542 00:24:18,800 --> 00:24:22,040 Speaker 2: were so proud of themselves, for this machine can repeatedly 543 00:24:22,320 --> 00:24:25,199 Speaker 2: upsell you things. And that was the only utility. It 544 00:24:25,240 --> 00:24:28,520 Speaker 2: was not this allows you to more easily navigate town, 545 00:24:28,880 --> 00:24:31,720 Speaker 2: This allows you to more easily, you know, cut out 546 00:24:31,800 --> 00:24:35,320 Speaker 2: problems in your life. It was this machine can upsell 547 00:24:35,359 --> 00:24:38,240 Speaker 2: you every minute of your day. Everything you ask it 548 00:24:38,280 --> 00:24:40,919 Speaker 2: to do everything you try to have it do, and 549 00:24:41,000 --> 00:24:42,760 Speaker 2: we get a cut of that. If we send it 550 00:24:42,800 --> 00:24:45,480 Speaker 2: to a restaurant and you buy food there, we get 551 00:24:45,520 --> 00:24:47,960 Speaker 2: a cut of that. And so does whatever company you 552 00:24:47,960 --> 00:24:49,800 Speaker 2: know put the thing in your car if you buy 553 00:24:49,800 --> 00:24:51,840 Speaker 2: a new car, we get a cut of that. That 554 00:24:52,000 --> 00:24:55,280 Speaker 2: was the product and that we have gone from. Here 555 00:24:55,320 --> 00:24:57,800 Speaker 2: are machines that do things. And even back in the 556 00:24:58,280 --> 00:25:01,040 Speaker 2: glorias of smartphones, at least everyone showing like, look, we 557 00:25:01,119 --> 00:25:03,439 Speaker 2: have a new phone that's thinner than a phone has 558 00:25:03,480 --> 00:25:03,840 Speaker 2: ever been. 559 00:25:03,960 --> 00:25:06,800 Speaker 4: Yeah, or like the camera is like you know, four 560 00:25:06,880 --> 00:25:07,960 Speaker 4: K now or something. 561 00:25:07,800 --> 00:25:10,320 Speaker 2: Whatever, the focus is always and now people who buy 562 00:25:10,359 --> 00:25:12,320 Speaker 2: them can do this with it, right. 563 00:25:12,880 --> 00:25:15,879 Speaker 1: I mean, I would guess that so much of the 564 00:25:15,920 --> 00:25:18,399 Speaker 1: impetus for creating this stuff and developing it is all 565 00:25:18,440 --> 00:25:22,040 Speaker 1: for producer goods, and then the more revenue is honest, 566 00:25:22,080 --> 00:25:24,160 Speaker 1: and that like that's what the and all the consumer 567 00:25:24,200 --> 00:25:26,600 Speaker 1: goods are mostly just getting you know, cast off is 568 00:25:26,680 --> 00:25:28,480 Speaker 1: like now we have all I mean literally, that's what 569 00:25:28,520 --> 00:25:31,040 Speaker 1: the LLLM rappers are. It's just like, oh, we have 570 00:25:31,080 --> 00:25:33,119 Speaker 1: this thing, let's throw a plastic robot on it and 571 00:25:33,440 --> 00:25:37,080 Speaker 1: try and sell it. But what drives its development is 572 00:25:37,880 --> 00:25:41,200 Speaker 1: squeezing just little bits, squeezing labor out of the pores 573 00:25:41,200 --> 00:25:43,800 Speaker 1: of the production process, which it just gets you a 574 00:25:43,800 --> 00:25:47,000 Speaker 1: little bit more, you know, capital to keep this engine 575 00:25:47,000 --> 00:25:47,960 Speaker 1: going a little bit further. 576 00:25:48,400 --> 00:25:51,120 Speaker 3: And it's so because the way it'll work is I 577 00:25:51,160 --> 00:25:54,159 Speaker 3: saw that thing where it's literally all you've invented is 578 00:25:54,160 --> 00:25:55,800 Speaker 3: a way to try to con people out of more 579 00:25:55,840 --> 00:25:58,359 Speaker 3: of their money. I hope you're proud of yourselves, because 580 00:25:58,440 --> 00:26:02,720 Speaker 3: I think it redacted is what you should do to yourself. 581 00:26:03,119 --> 00:26:04,520 Speaker 3: And I went from that to the booth of a 582 00:26:04,520 --> 00:26:06,520 Speaker 3: company called Gintech, who had never heard of before, but 583 00:26:06,560 --> 00:26:11,480 Speaker 3: they make different automotive products, and an engineer there showed 584 00:26:11,520 --> 00:26:13,439 Speaker 3: me a thing that he had been the lead on inventing, 585 00:26:13,480 --> 00:26:15,840 Speaker 3: which was a sun visor. So like you know, when 586 00:26:15,840 --> 00:26:17,960 Speaker 3: you're driving and there's a glare, you put down the 587 00:26:18,359 --> 00:26:20,520 Speaker 3: advisor and the visor is just like a basically a 588 00:26:20,560 --> 00:26:22,800 Speaker 3: piece of fabric and it blocks a chunk of your view, 589 00:26:22,800 --> 00:26:26,000 Speaker 3: but it at least blocks the sun. And this was 590 00:26:26,080 --> 00:26:29,880 Speaker 3: an intelligent visor that was clear and so you could 591 00:26:29,960 --> 00:26:32,439 Speaker 3: see through it, but it also blocked UV rays and 592 00:26:32,480 --> 00:26:34,840 Speaker 3: you could adjust the level of opacity if you needed 593 00:26:34,840 --> 00:26:36,920 Speaker 3: it to be more or less, but you could still 594 00:26:36,920 --> 00:26:39,320 Speaker 3: see through the mirror and it still blocked the glare. 595 00:26:39,760 --> 00:26:41,639 Speaker 3: And I was like, oh, that's really neat. And then 596 00:26:41,640 --> 00:26:43,760 Speaker 3: he pressed a button and it turned into a mirror. 597 00:26:44,160 --> 00:26:46,760 Speaker 3: Suddenly that functioned. It looked great. I saw it. 598 00:26:46,840 --> 00:26:49,920 Speaker 2: I know it works, and I got like an honest wow, 599 00:26:50,160 --> 00:26:52,159 Speaker 2: I didn't know that was a thing that could happen. 600 00:26:52,600 --> 00:26:55,960 Speaker 2: And that's a real product, and I can imagine using it. 601 00:26:56,320 --> 00:26:58,800 Speaker 2: That's like a problem where yeah, if you want to 602 00:26:58,840 --> 00:27:02,240 Speaker 2: block glare, you're losing a degree of visibility and now 603 00:27:02,240 --> 00:27:05,880 Speaker 2: you're not You've actually done something right. 604 00:27:06,160 --> 00:27:10,119 Speaker 4: Yeah, but you could put Gemini into a toaster and 605 00:27:10,160 --> 00:27:10,720 Speaker 4: call it a day. 606 00:27:10,880 --> 00:27:13,280 Speaker 3: What if your toaster could talk to you about Proust? 607 00:27:13,480 --> 00:27:24,640 Speaker 4: I guess I mean this is actually now that's an idea. 608 00:27:27,840 --> 00:27:30,479 Speaker 4: If we're going to close this this more AI focused episode, 609 00:27:30,520 --> 00:27:33,879 Speaker 4: I kind of want to circle back to Cloyd and 610 00:27:33,920 --> 00:27:37,240 Speaker 4: like why and why Chloyd Loyd again, chlod exists. 611 00:27:37,280 --> 00:27:39,080 Speaker 2: Just take a second, if you're listening to this at 612 00:27:39,080 --> 00:27:41,720 Speaker 2: home on the drive, if you've got family around, look 613 00:27:41,760 --> 00:27:44,439 Speaker 2: at each other, look another person in the eye, and 614 00:27:44,520 --> 00:27:48,800 Speaker 2: say the word LG has a new home assistance robot 615 00:27:48,920 --> 00:27:54,760 Speaker 2: named Cloyd. Cloyd roll it around in your tongue, you know, 616 00:27:55,040 --> 00:27:56,320 Speaker 2: just think about it for a second. 617 00:27:56,480 --> 00:28:00,640 Speaker 4: Okay, why Cloyd exists? Like why? Why? Why is LG 618 00:28:00,840 --> 00:28:04,280 Speaker 4: who's previously had some really impressive booths over the years, 619 00:28:04,320 --> 00:28:04,760 Speaker 4: and they. 620 00:28:04,640 --> 00:28:05,520 Speaker 3: Had cool TV. 621 00:28:05,640 --> 00:28:08,040 Speaker 4: They had TV's where the wallpaper TV was impressive. 622 00:28:08,119 --> 00:28:10,040 Speaker 3: Every time I go to an LG's booth every year, 623 00:28:10,080 --> 00:28:11,520 Speaker 3: I'm like, yeah, that's a better look on TV. 624 00:28:12,480 --> 00:28:16,199 Speaker 4: Really TV. Yeah it was great, but why why? And 625 00:28:16,240 --> 00:28:18,840 Speaker 4: then the wall peer TV is Okay, there was one 626 00:28:18,840 --> 00:28:21,040 Speaker 4: paper TV last year. It was it was slightly worse. 627 00:28:21,080 --> 00:28:23,200 Speaker 4: This one's a little bit better. But why does Cloyd 628 00:28:23,680 --> 00:28:26,080 Speaker 4: the big thing at the LG booth this year? 629 00:28:26,400 --> 00:28:26,760 Speaker 3: Loyd? 630 00:28:26,880 --> 00:28:28,959 Speaker 4: Right? Because none of the technology that Cloyd is doing 631 00:28:29,080 --> 00:28:31,520 Speaker 4: is new. Remember last year at show stoppers, me and 632 00:28:31,600 --> 00:28:35,320 Speaker 4: you we saw that really janky robot. N't have to 633 00:28:35,320 --> 00:28:37,439 Speaker 4: be more specific, but it's not taking a robot that 634 00:28:37,520 --> 00:28:38,280 Speaker 4: moves up and down. 635 00:28:38,480 --> 00:28:40,840 Speaker 3: Oh yeah yeah yeah, the pusher and shover robots. 636 00:28:40,760 --> 00:28:44,240 Speaker 4: Yes, right, And Cloyd is like kind of that. It's 637 00:28:44,280 --> 00:28:47,240 Speaker 4: like the actual physical robotic parts of Cloyd aren't new. Yeah, 638 00:28:47,240 --> 00:28:49,560 Speaker 4: and there's sort of the sort of AI that's running Cloyd. 639 00:28:49,400 --> 00:28:53,200 Speaker 2: And Kloyd if someone needed to make Wally that was 640 00:28:53,280 --> 00:28:55,720 Speaker 2: legally distinct enough to stop Disney from suing. 641 00:28:55,400 --> 00:28:58,160 Speaker 3: Them, and tall and Tom Taller. That's how Chloyd looks. 642 00:28:58,320 --> 00:29:01,400 Speaker 2: So why is Cloyd there? Why is Chloyd there? I'm 643 00:29:01,400 --> 00:29:03,400 Speaker 2: always asking myself this. 644 00:29:02,840 --> 00:29:04,920 Speaker 4: Is This goes into like what this what like Cees 645 00:29:05,120 --> 00:29:06,680 Speaker 4: is like doing this year, and how it reflects this 646 00:29:06,680 --> 00:29:09,680 Speaker 4: current state of the tech industry is that these lllms 647 00:29:09,960 --> 00:29:12,920 Speaker 4: like tra GVT are not actually better than they were 648 00:29:13,000 --> 00:29:15,200 Speaker 4: last year. No, they are. They are the same. So 649 00:29:15,240 --> 00:29:17,440 Speaker 4: how do we make things look look cool? 650 00:29:17,560 --> 00:29:19,920 Speaker 2: Just whatever improvements they are is not enough to notice 651 00:29:19,960 --> 00:29:21,320 Speaker 2: for an average person. 652 00:29:21,160 --> 00:29:23,880 Speaker 4: Very minimal. Yeah, So instead of actually having anything new 653 00:29:23,960 --> 00:29:27,160 Speaker 4: or any kind of sizeable improvement to display, they're combining 654 00:29:27,320 --> 00:29:30,280 Speaker 4: too old and some of the products are kind of archaic. 655 00:29:30,400 --> 00:29:33,240 Speaker 4: The come combining two older products and trying to pass 656 00:29:33,240 --> 00:29:35,440 Speaker 4: it off as a new thing. And that's these these 657 00:29:35,480 --> 00:29:38,920 Speaker 4: these like older older like robotic systems right that you're 658 00:29:39,000 --> 00:29:41,320 Speaker 4: usually kind of humanoid. Maybe they have hands, Maybe the 659 00:29:41,400 --> 00:29:44,320 Speaker 4: hands can grab things. Can the hands unscrew up a 660 00:29:44,360 --> 00:29:47,160 Speaker 4: milk carton? No? Can can can the can the robot 661 00:29:47,360 --> 00:29:48,520 Speaker 4: grab milk out of the fridge? 662 00:29:48,640 --> 00:29:51,040 Speaker 1: Yes, so long as you want milk from a very 663 00:29:51,120 --> 00:29:54,360 Speaker 1: specific carton and croissants and only croissants, you're good. 664 00:29:54,400 --> 00:29:56,360 Speaker 2: As long as the milk has a QR code that 665 00:29:56,400 --> 00:29:58,840 Speaker 2: the robot can recognize to know that it's milk. And 666 00:29:58,920 --> 00:30:02,440 Speaker 2: also when it's emptier than a certain level, it actually 667 00:30:02,440 --> 00:30:04,280 Speaker 2: will crush the milk thing like it has to be 668 00:30:04,280 --> 00:30:05,640 Speaker 2: a certain level a full otherwise it doesn't. 669 00:30:05,640 --> 00:30:07,880 Speaker 4: But it's neither of these things are new. And the 670 00:30:07,880 --> 00:30:10,600 Speaker 4: fact that Algae doesn't have anything else to display at 671 00:30:10,600 --> 00:30:12,960 Speaker 4: their booth, the fact that they had to stoop so 672 00:30:13,200 --> 00:30:17,000 Speaker 4: low is to regurgitate this old kind of cheap robotics 673 00:30:17,360 --> 00:30:19,240 Speaker 4: and slap an LLLM in there and then call it 674 00:30:19,280 --> 00:30:21,880 Speaker 4: a day, so that they have very little to actually 675 00:30:22,160 --> 00:30:25,160 Speaker 4: show for us. Yes, and you see this walking through 676 00:30:25,320 --> 00:30:28,200 Speaker 4: like the Central Hall SAT. The Samsung booth is into there, 677 00:30:28,440 --> 00:30:30,960 Speaker 4: the Nkon booth is into there, the Sony booth is 678 00:30:30,960 --> 00:30:33,440 Speaker 4: mostly a car like. There's a lot of these big 679 00:30:33,480 --> 00:30:38,200 Speaker 4: companies are really absent from actual products. The Panasonic has 680 00:30:38,200 --> 00:30:40,600 Speaker 4: a really big booth, but it's mostly about like servers, 681 00:30:40,880 --> 00:30:44,080 Speaker 4: it's mostly about how they're how they're improving, how they're 682 00:30:44,120 --> 00:30:47,760 Speaker 4: improving data farming, there's no stuff, and a lot. 683 00:30:47,600 --> 00:30:49,120 Speaker 3: Of the stuff that does exist. 684 00:30:49,520 --> 00:30:52,280 Speaker 2: You even have to separate further from stuff that exists 685 00:30:52,520 --> 00:30:55,320 Speaker 2: and actually might be useful to stuff that exists and 686 00:30:55,400 --> 00:30:58,240 Speaker 2: might be useful because it solves a problem that the 687 00:30:58,360 --> 00:31:01,800 Speaker 2: thing that it is already. Like for example, a bunch 688 00:31:02,080 --> 00:31:06,120 Speaker 2: I came to several different companies that had a car 689 00:31:06,320 --> 00:31:09,480 Speaker 2: AI assistance whose job was to yell at you if 690 00:31:09,520 --> 00:31:12,520 Speaker 2: you fell asleep or got distracted, and they were all 691 00:31:12,520 --> 00:31:15,680 Speaker 2: built into these giant dashboard things that were the whole 692 00:31:15,760 --> 00:31:19,520 Speaker 2: dashboard is a and it's like, yeah, I can see 693 00:31:19,560 --> 00:31:21,640 Speaker 2: why you need a robot to yell people get distracted 694 00:31:21,960 --> 00:31:23,440 Speaker 2: because we have data. 695 00:31:23,480 --> 00:31:27,160 Speaker 4: Is there putning subway surfers on your fucking car dash. 696 00:31:26,800 --> 00:31:29,160 Speaker 2: We have ample data that shows that when you have 697 00:31:29,280 --> 00:31:32,440 Speaker 2: a giant screen in a car and people use that screen, 698 00:31:32,880 --> 00:31:36,040 Speaker 2: they are actually worse drivers than when they're just drawn 699 00:31:36,320 --> 00:31:39,560 Speaker 2: on a normal car. And so, yes, you have made 700 00:31:39,600 --> 00:31:41,600 Speaker 2: the car. You can show me how this whole dashboard 701 00:31:41,720 --> 00:31:43,160 Speaker 2: you can change it in a second. Look at all 702 00:31:43,160 --> 00:31:45,280 Speaker 2: these different modes you have. You can smartly change your 703 00:31:45,360 --> 00:31:48,080 Speaker 2: dashboard be whatever you want and it'll yell at you 704 00:31:48,080 --> 00:31:49,360 Speaker 2: if you get distracted. And it's like, well, but the 705 00:31:49,360 --> 00:31:51,720 Speaker 2: only reason you're getting distracted is because your entire car 706 00:31:51,840 --> 00:31:55,600 Speaker 2: is a computer screen, which it shouldn't be, and we 707 00:31:55,720 --> 00:31:56,840 Speaker 2: know it shouldn't be. 708 00:31:57,240 --> 00:31:59,920 Speaker 4: They're either trying to solve problems that they created or 709 00:32:00,640 --> 00:32:03,960 Speaker 4: inventing solutions to things that aren't really problems. And like 710 00:32:03,960 --> 00:32:06,280 Speaker 4: with with and this is this specifically with Cloyd, and 711 00:32:06,280 --> 00:32:08,280 Speaker 4: they kept the guy who is like doing the demo 712 00:32:08,360 --> 00:32:11,800 Speaker 4: kept reiterating that Cloyd Cloud or already knows what you 713 00:32:11,920 --> 00:32:14,480 Speaker 4: want before you have to say anything, right, whether that's 714 00:32:14,520 --> 00:32:18,120 Speaker 4: a croissant that's slightly under baked, or he knows had 715 00:32:18,160 --> 00:32:20,520 Speaker 4: a fold laundry just the way you use the way, 716 00:32:21,760 --> 00:32:23,760 Speaker 4: which is what she what she said in a kind 717 00:32:23,760 --> 00:32:27,640 Speaker 4: of self aware ironic tone, because this robot spent two 718 00:32:27,720 --> 00:32:30,600 Speaker 4: minutes trying to fold a single towel and it couldn't 719 00:32:30,640 --> 00:32:33,280 Speaker 4: do it. These things don't work, and they're not meant to. 720 00:32:33,800 --> 00:32:37,080 Speaker 4: It's meant to drive traffic and attention towards the LG brand, 721 00:32:37,080 --> 00:32:38,840 Speaker 4: because there's gona tons of articles being like, look at 722 00:32:38,920 --> 00:32:41,960 Speaker 4: LG's new butler robot, right and that, and that's that's 723 00:32:41,960 --> 00:32:43,960 Speaker 4: all that they're doing at this demo. M HM because 724 00:32:44,000 --> 00:32:46,080 Speaker 4: this is not a real product for sale. It is 725 00:32:46,120 --> 00:32:48,000 Speaker 4: it is meant to drive attention to the brand and 726 00:32:48,040 --> 00:32:50,160 Speaker 4: get articles, and then those articles are going to get 727 00:32:50,160 --> 00:32:53,160 Speaker 4: you know, cited by by other by other lllfs. And 728 00:32:53,200 --> 00:32:56,080 Speaker 4: it's this, it's this cycle that just keeps building. There's 729 00:32:56,080 --> 00:32:58,160 Speaker 4: some really impressive stuff there too. Like I went to 730 00:32:58,160 --> 00:33:01,920 Speaker 4: persona AI's booth had a bunch of computers that had 731 00:33:02,480 --> 00:33:04,240 Speaker 4: without that were not connected to the Internet. 732 00:33:04,320 --> 00:33:05,920 Speaker 2: All the signs told you that. And it has on 733 00:33:06,160 --> 00:33:10,400 Speaker 2: pc AI image generators where it's all on the machine itself, 734 00:33:10,960 --> 00:33:13,800 Speaker 2: and you know, one of the representatives said, come on, 735 00:33:13,840 --> 00:33:15,600 Speaker 2: give it a prompt, it will generate an image. And 736 00:33:15,640 --> 00:33:17,720 Speaker 2: I've never used an AI image generator, and so I 737 00:33:17,800 --> 00:33:20,000 Speaker 2: kind of panic. I'll be honest, like I got freaked out. 738 00:33:20,080 --> 00:33:21,320 Speaker 4: It's gonna be some bullshit. 739 00:33:22,840 --> 00:33:24,320 Speaker 3: Tom size more with dead kid. 740 00:33:24,400 --> 00:33:27,240 Speaker 2: Yeah yeah, now that's not Tom size Moore. But that 741 00:33:27,400 --> 00:33:30,200 Speaker 2: is a man cuddling a dead child. The kid doesn't 742 00:33:30,200 --> 00:33:34,080 Speaker 2: the face is gone, and that's not Tom size more like. 743 00:33:34,440 --> 00:33:38,600 Speaker 4: But you know, the future, the future, No, this this 744 00:33:38,760 --> 00:33:41,280 Speaker 4: I can't. I don't want to harp on Cloyd too much, 745 00:33:41,520 --> 00:33:43,080 Speaker 4: but it's so it's such. 746 00:33:42,920 --> 00:33:46,080 Speaker 2: A good every additional time you say, Cloyd, it sounds 747 00:33:46,120 --> 00:33:46,800 Speaker 2: less like a work. 748 00:33:46,840 --> 00:33:49,400 Speaker 4: It's a good example of what this show is specifically 749 00:33:49,400 --> 00:33:52,120 Speaker 4: this year, how there's there's there's nothing new, so they're 750 00:33:52,160 --> 00:33:56,800 Speaker 4: reaching into like into like the cees of of of yesteryear. Yeah, 751 00:33:56,840 --> 00:33:59,520 Speaker 4: I'm trying to push two things together and pretend that 752 00:33:59,560 --> 00:34:02,440 Speaker 4: it's a new thing. And when it doesn't work, they're like, oh, 753 00:34:02,560 --> 00:34:05,680 Speaker 4: this is actually a good thing. He's folding the towel 754 00:34:05,760 --> 00:34:08,160 Speaker 4: just the way I like, and that's bad. Poorly. Look, 755 00:34:08,320 --> 00:34:11,960 Speaker 4: he spends ninety seconds putting a single shirt into the 756 00:34:12,080 --> 00:34:14,520 Speaker 4: washing machine, and this is him being very thorough. 757 00:34:14,640 --> 00:34:16,920 Speaker 3: That was the word. He's being very thorough, really put 758 00:34:17,040 --> 00:34:17,640 Speaker 3: in the time, and. 759 00:34:17,640 --> 00:34:18,920 Speaker 4: You're like, this thing doesn't work. 760 00:34:19,160 --> 00:34:21,560 Speaker 2: Is bad, it's a bad product. Part of the mistake 761 00:34:21,600 --> 00:34:23,160 Speaker 2: they made, I think, is that because this is the 762 00:34:23,239 --> 00:34:26,640 Speaker 2: year of robots, there are robots there that are like 763 00:34:26,680 --> 00:34:30,160 Speaker 2: industrial application robots that are showcasing we have made a 764 00:34:30,280 --> 00:34:33,160 Speaker 2: robot with humanoid hands that is capable of more intricate 765 00:34:33,200 --> 00:34:35,520 Speaker 2: movements than any other human hand robot before, And they 766 00:34:35,520 --> 00:34:38,920 Speaker 2: showed it like intricately folding like a pinwheel, and I 767 00:34:39,040 --> 00:34:41,880 Speaker 2: was like, yeah, that I have not seen a robot 768 00:34:41,920 --> 00:34:44,960 Speaker 2: with humanoid hands that has had that much dexterity before. 769 00:34:45,440 --> 00:34:48,680 Speaker 2: I'm sure that has some useful industrial applications. And then 770 00:34:48,880 --> 00:34:50,600 Speaker 2: you go from that and there's a bunch of other 771 00:34:50,680 --> 00:34:52,399 Speaker 2: robots that are industrial where it's like, we have built 772 00:34:52,400 --> 00:34:54,520 Speaker 2: a new tip for this robot that allows it to 773 00:34:54,520 --> 00:34:57,000 Speaker 2: do this kind of automotive work, or allows it to 774 00:34:57,000 --> 00:34:59,919 Speaker 2: do this kind of like manufacturing work, right where I'm like, ayes, 775 00:35:00,000 --> 00:35:03,000 Speaker 2: assume not being an expert on robotics, but you're saying 776 00:35:03,040 --> 00:35:05,520 Speaker 2: it's the world's first robot that can handle this task. 777 00:35:05,840 --> 00:35:08,160 Speaker 2: But that's at least an innovation. You talk about the 778 00:35:08,200 --> 00:35:10,480 Speaker 2: ethics of replacing him, but like, that's a thing that 779 00:35:10,640 --> 00:35:13,479 Speaker 2: is a new capability. And you have those robots next 780 00:35:13,480 --> 00:35:15,680 Speaker 2: to the robots that human beings are actally meant to 781 00:35:15,680 --> 00:35:18,360 Speaker 2: buy and put in their homes, none of which work well, 782 00:35:18,680 --> 00:35:21,719 Speaker 2: all of which are exactly as capable as robots twenty 783 00:35:21,800 --> 00:35:24,200 Speaker 2: years ago, except there's a chatbot on them, and it 784 00:35:24,280 --> 00:35:26,600 Speaker 2: makes it all look worse. Where you're showing me what 785 00:35:26,760 --> 00:35:29,839 Speaker 2: theoretically the best in robotics can do, and then I'm 786 00:35:29,840 --> 00:35:31,080 Speaker 2: looking at the thing I'm supposed to have in my 787 00:35:31,120 --> 00:35:34,239 Speaker 2: house and it just fell down and it can't fold laundry, 788 00:35:34,760 --> 00:35:37,520 Speaker 2: and you want me to spend six thousand dollars or 789 00:35:37,640 --> 00:35:41,879 Speaker 2: twelve thousand dollars. One of the robot THEU I think 790 00:35:41,920 --> 00:35:43,880 Speaker 2: it was like Booster X or something like that. The 791 00:35:43,880 --> 00:35:46,440 Speaker 2: one that was dressed like Michael Jackson. You're supposed to 792 00:35:46,480 --> 00:35:48,279 Speaker 2: have as a companion for your child. It will help 793 00:35:48,280 --> 00:35:51,240 Speaker 2: it with its homework using chat GPT. It's small dancing robots, 794 00:35:51,239 --> 00:35:54,719 Speaker 2: small dancing robots. Yes, yes, the small dancing robot that 795 00:35:54,719 --> 00:35:56,400 Speaker 2: you've been hit in the head with a liquor bottle 796 00:35:56,400 --> 00:35:57,200 Speaker 2: and it won't break. 797 00:35:57,480 --> 00:36:00,640 Speaker 4: That was part of the ad video. You know how 798 00:36:00,680 --> 00:36:02,279 Speaker 4: you always want to hit your get in the head 799 00:36:02,320 --> 00:36:05,600 Speaker 4: with the liquor bottle. Get out this anger with this 800 00:36:05,680 --> 00:36:09,400 Speaker 4: tiny child's high robot robot senator for your child. 801 00:36:10,440 --> 00:36:12,759 Speaker 2: And again, if someone was marketing a robot senator, like 802 00:36:12,800 --> 00:36:14,160 Speaker 2: are you angry at your spouse? 803 00:36:14,680 --> 00:36:16,799 Speaker 3: You can beat the shit out of this robot and 804 00:36:17,040 --> 00:36:19,120 Speaker 3: be fine. At least that's an idea. 805 00:36:19,520 --> 00:36:22,440 Speaker 1: With AI technology, the robot will actually learn your spouse's 806 00:36:22,480 --> 00:36:25,160 Speaker 1: personality and respond accordingly to the beatings. 807 00:36:25,200 --> 00:36:27,759 Speaker 2: Look, look, I've been hitting this robot after I come 808 00:36:27,800 --> 00:36:29,439 Speaker 2: back from work every day for the last two weeks. 809 00:36:29,480 --> 00:36:30,839 Speaker 2: And look as soon as I walk on the door, 810 00:36:30,880 --> 00:36:31,720 Speaker 2: it starts to shake. 811 00:36:33,080 --> 00:36:34,600 Speaker 3: The previous models is just way. 812 00:36:34,600 --> 00:36:35,480 Speaker 1: It wasn't satisfied. 813 00:36:35,560 --> 00:36:37,279 Speaker 2: Yeah, it took a long time for a team to 814 00:36:37,320 --> 00:36:39,319 Speaker 2: figure out how to give a robot PTSD. 815 00:36:39,719 --> 00:36:41,360 Speaker 3: By god, we've crossed the rubicon. 816 00:36:41,880 --> 00:36:44,319 Speaker 4: As you can see, Vegas is taking its toll on 817 00:36:44,360 --> 00:36:48,600 Speaker 4: our psyches as we've done a extended intimate partner abuse bid. 818 00:36:49,280 --> 00:36:50,880 Speaker 3: He's not a partner. It's robot Garrison. 819 00:36:51,000 --> 00:36:51,719 Speaker 4: Oh, you're right, it's not. 820 00:36:51,800 --> 00:36:54,680 Speaker 2: Also, the robot can think and feel and has core 821 00:36:54,760 --> 00:36:55,839 Speaker 2: memories and can love. 822 00:36:55,880 --> 00:37:01,320 Speaker 4: You don't put those two things together. He's only separate thoughts. 823 00:37:01,400 --> 00:37:03,680 Speaker 3: No, this robot basically has a soul. Watches hit it 824 00:37:03,719 --> 00:37:04,520 Speaker 3: with a liquor bottle. 825 00:37:04,640 --> 00:37:05,319 Speaker 4: It would just be. 826 00:37:05,360 --> 00:37:07,839 Speaker 1: So great to like with all of the how much 827 00:37:07,880 --> 00:37:11,799 Speaker 1: the they're focused on the the AI can develop real 828 00:37:11,920 --> 00:37:14,840 Speaker 1: human connection, but it's also saccharin. I would love to 829 00:37:14,960 --> 00:37:16,840 Speaker 1: just do a booth where it's like we're teaching our 830 00:37:16,960 --> 00:37:17,800 Speaker 1: robots hate. 831 00:37:17,960 --> 00:37:19,280 Speaker 4: They can't know how to hate. 832 00:37:19,480 --> 00:37:21,359 Speaker 2: Yeah, And I do want to end by noting one 833 00:37:21,400 --> 00:37:23,160 Speaker 2: thing that we talked about a little briefly, but is 834 00:37:23,239 --> 00:37:25,000 Speaker 2: kind of low key The most upsetting thing about this, 835 00:37:25,040 --> 00:37:26,840 Speaker 2: which is I saw a bunch of different booths that 836 00:37:26,960 --> 00:37:29,280 Speaker 2: use the term empathy, and what they meant by empathy 837 00:37:29,400 --> 00:37:33,080 Speaker 2: was the robot can understand and anticipate what you want. 838 00:37:33,320 --> 00:37:36,160 Speaker 2: Right that it's learning you and your patterns in order 839 00:37:36,239 --> 00:37:39,640 Speaker 2: to offer you and more effectively assist you. And I 840 00:37:39,680 --> 00:37:44,360 Speaker 2: guess technically yeah, but reducing the concept of empathy to 841 00:37:44,880 --> 00:37:49,799 Speaker 2: the robot knows when you might want snacks is kind 842 00:37:49,880 --> 00:37:56,040 Speaker 2: of evil, like it's in its freedom's right. Empathy means 843 00:37:56,040 --> 00:37:58,319 Speaker 2: the robot knows when to serve you is like a 844 00:37:58,719 --> 00:37:59,960 Speaker 2: bad way to talk about it. 845 00:38:00,000 --> 00:38:00,400 Speaker 3: Empathy. 846 00:38:00,640 --> 00:38:02,520 Speaker 2: I don't think most people you what is empathy? Well, 847 00:38:02,560 --> 00:38:04,520 Speaker 2: it means someone knows when I want to be up 848 00:38:04,520 --> 00:38:05,600 Speaker 2: sold on a Hyundai. 849 00:38:05,920 --> 00:38:07,120 Speaker 3: That's not what empathy is. 850 00:38:08,160 --> 00:38:12,080 Speaker 1: Yeah, our robot learns empathy by being instrumental to you 851 00:38:12,120 --> 00:38:12,680 Speaker 1: and useful. 852 00:38:12,760 --> 00:38:15,120 Speaker 3: See wenously what you know the core of them. 853 00:38:15,360 --> 00:38:18,520 Speaker 2: We made our robot watch four hours of videos from Gaza, 854 00:38:18,840 --> 00:38:21,840 Speaker 2: and it immediately said, I bet those kids went on 855 00:38:21,960 --> 00:38:24,480 Speaker 2: Undai a Lantra like that. 856 00:38:25,320 --> 00:38:29,160 Speaker 4: I anyway, Yeah, it's if your version of empathy is 857 00:38:29,160 --> 00:38:32,719 Speaker 4: trying to sell coke vanilla. Because we have all of 858 00:38:32,760 --> 00:38:33,560 Speaker 4: this all this stock. 859 00:38:33,800 --> 00:38:37,280 Speaker 3: Oh wait too much. We fucked up. We are in trouble. 860 00:38:37,320 --> 00:38:38,279 Speaker 3: We're underwater. 861 00:38:38,400 --> 00:38:39,279 Speaker 4: That's what empathy is. 862 00:38:39,440 --> 00:38:43,920 Speaker 3: Yeah, anyway, welcome to the future. Everyone's a ces miracle. 863 00:38:44,000 --> 00:38:46,480 Speaker 3: It's a ces miracle. Goodbye. 864 00:38:49,520 --> 00:38:52,000 Speaker 5: It Could Happen Here is a production of cool Zone Media. 865 00:38:52,200 --> 00:38:55,240 Speaker 5: For more podcasts from cool Zone Media, visit our website 866 00:38:55,320 --> 00:38:57,880 Speaker 5: cool Zonemedia dot com, or check us out on the 867 00:38:57,920 --> 00:39:00,959 Speaker 5: iHeartRadio app, Apple Podcasts, or wherever. 868 00:39:00,680 --> 00:39:01,880 Speaker 4: You listen to podcasts. 869 00:39:02,280 --> 00:39:04,239 Speaker 5: You can now find sources for It Could Happen Here, 870 00:39:04,239 --> 00:39:07,240 Speaker 5: listed directly in episode descriptions. Thanks for listening.