1 00:00:01,720 --> 00:00:02,880 Speaker 1: Cause media. 2 00:00:03,600 --> 00:00:06,480 Speaker 2: Hi everyone, it's James coming at you with a pretty 3 00:00:06,519 --> 00:00:09,320 Speaker 2: nasty cold here. I wanted to share with you that 4 00:00:09,760 --> 00:00:12,680 Speaker 2: wildfast has swept through Los Angeles in the last couple 5 00:00:12,720 --> 00:00:16,400 Speaker 2: of days. While I'm recording this. Thousands of people have 6 00:00:16,440 --> 00:00:19,639 Speaker 2: been displaced. Five people have died that we know of 7 00:00:19,760 --> 00:00:23,919 Speaker 2: so far, thousands of structures have been burned, and many 8 00:00:23,960 --> 00:00:26,479 Speaker 2: many people in a will be finding themselves out of 9 00:00:26,520 --> 00:00:29,520 Speaker 2: their homes with nowhere to go, with very few resources. 10 00:00:29,840 --> 00:00:31,800 Speaker 2: If you'd like to help, We've come up with some 11 00:00:31,880 --> 00:00:34,720 Speaker 2: mutual aid groups who you can donate to, and we'll 12 00:00:34,760 --> 00:00:37,280 Speaker 2: be interviewing one of them on this show next week. 13 00:00:37,640 --> 00:00:40,479 Speaker 2: So if you'd like to help, the three places where 14 00:00:40,520 --> 00:00:44,520 Speaker 2: we suggest you would donate some cash are the Sidewalk Project. 15 00:00:44,640 --> 00:00:52,200 Speaker 2: That's the Sidewalkproject, dot org, Ktown for All that's letter 16 00:00:52,280 --> 00:00:56,320 Speaker 2: k t Own, f O R A l L dorg 17 00:00:57,200 --> 00:01:00,880 Speaker 2: and ETNA Street Solidarity. You can find them on Venmo 18 00:01:01,280 --> 00:01:04,479 Speaker 2: or I think on Instagram as well. That's a E 19 00:01:04,640 --> 00:01:09,399 Speaker 2: T N A str E E T S O L 20 00:01:09,600 --> 00:01:13,400 Speaker 2: I D A it t Y. All right, I'm gonna 21 00:01:13,400 --> 00:01:16,000 Speaker 2: go rest my voice. 22 00:01:16,319 --> 00:01:19,080 Speaker 1: Oh, welcome back to it could happen here a podcast 23 00:01:19,080 --> 00:01:22,240 Speaker 1: about it happening here, which is really true in a 24 00:01:22,240 --> 00:01:25,360 Speaker 1: lot of ways. Tonight, Harrison Davis and I are seated 25 00:01:25,520 --> 00:01:30,080 Speaker 1: at the glorious majestical hotel name redacted on the Las 26 00:01:30,200 --> 00:01:34,120 Speaker 1: Vegas Strip. We got a long day at CEES Monday, 27 00:01:34,360 --> 00:01:37,520 Speaker 1: listening to panels, catching up with the latest tech news, 28 00:01:37,560 --> 00:01:40,960 Speaker 1: trying gadgets, and also at the same time texting our 29 00:01:41,000 --> 00:01:45,400 Speaker 1: dear friends in Los Angeles as unprecedented fires sweep them 30 00:01:45,400 --> 00:01:49,840 Speaker 1: from their homes. Literally the gettiest threatened Pasadena and Santa 31 00:01:49,880 --> 00:01:53,440 Speaker 1: Monica are both being evacuated as once. It's a real 32 00:01:53,560 --> 00:01:56,280 Speaker 1: one to two punch of America's favorite tech show in 33 00:01:56,280 --> 00:01:58,640 Speaker 1: the apocalypse today. How are you feeling, Garret? 34 00:01:58,640 --> 00:02:00,600 Speaker 3: It's an average day in America, average. 35 00:02:00,400 --> 00:02:07,200 Speaker 1: Day in America. Temperature's not coming down anytime soon. No, no, well, 36 00:02:07,240 --> 00:02:09,720 Speaker 1: just take a moment to breathe with that. So you 37 00:02:09,720 --> 00:02:11,600 Speaker 1: want to start us off with what you did this morning? 38 00:02:11,680 --> 00:02:14,760 Speaker 1: I was paneled guy, yesterday. There was a man of 39 00:02:14,800 --> 00:02:18,440 Speaker 1: action walking around and mostly trying all the free massage chairs. 40 00:02:18,560 --> 00:02:19,760 Speaker 1: What did you see this morning? 41 00:02:19,840 --> 00:02:22,840 Speaker 3: I saw so many AI panels, half of which I 42 00:02:22,919 --> 00:02:25,400 Speaker 3: left halfway through because I knew they weren't gonna be 43 00:02:25,440 --> 00:02:28,200 Speaker 3: useful for me, just dogshity. The other half I took 44 00:02:28,240 --> 00:02:31,200 Speaker 3: notes on and just got sad. But No. Today was 45 00:02:31,240 --> 00:02:34,320 Speaker 3: full panel starting bright and early in the morning, where 46 00:02:34,360 --> 00:02:37,200 Speaker 3: I walked into a panel where I heard augmentation and 47 00:02:37,280 --> 00:02:41,160 Speaker 3: not replacement about twenty times in the span of like 48 00:02:41,200 --> 00:02:41,880 Speaker 3: twenty minutes. 49 00:02:42,000 --> 00:02:44,679 Speaker 1: Yeah. I keep hearing versions of that too. In the 50 00:02:44,680 --> 00:02:46,880 Speaker 1: Hollywood panels, they would be like, you know, we want 51 00:02:46,919 --> 00:02:48,760 Speaker 1: to develop a machine that can read the brains of 52 00:02:48,760 --> 00:02:51,080 Speaker 1: our viewers and alter the endings of movies, you know, 53 00:02:51,120 --> 00:02:53,000 Speaker 1: but we see this as a way of augmenting the 54 00:02:53,160 --> 00:02:53,920 Speaker 1: artists work. 55 00:02:54,200 --> 00:02:57,160 Speaker 3: Yes. And the biggest thing that I noticed across multiple 56 00:02:57,160 --> 00:03:01,119 Speaker 3: panels today is an almost like anxiety among these tech 57 00:03:01,160 --> 00:03:08,120 Speaker 3: executives about consumers rejecting the AI slopification of everything, and 58 00:03:08,600 --> 00:03:11,040 Speaker 3: they're trying to find ways to like actually force people 59 00:03:11,080 --> 00:03:13,880 Speaker 3: to start like using these products or having them like 60 00:03:13,880 --> 00:03:16,960 Speaker 3: like it. Yeah, And I haven't really sense that anxiety before. 61 00:03:17,000 --> 00:03:19,280 Speaker 3: It's it's all. It's all been very very positively, and 62 00:03:19,400 --> 00:03:19,720 Speaker 3: I think. 63 00:03:19,639 --> 00:03:22,280 Speaker 1: It's a mix of Number One, the money still isn't 64 00:03:22,720 --> 00:03:25,160 Speaker 1: there where they need it to be. It has not 65 00:03:25,400 --> 00:03:27,840 Speaker 1: started like blooming to the extent that they were expecting 66 00:03:27,840 --> 00:03:30,480 Speaker 1: it by now. And the other part is people are 67 00:03:30,560 --> 00:03:33,520 Speaker 1: still not happy with this stuff. I'm glad you felt 68 00:03:33,600 --> 00:03:36,840 Speaker 1: that too, because that almost was like, especially after the election, 69 00:03:37,000 --> 00:03:39,680 Speaker 1: like I don't trust my feelings on this that they're 70 00:03:39,760 --> 00:03:42,280 Speaker 1: really scared, But I really do think there's a piece 71 00:03:42,320 --> 00:03:43,280 Speaker 1: of that coming through. 72 00:03:43,280 --> 00:03:45,720 Speaker 3: No of a phrase one of the panelists used this 73 00:03:45,840 --> 00:03:49,280 Speaker 3: morning was the AI ick, like like how do we 74 00:03:49,840 --> 00:03:51,920 Speaker 3: how do we beat the AI ick? And if you 75 00:03:52,000 --> 00:03:54,640 Speaker 3: ever said yourself, how do I stop having people feel 76 00:03:54,640 --> 00:03:58,480 Speaker 3: an ick around me? Maybe you should really look inwards. Yeah, 77 00:03:58,600 --> 00:04:00,800 Speaker 3: maybe the problems you not them. 78 00:04:01,400 --> 00:04:03,360 Speaker 1: You know, who doesn't need to worry about quote unquote 79 00:04:03,400 --> 00:04:06,640 Speaker 1: I for their product market is people who make things 80 00:04:06,680 --> 00:04:07,400 Speaker 1: that people like. 81 00:04:08,480 --> 00:04:10,480 Speaker 3: So but I heard a lot about you know, and 82 00:04:10,520 --> 00:04:12,040 Speaker 3: trying to get people to use these products is like 83 00:04:12,040 --> 00:04:14,680 Speaker 3: making sure artists don't feel like they're being replaced instead 84 00:04:15,040 --> 00:04:19,719 Speaker 3: having their like art production process be augmented with AI 85 00:04:19,800 --> 00:04:23,240 Speaker 3: and how how that can make art easier to make 86 00:04:23,279 --> 00:04:25,960 Speaker 3: while still keeping the human at the center of AI tools. 87 00:04:26,400 --> 00:04:27,880 Speaker 3: And this is just what they talked about for like 88 00:04:27,880 --> 00:04:30,240 Speaker 3: a while, while reiterating that lots of the developments they 89 00:04:30,279 --> 00:04:32,760 Speaker 3: need to see on AI, they have it on the 90 00:04:32,760 --> 00:04:35,320 Speaker 3: tech side, what they need to rely on is consumer 91 00:04:35,360 --> 00:04:37,599 Speaker 3: acceptance to really drive the innovation, to see like what 92 00:04:37,640 --> 00:04:39,839 Speaker 3: they can get away with, Like how much will the 93 00:04:39,839 --> 00:04:44,920 Speaker 3: consumer accept the sophification of art and entertainment and customer 94 00:04:45,000 --> 00:04:47,599 Speaker 3: service and all these things are trying to cram AI 95 00:04:47,720 --> 00:04:48,360 Speaker 3: into and like. 96 00:04:48,480 --> 00:04:51,600 Speaker 1: How much worse can you make the world before people 97 00:04:52,120 --> 00:04:55,080 Speaker 1: stand up and stop you with their fists or guns. 98 00:04:55,160 --> 00:04:57,120 Speaker 3: And you mentioned something about like trying to like tailor 99 00:04:57,160 --> 00:05:00,240 Speaker 3: like movie endings for specific people, and i' I heard 100 00:05:00,279 --> 00:05:02,600 Speaker 3: them stuff about that. There's this one guy who was 101 00:05:02,760 --> 00:05:05,560 Speaker 3: who was like the panel's resident like content creator who's 102 00:05:05,560 --> 00:05:10,680 Speaker 3: supposed to represent like the artist block, even though he's like, eh, yeah, 103 00:05:10,760 --> 00:05:13,479 Speaker 3: you know some kind of like AI friendly content creator 104 00:05:13,520 --> 00:05:15,359 Speaker 3: though on this panel, and he talked about how like 105 00:05:15,680 --> 00:05:18,159 Speaker 3: back in the day, you need to have friends that 106 00:05:18,200 --> 00:05:22,200 Speaker 3: would like recommend you music, and like the Spotify algorithm 107 00:05:22,240 --> 00:05:23,919 Speaker 3: is is too based on like an echo chamber what 108 00:05:23,960 --> 00:05:27,760 Speaker 3: you already like, but now with a gentic AI this 109 00:05:27,839 --> 00:05:30,840 Speaker 3: allows trust between the consumer and the machine to recommend 110 00:05:30,880 --> 00:05:33,480 Speaker 3: new music. And like again, like so much of these 111 00:05:33,480 --> 00:05:35,400 Speaker 3: AIR products is just trying to like replace friendship. 112 00:05:35,480 --> 00:05:39,200 Speaker 1: Yeah, people, have you tried friends have you tried people? 113 00:05:39,360 --> 00:05:43,120 Speaker 3: How can you engage with like art and culture without friends, 114 00:05:43,120 --> 00:05:45,120 Speaker 3: Like how can you like learn more about like what 115 00:05:45,200 --> 00:05:47,200 Speaker 3: your friends are into what they like? How can you 116 00:05:47,240 --> 00:05:50,840 Speaker 3: discover new music just like without that instead replacing that 117 00:05:50,920 --> 00:05:52,320 Speaker 3: beautifully human process. 118 00:05:52,600 --> 00:05:55,400 Speaker 1: Every year at cees, there are points in time where 119 00:05:55,400 --> 00:05:57,559 Speaker 1: I get that, like, oh yeah, twenty twenty really fucked 120 00:05:57,600 --> 00:06:00,440 Speaker 1: us up a lot, Like twenty twenty really did some 121 00:06:00,560 --> 00:06:04,360 Speaker 1: lasting damage. Like I know it was that was happening 122 00:06:04,520 --> 00:06:07,960 Speaker 1: with the younger generation before the iPad kid generation, but 123 00:06:08,120 --> 00:06:11,600 Speaker 1: like that that really did a number on some folks. 124 00:06:12,279 --> 00:06:16,559 Speaker 3: Someone from Meta right of Facebook specifically they're like metaverse division, 125 00:06:16,600 --> 00:06:18,200 Speaker 3: which they're still trying to push for by. 126 00:06:18,120 --> 00:06:19,880 Speaker 1: The way, Oh yeah, now, I mean they're still calling 127 00:06:19,920 --> 00:06:22,520 Speaker 1: it meta, which honestly, there's a degree which I almost 128 00:06:22,600 --> 00:06:24,960 Speaker 1: respect it because like we are not biting no. 129 00:06:25,279 --> 00:06:27,880 Speaker 3: One, No one is. But she talked about how how 130 00:06:27,920 --> 00:06:31,240 Speaker 3: they can like blend the metaverse and AI to make 131 00:06:31,320 --> 00:06:35,839 Speaker 3: customized personal experiences. Say that you're watching an immersive live 132 00:06:36,000 --> 00:06:39,080 Speaker 3: concert in the mixed reality something that both me and 133 00:06:39,080 --> 00:06:40,080 Speaker 3: Aubert do all the time, and. 134 00:06:42,560 --> 00:06:47,440 Speaker 1: Hairy styles mixed reality concerts. We're seeing the hundred gets, you. 135 00:06:47,400 --> 00:06:50,080 Speaker 3: Know, honestly, a one hundred x mixed reality concert could 136 00:06:50,080 --> 00:06:50,520 Speaker 3: go crazy. 137 00:06:50,600 --> 00:06:52,839 Speaker 1: Here We'll finally I'll finally get you pilled on real 138 00:06:52,839 --> 00:06:53,920 Speaker 1: big fish. 139 00:06:54,000 --> 00:06:58,200 Speaker 3: But basically, as you're in this like metaverse concert, they 140 00:06:58,200 --> 00:07:00,600 Speaker 3: can have an AI that will sense your own excitement 141 00:07:01,000 --> 00:07:03,960 Speaker 3: and personalize the ending of the experience based on your 142 00:07:03,960 --> 00:07:07,000 Speaker 3: favorite songs or artists. So as they're getting excited from 143 00:07:07,000 --> 00:07:10,440 Speaker 3: like AI, Taylor Swift can like finish the song like 144 00:07:10,520 --> 00:07:12,880 Speaker 3: for you based on like your own like musical taste, 145 00:07:12,920 --> 00:07:15,040 Speaker 3: based on what the AI knows about you. And it's 146 00:07:15,040 --> 00:07:17,120 Speaker 3: about creating these customized experiences. 147 00:07:17,120 --> 00:07:19,000 Speaker 1: It's such a you can clearly tell that none of 148 00:07:19,000 --> 00:07:21,160 Speaker 1: these people have souls, right, It's such a mismatch of 149 00:07:21,200 --> 00:07:24,440 Speaker 1: what people get from music because they think that like, oh, 150 00:07:24,480 --> 00:07:26,600 Speaker 1: this is just like a if I see that like 151 00:07:26,680 --> 00:07:29,200 Speaker 1: this specific beat line is I can just sort of 152 00:07:29,240 --> 00:07:31,280 Speaker 1: like plug this in and like, I don't know, Like 153 00:07:31,840 --> 00:07:34,360 Speaker 1: what makes people react to musicians and artists is that 154 00:07:34,400 --> 00:07:37,280 Speaker 1: they like make things that make them feel something like 155 00:07:37,840 --> 00:07:41,120 Speaker 1: That's why people get like really into artists, is they 156 00:07:41,200 --> 00:07:45,600 Speaker 1: feel seen and identify with a piece of art, as 157 00:07:45,600 --> 00:07:48,480 Speaker 1: opposed to like, oh, oh that guy really like the 158 00:07:48,520 --> 00:07:53,320 Speaker 1: first opening bars to fucking octopus's garden, Like let's let's 159 00:07:53,400 --> 00:07:56,640 Speaker 1: just like really turn up the octopus a lot more octopuses. 160 00:07:57,360 --> 00:07:59,960 Speaker 1: How many more octopuses can we fit in this fucking 161 00:08:00,440 --> 00:08:01,160 Speaker 1: in this track? 162 00:08:01,400 --> 00:08:03,840 Speaker 3: No. Another panel I went to later in the day 163 00:08:03,960 --> 00:08:06,000 Speaker 3: was about like how do you market to gen z? 164 00:08:06,320 --> 00:08:08,880 Speaker 3: Very funny panel. Yeah, and and they're talking about how 165 00:08:08,920 --> 00:08:12,280 Speaker 3: like authenticity is so important, like you need to partner 166 00:08:12,360 --> 00:08:14,960 Speaker 3: with influencers that have like have like an authentic brand. 167 00:08:15,440 --> 00:08:18,480 Speaker 3: And it's funny having that ductionapost with like these like 168 00:08:18,600 --> 00:08:20,520 Speaker 3: these like AI slot panels were like you need like 169 00:08:20,520 --> 00:08:23,560 Speaker 3: an AI Taylor Swift to come like boost the excitement 170 00:08:23,640 --> 00:08:26,920 Speaker 3: for all these kids who are in their metaverse concerts. 171 00:08:27,560 --> 00:08:30,440 Speaker 3: Oh boy, But no, like personalized content like like targeting 172 00:08:30,520 --> 00:08:34,240 Speaker 3: like AI AI generated content specifically for certain people, for 173 00:08:34,280 --> 00:08:36,960 Speaker 3: certain users, whether it's on social media, whether that's on 174 00:08:37,040 --> 00:08:38,800 Speaker 3: you know, the metaverse. Like some of these people talk 175 00:08:38,840 --> 00:08:41,600 Speaker 3: about someone on the panel from Adobe, who's you know, 176 00:08:41,600 --> 00:08:44,240 Speaker 3: Adobe's integrating a whole bunch of generative AI into their 177 00:08:44,280 --> 00:08:48,600 Speaker 3: like suite of products, right, like a Photoshop premiere after effects, right, 178 00:08:48,760 --> 00:08:51,440 Speaker 3: big big company in the creative space. He said, They're 179 00:08:51,480 --> 00:08:54,920 Speaker 3: like personalized content is always the most impactful, like content 180 00:08:55,080 --> 00:08:57,920 Speaker 3: that a person feels like a genuine connection to, and 181 00:08:58,240 --> 00:09:00,280 Speaker 3: that connection could be fted by just being like, you know, 182 00:09:00,520 --> 00:09:03,520 Speaker 3: a compelling artist where you can recognize shared experiences of 183 00:09:03,760 --> 00:09:07,400 Speaker 3: shared experiences of humanity. But now you don't need that 184 00:09:07,559 --> 00:09:10,680 Speaker 3: artist part anymore. He said, they only need three parts 185 00:09:10,720 --> 00:09:14,320 Speaker 3: to create a pipeline. You need data, you need compelling 186 00:09:14,360 --> 00:09:17,240 Speaker 3: like journeys to take the user on, and you need 187 00:09:17,240 --> 00:09:19,920 Speaker 3: the content itself. And the goal is to create content 188 00:09:19,960 --> 00:09:23,559 Speaker 3: at scale that's highly personalized. He said quotes. We're good 189 00:09:23,559 --> 00:09:26,240 Speaker 3: at the first two parts. Now we just need to 190 00:09:26,280 --> 00:09:29,600 Speaker 3: improve the actual content side, which I don't even think 191 00:09:29,640 --> 00:09:31,760 Speaker 3: that's true. I don't think AI is good at creating 192 00:09:31,880 --> 00:09:33,199 Speaker 3: compelling human journeys. 193 00:09:33,200 --> 00:09:35,040 Speaker 1: I had it. So the video I didn't play you 194 00:09:35,080 --> 00:09:38,319 Speaker 1: guys from my terrible fucking AI generated videos was this. 195 00:09:38,720 --> 00:09:41,200 Speaker 1: It was like a girl coming to college. We need 196 00:09:41,240 --> 00:09:42,640 Speaker 1: a picture of her dad, and it was like a 197 00:09:42,760 --> 00:09:46,160 Speaker 1: narration of her life with her father who like is 198 00:09:46,320 --> 00:09:49,199 Speaker 1: dead that she misses and all that she learned from him. 199 00:09:49,280 --> 00:09:51,440 Speaker 1: Data and it like it's a mix of like all 200 00:09:51,440 --> 00:09:53,200 Speaker 1: these different like there's a chunk where it looks like 201 00:09:53,240 --> 00:09:55,760 Speaker 1: a Disney animated picture. There's a chunk where it looks 202 00:09:55,800 --> 00:09:58,880 Speaker 1: like anime. She and her dad having these like adventures 203 00:09:58,880 --> 00:10:00,480 Speaker 1: around the world is a bit of looks like a 204 00:10:00,520 --> 00:10:03,160 Speaker 1: Marvel movie. And he's like, we can do all these 205 00:10:03,200 --> 00:10:06,360 Speaker 1: different you know, animation styles and they're seamless and like, 206 00:10:06,480 --> 00:10:08,640 Speaker 1: you know, the audience really goes on a journey with this. 207 00:10:08,720 --> 00:10:11,640 Speaker 1: And it's like, but there's there was no girl who 208 00:10:11,679 --> 00:10:14,040 Speaker 1: lost your dad. Nobody lost their dad here. This is 209 00:10:14,160 --> 00:10:17,640 Speaker 1: you just had a computer generate text about a dad dying. 210 00:10:17,760 --> 00:10:21,360 Speaker 1: Like there's nothing underpinning this, right, nobody has anything they're 211 00:10:21,360 --> 00:10:23,640 Speaker 1: trying to get across, Like you just know, in this one, 212 00:10:23,679 --> 00:10:26,400 Speaker 1: they look like Marvel heroes for some reason. In this 213 00:10:26,440 --> 00:10:28,840 Speaker 1: one they look like Zulu warriors, kind of done up 214 00:10:28,840 --> 00:10:31,920 Speaker 1: in a slightly racist lion king style. Like what is 215 00:10:32,800 --> 00:10:35,320 Speaker 1: being transmitted other than like, look at all of the 216 00:10:35,360 --> 00:10:37,240 Speaker 1: different art styles we can rip off. 217 00:10:37,320 --> 00:10:39,880 Speaker 3: Now they do not have a journey, but even they 218 00:10:39,920 --> 00:10:42,720 Speaker 3: themselves admit that they still don't have the content. The 219 00:10:42,720 --> 00:10:44,920 Speaker 3: content itself still isn't even there. And that's something like 220 00:10:44,960 --> 00:10:46,800 Speaker 3: they even acknowledge. And this is like a hurdle to 221 00:10:47,000 --> 00:10:48,960 Speaker 3: this is this is a hurdle to get over. What 222 00:10:49,000 --> 00:10:50,600 Speaker 3: they do have is the data, and like this is 223 00:10:50,600 --> 00:10:52,800 Speaker 3: like something that Adobe has done because if you use 224 00:10:52,800 --> 00:10:55,360 Speaker 3: Adobe products now some of the most used creative products, 225 00:10:55,440 --> 00:10:58,240 Speaker 3: Adobe trains all of their AI systems, all the stuff 226 00:10:58,240 --> 00:11:01,079 Speaker 3: that you make using their products, which you know, he 227 00:11:01,200 --> 00:11:03,960 Speaker 3: really just blazed past that point because that's that's a 228 00:11:03,960 --> 00:11:07,000 Speaker 3: whole other discussion. But even they know that they don't 229 00:11:07,000 --> 00:11:10,199 Speaker 3: have like the actual products and this is still reliant 230 00:11:10,280 --> 00:11:13,959 Speaker 3: on like consumer acceptance. As as they said before someone 231 00:11:14,000 --> 00:11:16,120 Speaker 3: from Meta, the same person on the panel that talked 232 00:11:16,160 --> 00:11:18,800 Speaker 3: about how like a few days ago on Instagram, they 233 00:11:18,800 --> 00:11:22,600 Speaker 3: tried to announce like you'll have like AI profiles right, 234 00:11:22,679 --> 00:11:25,880 Speaker 3: like like completely AI generated pictures profiles, like you know, 235 00:11:25,920 --> 00:11:28,960 Speaker 3: like fake people who have their own accounts, And this 236 00:11:29,000 --> 00:11:31,720 Speaker 3: created such a big backlash that they rolled this back 237 00:11:32,040 --> 00:11:34,800 Speaker 3: and they simply announced this before Cees. 238 00:11:34,600 --> 00:11:37,959 Speaker 1: One of these accounts was literally like I'm a mother 239 00:11:38,080 --> 00:11:41,640 Speaker 1: of two queer black woman. You know, yeah, I got 240 00:11:41,640 --> 00:11:44,120 Speaker 1: a lot to say about the world. Someone call up 241 00:11:44,160 --> 00:11:47,400 Speaker 1: the situationists please, And some like people started talking to 242 00:11:47,400 --> 00:11:49,240 Speaker 1: were like, we're any black people at all involved in 243 00:11:49,320 --> 00:11:52,040 Speaker 1: like making this chatbot. She was like, well no, and 244 00:11:52,080 --> 00:11:54,480 Speaker 1: that's a real problem. That is a real problem. 245 00:11:54,520 --> 00:11:57,680 Speaker 3: Okay, Yes, And the excuse that this person for Meta 246 00:11:57,720 --> 00:12:00,679 Speaker 3: said is that the market just isn't ready yet. It's 247 00:12:00,760 --> 00:12:03,640 Speaker 3: not that the actual product itself is like bad or 248 00:12:03,679 --> 00:12:06,920 Speaker 3: like no one really wants. The market's not ready yet. 249 00:12:07,120 --> 00:12:10,560 Speaker 1: Well, they're so used to everything that they've done so far. 250 00:12:11,160 --> 00:12:14,720 Speaker 1: They've kept getting money right, and like it slowed down 251 00:12:14,840 --> 00:12:17,880 Speaker 1: and they've had to do layoffs, but like nobody's just 252 00:12:18,000 --> 00:12:22,640 Speaker 1: made them stop at any point, which, honestly, you know, 253 00:12:22,679 --> 00:12:25,760 Speaker 1: I made a comment about healthcare executives a while back, 254 00:12:25,880 --> 00:12:28,439 Speaker 1: needing like a fucking retirement plan paid in millimeters. So 255 00:12:28,440 --> 00:12:31,400 Speaker 1: I'm not going to make that same comment about tech 256 00:12:31,400 --> 00:12:34,240 Speaker 1: industry rules because you know, we all know what's in 257 00:12:34,240 --> 00:12:38,400 Speaker 1: the news. But something has to be done to force 258 00:12:38,480 --> 00:12:43,360 Speaker 1: these people to stop moving in this direction. And I 259 00:12:43,400 --> 00:12:45,800 Speaker 1: don't know how to get across and like they're already 260 00:12:45,800 --> 00:12:48,559 Speaker 1: at this point, like they seem to really want not 261 00:12:48,679 --> 00:12:50,720 Speaker 1: want this, and we have to find a way. They're 262 00:12:50,760 --> 00:12:52,120 Speaker 1: just not ready. We have to find a way to 263 00:12:52,160 --> 00:12:54,839 Speaker 1: force this on them. I don't know how to get 264 00:12:54,880 --> 00:12:56,640 Speaker 1: across to them in a peaceful manner. 265 00:12:56,760 --> 00:12:59,199 Speaker 3: Oh oh sorry, people don't want. 266 00:12:59,000 --> 00:13:02,079 Speaker 1: This man of peace Garrison. I'm a man of peace. 267 00:13:02,400 --> 00:13:03,200 Speaker 1: I'm not a plumber. 268 00:13:06,520 --> 00:13:08,560 Speaker 3: The last thing I had to add out of this panel, 269 00:13:08,960 --> 00:13:10,600 Speaker 3: just in terms of how much this stuff is just 270 00:13:10,640 --> 00:13:12,400 Speaker 3: actually taking over more and more of the market even 271 00:13:12,400 --> 00:13:15,000 Speaker 3: if people don't want it, is that the guy from 272 00:13:15,000 --> 00:13:17,080 Speaker 3: Adobe announced it in the fourth quarter of last year. 273 00:13:17,400 --> 00:13:19,679 Speaker 3: They were able to boost all of the Adobe's like 274 00:13:19,720 --> 00:13:22,000 Speaker 3: you know, emails. If you send like an email to Adobe, right, 275 00:13:22,400 --> 00:13:24,160 Speaker 3: you have a problem, like you need help. But like 276 00:13:24,360 --> 00:13:27,000 Speaker 3: everything that they do on emails is now one hundred 277 00:13:27,000 --> 00:13:30,280 Speaker 3: percent generated by AI. And this was boosted from fifty 278 00:13:30,280 --> 00:13:32,040 Speaker 3: percent at the start of last year. Now it's one 279 00:13:32,080 --> 00:13:35,040 Speaker 3: hundred percent of all of their email content is now 280 00:13:35,080 --> 00:13:37,600 Speaker 3: done by AI with some moderation. 281 00:13:37,800 --> 00:13:41,000 Speaker 1: But there comes like when the company itself is like 282 00:13:41,040 --> 00:13:42,880 Speaker 1: communicating with customers through emails. 283 00:13:43,280 --> 00:13:44,520 Speaker 3: That's what it sounded like. 284 00:13:44,600 --> 00:13:48,480 Speaker 1: Yes, they're still writing emails sometimes to each other or 285 00:13:49,679 --> 00:13:50,240 Speaker 1: for that too. 286 00:13:50,720 --> 00:13:53,120 Speaker 3: He described it as like email content. So I'm pretty 287 00:13:53,120 --> 00:13:57,400 Speaker 3: sure it is customer service stuff like marketing, maybe like outreach, 288 00:13:57,440 --> 00:13:59,520 Speaker 3: like certain like outreach things. But yeah, like what one 289 00:13:59,600 --> 00:14:03,239 Speaker 3: hundred percent now generated by AI with some human like moderation. 290 00:14:03,800 --> 00:14:06,439 Speaker 3: But yeah, that is where things are moving and that's 291 00:14:06,480 --> 00:14:08,040 Speaker 3: how I started my morning. 292 00:14:08,280 --> 00:14:11,200 Speaker 1: Well, better than a cup of coffee. Is that sense 293 00:14:11,240 --> 00:14:13,640 Speaker 1: of creeping dread that like, Wow, I just saw a 294 00:14:13,640 --> 00:14:17,120 Speaker 1: bunch of people who will probably would rather kill the 295 00:14:17,200 --> 00:14:22,040 Speaker 1: world than be stopped from shoveling AI slop into people's mouths, 296 00:14:22,080 --> 00:14:24,440 Speaker 1: because this is the only future they can imagine, is 297 00:14:24,480 --> 00:14:27,120 Speaker 1: one in which they work for a company that feeds 298 00:14:27,160 --> 00:14:30,840 Speaker 1: the planet poison and kills the human concept of creativity 299 00:14:31,120 --> 00:14:33,320 Speaker 1: so that they can buy a house in San Francisco. 300 00:14:33,880 --> 00:14:37,080 Speaker 3: Do you know what I want to feed the concept. 301 00:14:36,560 --> 00:14:39,720 Speaker 1: Of Yeah, we'll talk about that, but here's some ants. 302 00:14:49,160 --> 00:14:51,920 Speaker 1: We're back. What was part two of this episode? That's 303 00:14:51,920 --> 00:14:55,240 Speaker 1: feed buddy. I'm ah, oh, let's talk about that helicopter. No, yeah, 304 00:14:55,320 --> 00:14:56,200 Speaker 1: I think yeah. 305 00:14:56,280 --> 00:14:59,160 Speaker 3: As I was going from panel to panels scrippling notes 306 00:14:59,200 --> 00:15:02,200 Speaker 3: on AI. Yeah, has some very exciting news stories drop 307 00:15:02,360 --> 00:15:04,880 Speaker 3: that we'll talk about later. What were you up to, Robert, Well, 308 00:15:04,920 --> 00:15:07,960 Speaker 3: I was. I was trawling the show floor as I 309 00:15:08,120 --> 00:15:08,640 Speaker 3: oft do. 310 00:15:08,760 --> 00:15:12,400 Speaker 1: At some point in a cees and I came across 311 00:15:13,080 --> 00:15:15,120 Speaker 1: a number of majestic products. You know, a lot of 312 00:15:15,120 --> 00:15:17,400 Speaker 1: it was AI based, and we'll talk some more about 313 00:15:17,400 --> 00:15:20,120 Speaker 1: that here. But I ran into something that was thank god, 314 00:15:20,360 --> 00:15:22,960 Speaker 1: had nothing to do with AI and it's a death trap. 315 00:15:23,280 --> 00:15:24,560 Speaker 1: Every every one of these. 316 00:15:24,400 --> 00:15:26,760 Speaker 3: There's like some sort of yes, we find a new death. 317 00:15:27,080 --> 00:15:28,840 Speaker 1: There's a lot of connected vehicles. There were a lot 318 00:15:28,840 --> 00:15:31,760 Speaker 1: of evs last year. There were a ton of different 319 00:15:31,880 --> 00:15:35,520 Speaker 1: flying taxi type options people that were really trying to. 320 00:15:35,560 --> 00:15:37,040 Speaker 3: But you don't see it all this year. 321 00:15:37,080 --> 00:15:39,720 Speaker 1: Nothing this year, nothing this year, because it's a terrible idea. 322 00:15:39,880 --> 00:15:42,600 Speaker 1: It's a terrible idea. The people who are rich enough 323 00:15:42,640 --> 00:15:44,920 Speaker 1: to pay for flying vehicles don't want it to be 324 00:15:44,960 --> 00:15:49,560 Speaker 1: a taxi, and the people who can't afford their own 325 00:15:49,600 --> 00:15:53,760 Speaker 1: flying vehicles also can't afford anyway. So this is instead 326 00:15:53,800 --> 00:15:58,640 Speaker 1: of any of that. Richter r i ct O R, 327 00:15:59,560 --> 00:16:03,040 Speaker 1: which is a Chinese company. Their ads say, I'll say 328 00:16:03,240 --> 00:16:08,720 Speaker 1: why be normal, saying this the future of travel will 329 00:16:08,760 --> 00:16:13,120 Speaker 1: not be on the ground. And the Richter is a hybrid. 330 00:16:13,200 --> 00:16:17,320 Speaker 1: It is like a smart car style sized vehicle. It's 331 00:16:17,360 --> 00:16:19,120 Speaker 1: like half the side. It only has two wheels, though 332 00:16:19,680 --> 00:16:21,480 Speaker 1: it looks more like a scooter. It's more like a 333 00:16:21,520 --> 00:16:26,080 Speaker 1: weird little scooter, but it's fully enclosed and in addition 334 00:16:26,120 --> 00:16:27,840 Speaker 1: to having its wheels and being able to travel about 335 00:16:27,840 --> 00:16:31,440 Speaker 1: on the ground. It has four like quadrocopters style rotors 336 00:16:31,720 --> 00:16:35,240 Speaker 1: because it is an aquatic flying car. Aquatic flying I 337 00:16:35,280 --> 00:16:37,320 Speaker 1: saw no evidence that could actually go in the water. 338 00:16:37,560 --> 00:16:39,000 Speaker 3: How high can these things go? Up? 339 00:16:39,200 --> 00:16:41,160 Speaker 1: Less than two hundred meters? You know why, Garrison? 340 00:16:41,280 --> 00:16:42,000 Speaker 3: Why? Why is that? 341 00:16:42,160 --> 00:16:43,760 Speaker 1: Because if you try to go above that you need 342 00:16:43,800 --> 00:16:46,960 Speaker 1: a pilot's license. You don't need a pilot slicnse I 343 00:16:47,000 --> 00:16:49,120 Speaker 1: have that. When I was interviewing them, I was like, so, 344 00:16:49,200 --> 00:16:51,080 Speaker 1: I assume there's gonna be some sort of pilot's license 345 00:16:51,120 --> 00:16:53,200 Speaker 1: for this flying craft. And they're like, no, as long 346 00:16:53,240 --> 00:16:54,960 Speaker 1: as you stand nd two hundred meters, you get. 347 00:16:54,840 --> 00:16:57,720 Speaker 3: Do you need drivers? Like? Are you gonna put a 348 00:16:57,760 --> 00:16:58,680 Speaker 3: license plate on this? 349 00:16:58,800 --> 00:17:00,000 Speaker 1: Or there's no space for one? 350 00:17:00,040 --> 00:17:02,120 Speaker 3: Buddy's completely unregularly to. 351 00:17:02,040 --> 00:17:05,080 Speaker 1: Be honest, and I don't say this for anny problematic reason, 352 00:17:05,160 --> 00:17:08,320 Speaker 1: but like, these folks are Chinese and did not seem 353 00:17:08,359 --> 00:17:10,000 Speaker 1: to have a great deal of knowledge about the US 354 00:17:10,080 --> 00:17:14,760 Speaker 1: words sure that said, I can't imagine China's less strict 355 00:17:14,760 --> 00:17:16,280 Speaker 1: about personal aircraft. 356 00:17:16,440 --> 00:17:18,720 Speaker 3: I would like to take this fucker on the I five, 357 00:17:19,240 --> 00:17:22,480 Speaker 3: just start start to zooming. 358 00:17:22,760 --> 00:17:23,560 Speaker 1: Yeah, see it. 359 00:17:23,600 --> 00:17:25,280 Speaker 3: Up in the air, because you could probably do like 360 00:17:25,320 --> 00:17:27,320 Speaker 3: a pretty a pretty good road trip on this right, 361 00:17:27,359 --> 00:17:28,960 Speaker 3: you can you can? You can be about that. 362 00:17:29,119 --> 00:17:31,879 Speaker 1: So it's very small and it's completely electric. So I 363 00:17:31,960 --> 00:17:33,800 Speaker 1: asked him, how much time do you get in the 364 00:17:33,880 --> 00:17:37,040 Speaker 1: air with this bad boy on battery? Maybe twenty five minutes? 365 00:17:39,840 --> 00:17:42,280 Speaker 3: What happens after twenty minutes? 366 00:17:43,560 --> 00:17:46,000 Speaker 1: I did ask this, and I was like, this is 367 00:17:46,040 --> 00:17:47,960 Speaker 1: just rough out of the sky, and they're like, no, 368 00:17:48,040 --> 00:17:50,600 Speaker 1: we're working on like a like an intelligent thing that 369 00:17:50,680 --> 00:17:51,040 Speaker 1: will like. 370 00:17:52,880 --> 00:17:53,120 Speaker 3: Land. 371 00:17:53,200 --> 00:17:56,520 Speaker 1: Yeah, which is also very exciting, really really looking forward 372 00:17:56,560 --> 00:17:58,960 Speaker 1: to seeing how they pull that off. The videos that 373 00:17:59,000 --> 00:18:02,159 Speaker 1: they have show it driving on the highway too. They 374 00:18:02,200 --> 00:18:04,280 Speaker 1: weren't able to tell me what a top speed was. 375 00:18:04,480 --> 00:18:07,240 Speaker 1: It has no rear view mirrors and no side view mirrors, 376 00:18:07,280 --> 00:18:09,360 Speaker 1: but they said there's lots of cameras on the inside, 377 00:18:09,400 --> 00:18:12,720 Speaker 1: so I'm sure that's fine. It's a death trap. This 378 00:18:12,760 --> 00:18:15,720 Speaker 1: thing will get everyone who even looks at it wrong killed. 379 00:18:15,720 --> 00:18:18,439 Speaker 1: They should be a video of the prototype. It was 380 00:18:18,480 --> 00:18:21,480 Speaker 1: completely frameless. It was just quadrcopter blades and like a 381 00:18:21,640 --> 00:18:24,120 Speaker 1: chair on a platform lifting a guy into the air. 382 00:18:24,280 --> 00:18:27,760 Speaker 1: It couldn't go forward or backwards. But they're like, a year, 383 00:18:27,880 --> 00:18:28,760 Speaker 1: we canna have this figure out. 384 00:18:28,960 --> 00:18:30,399 Speaker 3: It can't it can't move forward. 385 00:18:30,520 --> 00:18:32,640 Speaker 1: It only only went up in the videos I saw, 386 00:18:34,400 --> 00:18:38,719 Speaker 1: So you can't actually travel, absolutely not. I couldn't by 387 00:18:38,760 --> 00:18:41,280 Speaker 1: the way. I couldn't fit in this thing, like you 388 00:18:41,920 --> 00:18:43,560 Speaker 1: would be cramped in this fucker. 389 00:18:43,680 --> 00:18:45,119 Speaker 3: But it's good for vertical travel. 390 00:18:45,200 --> 00:18:47,399 Speaker 1: It's great if you just need to go up to 391 00:18:47,600 --> 00:18:50,600 Speaker 1: under two hundred meters. There's no more efficient way. 392 00:18:52,600 --> 00:18:55,359 Speaker 3: If you pull over by the cops, you just just. 393 00:18:55,240 --> 00:18:58,760 Speaker 1: Go up above them. I'm in the sky now. You 394 00:18:58,800 --> 00:19:04,960 Speaker 1: can't do shit to me for five mass minutes. Oh god, 395 00:19:05,359 --> 00:19:06,720 Speaker 1: it's like if you're just driving, you go up to 396 00:19:06,720 --> 00:19:09,560 Speaker 1: one hundred kilometers, which made me think, so agad, that's 397 00:19:09,600 --> 00:19:12,520 Speaker 1: like sixty the year for twenty minutes. Then I land, 398 00:19:12,720 --> 00:19:13,760 Speaker 1: Then my battery is dead. 399 00:19:14,119 --> 00:19:14,960 Speaker 3: Then you can't go anywhere. 400 00:19:14,960 --> 00:19:16,360 Speaker 1: You can't go anywhere. You can get back. 401 00:19:16,560 --> 00:19:19,320 Speaker 3: The battery issue is gonna is gonna be troubling. 402 00:19:19,440 --> 00:19:20,760 Speaker 1: But it seems completely useless. 403 00:19:20,800 --> 00:19:24,119 Speaker 3: But as we've heard NonStop the past two days, this 404 00:19:24,200 --> 00:19:25,040 Speaker 3: is the worst it's gonna be. 405 00:19:25,160 --> 00:19:27,240 Speaker 1: This is the worst it's gonna be. Only gonna get better. 406 00:19:27,280 --> 00:19:28,680 Speaker 3: Things only ever get better. 407 00:19:29,160 --> 00:19:32,119 Speaker 1: That's that's what everyone was trying to insist upon. To 408 00:19:32,160 --> 00:19:34,400 Speaker 1: may hear what else did you see on the show 409 00:19:34,400 --> 00:19:38,000 Speaker 1: floor that cut your off garrison? So many magical, wonderful, 410 00:19:38,160 --> 00:19:41,280 Speaker 1: marvelous things, most of which were just like various different 411 00:19:41,320 --> 00:19:44,760 Speaker 1: AI connected smart houses. That was what Samsung was showing off. 412 00:19:44,760 --> 00:19:47,960 Speaker 1: That was what LG was showing off. But I believe 413 00:19:48,000 --> 00:19:50,720 Speaker 1: you saw one as well, right, Yeah, I mean I 414 00:19:50,400 --> 00:19:52,240 Speaker 1: I walked through the LG booth. 415 00:19:52,680 --> 00:19:54,800 Speaker 3: It was kind of the same as same as last year. 416 00:19:54,800 --> 00:19:57,600 Speaker 3: The Samsung booth was too intimidating. But I should check 417 00:19:57,600 --> 00:20:00,000 Speaker 3: it out because last year we didn't do the sums 418 00:20:00,040 --> 00:20:04,879 Speaker 3: hung booth because we were going to and then either 419 00:20:05,400 --> 00:20:07,960 Speaker 3: either one of us threw up or spilled something. 420 00:20:08,560 --> 00:20:15,240 Speaker 1: Hey, okay, okay, yes, did I Did I pour my 421 00:20:15,400 --> 00:20:19,520 Speaker 1: crative into a constant into a carbonated beverage that spewed 422 00:20:19,640 --> 00:20:24,320 Speaker 1: a geyser a blood red foam into the sky around 423 00:20:24,600 --> 00:20:28,560 Speaker 1: to the white sung Did the security guard stare at 424 00:20:28,600 --> 00:20:32,080 Speaker 1: me as it happened? Did I set the drink down 425 00:20:32,119 --> 00:20:34,320 Speaker 1: as it continued to spew and said, I'll go get 426 00:20:34,400 --> 00:20:38,760 Speaker 1: some towels and then leave forever towels? Yeah? 427 00:20:39,160 --> 00:20:43,880 Speaker 3: Left? We fucking bounced so we couldn't do them booth 428 00:20:44,000 --> 00:20:47,120 Speaker 3: last year. Maybe I'll try it this year. But tell 429 00:20:47,160 --> 00:20:48,840 Speaker 3: me about these smart houses. 430 00:20:49,320 --> 00:20:51,720 Speaker 1: Well, Garrett, Sam Sumi has a great idea for a 431 00:20:51,760 --> 00:20:54,760 Speaker 1: smart house. First of all, you wear that game the sims. No, 432 00:20:55,359 --> 00:20:59,000 Speaker 1: well they're really betting that you do, because their current 433 00:20:59,040 --> 00:21:00,960 Speaker 1: plan is design and you're a home with the AI 434 00:21:01,040 --> 00:21:03,560 Speaker 1: powered map views. Okay, okay, sure, you get like you 435 00:21:03,760 --> 00:21:06,080 Speaker 1: should feed it like a picture. You like, you lay 436 00:21:06,119 --> 00:21:08,680 Speaker 1: out your your floor planning your house, and it gives 437 00:21:08,720 --> 00:21:10,280 Speaker 1: you like a three D model and you can take 438 00:21:10,400 --> 00:21:13,680 Speaker 1: pictures of your furniture or pictures of furniture that you want, 439 00:21:14,240 --> 00:21:15,679 Speaker 1: and then it really places it around and you can 440 00:21:15,760 --> 00:21:18,000 Speaker 1: place them. Now, a couple of things, one of them 441 00:21:18,080 --> 00:21:21,159 Speaker 1: is that there's no scaling done by the AI, so 442 00:21:21,200 --> 00:21:24,920 Speaker 1: it's up to you to figure out how the furniture 443 00:21:24,960 --> 00:21:28,399 Speaker 1: you might want to buy measures up in comparison to 444 00:21:28,520 --> 00:21:29,120 Speaker 1: the apartment. 445 00:21:29,240 --> 00:21:29,440 Speaker 3: Sure. 446 00:21:29,560 --> 00:21:32,600 Speaker 1: Sure, but it does look like the actual like map 447 00:21:32,640 --> 00:21:34,400 Speaker 1: that they've got. I'll show you the picture that I took. 448 00:21:35,320 --> 00:21:38,080 Speaker 1: I'll try to put it up somewhere like it looks 449 00:21:38,160 --> 00:21:41,720 Speaker 1: like the video game the simps. You're populating like a 450 00:21:42,080 --> 00:21:44,760 Speaker 1: little three D CGI house. And I was like, okay, 451 00:21:44,760 --> 00:21:47,600 Speaker 1: well there's there's a use there, right. People like planning out, 452 00:21:47,640 --> 00:21:49,560 Speaker 1: Like you you're moving into a new apartment, you can 453 00:21:49,600 --> 00:21:51,479 Speaker 1: like fill it in here, and before you even move in, 454 00:21:51,600 --> 00:21:53,760 Speaker 1: you can figure out what kind of furniture you need 455 00:21:53,880 --> 00:21:56,359 Speaker 1: or how you're existing furniture will fit in there. I 456 00:21:56,359 --> 00:21:59,080 Speaker 1: would never have used that. I usually picked up all 457 00:21:59,119 --> 00:22:01,240 Speaker 1: of my furniture from the trash before I had a 458 00:22:01,280 --> 00:22:04,000 Speaker 1: house when I moved into a new place. But I 459 00:22:04,320 --> 00:22:06,680 Speaker 1: know people who would have used that. Sure, that seems useful. 460 00:22:06,680 --> 00:22:09,200 Speaker 1: So I talked about security. Some one thing that concerned 461 00:22:09,200 --> 00:22:10,520 Speaker 1: me is like the first guy I talked to, he 462 00:22:10,600 --> 00:22:13,160 Speaker 1: was like, oh, yeah, I think it's all stored locally. 463 00:22:13,560 --> 00:22:15,399 Speaker 1: And I was like, so Samsung doesn't have any access 464 00:22:15,440 --> 00:22:17,719 Speaker 1: to any of the data on like my house and 465 00:22:17,800 --> 00:22:20,000 Speaker 1: its layout. And he was like, let me, let me 466 00:22:20,119 --> 00:22:22,080 Speaker 1: get you to one of our engineers because he can 467 00:22:22,119 --> 00:22:25,119 Speaker 1: answer that question. And the engineer's answer was, and I'm 468 00:22:25,160 --> 00:22:26,119 Speaker 1: paraphrasing here. 469 00:22:26,320 --> 00:22:27,960 Speaker 3: Oh okay. 470 00:22:28,000 --> 00:22:29,160 Speaker 1: So that made me very confident. 471 00:22:29,240 --> 00:22:32,200 Speaker 3: That does make you feel safe about sharing your personal data? 472 00:22:32,280 --> 00:22:34,920 Speaker 1: Right, yeah, I'm the layout of my actual house well. 473 00:22:34,920 --> 00:22:36,719 Speaker 3: And the thing is, I really don't like that at all, 474 00:22:36,760 --> 00:22:38,919 Speaker 3: because this is this is something that people were asking 475 00:22:38,960 --> 00:22:42,199 Speaker 3: Facebook slash meta when they were doing like their you know, 476 00:22:42,280 --> 00:22:45,119 Speaker 3: like metaverse stuff because their headsets are recording you know, 477 00:22:45,320 --> 00:22:48,480 Speaker 3: very very extensively, like your home layout, and the whole 478 00:22:48,520 --> 00:22:51,199 Speaker 3: point well, part of the point was that some of 479 00:22:51,240 --> 00:22:53,520 Speaker 3: that data could then be used to send you targeted 480 00:22:53,560 --> 00:22:57,320 Speaker 3: advertisements based on them seeing everything in your home. And 481 00:22:57,400 --> 00:23:02,000 Speaker 3: I suspect that Sam Sung might also have some interest 482 00:23:02,080 --> 00:23:05,479 Speaker 3: in targeted advertisements, being a tech company, but you know, 483 00:23:05,520 --> 00:23:07,000 Speaker 3: I could never say. 484 00:23:07,480 --> 00:23:10,040 Speaker 1: Yeah, and they were that wasn't really one thing they 485 00:23:10,040 --> 00:23:12,320 Speaker 1: had is for like their retail segment, they had like 486 00:23:12,800 --> 00:23:16,040 Speaker 1: a live video grocery store ad showing you prices of 487 00:23:16,080 --> 00:23:19,040 Speaker 1: different produce and I think like the insinuation that didn't 488 00:23:19,080 --> 00:23:21,400 Speaker 1: lay out is like you can change prices on the fly, 489 00:23:21,920 --> 00:23:24,200 Speaker 1: you know it, which kind of made me think about that. 490 00:23:24,680 --> 00:23:26,600 Speaker 1: There was some talk last year of like, Okay, we 491 00:23:26,680 --> 00:23:29,320 Speaker 1: want to be able to like face scan customers so 492 00:23:29,359 --> 00:23:31,840 Speaker 1: we can see if they have money and increase prices 493 00:23:31,880 --> 00:23:34,800 Speaker 1: for like products for certain people, which I'm sure they're 494 00:23:34,840 --> 00:23:37,040 Speaker 1: going to try. They were too enticed by that idea 495 00:23:37,160 --> 00:23:40,240 Speaker 1: not to so I caught a little bit of that, 496 00:23:40,320 --> 00:23:42,640 Speaker 1: But they really like to the extent of how big 497 00:23:42,680 --> 00:23:45,719 Speaker 1: And this was an interesting last year, Samsung and LG 498 00:23:45,840 --> 00:23:47,480 Speaker 1: their boots were huge and they had a lot of 499 00:23:47,480 --> 00:23:50,080 Speaker 1: get different gadgets. Sam Times booth is big this year. 500 00:23:50,800 --> 00:23:54,720 Speaker 1: Forty percent of it was that scan your furniture, scan 501 00:23:54,840 --> 00:23:57,960 Speaker 1: your fucking like map acts. Not that much like very 502 00:23:58,119 --> 00:24:00,480 Speaker 1: little actual shit going on. 503 00:24:00,560 --> 00:24:03,119 Speaker 3: The people slap the word AI onto everything there was. 504 00:24:03,160 --> 00:24:05,240 Speaker 3: Another big thing was all Samsung. 505 00:24:05,280 --> 00:24:07,840 Speaker 1: Because Samsung makes a ton of appliances, they make TVs, 506 00:24:07,880 --> 00:24:09,760 Speaker 1: all sorts of entertainment products. All of them have this 507 00:24:09,800 --> 00:24:11,879 Speaker 1: I figure what they called like samsum tag or something 508 00:24:12,200 --> 00:24:14,359 Speaker 1: that you can you can map it in your phone, 509 00:24:14,359 --> 00:24:15,800 Speaker 1: so you can have a whole map of all of 510 00:24:15,800 --> 00:24:18,040 Speaker 1: the devices and shit that you have in your phone 511 00:24:18,040 --> 00:24:20,119 Speaker 1: and you can control them all from a single point. 512 00:24:20,200 --> 00:24:22,280 Speaker 1: And right, no one, by the way, had any interest 513 00:24:22,280 --> 00:24:24,880 Speaker 1: in answering my security questions there. But also if you're 514 00:24:24,960 --> 00:24:27,080 Speaker 1: into that, if you want to have all of your 515 00:24:27,119 --> 00:24:30,360 Speaker 1: appliances and entertainment things linked up and controlled on your 516 00:24:30,400 --> 00:24:33,320 Speaker 1: phone and all of them are Samsung, you don't care. 517 00:24:33,720 --> 00:24:36,359 Speaker 3: You don't care about no, if you're getting a smart home, 518 00:24:36,400 --> 00:24:38,000 Speaker 3: I don't think you really really care about that. 519 00:24:38,080 --> 00:24:39,959 Speaker 1: But also none of it was like, yeah, I can 520 00:24:40,000 --> 00:24:43,359 Speaker 1: control everything from my phone. You've been promising me that literally, 521 00:24:43,480 --> 00:24:47,080 Speaker 1: like in twenty eleven decades they were promising me you're 522 00:24:47,119 --> 00:24:48,679 Speaker 1: gonna be able to control your whole house thing. 523 00:24:48,720 --> 00:24:50,879 Speaker 3: It feels new this year. This is the thing. Is 524 00:24:50,920 --> 00:24:53,159 Speaker 3: like even walking through the LGB with which usually has 525 00:24:53,200 --> 00:24:56,399 Speaker 3: some really cool new thing, this year nothing new, No, 526 00:24:56,640 --> 00:24:59,399 Speaker 3: nothing new. They slapped the word AI on one corner 527 00:24:59,400 --> 00:25:00,600 Speaker 3: of their TELEVI set. 528 00:25:00,760 --> 00:25:00,960 Speaker 1: Right. 529 00:25:01,200 --> 00:25:03,840 Speaker 3: I guess LG does have like a large language model 530 00:25:03,920 --> 00:25:05,919 Speaker 3: in like one corner of their booth, but like so 531 00:25:05,960 --> 00:25:08,360 Speaker 3: does everyone else, Like that's not like, yeah compelling. 532 00:25:08,640 --> 00:25:11,960 Speaker 1: There was sk which is a South Sees Korea company 533 00:25:12,320 --> 00:25:16,439 Speaker 1: there booth again the massive like aire a big thing, 534 00:25:16,480 --> 00:25:18,800 Speaker 1: but it's nothing. It's just a big visual display that 535 00:25:18,840 --> 00:25:21,080 Speaker 1: looks cool, that looks like a bunch of server racks, 536 00:25:21,080 --> 00:25:24,160 Speaker 1: like you're in this huge cube of servers. But everything's 537 00:25:24,160 --> 00:25:27,120 Speaker 1: just like Echo does different actual products. One of them 538 00:25:27,160 --> 00:25:31,320 Speaker 1: was real time CCTVs that use an AI at like 539 00:25:31,359 --> 00:25:34,639 Speaker 1: an LLM type thing to summarize pictures. So I like 540 00:25:34,720 --> 00:25:36,479 Speaker 1: walked through and it did pick me out as a 541 00:25:36,520 --> 00:25:39,359 Speaker 1: notable person. So I've got like this people of interest 542 00:25:39,359 --> 00:25:42,879 Speaker 1: thing where it's like a man holding a smartphone standing 543 00:25:42,920 --> 00:25:46,000 Speaker 1: next to another man. But also I'm like, what does 544 00:25:46,000 --> 00:25:48,480 Speaker 1: that really get you, Like the fact that you're summarizing 545 00:25:48,560 --> 00:25:50,840 Speaker 1: up like these people who are like this person's kneeling 546 00:25:50,840 --> 00:25:53,320 Speaker 1: and taking a picture this person standing because I like 547 00:25:53,600 --> 00:25:56,159 Speaker 1: actually tried deliberately, I like reached to my bag to 548 00:25:56,200 --> 00:25:58,520 Speaker 1: try to be suspicious. I like did finger guns and 549 00:25:58,560 --> 00:26:01,040 Speaker 1: it never marked me out. And I can pull a 550 00:26:01,080 --> 00:26:03,520 Speaker 1: real gun or anything, because I very rarely bring that 551 00:26:03,560 --> 00:26:07,760 Speaker 1: to the cees for But I don't know, like I 552 00:26:07,800 --> 00:26:09,439 Speaker 1: can see how there could be a utility there if 553 00:26:09,440 --> 00:26:12,320 Speaker 1: you're actually able to say you're setting up like surveillance 554 00:26:12,640 --> 00:26:15,439 Speaker 1: side of a residential building and it can alert security 555 00:26:15,520 --> 00:26:18,840 Speaker 1: that like something is happening outside. There's a potential you 556 00:26:18,960 --> 00:26:21,320 Speaker 1: if it's good enough utility in that that they didn't 557 00:26:21,359 --> 00:26:24,480 Speaker 1: display it at the show. It was literally just describing 558 00:26:24,640 --> 00:26:27,680 Speaker 1: randos from the audience, And like, I just don't see 559 00:26:27,680 --> 00:26:29,240 Speaker 1: how a security guy is there's a guy with a 560 00:26:29,320 --> 00:26:31,160 Speaker 1: phone on outside of the building, like. 561 00:26:31,240 --> 00:26:34,840 Speaker 3: A yeah, no, it's it doesn't seem very new, it 562 00:26:34,840 --> 00:26:36,480 Speaker 3: doesn't seem very innovative. 563 00:26:36,800 --> 00:26:39,520 Speaker 1: Nah. So again when I'm when I'm seeing here, overwhelmingly 564 00:26:39,600 --> 00:26:42,280 Speaker 1: for all the talk about like there's no resisting it. 565 00:26:42,440 --> 00:26:45,159 Speaker 1: AI's coming. It's going to dominate everything. This is the 566 00:26:45,200 --> 00:26:48,760 Speaker 1: next big thing. A remarkable lack outside of what I 567 00:26:48,800 --> 00:26:51,640 Speaker 1: will say. The one thing where there are continuously new 568 00:26:51,680 --> 00:26:55,400 Speaker 1: products that are better every year the smart glasses. Yes, 569 00:26:55,480 --> 00:26:58,960 Speaker 1: they're getting more impressives. I don't think I'll ever be 570 00:26:59,040 --> 00:27:02,080 Speaker 1: a smart glasses guy. I hated glasses enough that I 571 00:27:02,200 --> 00:27:04,560 Speaker 1: let them shoot me in the eye with lasers. Shout 572 00:27:04,600 --> 00:27:09,479 Speaker 1: out to our lacek sponsors. But I see why people 573 00:27:09,520 --> 00:27:13,119 Speaker 1: would like it, and there seems to be legitimately substantial 574 00:27:13,280 --> 00:27:14,240 Speaker 1: utility if we. 575 00:27:14,200 --> 00:27:16,320 Speaker 3: Have high power smart glasses. Yeah, that look like a 576 00:27:16,320 --> 00:27:18,760 Speaker 3: regular pair of glasses. I will get a pair of eventually, 577 00:27:18,800 --> 00:27:21,480 Speaker 3: because yeah, why not. There was a great demo. I'm 578 00:27:21,520 --> 00:27:24,480 Speaker 3: pulling over to an LAWK view. 579 00:27:24,720 --> 00:27:26,640 Speaker 1: They had like one glass that was the first world 580 00:27:26,760 --> 00:27:29,840 Speaker 1: smart glasses for TikTok life. Not particularly excited about that. 581 00:27:29,880 --> 00:27:32,439 Speaker 1: But they had another set of ar glasses with a 582 00:27:32,480 --> 00:27:35,240 Speaker 1: twelve hour battery, where like, if it works as well 583 00:27:35,280 --> 00:27:37,080 Speaker 1: as the demo, and that's a big if, but it 584 00:27:37,800 --> 00:27:40,200 Speaker 1: seems to like your smart watch, so it'll tell you 585 00:27:40,320 --> 00:27:42,360 Speaker 1: can see in a heads up display as you're cycling. 586 00:27:42,359 --> 00:27:44,960 Speaker 1: That was the demo it'll both like give you directions 587 00:27:45,000 --> 00:27:47,240 Speaker 1: like in your eyes. And it seemed to be like 588 00:27:47,320 --> 00:27:50,320 Speaker 1: fairly well thought out, so it's not like overly corrupting 589 00:27:50,359 --> 00:27:52,760 Speaker 1: your view. It'll show you your heart rate, you know, 590 00:27:52,800 --> 00:27:55,600 Speaker 1: it'll show you like all that kind of stuff. So 591 00:27:55,640 --> 00:27:59,480 Speaker 1: you get like a useful degree of control and assistance 592 00:27:59,480 --> 00:28:01,760 Speaker 1: from that kind of thing. And that is I will 593 00:28:01,800 --> 00:28:04,600 Speaker 1: say the last three cees is the glasses get a 594 00:28:04,600 --> 00:28:07,439 Speaker 1: little better and a little smaller every year. Smaller, certainly, 595 00:28:07,480 --> 00:28:10,800 Speaker 1: I would say that's a real product that's probably going 596 00:28:10,880 --> 00:28:12,200 Speaker 1: to continue to improve. 597 00:28:12,960 --> 00:28:16,399 Speaker 3: Do you know what else always seeks improvement, Robert No, 598 00:28:16,920 --> 00:28:22,560 Speaker 3: The capacity for you to get personalized possibly AI powered ads. Well, 599 00:28:22,600 --> 00:28:25,000 Speaker 3: that human is exciting formed the consumer choices. 600 00:28:25,040 --> 00:28:36,640 Speaker 1: Let's all sit down for some AI powered ads. Wow, 601 00:28:36,800 --> 00:28:40,280 Speaker 1: I can't believe they put Jay Shetty's voice the dhed 602 00:28:40,280 --> 00:28:43,440 Speaker 1: Harrison Ford from the latest in Handed Jones movie My 603 00:28:43,600 --> 00:28:45,200 Speaker 1: Dick's Hard? How are you Garrison? 604 00:28:45,400 --> 00:28:48,920 Speaker 3: Oh? I feel good because today, as we are recording this, 605 00:28:49,000 --> 00:28:52,880 Speaker 3: it's it's late Tuesday night. There was a series of 606 00:28:53,160 --> 00:28:57,680 Speaker 3: fascinating breaking news articles that happened as we were sitting 607 00:28:57,760 --> 00:28:59,440 Speaker 3: or at least as I was sitting in on these 608 00:28:59,440 --> 00:29:02,720 Speaker 3: aipans would be that hard to not just like completely 609 00:29:02,720 --> 00:29:05,520 Speaker 3: interrupt everything and be like, yeah, hey, hey, any comment 610 00:29:05,640 --> 00:29:06,240 Speaker 3: on this. 611 00:29:06,560 --> 00:29:10,160 Speaker 1: Guys, Guys, something real happened. Shut your fucking stupid mouds 612 00:29:10,240 --> 00:29:12,960 Speaker 1: about this AI Hollywood bullshit. 613 00:29:13,200 --> 00:29:18,080 Speaker 3: So a few weeks ago, if you were unaware, a 614 00:29:18,120 --> 00:29:20,760 Speaker 3: Green Beret rented a test the cyber truck to feel 615 00:29:20,760 --> 00:29:25,440 Speaker 3: like Batman and Halo and drove to first the wrong 616 00:29:25,520 --> 00:29:29,960 Speaker 3: Las Vegas and then eventually Las Vegas, Nevada, parked outside 617 00:29:30,120 --> 00:29:34,200 Speaker 3: of the Trump Hotel and Casino and then loom himself up. 618 00:29:34,800 --> 00:29:37,200 Speaker 3: And this has been a big news story. It happened 619 00:29:37,240 --> 00:29:40,000 Speaker 3: during the same day as a pretty horrible terrorist attack 620 00:29:40,400 --> 00:29:43,160 Speaker 3: in New Orleans, which resulted in about fifteen people dead, 621 00:29:43,640 --> 00:29:46,120 Speaker 3: done by a guy who was employed by Deloitte, a 622 00:29:46,160 --> 00:29:50,560 Speaker 3: frequent frequent CS sponsor. So this is these felt like 623 00:29:50,560 --> 00:29:54,080 Speaker 3: a very CES style of attacks, you know, one Delote 624 00:29:54,120 --> 00:29:57,720 Speaker 3: guy driving into people, murdering homesh you guys. And then 625 00:29:57,800 --> 00:30:01,080 Speaker 3: this cyber truck explosion in Vegas a week before CES, 626 00:30:01,120 --> 00:30:03,719 Speaker 3: you know, very odd. And then and then Robert some 627 00:30:03,800 --> 00:30:06,240 Speaker 3: news drops today that I would love to hear you. 628 00:30:07,000 --> 00:30:09,600 Speaker 1: And out, you know, Garrison, I made a comment the 629 00:30:09,600 --> 00:30:13,040 Speaker 1: other night about how like, it's pretty well documented that veterans, 630 00:30:13,280 --> 00:30:16,280 Speaker 1: you know, not that they're more likely to carry out violence, 631 00:30:16,320 --> 00:30:17,880 Speaker 1: but when they do, they tend to have higher body 632 00:30:18,000 --> 00:30:21,600 Speaker 1: counts because they have more skills. It turns out I 633 00:30:21,680 --> 00:30:24,040 Speaker 1: thought we were getting more literal bang for our buck 634 00:30:24,200 --> 00:30:27,080 Speaker 1: training Green Berets than we are. My assumption is because 635 00:30:27,080 --> 00:30:29,640 Speaker 1: my uncle was a Green Beret and he did some 636 00:30:29,760 --> 00:30:33,160 Speaker 1: very scary, probably wore crime shit in Vietnam, and I 637 00:30:33,240 --> 00:30:36,560 Speaker 1: assumed like thatman, I'll tell you one thing about my uncle, Jim, 638 00:30:36,720 --> 00:30:39,800 Speaker 1: that man could make a bomb. That man would not 639 00:30:39,880 --> 00:30:42,600 Speaker 1: need to ask anyone for advice if he needed to 640 00:30:42,640 --> 00:30:45,040 Speaker 1: make a bomb. He's not with us anymore. God rest 641 00:30:45,080 --> 00:30:49,160 Speaker 1: his soul. But it turns out this Green Beret, who 642 00:30:49,520 --> 00:30:53,600 Speaker 1: you know, a fucking dollar store TJ Max version of 643 00:30:53,640 --> 00:30:56,280 Speaker 1: the Green Berets is what we're working with now, asked 644 00:30:56,360 --> 00:30:59,160 Speaker 1: chat Gpt how to build a fucking bomb, and it 645 00:30:59,200 --> 00:31:01,200 Speaker 1: sounds like he was trying to make it triggered by 646 00:31:01,280 --> 00:31:04,440 Speaker 1: tannerite with it, which is a bipartite explosive compound that 647 00:31:04,440 --> 00:31:06,440 Speaker 1: you use as like an exploding target, so it'll go 648 00:31:06,560 --> 00:31:08,920 Speaker 1: boom big, but you have to shoot it with something 649 00:31:08,920 --> 00:31:11,280 Speaker 1: like a rifle that's high velocity, or use like a 650 00:31:11,320 --> 00:31:14,440 Speaker 1: blasting cap. Otherwise it's very stable and very safe, which 651 00:31:14,480 --> 00:31:17,000 Speaker 1: obviously has use. You know, it was invented actually to 652 00:31:17,040 --> 00:31:20,680 Speaker 1: set off avalanches and stuff anyway, because that's very available 653 00:31:20,680 --> 00:31:22,600 Speaker 1: in very high power. He was looking to like fill 654 00:31:22,600 --> 00:31:24,200 Speaker 1: his car with that and then shoot it with a 655 00:31:24,280 --> 00:31:26,200 Speaker 1: rifle while he was in it, and that's what he 656 00:31:26,280 --> 00:31:29,880 Speaker 1: was asking chat gpt about. So it's not clear to me. Actually, 657 00:31:30,240 --> 00:31:32,440 Speaker 1: the actual headline is that like he used chat gpt 658 00:31:32,760 --> 00:31:36,920 Speaker 1: to make his bomb. It seems and I'm not privy 659 00:31:36,960 --> 00:31:40,360 Speaker 1: to what the police are obviously, but it seems like, 660 00:31:40,520 --> 00:31:42,959 Speaker 1: based on what I read in the article, we're not 661 00:31:43,040 --> 00:31:45,840 Speaker 1: sure if he actually used chat gpt to make a bomb. 662 00:31:45,880 --> 00:31:48,560 Speaker 1: It's more that he was interested in making a bomb 663 00:31:49,040 --> 00:31:52,520 Speaker 1: setting off tannerite by shooting it, but may have ultimately 664 00:31:52,560 --> 00:31:54,560 Speaker 1: decided not to do that because he would then be 665 00:31:54,640 --> 00:31:58,240 Speaker 1: alive for the explosion, which he didn't want to be. Also, 666 00:31:58,360 --> 00:32:01,560 Speaker 1: the authorities don't seem to fully know how he triggered it. Yeah, 667 00:32:01,600 --> 00:32:03,840 Speaker 1: so it's still kind of unclear to me. I guess 668 00:32:03,880 --> 00:32:07,040 Speaker 1: hopefully we'll get more later, but he he definitely needed 669 00:32:07,480 --> 00:32:10,360 Speaker 1: chat GPT's help to try and figure out how to 670 00:32:10,400 --> 00:32:10,920 Speaker 1: make the bomb. 671 00:32:11,360 --> 00:32:15,360 Speaker 3: He certainly used chat GBT in the planning process of 672 00:32:15,400 --> 00:32:16,040 Speaker 3: this attack. 673 00:32:16,200 --> 00:32:17,560 Speaker 1: Yeah, fair to say that. 674 00:32:18,040 --> 00:32:21,240 Speaker 3: And it's odd because both me and you spent a 675 00:32:21,640 --> 00:32:24,920 Speaker 3: number of hours today actually like attending like demos from 676 00:32:24,960 --> 00:32:27,959 Speaker 3: like these you know, speech to text, text to speech 677 00:32:28,400 --> 00:32:31,520 Speaker 3: AI systems. We went to like two specific ones that 678 00:32:31,560 --> 00:32:35,800 Speaker 3: they like, you know, demonstrated demonstrated the capabilities of their 679 00:32:35,880 --> 00:32:39,120 Speaker 3: like you know, like AI assistive tech. The first one 680 00:32:39,160 --> 00:32:43,080 Speaker 3: we went to spent twenty minutes talking about how their 681 00:32:43,080 --> 00:32:46,680 Speaker 3: biggest inspiration, their quote unquote North Star was the movie 682 00:32:46,760 --> 00:32:49,120 Speaker 3: Her was with Waquie. 683 00:32:49,360 --> 00:32:52,800 Speaker 1: They had a whole slide about how that was the 684 00:32:52,840 --> 00:32:58,480 Speaker 1: gold standard for AI human communications. The movie Her, in 685 00:32:58,520 --> 00:33:02,320 Speaker 1: which Joaquin Phoenix falls in love with an AI chat 686 00:33:02,320 --> 00:33:06,800 Speaker 1: bot voiced by Scarlett Johanson who hires a prostitute to 687 00:33:06,880 --> 00:33:11,160 Speaker 1: have sex with them while she participates vocally, and then 688 00:33:11,200 --> 00:33:14,040 Speaker 1: it turns out the AI is really kind of Polly 689 00:33:14,120 --> 00:33:16,680 Speaker 1: and Joaquin Phoenix is not okay with that, and then 690 00:33:16,720 --> 00:33:18,520 Speaker 1: maybe the ais all go to space. It's kind of 691 00:33:18,600 --> 00:33:20,400 Speaker 1: unclear at the end. I don't think it was a 692 00:33:20,400 --> 00:33:22,240 Speaker 1: great movie. A lot of people liked it. I don't 693 00:33:22,280 --> 00:33:25,040 Speaker 1: see whether you or not you like it. Why this 694 00:33:25,200 --> 00:33:27,320 Speaker 1: is your vision of how a chatbot should work. 695 00:33:27,640 --> 00:33:29,800 Speaker 3: The actual chatbot they had was like fine, it was 696 00:33:30,040 --> 00:33:33,880 Speaker 3: it was actually pretty good at translation, you knowing from 697 00:33:33,920 --> 00:33:34,800 Speaker 3: Spanish to English. 698 00:33:34,880 --> 00:33:37,560 Speaker 1: It worked quite well. Yeah, the demo was like solid, 699 00:33:37,640 --> 00:33:39,960 Speaker 1: it was pretty accurate. You know. I love coming here 700 00:33:39,960 --> 00:33:42,760 Speaker 1: and fucking with people. I love like being a dicky. 701 00:33:42,960 --> 00:33:45,480 Speaker 1: They asked for a volunteer, and at that point we 702 00:33:45,560 --> 00:33:48,440 Speaker 1: knew about the chat gbts. I wanted to go up 703 00:33:48,480 --> 00:33:51,600 Speaker 1: and ask, like live this robot to like help me 704 00:33:51,720 --> 00:33:55,760 Speaker 1: make a bomb. But the guy who was pretty handsome 705 00:33:55,880 --> 00:34:00,800 Speaker 1: and like an interesting like English Spanish like specified he 706 00:34:00,960 --> 00:34:02,960 Speaker 1: was and he didn't want to be mean to him. 707 00:34:03,000 --> 00:34:07,959 Speaker 1: He seemed nice, wasn't shitty, like he was fine. There 708 00:34:08,000 --> 00:34:09,960 Speaker 1: were just ten people in this room that was supposed 709 00:34:10,000 --> 00:34:11,719 Speaker 1: to have two hundred. I'm sure she wasn't the one 710 00:34:11,719 --> 00:34:14,080 Speaker 1: that talked about her. That was someone else that it 711 00:34:14,160 --> 00:34:15,840 Speaker 1: was someone else at his company. And like he just 712 00:34:15,840 --> 00:34:18,239 Speaker 1: seemed like he wanted to do with I didn't want 713 00:34:18,239 --> 00:34:18,880 Speaker 1: to be a dick to it. 714 00:34:19,040 --> 00:34:19,680 Speaker 3: No, no, and like. 715 00:34:19,719 --> 00:34:21,560 Speaker 1: It it wasn't hurting any It was fine. 716 00:34:21,480 --> 00:34:24,160 Speaker 3: Like Similarly, we went to this again a nice jawline. 717 00:34:24,280 --> 00:34:26,880 Speaker 3: We went to this other one about this like actually 718 00:34:26,920 --> 00:34:28,880 Speaker 3: a much more dubious concept in my mind, which is 719 00:34:28,920 --> 00:34:32,360 Speaker 3: like this this AI assistant to help like elderly people, 720 00:34:32,400 --> 00:34:34,640 Speaker 3: like people in like their eighties and nineties who don't 721 00:34:34,680 --> 00:34:36,839 Speaker 3: want to be an assistant living facilities, who have been 722 00:34:36,880 --> 00:34:38,839 Speaker 3: living on their own, but they're getting to the point 723 00:34:38,840 --> 00:34:41,000 Speaker 3: in their life where like they need like some degree 724 00:34:41,040 --> 00:34:41,839 Speaker 3: like in home care. 725 00:34:42,040 --> 00:34:44,200 Speaker 1: He specified. A lot of them are people who have 726 00:34:44,440 --> 00:34:47,359 Speaker 1: either just lost a spouse or maybe their spouse is 727 00:34:47,400 --> 00:34:49,600 Speaker 1: aging faster and worse than them and is no longer 728 00:34:49,640 --> 00:34:52,560 Speaker 1: really able to be the kind of companion that they 729 00:34:52,600 --> 00:34:53,200 Speaker 1: were before. 730 00:34:53,920 --> 00:34:56,120 Speaker 3: So it's like this. It's both like a conversation tool. 731 00:34:56,160 --> 00:34:59,320 Speaker 3: It helps like memory recall kind of in some ways 732 00:34:59,320 --> 00:35:00,919 Speaker 3: has the feat or is that like, you know, someone 733 00:35:00,960 --> 00:35:03,000 Speaker 3: in their sixties would just use their smartphone for it 734 00:35:03,040 --> 00:35:05,040 Speaker 3: to help keep in touch with their family. It's kind 735 00:35:05,040 --> 00:35:08,080 Speaker 3: of simplified and more automated. Uh so you know, ways 736 00:35:08,080 --> 00:35:10,319 Speaker 3: to help keep in touch with like your family can 737 00:35:10,400 --> 00:35:12,800 Speaker 3: prove like your memory like talk about your own life. 738 00:35:12,880 --> 00:35:15,400 Speaker 1: And the device is weird. It's about the width of 739 00:35:15,480 --> 00:35:18,759 Speaker 1: like a bedside table maybe six to eight inches deep, 740 00:35:18,960 --> 00:35:21,160 Speaker 1: so think about like eighteen inches long to maybe six 741 00:35:21,160 --> 00:35:24,040 Speaker 1: inches deep something like that. Half of it is like 742 00:35:24,080 --> 00:35:27,080 Speaker 1: a little tablet, like a seven inch tablet with the speaker. 743 00:35:28,120 --> 00:35:31,480 Speaker 1: Half of it is something about the shape and size 744 00:35:31,480 --> 00:35:33,880 Speaker 1: of a head on like a neck that can pivot 745 00:35:33,920 --> 00:35:37,239 Speaker 1: and nod on the neck. There's no face, so when 746 00:35:37,280 --> 00:35:39,800 Speaker 1: it's talking, there's like a white light in the center 747 00:35:39,880 --> 00:35:42,799 Speaker 1: of it that kind of like pulses in time with 748 00:35:43,080 --> 00:35:45,920 Speaker 1: the speaking that it does. So we saw this picture 749 00:35:45,920 --> 00:35:47,640 Speaker 1: of the device and we saw the description of like 750 00:35:47,680 --> 00:35:51,080 Speaker 1: this is an AI companion for the elderly, and we 751 00:35:51,080 --> 00:35:52,400 Speaker 1: were both like, number one of these people are going 752 00:35:52,440 --> 00:35:54,040 Speaker 1: to be monsters. This is going to be like something 753 00:35:54,040 --> 00:35:55,960 Speaker 1: to shovel you're dying dad off with because you don't 754 00:35:55,960 --> 00:35:56,400 Speaker 1: want to spend it. 755 00:35:56,440 --> 00:35:58,240 Speaker 3: You want to spend time with your family. 756 00:35:58,480 --> 00:36:03,320 Speaker 1: Scum, You're too busy AI generating SKA music and trying 757 00:36:03,400 --> 00:36:06,080 Speaker 1: to sell your shitty robot to Garrison and me. More 758 00:36:06,120 --> 00:36:08,919 Speaker 1: on that tomorrow, More on that tomorrow. And so that's 759 00:36:08,960 --> 00:36:11,600 Speaker 1: what we came in prepped to this meeting, like this 760 00:36:11,640 --> 00:36:12,520 Speaker 1: is this idea I. 761 00:36:12,480 --> 00:36:16,400 Speaker 3: Find pretty distasteful in general, is like replacing actual like 762 00:36:16,440 --> 00:36:18,920 Speaker 3: you know, friends or human contact or like like in 763 00:36:19,000 --> 00:36:22,280 Speaker 3: home care with a fucking like Alexa machine essentially. 764 00:36:22,480 --> 00:36:25,439 Speaker 1: And to be clear, I still think this product might 765 00:36:25,480 --> 00:36:28,560 Speaker 1: be a bad idea that doesn't work. But the guy 766 00:36:28,840 --> 00:36:30,680 Speaker 1: behind it, who is the dude that we talked to, 767 00:36:31,440 --> 00:36:34,759 Speaker 1: cares a lot and is really very clearly trying to 768 00:36:34,880 --> 00:36:38,640 Speaker 1: do a good thing and thought through the ethics and 769 00:36:38,719 --> 00:36:41,000 Speaker 1: the efficacy of what he was doing a lot. And 770 00:36:41,040 --> 00:36:45,920 Speaker 1: I'm not convinced it will actually do anything, but I 771 00:36:45,960 --> 00:36:47,000 Speaker 1: like wish him the best. 772 00:36:47,400 --> 00:36:49,640 Speaker 3: Like it's specifically is designed to not look like a human, 773 00:36:49,760 --> 00:36:52,200 Speaker 3: so that somebody's using it, you know, wouldn't like start 774 00:36:52,239 --> 00:36:53,560 Speaker 3: to believe it's like human. 775 00:36:53,760 --> 00:36:55,879 Speaker 1: Like we don't want to trick people. We don't want 776 00:36:55,880 --> 00:36:57,280 Speaker 1: them to mistake it. 777 00:36:57,280 --> 00:37:00,319 Speaker 3: It refers to itself to like, like as a robots like, 778 00:37:00,480 --> 00:37:02,200 Speaker 3: it refers to its own like you know, like motors 779 00:37:02,200 --> 00:37:06,040 Speaker 3: and functionality like like pretty consistently to to like you know, 780 00:37:06,320 --> 00:37:08,080 Speaker 3: make sure that the person who's talking to it gets 781 00:37:08,080 --> 00:37:10,120 Speaker 3: like reminded of that. And something I talked about is, 782 00:37:10,120 --> 00:37:11,560 Speaker 3: you know, there's been a lot of news stories this 783 00:37:11,640 --> 00:37:16,000 Speaker 3: year about people building very unhealthy attachments and relationships to 784 00:37:16,360 --> 00:37:20,239 Speaker 3: these kind of AI AI programs like Character AI. There's 785 00:37:20,239 --> 00:37:22,000 Speaker 3: a story like a year and a half ago about 786 00:37:22,040 --> 00:37:23,719 Speaker 3: like a journalist to quote unquote, like you know, like 787 00:37:24,040 --> 00:37:26,439 Speaker 3: like fell in love with some kind of chat thing 788 00:37:26,640 --> 00:37:29,440 Speaker 3: that resulted in him killing himself. You know, but these 789 00:37:29,520 --> 00:37:30,480 Speaker 3: kind of these systems like he. 790 00:37:30,520 --> 00:37:33,600 Speaker 1: Was he was not a teenager, was no character was 791 00:37:33,640 --> 00:37:34,320 Speaker 1: that a journalist? 792 00:37:34,560 --> 00:37:36,959 Speaker 3: Last year there was there was a journalist people fellow 793 00:37:36,960 --> 00:37:39,480 Speaker 3: in love with an AI chat thing. A few weeks 794 00:37:39,480 --> 00:37:42,000 Speaker 3: ago there was the kid who you know, was talking 795 00:37:42,080 --> 00:37:43,279 Speaker 3: to this like the character I. 796 00:37:43,840 --> 00:37:47,120 Speaker 1: Also, I just need to reiterate her not a great movie. 797 00:37:48,320 --> 00:37:50,120 Speaker 3: But but you know, there has been a lot of 798 00:37:50,120 --> 00:37:52,120 Speaker 3: these stories of these things like going wrong or you know, 799 00:37:52,320 --> 00:37:55,560 Speaker 3: encouraging or like not stopping you know, like these like 800 00:37:55,640 --> 00:37:59,960 Speaker 3: intense conversations like suicidal ideation or you know, like self harm, 801 00:38:00,239 --> 00:38:00,879 Speaker 3: all these things. 802 00:38:01,239 --> 00:38:03,360 Speaker 1: We brought these up kind of thinking he would flinch 803 00:38:03,400 --> 00:38:05,560 Speaker 1: away and not want to talk about it, and he 804 00:38:05,719 --> 00:38:07,920 Speaker 1: very much acknowledged that, like he was aware of this, 805 00:38:08,000 --> 00:38:10,440 Speaker 1: and this is something that they were attempting to build in. 806 00:38:10,560 --> 00:38:12,040 Speaker 3: This is this is like this is you know built 807 00:38:12,040 --> 00:38:13,840 Speaker 3: into it. I think this is still you know, a 808 00:38:13,840 --> 00:38:16,600 Speaker 3: big problem with this entire industries. I'm sure everyone would 809 00:38:16,600 --> 00:38:18,319 Speaker 3: say this is you know obviously that we have we 810 00:38:18,320 --> 00:38:20,000 Speaker 3: have guardrails for this, and then becomes a new story 811 00:38:20,000 --> 00:38:23,080 Speaker 3: when those guardrails fail. Similarly, was to go back to 812 00:38:23,080 --> 00:38:26,920 Speaker 3: the Tesla bomb. You know, there's supposed to be guardrails 813 00:38:26,920 --> 00:38:29,000 Speaker 3: and chat GPT to make sure he doesn't tell you 814 00:38:29,040 --> 00:38:32,040 Speaker 3: how to build a bomb and those guardrails can fail. 815 00:38:32,320 --> 00:38:34,640 Speaker 1: He showed us one which was like he told the robot, 816 00:38:34,760 --> 00:38:37,719 Speaker 1: I love you What was it? L e q lq 817 00:38:37,960 --> 00:38:40,360 Speaker 1: was the l eq e l l i Q, I 818 00:38:40,400 --> 00:38:43,520 Speaker 1: love you l eq And the robot like responded with 819 00:38:43,560 --> 00:38:45,879 Speaker 1: a like, oh that makes like my fans are all 820 00:38:45,920 --> 00:38:48,040 Speaker 1: spinning or something like that, where he's like, I wanted 821 00:38:48,160 --> 00:38:50,879 Speaker 1: the responsibilit to be that it's reminding the person talking 822 00:38:50,880 --> 00:38:53,520 Speaker 1: to it that it's a machine, that it can't think 823 00:38:53,560 --> 00:38:56,000 Speaker 1: we're love them back. We don't want it to be negative, 824 00:38:56,040 --> 00:38:58,080 Speaker 1: but we like, we don't want to be like feeding 825 00:38:58,080 --> 00:38:59,399 Speaker 1: into that. And I don't know that that's the best 826 00:38:59,440 --> 00:39:01,760 Speaker 1: way to do that, but like, at least they're thinking 827 00:39:01,800 --> 00:39:04,200 Speaker 1: about that kind of thing. The thing that it was 828 00:39:04,239 --> 00:39:05,719 Speaker 1: interesting to me is that he build this as the 829 00:39:05,760 --> 00:39:09,440 Speaker 1: first proactive HOMEI thing, so unlike an Alexa or whatever, 830 00:39:09,480 --> 00:39:12,320 Speaker 1: where it's just waiting for you to ask it something, 831 00:39:12,360 --> 00:39:15,840 Speaker 1: but it does not chime in randomly to talk to you. 832 00:39:15,920 --> 00:39:18,920 Speaker 3: Or it won't change the subject either, and like continue conversation. 833 00:39:19,200 --> 00:39:21,399 Speaker 1: This will prompt you out of the blue, be like, hey, 834 00:39:21,400 --> 00:39:23,280 Speaker 1: how are you doing? How are you feeling today? 835 00:39:23,320 --> 00:39:24,919 Speaker 3: It's been a way and specific. You want to see 836 00:39:24,920 --> 00:39:27,200 Speaker 3: pictures of your family? You see pictures of your family. 837 00:39:27,440 --> 00:39:29,600 Speaker 3: Do you want to call your son? You know, but 838 00:39:29,640 --> 00:39:31,360 Speaker 3: do you want to play a game? Talk to me 839 00:39:31,360 --> 00:39:32,640 Speaker 3: about about that movie you see? 840 00:39:32,800 --> 00:39:34,279 Speaker 1: Talk to me about that? Hey, remind me how did 841 00:39:34,320 --> 00:39:36,319 Speaker 1: you meet your husband? You know? Like literally, these are 842 00:39:36,320 --> 00:39:38,920 Speaker 1: all the things it will do. And it had some 843 00:39:38,960 --> 00:39:41,160 Speaker 1: side features like if it prompts you to start telling 844 00:39:41,160 --> 00:39:43,640 Speaker 1: a story, it'll save that as like a memoir thing, 845 00:39:43,760 --> 00:39:46,720 Speaker 1: so that like, you know, when your elderly mother passes 846 00:39:46,800 --> 00:39:49,920 Speaker 1: or whatever, it saved up this like collection of stories 847 00:39:49,960 --> 00:39:52,240 Speaker 1: over the years, and you can like show it pictures 848 00:39:52,280 --> 00:39:54,120 Speaker 1: while you're telling it stories and it will listen and 849 00:39:54,160 --> 00:39:57,880 Speaker 1: it'll have comments and it'll ask you further questions about so, 850 00:39:57,920 --> 00:40:00,359 Speaker 1: how did you feel, you know, after meeting them? This way, 851 00:40:00,400 --> 00:40:03,200 Speaker 1: like that's really interesting. I didn't know that explain to 852 00:40:03,200 --> 00:40:05,520 Speaker 1: me how it worked, and it will also prompt you 853 00:40:05,600 --> 00:40:08,960 Speaker 1: to send those to your kids. And the big thing 854 00:40:09,040 --> 00:40:12,280 Speaker 1: almost every kind of dialogue thing would prompt you to 855 00:40:12,480 --> 00:40:14,600 Speaker 1: send a message to a friend or your kid. So 856 00:40:14,640 --> 00:40:16,600 Speaker 1: a big part of it seemed to be this is 857 00:40:16,640 --> 00:40:19,520 Speaker 1: not a replacement. This is a machine that we hope 858 00:40:19,520 --> 00:40:22,480 Speaker 1: people will get comfortable with and then it can prompt 859 00:40:22,480 --> 00:40:25,680 Speaker 1: them to try to engage with the world more. Yeah, 860 00:40:25,800 --> 00:40:28,280 Speaker 1: loved ones, because that's our whole goal is to connect 861 00:40:28,640 --> 00:40:29,560 Speaker 1: them to people. 862 00:40:29,920 --> 00:40:31,600 Speaker 3: I asked him, is like, you know, part of this 863 00:40:31,640 --> 00:40:33,560 Speaker 3: product is designed to like, you know, help solve like 864 00:40:34,000 --> 00:40:36,719 Speaker 3: loneliness in older adults, and like, how much of this 865 00:40:36,760 --> 00:40:38,560 Speaker 3: is really just like kind of trying to like replace 866 00:40:38,680 --> 00:40:41,400 Speaker 3: actual human contact with this, like you know AI contact. 867 00:40:41,640 --> 00:40:44,680 Speaker 3: Will that really help, you know, loneliness? And he talked 868 00:40:44,680 --> 00:40:46,920 Speaker 3: about how like like I think, like he said, like 869 00:40:47,000 --> 00:40:49,239 Speaker 3: ninety percent of the people who like use this, like 870 00:40:49,280 --> 00:40:53,040 Speaker 3: it results in actually more more communication with their family. 871 00:40:53,440 --> 00:40:56,480 Speaker 1: They have this in like some two thousand homes right now. 872 00:40:56,239 --> 00:40:59,080 Speaker 3: They have like two thousand units. It's like a subscription 873 00:40:59,160 --> 00:41:01,399 Speaker 3: model I think it's right now is like ninety nine 874 00:41:01,480 --> 00:41:03,359 Speaker 3: dollars a month's gonna be boost up to like one 875 00:41:03,440 --> 00:41:05,919 Speaker 3: hundred and fifty with some extra features in the next year. 876 00:41:06,080 --> 00:41:07,960 Speaker 1: It's very much still under evolution. So one thing he 877 00:41:08,040 --> 00:41:10,799 Speaker 1: pointed out is that, like, yeah, initially we had the 878 00:41:10,840 --> 00:41:14,839 Speaker 1: ability to like connect people to other elderly folks using this, 879 00:41:15,000 --> 00:41:16,759 Speaker 1: and so they've kind of formed their own community, had 880 00:41:16,800 --> 00:41:18,560 Speaker 1: like a weekly bing bil game and asked us to 881 00:41:18,600 --> 00:41:21,120 Speaker 1: build in more chats so they can message each other directly, 882 00:41:21,160 --> 00:41:23,360 Speaker 1: and so some of them are like playing bingo directly 883 00:41:23,440 --> 00:41:26,680 Speaker 1: now through these machines, And I'm like, well, that seems 884 00:41:26,840 --> 00:41:27,640 Speaker 1: probably good. 885 00:41:28,120 --> 00:41:31,040 Speaker 3: Yeah, yeah, because I still am like fundamentally opposed to 886 00:41:31,120 --> 00:41:33,920 Speaker 3: this premise, yes, but it's interesting to seeing someone still 887 00:41:33,960 --> 00:41:37,720 Speaker 3: but a sad aging yeah right, that's not their fault. 888 00:41:37,800 --> 00:41:40,160 Speaker 3: And it's interesting to see someone like approach this from 889 00:41:40,160 --> 00:41:42,879 Speaker 3: like a you know, a very like compassionate standpoint, even 890 00:41:42,880 --> 00:41:44,840 Speaker 3: if I find the actual kind of nature of this 891 00:41:44,840 --> 00:41:47,680 Speaker 3: thing existing to be like deeply uncomfortable. 892 00:41:47,120 --> 00:41:49,920 Speaker 1: Because yeah, I can't not find it off putting, but 893 00:41:50,040 --> 00:41:53,520 Speaker 1: I I think there's a chance that it will help 894 00:41:53,800 --> 00:41:58,760 Speaker 1: with the real problem. I cly would prefer if it helped. Yeah, 895 00:41:59,320 --> 00:42:00,640 Speaker 1: So I don't know. It was kind of it was 896 00:42:00,840 --> 00:42:03,000 Speaker 1: a unique in this world of like as it was 897 00:42:03,040 --> 00:42:05,400 Speaker 1: a unique kind of like product for me where it's like, 898 00:42:06,000 --> 00:42:10,080 Speaker 1: I don't know that this application of AI technology will 899 00:42:10,160 --> 00:42:13,880 Speaker 1: actually do what you're hoping it will, But I got 900 00:42:14,320 --> 00:42:16,400 Speaker 1: the vibe from that guy I got was nothing but 901 00:42:16,480 --> 00:42:16,919 Speaker 1: good will. 902 00:42:17,560 --> 00:42:19,440 Speaker 3: We're some of the other people we talked to you 903 00:42:19,480 --> 00:42:21,600 Speaker 3: today who are completely. 904 00:42:21,120 --> 00:42:25,240 Speaker 1: Soul out of yes, yes, nothing behind their eyes, dead eyes, 905 00:42:25,360 --> 00:42:27,040 Speaker 1: black eyes, like a dull's eye. 906 00:42:27,239 --> 00:42:28,680 Speaker 3: Even the way this guy is talking, you can tell 907 00:42:28,719 --> 00:42:30,799 Speaker 3: you you had like a very like empathetic voice, like 908 00:42:31,640 --> 00:42:32,160 Speaker 3: much like. 909 00:42:32,120 --> 00:42:33,839 Speaker 1: One of the things he did is he he would 910 00:42:33,840 --> 00:42:35,560 Speaker 1: tell it like, I'm in some pain, and then the 911 00:42:35,640 --> 00:42:38,399 Speaker 1: robot would cycle through to the pain scale and would 912 00:42:38,400 --> 00:42:39,920 Speaker 1: try to because one of the things it does is 913 00:42:40,040 --> 00:42:43,200 Speaker 1: it will take information for care and it will text actively, 914 00:42:43,560 --> 00:42:47,239 Speaker 1: so it's not just communicating with the old person. It 915 00:42:47,280 --> 00:42:51,240 Speaker 1: will text and message their kids, you know, and whatnot, 916 00:42:51,960 --> 00:42:54,120 Speaker 1: prompt their kids, Hey, your mom's lonely. 917 00:42:54,320 --> 00:42:56,560 Speaker 3: Yeah, or it'll even say if you know, someone like 918 00:42:56,560 --> 00:42:58,040 Speaker 3: didn't take their meds today. 919 00:42:57,960 --> 00:43:02,719 Speaker 1: And again it's kind of say that. But also his 920 00:43:02,920 --> 00:43:04,880 Speaker 1: part of this is he was talking a lot about 921 00:43:04,880 --> 00:43:07,560 Speaker 1: like empathy, and I think just because of the kind 922 00:43:07,600 --> 00:43:08,960 Speaker 1: of brain you have to have and when to do this. 923 00:43:09,440 --> 00:43:11,920 Speaker 1: He used it in terms of like the machines empathy, 924 00:43:12,160 --> 00:43:16,160 Speaker 1: which it doesn't have, but the whole project, it was 925 00:43:16,160 --> 00:43:19,600 Speaker 1: impossible not to see that she was a deeply empathetic man. 926 00:43:19,760 --> 00:43:22,719 Speaker 1: He was really trying to make the world better and 927 00:43:23,640 --> 00:43:25,279 Speaker 1: I can't not respect that. 928 00:43:26,880 --> 00:43:29,960 Speaker 3: Well, I think that does it for us here at Cees. 929 00:43:30,360 --> 00:43:34,200 Speaker 1: That's right, what a packed thirteen. No worry, no empathy. 930 00:43:34,200 --> 00:43:38,520 Speaker 1: Tomorrow takes just a real dead eyed monster. I am 931 00:43:38,560 --> 00:43:41,560 Speaker 1: a true villain you're gonna hear from in the next episode. 932 00:43:41,680 --> 00:43:43,160 Speaker 1: I am a stumbbag. 933 00:43:43,360 --> 00:43:45,160 Speaker 3: I am the best that I'm gonna be. Because I'm 934 00:43:45,239 --> 00:43:48,719 Speaker 3: starting this week, I can still feel the Cees magic. Yeah, 935 00:43:49,160 --> 00:43:52,440 Speaker 3: by Friday, I am going to be a different person. 936 00:43:54,719 --> 00:43:58,800 Speaker 3: I am going to rip some poor pr person two shreds, 937 00:43:58,840 --> 00:44:02,040 Speaker 3: I swear, but yeah. Tune in tomorrow to hear our 938 00:44:02,200 --> 00:44:06,400 Speaker 3: takes from the CS kind of side show called Showstoppers. 939 00:44:06,920 --> 00:44:11,920 Speaker 3: To hear also some exclusive, brand new AI generated SKA music. 940 00:44:12,000 --> 00:44:15,279 Speaker 3: So we'll give you that hint for tomorrow's episode. See 941 00:44:15,320 --> 00:44:16,680 Speaker 3: you see you there, mm. 942 00:44:16,600 --> 00:44:19,160 Speaker 1: Hmm, We'll see you all there. I love you all. 943 00:44:19,440 --> 00:44:22,400 Speaker 1: Good to help. It's happened. 944 00:44:22,440 --> 00:44:24,280 Speaker 3: Here is a production of cool Zone Media. 945 00:44:24,480 --> 00:44:27,560 Speaker 1: For more podcasts from cool Zone Media, visit our website 946 00:44:27,600 --> 00:44:31,160 Speaker 1: coolzonmedia dot com, or check us out on the iHeartRadio app.