1 00:00:01,720 --> 00:00:07,520 Speaker 1: Al Zone Media. 2 00:00:06,120 --> 00:00:10,040 Speaker 2: Sweet Baby, Jiminy Christmas, Welcome back to It Could Happen 3 00:00:10,119 --> 00:00:14,120 Speaker 2: Here a podcast. It's normally about all of the sad 4 00:00:14,160 --> 00:00:18,400 Speaker 2: and horrifying and violent and dangerous and sometimes inspiring things 5 00:00:18,480 --> 00:00:21,720 Speaker 2: happening around the world. But this week, well, today is 6 00:00:21,760 --> 00:00:27,560 Speaker 2: about something different. Today We're talking about ces finally, for 7 00:00:27,680 --> 00:00:29,400 Speaker 2: those of you who don't know or who are new 8 00:00:29,440 --> 00:00:32,960 Speaker 2: to the show. Every year in Las Vegas, Nevada, a 9 00:00:33,159 --> 00:00:35,920 Speaker 2: bunch of the world's big tech companies come together for 10 00:00:36,000 --> 00:00:39,519 Speaker 2: the Consumer Electronics Show, where they present their visions for 11 00:00:39,600 --> 00:00:42,240 Speaker 2: the future, the new products that will be coming out 12 00:00:42,280 --> 00:00:44,080 Speaker 2: that year, and stuff that will be coming out and 13 00:00:44,159 --> 00:00:47,199 Speaker 2: used to come that's less developed, and the whole industry 14 00:00:47,240 --> 00:00:50,879 Speaker 2: talks about itself, and Garrison and I show up and 15 00:00:52,000 --> 00:00:54,920 Speaker 2: largely just kind of let it wash over us like 16 00:00:55,840 --> 00:01:02,480 Speaker 2: a warming tide of lukewarm garbage water, fairy lukewarmy, very lukewarm, 17 00:01:02,600 --> 00:01:06,280 Speaker 2: and it smells like someone did not clean their fridge 18 00:01:06,280 --> 00:01:08,759 Speaker 2: out often enough before putting it into the trash. 19 00:01:08,840 --> 00:01:13,560 Speaker 3: That was the feeling of showstoppers tonight, the media only 20 00:01:14,400 --> 00:01:16,680 Speaker 3: presentation on the finest products of CEES. 21 00:01:16,880 --> 00:01:19,160 Speaker 2: Yeah, yeah, why don't we start? So, I mean I mean, 22 00:01:19,520 --> 00:01:22,320 Speaker 2: there's two different things that are interesting about CEES broadly. 23 00:01:22,360 --> 00:01:24,760 Speaker 2: One of them is people bring gadgets that are not 24 00:01:24,880 --> 00:01:26,960 Speaker 2: out yet that it will be coming out this year 25 00:01:27,080 --> 00:01:29,520 Speaker 2: or coming out soon, and you can actually test them 26 00:01:29,520 --> 00:01:33,160 Speaker 2: and use them and see how technology is progressing, and 27 00:01:33,200 --> 00:01:35,880 Speaker 2: that can be kind of fun. The downside of that 28 00:01:36,040 --> 00:01:39,479 Speaker 2: is that people also bring gadgets that are crap. Right, 29 00:01:39,680 --> 00:01:41,560 Speaker 2: some guy has a vision for a way to like, 30 00:01:42,120 --> 00:01:44,919 Speaker 2: you know, there's not a good way for blind people 31 00:01:45,000 --> 00:01:48,440 Speaker 2: to use the pogo stick while watching Netflix, and so 32 00:01:48,560 --> 00:01:51,080 Speaker 2: I have created this product, right, Or like, there's not 33 00:01:51,200 --> 00:01:54,360 Speaker 2: a good way for children to test their blood alcohol 34 00:01:54,480 --> 00:01:57,120 Speaker 2: level before getting behind the wheel of a jeep Grand Cherokee, 35 00:01:57,200 --> 00:01:59,320 Speaker 2: and I have invented the device to make it about 36 00:01:59,400 --> 00:02:03,840 Speaker 2: things that, like I have no conceivable audience or utilization. Right. 37 00:02:04,080 --> 00:02:06,520 Speaker 2: That's the other side of the gadget part of CEES. 38 00:02:06,600 --> 00:02:08,519 Speaker 2: And then outside of that you get a hint at 39 00:02:08,560 --> 00:02:10,920 Speaker 2: like there's all these panels where people from the industry 40 00:02:10,919 --> 00:02:13,400 Speaker 2: come to talk about the major trends in technology, how 41 00:02:13,400 --> 00:02:15,720 Speaker 2: things are developing, and what they see as the future. 42 00:02:15,760 --> 00:02:18,440 Speaker 2: And so there's both here's what they're going to try 43 00:02:18,480 --> 00:02:20,640 Speaker 2: to sell us and here's the devices that might change 44 00:02:20,680 --> 00:02:23,200 Speaker 2: the way we live. And also here's how a bunch 45 00:02:23,240 --> 00:02:25,960 Speaker 2: of the richest, sometimes craziest people in the country you're 46 00:02:26,000 --> 00:02:28,320 Speaker 2: talking about the future. Those are the two things that 47 00:02:28,400 --> 00:02:31,440 Speaker 2: happen at CES and Garrison. You wanted to talk about 48 00:02:31,440 --> 00:02:35,280 Speaker 2: the first the gadgets. The gadgets, the gadgets one as 49 00:02:35,320 --> 00:02:37,080 Speaker 2: you went to the gadget show tonight. I spent my 50 00:02:37,240 --> 00:02:38,320 Speaker 2: entire day in panels. 51 00:02:38,800 --> 00:02:40,480 Speaker 4: Yeah, I mean I did most of the panels in 52 00:02:40,520 --> 00:02:40,799 Speaker 4: the day. 53 00:02:40,800 --> 00:02:43,400 Speaker 3: I didn't really got to watch the show floor on 54 00:02:43,440 --> 00:02:46,359 Speaker 3: the first day, which is which is Tuesday. So instead 55 00:02:46,360 --> 00:02:49,120 Speaker 3: of doing the show floor, I went to show stoppers 56 00:02:49,120 --> 00:02:53,960 Speaker 3: at the Bellagio, which is this presentation of Usually usually 57 00:02:54,440 --> 00:02:56,880 Speaker 3: you know, a collection of gadgets that have won CES 58 00:02:56,960 --> 00:03:01,120 Speaker 3: Innovation Awards, which are on display or for journalists and 59 00:03:01,200 --> 00:03:04,240 Speaker 3: media you can talk to the people behind them choicetoppers. 60 00:03:04,280 --> 00:03:06,520 Speaker 3: This year was a little different. It took place like. 61 00:03:06,520 --> 00:03:08,400 Speaker 4: A in a in like a different venue. Haul was 62 00:03:08,520 --> 00:03:12,600 Speaker 4: smaller than the past few showstoppers years, and I would 63 00:03:12,639 --> 00:03:15,160 Speaker 4: say about forty percent of it with smart glasses. 64 00:03:16,160 --> 00:03:19,280 Speaker 2: Yeah, there's usually like a big product that is like 65 00:03:19,360 --> 00:03:22,400 Speaker 2: this product category is the hot thing. This year, we've 66 00:03:22,520 --> 00:03:25,399 Speaker 2: tried all smart glasses every year, it is that they've 67 00:03:25,400 --> 00:03:26,200 Speaker 2: always had them. 68 00:03:26,280 --> 00:03:27,040 Speaker 4: Every year that. 69 00:03:27,160 --> 00:03:29,639 Speaker 3: We've been doing this, we've done smart glasses, and they've 70 00:03:29,639 --> 00:03:32,400 Speaker 3: always kind of been the same. Maybe the resolution on 71 00:03:32,480 --> 00:03:36,440 Speaker 3: like the text better, like the glasses get a little 72 00:03:36,440 --> 00:03:38,800 Speaker 3: bit smaller, and then this year, yeah, the glasses were 73 00:03:38,800 --> 00:03:42,360 Speaker 3: generally smaller, but for all practical purposes function about the same. 74 00:03:42,640 --> 00:03:44,720 Speaker 3: But there was but ten different smart glasses. Most of 75 00:03:44,760 --> 00:03:47,880 Speaker 3: them could do some kind of like transcription service, could 76 00:03:47,920 --> 00:03:50,040 Speaker 3: have some kind of heads up display. One of them 77 00:03:50,080 --> 00:03:52,280 Speaker 3: was just audio only. It was like an audio audio 78 00:03:52,320 --> 00:03:55,240 Speaker 3: transcriptions like it like it listens to someone else speaking, 79 00:03:55,280 --> 00:03:57,800 Speaker 3: in this case Chinese, and it would translates to me 80 00:03:58,160 --> 00:04:02,080 Speaker 3: to American. Yeah, yeah, translate to American via sound. It 81 00:04:02,120 --> 00:04:04,240 Speaker 3: had speakers. It had speakers, and like the actual you know, 82 00:04:04,280 --> 00:04:07,040 Speaker 3: like the arm of the glasses, the delay was long 83 00:04:07,160 --> 00:04:09,160 Speaker 3: enough that it was you couldn't really keep a conversation 84 00:04:09,200 --> 00:04:13,320 Speaker 3: of a normal speed like the visual translations, which you 85 00:04:13,360 --> 00:04:15,760 Speaker 3: can't actually kind of just talk in full time. But 86 00:04:15,840 --> 00:04:18,880 Speaker 3: the audio only ones were like a smaller profile. The 87 00:04:18,960 --> 00:04:22,600 Speaker 3: visual ones weren't necessarily bulkier, but you can definitely see that. 88 00:04:22,560 --> 00:04:23,920 Speaker 4: There's more hardware inside them. 89 00:04:24,000 --> 00:04:26,080 Speaker 3: Yeah, the thing that I have seen this year which 90 00:04:26,120 --> 00:04:30,200 Speaker 3: is newer, er, maybe not totally new, but incorporating smart 91 00:04:30,200 --> 00:04:34,240 Speaker 3: glasses technology into other types of eyewear, so like swim goggles, 92 00:04:34,440 --> 00:04:39,040 Speaker 3: ski goggles, like outdoor sports stuff. So if you're you know, 93 00:04:39,279 --> 00:04:42,600 Speaker 3: swimming or you're diving and you can't really use your 94 00:04:42,600 --> 00:04:45,120 Speaker 3: phone underwater, you have there's there's a heads up that 95 00:04:45,240 --> 00:04:48,120 Speaker 3: there's a heads up display in like your like Scoopa goggles. 96 00:04:48,480 --> 00:04:51,240 Speaker 4: So that that's that's a new, a newish thing that 97 00:04:51,279 --> 00:04:51,760 Speaker 4: I've seen. 98 00:04:52,279 --> 00:04:56,160 Speaker 3: I've seen like you know, biking glasses, skiing, snowboarding, so 99 00:04:56,279 --> 00:04:59,680 Speaker 3: that that's the kind of one one slight change. But 100 00:04:59,839 --> 00:05:02,480 Speaker 3: in besides that, it's basically five different smart glasses which 101 00:05:02,520 --> 00:05:05,880 Speaker 3: are for all pactal purposes identical right to each other. 102 00:05:06,160 --> 00:05:09,240 Speaker 2: Yeah, I mean, and that I think is kind of 103 00:05:09,240 --> 00:05:11,760 Speaker 2: one of the things that I've watched happen over the 104 00:05:12,440 --> 00:05:15,240 Speaker 2: fifteen years almost that I've been going to cess or 105 00:05:15,240 --> 00:05:18,760 Speaker 2: cess whichever is more accurate, which is, you know, when 106 00:05:18,800 --> 00:05:21,240 Speaker 2: I first started coming, the smartphone era was new, and 107 00:05:21,279 --> 00:05:23,600 Speaker 2: then we had like the tablet era after that, and 108 00:05:23,640 --> 00:05:26,040 Speaker 2: so there was a lot of like you would have 109 00:05:26,120 --> 00:05:29,240 Speaker 2: dozens of manufacturers making different devices, and every year there 110 00:05:29,279 --> 00:05:32,280 Speaker 2: were very different capability. For the first few years, smartphones 111 00:05:32,320 --> 00:05:35,640 Speaker 2: were out advanced very rapidly, and that was really exciting, 112 00:05:35,680 --> 00:05:39,719 Speaker 2: and the conventionally thrived on that as the number of 113 00:05:39,760 --> 00:05:42,680 Speaker 2: new device categories of winnowed down and the difference like 114 00:05:42,760 --> 00:05:45,159 Speaker 2: I'm not excited when I get a phone anymore now 115 00:05:45,240 --> 00:05:45,920 Speaker 2: there's anyone I. 116 00:05:45,880 --> 00:05:48,000 Speaker 3: Know, because it's like no, usually I'm actually kind of 117 00:05:48,200 --> 00:05:49,120 Speaker 3: yeah more sad. 118 00:05:49,240 --> 00:05:49,440 Speaker 2: Yeah. 119 00:05:49,520 --> 00:05:49,680 Speaker 3: Yeah. 120 00:05:49,800 --> 00:05:51,240 Speaker 2: The only thing that's exciting is like, well, my old 121 00:05:51,279 --> 00:05:52,960 Speaker 2: phone was literally not working anymore. 122 00:05:53,040 --> 00:05:55,840 Speaker 4: Yeah the phone work, the battery has been completely destroyed. 123 00:05:55,880 --> 00:05:57,800 Speaker 2: Yeah the battery works now or whatever. But it's not 124 00:05:57,839 --> 00:06:01,119 Speaker 2: like the cameras are not ce changes better. Generally, nothing 125 00:06:01,240 --> 00:06:03,880 Speaker 2: is like you're not getting a lot more out of 126 00:06:03,920 --> 00:06:06,400 Speaker 2: it than you used to. And the same mystruy of 127 00:06:06,440 --> 00:06:09,480 Speaker 2: like laptops, i mean, graphics cards just because of the 128 00:06:10,480 --> 00:06:13,280 Speaker 2: data center crunch, like that's not nearly as exciting a 129 00:06:13,320 --> 00:06:15,760 Speaker 2: technology category for consumers as it used to be. So 130 00:06:16,279 --> 00:06:20,000 Speaker 2: this stuff is just like less less sexy, and yeah, 131 00:06:20,320 --> 00:06:21,960 Speaker 2: it just kind of shows that we're at a point 132 00:06:21,960 --> 00:06:23,960 Speaker 2: where kind of one of the only spaces where they 133 00:06:24,000 --> 00:06:26,039 Speaker 2: are still making improvements and where there's a lot of 134 00:06:26,040 --> 00:06:27,960 Speaker 2: competition in the market is smart glasses. 135 00:06:28,160 --> 00:06:31,000 Speaker 3: Yeah, I mean that's like the wearables category in general, Yeah, 136 00:06:31,000 --> 00:06:35,159 Speaker 3: which was mentioned. I went to the Consumer Technology Association 137 00:06:35,440 --> 00:06:38,800 Speaker 3: like keynote panel this morning, which is the group that 138 00:06:38,880 --> 00:06:42,400 Speaker 3: puts on cees, and they mentioned only only a few products, 139 00:06:42,440 --> 00:06:45,560 Speaker 3: but one of them we're smart glasses. And then also 140 00:06:45,640 --> 00:06:49,360 Speaker 3: like wearables in general, like AI powered wearables and now 141 00:06:49,400 --> 00:06:53,680 Speaker 3: like wearable technology. You know, it's like smart watches, rings, necklaces, 142 00:06:53,720 --> 00:06:56,160 Speaker 3: whatever are going to make like a big comeback now 143 00:06:56,200 --> 00:06:59,039 Speaker 3: that now that AI is a lot is a lot 144 00:06:59,080 --> 00:07:02,920 Speaker 3: more intelligent, and it used to be in particular at 145 00:07:02,920 --> 00:07:06,080 Speaker 3: the CEES Like big keynote Tuesday morning, you mentioned a 146 00:07:06,120 --> 00:07:09,880 Speaker 3: persona smart tutor glasses glasses to help you, you know, 147 00:07:09,960 --> 00:07:12,480 Speaker 3: well learning. I haven't tried to be able to check 148 00:07:12,480 --> 00:07:14,400 Speaker 3: out the product yet, but they kind of remind me 149 00:07:14,440 --> 00:07:17,960 Speaker 3: of some of the concept behind those cleuely glasses that 150 00:07:18,000 --> 00:07:20,520 Speaker 3: you may have seen on social media, the. 151 00:07:20,480 --> 00:07:21,640 Speaker 4: Glasses that help you like cheat. 152 00:07:21,960 --> 00:07:25,600 Speaker 2: Yeah, but also somebody who's like we should embrace people cheating. 153 00:07:25,280 --> 00:07:28,280 Speaker 3: And but cheat just in a conversation, it seems like 154 00:07:28,280 --> 00:07:32,080 Speaker 3: that product isn't necessarily as real as uh what the 155 00:07:32,200 --> 00:07:33,720 Speaker 3: video might make it out to be. 156 00:07:33,880 --> 00:07:37,120 Speaker 2: People whose company was based on lying didn't make a 157 00:07:37,160 --> 00:07:37,840 Speaker 2: real product. 158 00:07:38,000 --> 00:07:40,880 Speaker 3: But well, walking through your Eureka Park today, it's funny. 159 00:07:40,880 --> 00:07:43,840 Speaker 3: I also saw this this product in one of like 160 00:07:43,880 --> 00:07:45,920 Speaker 3: the National Pavilions I think it was the one of 161 00:07:45,960 --> 00:07:49,880 Speaker 3: like the japan Tech Pavilions. AI powered tool to help 162 00:07:50,160 --> 00:07:52,680 Speaker 3: to help prevent cheating while test taking. So you have 163 00:07:53,000 --> 00:07:56,080 Speaker 3: AI powered tools that will monitor you to make sure 164 00:07:56,120 --> 00:07:56,840 Speaker 3: you're not cheating. 165 00:07:56,840 --> 00:07:58,960 Speaker 4: Will you use an AI powered tool that helps. 166 00:07:58,760 --> 00:08:00,880 Speaker 2: Cheat cheat it? Yeah? Better at school? 167 00:08:00,960 --> 00:08:03,520 Speaker 3: Yeah, that's kind of just a good representation of kind 168 00:08:03,560 --> 00:08:05,520 Speaker 3: of where where this whole industry is at at the moment. 169 00:08:05,800 --> 00:08:06,240 Speaker 2: Yeah. 170 00:08:06,480 --> 00:08:08,280 Speaker 3: In some ways, I think, you know, this is probably 171 00:08:08,280 --> 00:08:12,240 Speaker 3: what year three of AI being you know, the big thing, 172 00:08:12,280 --> 00:08:14,960 Speaker 3: whether that's unwearables, you know, where that's smart glasses, whether 173 00:08:14,960 --> 00:08:18,360 Speaker 3: that's you know, a generative AI, whether that's AI. You know, 174 00:08:19,240 --> 00:08:22,600 Speaker 3: but it's been but a has been been, like you know, 175 00:08:22,880 --> 00:08:26,520 Speaker 3: the added of property for for everything, and some of 176 00:08:26,560 --> 00:08:28,720 Speaker 3: that might be starting to kind of tucker out or 177 00:08:28,760 --> 00:08:33,320 Speaker 3: at least the they've taken the victory lap. And there's 178 00:08:33,360 --> 00:08:35,600 Speaker 3: there's a certain like you know, like cultural victory that 179 00:08:35,600 --> 00:08:38,160 Speaker 3: that they're resting on whether they're starting to put some 180 00:08:38,240 --> 00:08:41,640 Speaker 3: of their eggs in other baskets now, which certainly wasn't 181 00:08:41,640 --> 00:08:42,480 Speaker 3: the case last year. 182 00:08:42,960 --> 00:08:46,080 Speaker 2: No, and I I got a sense I attended six 183 00:08:46,120 --> 00:08:50,240 Speaker 2: panels today, congratulations. It was a mix of like advertising people, 184 00:08:51,000 --> 00:08:56,000 Speaker 2: entertainment associated people, some journalism associated people, and in robotics, 185 00:08:56,000 --> 00:08:57,920 Speaker 2: people in robotics talking about a lot of a lot 186 00:08:57,920 --> 00:08:59,960 Speaker 2: of robodies. Yeah, what they saw was the future of AI, 187 00:09:00,640 --> 00:09:03,760 Speaker 2: and there was a lot of focus first on AI 188 00:09:03,880 --> 00:09:05,760 Speaker 2: is not going to be taking jobs as much as 189 00:09:05,760 --> 00:09:08,560 Speaker 2: it's going to be augmenting jobs, right, Although you would 190 00:09:08,559 --> 00:09:10,480 Speaker 2: get the occasional person be like, I's going to take 191 00:09:10,520 --> 00:09:12,200 Speaker 2: a lot of jobs, I got. 192 00:09:12,120 --> 00:09:14,600 Speaker 3: People saying that it's it's only going to take jobs 193 00:09:14,600 --> 00:09:17,240 Speaker 3: if if you don't know how to incorporate AI into 194 00:09:17,280 --> 00:09:20,320 Speaker 3: your workfloce. And that's the argument at the moment right now. Yeah, 195 00:09:21,160 --> 00:09:23,480 Speaker 3: if you're not using AI, you're a greater risk of 196 00:09:24,120 --> 00:09:24,800 Speaker 3: you losing So. 197 00:09:24,800 --> 00:09:28,360 Speaker 2: You better get on it, right, start start learning it. Yes, Yes. 198 00:09:28,520 --> 00:09:30,680 Speaker 2: And then the other thing is that there was a 199 00:09:30,679 --> 00:09:32,920 Speaker 2: lot of like it's there to help or take away 200 00:09:32,960 --> 00:09:36,120 Speaker 2: unpleasant tasks from workers, but really emphasizing the it's not 201 00:09:36,240 --> 00:09:38,960 Speaker 2: your enemy thing, you don't need to be scared. And 202 00:09:39,000 --> 00:09:42,680 Speaker 2: I quoted like half of these panels, people would quote 203 00:09:42,679 --> 00:09:46,320 Speaker 2: statistics about low user trust and AI and the fact 204 00:09:46,360 --> 00:09:50,960 Speaker 2: that people are generally not super comfortable with this technology 205 00:09:50,960 --> 00:09:53,600 Speaker 2: even if they use it in parts of their work 206 00:09:53,640 --> 00:09:56,640 Speaker 2: life right or daily life. And so what I saw 207 00:09:56,679 --> 00:09:58,360 Speaker 2: from that, when I interpret from that, is that there 208 00:09:58,400 --> 00:10:00,360 Speaker 2: is internal concern that like that that's one of the 209 00:10:00,400 --> 00:10:02,200 Speaker 2: things that could screw the pooch on this is that 210 00:10:02,240 --> 00:10:05,600 Speaker 2: people are not really sure they like this stuff, and 211 00:10:06,000 --> 00:10:10,400 Speaker 2: so there's this impulse to kind of cover the softer 212 00:10:10,520 --> 00:10:13,720 Speaker 2: and fuzzier sides of it that I didn't see in 213 00:10:13,840 --> 00:10:16,720 Speaker 2: previous years. Yeah, and I think is really focusing on 214 00:10:17,559 --> 00:10:20,520 Speaker 2: this is just making things you already like better, as 215 00:10:20,520 --> 00:10:23,960 Speaker 2: opposed to this is a revolution that's completely changing life. 216 00:10:24,679 --> 00:10:28,719 Speaker 3: And to the extent where AI was fremous revolutionary, it 217 00:10:28,760 --> 00:10:32,679 Speaker 3: was specifically trying to ground it in like physical applications, 218 00:10:32,720 --> 00:10:36,000 Speaker 3: as opposed to this more general kind of like spectral 219 00:10:36,120 --> 00:10:38,199 Speaker 3: like AI hype that we've seen the past few years, 220 00:10:38,240 --> 00:10:40,760 Speaker 3: which is in specifically around like generative AI right where 221 00:10:40,760 --> 00:10:42,520 Speaker 3: it's like this like kind of vague thing that we 222 00:10:42,640 --> 00:10:46,560 Speaker 3: like gestured to there's more specific applications for AI being 223 00:10:46,600 --> 00:10:48,960 Speaker 3: talked about right now, and they talked talked about like 224 00:10:49,240 --> 00:10:55,920 Speaker 3: AI assisted manufacturing simulations like digital twins of factories, shipyards, 225 00:10:55,960 --> 00:10:59,000 Speaker 3: power plants, a lot of digital twin talk building you know, 226 00:10:59,080 --> 00:11:03,000 Speaker 3: a digital replica of like everything you know of society, 227 00:11:03,040 --> 00:11:06,400 Speaker 3: to like run these simulations to both make AIS smarter, 228 00:11:06,800 --> 00:11:10,040 Speaker 3: to generate new solutions outside of the limitations of a 229 00:11:10,120 --> 00:11:13,680 Speaker 3: language model, also find you know, potential problems in you know, 230 00:11:13,800 --> 00:11:15,280 Speaker 3: when you build these things physically. 231 00:11:15,760 --> 00:11:17,240 Speaker 2: Yeah, that was something that was brought up during the 232 00:11:17,320 --> 00:11:20,280 Speaker 2: robotics panel, which was talking about how to like take 233 00:11:21,120 --> 00:11:23,760 Speaker 2: the machine learning technology and other things that are generally 234 00:11:23,760 --> 00:11:26,439 Speaker 2: grouped under AI and apply it in the physical world. 235 00:11:26,800 --> 00:11:30,240 Speaker 3: Yeah, manufacturing, I've had done so much right in just 236 00:11:30,320 --> 00:11:33,079 Speaker 3: one day. I probably I've heard the word manufacturing more 237 00:11:33,080 --> 00:11:36,959 Speaker 3: today than I have most in every cees I've been 238 00:11:37,000 --> 00:11:39,160 Speaker 3: to previously, combined. 239 00:11:38,760 --> 00:11:41,960 Speaker 2: And I think consciously more focused on industrial applications than 240 00:11:42,000 --> 00:11:45,440 Speaker 2: on consumer technology because there's not that much new to 241 00:11:45,480 --> 00:11:48,480 Speaker 2: give the consumer right and they are also I think 242 00:11:48,520 --> 00:11:51,720 Speaker 2: starting to recognize that you can get people using chat, 243 00:11:51,760 --> 00:11:54,840 Speaker 2: GPT and the like, but they're mostly not using it, 244 00:11:54,880 --> 00:11:56,920 Speaker 2: and the data backs this up. People are mostly using 245 00:11:56,920 --> 00:11:59,640 Speaker 2: it at work and for school and gin z a lot. 246 00:12:00,440 --> 00:12:02,400 Speaker 2: There's a lot of people who are doing like their 247 00:12:02,440 --> 00:12:05,000 Speaker 2: research on like what to buy and whatnot through using 248 00:12:05,080 --> 00:12:07,960 Speaker 2: chat GPT. But there's not a lot that you can 249 00:12:08,000 --> 00:12:12,439 Speaker 2: sell people in CES because it's an app and there's 250 00:12:12,480 --> 00:12:15,679 Speaker 2: not a ton of different devices for it. People are 251 00:12:15,720 --> 00:12:17,520 Speaker 2: using it on their phone, they're using it on their computer, 252 00:12:17,640 --> 00:12:19,280 Speaker 2: but like none of the new phones and computers are 253 00:12:19,320 --> 00:12:22,520 Speaker 2: marketly better at using chat GPT or another you know, 254 00:12:22,640 --> 00:12:24,800 Speaker 2: chatbot thing than any of the others. So there's not 255 00:12:24,880 --> 00:12:28,640 Speaker 2: a lot that's sexy in just that at CES. So 256 00:12:28,679 --> 00:12:32,520 Speaker 2: I think I have seen this conscience reforming around people 257 00:12:32,600 --> 00:12:35,360 Speaker 2: in manufacturing and people who are like thinking of the 258 00:12:35,400 --> 00:12:38,760 Speaker 2: concerns of like I have a pair like an exoskeleton 259 00:12:38,800 --> 00:12:40,720 Speaker 2: to test this week. That's seeing a lot of its 260 00:12:40,720 --> 00:12:44,920 Speaker 2: business in folks who are like doing like Amazon type 261 00:12:44,960 --> 00:12:47,640 Speaker 2: jobs right loading and unloading packages and whatnot all day long, 262 00:12:47,800 --> 00:12:50,439 Speaker 2: you know. And I do see a conscious reforming there, 263 00:12:50,480 --> 00:12:52,680 Speaker 2: which I think is kind of evidence of like there's 264 00:12:52,679 --> 00:12:54,800 Speaker 2: almost an admission that like, yeah, we don't really have 265 00:12:54,840 --> 00:12:59,360 Speaker 2: that much to hand consumers anymore. On a yearly. 266 00:12:59,120 --> 00:13:10,480 Speaker 5: Basis speaking of handing things to consumers ads. 267 00:13:15,280 --> 00:13:18,120 Speaker 2: So the first panel I went to of the day 268 00:13:18,520 --> 00:13:21,640 Speaker 2: was about the funnel, which, as I understand it is 269 00:13:21,679 --> 00:13:25,120 Speaker 2: just kind of like the way in which people have 270 00:13:25,280 --> 00:13:29,920 Speaker 2: traditionally engaged with like media, gotten advertised to and then 271 00:13:30,040 --> 00:13:32,320 Speaker 2: like gone to stores and bought stuff like the funnel 272 00:13:32,360 --> 00:13:35,560 Speaker 2: by which you like make a customer, and how that's 273 00:13:35,600 --> 00:13:38,520 Speaker 2: been completely blown up now, right, and AI is like 274 00:13:38,559 --> 00:13:42,080 Speaker 2: a further massive disruption because people are not like people 275 00:13:42,160 --> 00:13:45,240 Speaker 2: are increasingly, especially very young people, which was putted out 276 00:13:45,240 --> 00:13:47,720 Speaker 2: in a number of these are like buying stuff that 277 00:13:47,840 --> 00:13:50,679 Speaker 2: a chatbot recommends them, right, And so a lot of 278 00:13:50,720 --> 00:13:53,840 Speaker 2: marketing is being seen as being done through how do 279 00:13:53,920 --> 00:13:56,800 Speaker 2: you get the chatbot to talk about you a certain way? 280 00:13:56,840 --> 00:14:00,200 Speaker 2: What is the SEO of getting chat ebt? Oh, it's interesting, 281 00:14:00,240 --> 00:14:01,560 Speaker 2: like right, there's a lot of time I. 282 00:14:01,760 --> 00:14:04,560 Speaker 3: Thought, as someone who's not a regular chatbot user, which 283 00:14:04,559 --> 00:14:06,400 Speaker 3: I'm sure must people here am I, which chastis me 284 00:14:06,480 --> 00:14:09,680 Speaker 3: for we're not maximizing my productivity. 285 00:14:09,240 --> 00:14:11,080 Speaker 2: And meaning to get on to you for that garrison. 286 00:14:12,400 --> 00:14:14,720 Speaker 3: There's not a regular chat about user. I've never thought 287 00:14:14,720 --> 00:14:16,600 Speaker 3: of that before. Of yeah, I mean, like I know 288 00:14:16,640 --> 00:14:19,360 Speaker 3: people use these chatbots as a replacement for search engines, 289 00:14:19,400 --> 00:14:21,520 Speaker 3: But the idea of like trying to you know, evaluate 290 00:14:21,720 --> 00:14:25,240 Speaker 3: purchases is uh. I mean, I guess that makes sense now, 291 00:14:25,280 --> 00:14:26,400 Speaker 3: but I've never put that together. 292 00:14:26,600 --> 00:14:28,920 Speaker 2: Yeah, And the only because there was a lot of 293 00:14:29,080 --> 00:14:32,720 Speaker 2: talking about like how AI is helping advertisers, how it's 294 00:14:33,160 --> 00:14:36,520 Speaker 2: making advertisements, how like it's helping in the process of that, 295 00:14:36,640 --> 00:14:39,520 Speaker 2: And there was a focus in all the panels about 296 00:14:39,560 --> 00:14:42,080 Speaker 2: that on how like, well, it's just augmenting the humans. 297 00:14:42,600 --> 00:14:46,280 Speaker 2: But the only specific examples given were the McDonald's and 298 00:14:46,560 --> 00:14:50,360 Speaker 2: Coca Cola AI generated ads, which were both disasters. I 299 00:14:50,400 --> 00:14:53,800 Speaker 2: mean the McDonald's one in the Netherlands got removed. Yeah, 300 00:14:53,880 --> 00:14:54,760 Speaker 2: they people so bad. 301 00:14:54,840 --> 00:14:57,400 Speaker 4: They withdrew the attic so ugly. 302 00:14:57,440 --> 00:14:59,600 Speaker 2: Deed. Yeah, I don't know. Coke did do it twice. 303 00:15:00,040 --> 00:15:02,440 Speaker 2: Maybe they consider it a win. But everything I saw 304 00:15:02,520 --> 00:15:04,400 Speaker 2: was very negative. I didn't see a lot of positive 305 00:15:04,400 --> 00:15:08,520 Speaker 2: feedback on Coca Cola visavi. They're weird. AI holidays are coming. 306 00:15:08,320 --> 00:15:10,600 Speaker 3: At It's like people who don't know it's AI think 307 00:15:10,840 --> 00:15:13,080 Speaker 3: feel very neutral about it. Yeah, people that do we 308 00:15:13,160 --> 00:15:15,840 Speaker 3: know it's AI, I think generally have negative reactions. 309 00:15:15,920 --> 00:15:17,720 Speaker 2: I think if you look at it, it's pretty clear. 310 00:15:17,720 --> 00:15:19,840 Speaker 3: But anyway, I mean, it less clear if maybe you're 311 00:15:19,880 --> 00:15:22,000 Speaker 3: like a sixty year old watches. 312 00:15:21,720 --> 00:15:25,160 Speaker 2: You know, exact or in between stuff, and that's all 313 00:15:25,360 --> 00:15:25,960 Speaker 2: good enough. 314 00:15:26,280 --> 00:15:29,360 Speaker 4: Grandpa, don't go Internet, right, Yeah, he doesn't know. 315 00:15:29,480 --> 00:15:31,920 Speaker 2: He might notice some of those those fucking h polar 316 00:15:31,960 --> 00:15:34,480 Speaker 2: bears have the wrong number of pause. But yeah, so, 317 00:15:34,600 --> 00:15:35,960 Speaker 2: like there was some talk of that and the other. 318 00:15:36,040 --> 00:15:38,480 Speaker 2: The only specific example they gave of like an AI 319 00:15:39,120 --> 00:15:43,520 Speaker 2: enhanced strategy was Allegra. The people who own like the 320 00:15:43,600 --> 00:15:46,240 Speaker 2: medicine had like a new non drowsy formula or they 321 00:15:46,280 --> 00:15:48,600 Speaker 2: just wanted to highlight that it was non drowsy. So 322 00:15:48,640 --> 00:15:53,320 Speaker 2: they basically had a bunch of like seeded the stuff 323 00:15:53,360 --> 00:15:56,200 Speaker 2: that chatbot that like open Ai was scored that chatbots 324 00:15:56,200 --> 00:15:59,720 Speaker 2: were scraping, yeah, with content about how Allegra makes it 325 00:15:59,760 --> 00:16:03,680 Speaker 2: is on drowsy and about how like competing similar medications 326 00:16:03,680 --> 00:16:05,800 Speaker 2: make you drowsy, I mean, so that it would get 327 00:16:05,840 --> 00:16:10,480 Speaker 2: mentioned in like and they talked about They called it 328 00:16:10,560 --> 00:16:12,960 Speaker 2: like model hacking, I think was the exact term used. 329 00:16:13,440 --> 00:16:15,120 Speaker 2: And that was the only thing that was the only 330 00:16:15,160 --> 00:16:17,600 Speaker 2: specific example that I can't know any of this works too, 331 00:16:17,680 --> 00:16:19,760 Speaker 2: Like everyone else was just talking in vague terms about 332 00:16:19,760 --> 00:16:23,640 Speaker 2: like and we've really seen our teams creativity sore or whatever. 333 00:16:23,800 --> 00:16:26,640 Speaker 3: That's interesting, you know because the way that I probably 334 00:16:26,720 --> 00:16:29,200 Speaker 3: use or exposed to AI the most is like on 335 00:16:29,240 --> 00:16:31,280 Speaker 3: like Google Search now, which has you know, it's it's 336 00:16:31,280 --> 00:16:34,560 Speaker 3: like AI like summaries instead of like actual search results. 337 00:16:34,800 --> 00:16:36,840 Speaker 4: But those are based on like which you. 338 00:16:36,760 --> 00:16:39,160 Speaker 2: Can type minus AI in with the search results if 339 00:16:39,160 --> 00:16:40,240 Speaker 2: we want to cut that stuff out. 340 00:16:40,440 --> 00:16:43,600 Speaker 3: But those AI results are you know, pulling from certain 341 00:16:43,600 --> 00:16:46,360 Speaker 3: like articles which which they'll link to. So yeah, I 342 00:16:46,360 --> 00:16:47,920 Speaker 3: guys said, if I was trying to design it like 343 00:16:47,920 --> 00:16:51,560 Speaker 3: an AI marketing strategy, I would I would either you know, 344 00:16:51,920 --> 00:16:55,600 Speaker 3: pay publications to mention my product in more articles or 345 00:16:56,400 --> 00:16:58,760 Speaker 3: find find out find other ways to to to influence 346 00:16:58,800 --> 00:17:02,080 Speaker 3: mentions of of my product in like written media that 347 00:17:02,240 --> 00:17:04,600 Speaker 3: then would be used as like training data for AI. 348 00:17:05,280 --> 00:17:07,520 Speaker 3: And yeah, I guess there there can be a whole 349 00:17:07,600 --> 00:17:11,760 Speaker 3: you know, search engine optimization model acking is I guess. 350 00:17:12,000 --> 00:17:16,200 Speaker 2: Model optimism or yeah, like product opposite optimization for a model. 351 00:17:16,240 --> 00:17:19,200 Speaker 2: I guess that's funny, but yeah, like the So the 352 00:17:19,680 --> 00:17:22,000 Speaker 2: first talk that I went to was about the funnel 353 00:17:22,080 --> 00:17:23,840 Speaker 2: or whatever, and one of the people speaking. There was 354 00:17:23,880 --> 00:17:27,359 Speaker 2: the CMO, the chief marketing officer of Intuit, which is 355 00:17:27,400 --> 00:17:30,040 Speaker 2: the company that owns Turbo tax right like, it's one 356 00:17:30,040 --> 00:17:33,680 Speaker 2: of the big we do your taxes companies out there, 357 00:17:33,680 --> 00:17:37,000 Speaker 2: and also a lobbyer in terms of stopping any sort 358 00:17:37,000 --> 00:17:38,399 Speaker 2: of a form of make it's that you don't need 359 00:17:38,400 --> 00:17:40,520 Speaker 2: to do your own other countries. 360 00:17:40,520 --> 00:17:42,760 Speaker 4: Credit score monitoring, yes, a whole bunch of. 361 00:17:42,720 --> 00:17:44,840 Speaker 2: Stuff, all that kind of stuff. So this this guy, 362 00:17:44,880 --> 00:17:47,720 Speaker 2: the CMO of the company, Thomas Renize, was part of 363 00:17:47,720 --> 00:17:50,680 Speaker 2: this the end of the funnel speech, and he made 364 00:17:50,680 --> 00:17:52,280 Speaker 2: a couple of comments that I took note of. One 365 00:17:52,359 --> 00:17:55,360 Speaker 2: is product is brand and brand is product full stop. 366 00:17:55,800 --> 00:17:57,879 Speaker 2: So the more people you can make experience your product, 367 00:17:58,000 --> 00:18:01,359 Speaker 2: that's the best selling point and value of right which 368 00:18:01,600 --> 00:18:03,199 Speaker 2: it was just interesting to me in terms of the 369 00:18:03,520 --> 00:18:05,479 Speaker 2: into it as a company that has lobbied to make 370 00:18:05,520 --> 00:18:08,280 Speaker 2: it impossible for like any reform that would allow people 371 00:18:08,320 --> 00:18:12,000 Speaker 2: to not need a third party to do there they're taxes. 372 00:18:12,359 --> 00:18:14,520 Speaker 2: But also this idea that like product is brand and 373 00:18:14,560 --> 00:18:17,919 Speaker 2: brand is product isn't true of a lot of companies. 374 00:18:17,960 --> 00:18:20,639 Speaker 2: Like if you think about like, for example, like a 375 00:18:20,680 --> 00:18:23,080 Speaker 2: lot of the different soft drinks are all owned by 376 00:18:23,119 --> 00:18:26,159 Speaker 2: one company, but they're fundamentally different like products and have 377 00:18:26,240 --> 00:18:29,600 Speaker 2: an often cases like a different user base, and it's 378 00:18:29,760 --> 00:18:31,920 Speaker 2: very like, it's a very tech when your product is 379 00:18:31,960 --> 00:18:35,040 Speaker 2: a concept, Like you can't do your own taxes because 380 00:18:35,040 --> 00:18:36,840 Speaker 2: it's a pain in the ass, but the government doesn't 381 00:18:36,880 --> 00:18:38,880 Speaker 2: do it for you because we lobby to make that illegal. 382 00:18:39,359 --> 00:18:39,439 Speaker 3: Like. 383 00:18:40,080 --> 00:18:42,399 Speaker 2: I found that interesting and it kind of got me 384 00:18:43,040 --> 00:18:46,679 Speaker 2: angry at Thomas at the start of this. And I 385 00:18:46,880 --> 00:18:50,479 Speaker 2: was particularly interested in one of the things he brought up, 386 00:18:50,480 --> 00:18:52,520 Speaker 2: which is that he talked about the one hundred million 387 00:18:52,560 --> 00:18:55,040 Speaker 2: dollars that into It is putting into open Ai, and 388 00:18:55,080 --> 00:18:56,879 Speaker 2: they're putting this into open Ai as part of a 389 00:18:56,960 --> 00:19:00,800 Speaker 2: multi year partnership. And I want to qu vote from 390 00:19:00,920 --> 00:19:05,800 Speaker 2: an article in the website Araptus which is discussing this 391 00:19:05,920 --> 00:19:08,440 Speaker 2: exact thing that I found useful when I was formulating 392 00:19:08,440 --> 00:19:11,359 Speaker 2: my question for Thomas. The contract was to embed AM 393 00:19:11,400 --> 00:19:14,840 Speaker 2: models directly into QuickBooks, TurboTax, and credit Karma. The promise 394 00:19:14,880 --> 00:19:18,840 Speaker 2: AI assistants that can generate invoices, provide tax extments, recommend loans, 395 00:19:19,119 --> 00:19:22,760 Speaker 2: and help you make informed financial decisions. Right. That may 396 00:19:22,800 --> 00:19:25,960 Speaker 2: sound like kind of like a basic move, like's what's 397 00:19:26,160 --> 00:19:30,080 Speaker 2: so sketchy about just integrating like an AI chatbot to 398 00:19:30,080 --> 00:19:32,679 Speaker 2: make it easier to use your tax software. It can 399 00:19:32,720 --> 00:19:36,680 Speaker 2: be complicated and hard to use as anyway. But kind 400 00:19:36,680 --> 00:19:39,639 Speaker 2: of the necessary part of this is if you are 401 00:19:39,800 --> 00:19:42,680 Speaker 2: if you're doing this, if you're integrating all of these 402 00:19:42,720 --> 00:19:46,920 Speaker 2: different tax and credit programs into an AI model, you're 403 00:19:46,960 --> 00:19:50,600 Speaker 2: giving that AI model access to people's financial data in 404 00:19:50,680 --> 00:19:55,000 Speaker 2: a tremendous amount of detail, right, And all of these 405 00:19:55,040 --> 00:19:58,760 Speaker 2: AI models have a massive shared vulnerability, which is a 406 00:19:58,840 --> 00:20:03,120 Speaker 2: vernability to something called called prompt injection, right. And that's when, 407 00:20:03,119 --> 00:20:05,840 Speaker 2: for example, say someone is a customer of a tax 408 00:20:05,880 --> 00:20:08,359 Speaker 2: prepared that uses one of into its products to prepare 409 00:20:08,359 --> 00:20:11,160 Speaker 2: taxes for its customers, and this person sends an invoice 410 00:20:11,160 --> 00:20:14,879 Speaker 2: into the company that has hidden text in it that 411 00:20:15,000 --> 00:20:17,399 Speaker 2: is a command to the language model that will be 412 00:20:17,480 --> 00:20:22,800 Speaker 2: scraping this and uploading it to basically open up and 413 00:20:22,880 --> 00:20:26,080 Speaker 2: send over a bunch of customer data to a specific source. 414 00:20:26,640 --> 00:20:29,400 Speaker 2: That's a thing that you can do. It's called prompt injection, 415 00:20:29,960 --> 00:20:32,440 Speaker 2: and there's not really a way to counter it. There's 416 00:20:32,480 --> 00:20:36,240 Speaker 2: not like a proven comprehensive defense against this sort of thing. 417 00:20:36,960 --> 00:20:40,240 Speaker 2: And so there's this massive vulnerability, and this was first 418 00:20:40,240 --> 00:20:42,760 Speaker 2: brought up in an article on the website. I cited 419 00:20:43,040 --> 00:20:47,080 Speaker 2: a raptus by Chris Black, who's a security researcher an expert, 420 00:20:47,480 --> 00:20:50,359 Speaker 2: and I want to read a quote from his article 421 00:20:50,440 --> 00:20:53,680 Speaker 2: about this. There are no proven comprehensive defenses against prompt 422 00:20:53,720 --> 00:20:57,080 Speaker 2: injection when not if an AI powered financial tool leaks 423 00:20:57,080 --> 00:20:59,960 Speaker 2: customer data through a prompt injection attack, who is liable 424 00:21:00,080 --> 00:21:02,679 Speaker 2: the company using quick books into it open Ai? The 425 00:21:02,760 --> 00:21:06,080 Speaker 2: regulations weren't written for this scenario. So I decided to 426 00:21:06,080 --> 00:21:09,200 Speaker 2: ask that question of Thomas, being like the chief marketing officer, 427 00:21:09,240 --> 00:21:12,159 Speaker 2: I figured, well he should have some answer to like 428 00:21:12,480 --> 00:21:15,359 Speaker 2: what do you have, what sort of security measures do 429 00:21:15,440 --> 00:21:17,800 Speaker 2: you have to mitigate the risk of a prompt injection attack? 430 00:21:18,480 --> 00:21:20,960 Speaker 2: And who do you see as being responsible? If you 431 00:21:21,040 --> 00:21:24,679 Speaker 2: are the ones providing customer data to open ai and 432 00:21:24,720 --> 00:21:26,800 Speaker 2: their tool gets it by a prompt injection attack, are 433 00:21:26,840 --> 00:21:29,320 Speaker 2: you responsible? Is open ai is a third party that 434 00:21:29,400 --> 00:21:31,760 Speaker 2: might be using your products? And he had no answer 435 00:21:31,760 --> 00:21:33,960 Speaker 2: to this. He like, his only answer when we were 436 00:21:33,960 --> 00:21:36,560 Speaker 2: on stage was like, we're talking with open ai about it, 437 00:21:36,600 --> 00:21:39,720 Speaker 2: which like, well, you're already in the process of collaborating 438 00:21:39,760 --> 00:21:40,120 Speaker 2: with them. 439 00:21:40,680 --> 00:21:44,159 Speaker 6: Yeah, I have a question for Thomas. I was kind 440 00:21:44,200 --> 00:21:47,199 Speaker 6: of concerned when reading about into it assists that a 441 00:21:47,600 --> 00:21:50,560 Speaker 6: open AI is going to have read and write access 442 00:21:50,680 --> 00:21:55,720 Speaker 6: to quite a lot of financial information from users, which 443 00:21:55,920 --> 00:21:58,760 Speaker 6: opens up a vulnerability for a prompt injection. Right, you 444 00:21:58,840 --> 00:22:01,080 Speaker 6: have the possibility that people who can hide things and 445 00:22:01,119 --> 00:22:04,640 Speaker 6: invoices that are then being uploaded that will cause the 446 00:22:04,680 --> 00:22:09,439 Speaker 6: AI to provide the malicious user with financial details for 447 00:22:09,640 --> 00:22:14,240 Speaker 6: individuals or corporations. And I guess my primary question here 448 00:22:14,400 --> 00:22:18,880 Speaker 6: this seems like a major liability issue when somebody's information 449 00:22:19,000 --> 00:22:21,639 Speaker 6: gets rerouted to a place that's not supposed to go 450 00:22:21,720 --> 00:22:22,800 Speaker 6: to a malicious actor. 451 00:22:23,520 --> 00:22:24,600 Speaker 2: Who's responsible. 452 00:22:25,240 --> 00:22:29,680 Speaker 7: You're taking the value of integrity and protecting our customer 453 00:22:29,760 --> 00:22:33,600 Speaker 7: data privaly seriously over our entire lives. And that's what 454 00:22:33,840 --> 00:22:36,320 Speaker 7: forty years as a company and meeting in the software 455 00:22:36,359 --> 00:22:39,160 Speaker 7: space for financial services. So this is not something we're 456 00:22:39,160 --> 00:22:42,200 Speaker 7: about to use it in any way in the new 457 00:22:42,240 --> 00:22:44,040 Speaker 7: age of AI. In fact, it has to get even 458 00:22:44,119 --> 00:22:47,439 Speaker 7: more and more level down of protecting people's information and 459 00:22:47,520 --> 00:22:50,440 Speaker 7: security of edforation. So that is something that we are 460 00:22:50,600 --> 00:22:53,080 Speaker 7: already in the conversations with open AI have a nation 461 00:22:53,200 --> 00:22:56,080 Speaker 7: about that group, no matter where we're starving. 462 00:22:57,480 --> 00:22:59,679 Speaker 2: And when I kind of cornered him afterwards, he didn't 463 00:22:59,680 --> 00:23:01,840 Speaker 2: have like his eventual follow up answers like I don't 464 00:23:01,880 --> 00:23:03,560 Speaker 2: know that kind of stuff, and like you are the 465 00:23:03,640 --> 00:23:05,320 Speaker 2: chief marketing officer. 466 00:23:05,280 --> 00:23:06,680 Speaker 8: Thank you again for answering my question. 467 00:23:06,840 --> 00:23:07,680 Speaker 9: Sure, your question. 468 00:23:08,000 --> 00:23:11,680 Speaker 8: I'm still really concerned about the danger of prompt injection 469 00:23:11,800 --> 00:23:16,560 Speaker 8: attacks revealing financial data, and it doesn't still sound like 470 00:23:16,640 --> 00:23:19,080 Speaker 8: there's an understanding of who will be liable. 471 00:23:20,840 --> 00:23:22,160 Speaker 2: To answer that question for you. 472 00:23:22,200 --> 00:23:23,600 Speaker 6: So I mean, like I can tell you that we're 473 00:23:23,680 --> 00:23:26,640 Speaker 6: coming into our security and privacy and we are doing. 474 00:23:26,480 --> 00:23:28,879 Speaker 2: Everything we can to protect that. I feel a lot 475 00:23:28,920 --> 00:23:30,359 Speaker 2: writing on it, as you might imagine. 476 00:23:30,400 --> 00:23:33,240 Speaker 8: Well, yeah, it's every digital security expert I've talked to 477 00:23:33,359 --> 00:23:36,200 Speaker 8: says it's a matter of when, not if, that there 478 00:23:36,320 --> 00:23:38,680 Speaker 8: is financial data revealed. 479 00:23:38,280 --> 00:23:41,960 Speaker 2: By these attacks. It seems like understanding. 480 00:23:43,440 --> 00:23:44,920 Speaker 8: I'm just I'm not going to be the expert to 481 00:23:44,920 --> 00:23:46,160 Speaker 8: get into the details on that. 482 00:23:46,320 --> 00:23:52,280 Speaker 2: Okay, Yeah, A key part of marketing this should be 483 00:23:52,280 --> 00:23:55,280 Speaker 2: being able to tell people what kind of safety precautions 484 00:23:55,320 --> 00:23:58,439 Speaker 2: are being taken with their data. And the fact that 485 00:23:58,480 --> 00:24:00,840 Speaker 2: he didn't and clearly had never thought about any of 486 00:24:00,880 --> 00:24:02,400 Speaker 2: this stuff and I had a couple of different people 487 00:24:02,400 --> 00:24:05,120 Speaker 2: come up to me afterwards and like be like, Wow, 488 00:24:05,160 --> 00:24:07,480 Speaker 2: that was a really good question. And I was like, well, 489 00:24:07,960 --> 00:24:11,480 Speaker 2: why hasn't this been asked before? Like why is this 490 00:24:11,520 --> 00:24:13,119 Speaker 2: a thing where like some guy's blogging about it and 491 00:24:13,160 --> 00:24:14,560 Speaker 2: I'm asking you about it and you don't have an 492 00:24:14,600 --> 00:24:17,000 Speaker 2: answer to it, and you're the CEO of one of 493 00:24:17,080 --> 00:24:20,200 Speaker 2: like the largest tax prep the largest tax prep company 494 00:24:20,760 --> 00:24:24,480 Speaker 2: in the country. Like it's just it's emblematic of how 495 00:24:24,840 --> 00:24:30,000 Speaker 2: careless everyone adjacent to this industry is, which personal data 496 00:24:30,400 --> 00:24:33,600 Speaker 2: with the safety of people, end of society as a 497 00:24:33,640 --> 00:24:36,000 Speaker 2: result of like what their products are doing, Like there's 498 00:24:36,119 --> 00:24:39,800 Speaker 2: absolutely no consideration given the harms of any of this shit. 499 00:24:39,920 --> 00:24:42,720 Speaker 2: And it's it's the most consistently dispiriting part of showing 500 00:24:42,760 --> 00:24:45,840 Speaker 2: up at CES. Well what is right? 501 00:24:46,119 --> 00:24:47,040 Speaker 4: What a fun story? 502 00:24:47,119 --> 00:25:01,639 Speaker 2: That is? All right? We're back. So one of the 503 00:25:01,680 --> 00:25:03,920 Speaker 2: other things that's been a major topic on the panels 504 00:25:03,920 --> 00:25:05,679 Speaker 2: I went to and is generally a big thing at 505 00:25:05,720 --> 00:25:09,040 Speaker 2: CES this year and in tech this year is agentic 506 00:25:09,080 --> 00:25:12,240 Speaker 2: AI or agents. Right, the idea that you have an 507 00:25:12,240 --> 00:25:14,720 Speaker 2: AI that you can send off to like book a 508 00:25:14,720 --> 00:25:16,600 Speaker 2: flight for you, and it doesn't just like find a 509 00:25:16,640 --> 00:25:18,639 Speaker 2: flight that it searches for and be like, hey, this 510 00:25:18,720 --> 00:25:21,000 Speaker 2: looks good. It like actually books it for you and 511 00:25:21,359 --> 00:25:24,080 Speaker 2: handles all of that. Right. This has been one of 512 00:25:24,119 --> 00:25:26,600 Speaker 2: the big promises of AI, not just for like flights, 513 00:25:26,600 --> 00:25:29,160 Speaker 2: but that you can have like an actual digital assistant 514 00:25:29,160 --> 00:25:31,800 Speaker 2: that persistently remembers all of your shit and can book 515 00:25:31,800 --> 00:25:34,879 Speaker 2: stuff for you and handle like the pain in the 516 00:25:34,920 --> 00:25:37,440 Speaker 2: ass nitty gritty. If you say, like, hey, I need 517 00:25:37,480 --> 00:25:39,920 Speaker 2: you to find a restaurant within like this four block 518 00:25:40,040 --> 00:25:44,480 Speaker 2: radius that has seven seats open at eight pm and 519 00:25:44,840 --> 00:25:47,760 Speaker 2: abides by these dietary restrictions, you kind of just have 520 00:25:47,800 --> 00:25:50,600 Speaker 2: to slog through figuring that out right now. And the 521 00:25:50,680 --> 00:25:53,240 Speaker 2: idea is an agent can do that for you, and 522 00:25:53,320 --> 00:25:56,440 Speaker 2: currently none of them can, right. This is a thing 523 00:25:56,520 --> 00:26:00,720 Speaker 2: that is changing, like the performance of different agents are 524 00:26:00,840 --> 00:26:04,520 Speaker 2: changing over time. But it is still very unclear. If 525 00:26:04,560 --> 00:26:06,399 Speaker 2: you're not somebod who's fully bought into the kool aid, 526 00:26:06,440 --> 00:26:09,200 Speaker 2: I'll say it's very unclear where these things will top 527 00:26:09,280 --> 00:26:12,080 Speaker 2: out at. And there was a good article in futurism 528 00:26:12,320 --> 00:26:14,480 Speaker 2: recently and I want to quote from it right now. 529 00:26:14,800 --> 00:26:17,560 Speaker 2: Researchers at Carnegie Mellen University found earlier this year that 530 00:26:17,640 --> 00:26:20,040 Speaker 2: even the best performing AI agent, which was Google's gym 531 00:26:20,040 --> 00:26:21,920 Speaker 2: and I two point five pro at the time, failed 532 00:26:21,960 --> 00:26:25,119 Speaker 2: to complete real world office tasks seventy percent of the time. 533 00:26:25,800 --> 00:26:27,840 Speaker 2: And this is there's been a bunch of articles in 534 00:26:27,840 --> 00:26:30,120 Speaker 2: the last couple of months about like, why didn't because 535 00:26:30,119 --> 00:26:31,720 Speaker 2: twenty twenty five was supposed to be the year of 536 00:26:31,760 --> 00:26:34,480 Speaker 2: agent to AAI and how they're saying, well, twenty twenty six, 537 00:26:34,520 --> 00:26:36,560 Speaker 2: it's going to be the year of agenta AAI. Not 538 00:26:36,640 --> 00:26:38,960 Speaker 2: because none of this stuff works, and in fact enough 539 00:26:39,000 --> 00:26:41,680 Speaker 2: does that there's a number of viable businesses in it. 540 00:26:41,680 --> 00:26:44,320 Speaker 2: It's not nothing, but it does not work as well 541 00:26:44,359 --> 00:26:46,239 Speaker 2: as they said it would be working right now, and 542 00:26:46,280 --> 00:26:49,280 Speaker 2: it consequently has not been adopted merely as widely as 543 00:26:49,520 --> 00:26:52,520 Speaker 2: was expected even this time last year. Right there's an 544 00:26:52,600 --> 00:26:55,159 Speaker 2: article in hr dive half of gen Z chat GPT 545 00:26:55,359 --> 00:26:57,800 Speaker 2: users say they view it as a coworker. Survey shows 546 00:26:58,119 --> 00:26:59,959 Speaker 2: that sites the survey of about eighty six hundred four 547 00:27:00,160 --> 00:27:03,119 Speaker 2: time US workers, which found that about eleven percent of 548 00:27:03,160 --> 00:27:06,520 Speaker 2: those who responded said they use chat GPT regularly, including 549 00:27:06,560 --> 00:27:09,359 Speaker 2: about twenty one percent of gen Z workers, which is 550 00:27:09,400 --> 00:27:12,199 Speaker 2: significantly lower. You can find depending on who you go to. 551 00:27:12,240 --> 00:27:14,320 Speaker 2: And the stat I've seen bandied about was that like 552 00:27:15,320 --> 00:27:18,840 Speaker 2: fifty seven percent of gen z people use chat GPT 553 00:27:19,320 --> 00:27:22,560 Speaker 2: on a daily basis for like work, and like more 554 00:27:22,560 --> 00:27:25,240 Speaker 2: than half used it as like their primary source recommendations 555 00:27:25,280 --> 00:27:27,880 Speaker 2: like what stuff to buy. I don't know, like which 556 00:27:27,880 --> 00:27:30,080 Speaker 2: set of numbers is accurate. There's a lot of different 557 00:27:30,080 --> 00:27:33,160 Speaker 2: posters giving data, right, but kind of no matter who 558 00:27:33,200 --> 00:27:36,720 Speaker 2: you look at, the evidence suggests that the year that 559 00:27:36,760 --> 00:27:38,680 Speaker 2: was supposed to be the year of agentic AI did 560 00:27:38,680 --> 00:27:41,080 Speaker 2: not turn it into a normal thing, right, It's still 561 00:27:41,160 --> 00:27:44,120 Speaker 2: lagging behind expectation. So that's kind of what we're seeing 562 00:27:44,160 --> 00:27:47,560 Speaker 2: at CES is a lot of people trying to, like, well, 563 00:27:47,640 --> 00:27:50,399 Speaker 2: let's bring back kind of the same agentic shit we 564 00:27:50,440 --> 00:27:53,040 Speaker 2: had last year slightly improved and see if it catches on, 565 00:27:53,160 --> 00:27:54,879 Speaker 2: maybe this year it'll hit maturity. 566 00:27:55,160 --> 00:27:58,960 Speaker 3: Right Yeah, No, I mean we've been hearing agentic stuff 567 00:27:58,960 --> 00:28:01,520 Speaker 3: every once in a while, not as much as last year. 568 00:28:02,000 --> 00:28:03,639 Speaker 3: It's one of those like salt pepper words that they 569 00:28:03,680 --> 00:28:07,160 Speaker 3: throw in the second batch of panels that I attended 570 00:28:07,160 --> 00:28:11,040 Speaker 3: after the keynote, which I should mention as soon as 571 00:28:11,080 --> 00:28:15,240 Speaker 3: I walked into the keynote at eight thirty am. The 572 00:28:15,280 --> 00:28:18,520 Speaker 3: first thing, the very first thing I heard from Gary 573 00:28:18,560 --> 00:28:21,440 Speaker 3: Shapiro's well, one of the heads the Consumer Technology Association 574 00:28:22,119 --> 00:28:24,199 Speaker 3: was a six to seven joke. But whether this is 575 00:28:24,240 --> 00:28:29,440 Speaker 3: your first CEES or your fifteenth on my case, you. 576 00:28:29,480 --> 00:28:34,359 Speaker 5: Belong to number, was that sixty or seventy? 577 00:28:34,480 --> 00:28:35,680 Speaker 4: You did a six to seven joke? 578 00:28:35,800 --> 00:28:40,920 Speaker 3: Already, very first thing, Wow, eight great, great, thirty am. 579 00:28:41,000 --> 00:28:42,080 Speaker 4: As soon as I walk in. 580 00:28:42,320 --> 00:28:44,760 Speaker 3: It's because I walked in maybe like five minutes late, 581 00:28:45,280 --> 00:28:49,800 Speaker 3: but very first words. So that's that's good's that kind 582 00:28:49,800 --> 00:28:52,440 Speaker 3: of sets the tone for a lot of a lot 583 00:28:52,440 --> 00:28:55,200 Speaker 3: of that panel. But then I went to a few 584 00:28:55,520 --> 00:29:00,320 Speaker 3: panels in Eureka Park about like AI, governance, some like 585 00:29:00,600 --> 00:29:05,520 Speaker 3: governments working working with AI, A lot of stuff, mostly 586 00:29:05,840 --> 00:29:09,240 Speaker 3: about like the challenge of governments keeping up with innovation. 587 00:29:09,480 --> 00:29:12,840 Speaker 3: How you know, too much regulation restricts these companies from 588 00:29:12,880 --> 00:29:18,120 Speaker 3: doing real regulation. The Secretary of State of Austria had 589 00:29:18,160 --> 00:29:23,920 Speaker 3: a really good quote about how data protections inhibit innovation. 590 00:29:24,680 --> 00:29:26,680 Speaker 10: One of the things that you are saying today is 591 00:29:26,720 --> 00:29:29,320 Speaker 10: that some of the people, some of the citizens, have 592 00:29:29,560 --> 00:29:30,880 Speaker 10: this fear about AI. 593 00:29:31,360 --> 00:29:34,480 Speaker 9: So how do you feel it in Austria. 594 00:29:35,000 --> 00:29:36,680 Speaker 4: I think you mentioned very very well. 595 00:29:36,800 --> 00:29:41,160 Speaker 10: It's all about building trust, taking the field trust for 596 00:29:41,320 --> 00:29:45,360 Speaker 10: FEI that's the most important thing. And of course data 597 00:29:45,360 --> 00:29:48,959 Speaker 10: protection is very huge. But on the other side, between 598 00:29:49,080 --> 00:29:53,560 Speaker 10: data protection and innovation, you need to find the middle 599 00:29:53,600 --> 00:29:56,920 Speaker 10: way because sometimes data protection. 600 00:29:56,680 --> 00:29:57,880 Speaker 4: Is not good for innovation. 601 00:29:58,720 --> 00:30:02,200 Speaker 3: On a similar notice, the into it the turbo tax 602 00:30:02,280 --> 00:30:05,480 Speaker 3: thing of data protection is mainly getting in the way 603 00:30:05,520 --> 00:30:08,600 Speaker 3: of trying to actually make make real social progress, which 604 00:30:08,640 --> 00:30:11,640 Speaker 3: will carry with it some degree of risk. The second 605 00:30:11,680 --> 00:30:15,240 Speaker 3: one of these AI governance panels was like these EU 606 00:30:15,680 --> 00:30:21,520 Speaker 3: ambassadors to the US from Estonia and Luxembourg talking about 607 00:30:21,880 --> 00:30:25,959 Speaker 3: like Reaganomics, basically for thirty minutes, talking about how much 608 00:30:26,000 --> 00:30:28,760 Speaker 3: they love Ronald Reigen great. 609 00:30:28,520 --> 00:30:33,959 Speaker 9: Whenos Soniavo certainly subscribed to the this statement that Reagan 610 00:30:34,000 --> 00:30:38,240 Speaker 9: once made that the fuel most horrific words in English 611 00:30:38,560 --> 00:30:42,480 Speaker 9: are the ones saying that a I'm from the government 612 00:30:42,520 --> 00:30:43,760 Speaker 9: and I'm here to help. 613 00:30:43,560 --> 00:30:48,080 Speaker 3: You, specifically in trying to make sure that governments are 614 00:30:48,080 --> 00:30:51,240 Speaker 3: able to keep up with technology. And the previous panel 615 00:30:51,280 --> 00:30:54,200 Speaker 3: with the Austrian Secretary of State was about the challenges 616 00:30:54,240 --> 00:30:57,000 Speaker 3: of trying to convince the citizens of these countries to 617 00:30:57,600 --> 00:31:02,000 Speaker 3: adopt AI and adopt in general like digitalization and specifically 618 00:31:02,000 --> 00:31:04,320 Speaker 3: with like digital ideas, and how how there's like you know, 619 00:31:04,480 --> 00:31:06,239 Speaker 3: maybe like twenty to thirty percent of people who are 620 00:31:06,280 --> 00:31:09,360 Speaker 3: very resistant and the challenge of like making making sure 621 00:31:09,360 --> 00:31:11,080 Speaker 3: that like this gets framed is not as like a 622 00:31:11,120 --> 00:31:14,080 Speaker 3: product or like a like a project for technology, but 623 00:31:14,120 --> 00:31:16,360 Speaker 3: as like a society wide push. 624 00:31:16,800 --> 00:31:17,520 Speaker 2: Yeah. 625 00:31:17,560 --> 00:31:19,920 Speaker 3: But besides that, these these panels were honestly a little 626 00:31:19,960 --> 00:31:23,280 Speaker 3: bit sleepy as well as the state of the Creator 627 00:31:23,320 --> 00:31:24,200 Speaker 3: Economy panel. 628 00:31:24,240 --> 00:31:24,920 Speaker 2: Oh how's it doing? 629 00:31:25,720 --> 00:31:29,440 Speaker 3: You know what, It's both it's both in its adolescence 630 00:31:29,520 --> 00:31:30,840 Speaker 3: but also reach maturity. 631 00:31:30,920 --> 00:31:31,240 Speaker 2: Wow. 632 00:31:31,640 --> 00:31:34,160 Speaker 3: And they said, you know, it's hard to pick both. 633 00:31:34,240 --> 00:31:36,040 Speaker 3: It's hard for too for something to be two things 634 00:31:36,080 --> 00:31:38,280 Speaker 3: at once, but in this case it is. 635 00:31:41,160 --> 00:31:43,640 Speaker 4: But they talked about how creators are more. 636 00:31:43,640 --> 00:31:46,440 Speaker 3: Enabled to use brand deals, including you know, brand deals 637 00:31:46,480 --> 00:31:50,200 Speaker 3: to enabled with like a backlog of older content. You 638 00:31:50,200 --> 00:31:53,600 Speaker 3: can remove brand deals from older content and replace them 639 00:31:53,640 --> 00:31:56,400 Speaker 3: with current brand deals using a new future from YouTube. 640 00:31:56,520 --> 00:31:59,640 Speaker 3: Right there was there's a guy from YouTube at the panel. Sure, yeah, 641 00:31:59,640 --> 00:32:02,160 Speaker 3: I'm sure who's very excited. But it was mostly about how, 642 00:32:02,360 --> 00:32:05,719 Speaker 3: you know, new ways to use influencers to market your product, 643 00:32:06,080 --> 00:32:08,880 Speaker 3: and that was the extent of what the creator economy 644 00:32:08,920 --> 00:32:09,400 Speaker 3: really meant. 645 00:32:09,560 --> 00:32:11,680 Speaker 2: And that's all any of these people have any idea 646 00:32:11,720 --> 00:32:15,320 Speaker 2: on is like we can inject ads into AI they 647 00:32:15,360 --> 00:32:18,280 Speaker 2: trust AI so they'll buy the products, or we can 648 00:32:18,320 --> 00:32:22,000 Speaker 2: inject ads into influencers they trusts. That was the thing, 649 00:32:22,240 --> 00:32:24,600 Speaker 2: Like none of these people they dress it up with 650 00:32:24,640 --> 00:32:27,000 Speaker 2: all sorts of fancy language, but it's and most of 651 00:32:27,040 --> 00:32:29,840 Speaker 2: these panels you mentioned, like there's a lot of bullshit 652 00:32:29,920 --> 00:32:31,800 Speaker 2: every now and then you get some like good moments 653 00:32:31,840 --> 00:32:34,400 Speaker 2: so you get to like question an asshole. But it's 654 00:32:34,400 --> 00:32:37,680 Speaker 2: mostly bullshit, but it's occasionally worth it for moments like 655 00:32:38,080 --> 00:32:40,160 Speaker 2: when I was on the agentic AI cutting through the 656 00:32:40,240 --> 00:32:44,200 Speaker 2: hype panel, Jay Patasol, who's the principal analyst at Forrester, 657 00:32:44,720 --> 00:32:48,440 Speaker 2: started speaking and he said something beautiful, Garrison, and I'm 658 00:32:48,720 --> 00:32:51,560 Speaker 2: this is not an exact but it's pretty close. We 659 00:32:51,600 --> 00:32:54,600 Speaker 2: have a new audience. We are speaking to machines. We 660 00:32:54,680 --> 00:32:57,400 Speaker 2: are through the looking glass. We are building content for engines. 661 00:32:57,440 --> 00:32:59,840 Speaker 2: We are building websites to be scraped so that an 662 00:33:00,360 --> 00:33:03,760 Speaker 2: can understand what you wanted to understand about your brand. Yep, yep, yeah, 663 00:33:03,880 --> 00:33:06,480 Speaker 2: that gets it. That's what these people see the Internet 664 00:33:06,520 --> 00:33:08,880 Speaker 2: as they see it as like everything before this was 665 00:33:08,920 --> 00:33:12,440 Speaker 2: a mistake or was. What the Internet was for was 666 00:33:12,520 --> 00:33:17,200 Speaker 2: a place for brands to feed information into machines that 667 00:33:17,240 --> 00:33:20,720 Speaker 2: then spoon feed the information directly into customers who trust 668 00:33:20,720 --> 00:33:23,560 Speaker 2: it like little lambs. That's what they want the Internet 669 00:33:23,600 --> 00:33:25,840 Speaker 2: to be, and that's what they believe they've gotten to. 670 00:33:25,960 --> 00:33:28,320 Speaker 2: That's what AI. That's the promise of AI. 671 00:33:28,600 --> 00:33:30,480 Speaker 3: The promise of AI is that this isn't just the 672 00:33:30,520 --> 00:33:33,080 Speaker 3: Internet anymore. This can actually just be the physical world 673 00:33:33,120 --> 00:33:35,200 Speaker 3: as well. And this is something that was talked about 674 00:33:35,320 --> 00:33:40,280 Speaker 3: during the cees CTA keynote Tuesday morning. Specifically, with the 675 00:33:40,280 --> 00:33:43,440 Speaker 3: birth of AI wearables, each of these wearables is able 676 00:33:43,480 --> 00:33:46,680 Speaker 3: to now collect information about the physical world and as 677 00:33:46,720 --> 00:33:49,240 Speaker 3: talng as you have, you know, adequate data sharing, AI 678 00:33:49,280 --> 00:33:52,560 Speaker 3: is able to gain so much more knowledge about how 679 00:33:52,600 --> 00:33:55,239 Speaker 3: the quote unquote real world operates. And this is going 680 00:33:55,280 --> 00:33:58,440 Speaker 3: to make you know, all of the processes of AI 681 00:33:58,680 --> 00:34:01,520 Speaker 3: stronger in the future as that learns more about what 682 00:34:01,600 --> 00:34:05,960 Speaker 3: this world actually is. Yeah, and beyond the promise of 683 00:34:06,000 --> 00:34:10,760 Speaker 3: wearables to improve someone's life, this is the real project 684 00:34:11,040 --> 00:34:13,640 Speaker 3: is strengthening AI through the use of these wearables. 685 00:34:13,200 --> 00:34:15,680 Speaker 4: It's not actually about the consumer experience. 686 00:34:15,360 --> 00:34:17,120 Speaker 2: It's about providing data to this machine. 687 00:34:17,160 --> 00:34:21,640 Speaker 3: It's this like larger, larger, very like existential thing at 688 00:34:21,719 --> 00:34:24,399 Speaker 3: least for these executives or like that. That's the thing 689 00:34:24,440 --> 00:34:26,719 Speaker 3: that they are really emphasizing despite this being called the 690 00:34:26,760 --> 00:34:28,200 Speaker 3: Consumer Electronics Showcase. 691 00:34:28,320 --> 00:34:30,359 Speaker 2: And I think again, the best thing I can give 692 00:34:30,400 --> 00:34:33,200 Speaker 2: you into how fundamentally as much money as there is 693 00:34:33,239 --> 00:34:35,319 Speaker 2: behind this and as many grand words as they dress 694 00:34:35,360 --> 00:34:39,920 Speaker 2: it up, and how intellectually bankrupt this whole tech movement 695 00:34:40,080 --> 00:34:44,360 Speaker 2: is is that. The third panel that I went to, 696 00:34:44,880 --> 00:34:47,720 Speaker 2: which is about AI and creativity. One of the people 697 00:34:47,719 --> 00:34:50,719 Speaker 2: on it was Jesse Damasek, who works for Diagio, which 698 00:34:50,760 --> 00:34:52,320 Speaker 2: is like a company that imports all of your favorite 699 00:34:52,360 --> 00:34:55,080 Speaker 2: whiskeys from Europe, right like they sell all of the 700 00:34:55,120 --> 00:34:57,800 Speaker 2: different like Scottish whiskies that have to get like imported 701 00:34:57,840 --> 00:35:01,120 Speaker 2: and sold over here. And he was talking about they 702 00:35:01,120 --> 00:35:04,040 Speaker 2: were talking about some of the specific examples they had 703 00:35:04,239 --> 00:35:07,440 Speaker 2: of like how AI has been used in advertising campaigns, 704 00:35:08,120 --> 00:35:12,080 Speaker 2: and his exact statement was you can leverage an artist 705 00:35:12,160 --> 00:35:14,960 Speaker 2: and create infinite examples of their work, but which he 706 00:35:15,000 --> 00:35:16,920 Speaker 2: means you can find an artist that you like, sign 707 00:35:17,000 --> 00:35:19,120 Speaker 2: a deal with them, and then have AI created infinite 708 00:35:19,160 --> 00:35:22,319 Speaker 2: examples in their style, and so I camped afterwards and 709 00:35:22,360 --> 00:35:24,680 Speaker 2: I was like, what were you specifically referring to, Like, 710 00:35:24,719 --> 00:35:26,640 Speaker 2: how is this actually work as a product? And the 711 00:35:26,680 --> 00:35:28,360 Speaker 2: thing that he pointed out is that they have a 712 00:35:28,400 --> 00:35:32,239 Speaker 2: couple of whiskey brands that they have done. You go 713 00:35:32,400 --> 00:35:35,560 Speaker 2: in and you order a bottle and it's printed on site, 714 00:35:35,640 --> 00:35:38,000 Speaker 2: and it uses AI to make an example in the 715 00:35:38,080 --> 00:35:41,479 Speaker 2: style of this existing artist that they likes work that's 716 00:35:41,600 --> 00:35:45,920 Speaker 2: unique for you. Uh huh. And he said it's been 717 00:35:45,960 --> 00:35:49,839 Speaker 2: successful for them. Is it trillions of dollars? Three trillion dollars? 718 00:35:49,880 --> 00:35:49,920 Speaker 3: No? 719 00:35:49,960 --> 00:35:51,480 Speaker 2: I mean, these are these are things that like, yeah, 720 00:35:51,480 --> 00:35:53,600 Speaker 2: I guess I can see that maybe selling some Is 721 00:35:53,640 --> 00:35:58,040 Speaker 2: it selling better than any other like branded whiskey than 722 00:35:58,080 --> 00:36:00,440 Speaker 2: any other, like you know, because Whiskey Company, these big 723 00:36:00,440 --> 00:36:02,920 Speaker 2: ones will come out with like here's this edition every 724 00:36:03,000 --> 00:36:05,439 Speaker 2: year or whatever, they'll have one special limited edition one. 725 00:36:05,760 --> 00:36:07,840 Speaker 2: Is it selling better than that? We don't have that data, 726 00:36:07,920 --> 00:36:09,520 Speaker 2: but it was it's one of those is like that's 727 00:36:09,600 --> 00:36:10,080 Speaker 2: the idea. 728 00:36:10,200 --> 00:36:10,439 Speaker 3: Huh. 729 00:36:10,440 --> 00:36:13,680 Speaker 2: That's like we're talking about, like AI is supercharging creativity 730 00:36:13,680 --> 00:36:17,160 Speaker 2: and letting us like think bolder and more creatively than 731 00:36:17,280 --> 00:36:19,720 Speaker 2: we've ever thought before. And there were so many lines 732 00:36:20,440 --> 00:36:25,040 Speaker 2: in this fucking panel about like how we were like 733 00:36:25,160 --> 00:36:28,319 Speaker 2: hyper charging what human beings can be and do, and 734 00:36:28,400 --> 00:36:30,719 Speaker 2: like everyone should be really excited about what all this 735 00:36:30,760 --> 00:36:33,080 Speaker 2: means for the future. One of the panelists that my 736 00:36:33,200 --> 00:36:35,960 Speaker 2: best advice for you is let a thousand flowers bloom. 737 00:36:36,640 --> 00:36:38,200 Speaker 2: I'm sorry that was in the panel right before, but 738 00:36:38,239 --> 00:36:40,440 Speaker 2: it's still in all of this, Like it's all that 739 00:36:40,480 --> 00:36:43,440 Speaker 2: seems it's all the same kind of shit that bleeds together, 740 00:36:43,480 --> 00:36:45,680 Speaker 2: and it's like, Okay, what are your ideas? Well, we're 741 00:36:45,719 --> 00:36:48,800 Speaker 2: having a legrick kind of lie to manipulated an engine, 742 00:36:49,200 --> 00:36:53,520 Speaker 2: and we've got the custom printed bottles for your whiskey. 743 00:36:53,760 --> 00:36:58,279 Speaker 3: Well, you know, speaking of AI unleashing creativity. The last 744 00:36:58,320 --> 00:37:01,960 Speaker 3: thing I'll talk about this episode is the worst booth 745 00:37:02,520 --> 00:37:05,680 Speaker 3: at Showstoppers, which this year is kind of impressive because, yeah, 746 00:37:05,680 --> 00:37:09,000 Speaker 3: that's hard. It's mostly smart glasses and like three different 747 00:37:09,040 --> 00:37:11,800 Speaker 3: pool cleaners. Yeah, and then some random software stuff and 748 00:37:11,840 --> 00:37:13,719 Speaker 3: then a few things we saw last year. Sure, the 749 00:37:13,760 --> 00:37:18,000 Speaker 3: worst worst booth, Robert. You you write books. 750 00:37:17,760 --> 00:37:19,080 Speaker 2: Right, I have in the past. 751 00:37:19,600 --> 00:37:22,160 Speaker 3: What if in the future, what if I told you 752 00:37:22,760 --> 00:37:26,359 Speaker 3: that you could write three books in less than twenty 753 00:37:26,440 --> 00:37:27,040 Speaker 3: four hours. 754 00:37:27,080 --> 00:37:30,000 Speaker 2: God, thank you Garrison as. 755 00:37:29,840 --> 00:37:32,040 Speaker 4: A writer without using cocaine. 756 00:37:32,080 --> 00:37:35,320 Speaker 2: There's well, okay, now i'd say you're a liar. 757 00:37:35,480 --> 00:37:36,800 Speaker 4: That's see a little bit harder. 758 00:37:36,920 --> 00:37:39,080 Speaker 2: Yeah, but I thought you were trying to sell me 759 00:37:39,120 --> 00:37:41,000 Speaker 2: some blow. And I was going to say when we 760 00:37:41,040 --> 00:37:41,839 Speaker 2: turned the mic off. 761 00:37:42,200 --> 00:37:46,080 Speaker 3: With the power of AI, you can write three books 762 00:37:46,120 --> 00:37:48,080 Speaker 3: in six to twenty four hours. 763 00:37:48,560 --> 00:37:52,040 Speaker 2: Wow, that's almost as fast as Stephen King Winny was 764 00:37:52,040 --> 00:37:53,919 Speaker 2: son co. Yeah, there you get not quite. 765 00:37:54,280 --> 00:37:55,640 Speaker 4: So there's this table. 766 00:37:55,920 --> 00:37:58,560 Speaker 3: There was the least abouting table definitely all showstoppers because 767 00:37:58,560 --> 00:38:01,280 Speaker 3: it was it was filled with books with I will. 768 00:38:01,120 --> 00:38:04,839 Speaker 4: Show you the covers here. They all look like this. 769 00:38:06,000 --> 00:38:09,080 Speaker 2: They all oh yeah, no those I mean I'm seeing 770 00:38:09,160 --> 00:38:10,080 Speaker 2: blue and orange. 771 00:38:10,239 --> 00:38:13,040 Speaker 4: It's colors in it's AI generated images. 772 00:38:13,160 --> 00:38:15,480 Speaker 3: Like yeah, it's like every movie poster now in like 773 00:38:15,520 --> 00:38:19,400 Speaker 3: there's no art style behind it. It's very generic and 774 00:38:19,440 --> 00:38:23,040 Speaker 3: they have like, you know, like the most generic font 775 00:38:23,280 --> 00:38:25,600 Speaker 3: for the title, all in the same placement with some 776 00:38:25,719 --> 00:38:26,720 Speaker 3: author's name at the bottom. 777 00:38:26,960 --> 00:38:27,880 Speaker 2: They're very sleepy. 778 00:38:27,920 --> 00:38:30,040 Speaker 3: You could you can find pictures of these covers if 779 00:38:30,080 --> 00:38:34,440 Speaker 3: you Google or bing or you know, maybe chat gpt 780 00:38:35,200 --> 00:38:37,280 Speaker 3: write three books in twenty four hours. 781 00:38:37,400 --> 00:38:38,680 Speaker 4: You can see you can see the cover. 782 00:38:38,840 --> 00:38:41,400 Speaker 2: Finally, I've always wanted to write three books, Garrison. 783 00:38:41,560 --> 00:38:45,800 Speaker 3: So what this is is an app that will help 784 00:38:45,880 --> 00:38:48,359 Speaker 3: you write these books is not going to do it 785 00:38:48,400 --> 00:38:50,600 Speaker 3: all for you. You still need to come up with 786 00:38:50,640 --> 00:38:52,400 Speaker 3: the general idea of the story. 787 00:38:52,560 --> 00:38:55,480 Speaker 4: The hard stuff and the characters really the difficult. The 788 00:38:55,520 --> 00:38:57,160 Speaker 4: world building is always the hardest part. 789 00:38:57,239 --> 00:38:58,880 Speaker 2: Everyone says most of the work on a book is 790 00:38:58,880 --> 00:38:59,920 Speaker 2: done the first six hours. 791 00:39:00,400 --> 00:39:03,160 Speaker 3: The world building is the really hard part. The easy 792 00:39:03,239 --> 00:39:05,719 Speaker 3: part is just getting all those words down. Yeah, so 793 00:39:06,320 --> 00:39:08,920 Speaker 3: you need to create create some characters. Now, could you 794 00:39:09,000 --> 00:39:11,360 Speaker 3: just have some other AI service create these characters? 795 00:39:11,400 --> 00:39:11,840 Speaker 2: Maybe? 796 00:39:12,000 --> 00:39:15,160 Speaker 3: But you should write maybe about a thousand words kind 797 00:39:15,200 --> 00:39:17,720 Speaker 3: of like a story Bible type thing or a character 798 00:39:17,880 --> 00:39:21,480 Speaker 3: character outline and a general direction for the story, and 799 00:39:21,520 --> 00:39:26,360 Speaker 3: you feed that into this app and then within hours 800 00:39:26,400 --> 00:39:30,720 Speaker 3: it will generate not just one book, not just two books, 801 00:39:31,640 --> 00:39:35,840 Speaker 3: but a trilogy wow of books. And it's only a trilogy. 802 00:39:35,960 --> 00:39:40,320 Speaker 3: You cannot generate a single book. The only come in trilogy. 803 00:39:40,360 --> 00:39:42,960 Speaker 2: Look, I get it's George Lucas worked the same way, Garrison. 804 00:39:43,280 --> 00:39:46,120 Speaker 2: Look you're telling me that the greatest machine mind and 805 00:39:46,239 --> 00:39:48,479 Speaker 2: history wouldn't think the same as the greatest human mind 806 00:39:48,520 --> 00:39:52,279 Speaker 2: in history. You know, I bet it'll independently create Jiz 807 00:39:52,360 --> 00:39:54,839 Speaker 2: music too. It only comes in trilogies. 808 00:39:57,520 --> 00:40:02,680 Speaker 3: And I now shall read a sample of this writing, 809 00:40:02,680 --> 00:40:04,880 Speaker 3: and like dying read, there was maybe there was maybe 810 00:40:04,920 --> 00:40:07,440 Speaker 3: like five or six different books with many copies of 811 00:40:07,440 --> 00:40:09,680 Speaker 3: the same book on this table, and I flipped through, 812 00:40:09,719 --> 00:40:12,719 Speaker 3: maybe about half reading like a random page every you know, 813 00:40:12,800 --> 00:40:16,880 Speaker 3: every like twenty fifty pages, and it was it was 814 00:40:17,840 --> 00:40:20,400 Speaker 3: it was too boring that I forgot to take pictures 815 00:40:20,719 --> 00:40:23,239 Speaker 3: of these pages because I was just like it was 816 00:40:23,239 --> 00:40:26,200 Speaker 3: a struggle to finish, to finish each page. But luckily 817 00:40:26,239 --> 00:40:28,640 Speaker 3: on their website they do have some sample pages. I 818 00:40:29,239 --> 00:40:31,399 Speaker 3: talked to one of the one of the guys working 819 00:40:31,440 --> 00:40:33,640 Speaker 3: at the booth, and he said that he tried this 820 00:40:33,800 --> 00:40:36,879 Speaker 3: so we're like he found the service and he first thought, 821 00:40:36,880 --> 00:40:39,120 Speaker 3: you know, sure that this can't be any good. And 822 00:40:39,320 --> 00:40:41,480 Speaker 3: when he when he generated his book, he was surprised 823 00:40:41,560 --> 00:40:42,520 Speaker 3: how good the writing is. 824 00:40:42,560 --> 00:40:43,879 Speaker 4: He said that he probably. 825 00:40:43,560 --> 00:40:48,000 Speaker 3: Wouldn't win a Pulletzer or a Hugo, his two awards 826 00:40:48,000 --> 00:40:51,160 Speaker 3: that he named, but he said it was pretty good. 827 00:40:51,360 --> 00:40:51,520 Speaker 7: Right. 828 00:40:53,560 --> 00:40:56,040 Speaker 3: This guy was so far of the most Tim Robinson 829 00:40:56,200 --> 00:41:00,760 Speaker 3: character I met at the cop Yeah, that's great. So Robert, 830 00:41:00,840 --> 00:41:03,680 Speaker 3: you can pick the genre of sample. We have a 831 00:41:03,719 --> 00:41:10,200 Speaker 3: thriller book, a fantasy, mystery, science fiction, romance, or mainstream 832 00:41:10,280 --> 00:41:11,080 Speaker 3: literary fiction. 833 00:41:11,200 --> 00:41:12,560 Speaker 4: What genre do you want? 834 00:41:12,680 --> 00:41:15,480 Speaker 2: I think I want science fiction science fiction because I 835 00:41:15,480 --> 00:41:18,920 Speaker 2: feel like there's the shortest line between parody and legitimate 836 00:41:18,960 --> 00:41:19,719 Speaker 2: within sci fi. 837 00:41:20,680 --> 00:41:24,200 Speaker 3: All right, this is from a book called I don't 838 00:41:24,200 --> 00:41:25,279 Speaker 3: even want to say this one. 839 00:41:26,080 --> 00:41:27,440 Speaker 2: I'm really curious now. 840 00:41:27,880 --> 00:41:30,480 Speaker 4: Palympsit orbit is what I'm going to say. 841 00:41:30,520 --> 00:41:33,400 Speaker 2: Oh my god, they're starting to be Arthur C. Clark. 842 00:41:33,480 --> 00:41:40,000 Speaker 3: It's called the Polympsit Orbit. Chapter one, Desert Signals. Marrow 843 00:41:40,040 --> 00:41:43,080 Speaker 3: woke with the taste of metal in her mouth and 844 00:41:43,120 --> 00:41:46,200 Speaker 3: a pulse in her temples that felt one notch shy 845 00:41:46,320 --> 00:41:49,600 Speaker 3: of a hangover. The ceiling above her was low and white, 846 00:41:49,800 --> 00:41:53,120 Speaker 3: edged with soft events. A monitor over the bed scrolled 847 00:41:53,160 --> 00:41:57,319 Speaker 3: green numbers in a stylized outline of her lungs thin air. 848 00:41:57,760 --> 00:41:58,960 Speaker 2: She remembered. 849 00:42:01,520 --> 00:42:06,360 Speaker 3: The Atacoma sky, somewhere above concrete and glass. Good morning, 850 00:42:06,360 --> 00:42:11,440 Speaker 3: doctor Ellison, A calm baritone. How's the head? She turned 851 00:42:11,480 --> 00:42:14,520 Speaker 3: toward the voice. A man in the doorway wore a 852 00:42:14,560 --> 00:42:17,880 Speaker 3: slate blue clinic jumper and a badge that caught the 853 00:42:17,960 --> 00:42:22,160 Speaker 3: desert light leaking through the polarized glass. Dark curls threaded 854 00:42:22,200 --> 00:42:25,799 Speaker 3: with gray laugh lines that didn't quite match the tiredness 855 00:42:25,840 --> 00:42:26,600 Speaker 3: around his eyes. 856 00:42:26,760 --> 00:42:29,160 Speaker 4: Hire Man, are you good? Go me to keep you? 857 00:42:29,680 --> 00:42:32,600 Speaker 2: I you know, it's it's again. It's like the it's 858 00:42:32,600 --> 00:42:35,440 Speaker 2: an imitation of like a story. Like it's a scene 859 00:42:35,480 --> 00:42:38,960 Speaker 2: and it's a scene with details to describe people. But 860 00:42:39,040 --> 00:42:42,000 Speaker 2: there's not like you would ideally, I would have something 861 00:42:42,040 --> 00:42:43,960 Speaker 2: of an idea of like what the thrust of the 862 00:42:44,000 --> 00:42:47,880 Speaker 2: story is going to be. Like for example, Bilbo Baggins 863 00:42:47,960 --> 00:42:50,239 Speaker 2: was a hobbit who lived in a house underhill or 864 00:42:50,280 --> 00:42:53,080 Speaker 2: something like that. Forget the exact wording of that, but like, 865 00:42:54,480 --> 00:42:57,120 Speaker 2: I you know, it makes sense. It sounds like it 866 00:42:57,400 --> 00:43:01,560 Speaker 2: sounds remarkably like bad No. I don't insult Nano Remo 867 00:43:01,640 --> 00:43:04,799 Speaker 2: writers that much. It just it just it sounds like 868 00:43:04,920 --> 00:43:08,480 Speaker 2: a story that was generated based on a belief that like, well, 869 00:43:08,520 --> 00:43:10,600 Speaker 2: if we can just like describe enough stuff and use 870 00:43:10,680 --> 00:43:14,680 Speaker 2: enough words to describe a scene, then that counts as plot. Yeah, 871 00:43:14,719 --> 00:43:17,399 Speaker 2: I mean in character, which we don't have any of yet. 872 00:43:17,360 --> 00:43:20,880 Speaker 3: It's all of it was this very generic, empty like 873 00:43:21,120 --> 00:43:24,319 Speaker 3: stuff that's very very common. And if you ever have 874 00:43:24,360 --> 00:43:26,359 Speaker 3: to read through a lot of like AI writing, whether 875 00:43:26,440 --> 00:43:28,319 Speaker 3: for work or let's say, you know, you work in 876 00:43:28,360 --> 00:43:30,440 Speaker 3: a college, so you have students submitting this stuff, or 877 00:43:30,440 --> 00:43:33,560 Speaker 3: you for some reason are online and you feel obligated 878 00:43:33,600 --> 00:43:35,360 Speaker 3: to look at the worst parts of the world, like 879 00:43:35,560 --> 00:43:36,279 Speaker 3: what me and Robert do. 880 00:43:36,400 --> 00:43:38,440 Speaker 4: Sometimes this is all feels very familiar. 881 00:43:38,560 --> 00:43:41,520 Speaker 3: I'll read one other like paragraph from a different book, 882 00:43:42,000 --> 00:43:45,520 Speaker 3: a thriller called The Helix Files, Oh Good, obviously part 883 00:43:45,560 --> 00:43:47,880 Speaker 3: of a trilogy, so who knows where these stories go 884 00:43:48,000 --> 00:43:49,280 Speaker 3: over the course of three books. 885 00:43:49,640 --> 00:43:51,800 Speaker 4: Quote the car heater had. 886 00:43:51,680 --> 00:43:55,279 Speaker 3: Died ten minutes ago, cold leaked through the floorboards into 887 00:43:55,320 --> 00:43:59,799 Speaker 3: Helix's boots. Outside the eastern block industrial belt, slid past 888 00:43:59,840 --> 00:44:05,400 Speaker 3: and gray slabs and rusted steel, wet concrete period, diesel period, 889 00:44:06,080 --> 00:44:10,160 Speaker 3: a straight dog nosing tragh heap outside the road first, 890 00:44:10,200 --> 00:44:12,640 Speaker 3: slick with drizzle. So it's it's something that right, Yeah, 891 00:44:12,640 --> 00:44:14,760 Speaker 3: there's a lot of this sort of like quick punchy 892 00:44:14,880 --> 00:44:18,120 Speaker 3: sentences are common in AI writing at the moment, wet concrete, 893 00:44:18,160 --> 00:44:19,960 Speaker 3: you know, with a period. But like a lot of 894 00:44:20,000 --> 00:44:22,759 Speaker 3: this type of stuff you you see you see in 895 00:44:22,800 --> 00:44:24,759 Speaker 3: a AI writing, you have a lot of a lot 896 00:44:24,800 --> 00:44:27,759 Speaker 3: a lot of character and a lot a lot of 897 00:44:27,760 --> 00:44:30,680 Speaker 3: like m dash sentences. As I was flipping through these books, 898 00:44:30,719 --> 00:44:32,319 Speaker 3: I was like, okay, yeah, like I see what they're doing. 899 00:44:32,320 --> 00:44:33,280 Speaker 2: I see yeah. 900 00:44:33,360 --> 00:44:35,480 Speaker 3: But but now, if if you want to write a 901 00:44:35,600 --> 00:44:37,879 Speaker 3: quote unquote write a trilogy of books, you can pay 902 00:44:37,880 --> 00:44:40,560 Speaker 3: the money and and within six hours you will have 903 00:44:40,600 --> 00:44:44,600 Speaker 3: a trilogy. So what really makes me feel optimistic about 904 00:44:44,600 --> 00:44:47,440 Speaker 3: CES is the way that creativity is being democratized. 905 00:44:47,719 --> 00:44:50,600 Speaker 4: It used to be that no ordinary person could write 906 00:44:50,600 --> 00:44:50,960 Speaker 4: a book. 907 00:44:51,239 --> 00:44:52,960 Speaker 3: You had to have a story and be some sort 908 00:44:53,000 --> 00:44:55,120 Speaker 3: of freak at Oxford maybe like a you need like 909 00:44:55,160 --> 00:44:59,760 Speaker 3: a pencil, maybe a keyboard. Yeah, possible, barriers, it's not possible. 910 00:44:59,800 --> 00:45:03,040 Speaker 3: And now, luckily through AI, as long as you have 911 00:45:03,160 --> 00:45:04,520 Speaker 3: you know, sub money. 912 00:45:04,280 --> 00:45:06,959 Speaker 2: To pay a subscription service at a computer and VC 913 00:45:07,320 --> 00:45:09,759 Speaker 2: fundraising and subsidizing of vice service. 914 00:45:09,480 --> 00:45:13,160 Speaker 3: Ideally, yeah, then you too can be an author of 915 00:45:13,200 --> 00:45:13,800 Speaker 3: a trilogy. 916 00:45:14,800 --> 00:45:17,719 Speaker 2: Well that's that's my plan for the future. I guess 917 00:45:17,760 --> 00:45:20,680 Speaker 2: I want to end by talking about the second to 918 00:45:20,680 --> 00:45:24,399 Speaker 2: the last panel that I sat through, which was at 919 00:45:24,400 --> 00:45:27,719 Speaker 2: the AI house, and was I think yet again, this 920 00:45:27,840 --> 00:45:29,839 Speaker 2: was another one that was about they were largely talking 921 00:45:29,840 --> 00:45:32,399 Speaker 2: about ethics in this one, like AI and ethics and 922 00:45:32,480 --> 00:45:36,719 Speaker 2: like what that actually means. And Eric Pace, who on 923 00:45:36,760 --> 00:45:40,080 Speaker 2: the slide lajer City works at company but it's reassuring, Yeah, 924 00:45:40,239 --> 00:45:42,239 Speaker 2: reassuring who works at Cox Media, which is like a 925 00:45:42,280 --> 00:45:46,399 Speaker 2: big media company, yes, based on Georgia. And he had 926 00:45:46,520 --> 00:45:48,759 Speaker 2: a couple of statements that were interested in me. He 927 00:45:48,800 --> 00:45:50,719 Speaker 2: had one where he said that, like, it's kind of 928 00:45:50,760 --> 00:45:56,880 Speaker 2: incumbent upon people to develop an ethical rubric for how 929 00:45:56,960 --> 00:45:59,759 Speaker 2: and what sources and what AI is they trust and 930 00:45:59,800 --> 00:46:05,080 Speaker 2: want and figure that out. And I think what I 931 00:46:05,160 --> 00:46:07,799 Speaker 2: inferred was that, like, because it's not going to get 932 00:46:07,840 --> 00:46:10,480 Speaker 2: done by anyone else, And it kind of became clude 933 00:46:10,480 --> 00:46:12,080 Speaker 2: to me later. I think it's he also doesn't want 934 00:46:12,080 --> 00:46:13,840 Speaker 2: anyone else to do it. He wants this to be 935 00:46:13,880 --> 00:46:15,800 Speaker 2: an individual project where you have to kind of figure 936 00:46:15,840 --> 00:46:18,680 Speaker 2: that out for yourself. I was kind of unsure as 937 00:46:18,680 --> 00:46:20,960 Speaker 2: to whether he was the evil or just the pragmatic 938 00:46:21,040 --> 00:46:23,360 Speaker 2: version of this, because the pragmatic version is like, literally, 939 00:46:23,400 --> 00:46:25,200 Speaker 2: no one's going to restrict this stuff. You just have 940 00:46:25,280 --> 00:46:28,520 Speaker 2: to try to get by right, which is maybe accurate. 941 00:46:29,160 --> 00:46:31,200 Speaker 2: But there was a really interesting interaction on this panel. 942 00:46:31,239 --> 00:46:33,280 Speaker 2: One of the other people there was doctor Martin Clancy, 943 00:46:33,280 --> 00:46:37,319 Speaker 2: who was an Irish academic and a musician who was 944 00:46:37,440 --> 00:46:39,880 Speaker 2: on the panel again to talk about like creativity and ethics, 945 00:46:40,040 --> 00:46:42,560 Speaker 2: and made a comment that I found was really interesting 946 00:46:42,600 --> 00:46:44,239 Speaker 2: and I don't know, I wouldn't say I agree or 947 00:46:44,280 --> 00:46:46,000 Speaker 2: disagree with it, but I found it really interesting where 948 00:46:46,000 --> 00:46:49,600 Speaker 2: he was like, actually, I'm not at all concerned comparatively 949 00:46:49,800 --> 00:46:53,120 Speaker 2: about having an AI give me medical advice. I'm deeply 950 00:46:53,160 --> 00:46:56,360 Speaker 2: concerned about letting an AI recommend music or movies to me, 951 00:46:56,800 --> 00:46:58,640 Speaker 2: which I found a really interesting attitude and kind of 952 00:46:58,640 --> 00:47:00,600 Speaker 2: a thought provoking yell that it is interesting, which was 953 00:47:00,640 --> 00:47:03,520 Speaker 2: immediately spoiled by Eric Pace going like, well, I don't 954 00:47:03,520 --> 00:47:05,760 Speaker 2: see why anyone would have an issue with an AI doctor. 955 00:47:05,800 --> 00:47:07,480 Speaker 2: Doctors get things wrong all the time. 956 00:47:07,600 --> 00:47:10,680 Speaker 3: And then he just like let that statement set. 957 00:47:09,320 --> 00:47:14,439 Speaker 2: I have heard this before, and ais have a lot 958 00:47:14,480 --> 00:47:17,040 Speaker 2: more data. And they ended it by saying because everyone 959 00:47:17,040 --> 00:47:18,600 Speaker 2: asked like, what were their big wins of the year, 960 00:47:18,680 --> 00:47:21,919 Speaker 2: and his big win was that his wife hated chat 961 00:47:21,960 --> 00:47:23,920 Speaker 2: ept and didn't want to use it, and he convinced 962 00:47:23,960 --> 00:47:26,319 Speaker 2: her to use it to plan their vacation. It kind 963 00:47:26,320 --> 00:47:28,239 Speaker 2: of sounded like he bullied her into it, but that 964 00:47:28,360 --> 00:47:30,600 Speaker 2: was his big win for the year. I didn't like him. 965 00:47:31,080 --> 00:47:31,960 Speaker 1: My win is I. 966 00:47:33,719 --> 00:47:37,160 Speaker 2: My wife into using a chat bot to plan special 967 00:47:37,239 --> 00:47:40,600 Speaker 2: time vacationing together because we're not creative enough to figure 968 00:47:40,600 --> 00:47:42,719 Speaker 2: out how to go on a fucking trip. 969 00:47:42,480 --> 00:47:44,279 Speaker 4: Hashtag AI win Jesus. 970 00:47:44,480 --> 00:47:47,440 Speaker 2: I don't know anyway. I think that's good for episode 971 00:47:47,440 --> 00:47:49,799 Speaker 2: one from Cees. Come back next week. We'll we'll all 972 00:47:49,840 --> 00:47:52,359 Speaker 2: have more or listen to better offline where ed will 973 00:47:52,400 --> 00:47:55,359 Speaker 2: have just a shocking amount of content from a lot 974 00:47:55,440 --> 00:47:59,960 Speaker 2: of the relatively few and constantly shrinking stable of saying 975 00:48:00,120 --> 00:48:01,759 Speaker 2: people reporting on technology. 976 00:48:01,960 --> 00:48:02,680 Speaker 5: See you next week. 977 00:48:06,080 --> 00:48:08,600 Speaker 1: It could Happen Here is a production of cool Zone Media. 978 00:48:08,760 --> 00:48:11,840 Speaker 1: For more podcasts from cool Zone Media, visit our website 979 00:48:11,920 --> 00:48:15,480 Speaker 1: coolzonmedia dot com, or check us out on the iHeartRadio app, 980 00:48:15,560 --> 00:48:19,120 Speaker 1: Apple Podcasts, or wherever you listen to podcasts. You can 981 00:48:19,160 --> 00:48:21,480 Speaker 1: now find sources for it could Happen here, listed directly 982 00:48:21,520 --> 00:48:23,800 Speaker 1: in episode descriptions. Thanks for listening.