1 00:00:01,760 --> 00:00:07,400 Speaker 1: Al Zone Media. Oh my goodness, it's it could happen here, 2 00:00:07,600 --> 00:00:11,879 Speaker 1: a podcast that is about things falling apart are Dystopian 3 00:00:12,200 --> 00:00:16,040 Speaker 1: now and tomorrow, And for the last several days has 4 00:00:16,079 --> 00:00:19,400 Speaker 1: been heavily about the Consumer Electronics Show, which is a 5 00:00:19,440 --> 00:00:21,520 Speaker 1: huge event every year where one hundred and twenty to 6 00:00:21,520 --> 00:00:24,680 Speaker 1: one hundred and fifty thousand people flood into Las Vegas 7 00:00:24,720 --> 00:00:26,840 Speaker 1: to show off all of the new gadgets and to 8 00:00:26,880 --> 00:00:30,040 Speaker 1: have big, fancy panels on the future of technology. And 9 00:00:30,040 --> 00:00:32,720 Speaker 1: this has been a particularly good year for the Dystopia beat. 10 00:00:32,800 --> 00:00:36,200 Speaker 1: Part of that because the entire industry is obsessed right 11 00:00:36,240 --> 00:00:40,120 Speaker 1: now with artificial intelligence. Now there's a couple reasons for this. 12 00:00:40,600 --> 00:00:45,120 Speaker 1: Every laptop manufacturer is basically throwing out laptops with AI assistance. 13 00:00:45,200 --> 00:00:49,400 Speaker 1: Microsoft's is Copilot, and they're doing this because laptop sales 14 00:00:49,400 --> 00:00:51,879 Speaker 1: have stalled a lot of people, like the pandemic was 15 00:00:51,880 --> 00:00:55,600 Speaker 1: great for laptop sales, and then people stopped buying them 16 00:00:55,680 --> 00:00:58,920 Speaker 1: because most people don't need to replace their laptops very often. 17 00:00:59,280 --> 00:01:01,600 Speaker 1: So there's this death spread hope that by scaring everybody 18 00:01:01,680 --> 00:01:05,080 Speaker 1: into thinking they need AI immediately, they can get folks 19 00:01:05,120 --> 00:01:08,080 Speaker 1: to buy a new raft of machines, and outside of that, 20 00:01:08,160 --> 00:01:10,600 Speaker 1: it's just as I'm sure you're aware, with interest rates 21 00:01:10,640 --> 00:01:14,400 Speaker 1: where they are companies, Tech companies, particularly startups, are having 22 00:01:14,440 --> 00:01:18,560 Speaker 1: trouble getting VC money venture capital money invested in them. 23 00:01:18,600 --> 00:01:21,160 Speaker 1: So there's this kind of desperate hope that by plugging 24 00:01:21,240 --> 00:01:24,120 Speaker 1: AI constantly they can fill in the gap. So today 25 00:01:24,640 --> 00:01:26,480 Speaker 1: we have probably in a week or two, we're going 26 00:01:26,520 --> 00:01:29,360 Speaker 1: to have be putting out a long investigation based on 27 00:01:29,480 --> 00:01:31,840 Speaker 1: number of panels we went to with executives from Google, 28 00:01:31,880 --> 00:01:35,800 Speaker 1: from weirdly enough McDonald's, from Adobe, from Nvidia, from the 29 00:01:35,800 --> 00:01:40,959 Speaker 1: Consumer Electronics Association, in multiple government agencies including DHS, on 30 00:01:41,240 --> 00:01:43,480 Speaker 1: what they see as the future of AI. That's going 31 00:01:43,520 --> 00:01:45,880 Speaker 1: to be some pretty in depth reporting. But today we 32 00:01:46,000 --> 00:01:49,320 Speaker 1: want to talk about the AI products that we've been 33 00:01:49,360 --> 00:01:52,160 Speaker 1: seeing and as a spoiler, they're basically all the dumbest 34 00:01:52,200 --> 00:01:53,880 Speaker 1: shit you've ever heard of. So I want to introduce 35 00:01:53,920 --> 00:01:58,480 Speaker 1: our panel today. Coming back after catching a horrible, horrible 36 00:01:58,560 --> 00:02:02,320 Speaker 1: lung infection, throat infection, some kind of infection. Yeah, Garrison 37 00:02:02,360 --> 00:02:05,120 Speaker 1: got strep throat and despite the fact that we've been 38 00:02:05,160 --> 00:02:07,800 Speaker 1: hanging out together, I did not, which does prove I'm 39 00:02:07,840 --> 00:02:12,680 Speaker 1: genetically superior. We also have Tavia Mora coming back our 40 00:02:12,760 --> 00:02:17,280 Speaker 1: technical expert. Hello Tavia, Howdy everybody, and for the first 41 00:02:17,320 --> 00:02:19,480 Speaker 1: time on well, no, not for the first time, for 42 00:02:19,560 --> 00:02:22,560 Speaker 1: the third time on It could happen here. The upcoming 43 00:02:22,600 --> 00:02:26,080 Speaker 1: host of the cool Zone media tech focused show Better 44 00:02:26,160 --> 00:02:28,360 Speaker 1: Offline ed zitron. 45 00:02:28,200 --> 00:02:34,160 Speaker 2: Ed Wi Guan Hello, Hey, yeah, sorry, Hi, Yeah, hit 46 00:02:34,200 --> 00:02:35,680 Speaker 2: my head on the way, and yeah. 47 00:02:35,560 --> 00:02:38,720 Speaker 3: It's a truly awful show this year. 48 00:02:39,120 --> 00:02:41,560 Speaker 2: The thing that I said to Robert yesterday when we 49 00:02:41,560 --> 00:02:43,120 Speaker 2: were talking about the show, and this really stood out 50 00:02:43,160 --> 00:02:44,960 Speaker 2: to me, is if you had told me this was 51 00:02:44,960 --> 00:02:47,520 Speaker 2: twenty twenty one, I'd have believed you. It doesn't feel 52 00:02:47,560 --> 00:02:49,720 Speaker 2: despite the use of the word AI, it does not 53 00:02:49,880 --> 00:02:52,519 Speaker 2: feel like tech has actually moved that far. 54 00:02:53,760 --> 00:02:55,040 Speaker 3: And it's very strange. 55 00:02:55,639 --> 00:02:57,920 Speaker 1: Yeah, there was this period of time after the iPhone 56 00:02:57,919 --> 00:02:59,720 Speaker 1: came out where every year there would be really big 57 00:02:59,840 --> 00:03:02,440 Speaker 1: lead in the tech you saw. And this part of 58 00:03:02,480 --> 00:03:05,079 Speaker 1: I think why they're leaning on AI so heavily is 59 00:03:05,120 --> 00:03:09,520 Speaker 1: otherwise it's just the same laptop, smartphones, speakers, connected gadgets, 60 00:03:09,560 --> 00:03:12,160 Speaker 1: you know, autonomous cars and shit that we've been seeing 61 00:03:12,240 --> 00:03:15,840 Speaker 1: for years and they really haven't jumped forward much. But 62 00:03:16,680 --> 00:03:20,200 Speaker 1: you know, the downside of that is a lot of things. 63 00:03:20,200 --> 00:03:22,840 Speaker 1: But the upside of that is people are increasingly cramming 64 00:03:22,880 --> 00:03:26,639 Speaker 1: AI into insane shit in the hopes that somebody will 65 00:03:26,639 --> 00:03:28,520 Speaker 1: want to buy it. And so I want to start 66 00:03:28,560 --> 00:03:32,280 Speaker 1: off ed since you're you are not just our newest 67 00:03:32,360 --> 00:03:35,600 Speaker 1: host but also a Las Vegas native. I think people 68 00:03:35,680 --> 00:03:39,920 Speaker 1: could probably assume that from your Vegas accents. Yes, natural, Yeah, 69 00:03:39,920 --> 00:03:43,040 Speaker 1: what is your favorite or the first AI product do 70 00:03:43,040 --> 00:03:44,200 Speaker 1: you want to get into today? 71 00:03:44,440 --> 00:03:46,560 Speaker 3: I want to talk about the rabbit. The rabbit on 72 00:03:46,560 --> 00:03:47,360 Speaker 3: one Oh God. 73 00:03:47,480 --> 00:03:47,720 Speaker 1: Yes. 74 00:03:48,200 --> 00:03:50,960 Speaker 2: So this thing is a square box and I can't 75 00:03:51,000 --> 00:03:54,200 Speaker 2: tell if it acts without your phone or with your phone, 76 00:03:54,240 --> 00:03:56,840 Speaker 2: but it uses AI. You you speak into it, look 77 00:03:56,840 --> 00:04:00,120 Speaker 2: a walk, dooky, and it does a series of actions 78 00:04:00,280 --> 00:04:01,680 Speaker 2: what you say. So it can do all the things 79 00:04:01,720 --> 00:04:04,800 Speaker 2: that Siri could do five years ago, like change music 80 00:04:04,840 --> 00:04:07,040 Speaker 2: and start. But it also has like a three hundred 81 00:04:07,040 --> 00:04:10,920 Speaker 2: and sixty degree camera, which can based on the extremely 82 00:04:11,080 --> 00:04:14,880 Speaker 2: awkward and agonizing how a long demo twenty five minutes 83 00:04:14,920 --> 00:04:17,560 Speaker 2: but on me it felt like an hour. It can 84 00:04:17,600 --> 00:04:20,240 Speaker 2: look at a picture of rick Castle and start very 85 00:04:20,279 --> 00:04:23,840 Speaker 2: and after several agonizing seconds stop playing never give you up. 86 00:04:24,120 --> 00:04:27,279 Speaker 2: It can also it claims do a series of nuanced 87 00:04:27,279 --> 00:04:30,039 Speaker 2: actions like you can say, get me a cab home 88 00:04:30,520 --> 00:04:33,800 Speaker 2: and also put on my tunes and also change the 89 00:04:34,440 --> 00:04:39,719 Speaker 2: air conditioned at seventy four degrees, all in one one sentence. 90 00:04:40,200 --> 00:04:43,080 Speaker 2: Now you may think, why do I need to spend 91 00:04:43,120 --> 00:04:45,320 Speaker 2: two hundred dollars on a device to do this? And 92 00:04:45,400 --> 00:04:48,640 Speaker 2: the answer is you don't. You do not need to. 93 00:04:48,920 --> 00:04:51,520 Speaker 2: This thing looks cool and on some level, I'm just 94 00:04:51,560 --> 00:04:52,799 Speaker 2: glad we're getting new tat. 95 00:04:53,240 --> 00:04:55,720 Speaker 1: Yeah, the design is not bad. It's like a square. 96 00:04:55,800 --> 00:04:57,120 Speaker 1: It looks like it's maybe two to two and a 97 00:04:57,160 --> 00:04:59,080 Speaker 1: half inches by two and a half inches or so 98 00:04:59,240 --> 00:05:02,520 Speaker 1: something like that. Yeah, a little screen. It's like well 99 00:05:02,560 --> 00:05:05,760 Speaker 1: designed from an industrial design standpoint, and I think the 100 00:05:05,800 --> 00:05:08,799 Speaker 1: big Yeah, it looks like it's just that it's a 101 00:05:08,880 --> 00:05:12,279 Speaker 1: It's a basically a seri that can use app It 102 00:05:12,320 --> 00:05:14,560 Speaker 1: can use uber, It could book a flight for you. 103 00:05:14,560 --> 00:05:17,000 Speaker 1: One of the things they show is it like planning 104 00:05:17,120 --> 00:05:19,560 Speaker 1: a vacation in London for you, which does seem to 105 00:05:19,600 --> 00:05:21,919 Speaker 1: kind of go against the point of like going somewhere 106 00:05:21,960 --> 00:05:24,039 Speaker 1: new and like figuring out what you want to do there, 107 00:05:24,200 --> 00:05:26,440 Speaker 1: as opposed to it's basically pulling from a list. I'm 108 00:05:26,480 --> 00:05:28,760 Speaker 1: sure in the AI wrote of like top ten things 109 00:05:28,800 --> 00:05:29,600 Speaker 1: to do in London. 110 00:05:29,800 --> 00:05:34,080 Speaker 2: And it's just very weird because all of these tech guys, 111 00:05:34,160 --> 00:05:38,520 Speaker 2: who they very loudly claim their free spirits. They're independent, 112 00:05:38,560 --> 00:05:41,280 Speaker 2: they're not controlled by any authority, they cannot be manipulated, 113 00:05:41,600 --> 00:05:44,240 Speaker 2: all desperately want a machine to tell them exactly what 114 00:05:44,279 --> 00:05:46,680 Speaker 2: the hell to do with their lives. And it's so 115 00:05:46,880 --> 00:05:51,240 Speaker 2: bizarre because they we were discussing the different articles about 116 00:05:51,240 --> 00:05:53,240 Speaker 2: this and people trying to argue other thing these three exists. 117 00:05:53,240 --> 00:05:55,800 Speaker 2: It's like, oh, it takes out the friction between all 118 00:05:55,800 --> 00:05:57,920 Speaker 2: of these apps. I'm sorry, I just don't think there's 119 00:05:57,960 --> 00:05:58,640 Speaker 2: that much friction. 120 00:05:58,800 --> 00:06:01,240 Speaker 1: Pull up my phone. I'm on a yes, right, I 121 00:06:01,279 --> 00:06:03,680 Speaker 1: pull out my phone, I pull up grub Hub, I 122 00:06:03,839 --> 00:06:07,000 Speaker 1: order food. It's very simple. It's remarkably easy. I don't 123 00:06:07,040 --> 00:06:11,240 Speaker 1: see how talking to a square is better, Like it's 124 00:06:11,240 --> 00:06:13,279 Speaker 1: the same, Like I could call someone on the phone 125 00:06:13,680 --> 00:06:15,720 Speaker 1: and do it hands free, or I could text them, 126 00:06:15,720 --> 00:06:18,240 Speaker 1: and I always text them because that's more pleasant. 127 00:06:18,320 --> 00:06:20,080 Speaker 4: I mean, like, I have my phone open to signal 128 00:06:20,160 --> 00:06:22,360 Speaker 4: right now, I can swipe up go to Uber in 129 00:06:22,480 --> 00:06:26,760 Speaker 4: less than a second, saying the words move from signal 130 00:06:26,880 --> 00:06:30,640 Speaker 4: to the Uber app. Takes a whole lot longer than 131 00:06:30,760 --> 00:06:31,960 Speaker 4: just doing it with my thumb. 132 00:06:32,320 --> 00:06:35,240 Speaker 1: I also do love the idea of like completely ruining 133 00:06:35,279 --> 00:06:38,360 Speaker 1: the point of signal, which is an encrypted, extremely secure 134 00:06:38,400 --> 00:06:41,839 Speaker 1: messaging app, to be like, hey, random box, I want 135 00:06:41,880 --> 00:06:44,520 Speaker 1: to feed my private messages through you and have you 136 00:06:44,640 --> 00:06:47,080 Speaker 1: read them out to me as I go about my day. 137 00:06:47,360 --> 00:06:49,800 Speaker 1: I don't know what your data retention policy is or 138 00:06:49,839 --> 00:06:51,760 Speaker 1: what you'll be doing with it. 139 00:06:51,839 --> 00:06:54,560 Speaker 3: They sold down and they made two million dollars. 140 00:06:54,400 --> 00:06:56,120 Speaker 1: Like ten million of them ten thousand. 141 00:06:56,160 --> 00:06:59,359 Speaker 2: Sorry, it's just and it's I've read. I read like 142 00:06:59,520 --> 00:07:02,880 Speaker 2: eleven articles about this thing, because I occasionally drive myself 143 00:07:02,960 --> 00:07:05,520 Speaker 2: insane with these things when I see everyone excited about something, 144 00:07:05,680 --> 00:07:07,480 Speaker 2: but I can't read a single article that tells me 145 00:07:07,520 --> 00:07:11,080 Speaker 2: why I should buy it, even though my rap brain says, oh, 146 00:07:11,120 --> 00:07:13,640 Speaker 2: take with screen I want, but then I want to 147 00:07:13,760 --> 00:07:14,080 Speaker 2: use it. 148 00:07:14,920 --> 00:07:16,520 Speaker 3: But I'll have to explain this to. 149 00:07:16,520 --> 00:07:18,400 Speaker 2: The normal people in my life why I have this, 150 00:07:18,480 --> 00:07:20,440 Speaker 2: And I don't want to do that if it's useless. 151 00:07:20,480 --> 00:07:23,120 Speaker 2: But on top of that, I just don't think controlling 152 00:07:23,160 --> 00:07:24,760 Speaker 2: my life with voice is that useful. 153 00:07:24,800 --> 00:07:25,800 Speaker 3: Yeah, I don't like that. 154 00:07:26,320 --> 00:07:28,240 Speaker 1: I'm already and I think a lot of people are 155 00:07:28,280 --> 00:07:31,040 Speaker 1: already kind of fed up with the extent to which 156 00:07:31,120 --> 00:07:33,520 Speaker 1: my smartphone is a part of my life. Yeah, but 157 00:07:33,680 --> 00:07:36,520 Speaker 1: like it does irreplaceable tasks at the moment for me. 158 00:07:36,640 --> 00:07:39,440 Speaker 1: So I have it this thing is number one, adding 159 00:07:39,440 --> 00:07:42,040 Speaker 1: a device because I think it does require your phone. 160 00:07:42,440 --> 00:07:45,480 Speaker 1: But it's also, like you know, in addition to the 161 00:07:45,520 --> 00:07:48,600 Speaker 1: current problems I have with privacy on my smartphone, I 162 00:07:48,640 --> 00:07:52,160 Speaker 1: am adding another company and another device and another set 163 00:07:52,200 --> 00:07:55,280 Speaker 1: of security potential security flaws to it. 164 00:07:55,480 --> 00:07:57,920 Speaker 2: But on top of that, the thing they have failed 165 00:07:57,920 --> 00:08:01,520 Speaker 2: to explain anywhere no jealous apparently is interrogate them about 166 00:08:01,520 --> 00:08:04,920 Speaker 2: this is they claim this thing can log onto your 167 00:08:05,040 --> 00:08:08,600 Speaker 2: Uber and make a flight booking, ostensibly having your passport information, 168 00:08:08,640 --> 00:08:11,680 Speaker 2: your date of birth, and all this stuff. First and foremost, that's, 169 00:08:11,800 --> 00:08:14,040 Speaker 2: like you mentioned, the data retention policy is very strange. 170 00:08:14,040 --> 00:08:16,200 Speaker 2: But where is this crap all happening? Is it happening 171 00:08:16,240 --> 00:08:17,920 Speaker 2: on my phone? Is my phone just doing all this? 172 00:08:18,080 --> 00:08:20,200 Speaker 2: I refuse to believe that. So you're doing this in 173 00:08:20,200 --> 00:08:23,120 Speaker 2: the kind of virtual machine environment. How is that possible? 174 00:08:23,200 --> 00:08:26,960 Speaker 2: Surely these companies are going to have a problem with that. 175 00:08:27,200 --> 00:08:30,040 Speaker 2: Mark Sullivan for Fast Company. Actually, I think ask them 176 00:08:30,080 --> 00:08:32,040 Speaker 2: this and they were like, oh, yeah, they'll be fine 177 00:08:32,080 --> 00:08:34,320 Speaker 2: with it. They just want people using their apps. I 178 00:08:34,360 --> 00:08:36,000 Speaker 2: do not think they're going to be fine with this. 179 00:08:36,120 --> 00:08:39,280 Speaker 2: Companies hate it when they hand off power from the user, 180 00:08:39,679 --> 00:08:41,840 Speaker 2: who will still be liable to another computer. 181 00:08:42,320 --> 00:08:44,960 Speaker 1: Yeah. Well, the other thing is just that, like part 182 00:08:44,960 --> 00:08:47,680 Speaker 1: of me kind of suspects And when you watch the video, 183 00:08:47,679 --> 00:08:49,080 Speaker 1: we'll play a clip from it in a second. The 184 00:08:49,120 --> 00:08:53,040 Speaker 1: CEO of Rabbit very clearly, like a lot of guys 185 00:08:53,080 --> 00:08:55,800 Speaker 1: in tech, wants to be Steve Jobs. And I will 186 00:08:55,840 --> 00:08:58,720 Speaker 1: say one thing I kind of suspect that might actually 187 00:08:58,760 --> 00:09:01,120 Speaker 1: be that would be a Steve Jobs move, is he 188 00:09:01,200 --> 00:09:03,760 Speaker 1: may have just been hoping that this thing coming out 189 00:09:03,920 --> 00:09:06,640 Speaker 1: selling a shitload on free order and getting huge buzz 190 00:09:06,840 --> 00:09:10,000 Speaker 1: would force these companies after the fact to allow integration. 191 00:09:10,400 --> 00:09:12,520 Speaker 1: Like he may just be gambling, like if I get 192 00:09:12,600 --> 00:09:15,080 Speaker 1: enough buzz behind me, Uber and whatnot will come to 193 00:09:15,120 --> 00:09:17,120 Speaker 1: the table and be willing to work with me, because 194 00:09:17,320 --> 00:09:20,200 Speaker 1: suddenly this is like the Hippus new gadget. 195 00:09:19,880 --> 00:09:23,719 Speaker 2: Except ten thousand customers is actually not that many. And 196 00:09:24,000 --> 00:09:27,200 Speaker 2: I actually look forward to I really can't wait for 197 00:09:27,360 --> 00:09:30,120 Speaker 2: two months to pass people to get this and someone 198 00:09:30,160 --> 00:09:32,080 Speaker 2: to end up like sending the word penis to their 199 00:09:32,200 --> 00:09:35,040 Speaker 2: or company slack because they wanted to order pizza, and 200 00:09:35,080 --> 00:09:37,679 Speaker 2: on top of that, ordering a fly, ordering an uber. 201 00:09:37,720 --> 00:09:41,400 Speaker 2: These are actually really nuanced actions. Coming to mandelay, baitenit 202 00:09:42,080 --> 00:09:44,880 Speaker 2: Uber took me to the wrong place because it decided 203 00:09:44,880 --> 00:09:46,719 Speaker 2: it wanted to go to the convention center. I did 204 00:09:46,760 --> 00:09:48,600 Speaker 2: not select that. If you go to the airport, you 205 00:09:48,600 --> 00:09:50,400 Speaker 2: need to put in Southwest Airlines and what have you. 206 00:09:51,080 --> 00:09:53,280 Speaker 2: With grub hub, you need to do little bits. It's 207 00:09:53,400 --> 00:09:56,959 Speaker 2: just most people don't order lunch. They order something for lunch, 208 00:09:57,040 --> 00:10:00,160 Speaker 2: and I just don't. Ah, this whole thing just feels useless. 209 00:10:00,600 --> 00:10:04,199 Speaker 5: Yeah. See, for me, it's the additional level of abstraction 210 00:10:04,480 --> 00:10:07,160 Speaker 5: on top of these already abstracted apps that we use 211 00:10:07,240 --> 00:10:10,800 Speaker 5: to order our basic necessities like eating and things like that. 212 00:10:11,200 --> 00:10:14,720 Speaker 5: It worries me in sort of like a fantasy dystopic way. 213 00:10:14,840 --> 00:10:19,000 Speaker 5: What happens when people suddenly don't use that after getting 214 00:10:19,040 --> 00:10:20,840 Speaker 5: used to using it, Like what are they going to know? 215 00:10:20,920 --> 00:10:22,280 Speaker 5: Are they going to know how to operate a door 216 00:10:22,360 --> 00:10:23,760 Speaker 5: dash app? Are they going to know how to book 217 00:10:23,800 --> 00:10:25,160 Speaker 5: a flight? That kind of thing. 218 00:10:25,840 --> 00:10:28,120 Speaker 1: Yeah, it is kind of because one of the things 219 00:10:28,120 --> 00:10:30,199 Speaker 1: there was a c neet review that said, like, well, 220 00:10:30,240 --> 00:10:33,880 Speaker 1: the potential of this is that it completely removes physical 221 00:10:34,000 --> 00:10:36,040 Speaker 1: use of a device. So you're using these apps, but 222 00:10:36,080 --> 00:10:38,160 Speaker 1: they're just a part of your life. Uber's just a 223 00:10:38,200 --> 00:10:40,240 Speaker 1: thing you talk to. You never look at anything when 224 00:10:40,280 --> 00:10:43,160 Speaker 1: you do it, And like, is that better? Like I 225 00:10:43,200 --> 00:10:45,840 Speaker 1: don't like the idea that you basically have a robot 226 00:10:46,160 --> 00:10:48,640 Speaker 1: that you treat as like your nanny that plans your 227 00:10:48,640 --> 00:10:52,320 Speaker 1: life for you. Like the amount of hype over there 228 00:10:52,360 --> 00:10:55,240 Speaker 1: will be a more concerted piece about this coming out. 229 00:10:55,320 --> 00:10:57,040 Speaker 1: But the first thing I thought when I looked at 230 00:10:57,040 --> 00:10:58,640 Speaker 1: all these guys talking about how cool it was to 231 00:10:58,640 --> 00:11:00,440 Speaker 1: be able to just tell a robot to book your 232 00:11:00,480 --> 00:11:03,120 Speaker 1: flight and plan your travel and book your hotels for you. 233 00:11:03,640 --> 00:11:07,120 Speaker 1: That's like part of the experience of traveling, and like 234 00:11:07,320 --> 00:11:09,920 Speaker 1: choosing things to do is like one of the things 235 00:11:10,000 --> 00:11:13,600 Speaker 1: that traveling is, and the desire so many people have 236 00:11:13,679 --> 00:11:17,200 Speaker 1: to hand off elements of choice really reminds me of 237 00:11:17,240 --> 00:11:19,200 Speaker 1: like cult dynamics, And I don't think this is a 238 00:11:19,240 --> 00:11:22,679 Speaker 1: consumer thing. I think this is specifically a weird subculture 239 00:11:23,120 --> 00:11:26,040 Speaker 1: of tech people of AI people, a lot of the 240 00:11:26,120 --> 00:11:28,480 Speaker 1: same folks who got into NFTs. But this desire, like 241 00:11:28,920 --> 00:11:31,360 Speaker 1: life is so complex and scary, I want to hand 242 00:11:31,400 --> 00:11:33,520 Speaker 1: over all of my agency to a robot. It's the 243 00:11:33,520 --> 00:11:36,920 Speaker 1: same thing that is behind a lot of like why 244 00:11:37,000 --> 00:11:40,000 Speaker 1: people join cults. And I don't think this is a 245 00:11:40,040 --> 00:11:42,360 Speaker 1: pro societal problem, but I think it is a weird 246 00:11:42,400 --> 00:11:44,520 Speaker 1: problem with the group of people who are most excited 247 00:11:44,559 --> 00:11:45,520 Speaker 1: to have a fucking rabbit. 248 00:11:46,400 --> 00:11:48,560 Speaker 5: It seems like a sad thing to me that folks 249 00:11:48,679 --> 00:11:51,400 Speaker 5: might only attend bars or restaurants that are rated like 250 00:11:51,480 --> 00:11:54,680 Speaker 5: four point five and above that's decided by something else. Yeah, 251 00:11:54,679 --> 00:11:56,640 Speaker 5: and they don't get to have this like experience of 252 00:11:56,679 --> 00:11:59,080 Speaker 5: walking into like the sedious bar you've ever seen in 253 00:11:59,120 --> 00:12:02,240 Speaker 5: your life and have like maybe possibly like a life 254 00:12:02,320 --> 00:12:03,120 Speaker 5: changing experience. 255 00:12:03,280 --> 00:12:07,600 Speaker 2: I was just in South Korea and we went to 256 00:12:08,400 --> 00:12:11,240 Speaker 2: this fried chicken place that ended up being close actually 257 00:12:11,320 --> 00:12:13,280 Speaker 2: was like we opened, but nobody was there, which made 258 00:12:13,320 --> 00:12:15,480 Speaker 2: me just want to leave before getting killed. And so 259 00:12:15,720 --> 00:12:19,800 Speaker 2: I just went to a random chicken place across the 260 00:12:19,880 --> 00:12:22,960 Speaker 2: road from my hotel, and I thought, well, it'll feed me. 261 00:12:23,160 --> 00:12:25,800 Speaker 2: It was wonderful, it was delightful. 262 00:12:25,400 --> 00:12:27,760 Speaker 3: And it was I could not find any reviews for it. 263 00:12:27,760 --> 00:12:31,760 Speaker 2: It was just a flipping place, and I don't I 264 00:12:31,800 --> 00:12:34,920 Speaker 2: think these people who were desperate for a device like this, 265 00:12:34,920 --> 00:12:37,839 Speaker 2: this kind of weird nanny device. First of all, I 266 00:12:37,840 --> 00:12:39,560 Speaker 2: don't think they think about the practicalities of this. I 267 00:12:39,559 --> 00:12:41,960 Speaker 2: don't think this is quicker or easier or better. But 268 00:12:42,000 --> 00:12:44,000 Speaker 2: also they're like, oh, I wish I could just say 269 00:12:44,000 --> 00:12:46,360 Speaker 2: one thing and all of these things could happen for me. 270 00:12:46,800 --> 00:12:48,480 Speaker 2: Same people, by the way, who were saying that people 271 00:12:48,559 --> 00:12:50,400 Speaker 2: need to pull themselves up by their bootstraps and do 272 00:12:50,480 --> 00:12:53,600 Speaker 2: things for themselves. It's just I don't know if they'd 273 00:12:53,640 --> 00:12:56,480 Speaker 2: even call it the stopian. It's just weird and sad 274 00:12:56,600 --> 00:12:56,840 Speaker 2: to me. 275 00:12:57,520 --> 00:12:59,440 Speaker 1: Speaking of weird and sad, We're going to move on 276 00:12:59,480 --> 00:13:02,040 Speaker 1: to the next product in a second, but first I 277 00:13:02,040 --> 00:13:05,120 Speaker 1: gotta play everybody, in case you haven't seen it or 278 00:13:05,160 --> 00:13:09,760 Speaker 1: heard it, the CEO of Rabbit trying to rickroll the 279 00:13:09,800 --> 00:13:15,480 Speaker 1: audience with his Hell device. Have you seen this, Garrison? Oh? Okay, 280 00:13:15,600 --> 00:13:21,200 Speaker 1: eyes on the screen. Everybody to activate the eye, just 281 00:13:21,240 --> 00:13:28,760 Speaker 1: double tap the button. Oh funny seeing you here, Rick. 282 00:13:33,120 --> 00:13:43,640 Speaker 6: Let me take a look. You're never gonna give you up. 283 00:13:45,480 --> 00:13:45,920 Speaker 6: Enjoy it? 284 00:13:49,559 --> 00:13:51,240 Speaker 3: What am I getting rickrolled? 285 00:13:51,280 --> 00:13:51,640 Speaker 2: In my own? 286 00:13:51,720 --> 00:13:53,679 Speaker 3: Kenots. Let's move on to the next one. 287 00:13:54,080 --> 00:13:57,400 Speaker 2: All right, I have a question real quick. So what 288 00:13:57,559 --> 00:14:00,320 Speaker 2: is the functionality he just activated? Is it that you 289 00:14:00,400 --> 00:14:03,480 Speaker 2: just put you point the eye at something and it 290 00:14:03,559 --> 00:14:04,679 Speaker 2: chooses an Actually the. 291 00:14:04,640 --> 00:14:07,960 Speaker 1: I automatically see Rick Astley and choose to play one 292 00:14:08,000 --> 00:14:10,599 Speaker 1: specific song of his, Because that actually doesn't seem like 293 00:14:10,640 --> 00:14:12,440 Speaker 1: a feature. That seems like a bug. 294 00:14:12,600 --> 00:14:17,000 Speaker 2: Yeah, that seems like what happens if it sees certain people. 295 00:14:17,080 --> 00:14:19,880 Speaker 1: Yeah, Jeffrey Epstein, Yeah, what happens if it sees Jeffrey Yeah, 296 00:14:19,880 --> 00:14:23,120 Speaker 1: plays children screaming like what is how is this thing work? 297 00:14:23,280 --> 00:14:26,600 Speaker 2: Booking trips to Florida. 298 00:14:28,080 --> 00:14:32,320 Speaker 1: I maybe it's respectable that they showed how bad the 299 00:14:32,400 --> 00:14:35,440 Speaker 1: lag is because that moment where there's quiet after he 300 00:14:35,600 --> 00:14:38,520 Speaker 1: like clicks on, it is like it's loading, it's processing 301 00:14:38,600 --> 00:14:41,400 Speaker 1: for a considerable period of time, and. 302 00:14:41,280 --> 00:14:44,240 Speaker 2: It's just Also I feel for the bloke because I 303 00:14:44,240 --> 00:14:46,400 Speaker 2: know he was probably so excited to do this and 304 00:14:46,440 --> 00:14:48,720 Speaker 2: he's like, I'm going to be Steve Jobs. But man, 305 00:14:49,440 --> 00:14:51,880 Speaker 2: when you can't perform, you don't perform, Like. 306 00:14:52,240 --> 00:14:53,760 Speaker 1: Yeah, that's bad delivery. 307 00:14:54,320 --> 00:14:57,640 Speaker 2: The did I just get Rick Rold in my own video? 308 00:14:57,960 --> 00:15:00,520 Speaker 2: It was like that I forget what the movie's Oh 309 00:15:00,560 --> 00:15:01,080 Speaker 2: hi mock. 310 00:15:01,360 --> 00:15:04,920 Speaker 1: Yeah, it is and obviously like English is the reverse language, 311 00:15:04,920 --> 00:15:09,120 Speaker 1: but like it's a performance. You like you practice, right, 312 00:15:09,200 --> 00:15:12,640 Speaker 1: you get coached and stuff because you're trying to represent 313 00:15:12,680 --> 00:15:13,160 Speaker 1: your company. 314 00:15:13,200 --> 00:15:15,680 Speaker 2: Oh I tell you this from experiences I've run a 315 00:15:15,720 --> 00:15:19,200 Speaker 2: the off. Yeah, that guy actually did practice because all 316 00:15:19,240 --> 00:15:23,080 Speaker 2: of that was He's actual timing wasn't bad. He just 317 00:15:23,160 --> 00:15:24,720 Speaker 2: does not have that dog in him. 318 00:15:24,960 --> 00:15:29,920 Speaker 1: Yeah. Yeah, you bring in other people to do like that. Anyway, everybody, anyone, 319 00:15:30,160 --> 00:15:35,040 Speaker 1: anyone's mind on the rabbit changed having seen that. Absolutely 320 00:15:35,080 --> 00:15:37,680 Speaker 1: not Garrison has a look on their face. 321 00:15:37,960 --> 00:15:42,080 Speaker 4: No, it's just like what I've always wanted in a 322 00:15:42,160 --> 00:15:44,280 Speaker 4: tech gadget is be able to point at three to 323 00:15:44,280 --> 00:15:47,840 Speaker 4: sixty five degree camera at a picture of a musician 324 00:15:47,840 --> 00:15:50,320 Speaker 4: and then wait thirty seconds and then have an AI 325 00:15:50,480 --> 00:15:52,800 Speaker 4: pick a random song of theirs. That's always what I 326 00:15:52,840 --> 00:15:53,720 Speaker 4: wanted for the future. 327 00:15:53,960 --> 00:15:58,280 Speaker 1: Yeah, yeah, that's that's the dream of fucking Archimedes had. 328 00:15:58,360 --> 00:15:58,760 Speaker 4: That's right. 329 00:15:58,760 --> 00:16:01,640 Speaker 1: It was when he was building his laser that we 330 00:16:01,680 --> 00:16:05,000 Speaker 1: all saw in the most recent Indiana Jones film. Speaking 331 00:16:05,000 --> 00:16:08,800 Speaker 1: of the most recent Indiana Jones film, this podcast is 332 00:16:08,960 --> 00:16:12,520 Speaker 1: entirely sponsored by that movie. So here's some other ads. 333 00:16:22,680 --> 00:16:25,200 Speaker 4: Why are we giving free advertising to Disney. 334 00:16:25,600 --> 00:16:26,200 Speaker 3: Why are we. 335 00:16:27,800 --> 00:16:31,680 Speaker 1: Why because that movie was so close to being worth it. 336 00:16:31,920 --> 00:16:38,760 Speaker 1: That last twenty minutested no Nazis machine gunning Roman legionnaires 337 00:16:39,560 --> 00:16:40,360 Speaker 1: pretty funny. 338 00:16:40,640 --> 00:16:42,560 Speaker 4: Well, do you know who would have loved the s 339 00:16:43,440 --> 00:16:45,960 Speaker 4: our Comedies? Probably? Yes, he probably would have would have 340 00:16:45,960 --> 00:16:53,160 Speaker 4: had a great time. What what Nextarrison? The product do 341 00:16:53,200 --> 00:16:54,000 Speaker 4: we want to talk about? 342 00:16:55,040 --> 00:16:57,560 Speaker 1: How about the pet one? Garrison? You saw that? 343 00:16:57,680 --> 00:16:59,360 Speaker 4: All right? So I think I think I think me 344 00:16:59,400 --> 00:17:01,960 Speaker 4: and Ed saw chat GPT for animals. 345 00:17:02,360 --> 00:17:03,520 Speaker 1: Oh yeah, damn it. 346 00:17:03,680 --> 00:17:08,400 Speaker 4: Which is not really what it is saying. It's like 347 00:17:08,520 --> 00:17:11,720 Speaker 4: it scans a picture of your dog and then tries 348 00:17:11,760 --> 00:17:13,639 Speaker 4: to tell you if it has any health problem. It's 349 00:17:13,680 --> 00:17:17,280 Speaker 4: based on that picture. It's it's. It's You're not You're 350 00:17:17,280 --> 00:17:19,080 Speaker 4: not actually talking to your dog or anything. It just 351 00:17:19,119 --> 00:17:22,560 Speaker 4: it takes pictures of animals and then it analyzes it 352 00:17:22,680 --> 00:17:25,000 Speaker 4: to tell you how the dog is feeling. Blah blah 353 00:17:25,000 --> 00:17:27,560 Speaker 4: blah blah blah. It's it's. I saw a product like 354 00:17:27,600 --> 00:17:29,840 Speaker 4: this earlier at CES. I saw a product. I saw 355 00:17:29,840 --> 00:17:32,960 Speaker 4: a product like this last year. They're just calling it 356 00:17:33,080 --> 00:17:35,920 Speaker 4: chat GPT because it's an AI name. It's it's it's 357 00:17:35,920 --> 00:17:36,560 Speaker 4: it's it's. 358 00:17:36,400 --> 00:17:39,040 Speaker 1: It's hip like because people people, they're hoping that that 359 00:17:39,080 --> 00:17:40,639 Speaker 1: will make people spend money up. 360 00:17:41,160 --> 00:17:43,679 Speaker 2: It was every ces I see something that begins to 361 00:17:43,680 --> 00:17:50,080 Speaker 2: make me disassociate and I walk I walked past there 362 00:17:50,119 --> 00:17:53,320 Speaker 2: and Blovo the chat GPT for and my brain was 363 00:17:53,359 --> 00:17:55,960 Speaker 2: just like, could she just like start like glitching out? 364 00:17:56,200 --> 00:17:57,560 Speaker 2: And then when I went to look it up as 365 00:17:57,600 --> 00:18:00,720 Speaker 2: Garrison did, I was so disappointed because I hope that 366 00:18:00,800 --> 00:18:02,960 Speaker 2: these would just crackpots are like, yep, you put the 367 00:18:03,040 --> 00:18:05,600 Speaker 2: microphone to your dog. Now you know your dog's saying 368 00:18:06,080 --> 00:18:08,720 Speaker 2: that I would respect even if it didn't work, just 369 00:18:08,760 --> 00:18:10,520 Speaker 2: if you're like, yeah, fuck it. Yeah, your cat's said 370 00:18:10,560 --> 00:18:13,480 Speaker 2: he hates here. Your cat's been radicalized them Afraight. 371 00:18:14,600 --> 00:18:17,040 Speaker 1: See, there's a fun product in here, which is you 372 00:18:17,080 --> 00:18:19,840 Speaker 1: sell to Rubes and a product that you're like it 373 00:18:19,920 --> 00:18:24,240 Speaker 1: translates your dog's micro expressions into language. And then the 374 00:18:24,400 --> 00:18:28,119 Speaker 1: actual paying customers are sickos like us, and you just 375 00:18:28,160 --> 00:18:30,520 Speaker 1: take control of somebody's pets voice. 376 00:18:30,600 --> 00:18:31,480 Speaker 4: That would be so cool. 377 00:18:31,520 --> 00:18:33,960 Speaker 1: You can have their like yeah, your cat's racist, now 378 00:18:34,000 --> 00:18:35,760 Speaker 1: your dog's are Nazi like. 379 00:18:36,320 --> 00:18:39,080 Speaker 4: This is this is the perfect product for HP. Lovecraft 380 00:18:39,200 --> 00:18:39,960 Speaker 4: would have loved this. 381 00:18:43,880 --> 00:18:46,000 Speaker 2: No, if you gave me like the show light to me, 382 00:18:46,160 --> 00:18:48,919 Speaker 2: but for dogs on my phone, I would spend whatever 383 00:18:48,960 --> 00:18:50,960 Speaker 2: you want A thousand dollars, I will geah. 384 00:18:51,440 --> 00:18:54,520 Speaker 1: I wouldn't pay like average West Coast rent prices to 385 00:18:54,600 --> 00:18:57,720 Speaker 1: be able to like gaslight some family into thinking their 386 00:18:57,760 --> 00:18:58,719 Speaker 1: dog is a terrorist. 387 00:18:58,920 --> 00:19:01,639 Speaker 2: See a friend of mine, Oh, what's what's wrong? Ed 388 00:19:03,160 --> 00:19:06,639 Speaker 2: chat GPT said the I said that my dogs joined 389 00:19:06,680 --> 00:19:11,000 Speaker 2: ISIS and I don't know. I don't know how he 390 00:19:11,040 --> 00:19:13,520 Speaker 2: did it, but he's been He's talking about a caliphate 391 00:19:13,520 --> 00:19:15,800 Speaker 2: according to the app. I don't know what this app 392 00:19:15,840 --> 00:19:18,040 Speaker 2: is bankrupting me. I paid four and a half thousand 393 00:19:18,080 --> 00:19:19,320 Speaker 2: dollars for this app a month. 394 00:19:19,359 --> 00:19:20,560 Speaker 3: I don't know why I need it. 395 00:19:21,240 --> 00:19:25,119 Speaker 4: So because so I unfortunately had to miss yesterday. So 396 00:19:25,160 --> 00:19:28,439 Speaker 4: there was probably an endless number of tech innovations that 397 00:19:28,480 --> 00:19:31,000 Speaker 4: I was unable to see because I had to miss 398 00:19:31,000 --> 00:19:33,159 Speaker 4: one day. But with the help of penicil and I 399 00:19:33,200 --> 00:19:35,879 Speaker 4: was able to return today to do one final. 400 00:19:35,720 --> 00:19:37,359 Speaker 1: Chat GPT of antibiotics. 401 00:19:37,400 --> 00:19:40,520 Speaker 4: That's that is exactly what my doctor said. Actually, but 402 00:19:40,840 --> 00:19:43,240 Speaker 4: I did. I did swear revenge on cees. So I 403 00:19:43,280 --> 00:19:48,000 Speaker 4: just walked around most mostly mostly the Venetian just seeing 404 00:19:48,080 --> 00:19:50,280 Speaker 4: all of the worst things I could find in documenting 405 00:19:50,359 --> 00:19:53,679 Speaker 4: them so I could get revenge from that twink poisoning 406 00:19:53,760 --> 00:19:57,240 Speaker 4: me with strap throat. So the first really good thing 407 00:19:57,480 --> 00:20:00,840 Speaker 4: is this. I mostly walked around the award winning sections 408 00:20:00,880 --> 00:20:04,320 Speaker 4: because that's where you find only the best. There was 409 00:20:04,359 --> 00:20:08,199 Speaker 4: an award winning speaker called Audio Cu that all of 410 00:20:08,200 --> 00:20:13,240 Speaker 4: their marketing was built from this horrible, horrible uh Ai 411 00:20:13,359 --> 00:20:17,679 Speaker 4: image generation of this like extremely busty blonde woman in 412 00:20:17,720 --> 00:20:21,920 Speaker 4: a latex suit, but if you zoom it onto her fingernails, 413 00:20:21,640 --> 00:20:24,359 Speaker 4: her her fingernails are like sticking through the wrong side 414 00:20:24,359 --> 00:20:25,840 Speaker 4: of her fingers. 415 00:20:26,200 --> 00:20:28,440 Speaker 3: There, my god, oh my god. 416 00:20:29,800 --> 00:20:35,080 Speaker 2: It's the woman's from that one movie for oh damn it, 417 00:20:35,240 --> 00:20:37,520 Speaker 2: not skin the one. The other it was the woman 418 00:20:37,560 --> 00:20:39,480 Speaker 2: where the alien was sexy and then she killed people 419 00:20:39,480 --> 00:20:43,000 Speaker 2: when she had sex with them. It's the same thing, Yes, terrifying. Yes, 420 00:20:43,480 --> 00:20:45,560 Speaker 2: rita's cool in and say what that is? 421 00:20:46,359 --> 00:20:49,080 Speaker 4: Yeah, it looks just like that. It says relaxed, stick 422 00:20:49,119 --> 00:20:51,439 Speaker 4: it in, which is pretty funny. So that that was 423 00:20:51,480 --> 00:20:52,080 Speaker 4: pretty bad. 424 00:20:54,680 --> 00:20:57,840 Speaker 1: Now I respect that. I respect That's that's a baller 425 00:20:57,840 --> 00:20:58,520 Speaker 1: move right there. 426 00:21:00,080 --> 00:21:02,879 Speaker 4: Again. This is this is for a speaker company like. 427 00:21:02,960 --> 00:21:06,399 Speaker 5: DJ Girlfriend in the shape it's a speaker in the 428 00:21:06,400 --> 00:21:07,200 Speaker 5: shape of a girlfriend. 429 00:21:07,280 --> 00:21:08,800 Speaker 4: No, it's just home theater speakers. 430 00:21:09,240 --> 00:21:13,000 Speaker 1: It just have a horrible AI generated woman as their spokesperson. 431 00:21:13,400 --> 00:21:15,680 Speaker 5: I mean I would buy it if it was DJ Girlfriend. Though. 432 00:21:15,920 --> 00:21:18,199 Speaker 1: DJ Girlfriend is a great idea for a product and 433 00:21:18,359 --> 00:21:20,000 Speaker 1: might stop several shows. 434 00:21:20,200 --> 00:21:21,800 Speaker 3: AI has brought back sexism. 435 00:21:22,840 --> 00:21:26,239 Speaker 1: If you do DJ girlfriend right, you could stop at 436 00:21:26,320 --> 00:21:27,520 Speaker 1: least one mass shooting. 437 00:21:27,880 --> 00:21:30,280 Speaker 5: Finally, we have a real solution now. 438 00:21:30,800 --> 00:21:33,400 Speaker 4: Another product that won the CEES twenty twenty four Innovation 439 00:21:33,440 --> 00:21:39,199 Speaker 4: Awards is an AI powered coffee brewer and grinder system. 440 00:21:39,040 --> 00:21:42,560 Speaker 4: I'm just gonna read the description from Coffee's been missing, 441 00:21:42,760 --> 00:21:46,080 Speaker 4: That's right. I know we wake up every morning make 442 00:21:46,119 --> 00:21:48,200 Speaker 4: our little French press coffee. That's fine. But you know 443 00:21:48,240 --> 00:21:50,600 Speaker 4: what could be better? An AI system that does it 444 00:21:50,640 --> 00:21:54,560 Speaker 4: for you. I'm going to read the award the award 445 00:21:54,800 --> 00:21:59,320 Speaker 4: description for this product. Okay, introducing Barista Brew Coffee Brewer 446 00:21:59,359 --> 00:22:03,240 Speaker 4: and Grinders System, a smart coffee system that tailors your 447 00:22:03,320 --> 00:22:07,919 Speaker 4: brew to perfection with AI guided personalization. Easily adjust brewing 448 00:22:07,960 --> 00:22:11,560 Speaker 4: parameters for a custom cup. New expertise needed rate to 449 00:22:11,680 --> 00:22:15,840 Speaker 4: track and refine your bruise brew iq AI suggestions for 450 00:22:15,880 --> 00:22:20,280 Speaker 4: your ideal taste, simplify with one touch favorites, elevate your 451 00:22:20,320 --> 00:22:21,359 Speaker 4: coffee experience. 452 00:22:22,320 --> 00:22:23,879 Speaker 2: Yeah, well, I hear all that. The one thing I 453 00:22:23,960 --> 00:22:27,719 Speaker 2: think is simplify. That's that's simple. The Movie's Spacies, by 454 00:22:27,720 --> 00:22:29,440 Speaker 2: the way, I. 455 00:22:29,440 --> 00:22:31,960 Speaker 4: Love that movie. One of one of one of the 456 00:22:32,000 --> 00:22:34,560 Speaker 4: best hr Geiger art utilizations. 457 00:22:34,720 --> 00:22:37,800 Speaker 1: Yeah yeah, and easily the horniest movie of the nineteen nineties. 458 00:22:37,960 --> 00:22:38,879 Speaker 1: Like this, which is a lot of. 459 00:22:39,040 --> 00:22:44,600 Speaker 4: Which is which is a high high bar, so on on. 460 00:22:44,600 --> 00:22:46,960 Speaker 4: On this AI coffee maker on the front, there's a 461 00:22:47,040 --> 00:22:49,920 Speaker 4: little control panel with nine different settings that you can 462 00:22:50,080 --> 00:22:52,679 Speaker 4: you can change because they're all in a graph. We 463 00:22:52,720 --> 00:22:59,240 Speaker 4: have we have citrus, spice, nutty, fruity, balanced, cocoa, floral, herbal, 464 00:22:59,480 --> 00:23:02,960 Speaker 4: and honey. So you can you can with your with 465 00:23:03,000 --> 00:23:05,560 Speaker 4: the with the ease of a touchpad, start to customize 466 00:23:05,600 --> 00:23:08,480 Speaker 4: your own AI coffee. So that that is revolutionary. I'm 467 00:23:08,480 --> 00:23:10,280 Speaker 4: going to be getting one for Robert this Christmas. 468 00:23:10,280 --> 00:23:12,760 Speaker 1: Thank you, Garrison. I know I've always thought, you know, 469 00:23:12,800 --> 00:23:17,399 Speaker 1: what I hate is the experience of exploring new flavors 470 00:23:17,400 --> 00:23:21,480 Speaker 1: on my own and learning new ways of brewing coffee, 471 00:23:21,480 --> 00:23:23,960 Speaker 1: a beverage I consume every day. So I'm glad to 472 00:23:24,000 --> 00:23:26,480 Speaker 1: be handing that whole experience off to a machine. 473 00:23:26,560 --> 00:23:29,160 Speaker 4: That's right, And I know a lot of people used. 474 00:23:29,760 --> 00:23:32,560 Speaker 1: Tavia just brought something up that I think is relevant here. 475 00:23:33,560 --> 00:23:36,240 Speaker 5: It's a Guardian article about an AI smoothie shop that 476 00:23:36,280 --> 00:23:39,800 Speaker 5: opened in San Francisco well before ce S. That is 477 00:23:40,040 --> 00:23:42,960 Speaker 5: a combination of uh it's being driven forward with AI 478 00:23:43,080 --> 00:23:46,280 Speaker 5: technology as well as five G stuff that I think 479 00:23:46,359 --> 00:23:48,640 Speaker 5: had opened up and then like three weeks later had 480 00:23:48,640 --> 00:23:49,200 Speaker 5: shut down. 481 00:23:49,520 --> 00:23:52,040 Speaker 1: Oh that's too They were like, a robot will pick 482 00:23:52,080 --> 00:23:53,560 Speaker 1: the perfect smoothie for you. 483 00:23:53,960 --> 00:23:56,440 Speaker 2: Well, I actually want to bring work. I want to 484 00:23:56,440 --> 00:24:00,360 Speaker 2: bring something up. So I love smoking me I pelosmokers 485 00:24:00,359 --> 00:24:02,720 Speaker 2: at home. And I saw a few times on this 486 00:24:02,800 --> 00:24:05,960 Speaker 2: show AI grills and I just looked up one called 487 00:24:06,520 --> 00:24:09,320 Speaker 2: a brisk It smart grill, and I was like, how 488 00:24:09,320 --> 00:24:12,920 Speaker 2: could you possibly make a thing which is basically maintaining 489 00:24:12,960 --> 00:24:15,720 Speaker 2: hot air in a tube long enough until the food's done? 490 00:24:15,720 --> 00:24:17,560 Speaker 2: And what it is is it has a thing. You 491 00:24:17,560 --> 00:24:20,480 Speaker 2: can ask the grill what seasoning should I add to 492 00:24:20,480 --> 00:24:22,879 Speaker 2: make my chicken skew a spicy or how do I 493 00:24:22,920 --> 00:24:25,399 Speaker 2: see a medium rest eak? I don't fucking know. Why 494 00:24:25,440 --> 00:24:26,679 Speaker 2: don't you learn to cook. 495 00:24:26,440 --> 00:24:30,640 Speaker 1: You twat, it's just like. 496 00:24:32,200 --> 00:24:35,679 Speaker 2: The enjoyable part of cooking is the experimentation and learning taste. 497 00:24:35,680 --> 00:24:39,040 Speaker 2: But no, thank you, just like that goddamn coffee thing. Oh, 498 00:24:39,080 --> 00:24:41,280 Speaker 2: I don't want to learn anything. I don't want to 499 00:24:41,280 --> 00:24:42,359 Speaker 2: have a human experience. 500 00:24:42,840 --> 00:24:45,160 Speaker 5: That's the thing with a lot of these AI solutions, 501 00:24:45,200 --> 00:24:47,639 Speaker 5: will call them, is I feel like they're robbing people 502 00:24:47,640 --> 00:24:49,080 Speaker 5: of real experiences. 503 00:24:49,359 --> 00:24:53,359 Speaker 1: Yeah, for like no bit, Like there's some stuff that like, 504 00:24:53,680 --> 00:24:56,960 Speaker 1: you know, the ability of a smartphone to once you 505 00:24:57,080 --> 00:24:59,239 Speaker 1: had to be like in a building in order to 506 00:24:59,280 --> 00:25:01,840 Speaker 1: like access a phone or like use a payphone. Now 507 00:25:01,840 --> 00:25:04,920 Speaker 1: you can connect with people everywhere. That's that's a clear benefit, right, 508 00:25:05,119 --> 00:25:07,920 Speaker 1: there's downsides to it, obviously, but it's a clear benefit. 509 00:25:08,240 --> 00:25:11,080 Speaker 1: But like now you don't have to learn. Now, now 510 00:25:11,080 --> 00:25:13,199 Speaker 1: you don't have to cook. You can let a robot 511 00:25:13,240 --> 00:25:17,400 Speaker 1: do it for you. It's like, well, but why cooking 512 00:25:17,520 --> 00:25:19,960 Speaker 1: is pleasurable? And if I don't want to cook, I 513 00:25:20,000 --> 00:25:22,480 Speaker 1: will go to a restaurant or order food, and it's 514 00:25:22,560 --> 00:25:25,600 Speaker 1: cheaper than buying several thousand dollars AI device. 515 00:25:25,720 --> 00:25:29,040 Speaker 4: I mean, some things are hard to learn, which brings 516 00:25:29,080 --> 00:25:30,280 Speaker 4: me to the next product. 517 00:25:29,960 --> 00:25:30,600 Speaker 3: That's smoking me. 518 00:25:30,840 --> 00:25:34,120 Speaker 1: But hard word. 519 00:25:34,160 --> 00:25:36,360 Speaker 4: Kind of like like like. 520 00:25:36,400 --> 00:25:39,760 Speaker 7: Parenting, right, so good, okay, nice, you know what, Garrison, 521 00:25:39,760 --> 00:25:41,480 Speaker 7: I'm proud of you. That was a good SEXUA so 522 00:25:42,040 --> 00:25:46,080 Speaker 7: AI parenting, especially with your infant child. This was also 523 00:25:46,240 --> 00:25:49,000 Speaker 7: in the CEES Awards section, so you know it's going 524 00:25:49,040 --> 00:25:51,399 Speaker 7: to be legit. I was able to see an demonstration 525 00:25:51,480 --> 00:25:54,560 Speaker 7: of an AI baby crib that will shake your baby 526 00:25:54,680 --> 00:25:58,919 Speaker 7: up and down based on facial expression analysis done by 527 00:25:58,960 --> 00:25:59,399 Speaker 7: an AI. 528 00:26:00,359 --> 00:26:02,359 Speaker 4: Yeah, that's what you do is I'm gonna show you 529 00:26:02,400 --> 00:26:05,479 Speaker 4: show it here. So here is the cutting edge facial 530 00:26:05,480 --> 00:26:08,760 Speaker 4: expressions we have anger discussed fear, happiness, sadness, and surprise, 531 00:26:09,320 --> 00:26:13,520 Speaker 4: and that basically that data will go into this little 532 00:26:13,600 --> 00:26:15,880 Speaker 4: crib which will start shaking and moving up and down 533 00:26:15,920 --> 00:26:17,920 Speaker 4: based on what they skin on your baby's face. 534 00:26:18,280 --> 00:26:21,399 Speaker 2: So to be clear, there is a product snow that 535 00:26:21,520 --> 00:26:25,440 Speaker 2: exists where it drop my phone there. There's a product 536 00:26:25,440 --> 00:26:28,040 Speaker 2: called the Snow which is like a for infants and 537 00:26:28,080 --> 00:26:30,000 Speaker 2: it notices when they're fussing and it kind of like 538 00:26:30,040 --> 00:26:32,520 Speaker 2: lightly rocks them, but the way it rocks them is 539 00:26:32,560 --> 00:26:35,719 Speaker 2: so very light. It is very much a This is 540 00:26:35,960 --> 00:26:37,800 Speaker 2: this is what a mother would do with a brand 541 00:26:37,840 --> 00:26:40,800 Speaker 2: new baby, freshly baked. You don't want to move into much. 542 00:26:41,000 --> 00:26:43,760 Speaker 2: That one has like six pictures from the intro of 543 00:26:43,840 --> 00:26:47,720 Speaker 2: light to me and a heart rate monitor, and it's like, yeah, 544 00:26:47,960 --> 00:26:49,840 Speaker 2: hand over your baby to AI. 545 00:26:50,480 --> 00:26:50,800 Speaker 3: Great. 546 00:26:51,920 --> 00:26:54,600 Speaker 5: Yeah, this product looks like a baby Morocca the pace 547 00:26:54,760 --> 00:26:56,720 Speaker 5: that you've shaken it, which is dependent on what pretty 548 00:26:56,760 --> 00:26:57,960 Speaker 5: much you make pretty much. 549 00:26:58,040 --> 00:26:58,200 Speaker 4: Well. 550 00:26:58,200 --> 00:27:01,359 Speaker 1: I love it also because like a real scandal I 551 00:27:01,359 --> 00:27:03,280 Speaker 1: think from the I think it was in the eighties, 552 00:27:03,400 --> 00:27:08,719 Speaker 1: is like Nanny's shaking babies to death, like the idea 553 00:27:08,760 --> 00:27:12,119 Speaker 1: that like again a machine that can only go at 554 00:27:12,160 --> 00:27:13,760 Speaker 1: a certain pace that's very light. 555 00:27:14,720 --> 00:27:14,919 Speaker 8: You know. 556 00:27:14,960 --> 00:27:16,720 Speaker 1: I get that's a labor stave, especially for like a 557 00:27:16,760 --> 00:27:19,119 Speaker 1: single parent or whatnot. Like you know, some some people 558 00:27:19,160 --> 00:27:24,399 Speaker 1: will need that. But I just worry. I worry that 559 00:27:24,440 --> 00:27:26,440 Speaker 1: we're not all that far from our first and AI 560 00:27:26,600 --> 00:27:27,800 Speaker 1: killed my baby. 561 00:27:27,600 --> 00:27:29,400 Speaker 4: You know, I think I think that. I I think 562 00:27:29,400 --> 00:27:31,600 Speaker 4: the real beauty of this product is usually when you 563 00:27:31,600 --> 00:27:33,080 Speaker 4: have a newber, maybe you have to like watch it 564 00:27:33,119 --> 00:27:34,560 Speaker 4: all night becas it'll wake up, you have to like 565 00:27:34,600 --> 00:27:36,400 Speaker 4: pick it up, pat it, make sure it gets back 566 00:27:36,400 --> 00:27:38,240 Speaker 4: to sleep. You can just leave that baby in the bed. 567 00:27:38,320 --> 00:27:40,320 Speaker 4: You can you can like go to the club. Yeah, 568 00:27:40,440 --> 00:27:42,560 Speaker 4: just leave the baby in the bed. If it starts crying, 569 00:27:42,640 --> 00:27:45,320 Speaker 4: don't worry. The AI will take the will take over. 570 00:27:45,440 --> 00:27:47,840 Speaker 1: We are on the verge of beds that can raise 571 00:27:47,840 --> 00:27:50,040 Speaker 1: our children, just like the Venture Brothers. 572 00:27:50,160 --> 00:27:54,159 Speaker 4: That's right, and and those and those kids turned out fine, 573 00:27:54,880 --> 00:28:02,159 Speaker 4: they turned out great, perfect, But I think luckily, luckily 574 00:28:02,280 --> 00:28:04,360 Speaker 4: for you, because I know none of us are babies anymore. 575 00:28:04,400 --> 00:28:10,399 Speaker 4: But we are all you know, eventually going to get old, hopefully, hopefully, 576 00:28:11,200 --> 00:28:14,320 Speaker 4: And there is air products that will also assist us 577 00:28:14,359 --> 00:28:17,360 Speaker 4: as we get older using the same AI baby tech. 578 00:28:17,359 --> 00:28:17,560 Speaker 3: Here. 579 00:28:17,920 --> 00:28:19,479 Speaker 4: One of the one of the one of the places 580 00:28:19,480 --> 00:28:22,720 Speaker 4: that me and Robert stopped by was called Blue Sky AI. 581 00:28:22,760 --> 00:28:28,200 Speaker 4: It's spelled ridiculously offensively, and they refused to do an interview. 582 00:28:28,320 --> 00:28:30,440 Speaker 1: They were not happy. 583 00:28:32,080 --> 00:28:35,600 Speaker 4: But I was able to get a pamphlet and they 584 00:28:35,600 --> 00:28:37,920 Speaker 4: have an AI that I think they're mostly targeted to 585 00:28:37,960 --> 00:28:41,440 Speaker 4: get at like older people. But quote by comparing the 586 00:28:41,480 --> 00:28:44,520 Speaker 4: way your facial and vocal behavior changes over time, using 587 00:28:44,520 --> 00:28:47,560 Speaker 4: your facial expressions, facial muscle actions, as well as where 588 00:28:47,600 --> 00:28:49,880 Speaker 4: you are looking, your body posts, and the tone of 589 00:28:49,880 --> 00:28:52,400 Speaker 4: your voice, we have the potential to identify and monitor 590 00:28:52,480 --> 00:28:55,120 Speaker 4: all kinds of medical conditions that manifest in the face 591 00:28:55,240 --> 00:28:59,240 Speaker 4: or voice, So it's it's a facial scanning and voice 592 00:28:59,280 --> 00:29:02,320 Speaker 4: scanning uses AI to try to diagnose you with medical 593 00:29:02,360 --> 00:29:06,840 Speaker 4: conditions specific specifically, the guy told us that it's it's 594 00:29:06,920 --> 00:29:09,400 Speaker 4: useful for Alzheimer's. Then he realized we were journalists and 595 00:29:09,400 --> 00:29:12,959 Speaker 4: the nasses to go away. 596 00:29:14,000 --> 00:29:17,120 Speaker 1: It also, but yeah, that's how you know you've got 597 00:29:17,120 --> 00:29:19,120 Speaker 1: a good medical device. 598 00:29:18,840 --> 00:29:23,080 Speaker 4: A good product. At CES Blue Sky uses a continuous 599 00:29:23,120 --> 00:29:28,560 Speaker 4: approach apparent valiance and arousal to measure to measure expressed emotion. 600 00:29:28,960 --> 00:29:31,600 Speaker 4: This better fits the real human experience of emotional states. 601 00:29:31,720 --> 00:29:34,680 Speaker 4: This approach allows emotion regions to be defined and to 602 00:29:34,680 --> 00:29:38,040 Speaker 4: measure the transitions away from and towards these regions. This 603 00:29:38,080 --> 00:29:41,000 Speaker 4: continuous approach, where appropriate, can be mapped back to a 604 00:29:41,040 --> 00:29:46,160 Speaker 4: much less exact categorical representation. For example, excited, calm or angry? 605 00:29:46,800 --> 00:29:47,640 Speaker 5: Did he have horny? 606 00:29:48,160 --> 00:29:50,440 Speaker 4: They do not have horny, not that I can see. 607 00:29:50,480 --> 00:29:53,600 Speaker 1: Look, if you know old people, one thing they never 608 00:29:53,720 --> 00:29:54,800 Speaker 1: stop doing is fun. 609 00:29:54,840 --> 00:29:58,480 Speaker 4: Now, they do have a list of all human emotions 610 00:29:58,520 --> 00:30:01,920 Speaker 4: here that char you turn it on a map. Finally 611 00:30:02,200 --> 00:30:05,440 Speaker 4: that using AI, we can finally figure out what emotions 612 00:30:05,440 --> 00:30:07,800 Speaker 4: you're feeling based on your face, so you can use 613 00:30:07,800 --> 00:30:10,240 Speaker 4: this just with your with your phone camera, with your 614 00:30:10,480 --> 00:30:13,760 Speaker 4: with your iPad camera. They do data collection, data analysis. 615 00:30:14,080 --> 00:30:16,040 Speaker 4: One of the weird use cases that we saw was, 616 00:30:16,520 --> 00:30:19,160 Speaker 4: I know we saw something similar to this already, but 617 00:30:19,520 --> 00:30:21,880 Speaker 4: just scanning your face when driving to tell you how 618 00:30:21,920 --> 00:30:24,000 Speaker 4: you're feeling, which is just quite fun. 619 00:30:24,160 --> 00:30:26,520 Speaker 1: Yeah, it's a I could talk about that a second. 620 00:30:26,520 --> 00:30:28,640 Speaker 1: What this reminds me of there was a product a 621 00:30:28,640 --> 00:30:31,400 Speaker 1: few years ago. It was like a robot for the military, 622 00:30:31,440 --> 00:30:34,280 Speaker 1: and the idea was this robot can run in dangerous 623 00:30:34,320 --> 00:30:36,400 Speaker 1: situations and pick up troops that have been injured and 624 00:30:36,480 --> 00:30:38,600 Speaker 1: run them out, which is probably a thing that will 625 00:30:38,600 --> 00:30:40,880 Speaker 1: exist at some point and might even save lives. Right, 626 00:30:40,920 --> 00:30:42,280 Speaker 1: I can see how that would be a useful thing 627 00:30:42,280 --> 00:30:44,480 Speaker 1: in the military can be very dangerous to retrieve people. 628 00:30:44,720 --> 00:30:46,920 Speaker 1: Much better for a robot to get shot or blown 629 00:30:46,960 --> 00:30:50,200 Speaker 1: up in that situation than another person. But to try 630 00:30:50,240 --> 00:30:53,600 Speaker 1: and comfort the soldiers, they gave the robot the head 631 00:30:53,600 --> 00:30:56,560 Speaker 1: of a Teddy Bear, like a metal Teddy Bear head. 632 00:30:57,160 --> 00:31:01,200 Speaker 1: It looked like a fucking nightmare. It's just like, what 633 00:31:01,200 --> 00:31:04,360 Speaker 1: what do you think did you talk to There's all 634 00:31:04,440 --> 00:31:06,720 Speaker 1: sorts of guys who have been shot in combat. Did 635 00:31:06,760 --> 00:31:08,800 Speaker 1: you talk to one of them? Did you go with 636 00:31:08,920 --> 00:31:11,920 Speaker 1: the experience of having your arm blown off? Corporal have 637 00:31:12,040 --> 00:31:15,640 Speaker 1: been more pleasurable if a giant metal teddy bear it. 638 00:31:16,080 --> 00:31:18,040 Speaker 2: So my first job was working on the characters and 639 00:31:18,040 --> 00:31:23,440 Speaker 2: twisted metal, but then I moved into robotics. It's so 640 00:31:23,640 --> 00:31:28,240 Speaker 2: cool that how many of these products are very clearly made, funded, prototyped, 641 00:31:28,440 --> 00:31:31,600 Speaker 2: r and D, hired PR teams. Everyone's done these big 642 00:31:31,640 --> 00:31:34,320 Speaker 2: presentations without talking to a single fucking human being. 643 00:31:34,720 --> 00:31:35,640 Speaker 3: It's so cool. 644 00:31:35,920 --> 00:31:38,400 Speaker 2: It's so cool how much waste there is at this 645 00:31:38,520 --> 00:31:42,240 Speaker 2: show where not a single human soul there is a 646 00:31:42,280 --> 00:31:46,240 Speaker 2: completely different subject. There was like an AI powered nail 647 00:31:46,280 --> 00:31:49,440 Speaker 2: salon thing as well. I saw, and I'm like, that's 648 00:31:49,480 --> 00:31:52,600 Speaker 2: definitely one where you didn't talk to talk to any 649 00:31:52,640 --> 00:32:00,200 Speaker 2: woman though. Yeah, first and foremost in my experience, one 650 00:32:00,280 --> 00:32:02,280 Speaker 2: is scared of a new nail place for fucking up 651 00:32:02,320 --> 00:32:05,120 Speaker 2: their hands. So are they going to spend eight hundred 652 00:32:05,200 --> 00:32:07,880 Speaker 2: goddamn dollars on this thing to maybe get burned? And 653 00:32:07,920 --> 00:32:10,440 Speaker 2: I saw in this article about it just now that 654 00:32:10,520 --> 00:32:12,400 Speaker 2: their thing they said was, oh, yeah, it's like an 655 00:32:12,480 --> 00:32:16,280 Speaker 2: espresso at home. I've had an espressos break multiple times, 656 00:32:16,960 --> 00:32:18,920 Speaker 2: and I realized it may sound weird. How can you 657 00:32:18,920 --> 00:32:21,040 Speaker 2: break an espresso? I'm just built different. 658 00:32:21,240 --> 00:32:23,080 Speaker 1: But if I can break it in just like me 659 00:32:23,160 --> 00:32:26,480 Speaker 1: and strep throat, unbelievable. 660 00:32:27,560 --> 00:32:30,200 Speaker 4: So I do have one more product, and then I'm and. 661 00:32:30,240 --> 00:32:32,880 Speaker 1: Well, first, Garrison, I know you have one more product, 662 00:32:32,960 --> 00:32:47,360 Speaker 1: but we also have one more ad break. Ah, we're back, Garrison. 663 00:32:47,600 --> 00:32:48,640 Speaker 1: What's your next product? 664 00:32:48,760 --> 00:32:52,200 Speaker 4: So we already talked about the Handy, which is you know, 665 00:32:52,240 --> 00:32:56,440 Speaker 4: like sure did, which is by all accounts actually like 666 00:32:56,840 --> 00:32:57,760 Speaker 4: works as intended. 667 00:32:57,840 --> 00:33:00,480 Speaker 1: It's a good product. The people, the PR people, and 668 00:33:00,520 --> 00:33:03,320 Speaker 1: we talked to the CEO. We're not just knowledgeable, but 669 00:33:03,520 --> 00:33:07,000 Speaker 1: like remarkably good at keeping a straight face while talking 670 00:33:07,040 --> 00:33:10,240 Speaker 1: about their jack Well, that's professionalism. You have to respect it. 671 00:33:10,600 --> 00:33:13,200 Speaker 5: Honestly, that was the most professional booth I saw the 672 00:33:13,360 --> 00:33:15,640 Speaker 5: entirety of cees. They were really on point. 673 00:33:15,960 --> 00:33:18,360 Speaker 1: If you are looking for a jack off machine, I 674 00:33:18,400 --> 00:33:20,680 Speaker 1: can't recommend anything more highly. 675 00:33:20,480 --> 00:33:24,040 Speaker 4: Well, Robert, except for our next product, which is an 676 00:33:24,080 --> 00:33:26,280 Speaker 4: AI power to jack off machine. 677 00:33:26,880 --> 00:33:27,680 Speaker 1: Thank god. 678 00:33:28,840 --> 00:33:32,240 Speaker 4: So this is called my Hixel. It is the first 679 00:33:32,480 --> 00:33:34,000 Speaker 4: it's the first app. 680 00:33:33,760 --> 00:33:37,120 Speaker 1: That's an appealing names that's a name that sounds like sex. 681 00:33:37,800 --> 00:33:42,480 Speaker 4: It is the first app for climax Control to incorporate AI. 682 00:33:43,200 --> 00:33:46,160 Speaker 4: Now I'm gonna I'm gonna read through their. 683 00:33:46,720 --> 00:33:49,400 Speaker 5: Really redefines edge technology. Huh. 684 00:33:49,600 --> 00:33:51,280 Speaker 1: I want to make a note before you get into it. 685 00:33:51,640 --> 00:33:53,400 Speaker 1: The thing that they're claiming this is useful for. There 686 00:33:53,400 --> 00:33:56,000 Speaker 1: are devices for and it is a real use case, 687 00:33:56,000 --> 00:33:59,200 Speaker 1: which is that like premature ejaculation, this is a serious 688 00:33:59,240 --> 00:34:01,080 Speaker 1: problem for a lot of it. It's like a quality 689 00:34:01,080 --> 00:34:04,920 Speaker 1: of life issue, right, like it stops people from feeling confident. 690 00:34:04,960 --> 00:34:08,600 Speaker 1: It's a serious problem. There are prosthetic devices people can 691 00:34:08,680 --> 00:34:13,440 Speaker 1: use to train themselves. That's fine, they already exist. This 692 00:34:13,600 --> 00:34:16,120 Speaker 1: is basically like what if an AI could teach you 693 00:34:16,200 --> 00:34:17,200 Speaker 1: how to come slower? 694 00:34:17,400 --> 00:34:23,120 Speaker 4: Yes, and we have a six step layout here describing 695 00:34:23,480 --> 00:34:27,080 Speaker 4: why why my hixel is right for you. For the 696 00:34:27,200 --> 00:34:32,239 Speaker 4: first step is secure and anonymonized data collection, so you 697 00:34:32,280 --> 00:34:35,080 Speaker 4: can get o good all of your coming data stored, 698 00:34:35,200 --> 00:34:36,800 Speaker 4: but don't worry, it's secure. 699 00:34:37,040 --> 00:34:37,279 Speaker 8: See. 700 00:34:37,320 --> 00:34:39,560 Speaker 1: My first question to that is why is data on 701 00:34:39,640 --> 00:34:42,240 Speaker 1: me masturbating being collected at all? 702 00:34:42,400 --> 00:34:45,000 Speaker 4: Well, it could be because they're putting it towards an 703 00:34:45,040 --> 00:34:47,799 Speaker 4: eight week training program. 704 00:34:49,000 --> 00:34:51,799 Speaker 2: No, So, first and foremost, one of the first things 705 00:34:51,800 --> 00:34:53,880 Speaker 2: on the website for this is just the words happy 706 00:34:53,920 --> 00:34:57,360 Speaker 2: sex here, save sixty dollars in my Hixel Control. But 707 00:34:57,520 --> 00:34:59,120 Speaker 2: happy sex here is going to be something I think 708 00:34:59,160 --> 00:35:01,719 Speaker 2: about for a while. But also it says it has 709 00:35:02,280 --> 00:35:05,720 Speaker 2: my Hixel Care and my Hixel Control, two different things, 710 00:35:05,719 --> 00:35:08,600 Speaker 2: and then my hicksl Academy, and sadly you can't click 711 00:35:08,640 --> 00:35:10,680 Speaker 2: on that because I've never wanted to know more about 712 00:35:11,040 --> 00:35:13,160 Speaker 2: what how much material could that be? 713 00:35:14,080 --> 00:35:16,160 Speaker 5: Unless a masturbation academy. 714 00:35:16,680 --> 00:35:18,680 Speaker 1: Yeah, I thought they just called that eton. I was 715 00:35:18,680 --> 00:35:19,960 Speaker 1: a British public school joke. 716 00:35:20,239 --> 00:35:22,400 Speaker 5: It's okay, I made an edging joke earlier and nobody 717 00:35:22,480 --> 00:35:22,799 Speaker 5: caught it. 718 00:35:23,840 --> 00:35:24,120 Speaker 3: Yeah. 719 00:35:24,280 --> 00:35:27,400 Speaker 2: I there's one thing the eating boys do and they 720 00:35:27,440 --> 00:35:28,400 Speaker 2: don't have sex. 721 00:35:29,120 --> 00:35:30,000 Speaker 1: No masturbation. 722 00:35:30,320 --> 00:35:30,520 Speaker 4: Yeah. 723 00:35:30,719 --> 00:35:31,239 Speaker 3: Sorry. 724 00:35:32,680 --> 00:35:34,799 Speaker 1: Part of what I hate about this is its name 725 00:35:34,920 --> 00:35:37,719 Speaker 1: is so clearly like trying to be respectful and like 726 00:35:37,840 --> 00:35:40,840 Speaker 1: respectable and a tech product name as opposed to like 727 00:35:40,880 --> 00:35:42,840 Speaker 1: one of the things that I respect about the Handy 728 00:35:42,840 --> 00:35:45,240 Speaker 1: people is they just went ahead and called it handy. 729 00:35:45,360 --> 00:35:47,680 Speaker 4: I mean it's weird because like some of their some 730 00:35:47,719 --> 00:35:51,080 Speaker 4: of their free merch were labeled with stuff like download 731 00:35:51,120 --> 00:35:58,440 Speaker 4: the app to control your loads, we bring the game, 732 00:35:58,640 --> 00:36:03,520 Speaker 4: you bring the joystick. The first day you went for 733 00:36:03,560 --> 00:36:06,280 Speaker 4: a run, you couldn't last more than three minutes either. 734 00:36:07,360 --> 00:36:09,640 Speaker 4: So it's weird how they Yeah, I had this very 735 00:36:09,640 --> 00:36:13,680 Speaker 4: like sanitized branding except for their like free merch but yeah, 736 00:36:13,719 --> 00:36:17,040 Speaker 4: it has. It has Bluetooth connection, interactive and personalized settings. 737 00:36:17,239 --> 00:36:20,319 Speaker 4: You can monitor your user evolution, and it is it 738 00:36:20,400 --> 00:36:23,439 Speaker 4: is marked as a medical device. But on on their 739 00:36:23,480 --> 00:36:28,080 Speaker 4: brochure there's just two really really good sentences. There's video 740 00:36:28,200 --> 00:36:31,160 Speaker 4: feedback from our sexual health professionals. So after you come, 741 00:36:31,480 --> 00:36:34,200 Speaker 4: you can get on a video chat and talk about 742 00:36:34,200 --> 00:36:37,200 Speaker 4: there we go, there we go looking good. 743 00:36:37,840 --> 00:36:41,239 Speaker 5: It's the pillow talking. Add on, I'd. 744 00:36:40,960 --> 00:36:43,399 Speaker 2: Love to be one of those people as a gun man. 745 00:36:43,560 --> 00:36:45,800 Speaker 2: Three minutes. You can do better than that. Come on, 746 00:36:46,040 --> 00:36:47,640 Speaker 2: So they are you meant to encourage them? 747 00:36:47,920 --> 00:36:48,160 Speaker 1: Yeah? 748 00:36:48,440 --> 00:36:48,640 Speaker 4: Yeah? 749 00:36:48,840 --> 00:36:50,600 Speaker 3: Are you meant to commiserate with them? Yeah? 750 00:36:50,640 --> 00:36:52,160 Speaker 1: What is the goal here? 751 00:36:52,320 --> 00:36:52,680 Speaker 3: Yeah? 752 00:36:52,719 --> 00:36:55,239 Speaker 2: But also I cannot think of a single person i'd 753 00:36:55,239 --> 00:36:56,320 Speaker 2: want to talk about that with. 754 00:36:56,760 --> 00:36:59,200 Speaker 1: Yeah, I'm just imagining like the guy in the other 755 00:36:59,200 --> 00:37:00,799 Speaker 1: and be like no, no, no, no, zoom me the camera 756 00:37:00,800 --> 00:37:02,600 Speaker 1: a little more. I want to see those ropes. No, 757 00:37:02,920 --> 00:37:05,000 Speaker 1: that's not bad, that's not bad. Good consistency. 758 00:37:05,080 --> 00:37:07,160 Speaker 5: Okay, let's move that over. Let's see his O face again. 759 00:37:07,239 --> 00:37:10,840 Speaker 3: Wow, you play that, my friend? That your load management 760 00:37:10,920 --> 00:37:11,920 Speaker 3: is very consistent. 761 00:37:14,840 --> 00:37:18,560 Speaker 4: And I think I think we're really missing is how 762 00:37:18,680 --> 00:37:21,359 Speaker 4: much how much AI will assist in this because they 763 00:37:21,400 --> 00:37:26,640 Speaker 4: claim that using cutting edge technology? Eh eh my hixel 764 00:37:26,719 --> 00:37:29,480 Speaker 4: Control is the first solution to include AI and machine 765 00:37:29,560 --> 00:37:34,440 Speaker 4: learning for climax controlled treatment, which is just really really reassuring. 766 00:37:34,960 --> 00:37:37,080 Speaker 4: So yeah, it basically looks like a flashlight that connects 767 00:37:37,080 --> 00:37:42,920 Speaker 4: to your phone and it's an app with anatomical realistic 768 00:37:42,920 --> 00:37:47,840 Speaker 4: interior design and AI and secured it and and adanemonized data. 769 00:37:47,960 --> 00:37:49,480 Speaker 5: I think this is really going to open up some 770 00:37:49,520 --> 00:37:50,840 Speaker 5: avenues for sex workers. 771 00:37:51,600 --> 00:37:54,440 Speaker 4: Yeah, hope, hopefully, hopefully, Tafia. 772 00:37:54,160 --> 00:37:57,479 Speaker 1: It's It's also like the design the handy is very 773 00:37:57,560 --> 00:38:00,520 Speaker 1: clearly a robot. You stick your dick inside and it 774 00:38:00,960 --> 00:38:05,280 Speaker 1: jacks you off. This looks like a flesh light except 775 00:38:05,680 --> 00:38:07,840 Speaker 1: the back and like the front end that we unscrewed 776 00:38:07,840 --> 00:38:09,520 Speaker 1: the top and it's like a fake vagina looks like 777 00:38:09,520 --> 00:38:13,600 Speaker 1: a fleshlight. The back end looks like an incense diffuser, 778 00:38:14,080 --> 00:38:18,440 Speaker 1: Like like someone decided these two products needed to be 779 00:38:18,680 --> 00:38:22,240 Speaker 1: Like what if you could fuck your your aromatherapy bot. 780 00:38:23,680 --> 00:38:27,560 Speaker 4: Finally, So that is that is most of the the 781 00:38:27,560 --> 00:38:30,240 Speaker 4: the just the groundbreaking AI products that I was able 782 00:38:30,280 --> 00:38:33,400 Speaker 4: to see today. Does anyone else have any AI products 783 00:38:33,400 --> 00:38:34,319 Speaker 4: they would love to talk about? 784 00:38:34,400 --> 00:38:39,200 Speaker 1: It's time to talk about ganert I okayert obvious you 785 00:38:39,239 --> 00:38:41,440 Speaker 1: want to start us off about ganert. 786 00:38:41,560 --> 00:38:45,080 Speaker 5: Okay I guess We attended a panel. Which panel was it, y'all? 787 00:38:45,160 --> 00:38:48,440 Speaker 8: That was the dhs AI Yeah, that was That was 788 00:38:48,480 --> 00:38:50,960 Speaker 8: the AI panel with one of the heads of the 789 00:38:51,000 --> 00:38:54,440 Speaker 8: Department of Homeland Security, who I can confirm because he 790 00:38:54,520 --> 00:38:57,440 Speaker 8: turned around to take a selfie has a Hank Hill ass. 791 00:38:58,360 --> 00:38:59,640 Speaker 5: He was very insistent on that. 792 00:39:00,160 --> 00:39:03,520 Speaker 1: No but absolutely no. But and I'm saying this not 793 00:39:03,600 --> 00:39:06,239 Speaker 1: to shame him, but because there are ORTHOTICX for that 794 00:39:06,640 --> 00:39:10,080 Speaker 1: you can get help, sir. That's even a whole episode 795 00:39:10,520 --> 00:39:14,600 Speaker 1: of King of the Hill, one of the better EPISODESOD times. 796 00:39:15,040 --> 00:39:19,479 Speaker 5: Soert AI was announced before this talk that we had, 797 00:39:19,960 --> 00:39:23,840 Speaker 5: and it was a I think the guy announcing both 798 00:39:24,000 --> 00:39:28,520 Speaker 5: this this event as well as the panel had taken 799 00:39:28,560 --> 00:39:30,200 Speaker 5: some time to really focus on the fact that this 800 00:39:30,360 --> 00:39:31,640 Speaker 5: was his quote unquote. 801 00:39:31,280 --> 00:39:34,840 Speaker 4: Opis his his opis opus. He said the word opus 802 00:39:34,920 --> 00:39:37,640 Speaker 4: like five times Ganert's what I'll be remembered by. 803 00:39:38,920 --> 00:39:42,200 Speaker 5: This is my legacy. Yeah, and then I guess two 804 00:39:42,239 --> 00:39:44,520 Speaker 5: of the designers had come up who stuck out like 805 00:39:44,560 --> 00:39:47,200 Speaker 5: a sore thumb compared to like the sea of khaki 806 00:39:47,280 --> 00:39:48,839 Speaker 5: and blazers and things like that. 807 00:39:50,120 --> 00:39:53,160 Speaker 1: Yeah. Yeah, they had clearly never ordered a drone strike, 808 00:39:53,280 --> 00:39:55,360 Speaker 1: unlike our hero and Homelands security. 809 00:39:55,840 --> 00:39:58,120 Speaker 4: One of them had a wide brimmed hat that was 810 00:39:58,239 --> 00:40:01,040 Speaker 4: color matched to the Gert logo, which is pretty cool. 811 00:40:01,480 --> 00:40:02,799 Speaker 1: What does Genert stand for? 812 00:40:03,480 --> 00:40:07,000 Speaker 4: Ganert stands for generate, So I think it's actually just 813 00:40:07,600 --> 00:40:12,319 Speaker 4: called it generate. They just took out the vowels. But 814 00:40:12,480 --> 00:40:15,840 Speaker 4: this is going to be a three day event or 815 00:40:15,960 --> 00:40:21,960 Speaker 4: a conference held in Arlington, Virginia's they're claiming that it's 816 00:40:21,960 --> 00:40:24,560 Speaker 4: gonna have like two hundred speakers, one hundred and fifty 817 00:40:24,600 --> 00:40:28,320 Speaker 4: AI sessions, more than five hundred startups, one hundred fifty partners, 818 00:40:28,320 --> 00:40:32,439 Speaker 4: one hundred investors, and around five thousand attendees. They're trying 819 00:40:32,480 --> 00:40:38,600 Speaker 4: to target enterprise governments, platforms, AI tools, AI builders, services, investors, startups, 820 00:40:38,640 --> 00:40:42,640 Speaker 4: and media that it's it's these three events held simultaneously, 821 00:40:42,760 --> 00:40:47,080 Speaker 4: ones just called Gennert or Generate AI, which is about 822 00:40:47,280 --> 00:40:53,120 Speaker 4: just AI AI tech. It's about like AI companies, classes, keynotes, funding, 823 00:40:53,120 --> 00:40:55,760 Speaker 4: blah blah blah blah blah. There is then Voice and AI, 824 00:40:56,160 --> 00:41:00,359 Speaker 4: which is about AI language services. And there's also for 825 00:41:00,480 --> 00:41:03,279 Speaker 4: gov ai, which is about public sector and how the 826 00:41:03,320 --> 00:41:06,040 Speaker 4: government's gonna start integrating AI or regulating AI. And they 827 00:41:06,080 --> 00:41:10,839 Speaker 4: also have one for coding called code Forward, and it's 828 00:41:11,000 --> 00:41:13,359 Speaker 4: it's a bummer. We can't just play the opening video 829 00:41:13,400 --> 00:41:16,680 Speaker 4: because opening video had no like voice here. 830 00:41:16,719 --> 00:41:18,880 Speaker 1: But there's yeah, there's no voice. I can read it though. 831 00:41:19,440 --> 00:41:22,880 Speaker 1: Ninety seven million new jobs in AI, five hundred billion 832 00:41:22,920 --> 00:41:25,520 Speaker 1: in annual AI spend by twenty twenty seven, two hundred 833 00:41:25,520 --> 00:41:26,840 Speaker 1: and fifty billion in VC. 834 00:41:27,360 --> 00:41:29,320 Speaker 3: Funding by twenty twenty five. 835 00:41:29,760 --> 00:41:33,160 Speaker 1: Conert generate for a new world in a new market. 836 00:41:33,440 --> 00:41:37,560 Speaker 1: Gonert connects and forms, elevates and inspires. It all happens 837 00:41:37,600 --> 00:41:39,320 Speaker 1: at Ganerts. 838 00:41:39,760 --> 00:41:42,160 Speaker 4: And we cannot emphasize it enough. How they hyped up 839 00:41:42,560 --> 00:41:46,720 Speaker 4: VC cash there, there was there was so much build 840 00:41:46,800 --> 00:41:48,040 Speaker 4: up for VC cash. 841 00:41:48,239 --> 00:41:51,840 Speaker 1: I have I have watched people who are dope sick 842 00:41:52,360 --> 00:41:57,200 Speaker 1: by heroin with less jittery excitement in their hands and eyes. 843 00:41:58,120 --> 00:42:00,880 Speaker 3: All right, So but about shit like this? 844 00:42:01,000 --> 00:42:03,440 Speaker 2: So I just did a brief cursory look up Ganert 845 00:42:04,040 --> 00:42:10,640 Speaker 2: and it's and it's it's connected, Confidence, Voice I and 846 00:42:10,680 --> 00:42:14,160 Speaker 2: Gove and code Forward, and all of them are claiming 847 00:42:14,200 --> 00:42:20,480 Speaker 2: the following. They're featuring GitHub, Microsoft, open Ai, Codium tab nine. 848 00:42:21,320 --> 00:42:24,040 Speaker 2: Their thing on LinkedIn has twenty eight followers, and their 849 00:42:24,040 --> 00:42:27,640 Speaker 2: engagement is like when I post the word twitter on Twitter, 850 00:42:28,120 --> 00:42:29,879 Speaker 2: it's not very good at all. I can get more 851 00:42:29,880 --> 00:42:31,520 Speaker 2: on that doing any of the Pocht picture of my 852 00:42:31,520 --> 00:42:34,000 Speaker 2: asshole and get more than that. But also I cannot 853 00:42:34,040 --> 00:42:38,319 Speaker 2: find a single person claiming to attend this. Despite them 854 00:42:38,320 --> 00:42:40,600 Speaker 2: claiming two hundred bus speakers, one hundred and fifty plus sessions, 855 00:42:40,640 --> 00:42:43,080 Speaker 2: five hundred stars, one hundred fifty part one hundred investors, 856 00:42:43,080 --> 00:42:46,080 Speaker 2: five thousand attendees, I can't find a single bit of 857 00:42:46,120 --> 00:42:49,680 Speaker 2: evidence that anyone is ganerting around at all. And also 858 00:42:49,760 --> 00:42:52,840 Speaker 2: they claim to have three different conferences, code Forward, Gove, AI, 859 00:42:52,960 --> 00:42:57,440 Speaker 2: Voice AI and of course Connert AI and I, and 860 00:42:57,640 --> 00:43:00,520 Speaker 2: of course all of these are part of the AI 861 00:43:00,760 --> 00:43:01,840 Speaker 2: beta experience. 862 00:43:01,920 --> 00:43:03,680 Speaker 3: I don't know why you put beta. 863 00:43:03,800 --> 00:43:06,640 Speaker 2: People are beta as hell, But also why have you 864 00:43:06,680 --> 00:43:08,160 Speaker 2: got beta on a conference? 865 00:43:08,239 --> 00:43:10,040 Speaker 3: What are you doing? But also. 866 00:43:11,280 --> 00:43:14,440 Speaker 2: Featuring open AI and video Microsoft, Google, and Verituan. I'm 867 00:43:14,480 --> 00:43:17,560 Speaker 2: gonna guess that they've got like chat, GPT, open on 868 00:43:17,560 --> 00:43:22,319 Speaker 2: a computer, an Nvidio GPU and something Microsoft Word and 869 00:43:22,360 --> 00:43:28,000 Speaker 2: they've used Google. And it's very strange because I don't 870 00:43:28,040 --> 00:43:29,080 Speaker 2: know what this thing. 871 00:43:29,160 --> 00:43:32,160 Speaker 1: Is, you know. I think what it is is some 872 00:43:32,440 --> 00:43:37,080 Speaker 1: guys who have a degree of like nay like like 873 00:43:37,160 --> 00:43:40,480 Speaker 1: some guys who are hoping that they don't have any 874 00:43:40,480 --> 00:43:42,359 Speaker 1: actual ideas for it to do with AI, so they're 875 00:43:42,360 --> 00:43:44,719 Speaker 1: hoping that if they create a conference and make that 876 00:43:44,800 --> 00:43:47,279 Speaker 1: be like the Cees of AI, they can kind of 877 00:43:47,280 --> 00:43:50,080 Speaker 1: force a place for themselves and also attract a bunch 878 00:43:50,080 --> 00:43:51,239 Speaker 1: of suction up a bunch of money. 879 00:43:51,280 --> 00:43:53,440 Speaker 2: I also found some I found some of the speakers. 880 00:43:53,440 --> 00:43:56,760 Speaker 2: You've got fellow called Adam Goldberg, who's an account director 881 00:43:56,800 --> 00:43:58,880 Speaker 2: and head of a is your open Ai enablement on 882 00:43:58,920 --> 00:44:01,200 Speaker 2: the go to market team open Ai. They found a 883 00:44:01,239 --> 00:44:03,319 Speaker 2: sales guy from open ai and then said they got 884 00:44:03,320 --> 00:44:06,320 Speaker 2: someone from open Ai. They got someone from JP Morgan, 885 00:44:06,400 --> 00:44:10,080 Speaker 2: Chile's data and AI design. These are all fake jobs. 886 00:44:10,640 --> 00:44:14,480 Speaker 2: These aren't real jobs. And I think that these conferences 887 00:44:14,520 --> 00:44:16,560 Speaker 2: are amazing as well, because all people do at them 888 00:44:16,600 --> 00:44:18,719 Speaker 2: is they go they watch these things where people go 889 00:44:18,800 --> 00:44:21,560 Speaker 2: up on stage and go, you know, generative AI is 890 00:44:21,600 --> 00:44:24,760 Speaker 2: going to create maybe even trillions of dollars of value 891 00:44:24,760 --> 00:44:28,200 Speaker 2: at some point, and you know, the synergy between generative 892 00:44:28,280 --> 00:44:33,319 Speaker 2: AI and data collection but also data silos is going 893 00:44:33,400 --> 00:44:37,560 Speaker 2: to be truly, truly innovative. And everyone's like, holy fucking shit, 894 00:44:37,960 --> 00:44:41,520 Speaker 2: whoa holy shit, piss no, and then they all post 895 00:44:41,520 --> 00:44:44,120 Speaker 2: it on Twitter and they all forget it. Ever happened immediately. 896 00:44:44,360 --> 00:44:46,680 Speaker 5: Yeah, we call that the dividend. 897 00:44:47,800 --> 00:44:50,480 Speaker 4: We do call that the dividend. So Gert's being put 898 00:44:50,520 --> 00:44:54,200 Speaker 4: on by this guy who runs this like panel collection 899 00:44:54,360 --> 00:44:57,040 Speaker 4: called brands gpt at ces. 900 00:44:56,800 --> 00:44:57,160 Speaker 2: With a Z. 901 00:44:58,440 --> 00:45:00,840 Speaker 1: No, it's not Pear and Gar with a Z. 902 00:45:01,080 --> 00:45:04,200 Speaker 4: Should it should be. I think me and Robert both 903 00:45:04,239 --> 00:45:07,640 Speaker 4: went to like one or two of these brands GPT panels. 904 00:45:07,840 --> 00:45:10,040 Speaker 4: This is the one where Robert got to yell at 905 00:45:10,080 --> 00:45:11,680 Speaker 4: Google and Microsoft and get them mad. 906 00:45:12,400 --> 00:45:15,960 Speaker 1: No Google and McDonald's McDon donald's ahead of AI, which 907 00:45:16,000 --> 00:45:16,479 Speaker 1: is a thing. 908 00:45:17,640 --> 00:45:22,600 Speaker 4: So they look to basically just focus on like convention programming. 909 00:45:22,840 --> 00:45:24,520 Speaker 4: So now they're trying to put on their own convention 910 00:45:24,600 --> 00:45:28,840 Speaker 4: that they're calling Ganert instead of just running this brand's 911 00:45:28,920 --> 00:45:33,279 Speaker 4: gpt at CES. So that's that's the background. It's it's 912 00:45:33,320 --> 00:45:37,759 Speaker 4: done by Mode v events. That's Mode and the letter VD. 913 00:45:37,840 --> 00:45:40,600 Speaker 4: But one word that's like the parent company for this. 914 00:45:41,200 --> 00:45:43,839 Speaker 4: I'll be interested once we get closer to October. I'll 915 00:45:43,880 --> 00:45:45,480 Speaker 4: be interested to see if this is looking more like 916 00:45:45,480 --> 00:45:47,560 Speaker 4: a real event. It's it's not going to be that 917 00:45:47,600 --> 00:45:50,520 Speaker 4: far for me to travel. But no, they're they're promising 918 00:45:51,120 --> 00:45:54,640 Speaker 4: five hundred billion dollars in annual AI spending with two 919 00:45:54,800 --> 00:45:59,000 Speaker 4: hundred and fifty billion new VC cash investments, which is 920 00:45:59,040 --> 00:45:59,920 Speaker 4: which is quite promised. 921 00:46:00,360 --> 00:46:04,160 Speaker 1: Yeah, so hopefully this beta test goes like the last 922 00:46:04,239 --> 00:46:07,000 Speaker 1: video game beta test that I went to and everybody 923 00:46:07,040 --> 00:46:12,040 Speaker 1: clips through the floor and disappears into avoid Well, I 924 00:46:12,080 --> 00:46:14,800 Speaker 1: think that's gonna do it for us in this episode. 925 00:46:14,800 --> 00:46:17,320 Speaker 1: And I want to leave you all with well, before 926 00:46:17,680 --> 00:46:19,879 Speaker 1: we've got one more thing. But before we get into that, 927 00:46:19,960 --> 00:46:22,120 Speaker 1: which which will be fun, I want to talk about 928 00:46:22,120 --> 00:46:24,560 Speaker 1: something sobering, which is that, as you may get from this, 929 00:46:25,080 --> 00:46:27,719 Speaker 1: nearly one hundred percent of the AI use cases that 930 00:46:27,760 --> 00:46:31,080 Speaker 1: we saw presented were either nonsense or incredibly vague. At 931 00:46:31,080 --> 00:46:33,520 Speaker 1: these different that where you had people from like Nvidia 932 00:46:33,560 --> 00:46:36,000 Speaker 1: and Adobe and whatnot that like they wouldn't say like, 933 00:46:36,080 --> 00:46:38,359 Speaker 1: we're going to use AI for this specific task. They 934 00:46:38,360 --> 00:46:41,200 Speaker 1: would say we're going to use AI to get more nimble, 935 00:46:41,280 --> 00:46:42,880 Speaker 1: which I think means firing people. 936 00:46:43,360 --> 00:46:43,600 Speaker 3: You know. 937 00:46:43,920 --> 00:46:46,840 Speaker 1: Outside of that, the only real specific use cases that 938 00:46:46,840 --> 00:46:50,160 Speaker 1: were not clearly nonsense. We're stuff like replacing you know, 939 00:46:50,360 --> 00:46:53,360 Speaker 1: customer service workers with chatbots, which is bad, and to 940 00:46:53,400 --> 00:46:56,080 Speaker 1: be fair, some also really good stuff like that telescope 941 00:46:56,080 --> 00:46:58,279 Speaker 1: that used kind of machine learning in order to like 942 00:46:58,280 --> 00:47:00,440 Speaker 1: clean up images so that you can get better, better 943 00:47:00,480 --> 00:47:02,320 Speaker 1: images and whatnot when you're in an area with a 944 00:47:02,360 --> 00:47:04,600 Speaker 1: lot of light pollution. There was some stuff like that, 945 00:47:04,920 --> 00:47:08,640 Speaker 1: but usually very vague. The use cases for AI was 946 00:47:08,680 --> 00:47:12,480 Speaker 1: always extremely clear were the harms and the very first 947 00:47:13,040 --> 00:47:15,759 Speaker 1: panel we attended, there's a company called Deloitte. They're a 948 00:47:15,920 --> 00:47:18,680 Speaker 1: huge consulting firm. If you know about McKenzie because they're 949 00:47:18,680 --> 00:47:21,759 Speaker 1: currently somewhat rightfully so a bit of a bugbear on 950 00:47:21,800 --> 00:47:24,839 Speaker 1: the left, Deloitte is a similar kind of organization, right 951 00:47:24,880 --> 00:47:27,960 Speaker 1: I think they're a bit less toxic, but to a 952 00:47:28,000 --> 00:47:31,839 Speaker 1: marginal degree. They're like a massive consulting firm. Companies bring 953 00:47:31,880 --> 00:47:34,120 Speaker 1: them in in order to help them streamline and make 954 00:47:34,440 --> 00:47:38,080 Speaker 1: processes more efficient and stuff, and one of their people 955 00:47:38,120 --> 00:47:41,680 Speaker 1: said that according to their internal metrics, they expected half 956 00:47:41,719 --> 00:47:46,759 Speaker 1: a trillion dollars in fraud this year in one year 957 00:47:47,239 --> 00:47:52,920 Speaker 1: due just to voice cloning AI. And that was a 958 00:47:52,960 --> 00:47:56,680 Speaker 1: more specific statement of what AI is going to do 959 00:47:56,760 --> 00:48:00,279 Speaker 1: to change people's lives than absolutely any positive use case 960 00:48:00,320 --> 00:48:01,760 Speaker 1: I heard presented at this conference. 961 00:48:02,040 --> 00:48:05,120 Speaker 4: Could you like explain what you mean by voice cloning 962 00:48:05,440 --> 00:48:06,120 Speaker 4: so AI. 963 00:48:06,400 --> 00:48:08,720 Speaker 1: You know, we did a couple of Bastards episodes talking 964 00:48:08,719 --> 00:48:12,200 Speaker 1: about scams and like how they've contributed to the decline 965 00:48:12,200 --> 00:48:14,640 Speaker 1: of trust in our society. One of the things that 966 00:48:14,760 --> 00:48:16,880 Speaker 1: is in the last year or so become a massive 967 00:48:16,920 --> 00:48:19,840 Speaker 1: problem is there are now AI things that can generate 968 00:48:19,920 --> 00:48:23,960 Speaker 1: a human voice near perfectly to the point where, especially 969 00:48:23,960 --> 00:48:26,120 Speaker 1: if it is a voice of say your kid calls 970 00:48:26,120 --> 00:48:27,720 Speaker 1: you and they're telling you that they have been fucking 971 00:48:27,800 --> 00:48:30,440 Speaker 1: kidnapped or you know, something else has happened and they 972 00:48:30,440 --> 00:48:32,439 Speaker 1: need you to wire them money desperately and you send 973 00:48:32,440 --> 00:48:34,839 Speaker 1: them the money, it's a fucking scam, right that is. 974 00:48:35,120 --> 00:48:37,359 Speaker 1: We had a person from Deloitte, and I think it 975 00:48:37,400 --> 00:48:39,640 Speaker 1: was a person from Adobe, say that they had been 976 00:48:39,719 --> 00:48:43,800 Speaker 1: called by a colleague who had gotten like a call 977 00:48:43,960 --> 00:48:47,000 Speaker 1: thinking it was that seemed to be them asking them 978 00:48:47,000 --> 00:48:49,640 Speaker 1: to buy a bunch of Apple gift cards like shit, 979 00:48:49,800 --> 00:48:51,799 Speaker 1: Like this is extreme and it's only going to get 980 00:48:51,800 --> 00:48:55,080 Speaker 1: more common. You can automate to the writing of the 981 00:48:55,120 --> 00:48:57,960 Speaker 1: scams and the sending of the scams using these AI tools, 982 00:48:58,280 --> 00:49:02,840 Speaker 1: and that is absolutely, in my opinion, much more of 983 00:49:02,880 --> 00:49:05,680 Speaker 1: a direct way in which AI is going to affect 984 00:49:05,680 --> 00:49:09,279 Speaker 1: people than any single product or even cumulatively, all of 985 00:49:09,280 --> 00:49:10,840 Speaker 1: the AI products we saw at CE. 986 00:49:13,600 --> 00:49:17,000 Speaker 4: All that uplifting note Yeah, yeah. 987 00:49:16,719 --> 00:49:19,359 Speaker 1: So that's a bummer, and well, we will be going 988 00:49:19,400 --> 00:49:21,480 Speaker 1: into more depth about that, but I wanted to end. 989 00:49:21,560 --> 00:49:25,560 Speaker 1: Tavia took notes at all of the buzzwords, particularly the 990 00:49:25,560 --> 00:49:29,200 Speaker 1: AI buzzwords that we heard during the convention, and she's 991 00:49:29,239 --> 00:49:30,680 Speaker 1: going to read that to us now. 992 00:49:31,200 --> 00:49:33,520 Speaker 5: You gotta tell you, this list is incredible. I've worked 993 00:49:33,560 --> 00:49:36,040 Speaker 5: in and out of corporate America, and much like a cult, 994 00:49:36,040 --> 00:49:39,520 Speaker 5: they have their own internal vocabulary that they use, and 995 00:49:39,560 --> 00:49:44,279 Speaker 5: this convention we went to was just filthy with these buzzwords. 996 00:49:44,320 --> 00:49:46,759 Speaker 5: So I'm just going to dig in. The ones that 997 00:49:46,800 --> 00:49:50,120 Speaker 5: I've written down are double down, love that one. That 998 00:49:50,120 --> 00:49:54,480 Speaker 5: one comes up a lot. Versioning, versioning, versioning, which is 999 00:49:54,560 --> 00:49:57,240 Speaker 5: like a legitimate term in software, but I was hearing 1000 00:49:57,239 --> 00:49:59,520 Speaker 5: it used in places where it didn't make much sense 1001 00:49:59,560 --> 00:50:03,320 Speaker 5: to do it. Then our favorite liar's. 1002 00:50:02,960 --> 00:50:06,040 Speaker 4: Dividend by by by far the best term that we've 1003 00:50:06,040 --> 00:50:08,520 Speaker 4: heard at the conference, So flexible. 1004 00:50:08,840 --> 00:50:11,000 Speaker 1: Yeah, I'm using versions of that and everything. You know, 1005 00:50:11,040 --> 00:50:13,080 Speaker 1: it makes me think a lot about the murderer's dividend, 1006 00:50:13,120 --> 00:50:14,560 Speaker 1: which is when you know, I longer I have to 1007 00:50:14,560 --> 00:50:15,760 Speaker 1: deal with an annoying person. 1008 00:50:16,480 --> 00:50:19,399 Speaker 5: We got content credential, which is coming up a lot, 1009 00:50:19,480 --> 00:50:22,439 Speaker 5: especially around the topic of AI. We have data rich 1010 00:50:22,640 --> 00:50:26,520 Speaker 5: and it's sister term problem rich core values, which I 1011 00:50:26,600 --> 00:50:29,279 Speaker 5: heard in every single panel that we were in. 1012 00:50:29,560 --> 00:50:32,520 Speaker 1: Yeah. Usually the context of this was we don't need 1013 00:50:32,600 --> 00:50:36,480 Speaker 1: regulations around how AI can be made and put together. 1014 00:50:37,000 --> 00:50:39,759 Speaker 1: The core values of the companies is what we'll make 1015 00:50:39,800 --> 00:50:43,080 Speaker 1: sure that AI isn't used in a harmful way. 1016 00:50:43,120 --> 00:50:45,960 Speaker 3: Great, that's that's gonna happen. 1017 00:50:47,320 --> 00:50:51,960 Speaker 5: No, very trustworthy, very trustworthy groups. Got risk model. And 1018 00:50:52,000 --> 00:50:54,239 Speaker 5: then my next term is the favorite one. It's so good, 1019 00:50:54,400 --> 00:50:55,640 Speaker 5: I think I'm gonna give this one to you. 1020 00:50:56,200 --> 00:50:58,280 Speaker 1: Yeah, because I don't think we talked about this. Guardian 1021 00:50:58,480 --> 00:51:01,360 Speaker 1: MM or something like that was then it's MM guardian 1022 00:51:01,680 --> 00:51:05,440 Speaker 1: MM Guardian, which is an app you put on It's 1023 00:51:05,520 --> 00:51:07,080 Speaker 1: not it used to be an app. Now it is 1024 00:51:07,120 --> 00:51:09,880 Speaker 1: a phone you buy for your child. It's a modified 1025 00:51:09,920 --> 00:51:15,759 Speaker 1: Samsung Galaxy something or other that there's no seven. It 1026 00:51:15,880 --> 00:51:19,080 Speaker 1: gives your It gives you, as the parent, complete access 1027 00:51:19,120 --> 00:51:21,080 Speaker 1: to your kid's phone and everything they're doing. And it 1028 00:51:21,120 --> 00:51:25,160 Speaker 1: automatically monitors, monitors all of their not just their conversations, 1029 00:51:25,200 --> 00:51:28,400 Speaker 1: but their browsing history, and sends you alerts. So like, 1030 00:51:28,520 --> 00:51:31,279 Speaker 1: if someone sends your kid a text that says you 1031 00:51:31,280 --> 00:51:33,720 Speaker 1: should kys you know, kill yourself. This is the example 1032 00:51:33,719 --> 00:51:36,040 Speaker 1: he showed us. You get a message that like there's 1033 00:51:36,080 --> 00:51:40,640 Speaker 1: this suicidal discussion or whatnot going on, we ask them, 1034 00:51:40,920 --> 00:51:44,680 Speaker 1: you know. Hey. Garrison particularly was like, what if this 1035 00:51:44,719 --> 00:51:48,480 Speaker 1: is a situation where a parent is abusive and like 1036 00:51:48,880 --> 00:51:52,480 Speaker 1: using this in order to keep tabs on their kids 1037 00:51:52,600 --> 00:51:55,120 Speaker 1: or like hates you know, is like a child is 1038 00:51:55,120 --> 00:51:58,399 Speaker 1: gay or trans and their parents are not accepting of that. 1039 00:51:58,600 --> 00:52:01,279 Speaker 1: Like does this still can parents still like spy on 1040 00:52:01,320 --> 00:52:03,680 Speaker 1: them over that stuff? Are there any limitations? Are there 1041 00:52:03,680 --> 00:52:06,279 Speaker 1: any sort of safeguards built in in case a parent 1042 00:52:06,400 --> 00:52:09,200 Speaker 1: is being abusive right to like monitor or sin to 1043 00:52:09,239 --> 00:52:11,120 Speaker 1: the authorities of a parent is using this in an 1044 00:52:11,160 --> 00:52:14,680 Speaker 1: abusive way. And their answer was no, We're purely about 1045 00:52:14,719 --> 00:52:18,399 Speaker 1: giving parents more power. And yeah, the term that they 1046 00:52:18,600 --> 00:52:21,400 Speaker 1: used was tech contracts with children. 1047 00:52:23,640 --> 00:52:25,520 Speaker 5: I can't think of anything more dismal. 1048 00:52:26,120 --> 00:52:29,799 Speaker 1: Yeah, that is one of the most dystopian assemblies of 1049 00:52:29,920 --> 00:52:31,000 Speaker 1: words I've ever heard. 1050 00:52:31,080 --> 00:52:35,040 Speaker 4: Should you should you should never say the phrase contracts 1051 00:52:35,080 --> 00:52:38,439 Speaker 4: with children. That's just that's just like if you find 1052 00:52:38,480 --> 00:52:42,360 Speaker 4: yourself ever ever hearing the phrase contracts with children spoken 1053 00:52:42,440 --> 00:52:44,600 Speaker 4: by anyone, run away from that person as fast as 1054 00:52:44,640 --> 00:52:47,279 Speaker 4: you can, maybe maybe maybe punch them in the face first, 1055 00:52:47,560 --> 00:52:49,399 Speaker 4: and then run away as fast as you can. 1056 00:52:50,760 --> 00:52:52,120 Speaker 1: So that's a good one. 1057 00:52:52,760 --> 00:52:54,840 Speaker 3: That's that's some shit you just keep in Florida. I 1058 00:52:54,880 --> 00:52:55,600 Speaker 3: guess now. 1059 00:52:57,360 --> 00:52:58,759 Speaker 1: It's a super Florida app. 1060 00:52:58,920 --> 00:52:59,200 Speaker 4: That is. 1061 00:52:59,280 --> 00:53:00,960 Speaker 1: That is the scent of this business. 1062 00:53:01,920 --> 00:53:05,080 Speaker 5: Moving on, We've got other terms called like visionary and 1063 00:53:05,120 --> 00:53:07,400 Speaker 5: thought leaders, which comes up a lot in these types. 1064 00:53:07,200 --> 00:53:10,680 Speaker 2: Of I mean, the pr shit people love saying thought leader. 1065 00:53:11,000 --> 00:53:11,480 Speaker 1: I love it. 1066 00:53:12,680 --> 00:53:18,160 Speaker 5: Eat it up. We also have Edge Computing I know. 1067 00:53:19,200 --> 00:53:21,400 Speaker 1: Yeah again handy, great company. 1068 00:53:21,200 --> 00:53:25,479 Speaker 5: Incredible company, very very excellent product. We have Digital Twin 1069 00:53:26,080 --> 00:53:27,040 Speaker 5: Horizon Scan. 1070 00:53:27,360 --> 00:53:31,000 Speaker 2: So Digital Twin's really good because it means like eight 1071 00:53:31,040 --> 00:53:35,000 Speaker 2: different things. It can mean literally a copy of something, 1072 00:53:35,280 --> 00:53:37,840 Speaker 2: or it can mean a digital version of something. It 1073 00:53:37,840 --> 00:53:40,399 Speaker 2: can mean like a metaverse thing. And these are all 1074 00:53:40,480 --> 00:53:42,719 Speaker 2: different industries using it, and no one can agree on 1075 00:53:42,760 --> 00:53:43,160 Speaker 2: the meaning. 1076 00:53:43,480 --> 00:53:45,680 Speaker 5: Yeah, that's just tradition. That's just like what they do. 1077 00:53:46,560 --> 00:53:49,120 Speaker 5: They have horizon scan. I actually kind of liked that one. 1078 00:53:49,239 --> 00:53:51,120 Speaker 5: Was the first time I heard that one. When they're 1079 00:53:51,160 --> 00:53:53,279 Speaker 5: just like looking into the future, I think they're calling 1080 00:53:53,360 --> 00:53:57,239 Speaker 5: that horizon scan use case, which came up a lot 1081 00:53:57,320 --> 00:54:00,759 Speaker 5: because everyone was groping for use cases for their technology 1082 00:54:00,800 --> 00:54:03,319 Speaker 5: and didn't seem to have any that they could bring up. 1083 00:54:04,040 --> 00:54:08,880 Speaker 4: The next one I heard way more than I wanted 1084 00:54:08,960 --> 00:54:12,520 Speaker 4: to hear, which was accelerate. Yes, always always a great 1085 00:54:12,640 --> 00:54:16,440 Speaker 4: term to hear in tech. There was there was so 1086 00:54:16,680 --> 00:54:20,759 Speaker 4: much accelerate and accelerating relating to their tech development and 1087 00:54:20,800 --> 00:54:24,520 Speaker 4: their tech use cases. For another one of those terms 1088 00:54:24,520 --> 00:54:25,719 Speaker 4: that Tavia just read off. 1089 00:54:25,800 --> 00:54:27,920 Speaker 1: Now, this next term is a real thing and an 1090 00:54:27,960 --> 00:54:30,520 Speaker 1: important thing, and not a thing that anyone in the 1091 00:54:30,520 --> 00:54:34,000 Speaker 1: tech industry wants or cares about the right to be forgotten. 1092 00:54:34,480 --> 00:54:36,520 Speaker 1: This has actually been legislated. The reason they have to 1093 00:54:36,560 --> 00:54:38,879 Speaker 1: care about this to some extent is it's been legislated 1094 00:54:38,880 --> 00:54:41,480 Speaker 1: in the EU right, and it should be everywhere. I 1095 00:54:41,520 --> 00:54:44,759 Speaker 1: actually think this is an incredibly important concept, and it's 1096 00:54:44,760 --> 00:54:48,040 Speaker 1: basically the you know, we have people go viral that 1097 00:54:48,200 --> 00:54:51,279 Speaker 1: become a main character on whatever app for being a 1098 00:54:51,280 --> 00:54:54,040 Speaker 1: piece of shit sometimes or sometimes doing something stupid or 1099 00:54:54,080 --> 00:54:56,719 Speaker 1: sometimes doing something innocuous that for no reason at all 1100 00:54:57,040 --> 00:54:57,799 Speaker 1: makes a huge number. 1101 00:54:58,160 --> 00:54:59,320 Speaker 3: He's actually a really good example. 1102 00:54:59,360 --> 00:55:01,680 Speaker 2: There was a kid who posted a video of himself 1103 00:55:01,960 --> 00:55:05,080 Speaker 2: and it was like four point zero gpa had a job, 1104 00:55:05,239 --> 00:55:08,280 Speaker 2: brais money didn't get into Harvard or something. He didn't 1105 00:55:08,320 --> 00:55:10,239 Speaker 2: mean it in this way, but someone took it and 1106 00:55:10,280 --> 00:55:12,480 Speaker 2: then turned it into ay why kids are being kept 1107 00:55:12,520 --> 00:55:14,839 Speaker 2: off Harvard thing? And he dm them was like, you're 1108 00:55:14,920 --> 00:55:18,120 Speaker 2: ruining my fucking life. Yeah, this is how this, like 1109 00:55:18,160 --> 00:55:20,360 Speaker 2: the right to be forgotten, should be everywhere. 1110 00:55:20,440 --> 00:55:21,000 Speaker 3: Yeah, is not. 1111 00:55:21,320 --> 00:55:24,560 Speaker 1: It is a hugely important thing, and you know, I 1112 00:55:24,880 --> 00:55:26,680 Speaker 1: actually give the EU a lot of credit for the 1113 00:55:26,719 --> 00:55:29,279 Speaker 1: fact that that has to some extent been legislated. All 1114 00:55:29,280 --> 00:55:31,600 Speaker 1: of that needs to be more common in other countries 1115 00:55:31,680 --> 00:55:36,120 Speaker 1: and more vigorously enforced. I don't I say that I 1116 00:55:36,160 --> 00:55:37,880 Speaker 1: have no idea how you do it with the internet. 1117 00:55:37,920 --> 00:55:40,200 Speaker 1: Working the way it does. Some of this, I actually 1118 00:55:40,280 --> 00:55:43,080 Speaker 1: do think is a values thing where we all need 1119 00:55:43,120 --> 00:55:45,760 Speaker 1: to be more okay with the fact that people, even 1120 00:55:45,800 --> 00:55:49,600 Speaker 1: people who can do something shitty online, deserve to not 1121 00:55:49,680 --> 00:55:53,560 Speaker 1: have that necessarily define the rest of their lives, especially 1122 00:55:54,120 --> 00:55:55,399 Speaker 1: you know, teenagers. 1123 00:55:56,800 --> 00:55:59,000 Speaker 2: And the next one is one that I like to 1124 00:55:59,040 --> 00:56:03,480 Speaker 2: associate with my posts, data poisoning. I believe every time 1125 00:56:03,520 --> 00:56:06,839 Speaker 2: I interact with Twitter or blue Sky, that is what 1126 00:56:06,880 --> 00:56:09,640 Speaker 2: I am doing. I have some data poisoning gap, or 1127 00:56:09,719 --> 00:56:13,120 Speaker 2: I am data poisoning as a verb, or I am 1128 00:56:13,239 --> 00:56:14,480 Speaker 2: data poisoning myself. 1129 00:56:14,560 --> 00:56:16,920 Speaker 1: Yeah. 1130 00:56:17,400 --> 00:56:17,680 Speaker 5: Uh. 1131 00:56:17,719 --> 00:56:20,640 Speaker 1: And then we've got oh, Garrison, you want to do 1132 00:56:20,680 --> 00:56:21,080 Speaker 1: this one? 1133 00:56:21,160 --> 00:56:21,440 Speaker 3: Sure? 1134 00:56:21,640 --> 00:56:23,480 Speaker 4: These these are the last three that I got from 1135 00:56:23,520 --> 00:56:27,640 Speaker 4: an AI ethics panel. We have data silos, how data 1136 00:56:27,719 --> 00:56:31,200 Speaker 4: is all separated. We have data harmonization, kind of the 1137 00:56:31,239 --> 00:56:32,640 Speaker 4: opposite of data silos. 1138 00:56:32,719 --> 00:56:35,560 Speaker 1: Yeah, that's basically using AI to generate pictures of dan 1139 00:56:35,600 --> 00:56:36,319 Speaker 1: harmon Right. 1140 00:56:37,400 --> 00:56:42,239 Speaker 4: Yes, Then we have the last term, which I will 1141 00:56:42,360 --> 00:56:48,000 Speaker 4: I will describe for you the speed capacity gap. So 1142 00:56:48,760 --> 00:56:50,240 Speaker 4: the speed capacity. 1143 00:56:49,800 --> 00:56:51,680 Speaker 1: Gap, I know I can answer that for you. So 1144 00:56:51,800 --> 00:56:55,920 Speaker 1: sometimes when I'm doing a shitload of amphetamines that I 1145 00:56:56,000 --> 00:56:59,640 Speaker 1: purchased from some Turkish website via the dark web. You know, 1146 00:56:59,680 --> 00:57:02,320 Speaker 1: I'm doing them with a friend and they od because 1147 00:57:02,360 --> 00:57:04,920 Speaker 1: there's a day there's a speed capacity gap, which we 1148 00:57:05,040 --> 00:57:05,759 Speaker 1: need two of us. 1149 00:57:06,560 --> 00:57:09,480 Speaker 4: Yeah, that's what that uh, that's what that DHS guy 1150 00:57:09,560 --> 00:57:11,919 Speaker 4: was talking about. For using AI to monitor dark web 1151 00:57:11,960 --> 00:57:15,480 Speaker 4: purchases is going to really get on that one. No 1152 00:57:15,800 --> 00:57:19,440 Speaker 4: speed capacity gap the gap between tech acceleration and the 1153 00:57:19,480 --> 00:57:22,760 Speaker 4: capacity of society to keep up and make informed decisions 1154 00:57:22,800 --> 00:57:27,120 Speaker 4: about that technology, which is actually kind of a useful terms. 1155 00:57:27,160 --> 00:57:28,880 Speaker 4: It's it's just one of those you know, it sounds 1156 00:57:28,880 --> 00:57:31,040 Speaker 4: like a silly tech term, but when the win it's 1157 00:57:31,040 --> 00:57:32,919 Speaker 4: actually explained like, oh, that's actually a really good way 1158 00:57:32,920 --> 00:57:36,040 Speaker 4: to think about the way AI is being pushed in 1159 00:57:36,080 --> 00:57:38,760 Speaker 4: all of these new ways, and are we actually as 1160 00:57:38,800 --> 00:57:41,440 Speaker 4: a society, whether that's like as a government or just 1161 00:57:41,480 --> 00:57:44,160 Speaker 4: like culturally, able to actually make inform decisions about how 1162 00:57:44,160 --> 00:57:46,480 Speaker 4: we want this tech to be integrated into our lives. 1163 00:57:47,120 --> 00:57:49,640 Speaker 4: And now the dark side of this term, the speed 1164 00:57:49,720 --> 00:57:53,320 Speaker 4: capacity gap. For the to to kind of solve this gap, 1165 00:57:53,320 --> 00:57:56,520 Speaker 4: we can either slow down a development or we can 1166 00:57:56,560 --> 00:58:00,440 Speaker 4: speed up our capacity. And the panelists obviously we preferred 1167 00:58:00,480 --> 00:58:04,040 Speaker 4: the latter, and so we should just speed up our 1168 00:58:04,080 --> 00:58:05,080 Speaker 4: cultural capacity. 1169 00:58:05,760 --> 00:58:07,240 Speaker 5: Did they propose a solution for. 1170 00:58:07,240 --> 00:58:12,920 Speaker 4: That, Well, kind of, but it's it's a little unclear. 1171 00:58:12,960 --> 00:58:15,080 Speaker 4: We can go through my recording at a later date 1172 00:58:15,080 --> 00:58:18,720 Speaker 4: once we do our full AI episode. But they're rationale 1173 00:58:18,760 --> 00:58:21,240 Speaker 4: for why we should instead of instead of slowing down 1174 00:58:21,240 --> 00:58:24,560 Speaker 4: tech development instead speed up our cultural capacity is because 1175 00:58:24,640 --> 00:58:28,640 Speaker 4: of the many benefits that tech improvements can be made 1176 00:58:29,360 --> 00:58:32,520 Speaker 4: via tech iterations. Right, the more iterations you get of 1177 00:58:32,520 --> 00:58:35,320 Speaker 4: technology that the more benefits are able to get from 1178 00:58:35,360 --> 00:58:39,440 Speaker 4: said technology. A versioning y version exactly, which brings us 1179 00:58:39,480 --> 00:58:41,120 Speaker 4: all the way back to versioning there we go. 1180 00:58:41,280 --> 00:58:43,840 Speaker 1: Yeah, which brings us all the way back to turkisham fetamines. 1181 00:58:43,880 --> 00:58:46,520 Speaker 1: Because I've been for the last twenty years trying different 1182 00:58:46,760 --> 00:58:51,840 Speaker 1: versions of turkisham fetamines and the blue pills. Man. You know, 1183 00:58:51,960 --> 00:58:55,720 Speaker 1: normally you don't hallucinate on speed, but when you take enough, 1184 00:58:55,800 --> 00:58:58,320 Speaker 1: it turns out you can. And so I think what 1185 00:58:58,360 --> 00:59:01,240 Speaker 1: I'd like to leave everyone with is the knowledge that 1186 00:59:01,360 --> 00:59:03,600 Speaker 1: Turkish shamphetamines are a thing you can purchase on the 1187 00:59:03,680 --> 00:59:06,160 Speaker 1: dark web and should there's no health consequences to it 1188 00:59:06,200 --> 00:59:06,480 Speaker 1: at all. 1189 00:59:07,480 --> 00:59:11,200 Speaker 2: I'm not part of this off line does not support 1190 00:59:11,240 --> 00:59:12,520 Speaker 2: illegal drug purchases. 1191 00:59:12,680 --> 00:59:13,880 Speaker 3: Respectful podcast. 1192 00:59:14,520 --> 00:59:17,600 Speaker 1: They're not illegal if they're so new that the DEA 1193 00:59:17,760 --> 00:59:21,680 Speaker 1: hasn't banned them yet, that's innovation exactly exactly. 1194 00:59:21,960 --> 00:59:29,000 Speaker 2: That's versioning, and that is the speed capacity gap, folks, that. 1195 00:59:32,120 --> 00:59:34,800 Speaker 4: The DA can't keep up with the tech improvements. 1196 00:59:36,400 --> 00:59:38,440 Speaker 1: All right, everybody that's going to do it for us 1197 00:59:38,480 --> 00:59:40,840 Speaker 1: here at cool Zone. Before we leave, I want to 1198 00:59:40,840 --> 00:59:44,600 Speaker 1: give Tabia and Ed both chances to plug their pluggables. Ed, 1199 00:59:44,600 --> 00:59:46,200 Speaker 1: people are going to be hearing from you every week 1200 00:59:46,240 --> 00:59:48,520 Speaker 1: on your new show, Better off Line, which is launching 1201 00:59:48,880 --> 00:59:51,680 Speaker 1: in a what I'm sure you'll agree is a frighteningly 1202 00:59:51,760 --> 00:59:52,600 Speaker 1: short time raace. 1203 00:59:52,720 --> 00:59:52,959 Speaker 4: Soon. 1204 00:59:53,720 --> 00:59:56,160 Speaker 2: It is going to be the best weekly tech show. 1205 00:59:56,560 --> 00:59:58,280 Speaker 2: It is going to do the job that no one 1206 00:59:58,360 --> 01:00:00,880 Speaker 2: is strong enough to do, which is a questions. Listen 1207 01:00:00,920 --> 01:00:03,919 Speaker 2: to the answers, then actually make a question that follows them. 1208 01:00:04,320 --> 01:00:06,680 Speaker 2: I'm very much looking forward to this and very excited 1209 01:00:06,680 --> 01:00:10,720 Speaker 2: to work with the cool Zone team and Tabia. 1210 01:00:11,360 --> 01:00:15,400 Speaker 5: Oh. You can find me on Twitter at cutma and 1211 01:00:15,440 --> 01:00:17,360 Speaker 5: if you want to learn a little bit more about 1212 01:00:17,400 --> 01:00:20,120 Speaker 5: my interactive and immersive work, you can see that at 1213 01:00:20,160 --> 01:00:22,080 Speaker 5: Tabimora dot com. 1214 01:00:22,120 --> 01:00:23,760 Speaker 2: Now you may wonder why I didn't give you any 1215 01:00:23,840 --> 01:00:28,160 Speaker 2: links to anything, and that was a deliberate thing called subterfuge. 1216 01:00:28,160 --> 01:00:29,840 Speaker 2: But you can find me at where's your ed dot 1217 01:00:29,880 --> 01:00:34,800 Speaker 2: at at Edzeitron, on Twitter, x rateminutes, dot bierz, and 1218 01:00:34,880 --> 01:00:37,680 Speaker 2: of course plue sky Zetron, dobisky stot dot social. 1219 01:00:37,880 --> 01:00:44,520 Speaker 1: Yeah, and you can find my profile on here. All right, 1220 01:00:44,560 --> 01:00:45,480 Speaker 1: We're fucking done here. 1221 01:00:51,240 --> 01:00:53,600 Speaker 3: It could happen here as a production of cool Zone Media. 1222 01:00:53,840 --> 01:00:56,480 Speaker 3: For more podcasts from cool Zone Media, visit our website 1223 01:00:56,560 --> 01:00:58,720 Speaker 3: cool zonemedia dot com, or check us out on the 1224 01:00:58,800 --> 01:01:02,200 Speaker 3: iHeartRadio app podcasts, or wherever you listen to podcasts. 1225 01:01:02,640 --> 01:01:04,720 Speaker 4: You can find sources for It could Happen here, updated 1226 01:01:04,840 --> 01:01:07,800 Speaker 4: monthly at coolzonemedia dot com slash sources. 1227 01:01:08,040 --> 01:01:08,840 Speaker 5: Thanks for listening.