1 00:00:01,720 --> 00:00:03,200 Speaker 1: All media. 2 00:00:03,320 --> 00:00:05,720 Speaker 2: Hey everybody, Robert Evans here, and I wanted to let 3 00:00:05,800 --> 00:00:09,440 Speaker 2: you know this is a compilation episode. So every episode 4 00:00:09,560 --> 00:00:12,680 Speaker 2: of the week that just happened is here in one 5 00:00:12,800 --> 00:00:16,520 Speaker 2: convenient and with somewhat less ads package for you to 6 00:00:16,560 --> 00:00:19,040 Speaker 2: listen to in a long stretch if you want. If 7 00:00:19,040 --> 00:00:21,000 Speaker 2: you've been listening to the episodes every day this week, 8 00:00:21,040 --> 00:00:22,759 Speaker 2: there's going to be nothing new here for you, but 9 00:00:23,120 --> 00:00:24,760 Speaker 2: you can make your own decisions. 10 00:00:25,480 --> 00:00:25,600 Speaker 3: Hi. 11 00:00:25,640 --> 00:00:29,000 Speaker 4: Everyone, it's James coming at you. We're pretty nasty cold here. 12 00:00:29,320 --> 00:00:32,440 Speaker 4: I wanted to share with you that Wildfast has swept 13 00:00:32,479 --> 00:00:35,240 Speaker 4: through Los Angeles in the last couple of days. While 14 00:00:35,240 --> 00:00:39,640 Speaker 4: I'm recording this. Thousands of people have been displaced. Five 15 00:00:39,720 --> 00:00:43,479 Speaker 4: people have died that we know of so far, thousands 16 00:00:43,520 --> 00:00:46,120 Speaker 4: of structures have been burned, and many many people in 17 00:00:46,200 --> 00:00:48,720 Speaker 4: I will be finding themselves out of their homes with 18 00:00:48,840 --> 00:00:51,960 Speaker 4: nowhere to go, with very few resources. If you'd like 19 00:00:52,040 --> 00:00:54,440 Speaker 4: to help, We've come up with some mutual aid groups 20 00:00:54,680 --> 00:00:57,040 Speaker 4: who you can donate to, and we'll be interviewing one 21 00:00:57,040 --> 00:00:59,800 Speaker 4: of them on this show next week. So if you 22 00:01:00,160 --> 00:01:02,680 Speaker 4: like to help, the three places where we suggest you 23 00:01:02,680 --> 00:01:08,280 Speaker 4: would donate some cash, The Sidewalk Project, that's the Sidewalk 24 00:01:08,560 --> 00:01:13,800 Speaker 4: Project dot org K Town for All. That's that's a 25 00:01:13,880 --> 00:01:16,720 Speaker 4: K T O W N F O R A L 26 00:01:16,840 --> 00:01:21,000 Speaker 4: L dot O r G and ETNA Street Solidarity. You 27 00:01:21,000 --> 00:01:24,160 Speaker 4: can find them on Venmo or I think on Instagram 28 00:01:24,200 --> 00:01:28,200 Speaker 4: as well. That's a E T N A S T 29 00:01:28,400 --> 00:01:32,280 Speaker 4: R E E T S O L I D A 30 00:01:32,720 --> 00:01:33,679 Speaker 4: R I T Y. 31 00:01:34,200 --> 00:01:39,480 Speaker 2: All right, I'm gonna go rest my voice order in 32 00:01:39,520 --> 00:01:43,480 Speaker 2: the court order in the court Justice Robert Evans presiding, 33 00:01:43,600 --> 00:01:46,520 Speaker 2: I see we have a fine jury here to take 34 00:01:46,600 --> 00:01:50,480 Speaker 2: questions from the audience of our of our daily news show, 35 00:01:51,040 --> 00:01:54,400 Speaker 2: which is also my court room. Everybody, everybody get it, 36 00:01:54,400 --> 00:01:57,080 Speaker 2: because I'm a judge now, really, because that's how the 37 00:01:57,120 --> 00:01:58,160 Speaker 2: legal system works. 38 00:01:58,400 --> 00:02:02,280 Speaker 5: All those rubers finally have true. 39 00:02:02,960 --> 00:02:04,720 Speaker 2: No municipal Judge Garrison. 40 00:02:04,800 --> 00:02:09,919 Speaker 5: That's okay, okay, that's get You're right, You're right, You're 41 00:02:09,960 --> 00:02:10,839 Speaker 5: right here. 42 00:02:10,919 --> 00:02:12,640 Speaker 2: I will now for the rest of my life be 43 00:02:12,639 --> 00:02:15,400 Speaker 2: able to say when people ask questions, well as a 44 00:02:15,440 --> 00:02:19,040 Speaker 2: man of the law, which I'm very much looking forward to. 45 00:02:19,760 --> 00:02:22,720 Speaker 4: No, only able to say, Robert quite likely. 46 00:02:22,560 --> 00:02:26,320 Speaker 2: To say, anyway, that's all I got. 47 00:02:26,680 --> 00:02:28,919 Speaker 1: All right, this is the it could happen here. A 48 00:02:29,000 --> 00:02:31,360 Speaker 1: Q and A episode. We've got what what are we 49 00:02:31,400 --> 00:02:33,400 Speaker 1: calling you now? Robert Evans? What's your title? 50 00:02:34,080 --> 00:02:37,200 Speaker 2: The Honorable Robert Evans and I I actually all the 51 00:02:37,520 --> 00:02:39,880 Speaker 2: judge who made me a judge sent me a gabble, 52 00:02:40,200 --> 00:02:42,120 Speaker 2: but I didn't grab it for this one. So I 53 00:02:42,240 --> 00:02:45,359 Speaker 2: just used I have a the the barrel and lower 54 00:02:45,440 --> 00:02:48,560 Speaker 2: receiver from an antique sought off shotgun that belonged to 55 00:02:48,560 --> 00:02:50,880 Speaker 2: a bootlegger, and I just sort of slammed that into 56 00:02:50,880 --> 00:02:51,280 Speaker 2: my table. 57 00:02:51,600 --> 00:02:53,280 Speaker 5: I'm sure editor will love that. 58 00:02:53,720 --> 00:02:57,240 Speaker 4: Yeah, yeah, but before we broadcast, so you have a 59 00:02:57,280 --> 00:02:58,000 Speaker 4: sort of shotgun. 60 00:02:58,320 --> 00:03:00,880 Speaker 2: It's not a it's not functional, it's been destroyed. 61 00:03:01,120 --> 00:03:03,079 Speaker 4: I see, I see good. Didn't want a little really 62 00:03:03,120 --> 00:03:03,680 Speaker 4: rich moment. 63 00:03:03,919 --> 00:03:07,560 Speaker 1: Yeah, we've got Mio Wong, Garrison Davis, James Stout and 64 00:03:07,639 --> 00:03:09,880 Speaker 1: the dishonorable Robert Evans. 65 00:03:09,840 --> 00:03:13,400 Speaker 5: And Sophie Lickterman. Oh yes, it mean yeah, we're. 66 00:03:13,240 --> 00:03:15,320 Speaker 1: Gonna do the We're gonna do some questions we posted 67 00:03:15,360 --> 00:03:16,840 Speaker 1: in on our Blue Sky. If you not following us 68 00:03:16,840 --> 00:03:18,440 Speaker 1: on Blue Sky, we are on there. 69 00:03:18,639 --> 00:03:21,200 Speaker 2: Blue Ski one does not post on Blue Sky. So 70 00:03:21,280 --> 00:03:21,960 Speaker 2: if you want. 71 00:03:21,760 --> 00:03:24,600 Speaker 1: Skeets, I really hope that's not true, because that's really. 72 00:03:24,840 --> 00:03:28,800 Speaker 2: Embarrassing Unfortunately, they really tried to get that off the ground. 73 00:03:28,840 --> 00:03:30,880 Speaker 2: I don't see anyone actually using skeat. 74 00:03:31,040 --> 00:03:33,240 Speaker 4: I saw someone using it in French and it was 75 00:03:33,280 --> 00:03:36,400 Speaker 4: a real moment. Hi are you Garrison. 76 00:03:36,840 --> 00:03:39,080 Speaker 5: Instead of saying send a tweet, now, I just say 77 00:03:39,200 --> 00:03:42,240 Speaker 5: send skeet in conversation. Everyone loves it. M hm. 78 00:03:42,640 --> 00:03:44,840 Speaker 2: Do you re skeet? Is that a thing? Yeah? 79 00:03:44,880 --> 00:03:47,280 Speaker 5: I guess you do. I guess you do, mm hm. 80 00:03:47,320 --> 00:03:50,360 Speaker 1: And we're moving on. I'm just gonna throw out some 81 00:03:50,440 --> 00:03:53,160 Speaker 1: of the questions we received on online. I'm not even 82 00:03:53,200 --> 00:03:55,000 Speaker 1: gonna say the name of the app again because I'm 83 00:03:55,000 --> 00:03:59,920 Speaker 1: afraid being labeled as an old Garrison's embarrassed by me. 84 00:04:00,160 --> 00:04:02,920 Speaker 1: I can tell I didn't say that, but you thought it. 85 00:04:03,080 --> 00:04:05,600 Speaker 5: But you thought it. I didn't think that you did. 86 00:04:06,400 --> 00:04:08,800 Speaker 1: Any advice for someone with a desire to do some 87 00:04:08,840 --> 00:04:11,720 Speaker 1: hobby or freelance journalism in the coming few years, I 88 00:04:11,800 --> 00:04:15,640 Speaker 1: want to actively fight for equality. Also, thank you for 89 00:04:15,720 --> 00:04:16,719 Speaker 1: your questions. Everyone. 90 00:04:17,160 --> 00:04:20,120 Speaker 2: Hm, I don't thank you for your questions. I'm actively 91 00:04:20,240 --> 00:04:21,640 Speaker 2: angry at you for your question. 92 00:04:21,760 --> 00:04:23,360 Speaker 1: Yeah, that's why you're the dishonorable. 93 00:04:23,560 --> 00:04:26,840 Speaker 4: Yeah, start rich if you want to be a freelance journalist. 94 00:04:26,880 --> 00:04:28,599 Speaker 4: Because you'll progressively become poorer. 95 00:04:30,520 --> 00:04:34,000 Speaker 2: It's my I have funded my journal I get I 96 00:04:34,040 --> 00:04:37,159 Speaker 2: love whatever people ask me questions like how did you 97 00:04:37,200 --> 00:04:40,800 Speaker 2: convince Krack to send you to Iraq? I didn't. I 98 00:04:40,839 --> 00:04:44,520 Speaker 2: bought plane tickets, Like being an entertainer has always been 99 00:04:44,560 --> 00:04:46,000 Speaker 2: what's funded my journalism. 100 00:04:46,480 --> 00:04:50,679 Speaker 5: I guess my advice would be get really autistic about 101 00:04:50,680 --> 00:04:55,320 Speaker 5: something problematic, just like one thing, this one thing I 102 00:04:55,360 --> 00:04:58,479 Speaker 5: get like RELLI into it to the point where it 103 00:04:58,560 --> 00:05:01,320 Speaker 5: kind of takes over your life. Your personal life starts 104 00:05:01,320 --> 00:05:04,160 Speaker 5: fading away, it kind of blends into your whole state 105 00:05:04,200 --> 00:05:08,400 Speaker 5: of existence. And only then will you actually get good 106 00:05:08,400 --> 00:05:12,680 Speaker 5: at that thing. Yep, that's my advice. And then you 107 00:05:12,720 --> 00:05:14,440 Speaker 5: just take one thing at a time and every few 108 00:05:14,480 --> 00:05:17,279 Speaker 5: years you kind of change the scope of the thing 109 00:05:17,480 --> 00:05:20,440 Speaker 5: you're getting really autistic about. But that's kind of how 110 00:05:20,480 --> 00:05:22,720 Speaker 5: I've rolled, and it's been it's been okay. 111 00:05:23,200 --> 00:05:26,320 Speaker 2: Yeah, you just finished thirty six hours of digging into 112 00:05:26,400 --> 00:05:31,159 Speaker 2: the life of a school shooter. And I also built 113 00:05:31,200 --> 00:05:34,040 Speaker 2: the back of my career spending hours and hours digging 114 00:05:34,080 --> 00:05:38,000 Speaker 2: through the online lives of mass shooters. And you don't 115 00:05:38,000 --> 00:05:40,560 Speaker 2: have to do that, but you do have to do 116 00:05:40,640 --> 00:05:43,359 Speaker 2: that thing, which is Yeah, exactly what Garrison said. You 117 00:05:43,400 --> 00:05:47,000 Speaker 2: have to pick a very narrow thing and make it 118 00:05:47,080 --> 00:05:49,520 Speaker 2: your life, and not just a random thing, but like 119 00:05:49,520 --> 00:05:51,800 Speaker 2: a thing that you think is important. Yeah, and that 120 00:05:51,880 --> 00:05:54,440 Speaker 2: people don't other people don't understand how important it is. 121 00:05:54,960 --> 00:05:58,239 Speaker 2: And if you make yourself there's a fella. His blog 122 00:05:58,320 --> 00:06:00,880 Speaker 2: is called We Hunted the Mammoth. They've Futrell who's been 123 00:06:00,880 --> 00:06:04,040 Speaker 2: covering what we call the manosphere for like more than 124 00:06:04,040 --> 00:06:08,640 Speaker 2: a decade before anybody else in journalism was taking it seriously. Yep, 125 00:06:08,760 --> 00:06:11,080 Speaker 2: you got to do that kind of thing. If you 126 00:06:11,120 --> 00:06:14,000 Speaker 2: do that kind of thing, you build a name for yourself, 127 00:06:14,000 --> 00:06:16,719 Speaker 2: and that can allow you when that the thing that 128 00:06:16,760 --> 00:06:21,359 Speaker 2: you're obsessed on becomes a big story, being first to 129 00:06:21,680 --> 00:06:24,440 Speaker 2: have something meaningful to say about it can provide you 130 00:06:24,480 --> 00:06:28,359 Speaker 2: eventually with the opportunity to cover other things. Yeah, and 131 00:06:28,440 --> 00:06:29,040 Speaker 2: it's good advice. 132 00:06:29,080 --> 00:06:31,120 Speaker 4: I would say if you want to get started freelancing, 133 00:06:31,600 --> 00:06:34,880 Speaker 4: it's a good idea to join that IWW Freelance Journalist Union. 134 00:06:35,040 --> 00:06:37,560 Speaker 4: You can learn a lot from people who are freelancing there. 135 00:06:37,600 --> 00:06:39,760 Speaker 4: You can learn who not to pitch, which editors are 136 00:06:39,760 --> 00:06:43,080 Speaker 4: toxic as fuck, which is a surprisingly large amount. Yeah, 137 00:06:43,120 --> 00:06:45,120 Speaker 4: you can learn which email to send your pitches to 138 00:06:45,240 --> 00:06:47,240 Speaker 4: and how to pitch if you're not familiar with how 139 00:06:47,240 --> 00:06:50,159 Speaker 4: to pitch. I also teach sometimes journalism workshops at a 140 00:06:50,160 --> 00:06:52,960 Speaker 4: community college, So if you have a community college near you, 141 00:06:52,960 --> 00:06:55,320 Speaker 4: you might be able to get some either free or 142 00:06:55,440 --> 00:06:58,760 Speaker 4: very cheap sort of advice and the real like nuts 143 00:06:58,800 --> 00:07:01,360 Speaker 4: and bolts of journalism like ending pitches and stuff like that. 144 00:07:02,240 --> 00:07:02,480 Speaker 2: Cool. 145 00:07:03,360 --> 00:07:06,479 Speaker 1: What is the consensus on what the next Trump administration 146 00:07:06,600 --> 00:07:10,120 Speaker 1: will do? On the first day or first week, all 147 00:07:10,160 --> 00:07:12,119 Speaker 1: of us just look like we're in pain. 148 00:07:12,760 --> 00:07:21,560 Speaker 5: Oh fus like it's chaos of the Yeah. 149 00:07:21,080 --> 00:07:21,320 Speaker 2: I'm not. 150 00:07:21,480 --> 00:07:23,680 Speaker 5: I'm not for seeing good things. There'll be a lot 151 00:07:23,680 --> 00:07:27,160 Speaker 5: of executive orders that are you know, probably bad, you know, 152 00:07:27,280 --> 00:07:28,200 Speaker 5: things that aren't great. 153 00:07:28,920 --> 00:07:34,000 Speaker 2: Yeah, I think that, Uh, he's going to try to 154 00:07:34,080 --> 00:07:36,560 Speaker 2: do as much of what he's promised to do in 155 00:07:36,680 --> 00:07:38,760 Speaker 2: terms of particular not in terms of everything he's prounded, 156 00:07:38,800 --> 00:07:42,560 Speaker 2: but in terms of going after immigrants. Yeah, he's going 157 00:07:42,600 --> 00:07:44,200 Speaker 2: to do as much of what he's promised to do 158 00:07:44,240 --> 00:07:46,840 Speaker 2: as he possibly can't. Now that doesn't mean he's going 159 00:07:46,880 --> 00:07:49,960 Speaker 2: to actually deport millions of people. There are like some 160 00:07:50,160 --> 00:07:54,080 Speaker 2: just practical limitations based on the capacity of the institutions 161 00:07:54,040 --> 00:07:56,640 Speaker 2: will be using to do this and he could get 162 00:07:56,840 --> 00:07:58,760 Speaker 2: there's a very good chance things will get bogged down 163 00:07:58,800 --> 00:08:02,760 Speaker 2: and whatnot, but like he will try. Yeah, that's my take. Yeah, 164 00:08:02,840 --> 00:08:03,120 Speaker 2: I think. 165 00:08:03,120 --> 00:08:04,679 Speaker 6: I think the other thing that's going to happen pretty 166 00:08:04,760 --> 00:08:06,840 Speaker 6: quickly is I think he's gonna start moving on tariffs 167 00:08:07,080 --> 00:08:07,920 Speaker 6: very very fast. 168 00:08:08,240 --> 00:08:10,960 Speaker 2: Yeah. If you're planning to buy a computer, go ahead 169 00:08:10,960 --> 00:08:13,240 Speaker 2: and grab that fucker now if you can. 170 00:08:13,640 --> 00:08:16,280 Speaker 5: If you're getting anything from overseas, you should get it 171 00:08:16,520 --> 00:08:18,080 Speaker 5: in the few weeks that you still can. 172 00:08:18,160 --> 00:08:19,240 Speaker 2: Yeah, it has a battery. 173 00:08:19,640 --> 00:08:20,200 Speaker 3: It made here. 174 00:08:20,400 --> 00:08:24,760 Speaker 1: I had my annual physical today because otherwise our insurance 175 00:08:24,760 --> 00:08:27,680 Speaker 1: screws us over. And my doctor was like, you should 176 00:08:27,680 --> 00:08:30,600 Speaker 1: try to get as many prescriptions filled before the end 177 00:08:30,640 --> 00:08:33,880 Speaker 1: of the year before things come up, just in case. 178 00:08:34,160 --> 00:08:37,520 Speaker 1: There you go, and you know that's not terrible advice. 179 00:08:38,000 --> 00:08:38,679 Speaker 3: Yeah. 180 00:08:38,760 --> 00:08:42,320 Speaker 4: I think in terms of executive orders, he will try 181 00:08:42,320 --> 00:08:47,520 Speaker 4: and further restrict access to asylum, try and further change that. 182 00:08:47,600 --> 00:08:49,280 Speaker 4: There are things he can do by executive order with 183 00:08:49,320 --> 00:08:52,400 Speaker 4: ICE and CBP in terms of how they operate that 184 00:08:52,440 --> 00:08:55,360 Speaker 4: he will try and do. It's not impossible that they 185 00:08:55,360 --> 00:08:58,600 Speaker 4: will try and again immediately mobilize public health law against 186 00:08:58,679 --> 00:09:01,400 Speaker 4: migrants like he did in twenty twenty. Right, Yeah, those 187 00:09:01,480 --> 00:09:04,720 Speaker 4: things could all be done without congressional support. We might 188 00:09:04,840 --> 00:09:06,840 Speaker 4: hope on us about this, but Stephen Miller suggested that 189 00:09:06,840 --> 00:09:08,000 Speaker 4: they might do some of those things. 190 00:09:08,000 --> 00:09:11,640 Speaker 2: So, yeah, not impossible. Party won't be a great day. 191 00:09:12,000 --> 00:09:16,440 Speaker 1: Somebody's getting fired the first week, probably first day. 192 00:09:16,840 --> 00:09:19,760 Speaker 2: Yeah. I mean, I've seen the fact that the FBI 193 00:09:19,840 --> 00:09:24,160 Speaker 2: director is stepping down pushed as like an act of 194 00:09:24,200 --> 00:09:28,840 Speaker 2: resistance because it means that Trump now has to actually 195 00:09:28,840 --> 00:09:31,440 Speaker 2: go through like Congress to get it done. I don't 196 00:09:31,440 --> 00:09:33,960 Speaker 2: know if how much I buy that, how much I 197 00:09:34,000 --> 00:09:36,040 Speaker 2: think that. I think a lot of what I'm seeing 198 00:09:36,120 --> 00:09:38,760 Speaker 2: right now from establishment people. And maybe this isn't true 199 00:09:38,800 --> 00:09:41,480 Speaker 2: of Ray because I did find some of the arguments 200 00:09:41,480 --> 00:09:43,360 Speaker 2: they're compelling, But a lot of what I've seen from 201 00:09:43,480 --> 00:09:48,199 Speaker 2: establishment people in politics is they're scared and just really 202 00:09:48,240 --> 00:09:51,880 Speaker 2: try and not to make waves. Yeah, and I think 203 00:09:51,880 --> 00:09:54,240 Speaker 2: that's what you're going to see overwhelmingly. I think that 204 00:09:54,400 --> 00:09:59,200 Speaker 2: he's going to probably will not immediately act against the press, 205 00:09:59,400 --> 00:10:02,199 Speaker 2: and in a legal sense, as the president, they will 206 00:10:02,240 --> 00:10:04,600 Speaker 2: do that, but I think he's going to He's already 207 00:10:04,800 --> 00:10:07,080 Speaker 2: suing differently and I think that that's going to be 208 00:10:07,600 --> 00:10:11,000 Speaker 2: kind of his his focus there for a while, just 209 00:10:11,040 --> 00:10:13,360 Speaker 2: because there's a lot on his plate. But I think 210 00:10:13,400 --> 00:10:16,520 Speaker 2: here there will be attempts like the fuck with libel 211 00:10:16,559 --> 00:10:18,480 Speaker 2: laws and stuff, especially as things go on. 212 00:10:19,200 --> 00:10:24,280 Speaker 1: Okay, several of you have asked about the Android ad 213 00:10:24,320 --> 00:10:27,120 Speaker 1: free version subscription channel, and I want you all to 214 00:10:27,200 --> 00:10:30,040 Speaker 1: know that it will happen next year. I have been 215 00:10:30,559 --> 00:10:35,199 Speaker 1: trying to get this to happen for two years now, 216 00:10:35,440 --> 00:10:43,679 Speaker 1: and for unforeseen reasons, it just keeps getting roadblocked. But 217 00:10:43,880 --> 00:10:46,959 Speaker 1: it is happening. We're just waiting on a couple of 218 00:10:47,000 --> 00:10:50,480 Speaker 1: final things to get into place, so that will be happening, 219 00:10:50,679 --> 00:10:55,320 Speaker 1: hopefully very soon into twenty twenty five. I will update 220 00:10:55,360 --> 00:10:58,040 Speaker 1: everybody as soon as that's possible. And I'm so sorry 221 00:10:58,080 --> 00:10:59,679 Speaker 1: it's taken so long. I want you to know I 222 00:10:59,720 --> 00:11:05,360 Speaker 1: have worked so unbelievably hard on this, miserably hard. 223 00:11:06,200 --> 00:11:09,880 Speaker 2: Yeah, we've seen it, Sophie has. It's been a nightmare, 224 00:11:09,960 --> 00:11:13,800 Speaker 2: Harder than I have worked on anything else this year. Like, 225 00:11:13,920 --> 00:11:16,800 Speaker 2: it's been nuts. Yeah, And here's the thing that sucks 226 00:11:16,920 --> 00:11:18,960 Speaker 2: for no reason, no reason, but not that there's no 227 00:11:19,040 --> 00:11:21,040 Speaker 2: reason to launch the app. There's a great reason. There's 228 00:11:21,040 --> 00:11:23,480 Speaker 2: no reason it should have taken this long. Correct, But 229 00:11:23,840 --> 00:11:27,520 Speaker 2: we can't say anymore for reasons that are also equally frustrating. 230 00:11:27,960 --> 00:11:30,120 Speaker 2: I'd like to say in general, folks, there's a few 231 00:11:30,160 --> 00:11:31,720 Speaker 2: things that get brought up a lot. It's like, why 232 00:11:31,760 --> 00:11:33,760 Speaker 2: haven't they done this yet? Why haven't they done this yet? 233 00:11:33,800 --> 00:11:36,880 Speaker 2: We're talking like technical things or like you know, things 234 00:11:36,960 --> 00:11:39,000 Speaker 2: like a like a paid subscription, and they're like, why 235 00:11:39,040 --> 00:11:41,080 Speaker 2: haven't they gotten around to it yet? And the answer 236 00:11:41,120 --> 00:11:45,240 Speaker 2: is always some infuriating bullshit based. 237 00:11:44,920 --> 00:11:47,320 Speaker 1: On some bureaucracy bullshit. 238 00:11:47,160 --> 00:11:49,200 Speaker 2: Some be rock some legal shit where you're like, you 239 00:11:49,200 --> 00:11:52,320 Speaker 2: don't actually realize it's illegal to do this if you 240 00:11:52,360 --> 00:11:55,080 Speaker 2: do it this way or whatever, like some sort of 241 00:11:55,120 --> 00:11:58,600 Speaker 2: bullshit that makes it impossible. It's not that we want 242 00:11:58,640 --> 00:12:01,760 Speaker 2: to make this is it is as possible for people 243 00:12:01,760 --> 00:12:04,720 Speaker 2: to have the best listening experience that we can afford 244 00:12:04,760 --> 00:12:07,760 Speaker 2: to provide them. But there's a lot of annoying bullshit 245 00:12:07,920 --> 00:12:10,439 Speaker 2: that exists for reasons beyond our comprehension. 246 00:12:10,640 --> 00:12:15,240 Speaker 1: Sorry, anyways, here's ads. Unless you have an iPhone and 247 00:12:15,280 --> 00:12:28,760 Speaker 1: subscribe to color Zone Media on Apple. All right, we're back. 248 00:12:30,080 --> 00:12:33,520 Speaker 1: How do you each motivate yourself to write or do 249 00:12:33,640 --> 00:12:36,280 Speaker 1: your jobs. I get asked that question all the time, 250 00:12:36,679 --> 00:12:39,560 Speaker 1: but I'll let each of you tackle it. While this 251 00:12:39,640 --> 00:12:43,200 Speaker 1: is a communally hosted show, I feel like each of 252 00:12:43,240 --> 00:12:47,160 Speaker 1: you do very different things, so your answers are going 253 00:12:47,200 --> 00:12:50,920 Speaker 1: to be all over the place. So Garrison, Oh. 254 00:12:50,800 --> 00:12:52,960 Speaker 5: Well, I mean paying rents a great motivator. 255 00:12:53,080 --> 00:12:59,079 Speaker 2: Sure, yes, yes, understated, this is a big thing that 256 00:12:59,160 --> 00:13:01,480 Speaker 2: a lot of people who want to be writers but 257 00:13:01,520 --> 00:13:03,800 Speaker 2: have never done it for a living. Miss is that 258 00:13:03,840 --> 00:13:05,959 Speaker 2: all of your favorite writers who do it for a living, 259 00:13:06,800 --> 00:13:09,240 Speaker 2: a big part of how they get over fucking writer's 260 00:13:09,280 --> 00:13:12,640 Speaker 2: block is they have to pay rent or mortgage. Yeah, 261 00:13:13,000 --> 00:13:14,120 Speaker 2: it turns out that helps. 262 00:13:14,400 --> 00:13:17,880 Speaker 5: It's it's it's a quite compelling motivator, and sometimes it 263 00:13:17,880 --> 00:13:23,560 Speaker 5: has required the assistance of you know, caffeine or other things. 264 00:13:24,160 --> 00:13:27,360 Speaker 5: I have a variety of playlists to help me in 265 00:13:27,600 --> 00:13:31,120 Speaker 5: when I'm in like different moods. I definitely will about 266 00:13:31,160 --> 00:13:34,679 Speaker 5: you know, maybe twice a month, I just do a complete, 267 00:13:34,960 --> 00:13:38,000 Speaker 5: like a complete body check to my sleep schedule to 268 00:13:38,040 --> 00:13:40,520 Speaker 5: get a special project finished. And that's just kind of 269 00:13:40,520 --> 00:13:42,719 Speaker 5: part of the deal, at least in terms of how 270 00:13:42,720 --> 00:13:45,840 Speaker 5: I work, and not everyone does it this way. Though 271 00:13:46,120 --> 00:13:47,720 Speaker 5: maybe maybe people are more healthy than me. 272 00:13:48,520 --> 00:13:49,080 Speaker 2: Yeah for me. 273 00:13:50,000 --> 00:13:52,520 Speaker 6: Okay, So the easiest way something gets done is just 274 00:13:52,600 --> 00:13:56,760 Speaker 6: pure rage. I can just do it like. 275 00:13:56,760 --> 00:13:59,600 Speaker 2: It just comes out. The anger is a great motivator. 276 00:14:00,840 --> 00:14:04,520 Speaker 6: The other fun one is pure joy as something funny happening, 277 00:14:04,600 --> 00:14:07,920 Speaker 6: Like this is the shinzo Abe assassination. The easiest writing 278 00:14:07,960 --> 00:14:09,600 Speaker 6: I've ever done in my lifetimes. 279 00:14:09,600 --> 00:14:10,480 Speaker 2: It just flows. 280 00:14:11,080 --> 00:14:13,560 Speaker 7: Yeah. 281 00:14:13,720 --> 00:14:16,440 Speaker 6: Other times it's just like there's a deadline and everyone 282 00:14:16,480 --> 00:14:18,200 Speaker 6: is counting on me, and I have to get it out, 283 00:14:18,440 --> 00:14:21,000 Speaker 6: and I've gotten to the right level of sleep deprivation 284 00:14:21,080 --> 00:14:22,040 Speaker 6: where I can just do it. 285 00:14:22,400 --> 00:14:23,240 Speaker 5: That's right, That's right. 286 00:14:23,480 --> 00:14:23,760 Speaker 2: Yeah. 287 00:14:23,800 --> 00:14:28,000 Speaker 6: But I also think, you know, there's obviously like health insurance, 288 00:14:28,040 --> 00:14:30,240 Speaker 6: which is sort of a joke given our health insurance. 289 00:14:30,360 --> 00:14:34,040 Speaker 6: But yeah, and then the last thing, and this is 290 00:14:34,080 --> 00:14:38,120 Speaker 6: the sort of the serious one, is that like this, 291 00:14:38,680 --> 00:14:40,880 Speaker 6: you know, I mean I do some organizing stuff too, 292 00:14:40,960 --> 00:14:44,400 Speaker 6: but like this, this is the thing that I have 293 00:14:44,480 --> 00:14:47,400 Speaker 6: to do that can materially affect the world. Which is 294 00:14:47,440 --> 00:14:50,080 Speaker 6: a very very weird thing to say about a podcast. 295 00:14:50,160 --> 00:14:52,640 Speaker 6: But I've seen it happen, right, I've seen all of 296 00:14:52,680 --> 00:14:56,880 Speaker 6: you go and do things that wouldn't have happened and 297 00:14:56,920 --> 00:14:59,760 Speaker 6: I've had. You know, it's a weird situation, right because 298 00:14:59,880 --> 00:15:02,400 Speaker 6: my my motivation for doing this stuff is the chance 299 00:15:02,440 --> 00:15:05,760 Speaker 6: that you will make the world better. But I've seen 300 00:15:05,800 --> 00:15:08,640 Speaker 6: it happen, and I have to continue to believe that 301 00:15:08,760 --> 00:15:10,400 Speaker 6: the thing that I've been doing for all these years 302 00:15:10,400 --> 00:15:14,120 Speaker 6: is project of building a very large hammer and deploying 303 00:15:14,120 --> 00:15:17,360 Speaker 6: it against our enemies can work and will work. And 304 00:15:17,440 --> 00:15:19,800 Speaker 6: that is you know, that's how I get out of 305 00:15:19,840 --> 00:15:22,360 Speaker 6: bed every morning, is we're building the hammer and we're 306 00:15:22,360 --> 00:15:22,840 Speaker 6: swinging it. 307 00:15:22,960 --> 00:15:24,880 Speaker 2: Yeah, that's a great way to put it. 308 00:15:25,000 --> 00:15:27,760 Speaker 4: Very large hammer will be a banging name for a podcast. 309 00:15:27,880 --> 00:15:30,440 Speaker 1: I agree, Yeah, I agree. 310 00:15:30,560 --> 00:15:33,680 Speaker 2: Yeah, there's a there's a great speech in the comic 311 00:15:33,760 --> 00:15:37,920 Speaker 2: series trans Metropolitan about how journalism is a gun that 312 00:15:37,960 --> 00:15:40,920 Speaker 2: you you wire up to your eyes and your ears 313 00:15:41,160 --> 00:15:44,400 Speaker 2: and several other organs in order to shoot at the world. 314 00:15:44,520 --> 00:15:48,360 Speaker 2: And that's I think a good way to keep yourself 315 00:15:48,520 --> 00:15:50,680 Speaker 2: doing it when it feels like you're just shouting into 316 00:15:50,720 --> 00:15:51,120 Speaker 2: a void. 317 00:15:51,560 --> 00:15:51,880 Speaker 8: Yeah. 318 00:15:52,120 --> 00:15:53,840 Speaker 2: I really like the process of writing. 319 00:15:53,880 --> 00:15:56,680 Speaker 4: I like telling stories like that makes me happy and 320 00:15:56,760 --> 00:15:57,640 Speaker 4: I feel. 321 00:15:57,400 --> 00:15:59,400 Speaker 2: So lucky I can do it for my job. 322 00:15:59,640 --> 00:16:03,800 Speaker 4: I don't ticularly like like receiving trauma, which I also 323 00:16:03,800 --> 00:16:07,560 Speaker 4: do for yob but. 324 00:16:06,640 --> 00:16:11,640 Speaker 2: Like, really it can be sometimes I can't sleep. 325 00:16:12,160 --> 00:16:14,600 Speaker 4: So many people trusted me with their stories, especially this year, 326 00:16:14,640 --> 00:16:17,360 Speaker 4: that they didn't have to and sometimes a great personal risk, 327 00:16:18,000 --> 00:16:20,840 Speaker 4: and it's a massive privilege that they trusted me with 328 00:16:20,840 --> 00:16:22,840 Speaker 4: those stories, and I think I owe it to them 329 00:16:22,920 --> 00:16:25,360 Speaker 4: to do my best to tell those stories as well 330 00:16:25,400 --> 00:16:29,760 Speaker 4: as I can. Yeah, and like as Mia said, it 331 00:16:29,800 --> 00:16:33,000 Speaker 4: has materially changed the world, like the amount of people 332 00:16:33,000 --> 00:16:34,880 Speaker 4: who listened to our podcasts and came to the border 333 00:16:34,920 --> 00:16:37,600 Speaker 4: to help last year when we really desperately needed help, 334 00:16:38,040 --> 00:16:40,880 Speaker 4: people who just like on Sunday Night, gave their money, 335 00:16:40,880 --> 00:16:42,760 Speaker 4: which I know none of us have enough money right 336 00:16:42,800 --> 00:16:46,000 Speaker 4: now to help people who are displaced in Rushaba. Like 337 00:16:46,960 --> 00:16:49,840 Speaker 4: all that stuff really makes it feel like if you 338 00:16:49,880 --> 00:16:52,320 Speaker 4: tell a good enough story, people will care. That's always 339 00:16:52,360 --> 00:16:54,360 Speaker 4: what I felt like, if you could just get people 340 00:16:54,400 --> 00:16:56,480 Speaker 4: to see it, if people could be there, they would care, 341 00:16:57,040 --> 00:17:00,320 Speaker 4: and if they care enough, they'll do something. Then I've 342 00:17:00,360 --> 00:17:03,640 Speaker 4: seen that be true with people who listen to the show, 343 00:17:03,680 --> 00:17:05,800 Speaker 4: and that really makes me happy, So I want to 344 00:17:05,880 --> 00:17:06,359 Speaker 4: keep doing that. 345 00:17:06,600 --> 00:17:09,960 Speaker 1: Yeah, for me, it's two part answer. The first part 346 00:17:10,080 --> 00:17:14,440 Speaker 1: is that I genuinely give a shit about everything that 347 00:17:14,760 --> 00:17:21,000 Speaker 1: we put out and what we do is not really 348 00:17:22,240 --> 00:17:27,720 Speaker 1: while it is a job, it matters so much. And 349 00:17:28,000 --> 00:17:31,639 Speaker 1: the second part is if I don't do my job, 350 00:17:31,880 --> 00:17:36,280 Speaker 1: the amount of people's lives that that impacts is a 351 00:17:36,320 --> 00:17:40,320 Speaker 1: lot of fucking people, and I give a shit about 352 00:17:40,359 --> 00:17:44,240 Speaker 1: each and every one of them. So I'm gonna keep 353 00:17:44,280 --> 00:17:47,160 Speaker 1: doing my job so that everybody else can keep doing 354 00:17:47,200 --> 00:17:49,760 Speaker 1: their job and maybe we make a difference in this world, 355 00:17:50,560 --> 00:17:55,879 Speaker 1: this fucked up, crumbly world. Robert, did you have anything 356 00:17:55,880 --> 00:17:58,399 Speaker 1: to add? You were speaking and then nice talked, I. 357 00:17:58,480 --> 00:17:59,960 Speaker 2: Have did I already not give an answer? 358 00:18:00,200 --> 00:18:02,000 Speaker 1: You gave an answer, that's why, but you were starting 359 00:18:02,040 --> 00:18:02,439 Speaker 1: to speak. 360 00:18:02,800 --> 00:18:05,320 Speaker 2: Oh yeah, I do it for the fame, baby. 361 00:18:05,560 --> 00:18:15,240 Speaker 1: Great. Next, What episode or episodes were your favorite this 362 00:18:15,320 --> 00:18:16,840 Speaker 1: year to make or otherwise? 363 00:18:17,880 --> 00:18:18,720 Speaker 4: Yeah? 364 00:18:18,760 --> 00:18:22,600 Speaker 1: My favorite this year We're definitely James's series from the 365 00:18:22,640 --> 00:18:27,640 Speaker 1: Darien Gap. That was an incredible series. I'm so unbelievably 366 00:18:27,680 --> 00:18:30,400 Speaker 1: proud of it. Yeah, James had been trying to do 367 00:18:30,600 --> 00:18:33,439 Speaker 1: that work for a long time, and I'm I'm happy 368 00:18:33,600 --> 00:18:36,119 Speaker 1: that we were able to fund it and James was 369 00:18:36,160 --> 00:18:40,240 Speaker 1: able to do the incredible reporting that he did. I'm 370 00:18:40,240 --> 00:18:43,520 Speaker 1: also quite proud of Robert Garrison and I surviving the 371 00:18:43,680 --> 00:18:45,879 Speaker 1: RNC and d n C and C. 372 00:18:46,080 --> 00:18:52,119 Speaker 2: Was a good time, like legitimately. 373 00:18:49,920 --> 00:18:53,479 Speaker 1: Great time, pulling the worst people in the world. 374 00:18:53,600 --> 00:18:57,560 Speaker 2: It was the d n C that fucked me up. Yeah, yeah, same. 375 00:18:58,040 --> 00:19:00,320 Speaker 5: I was like destroyed emotionally after the d see. 376 00:19:00,440 --> 00:19:03,720 Speaker 1: Yeah, the DNC was really a huge bummer. And then 377 00:19:04,160 --> 00:19:07,600 Speaker 1: MIAs covered some of the most important labor stories that 378 00:19:07,680 --> 00:19:13,159 Speaker 1: like nobody covers absolutely yeah, and like without those genuinely 379 00:19:13,200 --> 00:19:17,640 Speaker 1: like nobody covers like small labor stories or big labor stories, 380 00:19:17,680 --> 00:19:20,600 Speaker 1: and she's always on top of that beat. And yeah, 381 00:19:20,640 --> 00:19:26,040 Speaker 1: I also really just like Robert's don't Panic episode something 382 00:19:27,040 --> 00:19:31,160 Speaker 1: some great writing, my friend, I answered, Now everybody else 383 00:19:31,200 --> 00:19:34,200 Speaker 1: has to, Well, I'll start with Miyah. 384 00:19:34,640 --> 00:19:38,480 Speaker 6: There's weirdly a few this year cool I normally isn't 385 00:19:38,640 --> 00:19:40,240 Speaker 6: I like the Boeing ones that was fun. 386 00:19:40,520 --> 00:19:41,400 Speaker 5: Yeah. 387 00:19:41,640 --> 00:19:44,280 Speaker 6: The one that was most emotionally impactful for me was 388 00:19:44,400 --> 00:19:48,560 Speaker 6: getting to interview doctor Julia Serrano, who if you haven't 389 00:19:48,560 --> 00:19:50,160 Speaker 6: listened to that episode, go listen to it. 390 00:19:50,280 --> 00:19:50,840 Speaker 1: Great book. 391 00:19:51,359 --> 00:19:55,760 Speaker 6: Yeah, Whipping Girl is the book that literally created a 392 00:19:55,800 --> 00:19:59,879 Speaker 6: bunch of the like like the concept of misgendering is 393 00:20:00,240 --> 00:20:02,720 Speaker 6: from that book, right, Like like the language that we 394 00:20:02,800 --> 00:20:07,240 Speaker 6: used to talk about Transnist today is directly her and 395 00:20:07,520 --> 00:20:09,240 Speaker 6: so few people who ever read the book, so a 396 00:20:09,240 --> 00:20:12,520 Speaker 6: few people even know who she is, and getting a 397 00:20:12,600 --> 00:20:15,919 Speaker 6: chance to talk to her was like incredible. And I'm 398 00:20:15,920 --> 00:20:18,000 Speaker 6: also really happy about the organizing one that I did, 399 00:20:18,000 --> 00:20:20,440 Speaker 6: because I've gotten so many messages from people who were 400 00:20:20,480 --> 00:20:23,600 Speaker 6: just like, I, oh, wait, my knitting is useful to organizing, 401 00:20:23,640 --> 00:20:26,200 Speaker 6: And I'm like, yes, yes it is, you're knitting. You're 402 00:20:26,200 --> 00:20:28,119 Speaker 6: so incredible, staggeringly useful. 403 00:20:28,560 --> 00:20:29,960 Speaker 2: Yeah, so I'm proud of that one. 404 00:20:30,600 --> 00:20:34,520 Speaker 1: Yeah, let's take a quick break then, Garrison, Robert James, 405 00:20:35,000 --> 00:20:48,879 Speaker 1: you can answer that question and we're back. James, how 406 00:20:48,880 --> 00:20:49,280 Speaker 1: about you. 407 00:20:53,000 --> 00:20:55,280 Speaker 4: I'm proud of doing the Darien ones. I think, like, 408 00:20:55,560 --> 00:20:59,000 Speaker 4: I'm so happy that we finally got to a place 409 00:20:59,040 --> 00:21:01,840 Speaker 4: where like we could do that, where we could fund that, 410 00:21:02,240 --> 00:21:04,160 Speaker 4: Like I've been trying to do that, like I said, 411 00:21:04,160 --> 00:21:05,119 Speaker 4: for nearly. 412 00:21:05,000 --> 00:21:06,520 Speaker 2: A decade, and. 413 00:21:07,960 --> 00:21:10,760 Speaker 4: Yeah, it's been hard, and it continues to be hard, 414 00:21:10,840 --> 00:21:12,760 Speaker 4: like one of the people you heard from in those 415 00:21:12,800 --> 00:21:16,840 Speaker 4: episodes got deported last week then, so like it continues 416 00:21:16,880 --> 00:21:20,320 Speaker 4: to kind of be emotionally difficult. But I really liked 417 00:21:20,359 --> 00:21:22,480 Speaker 4: how many people messaged me and were like, I sent 418 00:21:22,560 --> 00:21:27,399 Speaker 4: this to my father, uncle, not just dudes aunts and 419 00:21:27,440 --> 00:21:30,240 Speaker 4: their mum's too, I'm sure, and think non binary relatives, 420 00:21:30,280 --> 00:21:32,320 Speaker 4: but like, well maybe not because they sent it to 421 00:21:32,359 --> 00:21:36,080 Speaker 4: their right wing relatives and they like learned some compassion. 422 00:21:36,760 --> 00:21:38,480 Speaker 4: That's always what you want to do, Like I said before, 423 00:21:38,520 --> 00:21:40,160 Speaker 4: you want people to see it so that they care 424 00:21:40,359 --> 00:21:42,000 Speaker 4: and so they understand it and they don't just get 425 00:21:42,000 --> 00:21:45,080 Speaker 4: this stupid Fox News bullshit racism stuff, and. 426 00:21:45,040 --> 00:21:46,520 Speaker 2: So yeah, that may be really happy. 427 00:21:46,680 --> 00:21:48,360 Speaker 4: The reason we're all different on this, by the way, 428 00:21:48,440 --> 00:21:50,440 Speaker 4: is because we have not done a Come twenty twenty 429 00:21:50,440 --> 00:21:52,600 Speaker 4: four episode, and if we had, this would have been 430 00:21:52,640 --> 00:21:54,000 Speaker 4: a much much shorter segment. 431 00:21:54,640 --> 00:21:57,120 Speaker 2: James, let me just tell you, I think we can 432 00:21:57,160 --> 00:21:59,520 Speaker 2: all look forward to all white Christmas this year. 433 00:22:00,520 --> 00:22:01,879 Speaker 1: Jesus mother. 434 00:22:03,680 --> 00:22:07,159 Speaker 2: Set him up. It's my own fault. 435 00:22:11,560 --> 00:22:16,240 Speaker 5: Wow, I guess I'll go now. I'll just short clean 436 00:22:16,240 --> 00:22:17,800 Speaker 5: out the aftertaste of that. 437 00:22:19,240 --> 00:22:21,359 Speaker 2: So it's worse. 438 00:22:24,160 --> 00:22:28,040 Speaker 5: I think I started out pretty strong with police drones, 439 00:22:28,440 --> 00:22:31,280 Speaker 5: even more topical as we record this now as New 440 00:22:31,359 --> 00:22:33,879 Speaker 5: Jersey is about to get completely abducted I think by 441 00:22:34,480 --> 00:22:35,320 Speaker 5: alien aircraft. 442 00:22:35,600 --> 00:22:37,399 Speaker 2: Yeah, there's no one left in New Jersey. 443 00:22:37,440 --> 00:22:41,320 Speaker 5: Now they've all been taken away by all these unidentified drone. 444 00:22:41,000 --> 00:22:43,359 Speaker 2: That actually happened three days ago. It just took a 445 00:22:43,400 --> 00:22:45,320 Speaker 2: long time for the rest of the country to notice 446 00:22:45,400 --> 00:22:49,320 Speaker 2: or care. Breece Springsteen hasn't made a song about it. 447 00:22:49,680 --> 00:22:50,720 Speaker 2: We have no way of known. 448 00:22:51,280 --> 00:22:54,480 Speaker 5: Besides the mass hysteria of the New Jersey drone panic. 449 00:22:54,960 --> 00:22:57,239 Speaker 5: Police drones are a real problem, and those are going 450 00:22:57,320 --> 00:23:00,359 Speaker 5: to be increasingly so. I was happy with my porting 451 00:23:01,080 --> 00:23:04,080 Speaker 5: on that at CES, and then I guess I mean 452 00:23:04,400 --> 00:23:07,719 Speaker 5: to echo Sophie. I had a great time at the RNC. 453 00:23:08,280 --> 00:23:11,399 Speaker 5: It's fun, a sentence I never thought I would say, yeah, 454 00:23:11,440 --> 00:23:13,920 Speaker 5: And particularly the R and C Grinder episode, I still 455 00:23:13,920 --> 00:23:15,480 Speaker 5: think is Bruce pretty is pretty good. 456 00:23:15,680 --> 00:23:19,439 Speaker 1: It's pretty great. The amount of places that Garrison and 457 00:23:19,480 --> 00:23:22,840 Speaker 1: I snuck into at the rn C a time. 458 00:23:23,560 --> 00:23:25,879 Speaker 5: It was really dangerous too, because I was having to 459 00:23:25,960 --> 00:23:28,760 Speaker 5: like do my RNC research next to Robert and Sophie 460 00:23:28,800 --> 00:23:31,520 Speaker 5: the whole time, and oh boy, it's like a minefield 461 00:23:31,600 --> 00:23:32,200 Speaker 5: scrolling through. 462 00:23:32,240 --> 00:23:37,840 Speaker 1: That app effects an experience, to say the least. Any 463 00:23:37,920 --> 00:23:40,679 Speaker 1: thoughts on the proposed twenty twenty eight general strike, How 464 00:23:40,720 --> 00:23:42,160 Speaker 1: are people feeling about that? 465 00:23:42,640 --> 00:23:45,240 Speaker 6: I'll start with Mia, Yeah, I mean it's it's a 466 00:23:45,240 --> 00:23:48,880 Speaker 6: pretty good idea, Like there's definitely sort of and I'm 467 00:23:48,920 --> 00:23:51,440 Speaker 6: immediately going into this nacying a little bit. There's definitely 468 00:23:51,760 --> 00:23:53,720 Speaker 6: problems with it. It's going to be extremely hard to 469 00:23:53,800 --> 00:23:56,879 Speaker 6: execute because we just don't have a modern history of 470 00:23:56,920 --> 00:23:58,320 Speaker 6: doing that in the US, and even some of the 471 00:23:58,320 --> 00:24:00,919 Speaker 6: success so ones in the last decade that people have 472 00:24:00,960 --> 00:24:04,240 Speaker 6: pulled off haven't been that effective. But on the other hand, 473 00:24:04,280 --> 00:24:06,359 Speaker 6: as something that we you know, a concrete thing that 474 00:24:06,400 --> 00:24:10,239 Speaker 6: we have to organize towards that has a bunch of 475 00:24:10,600 --> 00:24:14,879 Speaker 6: like pretty large unions behind it. Already, I did an 476 00:24:14,880 --> 00:24:17,760 Speaker 6: episode about that a few weeks ago. I don't know, 477 00:24:17,800 --> 00:24:19,600 Speaker 6: a couple months ago. I don't remember when I did 478 00:24:19,600 --> 00:24:22,760 Speaker 6: this episode. I'm sorry, I can't remember anything you've ever done. 479 00:24:23,359 --> 00:24:26,719 Speaker 6: But I think I think it's a good opportunity to 480 00:24:26,760 --> 00:24:29,280 Speaker 6: connect a whole bunch of different kinds of organizing together, 481 00:24:30,080 --> 00:24:32,160 Speaker 6: both in terms of sort of labor and in terms 482 00:24:32,200 --> 00:24:35,240 Speaker 6: of the support work you need for that. So yeah, 483 00:24:35,240 --> 00:24:36,280 Speaker 6: cautiously optimistic. 484 00:24:36,680 --> 00:24:38,359 Speaker 1: Anyone else have anything they want to add? 485 00:24:38,680 --> 00:24:41,000 Speaker 5: The time to start figuring out those logistics, like is 486 00:24:41,040 --> 00:24:43,520 Speaker 5: now it's it's not waiting till twenty twenty seven. 487 00:24:43,720 --> 00:24:47,199 Speaker 2: Yeah, I agree, Garrison. I think that the fact that 488 00:24:47,240 --> 00:24:51,320 Speaker 2: there are serious people who represent serious unions talking about 489 00:24:51,320 --> 00:24:54,200 Speaker 2: it is part of way. It's one of the things 490 00:24:54,200 --> 00:24:56,480 Speaker 2: that does give me a degree of hope. We're going 491 00:24:56,520 --> 00:24:59,080 Speaker 2: to have to start working now towards it. It's not 492 00:24:59,160 --> 00:25:01,560 Speaker 2: going to be so in any way, shape or form. 493 00:25:01,640 --> 00:25:04,159 Speaker 2: If they see it coming, they are going to start 494 00:25:04,200 --> 00:25:08,359 Speaker 2: trying to criminalize things preemptively. If it is something that 495 00:25:08,400 --> 00:25:11,640 Speaker 2: even looks like a real possibility, they're going to come 496 00:25:11,680 --> 00:25:13,679 Speaker 2: after it with everything they've got. And it's one of 497 00:25:13,680 --> 00:25:20,240 Speaker 2: those things where maybe if the midterms go well for Democrats, 498 00:25:20,480 --> 00:25:24,040 Speaker 2: maybe Democrats stop that, But it's just as plausible and 499 00:25:24,080 --> 00:25:27,280 Speaker 2: probably more plausible, that Democrats line up with Republicans to 500 00:25:27,320 --> 00:25:29,040 Speaker 2: attempt to criminalize something like that. 501 00:25:29,440 --> 00:25:33,360 Speaker 4: Yeah, it's strained to be seeing something like this organized 502 00:25:33,400 --> 00:25:34,480 Speaker 4: so far off like it. 503 00:25:35,080 --> 00:25:37,280 Speaker 2: Yes, it's not something where any of us are familiar with, 504 00:25:37,400 --> 00:25:39,720 Speaker 2: which it has to be to be clear. Yeah, yeah, Yeah, 505 00:25:39,720 --> 00:25:40,200 Speaker 2: it has to. 506 00:25:40,160 --> 00:25:43,520 Speaker 4: Be barring, like an actual coup. That's the only way 507 00:25:43,560 --> 00:25:45,960 Speaker 4: you get a general strike, right like, either something so 508 00:25:47,160 --> 00:25:50,280 Speaker 4: earth shattering that everyone so that everyone's ready to risk 509 00:25:50,280 --> 00:25:52,520 Speaker 4: it because they're already in danger or yeah, you take 510 00:25:52,560 --> 00:25:54,040 Speaker 4: the time when you plan that you do it properly. 511 00:25:54,400 --> 00:25:56,080 Speaker 4: But it's just not something we're familiar with. I love 512 00:25:56,080 --> 00:25:58,439 Speaker 4: the general strike. I'm always going to support a general strike. 513 00:25:59,080 --> 00:26:01,720 Speaker 4: I'm excited to see a general strike. But yeah, we 514 00:26:01,760 --> 00:26:02,800 Speaker 4: have to put in the work now. 515 00:26:03,400 --> 00:26:07,879 Speaker 2: Yeah, the only responsible way to characterize the organized left 516 00:26:07,960 --> 00:26:10,880 Speaker 2: in the United States is a complete and utter failure. 517 00:26:11,200 --> 00:26:14,080 Speaker 2: Like it has been a calamity for the causes that 518 00:26:14,160 --> 00:26:18,960 Speaker 2: it seeks to represent. And a lot of that is 519 00:26:19,000 --> 00:26:22,960 Speaker 2: because of like fucking bullshit online clicktivism. You know, we're 520 00:26:22,960 --> 00:26:25,400 Speaker 2: all going to do a general strike. Everybody, get ready 521 00:26:25,480 --> 00:26:27,400 Speaker 2: next week, We're going to do it. You know, shit 522 00:26:27,600 --> 00:26:31,239 Speaker 2: like that is. It's just so deeply unseerious. And if 523 00:26:31,240 --> 00:26:34,240 Speaker 2: we're going to take the momentum and the energy that 524 00:26:34,359 --> 00:26:37,719 Speaker 2: exists in the number of people who are angry and 525 00:26:37,760 --> 00:26:40,000 Speaker 2: who you know, and that number of people will be 526 00:26:40,080 --> 00:26:45,080 Speaker 2: increasing as the consequences of conservative policies hit home by 527 00:26:45,119 --> 00:26:49,439 Speaker 2: twenty twenty eight, Like, it has to be something taken 528 00:26:49,920 --> 00:26:53,960 Speaker 2: deadly seriously by very serious people who are thinking through 529 00:26:54,720 --> 00:26:58,040 Speaker 2: the consequences and what's necessary in order to make this feasible. 530 00:26:58,119 --> 00:27:02,919 Speaker 1: You know, and last, do each of you have you 531 00:27:02,960 --> 00:27:06,440 Speaker 1: know a movie or a book or something you would 532 00:27:06,480 --> 00:27:07,280 Speaker 1: like to recommend. 533 00:27:08,119 --> 00:27:10,760 Speaker 4: In twenty twenty five, when I finished my books, you 534 00:27:10,760 --> 00:27:12,600 Speaker 4: should buy it, yes, but. 535 00:27:12,760 --> 00:27:13,800 Speaker 2: Read General Strike. 536 00:27:13,840 --> 00:27:17,280 Speaker 4: I've been reading a book called Platentive, which is in English, 537 00:27:17,320 --> 00:27:20,119 Speaker 4: but it's about how San Francisco doc workers block to 538 00:27:20,200 --> 00:27:24,200 Speaker 4: shipment of weapons to El Salvador and it just seems 539 00:27:24,240 --> 00:27:27,080 Speaker 4: a very relevant book. And they did it to Pinochet 540 00:27:27,119 --> 00:27:30,159 Speaker 4: as well. It's easy to read and like. It just 541 00:27:30,240 --> 00:27:32,720 Speaker 4: reminded me how important labor organizing is going to be 542 00:27:32,800 --> 00:27:34,600 Speaker 4: in the next four years and how powerful it can 543 00:27:34,680 --> 00:27:36,840 Speaker 4: be too. So I'll give that one a little plug. 544 00:27:37,200 --> 00:27:37,680 Speaker 2: Excellent. 545 00:27:38,000 --> 00:27:40,200 Speaker 4: There's a film called The End Will Be Spectacular which 546 00:27:40,280 --> 00:27:44,320 Speaker 4: is about the Kurdish use movement in Northern Kurdistan in Turkey. 547 00:27:44,680 --> 00:27:47,639 Speaker 4: It's a really good film. I think of understand to 548 00:27:47,680 --> 00:27:50,400 Speaker 4: help you understand the Kurdish freedom movement, and it's worth 549 00:27:50,400 --> 00:27:52,600 Speaker 4: a watch. It's not like a necessarily a happy, feel 550 00:27:52,600 --> 00:27:55,000 Speaker 4: good film, but I think it's worth a watch. Fifty 551 00:27:55,160 --> 00:27:57,399 Speaker 4: if you've recently become interested in that because of what 552 00:27:57,440 --> 00:27:58,440 Speaker 4: you've heard on the podcast. 553 00:27:58,960 --> 00:28:02,679 Speaker 6: Yeah, yeah, I have a couple, so I'm trans fiction pilled. 554 00:28:02,840 --> 00:28:06,080 Speaker 6: Right now, you've given you fiction for trans authors. 555 00:28:05,960 --> 00:28:08,480 Speaker 2: Would you say you're transfixed? Wow? 556 00:28:09,200 --> 00:28:09,840 Speaker 5: I walked. 557 00:28:09,920 --> 00:28:15,480 Speaker 6: I walked right into that one, like drove directly into it, 558 00:28:15,560 --> 00:28:17,119 Speaker 6: like JFK's head into that bullet. 559 00:28:17,320 --> 00:28:18,520 Speaker 5: Oh my god. 560 00:28:20,520 --> 00:28:20,760 Speaker 2: Wow. 561 00:28:20,840 --> 00:28:22,560 Speaker 1: We spent a lot of time with each other. 562 00:28:25,800 --> 00:28:26,040 Speaker 2: Yeah. 563 00:28:26,040 --> 00:28:28,280 Speaker 6: The first one I wanted to talk about is The 564 00:28:28,280 --> 00:28:30,800 Speaker 6: gun Runner and Her Hound by Maria Ying, which is 565 00:28:30,800 --> 00:28:33,800 Speaker 6: the pen name of a couple of authors. Okay, so 566 00:28:33,920 --> 00:28:36,920 Speaker 6: it's this is a This is an absolutely unhinged lesbian 567 00:28:36,960 --> 00:28:39,560 Speaker 6: book about a lesbian crime ward and here a do 568 00:28:39,640 --> 00:28:42,840 Speaker 6: bodyguard who is also a lesbian, and it rules. Uh, 569 00:28:43,000 --> 00:28:46,320 Speaker 6: there's a whole sort of like post apocalypse US thing 570 00:28:46,360 --> 00:28:48,720 Speaker 6: going on, but they're still in like civilized Hong Kong. 571 00:28:49,240 --> 00:28:52,400 Speaker 6: It's awesome, it's great. It's you need you need war 572 00:28:52,480 --> 00:28:55,960 Speaker 6: on hinge lesbians in your life, go read this. The 573 00:28:55,960 --> 00:28:58,720 Speaker 6: the other one is One of the Boys. This is 574 00:28:58,920 --> 00:29:02,520 Speaker 6: Forthcoming is going to release May thirteen, twenty twenty five, 575 00:29:03,240 --> 00:29:07,000 Speaker 6: by Victoria Zeller, and it's about a trans girl who's 576 00:29:07,040 --> 00:29:09,920 Speaker 6: like the kicker on her football team and she has 577 00:29:09,920 --> 00:29:12,160 Speaker 6: to like leave the team because she transitions, but then 578 00:29:12,440 --> 00:29:14,440 Speaker 6: the team needs her back. They don't have a kicker, 579 00:29:14,760 --> 00:29:17,160 Speaker 6: and it's it's fun, it's it's a good time, so 580 00:29:17,200 --> 00:29:18,880 Speaker 6: you should get that when it comes out. 581 00:29:19,240 --> 00:29:21,720 Speaker 2: Yeah. So I'm actually right now in the middle of 582 00:29:21,720 --> 00:29:23,760 Speaker 2: a book that I found myself surprised by how much 583 00:29:23,800 --> 00:29:26,800 Speaker 2: I've liked. It's called When Paris Went Dark, and it 584 00:29:26,880 --> 00:29:30,640 Speaker 2: is a history of the occupation of Paris under the Nazis. 585 00:29:30,840 --> 00:29:35,440 Speaker 2: That is a really fascinating social history by Ronald rose 586 00:29:35,480 --> 00:29:39,280 Speaker 2: Bottom that I found very like, emotionally affecting, especially in 587 00:29:39,400 --> 00:29:43,040 Speaker 2: light of, you know, some things going on, and yeah, 588 00:29:43,160 --> 00:29:46,880 Speaker 2: just kind of a fascinating look at the psychology of 589 00:29:46,920 --> 00:29:50,520 Speaker 2: a people, of like of a of an entire people 590 00:29:51,040 --> 00:29:54,240 Speaker 2: kind of grappling with what's about to happen to them 591 00:29:54,400 --> 00:29:56,760 Speaker 2: in the wake of the failure of the French army 592 00:29:56,840 --> 00:29:59,680 Speaker 2: and then what happens next. And then I would also 593 00:29:59,720 --> 00:30:03,640 Speaker 2: reckon Men Setting the Desert on Fire by James Barr, 594 00:30:03,960 --> 00:30:07,840 Speaker 2: which is one of the books about te Lawrence that 595 00:30:07,920 --> 00:30:12,479 Speaker 2: I cited in the Te Lawrence episodes. If you are 596 00:30:12,520 --> 00:30:15,480 Speaker 2: at all interested in the realities of needing to fight 597 00:30:15,560 --> 00:30:16,440 Speaker 2: an insurgent war. 598 00:30:17,040 --> 00:30:20,720 Speaker 5: Here, I guess just two recent things I've enjoyed. Finally 599 00:30:20,960 --> 00:30:27,440 Speaker 5: finished the Steppenwolf by Herman Hess. Yes, I enjoyed that deeply. 600 00:30:28,200 --> 00:30:31,000 Speaker 5: It kind of it kind of my picked my twin 601 00:30:31,000 --> 00:30:33,880 Speaker 5: Peaks the Return Brain. So that was that was pleasant 602 00:30:34,560 --> 00:30:39,600 Speaker 5: for and for a more recent release, Luca Guardaghino's new 603 00:30:39,680 --> 00:30:43,880 Speaker 5: movie Queer, adapting the short story by William S. 604 00:30:43,880 --> 00:30:44,240 Speaker 2: Burrows. 605 00:30:44,520 --> 00:30:49,680 Speaker 5: I found this movie to be utterly fascinating and transfixing, 606 00:30:49,840 --> 00:30:53,440 Speaker 5: to use the term from me, Robert. I don't have 607 00:30:53,520 --> 00:30:55,440 Speaker 5: much else to say about it because I would rather 608 00:30:55,480 --> 00:30:58,080 Speaker 5: people just watch it and take away what they want 609 00:30:58,120 --> 00:31:00,520 Speaker 5: to themselves. But it got me thinking a lot about 610 00:31:00,960 --> 00:31:05,640 Speaker 5: the lack of meaning inherent to identity and why I 611 00:31:05,680 --> 00:31:08,920 Speaker 5: hate the term queer bodies. So yeah, good. 612 00:31:08,720 --> 00:31:13,000 Speaker 1: Movie, awesome. I just have one movie to recommend, and 613 00:31:13,040 --> 00:31:16,120 Speaker 1: it's one of my favorite movies of all time, the 614 00:31:16,720 --> 00:31:21,200 Speaker 1: original nineteen seventy three, seventy two, so seventy three, seventy three, 615 00:31:21,680 --> 00:31:25,280 Speaker 1: The Wicker Man, not the fucking Nicholas Cage version, the 616 00:31:25,400 --> 00:31:29,520 Speaker 1: original version. And if you have a local theater that 617 00:31:29,560 --> 00:31:32,440 Speaker 1: plays old movies a lot of times, they'll play it 618 00:31:32,480 --> 00:31:36,080 Speaker 1: in theaters, and I highly recommend that experience. It's really fun, 619 00:31:36,160 --> 00:31:39,360 Speaker 1: especially at the end. I see it in theaters or 620 00:31:39,400 --> 00:31:41,360 Speaker 1: watch it at least once or twice a year and 621 00:31:42,400 --> 00:31:44,760 Speaker 1: vibes are good. Yeah, that's it for a Q and 622 00:31:44,800 --> 00:32:04,520 Speaker 1: A episode. Thanks for submitting, and goodbye, Welcome to it 623 00:32:04,560 --> 00:32:08,000 Speaker 1: could happen here. This is our twenty twenty five Predictions episode. 624 00:32:08,080 --> 00:32:11,760 Speaker 1: We were starting to bicker off Bike about what we 625 00:32:11,800 --> 00:32:13,680 Speaker 1: predicted last year, and I was talking about the things 626 00:32:13,680 --> 00:32:16,360 Speaker 1: we predicted, and one of the things I predicted early on, 627 00:32:16,400 --> 00:32:19,560 Speaker 1: I was like, I think Kim Kardashian will be part 628 00:32:19,600 --> 00:32:25,320 Speaker 1: of the Trump cabinet, and like, honestly goals at this point, 629 00:32:25,720 --> 00:32:29,960 Speaker 1: but I'm not that far off though, because essentially what 630 00:32:30,120 --> 00:32:32,320 Speaker 1: he has done is he's basically tried to go for 631 00:32:32,400 --> 00:32:34,800 Speaker 1: people that are good on TV. It's true, It's true, 632 00:32:35,000 --> 00:32:39,120 Speaker 1: and like going off of that reality TV energy. 633 00:32:40,040 --> 00:32:42,760 Speaker 4: Finally we will acknowledge the Armenian genocide. 634 00:32:43,920 --> 00:32:46,280 Speaker 1: I was I was vibing, okay, James, I. 635 00:32:46,280 --> 00:32:50,520 Speaker 4: Was vibing genocide. 636 00:32:50,640 --> 00:32:57,760 Speaker 1: Just James, all right, Viba side God all right? Mia 637 00:32:57,800 --> 00:33:01,360 Speaker 1: woggs here, I'm Bred Garrison's here, James Stout's here, and 638 00:33:01,440 --> 00:33:03,640 Speaker 1: the dishonorable Robert Evans is also here. 639 00:33:04,040 --> 00:33:06,880 Speaker 2: I judge that nickname bad Jesus Christ. 640 00:33:07,000 --> 00:33:11,080 Speaker 5: Wow, let's let's go over some of our terrible twenty 641 00:33:11,120 --> 00:33:15,200 Speaker 5: twenty four predictions just briefly. Now, Unfortunately there was a 642 00:33:15,200 --> 00:33:17,520 Speaker 5: lot of election ones which were very sad to listen to. 643 00:33:17,760 --> 00:33:22,560 Speaker 5: Oh no, now, we were correct about many things we did. 644 00:33:22,600 --> 00:33:25,600 Speaker 5: We did talk about how Harris would probably be a 645 00:33:25,640 --> 00:33:28,640 Speaker 5: really bad candidate to run against Trump. Totally forgot about that. 646 00:33:29,240 --> 00:33:34,800 Speaker 2: We did. We did huge for us. Yeah, for the country. 647 00:33:35,880 --> 00:33:38,320 Speaker 2: That brief period of time when Biden stepped down, it 648 00:33:38,360 --> 00:33:40,560 Speaker 2: really felt like it might be I mean, she did 649 00:33:40,560 --> 00:33:43,080 Speaker 2: better than he would have done. Yeah. 650 00:33:43,120 --> 00:33:45,120 Speaker 5: Well, I think that's just because we were still just 651 00:33:45,160 --> 00:33:49,400 Speaker 5: reeling from that debate, because so bad that like anything 652 00:33:49,560 --> 00:33:51,280 Speaker 5: was like, oh my god, there's like a life. 653 00:33:51,280 --> 00:33:54,200 Speaker 2: Look at how she can walk thirty forty feet at 654 00:33:54,200 --> 00:33:57,640 Speaker 2: a time, exact exact sentence. Good God. 655 00:33:58,040 --> 00:34:01,000 Speaker 5: None of us picked Vance specifically at that point in time, 656 00:34:01,040 --> 00:34:04,960 Speaker 5: but we did pinpoint Trump's orbit and his like campaign 657 00:34:05,080 --> 00:34:10,000 Speaker 5: like hmmm crew pretty well, like Mia predicted that RFK 658 00:34:10,160 --> 00:34:12,480 Speaker 5: Junior could be a Trump VP pick, and though he 659 00:34:12,520 --> 00:34:15,279 Speaker 5: didn't become VP, he essentially kind of took over the 660 00:34:15,400 --> 00:34:18,279 Speaker 5: VP like campaigning role from Vans in. 661 00:34:18,200 --> 00:34:21,239 Speaker 2: Like August Ye was so bad at it. 662 00:34:22,000 --> 00:34:24,439 Speaker 5: We all decided that like Vivec was simply like way 663 00:34:24,480 --> 00:34:27,000 Speaker 5: too loud and like obnoxious. So Trump would like find 664 00:34:27,000 --> 00:34:29,839 Speaker 5: some other spot for him. Stand by that, And that's 665 00:34:29,840 --> 00:34:31,640 Speaker 5: what happened. He's still in the orbit, but he's not 666 00:34:31,680 --> 00:34:35,560 Speaker 5: super close. So if he talked about possibly Christine nome 667 00:34:36,280 --> 00:34:39,520 Speaker 5: As is getting LinkedIn with Trump maybe for VP. Now 668 00:34:39,520 --> 00:34:43,200 Speaker 5: that didn't happen for VP, but christineoam is in the cabinet. 669 00:34:43,560 --> 00:34:44,759 Speaker 1: Good job passed me. 670 00:34:45,120 --> 00:34:47,520 Speaker 5: Yeah, and Robert said that he would not be shocked 671 00:34:47,600 --> 00:34:56,240 Speaker 5: if Trump got close with Tulca Gabbert and other less 672 00:34:56,239 --> 00:34:59,759 Speaker 5: good predictions I predicted that a daily wire host would 673 00:34:59,760 --> 00:35:04,560 Speaker 5: get unfortunately did not come to pass. There's still time. 674 00:35:04,640 --> 00:35:06,840 Speaker 6: It's still twenty twenty fours, right. 675 00:35:07,000 --> 00:35:08,640 Speaker 2: Not when this airs, not when this air. 676 00:35:10,160 --> 00:35:13,920 Speaker 5: Yes, Kim Kardashian getting into politics didn't really happen, she 677 00:35:14,000 --> 00:35:16,880 Speaker 5: kind of stated at her regular coast level. Sorry, Sophie 678 00:35:17,360 --> 00:35:17,839 Speaker 5: so far. 679 00:35:18,160 --> 00:35:21,480 Speaker 1: Trust me. She did all those things when Trump was 680 00:35:21,480 --> 00:35:23,360 Speaker 1: elected the first time, where all of a sudden she was, 681 00:35:23,440 --> 00:35:26,440 Speaker 1: like with other lawyers trying to get people out of 682 00:35:26,520 --> 00:35:29,040 Speaker 1: jail by utilizing Trump. 683 00:35:29,200 --> 00:35:31,080 Speaker 5: Yeah, I mean, and she was doing that with the 684 00:35:31,120 --> 00:35:34,240 Speaker 5: Biden campaign as well, not as visible a Harris campaign. 685 00:35:34,600 --> 00:35:37,360 Speaker 5: She was meeting with Harris multiple times. She kind of 686 00:35:37,360 --> 00:35:40,040 Speaker 5: stayed at this like distant but like talkative place. 687 00:35:40,640 --> 00:35:43,680 Speaker 1: That's the Kardashian way, distant and talkative. 688 00:35:45,280 --> 00:35:48,799 Speaker 5: Speaking of speaking of your other prediction was that people 689 00:35:48,800 --> 00:35:52,040 Speaker 5: would start forgetting about the Nazi stuff and Kanye would 690 00:35:52,040 --> 00:35:57,160 Speaker 5: put out a well received album, which kind of happened. Yeah, yeah, yeah, 691 00:35:57,200 --> 00:35:57,680 Speaker 5: a little bit. 692 00:35:58,000 --> 00:36:01,080 Speaker 1: God, I haven't thought about Kanye so many months. It 693 00:36:01,160 --> 00:36:05,480 Speaker 1: was really nice, well, really nice, thanks Garrison. 694 00:36:06,000 --> 00:36:10,480 Speaker 5: Lastly, my failed prediction is that if Trump won the election, 695 00:36:10,600 --> 00:36:12,879 Speaker 5: that there would be two solid weeks of writing which 696 00:36:12,920 --> 00:36:13,920 Speaker 5: simply did not happen. 697 00:36:14,120 --> 00:36:15,400 Speaker 2: Yes, nothing happened. 698 00:36:15,600 --> 00:36:18,640 Speaker 5: I think it's actually kind of interesting, and we will 699 00:36:18,680 --> 00:36:21,880 Speaker 5: maybe unpack that in the coming months as Trump's second 700 00:36:21,960 --> 00:36:24,560 Speaker 5: term kind of settles in. I'm sure we will kind 701 00:36:24,600 --> 00:36:28,240 Speaker 5: of revisit why we think this did not happen. Certainly, 702 00:36:28,280 --> 00:36:30,960 Speaker 5: I'm curious about what inauguration Day will look like. But 703 00:36:30,960 --> 00:36:35,920 Speaker 5: but yeah, that was a lot so sorry. Morsey is 704 00:36:35,920 --> 00:36:39,560 Speaker 5: still alive, David David Scavenger is still alive, Putin is 705 00:36:39,640 --> 00:36:42,920 Speaker 5: still alive, and though James did say that Asad would 706 00:36:42,960 --> 00:36:46,879 Speaker 5: eat it, and though a sawd didn't die, he kind 707 00:36:46,880 --> 00:36:47,640 Speaker 5: of did eat it. 708 00:36:49,480 --> 00:36:52,480 Speaker 2: Well, yeah, I mean, James, Yeah, that's that's not gonna 709 00:36:52,520 --> 00:36:56,399 Speaker 2: be the biggest dub of the year. Yeah, that's right. 710 00:36:56,680 --> 00:37:00,560 Speaker 4: Damn I forgot no one about that. Really happy with myself. 711 00:37:00,160 --> 00:37:02,240 Speaker 1: Now, James, I'm so proud of you, buddy. 712 00:37:02,840 --> 00:37:05,839 Speaker 2: You got to pick another one this year. Yeah, may 713 00:37:05,960 --> 00:37:07,800 Speaker 2: on long, baby, he's next. 714 00:37:09,000 --> 00:37:12,480 Speaker 5: Let's I guess let's start with some kind of dictator predictions. 715 00:37:12,600 --> 00:37:15,000 Speaker 5: What do we think will happen to like a dictator 716 00:37:15,440 --> 00:37:16,960 Speaker 5: in twenty twenty five. 717 00:37:17,120 --> 00:37:18,880 Speaker 4: Which is Hace is gonna die? Do we think go 718 00:37:19,000 --> 00:37:20,480 Speaker 4: just general dictator predictions? 719 00:37:20,600 --> 00:37:24,279 Speaker 5: Dictator predictions. It can be maybe we get a new one, 720 00:37:24,360 --> 00:37:26,200 Speaker 5: you know, maybe we get a new fancy one. 721 00:37:26,280 --> 00:37:28,920 Speaker 2: Yeah, well I don't know. Yeah, something's happening in January. 722 00:37:29,600 --> 00:37:31,879 Speaker 6: I have two well one of them, I mean it's 723 00:37:31,920 --> 00:37:33,520 Speaker 6: kind of a hack one, but I don't think. I 724 00:37:33,520 --> 00:37:35,560 Speaker 6: don't think the June to memr makes out of twenty 725 00:37:35,600 --> 00:37:39,000 Speaker 6: twenty five. Yeah, I think not in the version is today. Yeah, 726 00:37:39,040 --> 00:37:41,640 Speaker 6: that's the hack one. The the other one is another 727 00:37:41,640 --> 00:37:44,000 Speaker 6: assad one. Is I think someone actually does assassinate a 728 00:37:44,080 --> 00:37:46,560 Speaker 6: sod Well, he's like like he he gets too full 729 00:37:46,600 --> 00:37:48,239 Speaker 6: of himself and he goes to Abi Dabi and some 730 00:37:48,280 --> 00:37:50,000 Speaker 6: Mosan brotherhood guy just wax him. 731 00:37:50,239 --> 00:37:51,440 Speaker 2: Yep, okay. 732 00:37:51,680 --> 00:37:54,520 Speaker 5: May Assaud prediction is he becomes a Russia Today host. 733 00:37:54,920 --> 00:37:56,640 Speaker 5: That's my assad prediction. 734 00:37:57,000 --> 00:37:58,480 Speaker 1: Oh my gosh. 735 00:37:58,680 --> 00:38:02,120 Speaker 4: Yeah, he's going to open his ophthalmology clinic. 736 00:38:02,760 --> 00:38:05,120 Speaker 2: No, I mean I think he's going to get signed 737 00:38:05,520 --> 00:38:09,040 Speaker 2: to host a podcast by a little a little network 738 00:38:09,080 --> 00:38:13,400 Speaker 2: you might have heard of called cool Zone Media. Congratulations, guys, 739 00:38:13,640 --> 00:38:15,319 Speaker 2: let's bring him on. So if you get him on 740 00:38:15,360 --> 00:38:16,960 Speaker 2: the get him on the zoom, tell him you can 741 00:38:17,120 --> 00:38:24,719 Speaker 2: kick an hop in the room. Now for sure, baby. 742 00:38:23,120 --> 00:38:26,240 Speaker 5: We are merging with Tennant Media to bring up our friend. 743 00:38:28,400 --> 00:38:33,399 Speaker 2: Yeah, welcome to the Podsha. He's actually doing a whole 744 00:38:33,440 --> 00:38:35,839 Speaker 2: media tour with the Pod Save guys next week. That's 745 00:38:35,840 --> 00:38:36,879 Speaker 2: got to be fascinating. 746 00:38:38,360 --> 00:38:42,640 Speaker 4: Pod Save bathis Siria the most cursed podcast in the world. 747 00:38:44,040 --> 00:38:48,160 Speaker 1: My Dictator slash world Leader prediction is that despite being 748 00:38:48,200 --> 00:38:52,160 Speaker 1: Nen Yahoo's I was thinking, yeah, last ride. 749 00:38:52,120 --> 00:38:56,440 Speaker 2: From your mouth to whatever fucking clot is working its 750 00:38:56,480 --> 00:38:59,800 Speaker 2: way through his coronary sy in a year. 751 00:39:00,280 --> 00:39:03,640 Speaker 1: I really fucking hope. I'm really fucking hope. I'm right, 752 00:39:05,400 --> 00:39:05,880 Speaker 1: we all do. 753 00:39:05,920 --> 00:39:07,280 Speaker 2: I don't know what else to say there? 754 00:39:07,440 --> 00:39:11,319 Speaker 4: Yeah, yeah, yeah, yeah, yeah, that's a big thing for 755 00:39:11,360 --> 00:39:11,640 Speaker 4: the world. 756 00:39:12,280 --> 00:39:12,480 Speaker 2: Yeah. 757 00:39:12,520 --> 00:39:15,640 Speaker 5: I mean we are verging into not doing predictions, just 758 00:39:15,640 --> 00:39:16,640 Speaker 5: doing hopes and dreams. 759 00:39:16,880 --> 00:39:19,000 Speaker 4: Yeah. Well I did Morrissey like that last year. We 760 00:39:19,040 --> 00:39:20,040 Speaker 4: didn't get it, and I'm sack. 761 00:39:20,160 --> 00:39:22,200 Speaker 5: We need some hopes and dreams out in the world. 762 00:39:22,239 --> 00:39:24,040 Speaker 1: I fair enough. 763 00:39:24,400 --> 00:39:28,240 Speaker 5: Yeah, do you know what else? We need team money 764 00:39:28,440 --> 00:39:40,840 Speaker 5: from these advertisers. That's right, and we. 765 00:39:40,920 --> 00:39:44,320 Speaker 1: Are back, all right, Garrison, what's next? 766 00:39:44,680 --> 00:39:48,000 Speaker 5: So usually in the middle of these prediction episodes, I 767 00:39:48,400 --> 00:39:53,080 Speaker 5: like doing our like third annual death segment, who do 768 00:39:53,080 --> 00:39:54,920 Speaker 5: we think will die? And I guess we kin we 769 00:39:55,000 --> 00:39:57,120 Speaker 5: kind of touched on this briefly, but I don't think 770 00:39:57,160 --> 00:39:59,240 Speaker 5: we we actually secure death for any of those people 771 00:39:59,480 --> 00:40:01,480 Speaker 5: in our prediction, just that they would, you know, have 772 00:40:01,600 --> 00:40:05,719 Speaker 5: circumstances change, though for this year's death segment we have. 773 00:40:05,880 --> 00:40:09,360 Speaker 5: We have a bit of a twist. So it turns 774 00:40:09,360 --> 00:40:13,920 Speaker 5: out about two years ago, on Spotify Wrapped Day, we 775 00:40:14,040 --> 00:40:17,800 Speaker 5: all woke up to the news that both Angela BATTLEMENTI 776 00:40:17,920 --> 00:40:21,759 Speaker 5: was embarrassingly my number one Spotify artist that year, but 777 00:40:21,960 --> 00:40:28,280 Speaker 5: also that Henry Kissinger died honey, and this Spotify rap. Today, 778 00:40:29,200 --> 00:40:32,880 Speaker 5: we will come to the news that the United Healthcare 779 00:40:33,239 --> 00:40:39,200 Speaker 5: CEO was gunned down in New York City. So Spotify 780 00:40:39,360 --> 00:40:41,960 Speaker 5: rapped twenty twenty five Who's dying? 781 00:40:42,040 --> 00:40:43,759 Speaker 2: Who's dying on. 782 00:40:45,960 --> 00:40:49,040 Speaker 5: Spotify rap to day? So this is what late November, 783 00:40:49,160 --> 00:40:55,480 Speaker 5: early December, we don't really know. Spotify rapped death day predictions. 784 00:40:55,520 --> 00:40:59,960 Speaker 1: So long, farewell, Love Eider saying goodbye, Mitch McCall. 785 00:41:00,600 --> 00:41:05,520 Speaker 2: Oh, that's a good one. That's an easy one. But okay, 786 00:41:05,800 --> 00:41:06,640 Speaker 2: I'll give it to you. 787 00:41:06,880 --> 00:41:09,879 Speaker 5: I'm thinking, like, who's got to get through most of 788 00:41:09,920 --> 00:41:13,280 Speaker 5: the year but not finish it out. You know, it's tough. 789 00:41:13,719 --> 00:41:16,560 Speaker 2: I'm gonna make my call tye up resip air to one. 790 00:41:16,719 --> 00:41:19,960 Speaker 2: You know. That's that's that's my hope. That's a long shot. 791 00:41:20,000 --> 00:41:21,960 Speaker 2: I know, yes, he doesn't seem like he's in bad health, 792 00:41:22,040 --> 00:41:22,959 Speaker 2: but that's a big one. 793 00:41:23,400 --> 00:41:26,360 Speaker 5: Kissinger was a long shot too, because he was like 794 00:41:26,520 --> 00:41:27,760 Speaker 5: arguably immortals. 795 00:41:27,800 --> 00:41:29,960 Speaker 2: He'd kept living for so fucking long. 796 00:41:30,320 --> 00:41:36,960 Speaker 1: Ah, so long, farewell, Love Eater, say goodbye Musk. 797 00:41:37,600 --> 00:41:39,840 Speaker 2: I was gonna say that I think he might die. 798 00:41:40,040 --> 00:41:41,719 Speaker 5: You think we're finally gonna get that drug over do. 799 00:41:41,880 --> 00:41:44,920 Speaker 4: So I just he just seems to be spiraling so 800 00:41:45,120 --> 00:41:45,840 Speaker 4: hard right. 801 00:41:45,680 --> 00:41:47,120 Speaker 1: Now, the spiral's mad real. 802 00:41:47,239 --> 00:41:49,520 Speaker 2: Yeah, he's getting everything he wants though, but I mean 803 00:41:49,520 --> 00:41:50,480 Speaker 2: that that also. 804 00:41:50,320 --> 00:41:51,400 Speaker 5: It's It's true. 805 00:41:52,280 --> 00:41:56,680 Speaker 2: Sometimes that's dangerous, Yeah, especially if you are addicted to 806 00:41:56,840 --> 00:41:59,960 Speaker 2: a drug that you can get in unlimited peer quantities 807 00:42:00,080 --> 00:42:01,880 Speaker 2: and no one will ever say no to handing it 808 00:42:01,920 --> 00:42:02,120 Speaker 2: to you. 809 00:42:02,600 --> 00:42:05,520 Speaker 5: We have some more must predictions for later on the episode. Okay, 810 00:42:05,960 --> 00:42:08,840 Speaker 5: but I can see of some, you know, like fantously 811 00:42:08,920 --> 00:42:12,479 Speaker 5: the Secret Service, you know, not not great at hiding 812 00:42:12,520 --> 00:42:15,440 Speaker 5: there own drug problems. I can I can see possibly 813 00:42:15,680 --> 00:42:18,440 Speaker 5: with Musk entering a new level of comfort, maybe the 814 00:42:18,520 --> 00:42:21,480 Speaker 5: spiraling a little, a little too far out of his control. 815 00:42:21,600 --> 00:42:24,319 Speaker 2: He and two Secret Service ations are found dead with 816 00:42:24,440 --> 00:42:25,760 Speaker 2: fitnyl infected blow. 817 00:42:28,920 --> 00:42:35,920 Speaker 5: Maybe a SpaceX launch goes really wrong. Who's to say? 818 00:42:36,440 --> 00:42:39,960 Speaker 5: Who's to say? Damn? I gotta think of who? Who? 819 00:42:40,040 --> 00:42:42,520 Speaker 5: My who? My Spotify Rapped Day death is. 820 00:42:42,719 --> 00:42:47,320 Speaker 6: I have a long shot. Oh yeah, My long shot 821 00:42:47,520 --> 00:42:51,760 Speaker 6: is that sometime on Spotify rap Day, JK Rolling sees 822 00:42:51,800 --> 00:42:54,279 Speaker 6: a trans woman just like existing and gets so mad, 823 00:42:54,360 --> 00:42:55,560 Speaker 6: he has an aneurysm and dies. 824 00:42:55,560 --> 00:42:59,960 Speaker 5: No, she's looking through the Spotify wraps and she knows 825 00:43:00,440 --> 00:43:03,200 Speaker 5: that trans would make the best music, and she she 826 00:43:03,320 --> 00:43:06,719 Speaker 5: sees it, gets so mad she just she just kills over. 827 00:43:08,080 --> 00:43:12,239 Speaker 4: She transvestigates every single female artist on the Spotify rap 828 00:43:12,280 --> 00:43:14,759 Speaker 4: list and dies of sleep deprivation doing so. 829 00:43:15,000 --> 00:43:17,160 Speaker 6: Her her own fans start transvestigating her. 830 00:43:17,920 --> 00:43:20,360 Speaker 2: This is the edge. 831 00:43:20,840 --> 00:43:24,120 Speaker 5: Okay, I have a real long shot here, but I 832 00:43:24,160 --> 00:43:27,080 Speaker 5: can see how it could happen. So we're we're in. 833 00:43:27,160 --> 00:43:29,840 Speaker 5: We're in like what like month, month ten eleven of 834 00:43:29,920 --> 00:43:35,120 Speaker 5: Trump term two. Right, the right wing nazi content creators 835 00:43:35,120 --> 00:43:38,600 Speaker 5: are settling, are settling into their into their kind of groove. 836 00:43:38,920 --> 00:43:41,840 Speaker 5: Some of them aren't really happy at Trump, not like 837 00:43:42,320 --> 00:43:44,319 Speaker 5: you know, carrying on all of all of his big 838 00:43:44,360 --> 00:43:49,920 Speaker 5: lofty promises. And one one disgruntled fan of Nick Fuentes 839 00:43:50,920 --> 00:43:56,000 Speaker 5: does something crazy on Spotify rapped day, and that's that's 840 00:43:56,040 --> 00:43:59,800 Speaker 5: that's my prediction is that somehow something like really weird, 841 00:43:59,840 --> 00:44:05,120 Speaker 5: like like stalker or fan does something to to mister Fuentes. 842 00:44:05,520 --> 00:44:08,360 Speaker 5: Just pure prediction on like just like what would be 843 00:44:08,360 --> 00:44:11,680 Speaker 5: the oddest, oddest thing to happen, but something that could 844 00:44:11,719 --> 00:44:14,279 Speaker 5: totally make sense. Maybe it's like an old like Kanye fan, 845 00:44:14,440 --> 00:44:16,320 Speaker 5: you know from Kanye and Nick. 846 00:44:16,360 --> 00:44:17,360 Speaker 2: From his nasra. 847 00:44:17,880 --> 00:44:20,960 Speaker 5: Yeah, Yeah, I don't know. I feel like it's it's 848 00:44:21,480 --> 00:44:24,120 Speaker 5: his fandom's getting close enough to pull some like weird 849 00:44:24,160 --> 00:44:26,239 Speaker 5: crazy shit like that on, like a weird like on, 850 00:44:26,320 --> 00:44:30,040 Speaker 5: like a on, like a deeply parasocially destructive level, like 851 00:44:30,080 --> 00:44:34,319 Speaker 5: Stephen King's misery. A misery happens to Nick Quintes, but 852 00:44:34,360 --> 00:44:36,520 Speaker 5: he doesn't but he doesn't make it, he doesn't make 853 00:44:36,560 --> 00:44:39,400 Speaker 5: it out. That's that's my Spotify wrapped prediction. 854 00:44:40,000 --> 00:44:43,120 Speaker 2: I have said for years Nick Fuintes is going to 855 00:44:43,239 --> 00:44:47,320 Speaker 2: go down live probably maybe maybe live. He's gonna go 856 00:44:47,360 --> 00:44:50,000 Speaker 2: down like George Lincoln Rockwell. It is not going to 857 00:44:50,080 --> 00:44:52,279 Speaker 2: be like an enemy of his, that does it. It's 858 00:44:52,320 --> 00:44:55,760 Speaker 2: going to be a result of his incredibly messy personal life. Yeah, 859 00:44:55,840 --> 00:44:59,160 Speaker 2: like someone is going to take him down. H Like 860 00:44:59,280 --> 00:45:02,239 Speaker 2: it's it's that Yeah, Yeah, that feels right. 861 00:45:03,040 --> 00:45:06,680 Speaker 1: Do we have a non categorized predictions? Is it that 862 00:45:06,719 --> 00:45:07,320 Speaker 1: time yet? 863 00:45:07,440 --> 00:45:07,760 Speaker 2: Sure? 864 00:45:08,080 --> 00:45:11,239 Speaker 5: Now that we have, we have finished our Spotify wrapped predictions, 865 00:45:11,600 --> 00:45:13,799 Speaker 5: And I do not know who my top artists will be. 866 00:45:14,640 --> 00:45:18,880 Speaker 5: This last year it was Trent Resnor so salute that's like, okay, 867 00:45:19,040 --> 00:45:23,240 Speaker 5: Garrison Challenger soundtrack, that thing fucking bops. 868 00:45:23,440 --> 00:45:26,160 Speaker 1: I tried to make Robert watch that on the way 869 00:45:26,200 --> 00:45:28,120 Speaker 1: to Oh is it the d n C or the 870 00:45:28,239 --> 00:45:31,200 Speaker 1: r n C. I don't remember, and but he wouldn't 871 00:45:31,239 --> 00:45:33,880 Speaker 1: watch it with headphones, and so it was just on 872 00:45:33,880 --> 00:45:35,800 Speaker 1: on the plane. I think it was the DNC. 873 00:45:36,560 --> 00:45:37,320 Speaker 5: That's terrible. 874 00:45:37,680 --> 00:45:39,880 Speaker 2: Yeah, I think I think I was reading a Nick Land. 875 00:45:43,280 --> 00:45:47,160 Speaker 5: Honestly, that's a vibe that actually pairs quite well. 876 00:45:47,440 --> 00:45:52,439 Speaker 2: I landed completely deranged. It was great, Ready to work. 877 00:45:53,560 --> 00:45:56,520 Speaker 1: A prediction. A prediction that I have is that like 878 00:45:57,400 --> 00:46:01,200 Speaker 1: Trump basically tries to move a lot of the main 879 00:46:01,640 --> 00:46:06,520 Speaker 1: time he spends to mar Alago versus the White House. Like, 880 00:46:06,640 --> 00:46:09,560 Speaker 1: I feel like he's going to make mar A Lago 881 00:46:09,760 --> 00:46:13,799 Speaker 1: some like national monument type shit so that he can 882 00:46:13,880 --> 00:46:16,560 Speaker 1: take whatever the fuck documents he wants from the White 883 00:46:16,600 --> 00:46:18,560 Speaker 1: House to mar A Lago and spend as much time 884 00:46:18,560 --> 00:46:22,000 Speaker 1: there as he wants and make that like a national 885 00:46:22,080 --> 00:46:23,160 Speaker 1: residence or some shit. 886 00:46:23,840 --> 00:46:28,600 Speaker 5: Went to White House, Yeah, the whiter House, we could 887 00:46:28,640 --> 00:46:29,120 Speaker 5: call it. 888 00:46:29,239 --> 00:46:36,360 Speaker 2: Yeah, so true, true Harrison. No, I'm kind of interested 889 00:46:36,440 --> 00:46:39,719 Speaker 2: to watch what happens with AOC over the next year, 890 00:46:39,880 --> 00:46:43,840 Speaker 2: because she has definitely become to a lot of folks 891 00:46:43,880 --> 00:46:46,399 Speaker 2: that progressive and on the left, like a villain over 892 00:46:46,440 --> 00:46:51,040 Speaker 2: the last year, And I kind of wouldn't be surprised if, like, 893 00:46:51,320 --> 00:46:55,680 Speaker 2: in assuming there's still politics in twenty years. When we're 894 00:46:55,719 --> 00:46:59,600 Speaker 2: talking to young people, they think of her like Pelosi 895 00:46:59,719 --> 00:47:02,640 Speaker 2: and like, oh, you've got to understand when things started out, 896 00:47:02,640 --> 00:47:05,960 Speaker 2: this was a very different person. Yeah, yeah, Yeah. And 897 00:47:06,040 --> 00:47:08,440 Speaker 2: I'm not saying that's a fair way to characterize her 898 00:47:08,480 --> 00:47:10,400 Speaker 2: now or where she'll go. I'm just saying, like, I 899 00:47:10,440 --> 00:47:12,719 Speaker 2: wouldn't be shocked if that's the way a lot of 900 00:47:12,719 --> 00:47:15,600 Speaker 2: folks are looking at it in fucking a few years, 901 00:47:15,600 --> 00:47:17,279 Speaker 2: because I'm saying I'm hearing a lot of that now. Yeah, 902 00:47:17,320 --> 00:47:21,399 Speaker 2: peop are very angry at her over largely Gaza. But yeah, 903 00:47:21,480 --> 00:47:23,799 Speaker 2: also the fact that she and Bernie both tried to 904 00:47:23,800 --> 00:47:30,200 Speaker 2: back Biden kind of yeah late in his uh centicence. Yeah, Okay. 905 00:47:30,200 --> 00:47:32,319 Speaker 6: My My big one for the year is this is 906 00:47:32,320 --> 00:47:34,560 Speaker 6: the this is the year the economy finally collapses, Like, 907 00:47:34,600 --> 00:47:36,600 Speaker 6: this is the year you find out that no company 908 00:47:36,600 --> 00:47:40,000 Speaker 6: has made any fucking money in a decade. It's all 909 00:47:40,040 --> 00:47:44,200 Speaker 6: been being pumped up by like a deranged combination of 910 00:47:44,239 --> 00:47:48,160 Speaker 6: interest rate bullshit, a bunch of fucking money from like 911 00:47:48,239 --> 00:47:51,640 Speaker 6: overnight repo purchases, keeping the banks propped up. And I 912 00:47:51,880 --> 00:47:53,279 Speaker 6: don't know if it's gonna be the trade war that 913 00:47:53,320 --> 00:47:55,280 Speaker 6: fucking blows it up, although I think that will instantly 914 00:47:55,280 --> 00:47:56,800 Speaker 6: detonate everything. 915 00:47:56,880 --> 00:47:57,160 Speaker 2: I don't know. 916 00:47:57,200 --> 00:47:58,920 Speaker 6: Maybe it's maybe he's a Chinese housing bubble, maybe the 917 00:47:58,920 --> 00:48:01,080 Speaker 6: tech bubble finally collapse. Maybe all three of them hit 918 00:48:01,120 --> 00:48:03,040 Speaker 6: at the same time. This is the year fucking goes. 919 00:48:03,480 --> 00:48:05,920 Speaker 6: I've never actually put my name down down on this 920 00:48:06,000 --> 00:48:07,680 Speaker 6: on the show. On any other fucking year. This is 921 00:48:07,719 --> 00:48:11,520 Speaker 6: the year the zombie economy will fall over dead. The 922 00:48:11,560 --> 00:48:12,840 Speaker 6: necromancy cannot hold. 923 00:48:14,160 --> 00:48:16,560 Speaker 2: I guess my prediction is that the economy is going 924 00:48:16,600 --> 00:48:20,960 Speaker 2: to be basically identical to the Biden economy in that 925 00:48:21,120 --> 00:48:25,040 Speaker 2: we're going to get like fucked up inflation and people 926 00:48:25,040 --> 00:48:27,560 Speaker 2: are going to be very angry, and the number will 927 00:48:27,560 --> 00:48:29,680 Speaker 2: continue to go up on the stock market because that's 928 00:48:29,719 --> 00:48:32,000 Speaker 2: kind of what it's designed to do. That's my theory. 929 00:48:32,239 --> 00:48:34,759 Speaker 1: And the housing market will still be trash. 930 00:48:35,080 --> 00:48:38,120 Speaker 2: Yeah, we will never afford times and the housing's just 931 00:48:38,160 --> 00:48:41,360 Speaker 2: gonna get more expensive. It will be interesting to see 932 00:48:41,840 --> 00:48:44,800 Speaker 2: Trump's entire all of his backers and his whole media 933 00:48:45,080 --> 00:48:47,120 Speaker 2: like one thing that will be easier for the left 934 00:48:47,600 --> 00:48:51,120 Speaker 2: is really hitting conservatives on inflation as it gets horrible 935 00:48:51,160 --> 00:48:55,480 Speaker 2: again or continues to suck, because that's, you know, at 936 00:48:55,480 --> 00:49:00,160 Speaker 2: this point, just a factor of the economy working as intended. Yeah, 937 00:49:00,200 --> 00:49:03,000 Speaker 2: that they all have to pretend, isn't. Yeah. 938 00:49:03,120 --> 00:49:04,680 Speaker 1: And before we go to a break, I just want 939 00:49:04,680 --> 00:49:06,799 Speaker 1: to say the price of eggs will go up. 940 00:49:07,239 --> 00:49:09,479 Speaker 2: I need to get chickens now. Oh yeah, this bird 941 00:49:09,480 --> 00:49:11,799 Speaker 2: flu thing is not gonna help with eggs. Oh boy, 942 00:49:12,520 --> 00:49:15,200 Speaker 2: oh boy, get your eggs now, by one hundred, by 943 00:49:15,280 --> 00:49:16,760 Speaker 2: thousands of dollars of eggs. 944 00:49:16,800 --> 00:49:19,880 Speaker 4: Now, there was some kind of device to make eggs 945 00:49:19,920 --> 00:49:22,640 Speaker 4: and you could have in your own God, oh. 946 00:49:22,440 --> 00:49:31,920 Speaker 1: My goodness, it's time for ads. 947 00:49:35,480 --> 00:49:38,680 Speaker 5: I guess to piggyback off of Robert and MEA's predictions 948 00:49:38,719 --> 00:49:42,560 Speaker 5: there in the economy. My prediction is that once I 949 00:49:42,800 --> 00:49:46,120 Speaker 5: finally launched cool zone coin this year, I'm gonna make 950 00:49:46,120 --> 00:49:52,000 Speaker 5: a big If the economy is gonna go down, I 951 00:49:52,040 --> 00:49:54,359 Speaker 5: am gonna be going up. Everyone's gonna start buying coole 952 00:49:54,440 --> 00:49:58,239 Speaker 5: zone coin because the US dollar becomes worthless. Bitcoin's gonna 953 00:49:58,239 --> 00:50:01,720 Speaker 5: crash too. It's fake, but cool zone coin has real 954 00:50:01,920 --> 00:50:02,960 Speaker 5: fungible value. 955 00:50:03,480 --> 00:50:05,799 Speaker 2: Well, yeah, the thing about cool zone coin. That makes 956 00:50:05,840 --> 00:50:08,640 Speaker 2: it different from all of the other crypto coins is 957 00:50:08,680 --> 00:50:11,800 Speaker 2: that it is really based on a fundamentally limited and 958 00:50:11,920 --> 00:50:14,840 Speaker 2: valuable resource, which is movies from the nineties that I 959 00:50:14,960 --> 00:50:18,879 Speaker 2: showed Garrison and they actually liked. So, you know, there's 960 00:50:19,080 --> 00:50:22,480 Speaker 2: there's only so many cool Zone coins that can be 961 00:50:22,520 --> 00:50:23,359 Speaker 2: in circulation. 962 00:50:24,719 --> 00:50:26,799 Speaker 5: We're lucky I was in Portland this Christmas because we 963 00:50:26,840 --> 00:50:29,040 Speaker 5: really stocked up a few more of those nineties classics 964 00:50:29,040 --> 00:50:30,960 Speaker 5: to bump up the price of cool Zone coin going 965 00:50:31,000 --> 00:50:31,879 Speaker 5: into twenty twenty five. 966 00:50:31,920 --> 00:50:36,120 Speaker 4: That's right, everybody, Wow, sell your house by cool Zone coin. 967 00:50:36,800 --> 00:50:40,160 Speaker 2: Have you seen Hook? Garrison? I have seen Hook. I like, 968 00:50:40,480 --> 00:50:43,040 Speaker 2: of course, good, Yeah, a classic. 969 00:50:43,560 --> 00:50:45,799 Speaker 1: Have you seen Wickerman nineteen seventy three? 970 00:50:46,320 --> 00:50:48,520 Speaker 5: You know, I actually haven't. I've been waiting to catch 971 00:50:48,560 --> 00:50:49,320 Speaker 5: it in the theater. 972 00:50:49,680 --> 00:50:52,359 Speaker 1: We will make this happen at some point necessary. 973 00:50:52,680 --> 00:50:54,719 Speaker 5: I would I would love too. I would love too. 974 00:50:55,280 --> 00:50:58,760 Speaker 4: I bet one thing. I think it's very predictable border stuff. 975 00:50:58,800 --> 00:51:02,399 Speaker 4: They will stunt on an the caravan of migrants, and 976 00:51:02,440 --> 00:51:04,400 Speaker 4: I think it's pretty easy for them to kind of 977 00:51:04,480 --> 00:51:06,840 Speaker 4: organize that and make that happen, and it will be 978 00:51:07,280 --> 00:51:10,480 Speaker 4: a way for Trump to flex his border fascism yeah, 979 00:51:10,560 --> 00:51:12,359 Speaker 4: much like he did in twenty eighteen. Maybe they'll wait 980 00:51:12,400 --> 00:51:15,240 Speaker 4: till the midterms again. There's always a fun border disaster 981 00:51:15,320 --> 00:51:16,080 Speaker 4: for the midterms. 982 00:51:16,320 --> 00:51:19,040 Speaker 1: Could I just do one? That might not be a prediction, 983 00:51:19,160 --> 00:51:20,520 Speaker 1: but like a Sophie Hope. 984 00:51:20,800 --> 00:51:22,600 Speaker 2: Sure, yeah, get it. 985 00:51:22,719 --> 00:51:25,440 Speaker 1: Something has to happen to those Paul brothers. 986 00:51:25,480 --> 00:51:29,120 Speaker 2: Oh, Sophie, Oh yeah, that's possible. Yeah. My prediction for 987 00:51:29,200 --> 00:51:31,719 Speaker 2: the Paul brothers is that one of them dies within 988 00:51:31,760 --> 00:51:33,520 Speaker 2: the next five years, and one of them lives to 989 00:51:33,560 --> 00:51:34,560 Speaker 2: be one hundred and seven. 990 00:51:34,719 --> 00:51:36,960 Speaker 5: Oh that tracks sure. 991 00:51:37,040 --> 00:51:39,040 Speaker 4: Yeah, they decide to take on Bob Dylan in a 992 00:51:39,080 --> 00:51:41,560 Speaker 4: boxing match and only one of them survives. 993 00:51:41,600 --> 00:51:43,600 Speaker 2: I think Bob Dylan'll live in this next year. 994 00:51:43,680 --> 00:51:47,120 Speaker 4: But I've just found Bob Dylan's tweets the purest thing. 995 00:51:47,920 --> 00:51:49,480 Speaker 4: He just tweets about what he's doing. 996 00:51:51,080 --> 00:51:51,680 Speaker 2: What a hero. 997 00:51:52,520 --> 00:51:57,680 Speaker 1: Netflix paying Jake Paul to billions of dollars to fight 998 00:51:57,840 --> 00:52:01,280 Speaker 1: nine hundred year old Mike Tyson and then Jake Paul 999 00:52:01,800 --> 00:52:06,920 Speaker 1: coming in on like a vintage car and spraying his 1000 00:52:07,160 --> 00:52:11,719 Speaker 1: product and it having higher streaming numbers than the Super Bowl? 1001 00:52:11,920 --> 00:52:12,399 Speaker 5: Is that real? 1002 00:52:12,600 --> 00:52:12,839 Speaker 1: Yes? 1003 00:52:13,719 --> 00:52:15,960 Speaker 6: To be fair, that was a rancid super Bowl. 1004 00:52:17,000 --> 00:52:21,160 Speaker 1: Rancid Super Bowl. This this cannot this cannot be. 1005 00:52:21,640 --> 00:52:23,319 Speaker 4: Most of us just turned in on the off chane, 1006 00:52:23,360 --> 00:52:25,000 Speaker 4: So Jake Paul would die. 1007 00:52:25,040 --> 00:52:25,960 Speaker 5: Yes, that is true. 1008 00:52:26,120 --> 00:52:29,160 Speaker 4: That is true, or at least get bitten. 1009 00:52:29,440 --> 00:52:32,760 Speaker 1: Like Yeah, all of us were hoping that Mike Tyson 1010 00:52:33,080 --> 00:52:36,400 Speaker 1: was not in fact sixty years old. But he is 1011 00:52:36,640 --> 00:52:42,320 Speaker 1: sixty years old. So uh yeah some god, yeah something, 1012 00:52:42,520 --> 00:52:44,919 Speaker 1: something's gotta give. Oh, and there won't be a left 1013 00:52:44,960 --> 00:52:46,359 Speaker 1: wing Joe Rogan, thank you so much. 1014 00:52:46,440 --> 00:52:47,800 Speaker 2: Oh, I don't know, Sophie. 1015 00:52:47,480 --> 00:52:50,279 Speaker 5: I can as soon as we logicals so good, I 1016 00:52:50,320 --> 00:52:50,879 Speaker 5: think we can. 1017 00:52:51,000 --> 00:52:51,239 Speaker 2: Really. 1018 00:52:51,600 --> 00:52:55,960 Speaker 1: Oh, there'll be there'll be somebody trying to be. 1019 00:52:56,840 --> 00:52:59,560 Speaker 2: Oh, Sorphie, there already. By the way, it's time for 1020 00:52:59,600 --> 00:53:02,239 Speaker 2: me to do our new ad plug. You've heard of 1021 00:53:02,239 --> 00:53:04,719 Speaker 2: how good elk meat is for you, and you've heard 1022 00:53:04,719 --> 00:53:08,400 Speaker 2: of how liver is a superfood. Well now try new 1023 00:53:08,560 --> 00:53:12,200 Speaker 2: elk liver steaks. It's just just ground up liver shoved 1024 00:53:12,239 --> 00:53:14,959 Speaker 2: inside a steak. I send it through the mail through 1025 00:53:14,960 --> 00:53:19,960 Speaker 2: FedEx five day delivery. It is not refrigerated in any way. 1026 00:53:19,880 --> 00:53:22,759 Speaker 5: No refrigeration. It's better at roop temp, better at. 1027 00:53:22,760 --> 00:53:37,640 Speaker 4: ROOPTEMP to get the healthy bacteria it gives you mystical powers. 1028 00:53:33,880 --> 00:53:37,200 Speaker 5: Of One of my I guess more hopes and still 1029 00:53:37,200 --> 00:53:40,400 Speaker 5: partial predictions is that National Guard gets into a scuffle 1030 00:53:40,440 --> 00:53:42,520 Speaker 5: with border patrol in some kind of blue state. 1031 00:53:42,680 --> 00:53:43,880 Speaker 2: Yeah, yeah, good chance. 1032 00:53:43,960 --> 00:53:47,840 Speaker 5: We have some brave and strong governor is gonna is 1033 00:53:47,840 --> 00:53:51,640 Speaker 5: gonna salute the troops and send out our proud National 1034 00:53:51,640 --> 00:53:55,160 Speaker 5: Guard boys to fight off ice. That's just a battle 1035 00:53:55,160 --> 00:53:57,120 Speaker 5: I would love to see. I've wanted to see that 1036 00:53:57,160 --> 00:53:59,920 Speaker 5: ever since Portland twenty twenty. I've wanted to see National 1037 00:54:00,080 --> 00:54:03,160 Speaker 5: Guard troops fight against federal forces. 1038 00:54:02,760 --> 00:54:04,759 Speaker 2: Two groups of men who don't really know how to 1039 00:54:04,880 --> 00:54:09,719 Speaker 2: use their guns, using their guns, No, it's gonna be amazing. 1040 00:54:09,920 --> 00:54:12,400 Speaker 5: So battle, I've wanted to see you for like five years. 1041 00:54:12,560 --> 00:54:15,680 Speaker 2: Who's plate carriers at the top, closer to their nipples? 1042 00:54:17,120 --> 00:54:18,200 Speaker 2: It's anyone's game. 1043 00:54:19,400 --> 00:54:21,560 Speaker 5: I need to see it. I need to see it. 1044 00:54:22,040 --> 00:54:22,399 Speaker 5: Come on. 1045 00:54:23,040 --> 00:54:24,759 Speaker 4: I would like to see it from a distance, because 1046 00:54:24,760 --> 00:54:26,359 Speaker 4: that would be a shit show. 1047 00:54:26,440 --> 00:54:28,200 Speaker 2: Yeah, from a sizable distance. 1048 00:54:28,520 --> 00:54:30,279 Speaker 5: Yeah, General Whitmer, let's go. 1049 00:54:30,440 --> 00:54:34,359 Speaker 2: Let's go expensing a fucking telescope for that firefight. Yeah, yeah, 1050 00:54:34,800 --> 00:54:38,239 Speaker 2: a periscope. Maybe I trust the Iraqi Army more than 1051 00:54:38,280 --> 00:54:39,680 Speaker 2: either of those sides. 1052 00:54:41,239 --> 00:54:43,520 Speaker 4: I've seen a lot of dudes five guns while ducking 1053 00:54:43,560 --> 00:54:44,799 Speaker 4: behind a KB holding the gun. 1054 00:54:44,960 --> 00:54:45,400 Speaker 1: I love you. 1055 00:54:45,480 --> 00:54:47,359 Speaker 2: And then it does look fun. It does look fun. 1056 00:54:47,960 --> 00:54:48,239 Speaker 5: It is. 1057 00:54:48,360 --> 00:54:51,000 Speaker 4: Yeah, definitely, I would like to do that. That they kicked 1058 00:54:51,000 --> 00:54:53,760 Speaker 4: me out the range every time because of woke how sad? 1059 00:54:53,880 --> 00:54:57,200 Speaker 5: Well, not anymore, James, Yeah, that's also the casualties yet, 1060 00:54:57,239 --> 00:54:58,800 Speaker 5: not any more, James. Wolke is beaten. 1061 00:55:00,160 --> 00:55:02,359 Speaker 2: That's right. Yeah, they wait, went, they went broke. 1062 00:55:02,440 --> 00:55:04,879 Speaker 4: I'm going to buy the range, that's right, and we'll 1063 00:55:04,920 --> 00:55:06,920 Speaker 4: all fall from behind the bentress. 1064 00:55:06,560 --> 00:55:09,240 Speaker 5: Now, oh mama. 1065 00:55:09,520 --> 00:55:09,680 Speaker 3: Yeah. 1066 00:55:09,760 --> 00:55:10,160 Speaker 2: Yeah. 1067 00:55:10,440 --> 00:55:13,840 Speaker 4: Other predictions, maybe we'll get a good, solid couple of 1068 00:55:13,840 --> 00:55:15,919 Speaker 4: weeks of writing again, and that Garrison said, like, maybe 1069 00:55:15,920 --> 00:55:17,600 Speaker 4: it'll let me take a year or two this time. 1070 00:55:18,600 --> 00:55:20,360 Speaker 5: I don't think that anymore. 1071 00:55:20,560 --> 00:55:23,799 Speaker 2: Something will have to change. Yeah, there will have to 1072 00:55:23,840 --> 00:55:30,720 Speaker 2: be a material change in either organizing or social conditions, 1073 00:55:31,160 --> 00:55:34,200 Speaker 2: because people will need to either be vastly more desperate 1074 00:55:34,239 --> 00:55:36,279 Speaker 2: than they are right now, or they will need to 1075 00:55:36,280 --> 00:55:38,680 Speaker 2: have a specific reason to think, well, this time, getting 1076 00:55:38,719 --> 00:55:41,880 Speaker 2: out in the street might do something. Yeah. 1077 00:55:41,960 --> 00:55:45,520 Speaker 5: I think we're gonna kind of continue the trends that 1078 00:55:45,520 --> 00:55:48,440 Speaker 5: that we've been seeing, which which points towards a bit 1079 00:55:48,480 --> 00:55:52,960 Speaker 5: of an apathy towards like like big popular mobilizations and 1080 00:55:53,120 --> 00:55:57,799 Speaker 5: more towards kind of bizarre lone wolf attacks, something that 1081 00:55:57,880 --> 00:56:00,680 Speaker 5: you know could be slight even a slightly probable or 1082 00:56:00,719 --> 00:56:03,480 Speaker 5: you know, possibly darker productions. I think we'll have like 1083 00:56:03,520 --> 00:56:10,520 Speaker 5: a really bad Luigi copycat within the next like four months. Sure, yes, 1084 00:56:10,880 --> 00:56:13,239 Speaker 5: the years of Luigi, Like, it's not gonna be good. 1085 00:56:13,600 --> 00:56:16,239 Speaker 2: It's not gonna be good. There's probably gonna be a 1086 00:56:16,280 --> 00:56:19,200 Speaker 2: situation where some guy either gets the best case is 1087 00:56:19,200 --> 00:56:22,080 Speaker 2: that he gets killed immediately by the dude's security. The 1088 00:56:22,120 --> 00:56:24,840 Speaker 2: worst case is there's a big public firefight and a 1089 00:56:24,880 --> 00:56:26,239 Speaker 2: whole fuck let of people get hit. 1090 00:56:26,400 --> 00:56:28,279 Speaker 4: Yeah, didn't I predict that there would be a big 1091 00:56:28,320 --> 00:56:30,520 Speaker 4: public crime with a three D printed gun last year? 1092 00:56:30,920 --> 00:56:33,400 Speaker 5: I think that was the year before we talked about that. 1093 00:56:34,120 --> 00:56:37,400 Speaker 2: Oh damn, okay, so close and. 1094 00:56:37,920 --> 00:56:40,399 Speaker 5: Yeah, you know, I mean this, this certainly does kind 1095 00:56:40,400 --> 00:56:43,359 Speaker 5: of fit that mold. We'll see how much that like 1096 00:56:43,719 --> 00:56:46,640 Speaker 5: gets focused on in the trial and like continued reporting. 1097 00:56:47,080 --> 00:56:50,120 Speaker 4: Yeah, and in that legislation too, I missed a death. 1098 00:56:50,160 --> 00:56:52,959 Speaker 4: We can also include it in the hope section. Matthew Iglesias, 1099 00:56:53,040 --> 00:56:58,280 Speaker 4: that motherfucker, motherfucker has been standing bullshit for twenty years. 1100 00:56:58,719 --> 00:57:01,600 Speaker 4: It just it cannot can he knew he's lost a 1101 00:57:01,680 --> 00:57:03,520 Speaker 4: juice a little bit. I think he ses on the 1102 00:57:03,520 --> 00:57:03,879 Speaker 4: way out. 1103 00:57:04,120 --> 00:57:07,040 Speaker 2: All Right, Something very funny did just happen that we 1104 00:57:07,040 --> 00:57:11,440 Speaker 2: should talk about as a team. Senator Doug Mastriano taught, 1105 00:57:11,920 --> 00:57:14,640 Speaker 2: a thirty year US Army veteran who taught at the 1106 00:57:14,680 --> 00:57:19,680 Speaker 2: War College, just tweeted an indignant, furious tweet about the 1107 00:57:19,800 --> 00:57:23,040 Speaker 2: US government not being honest with Americans about like what's 1108 00:57:23,040 --> 00:57:25,480 Speaker 2: happening with these drones? And yeah, and the picture of 1109 00:57:25,560 --> 00:57:28,720 Speaker 2: the crash drone is a tie fighter. That's like a 1110 00:57:28,720 --> 00:57:30,880 Speaker 2: model tie fighter on the bed of a flatbed. 1111 00:57:31,880 --> 00:57:36,680 Speaker 5: Yes, we've all lost our mind. 1112 00:57:36,920 --> 00:57:41,360 Speaker 2: Hot at the US Army War College, they're not sending 1113 00:57:41,400 --> 00:57:46,360 Speaker 2: their best people. Oh fuck, that's funny, amazing stuff. That's 1114 00:57:46,400 --> 00:57:51,000 Speaker 2: one of the best things I've seen out here. Oh 1115 00:57:51,320 --> 00:57:51,880 Speaker 2: good me. 1116 00:57:53,760 --> 00:57:56,200 Speaker 5: Finally, I like the closer predictions a little bit on 1117 00:57:56,480 --> 00:57:59,680 Speaker 5: Trump's cabinet. I think it's pretty pretty safe to say, 1118 00:57:59,680 --> 00:58:03,000 Speaker 5: becausidering his last presidency, we'll have at least one third 1119 00:58:03,120 --> 00:58:06,080 Speaker 5: cabinet turnover by the end of the year. Yeah, this 1120 00:58:06,120 --> 00:58:08,280 Speaker 5: is something that we've been talking about a lot. When 1121 00:58:08,320 --> 00:58:11,000 Speaker 5: do we think Musk is gonna get the boot and 1122 00:58:11,440 --> 00:58:13,520 Speaker 5: based on the way Trump's kind of positioned him, I'm 1123 00:58:13,520 --> 00:58:14,840 Speaker 5: not sure if it's gonna be as soon as what 1124 00:58:14,880 --> 00:58:17,360 Speaker 5: we all kind of initially thought, because Trump has kept 1125 00:58:17,440 --> 00:58:21,360 Speaker 5: him out of his inner orbit but pretty solidly in 1126 00:58:21,440 --> 00:58:25,919 Speaker 5: his middle orbit. Like he's not in any like real position, right, Yeah, 1127 00:58:25,960 --> 00:58:27,400 Speaker 5: he has Doge, Like, come on. 1128 00:58:27,800 --> 00:58:29,160 Speaker 2: It just came out that he's not going to be 1129 00:58:29,200 --> 00:58:31,680 Speaker 2: able to get a high the highest security clearance. There 1130 00:58:31,720 --> 00:58:34,200 Speaker 2: you go, that's funny. 1131 00:58:34,240 --> 00:58:36,880 Speaker 1: But like he has him sitting next to his family 1132 00:58:36,960 --> 00:58:38,280 Speaker 1: at Thanksgiving totally. 1133 00:58:38,400 --> 00:58:41,800 Speaker 5: Yeah, yeah, no, no, totally, And especially in like the 1134 00:58:42,000 --> 00:58:44,680 Speaker 5: three weeks after the election, they were like they were 1135 00:58:44,720 --> 00:58:48,439 Speaker 5: like honeymoon, right, they were neck and neck and something 1136 00:58:48,480 --> 00:58:51,480 Speaker 5: that's gonna like start dissipating. Musk can't get fully booted 1137 00:58:51,480 --> 00:58:55,120 Speaker 5: out because like you know, the federal government needs SpaceX 1138 00:58:55,200 --> 00:58:58,960 Speaker 5: and and unlike Musk's other like technologies, so like they 1139 00:58:58,960 --> 00:59:01,080 Speaker 5: will remain friendly, but like you're not going to be 1140 00:59:01,120 --> 00:59:04,040 Speaker 5: in the close position that they are now. I initially 1141 00:59:04,200 --> 00:59:07,600 Speaker 5: I put that date for being March twentieth, twenty twenty five, 1142 00:59:07,760 --> 00:59:10,760 Speaker 5: you know, a two months after inauguration day. It's it's 1143 00:59:10,840 --> 00:59:13,200 Speaker 5: enough time to get, you know, for for someone like 1144 00:59:13,200 --> 00:59:16,760 Speaker 5: Trump to get tired of Musk's like personality. But I 1145 00:59:17,000 --> 00:59:19,479 Speaker 5: think I might stretch that out a little bit more 1146 00:59:19,600 --> 00:59:22,600 Speaker 5: now than my initial prediction. I think I think they 1147 00:59:22,680 --> 00:59:24,680 Speaker 5: might do a little bit more of a long term 1148 00:59:24,880 --> 00:59:27,760 Speaker 5: game here. But that also means that that Musk maybe 1149 00:59:27,760 --> 00:59:31,240 Speaker 5: will not have as much like constant influence as what 1150 00:59:31,280 --> 00:59:33,000 Speaker 5: it was first looking like in those like, you know, 1151 00:59:33,040 --> 00:59:34,640 Speaker 5: three months after the election. 1152 00:59:34,720 --> 00:59:37,760 Speaker 2: I think that RFK Junior is probably at pushed out 1153 00:59:37,760 --> 00:59:39,200 Speaker 2: of the picture before Musk is. 1154 00:59:39,880 --> 00:59:41,840 Speaker 1: Yeah, if he tries to get rid of the fucking 1155 00:59:41,960 --> 00:59:45,880 Speaker 1: polio vaccine, it's gonna be a real quick trip to 1156 00:59:45,920 --> 00:59:47,919 Speaker 1: the unemployment line for Bobby Boy. 1157 00:59:48,920 --> 00:59:53,760 Speaker 2: Yeah, I really, I don't think Trump's that reckquest No, 1158 00:59:54,040 --> 00:59:56,960 Speaker 2: like that that would be quite quite a line to 1159 00:59:57,360 --> 00:59:59,440 Speaker 2: get rid of the polio vaccine. 1160 00:59:59,680 --> 01:00:03,800 Speaker 4: Trump, I'm still so old, like he remembers he's old. 1161 01:00:04,920 --> 01:00:07,560 Speaker 1: But if, but if ORFK Junior could get the wheat 1162 01:00:07,800 --> 01:00:10,800 Speaker 1: ingredient out of the McDonald's fries, I'd be most obliged. 1163 01:00:11,280 --> 01:00:13,960 Speaker 5: Oh yeah, no, I'm I'm sure that he's gonna he's 1164 01:00:13,960 --> 01:00:17,160 Speaker 5: gonna reverse one hundred years of corn subsidies. And get 1165 01:00:17,200 --> 01:00:19,920 Speaker 5: corn served out of our Coca Cola. I believe in RFK. 1166 01:00:20,280 --> 01:00:24,000 Speaker 2: Yeah, I feel pretty good about the continuing legality of 1167 01:00:24,080 --> 01:00:26,240 Speaker 2: Kraton as long as he's the HHS head. 1168 01:00:26,720 --> 01:00:27,280 Speaker 5: There you go. 1169 01:00:27,320 --> 01:00:29,680 Speaker 2: All it's gonna take, is one of Joe Rogan's friends, 1170 01:00:29,760 --> 01:00:30,720 Speaker 2: speaking in his ear. 1171 01:00:32,360 --> 01:00:35,400 Speaker 5: We'll be all right, We're gonna have legally required DMT 1172 01:00:35,640 --> 01:00:37,120 Speaker 5: for everyone in the country. 1173 01:00:38,400 --> 01:00:41,200 Speaker 2: Yeah, why not? I think we need and I've been 1174 01:00:41,240 --> 01:00:43,480 Speaker 2: I've been saying this for years. We need to put 1175 01:00:43,480 --> 01:00:45,640 Speaker 2: the lithium back in the water. We also need to 1176 01:00:45,720 --> 01:00:48,360 Speaker 2: use those crop dusting planes and just like fill them 1177 01:00:48,360 --> 01:00:54,320 Speaker 2: with xanax. Just just just calm everyone down, take everything 1178 01:00:54,360 --> 01:00:55,600 Speaker 2: back a couple of steps. 1179 01:00:56,400 --> 01:00:59,440 Speaker 1: All right, I'm gonna go pet some dogs. So the 1180 01:00:59,480 --> 01:01:03,720 Speaker 1: podcast over. Happy New Year, everyone, Happy, Happy new everyone. 1181 01:01:04,080 --> 01:01:06,760 Speaker 1: I do want everyone to pick one thing that that 1182 01:01:06,760 --> 01:01:09,360 Speaker 1: that they're gonna do this year that will improve their life, 1183 01:01:09,720 --> 01:01:10,560 Speaker 1: however small. 1184 01:01:10,840 --> 01:01:13,400 Speaker 5: For me, I'm gonna get a new mirror. We're gonna 1185 01:01:13,400 --> 01:01:15,960 Speaker 5: all pick one thing. We call that Project twenty twenty five. 1186 01:01:16,240 --> 01:01:18,439 Speaker 5: It's what one thing we can do to improve our 1187 01:01:18,520 --> 01:01:21,480 Speaker 5: lives and you know, and then by extension, the lives 1188 01:01:21,520 --> 01:01:23,680 Speaker 5: of everyone else around us. So make sure everyone has 1189 01:01:23,720 --> 01:01:26,200 Speaker 5: their own personal project twenty twenty five going into this 1190 01:01:26,240 --> 01:01:28,040 Speaker 5: next year. I think we will need it. 1191 01:01:28,120 --> 01:01:31,200 Speaker 1: Yeah, I'm holding my Project twenty twenty five in my 1192 01:01:31,360 --> 01:01:31,960 Speaker 1: arms right now. 1193 01:01:32,040 --> 01:01:34,160 Speaker 5: Your new dog, your new dog I adopted. 1194 01:01:34,320 --> 01:01:37,960 Speaker 1: I adopted Anderson a sibling, and her name is Truman. 1195 01:01:37,880 --> 01:01:41,320 Speaker 2: Lovely after our greatest US president, after not. 1196 01:01:41,520 --> 01:01:44,280 Speaker 1: The greatest US president. I would never name a child 1197 01:01:44,280 --> 01:01:45,600 Speaker 1: of mine after our president. 1198 01:01:46,160 --> 01:01:49,320 Speaker 5: After the sheriff in Twin Peaks, that's right. Also, no, 1199 01:01:49,480 --> 01:01:50,840 Speaker 5: all right, well we love that. 1200 01:01:51,600 --> 01:01:54,800 Speaker 4: After the house did viv who grew up in the 1201 01:01:54,840 --> 01:01:58,000 Speaker 4: Truman Shows Gate is Gates? 1202 01:01:58,080 --> 01:01:59,520 Speaker 2: Yeah, named you're after Matt Gates. 1203 01:01:59,560 --> 01:02:03,920 Speaker 5: Tildhoods is like totally unoven job. Now that's so funny. 1204 01:02:04,000 --> 01:02:07,200 Speaker 1: It's very funny. It's very very funny, and I feel 1205 01:02:07,240 --> 01:02:09,800 Speaker 1: like I feel like we should end on that note. 1206 01:02:09,840 --> 01:02:15,880 Speaker 1: So ha to Matt Gates. Anyways, Anderson Truman to the 1207 01:02:15,880 --> 01:02:16,640 Speaker 1: fuck out. 1208 01:02:16,440 --> 01:02:33,360 Speaker 2: Of here, welcome back to it could happen here, A 1209 01:02:33,440 --> 01:02:36,520 Speaker 2: podcast about it, which in this week's case is the 1210 01:02:36,560 --> 01:02:41,160 Speaker 2: Consumer Electronics Show is happening here. And yeah, we're here 1211 01:02:41,200 --> 01:02:43,680 Speaker 2: to talk about things falling apart. And again, in this 1212 01:02:43,760 --> 01:02:47,200 Speaker 2: case that's the tech industry because The story this CES, 1213 01:02:47,320 --> 01:02:49,440 Speaker 2: as it has been for the last several CES, is 1214 01:02:49,440 --> 01:02:52,960 Speaker 2: is that the continuing degradation of big tech as it 1215 01:02:53,080 --> 01:02:56,960 Speaker 2: seeks more places to get money from while providing less 1216 01:02:57,000 --> 01:02:59,080 Speaker 2: and less utility to the people that it needs to 1217 01:02:59,120 --> 01:03:02,080 Speaker 2: give it money. And every CEES at some point I 1218 01:03:02,120 --> 01:03:04,800 Speaker 2: find myself face to face with something that makes me say, 1219 01:03:05,000 --> 01:03:08,840 Speaker 2: I've now seen the silliest thing I've ever seen. And 1220 01:03:08,920 --> 01:03:11,920 Speaker 2: this year, that experience happened for the first time within 1221 01:03:12,080 --> 01:03:14,680 Speaker 2: thirty minutes of the first half day. And I'm going 1222 01:03:14,720 --> 01:03:17,680 Speaker 2: to talk about that and show some videos to my 1223 01:03:18,440 --> 01:03:21,560 Speaker 2: panelists here, which of course are the great ed Zeitron, 1224 01:03:21,720 --> 01:03:24,560 Speaker 2: It's me, I'm here, the pretty good Garrison Davis. 1225 01:03:24,640 --> 01:03:29,680 Speaker 5: Okay, thanks, okay, all right, all right, Boddy them. 1226 01:03:28,960 --> 01:03:32,280 Speaker 2: And the supernumerate supernumerary. I'm sorry, I messed up the 1227 01:03:32,320 --> 01:03:34,400 Speaker 2: word I was using as a superlative to praise you. 1228 01:03:35,000 --> 01:03:36,440 Speaker 5: I'll take at longways. 1229 01:03:36,480 --> 01:03:40,280 Speaker 2: So Junior, thanks Ed, Thank you so much for joining us. Everybody, 1230 01:03:40,320 --> 01:03:42,560 Speaker 2: are you ready to see some of the dumbest AI 1231 01:03:42,640 --> 01:03:43,560 Speaker 2: generated videos? 1232 01:03:43,600 --> 01:03:45,960 Speaker 7: Sure sell me with more pleasure? 1233 01:03:46,160 --> 01:03:47,120 Speaker 2: Excellent, excellent. 1234 01:03:47,240 --> 01:03:48,400 Speaker 8: Nothing fills me with pleasure. 1235 01:03:48,520 --> 01:03:51,000 Speaker 2: So the first panel I sat down today with at 1236 01:03:51,040 --> 01:03:54,480 Speaker 2: ten am in the Goddamn Morning Jesus was the Hollywood 1237 01:03:54,480 --> 01:03:58,400 Speaker 2: Trajectory Generative AI Timeline twenty twenty five to twenty thirty. 1238 01:03:58,480 --> 01:04:01,200 Speaker 5: Oh boy, I am fascinated for what they think will 1239 01:04:01,200 --> 01:04:02,480 Speaker 5: happen in twenty thirty. 1240 01:04:03,400 --> 01:04:06,720 Speaker 2: Everything's just gonna get better Garrison. This panel featured a 1241 01:04:06,800 --> 01:04:10,600 Speaker 2: number of luminary thinkers, including Mary Hamilton, a managing director 1242 01:04:10,640 --> 01:04:13,960 Speaker 2: at a Centsure, who announced her company's three billion dollar 1243 01:04:14,040 --> 01:04:16,200 Speaker 2: investment in AI by dropping this gym. 1244 01:04:16,600 --> 01:04:17,880 Speaker 8: I have a digital twin. 1245 01:04:17,760 --> 01:04:21,200 Speaker 5: And she's constantly evolving and how she gets used and 1246 01:04:21,240 --> 01:04:22,360 Speaker 5: what she says, and. 1247 01:04:23,120 --> 01:04:25,880 Speaker 9: You know there's you know, big educations around that. 1248 01:04:26,040 --> 01:04:28,720 Speaker 5: So I think this is a really exciting space to 1249 01:04:28,800 --> 01:04:29,480 Speaker 5: be thinking. 1250 01:04:29,200 --> 01:04:32,440 Speaker 7: That to any like that she just stole Hurley Herndon's thing. 1251 01:04:32,520 --> 01:04:37,360 Speaker 10: But okay, they've probably said that to a dolta they 1252 01:04:37,440 --> 01:04:38,560 Speaker 10: think I had a concussion. 1253 01:04:38,600 --> 01:04:41,400 Speaker 2: The shere would this person needs? 1254 01:04:41,400 --> 01:04:42,640 Speaker 5: Like psychologically, Yeah, you. 1255 01:04:42,640 --> 01:04:43,600 Speaker 2: Should be allowed to drive. 1256 01:04:44,960 --> 01:04:46,480 Speaker 11: You need a trying kid. 1257 01:04:46,560 --> 01:04:49,240 Speaker 8: Okay, let's get you, let's get you sit down, and. 1258 01:04:49,440 --> 01:04:52,720 Speaker 2: We're taking the phone away from you. Now. I think 1259 01:04:52,720 --> 01:04:55,120 Speaker 2: this is very silly because again I think, yeah, it's 1260 01:04:55,160 --> 01:04:57,680 Speaker 2: just a fundamental mismatch and what people might want from 1261 01:04:57,680 --> 01:05:00,160 Speaker 2: an AI agent, and like the way in w which 1262 01:05:00,200 --> 01:05:01,040 Speaker 2: they get talked about. 1263 01:05:01,080 --> 01:05:03,400 Speaker 8: But also they use digital twin, which is cement to 1264 01:05:03,440 --> 01:05:04,160 Speaker 8: prise software. 1265 01:05:04,240 --> 01:05:07,320 Speaker 2: Shit. Yeah, oh my god, yeah it's it's it's I'm 1266 01:05:07,360 --> 01:05:09,800 Speaker 2: excited to go see some digital twin technology that I'm 1267 01:05:09,800 --> 01:05:13,080 Speaker 2: sure we'll make a cheap egg code switching. 1268 01:05:14,080 --> 01:05:15,800 Speaker 5: This was this is the first thing I reported on 1269 01:05:15,880 --> 01:05:18,760 Speaker 5: at CEES was there was the digital twin. Like back 1270 01:05:18,800 --> 01:05:21,640 Speaker 5: in like twenty twenty two, twenty one, there was like 1271 01:05:21,760 --> 01:05:23,800 Speaker 5: one single company and all of c Yes, I was 1272 01:05:23,800 --> 01:05:26,120 Speaker 5: promising like a digital twin, and now it's like every 1273 01:05:26,120 --> 01:05:26,720 Speaker 5: other company. 1274 01:05:27,080 --> 01:05:28,800 Speaker 2: Yees. It means so many different things. 1275 01:05:28,840 --> 01:05:31,240 Speaker 8: It means literally a digital representation of anything. 1276 01:05:31,280 --> 01:05:32,720 Speaker 2: It doesn't even mean an AI agent. 1277 01:05:32,760 --> 01:05:34,240 Speaker 8: The fact that they're using it in the wrong place 1278 01:05:34,320 --> 01:05:35,280 Speaker 8: is very annoying to me. 1279 01:05:35,440 --> 01:05:37,280 Speaker 2: Yeah. I keep seeing like they can now make an 1280 01:05:37,400 --> 01:05:40,360 Speaker 2: an AI chatbot trained off of your social media presence. 1281 01:05:40,360 --> 01:05:42,040 Speaker 2: That's eighty five percent accurate. 1282 01:05:42,120 --> 01:05:45,040 Speaker 12: Oh, as all twins are. 1283 01:05:45,400 --> 01:05:47,760 Speaker 2: And I want to say I know they can't. But 1284 01:05:47,800 --> 01:05:50,160 Speaker 2: then you talk to the average person at CEES or 1285 01:05:50,200 --> 01:05:53,479 Speaker 2: the average panelist on this particular panel, I'm like, yes, 1286 01:05:53,520 --> 01:05:55,600 Speaker 2: I do believe in fact, everyone on that panel you 1287 01:05:55,640 --> 01:05:58,880 Speaker 2: could accurately, you could accurately get eighty five percent of 1288 01:05:58,880 --> 01:06:02,160 Speaker 2: their personality with the I bought for a bit, you know, 1289 01:06:03,960 --> 01:06:08,040 Speaker 2: maybe a lot higher improvement. Yeah. Yeah, So I will say, 1290 01:06:08,120 --> 01:06:11,080 Speaker 2: like that was silly. That's not the silliest thing I saw. 1291 01:06:11,720 --> 01:06:14,280 Speaker 2: The silliest thing I saw came courtesy of another panelist, 1292 01:06:14,440 --> 01:06:19,040 Speaker 2: Jason Zada, founder of Secret Level and COO of the company. 1293 01:06:19,400 --> 01:06:22,360 Speaker 2: The videos that Jason came to CEES to brag about 1294 01:06:22,480 --> 01:06:25,880 Speaker 2: were a collection of the laziest AI slop ever to 1295 01:06:25,920 --> 01:06:29,440 Speaker 2: stain human eyeballs. His most recent big success that you 1296 01:06:29,440 --> 01:06:31,920 Speaker 2: could just see radiating off of him how proud he 1297 01:06:32,040 --> 01:06:35,720 Speaker 2: was of this was Coca Cola's annual Christmas ad, which 1298 01:06:35,920 --> 01:06:39,560 Speaker 2: last year was produced for the first time entirely with AI. 1299 01:06:40,040 --> 01:06:42,120 Speaker 2: And I'm just gonna if you haven't seen this, who 1300 01:06:42,200 --> 01:06:43,920 Speaker 2: hears seen Coca Cola's AI. 1301 01:06:46,520 --> 01:06:49,800 Speaker 5: Guess yeah, I've seen pictures. I think I've watched one. 1302 01:06:50,600 --> 01:06:55,160 Speaker 2: Okay, well let's let's let watch a few times. We're 1303 01:06:55,160 --> 01:06:57,200 Speaker 2: gonna play. There's three different versions of this. 1304 01:06:57,160 --> 01:06:59,200 Speaker 8: So why we're just gonna I mean, that's that's what 1305 01:06:59,240 --> 01:06:59,680 Speaker 8: it's about. 1306 01:06:59,680 --> 01:07:01,840 Speaker 10: Out Oh my God, if there's three different versions that 1307 01:07:01,760 --> 01:07:03,560 Speaker 10: that's just they saved the from. 1308 01:07:09,440 --> 01:07:18,880 Speaker 8: Everyone is the same length of shark. 1309 01:07:20,360 --> 01:07:24,120 Speaker 2: Can you believe this song's AI generated? I can't believe 1310 01:07:24,880 --> 01:07:27,280 Speaker 2: the cou Could they teach a computer to write the lyrics? 1311 01:07:27,280 --> 01:07:28,320 Speaker 2: Holidays are going? 1312 01:07:29,720 --> 01:07:31,800 Speaker 10: I just can't believe we finally have the technology to 1313 01:07:31,840 --> 01:07:33,880 Speaker 10: have three trucks driving somewhere. 1314 01:07:33,680 --> 01:07:37,560 Speaker 2: And there wagging its tail with a dead eye too 1315 01:07:37,640 --> 01:07:44,920 Speaker 2: horrible girls move trucks with Coca cola and them driving 1316 01:07:44,960 --> 01:07:47,400 Speaker 2: down not a street. Raccoons. 1317 01:07:48,000 --> 01:07:50,360 Speaker 8: What the why is there a satellite? 1318 01:07:50,360 --> 01:07:51,960 Speaker 10: Are they going to drop the iron cannon on the 1319 01:07:52,000 --> 01:07:57,680 Speaker 10: polar bears? 1320 01:07:58,920 --> 01:08:02,840 Speaker 2: It's all clearly a it's all glowing like the city 1321 01:08:02,880 --> 01:08:07,280 Speaker 2: shots of like snow colored villages with that, as we're 1322 01:08:07,320 --> 01:08:09,920 Speaker 2: going to see in later videos. A. I loves putting 1323 01:08:09,960 --> 01:08:13,080 Speaker 2: smoke and random fires where there should not be smoking. 1324 01:08:13,200 --> 01:08:16,840 Speaker 5: Random fire. Chris, that's such a bad omen for four 1325 01:08:16,920 --> 01:08:19,720 Speaker 5: more years of a Trump presidency. It's a bleak that 1326 01:08:19,800 --> 01:08:23,639 Speaker 5: we have like even uglier Thomas Kincaid esque artwork. 1327 01:08:24,080 --> 01:08:27,360 Speaker 2: That's all every frame looks like animated. 1328 01:08:27,920 --> 01:08:30,639 Speaker 5: It's like they just generated a Thomas Kincaid like frame 1329 01:08:30,720 --> 01:08:32,400 Speaker 5: and then like badly animated and. 1330 01:08:32,320 --> 01:08:34,800 Speaker 10: The way that they move is very weird, like it 1331 01:08:34,880 --> 01:08:37,320 Speaker 10: looks kind of right but kind of right, looks very strong. 1332 01:08:37,400 --> 01:08:39,080 Speaker 2: It does that all of the scenes because it's like 1333 01:08:39,120 --> 01:08:41,120 Speaker 2: showing you a bunch of you see like a polar bear, 1334 01:08:41,200 --> 01:08:43,519 Speaker 2: obviously it's a Coca Cola Christmas ad. You see like 1335 01:08:43,560 --> 01:08:46,599 Speaker 2: a fucking reindeer, you see squirrels, you see a dog. 1336 01:08:46,800 --> 01:08:49,280 Speaker 2: But it always is like this very ai shot where 1337 01:08:49,320 --> 01:08:52,719 Speaker 2: it just pans across the animal and it's like glowing 1338 01:08:52,760 --> 01:08:57,720 Speaker 2: and kind of glossy and steering much. But they're not 1339 01:08:57,840 --> 01:08:59,080 Speaker 2: going anywhere with the movement. 1340 01:08:59,120 --> 01:09:01,280 Speaker 8: It's just like they doing something and that's it. 1341 01:09:01,520 --> 01:09:01,760 Speaker 2: Yeah. 1342 01:09:01,880 --> 01:09:03,880 Speaker 10: You think in ten years they're still gonna have these commercials. 1343 01:09:04,080 --> 01:09:06,400 Speaker 10: Yeah no, because where's the snoke. It's just a polo 1344 01:09:06,400 --> 01:09:07,280 Speaker 10: bez walking around like. 1345 01:09:08,240 --> 01:09:11,640 Speaker 2: System one, which tests emotional responses to ads, claims that 1346 01:09:11,680 --> 01:09:14,919 Speaker 2: the initial response to their Christmas ad was overwhelmingly positive. 1347 01:09:15,040 --> 01:09:17,720 Speaker 5: I don't think they're lying about that. I think if 1348 01:09:17,760 --> 01:09:20,000 Speaker 5: you walked up to someone like randomly on the street 1349 01:09:20,000 --> 01:09:22,200 Speaker 5: and showed them this, I think they'd be like, oh, yeah, 1350 01:09:22,400 --> 01:09:22,960 Speaker 5: it looks fine. 1351 01:09:24,600 --> 01:09:24,840 Speaker 2: Yeah. 1352 01:09:25,080 --> 01:09:28,240 Speaker 8: No one's watching a Coca Cola RDE and being like, yeah, wow, 1353 01:09:28,720 --> 01:09:30,080 Speaker 8: I've never had one of these before. 1354 01:09:30,240 --> 01:09:33,200 Speaker 2: Yeah, it's never a new experience. Not yet. 1355 01:09:33,439 --> 01:09:34,759 Speaker 7: We need an ad man. 1356 01:09:34,800 --> 01:09:37,559 Speaker 8: Need an ad man for the coke holdouts, we. 1357 01:09:37,520 --> 01:09:38,839 Speaker 7: Need an AI Don Draper. 1358 01:09:39,040 --> 01:09:41,000 Speaker 5: Yeah, well, do not give them ideas. 1359 01:09:42,640 --> 01:09:43,280 Speaker 2: What if a. 1360 01:09:43,160 --> 01:09:44,479 Speaker 8: Company lost five billion dollars. 1361 01:09:44,600 --> 01:09:45,880 Speaker 2: It's just an air that doesn't work. 1362 01:09:45,880 --> 01:09:47,840 Speaker 7: Instead of going to the movies like Don Draper does 1363 01:09:47,880 --> 01:09:48,439 Speaker 7: throughout all. 1364 01:09:48,360 --> 01:09:50,479 Speaker 12: Of mad Man, it just doesn't work and respond to 1365 01:09:50,479 --> 01:09:51,360 Speaker 12: any of your queries. 1366 01:09:51,479 --> 01:09:55,720 Speaker 2: Just Don Draper spending hours watching that looping Christmas video. 1367 01:09:56,280 --> 01:09:57,600 Speaker 8: Staring at nothingness. 1368 01:09:57,920 --> 01:10:01,280 Speaker 2: Yeah. So there was like an immediate, pretty immediate backlash 1369 01:10:01,320 --> 01:10:03,559 Speaker 2: to this, like all of the responses. If you go 1370 01:10:03,560 --> 01:10:05,960 Speaker 2: to any of like where these things live on YouTube, 1371 01:10:06,000 --> 01:10:09,720 Speaker 2: it's just people shitting on them, which he did acknowledge 1372 01:10:09,840 --> 01:10:12,520 Speaker 2: Jason by saying the video was very debated. 1373 01:10:13,000 --> 01:10:14,920 Speaker 8: Yes, classic thing with commercials. 1374 01:10:15,120 --> 01:10:16,520 Speaker 2: We love debating commercials. 1375 01:10:16,560 --> 01:10:18,920 Speaker 5: Many things are very debated these days. 1376 01:10:18,680 --> 01:10:19,799 Speaker 7: A lot of people are saying. 1377 01:10:19,880 --> 01:10:22,360 Speaker 2: And then he showed us next an AI generated video, 1378 01:10:22,479 --> 01:10:26,160 Speaker 2: the Heist, which was entirely made by a tech script 1379 01:10:26,160 --> 01:10:29,240 Speaker 2: that itself was mostly written by chat GPT. And here's 1380 01:10:29,240 --> 01:10:32,400 Speaker 2: how Jason describes the workflow for what you're about to see. 1381 01:10:32,840 --> 01:10:35,519 Speaker 2: It took thousands of generations to get the final film, 1382 01:10:35,560 --> 01:10:39,560 Speaker 2: but I'm absolutely blown away by the quality, the consistency, 1383 01:10:39,920 --> 01:10:43,599 Speaker 2: and adherence to the original prompt when I described gritty 1384 01:10:43,680 --> 01:10:47,840 Speaker 2: New York City in the eighties, it delivered in spades consistently. 1385 01:10:48,280 --> 01:10:51,800 Speaker 2: While this is not perfect, it is hands down the 1386 01:10:51,840 --> 01:10:56,200 Speaker 2: best video generation model out there by a long shot. Additionally, 1387 01:10:56,479 --> 01:11:00,160 Speaker 2: it's important no VFX, no cleanup, no color correction has 1388 01:11:00,160 --> 01:11:03,639 Speaker 2: been added. Everything is straight out of VO two Google 1389 01:11:03,680 --> 01:11:06,759 Speaker 2: deep mind. So what is the model vo two Google 1390 01:11:06,760 --> 01:11:08,040 Speaker 2: deep mind? I think is what he's saying. 1391 01:11:08,080 --> 01:11:09,880 Speaker 8: It is, so I thought that I had another one. 1392 01:11:10,000 --> 01:11:11,960 Speaker 8: By the way, I'm sure what you're about to show 1393 01:11:12,040 --> 01:11:13,200 Speaker 8: me looks like a dog's. 1394 01:11:12,960 --> 01:11:15,240 Speaker 2: As it looks like, yeah, New York, exactly like New 1395 01:11:15,320 --> 01:11:18,840 Speaker 2: York at Juliani right before he came in clean it up. 1396 01:11:20,560 --> 01:11:23,360 Speaker 5: So this is like the competitor to Sora. I guess 1397 01:11:23,400 --> 01:11:26,559 Speaker 5: that's the other big like video generation brand new. 1398 01:11:26,600 --> 01:11:29,639 Speaker 2: I don't buy for a fucking and I'm not impressed. 1399 01:11:29,680 --> 01:11:31,080 Speaker 2: But we'll see what you guys think. Okay, I don't 1400 01:11:31,080 --> 01:11:33,400 Speaker 2: want to poison your I wouldn't. 1401 01:11:34,840 --> 01:11:37,720 Speaker 8: Oh god, okay, there is fire in. 1402 01:11:37,680 --> 01:11:39,800 Speaker 2: This The last time you're gonna see the sack full 1403 01:11:39,840 --> 01:11:41,760 Speaker 2: of money. It does not shine again. 1404 01:11:41,840 --> 01:11:44,360 Speaker 5: It's a lot of a lot of fire, random fire 1405 01:11:44,439 --> 01:11:45,599 Speaker 5: and go. 1406 01:11:45,560 --> 01:11:47,120 Speaker 8: Backwards when they're driving forwards. 1407 01:11:47,880 --> 01:11:51,040 Speaker 2: Wheel again another straight fire. 1408 01:11:52,120 --> 01:11:53,800 Speaker 8: I would love to do freeze frames on this. 1409 01:11:54,320 --> 01:11:58,960 Speaker 2: Actually it's in Gosthel. Why is there so many fires? 1410 01:11:59,040 --> 01:12:00,760 Speaker 7: Just all right, let's take a shot every time? 1411 01:12:01,400 --> 01:12:04,040 Speaker 2: Oh my god, and also take a shut every time. 1412 01:12:04,280 --> 01:12:07,280 Speaker 2: He is wearing different clothing and has a clearly different face. 1413 01:12:07,280 --> 01:12:10,080 Speaker 2: The car has changed. Column he's praising the consistency and 1414 01:12:10,120 --> 01:12:13,360 Speaker 2: it is a he is dressed completely differently every scene. 1415 01:12:14,800 --> 01:12:17,680 Speaker 5: His jacket has has has changed since the last one. 1416 01:12:17,760 --> 01:12:21,000 Speaker 2: Yeah, yeah, and again the cop car the cars. When 1417 01:12:21,000 --> 01:12:23,519 Speaker 2: it shows the cars dragging across the screen, they're kind 1418 01:12:23,520 --> 01:12:26,200 Speaker 2: of doing the same thing usually that the animals doing 1419 01:12:26,240 --> 01:12:26,839 Speaker 2: the coke. 1420 01:12:26,680 --> 01:12:29,839 Speaker 7: And minimal motion at the best. 1421 01:12:30,720 --> 01:12:34,880 Speaker 2: Yeah. I also love this. Can you believe this music? 1422 01:12:36,439 --> 01:12:38,519 Speaker 10: I also want to just say when he swoved hit 1423 01:12:38,600 --> 01:12:40,920 Speaker 10: that thing, he was driving like half a mile on it. 1424 01:12:41,200 --> 01:12:43,679 Speaker 5: Yeah, that's how I run. 1425 01:12:43,840 --> 01:12:46,439 Speaker 2: Yeah, look an obviously different man. 1426 01:12:46,560 --> 01:12:52,040 Speaker 5: That's by the way he runs was like he had 1427 01:12:52,040 --> 01:12:52,400 Speaker 5: his arms. 1428 01:12:52,560 --> 01:12:56,680 Speaker 2: Looks cops are three kes actually, look how they run. 1429 01:12:57,360 --> 01:12:58,519 Speaker 5: The running is very funny. 1430 01:12:58,600 --> 01:13:02,040 Speaker 2: Yeah, the bond his seat Yeah different, okay. 1431 01:13:01,840 --> 01:13:04,240 Speaker 5: What is going on with his feet and that, different 1432 01:13:04,320 --> 01:13:09,720 Speaker 5: levels of facial hair, different different jackets, he's wearing different colors, jackets. 1433 01:13:09,360 --> 01:13:12,960 Speaker 12: Vaguely definiteness and in this SA just move. 1434 01:13:13,000 --> 01:13:15,639 Speaker 2: What the fuck is going on? Oh my god, I 1435 01:13:15,680 --> 01:13:19,599 Speaker 2: got me? Yeah, directed by Jason's Odd in big flaming 1436 01:13:19,680 --> 01:13:22,840 Speaker 2: words because again the AI only knows how to put 1437 01:13:22,920 --> 01:13:23,840 Speaker 2: random fires on. 1438 01:13:24,000 --> 01:13:26,200 Speaker 10: Wow, I'm so glad that we have the technology that 1439 01:13:26,240 --> 01:13:28,040 Speaker 10: there a thing where a guy gets chased by the place. 1440 01:13:28,240 --> 01:13:30,479 Speaker 2: Yeah, we couldn't. This would have been impossible before. 1441 01:13:30,520 --> 01:13:32,880 Speaker 10: As he runs at anywhere from one to one hundred 1442 01:13:32,880 --> 01:13:33,479 Speaker 10: miles an hour. 1443 01:13:33,680 --> 01:13:36,439 Speaker 5: I assume they just trained they like this was specifically 1444 01:13:36,479 --> 01:13:39,360 Speaker 5: like pulling on like Scorsese movies a lot. 1445 01:13:39,800 --> 01:13:42,080 Speaker 10: I just want to know about these thousands of generations 1446 01:13:42,080 --> 01:13:43,160 Speaker 10: of script because. 1447 01:13:42,920 --> 01:13:43,599 Speaker 5: That is interesting. 1448 01:13:43,680 --> 01:13:46,000 Speaker 10: I am very curious because I just don't believe that 1449 01:13:46,040 --> 01:13:49,320 Speaker 10: for did he just uh read there? 1450 01:13:49,400 --> 01:13:51,320 Speaker 12: Yeah no, that's the opening crawl to just like some 1451 01:13:51,600 --> 01:13:55,840 Speaker 12: uh generated Star Wars. 1452 01:13:56,560 --> 01:13:58,800 Speaker 5: It seems like shot by shot, right, each each shot 1453 01:13:58,880 --> 01:13:59,880 Speaker 5: is going to require a lot of. 1454 01:13:59,800 --> 01:14:01,559 Speaker 2: Like iteration the script. 1455 01:14:02,280 --> 01:14:05,160 Speaker 5: It's just yeah, I mean again, like it unpacking. What 1456 01:14:05,240 --> 01:14:07,560 Speaker 5: he actually is saying is unclear. 1457 01:14:07,280 --> 01:14:09,960 Speaker 2: Because I went to the YouTube video for this and 1458 01:14:10,120 --> 01:14:13,040 Speaker 2: the first five or four comments are looks like we 1459 01:14:13,120 --> 01:14:15,439 Speaker 2: found the new king of video. Jesus Christ, give it 1460 01:14:15,479 --> 01:14:19,320 Speaker 2: a rest close change in every shot. Four to six 1461 01:14:19,400 --> 01:14:22,000 Speaker 2: year old boys are gonna love it and still acts 1462 01:14:22,080 --> 01:14:25,440 Speaker 2: character and vehicle consistency. But we're getting close. 1463 01:14:27,520 --> 01:14:28,719 Speaker 5: Which is which is the exact? 1464 01:14:30,640 --> 01:14:33,839 Speaker 2: By twenty thirty, you'll make a man wear the same 1465 01:14:33,920 --> 01:14:36,280 Speaker 2: clothes for an entire video. Oh this is This has 1466 01:14:36,320 --> 01:14:37,920 Speaker 2: happened before with Sora. 1467 01:14:38,040 --> 01:14:40,439 Speaker 10: When they put Sora out there, like check out airhead 1468 01:14:41,160 --> 01:14:45,240 Speaker 10: on your man God, and the balloon changes every single shot. 1469 01:14:45,280 --> 01:14:47,719 Speaker 8: It's a different size and color each time. 1470 01:14:47,920 --> 01:14:50,800 Speaker 10: There are just people running in the background sometimes and 1471 01:14:50,840 --> 01:14:52,519 Speaker 10: then they made a new one. You're like, oh, this 1472 01:14:52,560 --> 01:14:55,280 Speaker 10: is gonna be good. It was worse and less consistent, 1473 01:14:55,560 --> 01:14:57,800 Speaker 10: and this is what they think of us. They're like, 1474 01:14:57,840 --> 01:14:59,400 Speaker 10: these pigs will slop up anything. 1475 01:14:59,760 --> 01:14:59,800 Speaker 4: You. 1476 01:15:00,000 --> 01:15:03,720 Speaker 2: I can't expect technology to do something as complicated as 1477 01:15:03,800 --> 01:15:05,880 Speaker 2: dress a man in clothing and have him stay in 1478 01:15:05,920 --> 01:15:08,800 Speaker 2: that same clothing over multiple scenes. Hollywood never figured it 1479 01:15:09,040 --> 01:15:09,599 Speaker 2: so cool. 1480 01:15:09,439 --> 01:15:12,360 Speaker 10: That this costs like so much money as well just 1481 01:15:12,400 --> 01:15:16,240 Speaker 10: burning there's some fucking gpu melting and a day just 1482 01:15:16,360 --> 01:15:17,520 Speaker 10: enter in Arizona. 1483 01:15:17,120 --> 01:15:19,480 Speaker 2: The strain learning North Carolina. 1484 01:15:19,880 --> 01:15:22,959 Speaker 12: It is also there's gonna be like thirty forty companies 1485 01:15:23,000 --> 01:15:26,200 Speaker 12: trying to recreate the same misshapen wheel you know, for 1486 01:15:26,280 --> 01:15:27,200 Speaker 12: the next five days. 1487 01:15:27,240 --> 01:15:30,400 Speaker 10: Also, the little pigs that watch Star Wars, including myself, 1488 01:15:30,600 --> 01:15:33,519 Speaker 10: they'll notice every miner inconsistency. Do you think that they're 1489 01:15:33,560 --> 01:15:37,680 Speaker 10: going to tolerate Luke Skywalker's and Watteau and all their favorite. 1490 01:15:37,360 --> 01:15:39,360 Speaker 8: Characters they're going to drive? Do you think that they're 1491 01:15:39,360 --> 01:15:41,200 Speaker 8: going to be happy office with a cyber truck? 1492 01:15:41,560 --> 01:15:45,120 Speaker 2: That's a cyber truck situation. You I think the issues 1493 01:15:45,160 --> 01:15:47,479 Speaker 2: are twofold, which is like number one. In order to 1494 01:15:47,520 --> 01:15:50,000 Speaker 2: make this shiit sell to the people who watch movies, 1495 01:15:50,040 --> 01:15:53,599 Speaker 2: you have to dramatically reduce the average intelligence of people 1496 01:15:53,640 --> 01:15:56,519 Speaker 2: watching movies. You have to give everyone brain damage, which 1497 01:15:56,520 --> 01:15:57,599 Speaker 2: except they are. 1498 01:15:57,439 --> 01:15:59,000 Speaker 7: Working in Giant Yeah. 1499 01:15:59,320 --> 01:16:01,240 Speaker 2: And the other thing is the models have to get 1500 01:16:01,400 --> 01:16:03,320 Speaker 2: much better. And Jason made a point that like, look, 1501 01:16:03,400 --> 01:16:05,519 Speaker 2: every time people would like talk about the criticism and 1502 01:16:05,600 --> 01:16:08,240 Speaker 2: be like, look, this is the worst it's gonna look guys, 1503 01:16:08,840 --> 01:16:12,040 Speaker 2: And I was just looking into it. GPT four took 1504 01:16:12,160 --> 01:16:15,720 Speaker 2: fifty times as many resources in like fifty times as 1505 01:16:15,760 --> 01:16:19,680 Speaker 2: much energy to train as GPT three did. So this 1506 01:16:19,760 --> 01:16:22,720 Speaker 2: is these are the kind of like exponential increases that 1507 01:16:22,760 --> 01:16:25,120 Speaker 2: we're looking at. So like, if it took them so 1508 01:16:25,880 --> 01:16:29,320 Speaker 2: many millions, billions of dollars of investment to get to 1509 01:16:29,360 --> 01:16:31,519 Speaker 2: the point where they can make this shitty video, to 1510 01:16:31,600 --> 01:16:35,040 Speaker 2: make anything close to watchable. You're talking about again just 1511 01:16:35,080 --> 01:16:39,000 Speaker 2: like lighting on fire, billions of dollars to do what 1512 01:16:39,240 --> 01:16:41,640 Speaker 2: to make a scene that you could already get like 1513 01:16:41,680 --> 01:16:44,519 Speaker 2: a twenty six year old dude who grew up watching 1514 01:16:44,520 --> 01:16:48,519 Speaker 2: fucking Quentin Tarantino movies and taking cocaine, And you could 1515 01:16:48,520 --> 01:16:50,880 Speaker 2: give them sixty thousand dollars and he'll film that shit 1516 01:16:50,920 --> 01:16:52,439 Speaker 2: for you with an old car, Like. 1517 01:16:52,479 --> 01:16:55,040 Speaker 5: Yeah, I mean you could. You could even like animate it. 1518 01:16:55,520 --> 01:16:57,559 Speaker 12: M I mean, look, you give me a PS four 1519 01:16:57,760 --> 01:17:00,160 Speaker 12: and somebody's grandmother and I will make them think that 1520 01:17:00,200 --> 01:17:01,000 Speaker 12: they're watching that. 1521 01:17:01,160 --> 01:17:03,280 Speaker 5: No, seriously, seriously that six. 1522 01:17:03,520 --> 01:17:05,479 Speaker 10: But also this, I just want to read out some 1523 01:17:05,520 --> 01:17:08,120 Speaker 10: of the fucking people that use this model. We started 1524 01:17:08,160 --> 01:17:10,960 Speaker 10: working with creatives like Donald Glover, who I said was 1525 01:17:10,960 --> 01:17:11,880 Speaker 10: washed ten years ago. 1526 01:17:11,960 --> 01:17:13,200 Speaker 2: I'm fucking sick of people. 1527 01:17:13,840 --> 01:17:16,120 Speaker 7: My Love was a was a good album. 1528 01:17:16,240 --> 01:17:18,120 Speaker 5: America is an objectively bad song. 1529 01:17:18,680 --> 01:17:20,200 Speaker 2: It's a bad song with a great video. 1530 01:17:20,400 --> 01:17:22,600 Speaker 8: Yeaheah, I thought he's like kind of bar and be 1531 01:17:22,680 --> 01:17:23,080 Speaker 8: stuff is. 1532 01:17:23,120 --> 01:17:26,080 Speaker 10: Very interesting anyway, moving and of course the week in 1533 01:17:26,200 --> 01:17:32,080 Speaker 10: it so weekend and someone called great. I'll work with 1534 01:17:32,160 --> 01:17:34,800 Speaker 10: creators on VO one and form the development of VO two, 1535 01:17:34,840 --> 01:17:36,760 Speaker 10: and we look forward to working with trusted testers and 1536 01:17:36,760 --> 01:17:38,519 Speaker 10: creators to get feedback on this new model. 1537 01:17:38,600 --> 01:17:40,759 Speaker 8: How long are you going to get fucking feedback? It stinks. 1538 01:17:41,080 --> 01:17:44,160 Speaker 2: We've got some feedback from Yeah, I got a few thoughts. 1539 01:17:44,200 --> 01:17:47,080 Speaker 5: Hopefully those people are are just getting paid to tell 1540 01:17:47,120 --> 01:17:49,360 Speaker 5: them words and be like yeah, sure, I'll take your money. 1541 01:17:49,680 --> 01:17:52,200 Speaker 2: Yeah, if they can be twenty million dollars, I'm flipping 1542 01:17:52,240 --> 01:17:55,200 Speaker 2: the hole like just no, I will turn on a dime. 1543 01:17:55,640 --> 01:17:58,759 Speaker 2: Speaking of turning on a dime for money, here's. 1544 01:17:58,600 --> 01:18:11,960 Speaker 13: Ads Ah, we're back. 1545 01:18:12,960 --> 01:18:16,439 Speaker 2: So the next video that our friend I now feel 1546 01:18:16,479 --> 01:18:19,120 Speaker 2: he's like a brother to me, Jason puts on was 1547 01:18:19,160 --> 01:18:22,880 Speaker 2: of an AI generated fictional elderly rock star talking about 1548 01:18:22,960 --> 01:18:29,879 Speaker 2: death to do this plastic and incapable of dynamic expression 1549 01:18:29,880 --> 01:18:32,599 Speaker 2: as he guzzles randomly from bottles of liquor that flash 1550 01:18:32,680 --> 01:18:35,200 Speaker 2: in and out of existence. Sometimes he lies on his 1551 01:18:35,280 --> 01:18:38,160 Speaker 2: back in empty streets while talking about all the all 1552 01:18:38,200 --> 01:18:41,000 Speaker 2: of the cgi featureless women that he has loved in 1553 01:18:41,080 --> 01:18:45,040 Speaker 2: his exciting life. Other Times he plays stadium shows while 1554 01:18:45,080 --> 01:18:48,280 Speaker 2: obvious GPT written dialogue about aging and death drones on. 1555 01:18:48,680 --> 01:18:51,680 Speaker 2: When the video ends, everybody in the room claps, And 1556 01:18:51,720 --> 01:18:54,439 Speaker 2: as you watch this, I need to imagine seeing the 1557 01:18:54,479 --> 01:18:56,640 Speaker 2: thing that I'm about to show you all and a 1558 01:18:56,720 --> 01:18:59,479 Speaker 2: room with like two hundred people in it, all clapping 1559 01:18:59,560 --> 01:19:04,320 Speaker 2: in thusiastically. I don't think I did. I did it. 1560 01:19:04,560 --> 01:19:07,599 Speaker 2: I did. I said, come the fuck on, as flat 1561 01:19:07,640 --> 01:19:11,759 Speaker 2: as I could rise, a skywalk up. Yeah. So here's 1562 01:19:12,120 --> 01:19:16,320 Speaker 2: fade out and an old man. Yea, it looks a 1563 01:19:16,360 --> 01:19:19,559 Speaker 2: little bit like George car It's the end. Okay, three? 1564 01:19:20,040 --> 01:19:23,840 Speaker 6: Can you chest like the world's just you, God damn 1565 01:19:23,880 --> 01:19:29,600 Speaker 6: big and you're just to go passing through them? 1566 01:19:29,640 --> 01:19:30,280 Speaker 2: What's he doing? 1567 01:19:30,439 --> 01:19:34,679 Speaker 10: He carried my heart concerts, Granddad, calm down, It scattered. 1568 01:19:34,920 --> 01:19:38,080 Speaker 10: I love these slash cuts, the fast cuts, these fosse cuts, 1569 01:19:38,120 --> 01:19:39,680 Speaker 10: because the next frame was unusable. 1570 01:19:40,760 --> 01:19:41,960 Speaker 5: Yes, actually yes. 1571 01:19:42,000 --> 01:19:44,360 Speaker 2: Like that he drank and the bottle changed in his hand. 1572 01:19:44,400 --> 01:19:47,680 Speaker 2: You could see it starting to happen? What is just 1573 01:19:47,760 --> 01:19:52,360 Speaker 2: anonymous with destroyed It? Just a beautiful music. Listen to 1574 01:19:52,400 --> 01:19:53,760 Speaker 2: that lived It to the bow? 1575 01:19:54,200 --> 01:19:55,320 Speaker 5: Could you believe. 1576 01:19:56,640 --> 01:19:59,840 Speaker 2: By firing a Roman candles time? 1577 01:20:01,040 --> 01:20:01,760 Speaker 5: I like so? 1578 01:20:01,880 --> 01:20:05,200 Speaker 10: The old man does look very different each time, very 1579 01:20:05,200 --> 01:20:05,880 Speaker 10: different old man. 1580 01:20:06,400 --> 01:20:08,160 Speaker 5: That's a different that's a different guy. 1581 01:20:08,680 --> 01:20:15,840 Speaker 2: Yeah, that's the Emperor from the first Cladiators, trotting get running. 1582 01:20:15,560 --> 01:20:18,920 Speaker 5: Away from this the way this model generates running diesel. 1583 01:20:20,880 --> 01:20:25,040 Speaker 2: There he is drinking on the fire, old rock star, 1584 01:20:25,160 --> 01:20:29,720 Speaker 2: drinking in front of a flaming house, the a I 1585 01:20:29,760 --> 01:20:30,679 Speaker 2: loves burning building. 1586 01:20:30,760 --> 01:20:33,360 Speaker 5: What is this voice? I would love to track his 1587 01:20:33,479 --> 01:20:35,000 Speaker 5: tattoos from three. 1588 01:20:35,320 --> 01:20:37,880 Speaker 10: We'll say he's about to eat the micro different, I've 1589 01:20:37,920 --> 01:20:40,720 Speaker 10: done it, yum. 1590 01:20:40,840 --> 01:20:47,719 Speaker 2: Now he's sleeping and a broken Mustang, the classic Ferrari 1591 01:20:47,800 --> 01:20:51,479 Speaker 2: Mustang Mustang. It's in like a pool in front of 1592 01:20:51,479 --> 01:20:53,400 Speaker 2: a mansion, but he clearly isn't questioned to it. The 1593 01:20:53,400 --> 01:20:58,080 Speaker 2: car is hovering slightly over the pool like I love this, 1594 01:20:58,240 --> 01:21:00,639 Speaker 2: I love this, I love him, And he tells us. 1595 01:21:00,680 --> 01:21:02,560 Speaker 2: He tells us during this as if we're supposed to 1596 01:21:02,560 --> 01:21:06,960 Speaker 2: be impressed that Chatchypt wrote seventy five and that's fucking hell. 1597 01:21:08,800 --> 01:21:10,280 Speaker 5: I can't believe that. 1598 01:21:10,439 --> 01:21:16,360 Speaker 12: Frankly, as a bartender, I regret walking into the room 1599 01:21:16,400 --> 01:21:17,559 Speaker 12: to see if people want drinks. 1600 01:21:17,600 --> 01:21:19,120 Speaker 8: This is a bartender. 1601 01:21:19,240 --> 01:21:22,519 Speaker 2: I apologize. I apologize that you had to hit a drink. 1602 01:21:22,680 --> 01:21:24,760 Speaker 5: I also would like, actually, can I have a drink too? 1603 01:21:24,960 --> 01:21:26,519 Speaker 2: We are in the line. 1604 01:21:26,360 --> 01:21:28,920 Speaker 10: Ces Sweet and we're all drinking because I just want 1605 01:21:29,000 --> 01:21:32,040 Speaker 10: to say, I'm fucking disassociating after that, I'm so fucking 1606 01:21:32,080 --> 01:21:35,400 Speaker 10: saying every a year of doing this nonsense, and I 1607 01:21:35,479 --> 01:21:37,720 Speaker 10: look at these chit eaters and they show us that, 1608 01:21:37,760 --> 01:21:39,360 Speaker 10: and they like slope down the slop. 1609 01:21:39,600 --> 01:21:43,439 Speaker 2: Oh my god, it's it's it's hitting the easiest. 1610 01:21:42,960 --> 01:21:45,240 Speaker 8: Things to find an old man that drinks. 1611 01:21:44,920 --> 01:21:46,920 Speaker 2: For an idea of like how real this company is. 1612 01:21:46,920 --> 01:21:48,800 Speaker 2: Obviously they were one of the companies. They were not 1613 01:21:48,800 --> 01:21:50,559 Speaker 2: the only people who made that Coca Cola add They 1614 01:21:50,560 --> 01:21:52,400 Speaker 2: were one of like three or four companies. 1615 01:21:52,439 --> 01:21:55,479 Speaker 5: It takes four companies to take companies to make that. 1616 01:21:55,600 --> 01:21:59,080 Speaker 2: Can't believe it. They have six hundred and twenty two 1617 01:21:59,080 --> 01:22:02,080 Speaker 2: followers on Twitter, Hell yes or not? Twitter, on YouTube, 1618 01:22:02,160 --> 01:22:04,840 Speaker 2: on YouTube, on YouTube, on YouTube, I don't know than 1619 01:22:05,320 --> 01:22:08,000 Speaker 2: I post this karaoke song and this this fade out 1620 01:22:08,040 --> 01:22:10,800 Speaker 2: is there or Sorry. The Heist is their most successful video, 1621 01:22:10,840 --> 01:22:13,960 Speaker 2: with fifty six thousand views. Fade Out, which we just watched, 1622 01:22:14,000 --> 01:22:17,200 Speaker 2: has less than five thousand views. They're not ready, so 1623 01:22:17,240 --> 01:22:19,080 Speaker 2: they're not they're not white. 1624 01:22:19,120 --> 01:22:20,360 Speaker 5: It's only going to get better. 1625 01:22:20,479 --> 01:22:22,640 Speaker 2: Yeah, it's only going to get get only going to 1626 01:22:22,720 --> 01:22:23,080 Speaker 2: get better. 1627 01:22:23,160 --> 01:22:25,439 Speaker 5: Obviously, things will only get. 1628 01:22:25,280 --> 01:22:28,120 Speaker 7: Read floor for a small price of one billion dollars. 1629 01:22:28,200 --> 01:22:30,440 Speaker 8: It's just like one hundred thousand dollars to compute. 1630 01:22:30,880 --> 01:22:33,599 Speaker 2: Yeah, imagine how good it would be much a billion 1631 01:22:33,720 --> 01:22:38,320 Speaker 2: will only get worth more. I mean, I get now, Garrison, 1632 01:22:38,680 --> 01:22:40,599 Speaker 2: I do think you should invest all of your salary. 1633 01:22:40,600 --> 01:22:42,960 Speaker 8: I just did a sixteenth minute about talking this. 1634 01:22:43,640 --> 01:22:46,160 Speaker 10: I think I would rather hook To has a more 1635 01:22:46,560 --> 01:22:49,519 Speaker 10: obvious use case than this ship. Hey, do you want 1636 01:22:49,560 --> 01:22:51,679 Speaker 10: to spend way more money to get something way worse? 1637 01:22:51,760 --> 01:22:54,920 Speaker 10: I actually can't get over the seventy five percent check GPTO. 1638 01:22:55,720 --> 01:22:57,679 Speaker 2: That should be more twenty. 1639 01:22:57,920 --> 01:23:00,679 Speaker 8: No, it should be Theoretically it should be it should 1640 01:23:00,680 --> 01:23:00,880 Speaker 8: be one. 1641 01:23:00,840 --> 01:23:04,000 Speaker 10: Hundred, should be hundred percent, which means that a quarter 1642 01:23:04,439 --> 01:23:06,680 Speaker 10: a quarture of it was just fucking unusable. 1643 01:23:06,800 --> 01:23:07,000 Speaker 2: No. 1644 01:23:07,360 --> 01:23:10,519 Speaker 5: Absolutely, they're generating like individual shots that they're like stitching 1645 01:23:10,560 --> 01:23:13,320 Speaker 5: together and like who knows how how long it takes 1646 01:23:13,320 --> 01:23:15,400 Speaker 5: to like get like the prompt right for that shot 1647 01:23:15,439 --> 01:23:15,880 Speaker 5: to work. 1648 01:23:16,280 --> 01:23:18,760 Speaker 2: However long it takes, it was too long because it 1649 01:23:18,800 --> 01:23:20,920 Speaker 2: looks like, shit, we're gonna watch a video I haven't 1650 01:23:20,920 --> 01:23:22,960 Speaker 2: seen yet, or at least of course, because it's five minutes, 1651 01:23:22,960 --> 01:23:23,800 Speaker 2: so we're not watching all this. 1652 01:23:23,840 --> 01:23:24,400 Speaker 5: Oh my god. 1653 01:23:24,479 --> 01:23:27,360 Speaker 2: It's two hundred and fifty two views and came out 1654 01:23:27,360 --> 01:23:29,640 Speaker 2: a week ago. It's called miniminade. 1655 01:23:30,680 --> 01:23:34,400 Speaker 5: What Yeah, it's a word. 1656 01:23:34,600 --> 01:23:36,840 Speaker 10: Now it's like when you find your cat's momented on 1657 01:23:36,880 --> 01:23:37,639 Speaker 10: the floor again. 1658 01:23:38,240 --> 01:23:41,040 Speaker 2: So first we see a diner called Minimonade that appears 1659 01:23:41,080 --> 01:23:42,360 Speaker 2: to be both on the fire. 1660 01:23:42,200 --> 01:23:44,200 Speaker 8: Light runner yeah, light runner, oh god. 1661 01:23:44,360 --> 01:23:46,720 Speaker 2: And an old lady Rice is up out of a 1662 01:23:46,760 --> 01:23:49,479 Speaker 2: pile of ashes. That's how mouths work. 1663 01:23:51,680 --> 01:23:52,679 Speaker 5: Where am I. 1664 01:23:54,120 --> 01:23:55,160 Speaker 2: Great? AI voice? 1665 01:23:55,240 --> 01:23:59,479 Speaker 8: What is this pantasticgoria or all voice acting? It's me 1666 01:23:59,640 --> 01:24:06,400 Speaker 8: Harris Forward. The funk is going on with the tea. 1667 01:24:06,920 --> 01:24:09,840 Speaker 2: What I think this is death? This old lady is dead. 1668 01:24:10,479 --> 01:24:10,920 Speaker 2: That's how I. 1669 01:24:14,160 --> 01:24:14,320 Speaker 5: Now. 1670 01:24:14,400 --> 01:24:20,000 Speaker 2: She's tripping on tomatoes. The decaying Sandy diner that exploded 1671 01:24:20,040 --> 01:24:24,320 Speaker 2: has turned into a lively fifties diner off off. 1672 01:24:24,160 --> 01:24:25,679 Speaker 8: Dennis phil Villa News. 1673 01:24:25,760 --> 01:24:27,000 Speaker 7: This a segregated diner. 1674 01:24:28,720 --> 01:24:32,240 Speaker 2: I only see why going back to the good old days. 1675 01:24:32,320 --> 01:24:33,640 Speaker 5: Yeah, yeah, I know. 1676 01:24:33,640 --> 01:24:35,920 Speaker 2: There's a little Indian book he is. He is the 1677 01:24:36,000 --> 01:24:43,360 Speaker 2: help though. Yeah mm hmmm, oh that's not. The little 1678 01:24:43,400 --> 01:24:45,439 Speaker 2: kid just fell down, and the way it shows falling 1679 01:24:45,520 --> 01:24:47,439 Speaker 2: is that he just sort of deflays. 1680 01:24:48,680 --> 01:24:52,719 Speaker 7: And he's up again. The action is staring at. 1681 01:24:52,720 --> 01:24:55,600 Speaker 2: Well, that's terrible. We don't need to watch anymore of that. 1682 01:24:55,960 --> 01:24:57,840 Speaker 2: No one, no one, no one want to watch watch 1683 01:24:57,880 --> 01:24:59,080 Speaker 2: this and have a positive reaction. 1684 01:24:59,160 --> 01:25:01,080 Speaker 8: They should, They should keep you in a holding cell. 1685 01:25:01,200 --> 01:25:04,439 Speaker 2: Yeah, I'm deeply unhappy at the time we already spent watching. 1686 01:25:04,560 --> 01:25:06,320 Speaker 8: Yeah, like, we don't know what you're gonna do next. 1687 01:25:06,680 --> 01:25:08,200 Speaker 7: We're building a facility for you. 1688 01:25:08,439 --> 01:25:12,519 Speaker 2: Yeah. The phrase reality distortion field gets used a lot 1689 01:25:12,560 --> 01:25:15,240 Speaker 2: when we talk about texts, but I really tasted it 1690 01:25:15,280 --> 01:25:17,880 Speaker 2: in that room because all anyone on stage could talk 1691 01:25:17,880 --> 01:25:20,080 Speaker 2: about is how good it looks. And every one of 1692 01:25:20,160 --> 01:25:24,240 Speaker 2: these videos people are like clapping, They're like, wow, this 1693 01:25:24,439 --> 01:25:25,440 Speaker 2: is amazing. 1694 01:25:25,680 --> 01:25:28,200 Speaker 5: Why do you think they think it looks good? 1695 01:25:28,760 --> 01:25:30,400 Speaker 8: It looks better than an Xbox. 1696 01:25:30,960 --> 01:25:33,400 Speaker 10: Yeah, And the idea is you talk to thing in 1697 01:25:33,479 --> 01:25:35,640 Speaker 10: and now a thing came out, and that's magical. 1698 01:25:35,320 --> 01:25:37,920 Speaker 7: So why virtue of not having humans work on it? 1699 01:25:37,920 --> 01:25:41,280 Speaker 2: It's so it's better than you'd have Yeah, Okay. There 1700 01:25:41,320 --> 01:25:44,280 Speaker 2: was a moment after this where Jason like joked about 1701 01:25:44,280 --> 01:25:46,840 Speaker 2: how like I don't like obviously I don't want to 1702 01:25:46,880 --> 01:25:52,720 Speaker 2: replace actors yet yeah yeah, and another Panels was like, 1703 01:25:52,720 --> 01:25:54,640 Speaker 2: I think we're gonna have to make some you have 1704 01:25:54,680 --> 01:25:57,240 Speaker 2: to see how some decisions go as to fair use, 1705 01:25:57,280 --> 01:26:00,720 Speaker 2: because obviously this is cribbing from a bunch of fuck Scorsese, 1706 01:26:01,120 --> 01:26:04,280 Speaker 2: like it kind of looked like nine. Yeah, and Thomas 1707 01:26:04,360 --> 01:26:05,080 Speaker 2: can get and. 1708 01:26:05,280 --> 01:26:08,240 Speaker 5: Later On in twenty foury nine, and Denny Villanou in general, 1709 01:26:08,280 --> 01:26:09,960 Speaker 5: like all of his films have been like a massive 1710 01:26:10,000 --> 01:26:15,200 Speaker 5: source for for these emotion and still generations, so much 1711 01:26:15,200 --> 01:26:17,120 Speaker 5: so that like I think, like later On twenty forty 1712 01:26:17,160 --> 01:26:19,160 Speaker 5: nine is like one of the easiest films to like 1713 01:26:19,160 --> 01:26:22,640 Speaker 5: like replicate film stills almost exactly for based on like 1714 01:26:22,720 --> 01:26:25,200 Speaker 5: how like how like load bearing that film has been 1715 01:26:25,240 --> 01:26:27,600 Speaker 5: for a whole bunch of these models that could be 1716 01:26:27,640 --> 01:26:28,719 Speaker 5: due to a number of factors. 1717 01:26:28,960 --> 01:26:31,360 Speaker 2: Now I know what you're wondering, How soon until we 1718 01:26:31,400 --> 01:26:34,040 Speaker 2: can get a full ninety minute movie that looks like this? 1719 01:26:34,160 --> 01:26:34,280 Speaker 1: Oh? 1720 01:26:34,439 --> 01:26:35,759 Speaker 5: I mean I'm guessing days away. 1721 01:26:36,200 --> 01:26:38,479 Speaker 2: No, no, Jason said, probably not at least for a 1722 01:26:38,560 --> 01:26:39,240 Speaker 2: decade or so. 1723 01:26:39,479 --> 01:26:42,400 Speaker 5: Really, okay, years, that's interesting. 1724 01:26:42,680 --> 01:26:45,440 Speaker 2: I don't want to wait that long. What a worthwhile endeffort. 1725 01:26:45,080 --> 01:26:48,360 Speaker 5: Though, because he could have said shorter. That actually is interesting. 1726 01:26:48,400 --> 01:26:51,080 Speaker 8: He could have said anything those chums believed. 1727 01:26:51,320 --> 01:26:53,439 Speaker 2: I think it is like he did have to spend 1728 01:26:53,680 --> 01:26:57,479 Speaker 2: probably hundreds of hours of his precious one human life 1729 01:26:57,840 --> 01:27:01,080 Speaker 2: stitching those those turds together, and he's like, it's nowhere 1730 01:27:01,120 --> 01:27:03,280 Speaker 2: near ready. There's no way it could make it nice. 1731 01:27:03,280 --> 01:27:03,960 Speaker 5: It's giving him. 1732 01:27:06,080 --> 01:27:11,040 Speaker 12: Because I've only really seen one interesting genitive video thing. 1733 01:27:11,080 --> 01:27:15,040 Speaker 12: But it wasn't a generative video thing. It was they filmed, uh, Brian, 1734 01:27:15,120 --> 01:27:18,720 Speaker 12: you know, filmed a documentary and they created, you know, 1735 01:27:18,800 --> 01:27:22,680 Speaker 12: some backhand software so that they would be able to 1736 01:27:23,680 --> 01:27:27,720 Speaker 12: do cuts of existing footage and try to focus on 1737 01:27:27,840 --> 01:27:32,120 Speaker 12: the parts of the documentary. But I never ever see 1738 01:27:32,160 --> 01:27:35,800 Speaker 12: anything interested in like constructing narratives or it's like, you 1739 01:27:35,880 --> 01:27:39,920 Speaker 12: can't teasing other aspects of the creative process. It's only 1740 01:27:40,280 --> 01:27:42,760 Speaker 12: let's try to replace, right, Let's try to so you 1741 01:27:42,840 --> 01:27:44,000 Speaker 12: can't do narrative with it. 1742 01:27:44,080 --> 01:27:46,080 Speaker 2: And that's the thing. If I if i'd sat down 1743 01:27:46,120 --> 01:27:48,240 Speaker 2: there because I'm sitting I said this. I was sitting 1744 01:27:48,280 --> 01:27:49,880 Speaker 2: next to a guy from usc who was one of 1745 01:27:49,880 --> 01:27:51,600 Speaker 2: the only people in the room who was like similarly 1746 01:27:51,640 --> 01:27:55,080 Speaker 2: critical to me of what we were seeing on stage. 1747 01:27:55,080 --> 01:27:56,760 Speaker 2: It was like, look, if they had come down and 1748 01:27:56,760 --> 01:27:58,240 Speaker 2: been like, look, this is how we can plug a 1749 01:27:58,280 --> 01:28:00,439 Speaker 2: script in and it can create a story be bored, 1750 01:28:00,800 --> 01:28:02,799 Speaker 2: and you can like kind of see like a crude 1751 01:28:02,800 --> 01:28:04,800 Speaker 2: CGI animation of how the shots will look, and that 1752 01:28:04,840 --> 01:28:08,200 Speaker 2: can help you like plan out, like like that's legitimately useful. 1753 01:28:08,240 --> 01:28:10,759 Speaker 2: That's the thing that adds value and can cut costs 1754 01:28:10,800 --> 01:28:13,479 Speaker 2: in a meaningful way to like the production of good 1755 01:28:13,600 --> 01:28:16,920 Speaker 2: TV and movies. But that's not as sexy as like 1756 01:28:17,120 --> 01:28:19,479 Speaker 2: I'm and they were all talking. There was this this 1757 01:28:20,000 --> 01:28:24,599 Speaker 2: like very weird moment where one of the panels Leslie Shannon, 1758 01:28:24,800 --> 01:28:27,479 Speaker 2: who's head of innovation for Nokia, a company that used 1759 01:28:27,479 --> 01:28:29,479 Speaker 2: to make phones and now makes panelists who pretend to 1760 01:28:29,520 --> 01:28:30,720 Speaker 2: be entertained by awkwardata. 1761 01:28:30,760 --> 01:28:33,080 Speaker 5: They also like make cameras and. 1762 01:28:33,320 --> 01:28:35,080 Speaker 2: They make a lot of stuff. I was just shitting 1763 01:28:35,080 --> 01:28:38,400 Speaker 2: on Nokia. She's like, can we use neuroscience to see 1764 01:28:38,479 --> 01:28:41,840 Speaker 2: how people are reacting to AIA generated videos? And then 1765 01:28:41,880 --> 01:28:45,080 Speaker 2: adjust the ending to be like, you know, let's make 1766 01:28:45,120 --> 01:28:47,719 Speaker 2: this resonation of a night. That way, we're helping the creative. 1767 01:28:47,760 --> 01:28:49,719 Speaker 2: And I was like, are you out of your fucking mind? 1768 01:28:49,840 --> 01:28:53,120 Speaker 8: We attached electrodes to found people skulls. 1769 01:28:53,520 --> 01:28:56,760 Speaker 2: I would I would have supported electrodes in their skulls. Yes, 1770 01:28:57,200 --> 01:28:59,880 Speaker 2: Jesus Christ, we should do the monkey neuralink thing to 1771 01:29:00,120 --> 01:29:02,800 Speaker 2: perhaps a pair of calipers. 1772 01:29:04,560 --> 01:29:04,920 Speaker 8: Skulls. 1773 01:29:05,240 --> 01:29:07,400 Speaker 2: I am fascinated the skull shapes of that fucking. 1774 01:29:07,800 --> 01:29:10,360 Speaker 10: To say that is there's so many things that you've 1775 01:29:10,360 --> 01:29:13,040 Speaker 10: said that just they wouldn't survive at that position. 1776 01:29:13,479 --> 01:29:16,160 Speaker 2: Speaking of things that wouldn't survive a deposition the sponsors 1777 01:29:16,200 --> 01:29:30,920 Speaker 2: to this podcast. Okay, so that first panel was a 1778 01:29:30,960 --> 01:29:33,840 Speaker 2: real moment for me. I went through a couple of more, 1779 01:29:34,080 --> 01:29:36,640 Speaker 2: one of which was on like advertising and AI and 1780 01:29:36,840 --> 01:29:41,400 Speaker 2: was mostly mostly pretty boring. The third panel I went 1781 01:29:41,439 --> 01:29:46,000 Speaker 2: through though, was called AI Cinematic Spatial and XR and 1782 01:29:46,120 --> 01:29:47,680 Speaker 2: I just want to actually play you guys, you'll have 1783 01:29:47,720 --> 01:29:48,360 Speaker 2: to cluster around. 1784 01:29:48,400 --> 01:29:50,880 Speaker 10: I would actually believe that was generated with chat GPT 1785 01:29:52,000 --> 01:29:53,120 Speaker 10: GPT two point zero. 1786 01:29:53,680 --> 01:29:56,479 Speaker 5: So let's start with this one. AI will be more 1787 01:29:56,560 --> 01:29:57,840 Speaker 5: impactful than. 1788 01:29:57,720 --> 01:30:08,760 Speaker 14: The Internet, maybe it's a trick question because it is 1789 01:30:08,880 --> 01:30:11,000 Speaker 14: then that was that was. 1790 01:30:11,840 --> 01:30:14,799 Speaker 1: The Internet, although it can wrong about the Internet. 1791 01:30:15,080 --> 01:30:16,800 Speaker 6: So I'm like, oh, yeah, there you go. 1792 01:30:16,800 --> 01:30:19,440 Speaker 5: All right, what's what when you impact. 1793 01:30:20,600 --> 01:30:20,960 Speaker 9: AI? 1794 01:30:21,600 --> 01:30:26,320 Speaker 5: AI is going to resolve in astronomical job losses? 1795 01:30:28,040 --> 01:30:32,800 Speaker 2: True, bolsh, there will be an evolution of job laws. 1796 01:30:34,600 --> 01:30:37,400 Speaker 7: Next redistribution of. 1797 01:30:39,800 --> 01:30:42,559 Speaker 2: M That was the scene I wanted you to hear with. 1798 01:30:42,600 --> 01:30:44,000 Speaker 2: They're like, we don't want to say it out loud, 1799 01:30:44,040 --> 01:30:45,280 Speaker 2: and then everyone chuckles. 1800 01:30:45,360 --> 01:30:47,160 Speaker 8: These people are too fucking smug. 1801 01:30:47,479 --> 01:30:50,120 Speaker 10: Yeah, these people sound too confident and too chumming and 1802 01:30:50,160 --> 01:30:51,640 Speaker 10: too happy to say things like this. 1803 01:30:52,360 --> 01:30:54,479 Speaker 2: That's not good. I don't like these people laughing about 1804 01:30:54,520 --> 01:30:55,400 Speaker 2: people losing jobs. 1805 01:30:55,520 --> 01:30:56,880 Speaker 8: No, I shouldn't have jobs. 1806 01:30:56,960 --> 01:30:59,840 Speaker 2: That's that's a good place to start. Yeah, I don't 1807 01:30:59,920 --> 01:31:02,240 Speaker 2: like that either. And the people you're hearing from. Let 1808 01:31:02,240 --> 01:31:04,040 Speaker 2: me let me tell you who's in this fucking panel 1809 01:31:04,120 --> 01:31:07,920 Speaker 2: who was just laughing about like, well, there will be 1810 01:31:08,000 --> 01:31:14,519 Speaker 2: a uh an evolution law. Yeah. So the motherfuckers who 1811 01:31:14,520 --> 01:31:17,479 Speaker 2: are all that panel laughing about people losing their jobs. 1812 01:31:17,960 --> 01:31:24,040 Speaker 2: Ted Shillowitz literally his name is Shilowitz, futurist at Cinemersion, Inc. 1813 01:31:24,479 --> 01:31:25,080 Speaker 5: That's like a j. 1814 01:31:30,280 --> 01:31:35,160 Speaker 2: Rebecca Barkin, co founder in CEO laman o One, Aaron Luber, 1815 01:31:37,080 --> 01:31:45,759 Speaker 2: Director Partnerships at Google, IPG Media Lab, Layla Emirsadegi, Principal 1816 01:31:45,760 --> 01:31:50,559 Speaker 2: program manager at Engineering Microsoft, and Katie Henson, s VP 1817 01:31:50,720 --> 01:31:53,479 Speaker 2: post Production at Fear Studios. So those are the people 1818 01:31:53,520 --> 01:31:53,960 Speaker 2: who are. 1819 01:31:53,840 --> 01:31:57,800 Speaker 5: Are sad, all laughing and like it's like genera of 1820 01:31:57,840 --> 01:32:01,360 Speaker 5: AI is like good at like one thing creatively, it's 1821 01:32:01,439 --> 01:32:06,120 Speaker 5: good at like streamlining VFX, like workflow to the workflow 1822 01:32:06,160 --> 01:32:09,439 Speaker 5: of of how to do like it is it is 1823 01:32:09,439 --> 01:32:12,439 Speaker 5: like there's aspect. Famously, the only useful thing it's been 1824 01:32:12,520 --> 01:32:15,400 Speaker 5: used for is making people's eyes blue in Dune Part two. 1825 01:32:15,479 --> 01:32:17,200 Speaker 2: It's not one hundred billion dollars. 1826 01:32:17,920 --> 01:32:20,880 Speaker 5: And like it is applicable for like changing objects into 1827 01:32:20,920 --> 01:32:23,360 Speaker 5: other objects on screen. It can produce really like kind 1828 01:32:23,360 --> 01:32:26,160 Speaker 5: of odd like uncanny effects that could be utilized by 1829 01:32:26,200 --> 01:32:29,439 Speaker 5: a team of human artists. Really well. What it can't 1830 01:32:29,479 --> 01:32:31,960 Speaker 5: do is generate a short film that is in any 1831 01:32:31,960 --> 01:32:37,680 Speaker 5: way compelling piece well that is anyway compelling as a 1832 01:32:37,680 --> 01:32:39,800 Speaker 5: piece of art. Okay, And the fact that they're like 1833 01:32:39,880 --> 01:32:42,800 Speaker 5: laughing at how much how much have you lost? 1834 01:32:42,920 --> 01:32:44,719 Speaker 2: Enough jobs they have not. 1835 01:32:45,200 --> 01:32:49,160 Speaker 10: Or that had structures full to the beauty of the flame. 1836 01:32:49,880 --> 01:32:53,800 Speaker 2: Right, although the AI keeps keeps foreboding coming for them 1837 01:32:54,160 --> 01:32:58,000 Speaker 2: and wants somethings. I'm going to end on a happy 1838 01:32:58,040 --> 01:33:00,639 Speaker 2: note because the last panel I went to was actually 1839 01:33:00,720 --> 01:33:03,799 Speaker 2: really cool. It was AI in the Crisis of Creative Rights, 1840 01:33:04,040 --> 01:33:07,280 Speaker 2: Deep Fakes, Ethics and the Law, and it featured the 1841 01:33:07,400 --> 01:33:10,000 Speaker 2: first intelligent person that I've seen at CES this year, 1842 01:33:10,479 --> 01:33:13,720 Speaker 2: Moya McTeer, who is a folklorist and senior advisor at 1843 01:33:13,720 --> 01:33:18,679 Speaker 2: the Human Artistry Campaign. It also featured Duncan Crabtree Ireland, 1844 01:33:18,680 --> 01:33:21,920 Speaker 2: who's the national executive director and chief negotiator of sag Aftra. 1845 01:33:22,040 --> 01:33:24,799 Speaker 2: There we go, There we go, and this was no bullshit. 1846 01:33:24,840 --> 01:33:26,760 Speaker 2: It was talking about all of the different lawsuits that 1847 01:33:26,800 --> 01:33:29,280 Speaker 2: are going on right now, all of the litigation around AI, 1848 01:33:29,800 --> 01:33:32,920 Speaker 2: and like the actual strategy for litigating, and like there 1849 01:33:33,000 --> 01:33:34,760 Speaker 2: was a couple of points where like Duncan was like 1850 01:33:35,080 --> 01:33:37,519 Speaker 2: a lot is going to hinge on some very brave, 1851 01:33:37,840 --> 01:33:41,880 Speaker 2: very famous people choosing to throw down some big dollar lawsuits, 1852 01:33:42,280 --> 01:33:44,679 Speaker 2: like that's what we need right now. They did talk 1853 01:33:44,720 --> 01:33:47,559 Speaker 2: about the No Fakes Act, which has bipartisan support and 1854 01:33:47,560 --> 01:33:49,640 Speaker 2: gives some legal force to allow people to push for 1855 01:33:49,720 --> 01:33:52,720 Speaker 2: AI copies of themselves to be taken down. And they 1856 01:33:52,720 --> 01:33:55,879 Speaker 2: think there's also some bipartisan possibility to get AI labeling 1857 01:33:56,600 --> 01:33:57,680 Speaker 2: like legislation. 1858 01:33:57,920 --> 01:33:59,479 Speaker 10: The thing is, any of these things would be fucking 1859 01:33:59,479 --> 01:34:02,080 Speaker 10: fight OPA, because if what you have to remove something 1860 01:34:02,080 --> 01:34:04,240 Speaker 10: from a model, how the fuck do we do that? 1861 01:34:04,360 --> 01:34:05,120 Speaker 2: Yeah, we don't know how. 1862 01:34:05,120 --> 01:34:06,840 Speaker 5: You have to throw away the entire model, you. 1863 01:34:06,800 --> 01:34:09,080 Speaker 8: Have to retrain, like there's no way around it. 1864 01:34:09,560 --> 01:34:12,599 Speaker 2: Yeah, And there was a really good point where kind 1865 01:34:12,600 --> 01:34:14,000 Speaker 2: of at the end of this part of what I 1866 01:34:14,000 --> 01:34:16,200 Speaker 2: appreciate is again there was no bullshit. Like Moya at 1867 01:34:16,200 --> 01:34:18,360 Speaker 2: one point was like, I think it is absolutely it 1868 01:34:18,560 --> 01:34:21,640 Speaker 2: being generative. AI is absolutely a net negative for the 1869 01:34:21,720 --> 01:34:24,160 Speaker 2: artistic community. The point is, the point is not to 1870 01:34:24,200 --> 01:34:26,000 Speaker 2: get something out as quick as possible to like make 1871 01:34:26,160 --> 01:34:26,640 Speaker 2: art right. 1872 01:34:26,720 --> 01:34:29,120 Speaker 5: And this has to be like one of maybe five 1873 01:34:29,200 --> 01:34:31,400 Speaker 5: people who are doing panels at CES who's like willing 1874 01:34:31,439 --> 01:34:32,040 Speaker 5: to say that. 1875 01:34:31,960 --> 01:34:34,040 Speaker 2: Yes, and Duncan got I was like, look, you can't 1876 01:34:34,040 --> 01:34:36,479 Speaker 2: stop the technology from being invented, So the best path 1877 01:34:36,520 --> 01:34:38,760 Speaker 2: forward is to like try and channel this into a 1878 01:34:38,800 --> 01:34:41,720 Speaker 2: direction that like is at least better for artists, Like 1879 01:34:42,000 --> 01:34:44,040 Speaker 2: there were there was very little for most of the 1880 01:34:44,080 --> 01:34:46,559 Speaker 2: people on the panel, very little bullshit. There was some 1881 01:34:46,680 --> 01:34:50,000 Speaker 2: bullshit from one person on the panel, Jenny Katzman, Senior 1882 01:34:50,040 --> 01:34:54,120 Speaker 2: director of Government Affairs from Microsoft. Oh I bet that 1883 01:34:54,280 --> 01:34:56,800 Speaker 2: was fun. So after there's this whole point where like 1884 01:34:56,880 --> 01:34:58,360 Speaker 2: everyone else in the panels like, yeah, I think it's 1885 01:34:58,360 --> 01:35:02,120 Speaker 2: probably in that negative for artists the whole, and Jenny comes 1886 01:35:02,160 --> 01:35:04,719 Speaker 2: on she's like, actually, I think it's a net positive. 1887 01:35:04,960 --> 01:35:07,240 Speaker 2: And her example of this is, well, you know, think 1888 01:35:07,320 --> 01:35:09,360 Speaker 2: there's a lot of stuff that you couldn't do before that. 1889 01:35:09,439 --> 01:35:12,160 Speaker 2: Thanks to AI, you could do like d aging Harrison 1890 01:35:12,240 --> 01:35:16,360 Speaker 2: Ford for the Indiana Jones movie, something that went over 1891 01:35:16,640 --> 01:35:17,160 Speaker 2: very well. 1892 01:35:18,920 --> 01:35:21,880 Speaker 5: Everyone everyone loved and a great creative. 1893 01:35:22,640 --> 01:35:24,200 Speaker 2: This is the fucking problem with all of this. 1894 01:35:24,280 --> 01:35:26,240 Speaker 10: On top of house ship it is and how expensive 1895 01:35:26,240 --> 01:35:28,800 Speaker 10: it is, which kind of AI are we talking about? 1896 01:35:28,800 --> 01:35:29,400 Speaker 8: That dip shit. 1897 01:35:29,520 --> 01:35:34,200 Speaker 10: That's not generative AI, that's not what that fucking waste. 1898 01:35:33,760 --> 01:35:36,360 Speaker 5: And it also steals us from being able to cast 1899 01:35:36,439 --> 01:35:40,960 Speaker 5: a young River Phoenix's blat lovely thing. 1900 01:35:41,720 --> 01:35:44,680 Speaker 2: It's getting cast in more stuff. Garrett, I'm very unfair. 1901 01:35:45,400 --> 01:35:47,280 Speaker 5: Well, luckily, with the power of AI. 1902 01:35:48,080 --> 01:35:52,320 Speaker 2: Look, I can put into every newspaper sequentially starting in 1903 01:35:52,360 --> 01:35:54,800 Speaker 2: eighteen thirty four, So I've not gotten to the end 1904 01:35:54,840 --> 01:35:57,320 Speaker 2: of Phoenix. It would be a really long career. 1905 01:35:57,760 --> 01:35:58,559 Speaker 5: It would be really cool. 1906 01:35:58,640 --> 01:36:00,160 Speaker 8: Sleeping guy. 1907 01:36:00,360 --> 01:36:03,120 Speaker 2: I think he's got the bold ideas. This is gonna 1908 01:36:03,120 --> 01:36:05,200 Speaker 2: work out really well for Germany. 1909 01:36:05,400 --> 01:36:07,280 Speaker 5: It won't be really cool that instead of just doing 1910 01:36:07,320 --> 01:36:09,840 Speaker 5: Young Harrison for they just do a River Phoenix deep 1911 01:36:09,880 --> 01:36:16,880 Speaker 5: fake for you him. Look it's canonical. 1912 01:36:17,080 --> 01:36:18,639 Speaker 2: Yeah great. 1913 01:36:18,880 --> 01:36:21,280 Speaker 10: Oh, I love the movies in the future of them too. 1914 01:36:21,479 --> 01:36:23,200 Speaker 10: This is so good. This is so bad. 1915 01:36:23,320 --> 01:36:25,160 Speaker 5: James mangled, You're a hacking So. 1916 01:36:25,320 --> 01:36:28,000 Speaker 2: I gotta say it was very funny because she also 1917 01:36:28,040 --> 01:36:31,679 Speaker 2: suggests Jenny, there's we can use animals without causing harm 1918 01:36:31,680 --> 01:36:33,639 Speaker 2: thanks to AI, a thing that no one had figured 1919 01:36:33,640 --> 01:36:35,920 Speaker 2: out how to do before. Nobody had ever figured out 1920 01:36:35,920 --> 01:36:39,160 Speaker 2: how just like not hurt animals in movies that didn't 1921 01:36:39,200 --> 01:36:40,839 Speaker 2: exist before a I thank. 1922 01:36:40,720 --> 01:36:44,640 Speaker 5: God, thankfully AI will never do any harmed animals or 1923 01:36:44,680 --> 01:36:45,280 Speaker 5: the environment. 1924 01:36:46,160 --> 01:36:48,960 Speaker 12: Nobody asked the lobbyist for Microsoft, what else the company 1925 01:36:49,000 --> 01:36:52,600 Speaker 12: is doing with AI? Right, with police deployments or with 1926 01:36:53,080 --> 01:36:54,280 Speaker 12: fossil fuel companies? 1927 01:36:54,360 --> 01:36:54,719 Speaker 8: Yeah? 1928 01:36:54,880 --> 01:36:56,080 Speaker 2: Is that bad for animals? 1929 01:36:56,439 --> 01:36:56,679 Speaker 5: No? 1930 01:36:56,720 --> 01:37:02,160 Speaker 12: Actually, it's really good. They they needed it. They yearn 1931 01:37:02,280 --> 01:37:02,960 Speaker 12: for the month. 1932 01:37:03,160 --> 01:37:08,160 Speaker 2: They love datasets. Great for their habitats. She said, there's 1933 01:37:08,200 --> 01:37:11,040 Speaker 2: issues with employment, but there's lots of issues that fall 1934 01:37:11,080 --> 01:37:13,559 Speaker 2: around that, and I do think you need a balance. 1935 01:37:13,920 --> 01:37:15,600 Speaker 2: And at the end of it, the guy running the 1936 01:37:15,640 --> 01:37:19,800 Speaker 2: panel just says, Okay. 1937 01:37:18,880 --> 01:37:20,560 Speaker 7: It sounds like you guys are saying a bunch of 1938 01:37:20,680 --> 01:37:21,519 Speaker 7: woke ship. 1939 01:37:21,479 --> 01:37:25,920 Speaker 2: On this panel. All right, all right, Microsoft, Just once, 1940 01:37:26,000 --> 01:37:27,720 Speaker 2: I'd like on the panel someone to go and say, 1941 01:37:27,960 --> 01:37:32,479 Speaker 2: what the fuck do you mean? Closest to that that 1942 01:37:32,520 --> 01:37:33,360 Speaker 2: you were going to get. 1943 01:37:33,520 --> 01:37:35,240 Speaker 5: I think we do need a balance of some people 1944 01:37:35,280 --> 01:37:38,200 Speaker 5: being fired like these people, and other people keeping their 1945 01:37:38,280 --> 01:37:39,240 Speaker 5: jobs like everyone else. 1946 01:37:39,800 --> 01:37:42,880 Speaker 7: Like Moya give somebody has to lose and somebody has. 1947 01:37:42,760 --> 01:37:46,679 Speaker 10: To work exactly that's their entire Somebody has the guns, 1948 01:37:46,720 --> 01:37:47,439 Speaker 10: somebody doesn't. 1949 01:37:47,600 --> 01:37:49,639 Speaker 2: Somebody knows the way the maze works and something. 1950 01:37:49,800 --> 01:37:51,679 Speaker 7: He's gonna what, we shouldn't have guns. 1951 01:37:51,800 --> 01:37:54,000 Speaker 8: We shouldn't have a man and one of them knows 1952 01:37:54,040 --> 01:37:54,840 Speaker 8: the maze and have a gun. 1953 01:37:55,000 --> 01:37:59,000 Speaker 2: We should have a gun, mace, you're talking about the gun. Now, Look, 1954 01:37:59,520 --> 01:38:01,519 Speaker 2: we all like keeping a couple of people in a 1955 01:38:01,560 --> 01:38:05,000 Speaker 2: maze beneath our house. Yeah, there's nothing wrong with this. 1956 01:38:05,000 --> 01:38:07,120 Speaker 5: This is just the dormant next this we just we 1957 01:38:07,280 --> 01:38:11,560 Speaker 5: keep doing it. 1958 01:38:11,560 --> 01:38:14,200 Speaker 2: It's a nice maize under my house they have. 1959 01:38:14,520 --> 01:38:15,720 Speaker 8: It's nice to run some of them. 1960 01:38:18,680 --> 01:38:23,040 Speaker 10: The minotag gets them only sometimes I'm the minotta. 1961 01:38:24,479 --> 01:38:25,880 Speaker 2: Anyway, the gun maze isn't real. 1962 01:38:26,080 --> 01:38:28,920 Speaker 10: But also most of their arguments kind of mostly just 1963 01:38:28,920 --> 01:38:31,719 Speaker 10: come down to, well, you can't make an omelet without breaking, 1964 01:38:31,880 --> 01:38:33,320 Speaker 10: like you have to make people. 1965 01:38:33,360 --> 01:38:36,680 Speaker 2: You have to break the human drive to create art. Obviously, 1966 01:38:36,760 --> 01:38:42,439 Speaker 2: to make an not taste good an omelet esque food, 1967 01:38:43,120 --> 01:38:45,639 Speaker 2: it's a piss omelet, Like there's piss in the omelet, 1968 01:38:46,080 --> 01:38:47,960 Speaker 2: and we had to We had to burn down the 1969 01:38:48,000 --> 01:38:51,880 Speaker 2: susteined chapel to make the piss omelet. Computer made it, though, yeah, 1970 01:38:52,840 --> 01:38:56,639 Speaker 2: and clap for the computer. We did firebomb the louver. 1971 01:38:56,880 --> 01:39:01,599 Speaker 2: But look look at look at this rock star. 1972 01:39:03,439 --> 01:39:04,000 Speaker 11: Oh god. 1973 01:39:04,520 --> 01:39:06,840 Speaker 2: All right, well that's the episode. That's all I got, folks. 1974 01:39:06,840 --> 01:39:10,280 Speaker 2: That was my first day at CES twenty twenty five. Huzzah. Yeah, 1975 01:39:10,320 --> 01:39:11,400 Speaker 2: this is just my first day. 1976 01:39:11,439 --> 01:39:12,719 Speaker 8: Better offlines here all week. 1977 01:39:13,120 --> 01:39:15,120 Speaker 10: I'm gonna hear about stuff like this all week, and 1978 01:39:15,160 --> 01:39:17,200 Speaker 10: I think I'm gonna be fully jokeified. 1979 01:39:17,240 --> 01:39:19,439 Speaker 2: I'm gonna wake up in the clown makeup on Friday. 1980 01:39:19,479 --> 01:39:21,960 Speaker 7: I'm gonna find the funnest thing to bring back for you. 1981 01:39:22,200 --> 01:39:25,400 Speaker 10: I'm gonna find an artist to put me in full joke. 1982 01:39:25,439 --> 01:39:29,400 Speaker 2: Now, I'm gonna try to steal that AI enhanced grill. Yeah, 1983 01:39:30,560 --> 01:39:33,719 Speaker 2: grill that tests you can I just like move this around. 1984 01:39:33,720 --> 01:39:35,880 Speaker 2: I just want to test that would roll se ail, 1985 01:39:36,000 --> 01:39:37,240 Speaker 2: open the door, open the door. 1986 01:39:37,520 --> 01:39:39,920 Speaker 10: As someone who's done a lot of like grilling, done 1987 01:39:39,920 --> 01:39:42,600 Speaker 10: a lot of spoken barbecue, I don't know what a 1988 01:39:42,720 --> 01:39:43,120 Speaker 10: I would do. 1989 01:39:43,200 --> 01:39:44,599 Speaker 8: Is it gonna talk to me in the sixth. 1990 01:39:44,760 --> 01:39:47,200 Speaker 2: Wait till you are you? Are you trying to tell 1991 01:39:47,280 --> 01:39:50,639 Speaker 2: us here at Zra, yes, that you have grilled meat 1992 01:39:50,800 --> 01:39:53,720 Speaker 2: without a robot texting you about it? Because I just 1993 01:39:53,720 --> 01:39:54,280 Speaker 2: don't believe. 1994 01:39:54,360 --> 01:39:57,360 Speaker 8: I don't know how I did it, but I did it. 1995 01:39:57,600 --> 01:40:03,840 Speaker 2: You're never always way took the robots. It was impossible. 1996 01:40:04,280 --> 01:40:06,320 Speaker 8: Oh god, we're at the death of innovation. 1997 01:40:06,680 --> 01:40:09,000 Speaker 2: Yeah, at the end of a lot of things maybe 1998 01:40:09,280 --> 01:40:11,080 Speaker 2: and the end of the episode. Yeah, and the end 1999 01:40:11,080 --> 01:40:14,920 Speaker 2: of the episode. Thank god. You know, everyone else be 2000 01:40:15,040 --> 01:40:34,479 Speaker 2: the cyber truck in the oh, welcome back to It 2001 01:40:34,520 --> 01:40:37,519 Speaker 2: could happen here? A podcast about it happening here, which 2002 01:40:37,560 --> 01:40:40,839 Speaker 2: is really true in a lot of ways. Tonight, Harrison 2003 01:40:40,920 --> 01:40:45,040 Speaker 2: Davis and I are seated at the Glorious Majestical hotel 2004 01:40:45,200 --> 01:40:48,840 Speaker 2: name redacted on the Las Vegas Strip. We had a 2005 01:40:48,960 --> 01:40:53,160 Speaker 2: long day at CEES Day, listening to panels, catching up 2006 01:40:53,160 --> 01:40:56,439 Speaker 2: with the latest tech news, trying gadgets, and also at 2007 01:40:56,439 --> 01:40:59,440 Speaker 2: the same time texting our dear friends in Los Angeles 2008 01:40:59,520 --> 01:41:04,240 Speaker 2: as unprecedented fire sweep them from their homes. Literally the 2009 01:41:04,280 --> 01:41:08,480 Speaker 2: getdiest threatened. Pasadena and Santa Monica are both being evacuated 2010 01:41:08,520 --> 01:41:11,479 Speaker 2: as once. It's a real one to two punch of 2011 01:41:11,520 --> 01:41:14,720 Speaker 2: America's favorite tech show in the apocalypse today. How are 2012 01:41:14,720 --> 01:41:15,120 Speaker 2: you feeling? 2013 01:41:15,160 --> 01:41:15,320 Speaker 6: Gear? 2014 01:41:15,479 --> 01:41:17,440 Speaker 5: It's an average day in America, average. 2015 01:41:17,200 --> 01:41:24,040 Speaker 2: Day in America. Temperature's not coming down anytime soon. No, no, well, 2016 01:41:24,040 --> 01:41:26,519 Speaker 2: just take a moment to breathe with that. So you 2017 01:41:26,560 --> 01:41:28,400 Speaker 2: want to start us off with what you did this morning? 2018 01:41:28,479 --> 01:41:31,559 Speaker 2: I was panel guy yesterday. There was a man of 2019 01:41:31,600 --> 01:41:35,240 Speaker 2: action walking around and mostly trying all the free massage chairs. 2020 01:41:35,360 --> 01:41:36,519 Speaker 2: What did you see this morning? 2021 01:41:36,680 --> 01:41:39,640 Speaker 5: I saw so many AI panels, half of which I 2022 01:41:39,720 --> 01:41:41,840 Speaker 5: left halfway through because I knew they. 2023 01:41:41,720 --> 01:41:43,519 Speaker 2: Weren't gonna be useful for me, just dogshit. 2024 01:41:43,680 --> 01:41:46,559 Speaker 5: The other half I took notes on and just got sad. 2025 01:41:47,080 --> 01:41:50,080 Speaker 5: But no, today was full panel starting bright and early 2026 01:41:50,120 --> 01:41:52,400 Speaker 5: in the morning, where I walked into a panel where 2027 01:41:52,400 --> 01:41:57,120 Speaker 5: I heard augmentation and not replacement about twenty times in 2028 01:41:57,160 --> 01:41:58,719 Speaker 5: the span of like twenty minutes. 2029 01:41:58,840 --> 01:42:01,479 Speaker 2: Yeah, I keep hearing for of that too. In the 2030 01:42:01,520 --> 01:42:03,800 Speaker 2: Hollywood panels, they would be like, yeah, we want to 2031 01:42:03,800 --> 01:42:05,680 Speaker 2: develop a machine that can read the brains of our 2032 01:42:05,760 --> 01:42:08,040 Speaker 2: viewers and alter the endings of movies, you know, but 2033 01:42:08,040 --> 01:42:10,759 Speaker 2: we see this as a way of augmenting the artists work. 2034 01:42:11,000 --> 01:42:13,960 Speaker 5: Yes. And the biggest thing that I noticed across multiple 2035 01:42:13,960 --> 01:42:17,920 Speaker 5: panels today is an almost like anxiety among these tech 2036 01:42:17,960 --> 01:42:24,920 Speaker 5: executives about consumers rejecting the AI slopification of everything, and 2037 01:42:25,400 --> 01:42:27,880 Speaker 5: they're trying to find ways to like actually force people 2038 01:42:27,880 --> 01:42:30,679 Speaker 5: to start like using these products or having them like 2039 01:42:30,680 --> 01:42:33,759 Speaker 5: like it. Yeah, and I haven't really since that anxiety before. 2040 01:42:33,800 --> 01:42:36,519 Speaker 5: It's all been very, very positively. 2041 01:42:36,080 --> 01:42:37,880 Speaker 2: And I think it's a mix of Number One, the 2042 01:42:37,960 --> 01:42:41,240 Speaker 2: money still isn't there where they need it to be. 2043 01:42:41,479 --> 01:42:44,000 Speaker 2: It has not started like blooming to the extent that 2044 01:42:44,000 --> 01:42:45,880 Speaker 2: they were expecting it by now. And the other part 2045 01:42:45,920 --> 01:42:49,599 Speaker 2: is people are still not happy with this stuff. I'm 2046 01:42:49,640 --> 01:42:52,280 Speaker 2: glad you felt that too, because that almost was like 2047 01:42:52,560 --> 01:42:55,880 Speaker 2: especially after the election, Like, I don't trust my feelings 2048 01:42:55,920 --> 01:42:58,040 Speaker 2: on this that they're really scared, but I really do 2049 01:42:58,160 --> 01:42:59,920 Speaker 2: think there's a piece of that coming through it. 2050 01:43:00,080 --> 01:43:02,519 Speaker 5: No of a phrase one of the panelists used this 2051 01:43:02,680 --> 01:43:06,800 Speaker 5: morning was the AI ick, Like how do we how 2052 01:43:06,840 --> 01:43:09,000 Speaker 5: do we beat the AI ick? And if you're ever 2053 01:43:09,040 --> 01:43:11,599 Speaker 5: saying yourself, how do I stop having people feel an 2054 01:43:11,600 --> 01:43:15,320 Speaker 5: ick around me? Maybe you should really look inwards. Yeah, 2055 01:43:15,400 --> 01:43:17,479 Speaker 5: maybe the problems you not them. 2056 01:43:18,200 --> 01:43:20,200 Speaker 2: You know, who doesn't need to worry about quote unquote 2057 01:43:20,200 --> 01:43:23,479 Speaker 2: ick for their product market is people who make things 2058 01:43:23,479 --> 01:43:24,240 Speaker 2: that people like. 2059 01:43:25,280 --> 01:43:27,320 Speaker 5: So but I heard a lot about, you know, and 2060 01:43:27,320 --> 01:43:28,840 Speaker 5: trying to get people to use these products is like 2061 01:43:28,880 --> 01:43:31,519 Speaker 5: making sure artists don't feel like they're being replaced instead 2062 01:43:31,840 --> 01:43:36,519 Speaker 5: having their like art production process be augmented with AI 2063 01:43:36,600 --> 01:43:40,040 Speaker 5: and how how that can make art easier to make 2064 01:43:40,080 --> 01:43:42,759 Speaker 5: while still keeping the human at the center of AI tools. 2065 01:43:43,200 --> 01:43:44,680 Speaker 5: And this is just what they talked about for like 2066 01:43:44,680 --> 01:43:47,040 Speaker 5: a while, while reiterating that lots of the developments they 2067 01:43:47,080 --> 01:43:49,559 Speaker 5: need to see on AI they have it on the 2068 01:43:49,560 --> 01:43:52,120 Speaker 5: tech side. What they need to rely on is consumer 2069 01:43:52,160 --> 01:43:54,400 Speaker 5: acceptance to really drive the innovation. To see like what 2070 01:43:54,439 --> 01:43:56,640 Speaker 5: they can get away with, like how much will the 2071 01:43:56,680 --> 01:44:01,719 Speaker 5: consumer accept the sopification of our to entertainment and customer 2072 01:44:01,800 --> 01:44:04,400 Speaker 5: service and all these things are trying to cram AI 2073 01:44:04,520 --> 01:44:05,120 Speaker 5: into and like. 2074 01:44:05,280 --> 01:44:08,400 Speaker 2: How much worse can you make the world before people 2075 01:44:08,960 --> 01:44:11,880 Speaker 2: stand up and stop you with their fists or guns. 2076 01:44:11,960 --> 01:44:13,920 Speaker 5: And you mentioned something about like trying to like tailor 2077 01:44:13,960 --> 01:44:17,040 Speaker 5: like movie endings for specific people, and I definitely heard 2078 01:44:17,040 --> 01:44:19,479 Speaker 5: them stuff about that. There's this one guy who is 2079 01:44:19,560 --> 01:44:22,360 Speaker 5: who was like the panel's resident like content creator whos 2080 01:44:22,360 --> 01:44:27,479 Speaker 5: supposed to represent like the artist block, even though he's like, eh, yeah, 2081 01:44:27,560 --> 01:44:30,320 Speaker 5: you know, some kind of like AI friendly content creator 2082 01:44:30,320 --> 01:44:32,160 Speaker 5: though on this panel, and he talked about how like 2083 01:44:32,479 --> 01:44:34,960 Speaker 5: back in the day, you need to have friends that 2084 01:44:35,000 --> 01:44:39,000 Speaker 5: would like recommend your music, and like the Spotify algorithm 2085 01:44:39,040 --> 01:44:40,640 Speaker 5: is is too based on like an echo chamber of 2086 01:44:40,640 --> 01:44:44,200 Speaker 5: what you already like, but now with a gentic AI 2087 01:44:44,439 --> 01:44:47,080 Speaker 5: this allows trust between the consumer and the machine to 2088 01:44:47,160 --> 01:44:50,160 Speaker 5: recommend new music. And like again, like so much of 2089 01:44:50,160 --> 01:44:52,200 Speaker 5: these air products is just trying to like replace friendship. 2090 01:44:52,800 --> 01:44:55,920 Speaker 2: People. Have you friends? Have you tried people? 2091 01:44:56,160 --> 01:44:59,839 Speaker 5: How can you engage with like art and culture without friends? 2092 01:45:00,080 --> 01:45:02,200 Speaker 5: How can you like learn more about like what your 2093 01:45:02,240 --> 01:45:04,360 Speaker 5: friends are into what they like? How can you discover 2094 01:45:04,479 --> 01:45:08,240 Speaker 5: new music just like without that instead replacing that beautifully 2095 01:45:08,280 --> 01:45:09,120 Speaker 5: human process. 2096 01:45:09,400 --> 01:45:12,200 Speaker 2: Every year at cees, there are points in time where 2097 01:45:12,240 --> 01:45:14,360 Speaker 2: I get that, like, oh yeah, twenty twenty really fucked 2098 01:45:14,439 --> 01:45:17,240 Speaker 2: us up a lot, Like twenty twenty really did some 2099 01:45:17,360 --> 01:45:21,160 Speaker 2: lasting damage. Like I know it was that was happening 2100 01:45:21,320 --> 01:45:24,760 Speaker 2: with the younger generation before the iPad kid generation, but 2101 01:45:24,920 --> 01:45:28,480 Speaker 2: like that that really did a number on some folks. 2102 01:45:29,080 --> 01:45:33,400 Speaker 5: Someone from Meta right of Facebook specifically they're like metaverse division, 2103 01:45:33,400 --> 01:45:34,679 Speaker 5: which they're still trying to push. 2104 01:45:34,560 --> 01:45:36,280 Speaker 2: For by the way, Oh yeah, now, I mean they're 2105 01:45:36,280 --> 01:45:38,920 Speaker 2: still calling it meta, which honestly there's a degree which 2106 01:45:38,960 --> 01:45:42,120 Speaker 2: I almost respect it because we are not binding on 2107 01:45:42,360 --> 01:45:42,920 Speaker 2: no one is. 2108 01:45:43,320 --> 01:45:46,120 Speaker 5: But she talked about how they can like blend the 2109 01:45:46,160 --> 01:45:50,640 Speaker 5: metaverse and AI to make customized personal experiences. Say that 2110 01:45:50,720 --> 01:45:55,240 Speaker 5: you're watching an immersive live concert in mixed reality, something 2111 01:45:55,240 --> 01:45:56,880 Speaker 5: that both me and Aubert do all at the time, and. 2112 01:45:56,960 --> 01:46:02,320 Speaker 2: I mixed red hairy style mixed reality concerts. We're seeing 2113 01:46:02,360 --> 01:46:04,240 Speaker 2: the one hundred gets you. 2114 01:46:04,200 --> 01:46:06,880 Speaker 5: Know, honestly, a one hundred GEX mixed reality concert could 2115 01:46:06,880 --> 01:46:07,360 Speaker 5: go crazy. 2116 01:46:07,400 --> 01:46:09,639 Speaker 2: Here, We'll finally I'll finally get you pilled on real 2117 01:46:09,680 --> 01:46:10,720 Speaker 2: big fish. 2118 01:46:10,800 --> 01:46:15,000 Speaker 5: But Basically, as you're in this like metaverse concert, they 2119 01:46:15,000 --> 01:46:17,400 Speaker 5: can have an AI that will sense your own excitement 2120 01:46:17,800 --> 01:46:20,760 Speaker 5: and personalize the ending of the experience based on your 2121 01:46:20,760 --> 01:46:23,800 Speaker 5: favorite songs or artists. So as they're getting excited from 2122 01:46:23,800 --> 01:46:27,240 Speaker 5: like AI, Taylor Swift can like finish the song like 2123 01:46:27,360 --> 01:46:29,679 Speaker 5: for you based on like your own like musical tastes, 2124 01:46:29,720 --> 01:46:31,879 Speaker 5: based on what the AI knows about you. And it's 2125 01:46:31,880 --> 01:46:33,920 Speaker 5: about creating these customized experiences. 2126 01:46:33,920 --> 01:46:35,800 Speaker 2: It's such a you can clearly tell that none of 2127 01:46:35,840 --> 01:46:37,960 Speaker 2: these people have souls, right, It's such a mismatch of 2128 01:46:38,000 --> 01:46:41,240 Speaker 2: what people get from music because they think that like, oh, 2129 01:46:41,320 --> 01:46:43,400 Speaker 2: this is just like a if I see that like 2130 01:46:43,479 --> 01:46:46,000 Speaker 2: this specific beat line is I can just sort of 2131 01:46:46,040 --> 01:46:48,120 Speaker 2: like plug this in and like I don't know, Like 2132 01:46:48,640 --> 01:46:51,160 Speaker 2: what makes people react to musicians and artists is that 2133 01:46:51,200 --> 01:46:54,120 Speaker 2: they like make things that make them feel something like 2134 01:46:54,680 --> 01:46:57,920 Speaker 2: That's why people get like really into artists, is they 2135 01:46:58,000 --> 01:47:01,880 Speaker 2: feel seen and I identify with a piece of art 2136 01:47:02,200 --> 01:47:05,120 Speaker 2: as opposed to like, oh, oh, that guy really like 2137 01:47:05,240 --> 01:47:09,679 Speaker 2: the first opening bars to fucking Octopuses Garden, Like, let's 2138 01:47:09,880 --> 01:47:12,240 Speaker 2: let's just like really turn up the octopus a lot 2139 01:47:12,520 --> 01:47:16,040 Speaker 2: more octopuses? How many more octopuses. Can we fit in 2140 01:47:16,040 --> 01:47:17,960 Speaker 2: this fucking in this track? 2141 01:47:18,240 --> 01:47:18,280 Speaker 1: No. 2142 01:47:19,000 --> 01:47:21,120 Speaker 5: Another panel I went to later in the day was about, 2143 01:47:21,120 --> 01:47:24,519 Speaker 5: like how do you market to gen z? Very funny panel. Yeah, 2144 01:47:24,520 --> 01:47:28,000 Speaker 5: and and they're talking about how like authenticity is so important, 2145 01:47:28,080 --> 01:47:30,240 Speaker 5: like you need to partner with influencers that have like 2146 01:47:30,360 --> 01:47:33,599 Speaker 5: have like an authentic brand, and it's funny having that 2147 01:47:34,040 --> 01:47:36,599 Speaker 5: ductionapost with like these like these like AI slot panels 2148 01:47:36,600 --> 01:47:39,000 Speaker 5: were like you need like an AI Taylor Swift to 2149 01:47:39,040 --> 01:47:41,599 Speaker 5: come like boost the excitement for all these kids who 2150 01:47:41,600 --> 01:47:45,680 Speaker 5: are in their metaverse concerts. Oh boy, But no, like 2151 01:47:45,720 --> 01:47:49,240 Speaker 5: personalized content like like targeting, like ai AI generated content 2152 01:47:49,400 --> 01:47:52,640 Speaker 5: specifically for certain people, for certain users, whether that's on 2153 01:47:52,680 --> 01:47:54,960 Speaker 5: social media, whether that's on you know, the metaverse. Like 2154 01:47:54,960 --> 01:47:57,040 Speaker 5: some of these people talk about someone on the panel 2155 01:47:57,080 --> 01:47:59,720 Speaker 5: from Adobe, who's you know, Adobe's integrating a whole bunch 2156 01:47:59,720 --> 01:48:02,400 Speaker 5: of gen Ai into their like suite of products, right, 2157 01:48:02,520 --> 01:48:06,519 Speaker 5: like a Photoshop premiere after effects, right, big big company 2158 01:48:06,520 --> 01:48:08,560 Speaker 5: in the create A space. He said that, like the 2159 01:48:08,680 --> 01:48:12,000 Speaker 5: personalized content is always the most impactful, like content that 2160 01:48:12,040 --> 01:48:15,120 Speaker 5: a person feels like a genuine connection to, and that 2161 01:48:15,240 --> 01:48:17,120 Speaker 5: connection could be formed by just being like, you know, 2162 01:48:17,320 --> 01:48:20,360 Speaker 5: a compelling artist where you can recognize shared experiences of 2163 01:48:20,560 --> 01:48:24,200 Speaker 5: shared experiences of humanity. But now you don't need that 2164 01:48:24,360 --> 01:48:27,519 Speaker 5: artist part anymore. He said, they only need three parts 2165 01:48:27,520 --> 01:48:31,160 Speaker 5: to create a pipeline. You need data, you need compelling 2166 01:48:31,200 --> 01:48:34,040 Speaker 5: like journeys to take the user on, and you need 2167 01:48:34,080 --> 01:48:36,719 Speaker 5: the content itself. And the goal is to create content 2168 01:48:36,760 --> 01:48:40,400 Speaker 5: at scale that's highly personalized. He said quotes. We're good 2169 01:48:40,400 --> 01:48:43,040 Speaker 5: at the first two parts. Now we just need to 2170 01:48:43,080 --> 01:48:46,400 Speaker 5: improve the actual content side, which I don't even think 2171 01:48:46,439 --> 01:48:48,559 Speaker 5: that's true. I don't think AI is good at creating 2172 01:48:48,680 --> 01:48:50,000 Speaker 5: compelling human journeys. 2173 01:48:50,000 --> 01:48:51,840 Speaker 2: I had it. So the video I didn't play you 2174 01:48:51,880 --> 01:48:55,120 Speaker 2: guys from my terrible fucking AI generated videos was this. 2175 01:48:55,520 --> 01:48:58,000 Speaker 2: It was like a girl coming to college, we need 2176 01:48:58,040 --> 01:48:59,400 Speaker 2: a picture of her dad, and it was like a 2177 01:48:59,520 --> 01:49:03,000 Speaker 2: narration of her life with her father who like is 2178 01:49:03,120 --> 01:49:06,360 Speaker 2: dead that she misses and all that she learned from Dada, 2179 01:49:06,600 --> 01:49:08,400 Speaker 2: and it like it's a mix of like all these 2180 01:49:08,400 --> 01:49:10,120 Speaker 2: different Like there's a chunk where it looks like a 2181 01:49:10,200 --> 01:49:13,160 Speaker 2: Disney animated picture, there's a chunk where it looks like anime. 2182 01:49:13,720 --> 01:49:16,240 Speaker 2: She and her dad having these like adventures around the world. 2183 01:49:16,280 --> 01:49:17,719 Speaker 2: There's a bit of it that looks like a Marvel 2184 01:49:17,800 --> 01:49:19,960 Speaker 2: movie and he's like, we can do all of these 2185 01:49:20,000 --> 01:49:23,120 Speaker 2: different you know, animation styles and they're seamless and like, 2186 01:49:23,320 --> 01:49:25,400 Speaker 2: you know, the audience really goes on a journey with this, 2187 01:49:25,520 --> 01:49:28,120 Speaker 2: and it's it's like, but there's there was no girl 2188 01:49:28,240 --> 01:49:30,720 Speaker 2: who lost your dad. Nobody lost their dad here. This 2189 01:49:30,840 --> 01:49:33,640 Speaker 2: is you just had a computer generate text about a 2190 01:49:33,720 --> 01:49:37,640 Speaker 2: dad dying. Like there's nothing underpinning this, right, nobody has 2191 01:49:37,680 --> 01:49:39,840 Speaker 2: anything they're trying to get across, Like you just know 2192 01:49:40,000 --> 01:49:42,719 Speaker 2: in this one they look like Marvel heroes for some reason. 2193 01:49:42,960 --> 01:49:45,280 Speaker 2: In this one they look like Zulu warriors kind of 2194 01:49:45,280 --> 01:49:47,920 Speaker 2: done up in a slightly racist lion king style. Like 2195 01:49:48,160 --> 01:49:51,920 Speaker 2: what is being transmitted other than like, look at all 2196 01:49:51,960 --> 01:49:54,040 Speaker 2: of the different art styles we can rip off. 2197 01:49:54,160 --> 01:49:56,680 Speaker 5: No, they do not have a journey. But even they 2198 01:49:56,720 --> 01:49:59,519 Speaker 5: themselves admit that they still don't have the content. The 2199 01:49:59,560 --> 01:50:01,639 Speaker 5: content it's elf still isn't even there. And that's something 2200 01:50:01,680 --> 01:50:03,479 Speaker 5: like they even acknowledge. And this is like a hurdle 2201 01:50:03,520 --> 01:50:05,360 Speaker 5: to this is this is a hurdle to get over. 2202 01:50:05,640 --> 01:50:07,240 Speaker 5: What they do have is the data. And like this 2203 01:50:07,320 --> 01:50:09,400 Speaker 5: is like something that Adobe has done, because if you 2204 01:50:09,479 --> 01:50:12,200 Speaker 5: use Adobe products now some of the most used creative products. 2205 01:50:12,280 --> 01:50:14,640 Speaker 5: Adobe trains all the all of their AI systems on 2206 01:50:14,680 --> 01:50:17,439 Speaker 5: the stuff that you make using their products, which you know, 2207 01:50:17,479 --> 01:50:20,679 Speaker 5: he really just blazed past that point because that's that's 2208 01:50:20,720 --> 01:50:23,559 Speaker 5: a whole other discussion. But even they know that they 2209 01:50:23,560 --> 01:50:26,439 Speaker 5: don't have like the actual products, and this is still 2210 01:50:26,479 --> 01:50:30,080 Speaker 5: reliant on like consumer acceptance. As as they said before, 2211 01:50:30,520 --> 01:50:32,519 Speaker 5: someone from Meta, the same person on the panel that 2212 01:50:32,680 --> 01:50:35,439 Speaker 5: talked about how like a few days ago on Instagram 2213 01:50:35,479 --> 01:50:39,400 Speaker 5: they tried to announce like you'll have like AI profiles right, 2214 01:50:39,520 --> 01:50:42,680 Speaker 5: like like completely AI generated pictures profiles, like you know, 2215 01:50:42,720 --> 01:50:45,760 Speaker 5: like fake people who have their own accounts, and this 2216 01:50:45,840 --> 01:50:48,519 Speaker 5: created such a big backlash that they rolled this back 2217 01:50:48,840 --> 01:50:51,680 Speaker 5: and they simply announced this before Cees. 2218 01:50:51,400 --> 01:50:54,759 Speaker 2: One of these accounts was literally like I'm a mother 2219 01:50:54,880 --> 01:50:58,439 Speaker 2: of two, queer black woman. You know, yeah, I got 2220 01:50:58,439 --> 01:51:00,200 Speaker 2: a lot to say about the world. 2221 01:51:00,240 --> 01:51:03,559 Speaker 5: Someone call up the situationists please, And some like people 2222 01:51:03,600 --> 01:51:05,400 Speaker 5: started talking to her like we're any black people at 2223 01:51:05,439 --> 01:51:07,760 Speaker 5: all involved in like making this chat Bob. 2224 01:51:07,920 --> 01:51:10,080 Speaker 2: She's like, well no, and that's a real problem. That 2225 01:51:10,240 --> 01:51:12,120 Speaker 2: is a real problem, Okay. 2226 01:51:11,960 --> 01:51:14,720 Speaker 5: Yes, And the excuse that this person from Meta said 2227 01:51:14,760 --> 01:51:17,680 Speaker 5: is that the market just isn't ready yet. It's not 2228 01:51:17,720 --> 01:51:20,600 Speaker 5: that the actual product itself is like bad or like 2229 01:51:20,680 --> 01:51:23,719 Speaker 5: no one really wants. The market's not ready yet. 2230 01:51:23,960 --> 01:51:27,360 Speaker 2: Well, they're so used to everything that they've done so far. 2231 01:51:27,960 --> 01:51:31,519 Speaker 2: They've kept getting money right, and like it slowed down 2232 01:51:31,640 --> 01:51:34,680 Speaker 2: and they've had to do layoffs, but like nobody's just 2233 01:51:34,800 --> 01:51:39,439 Speaker 2: made them stop at any point, which, honestly, you know, 2234 01:51:39,720 --> 01:51:42,559 Speaker 2: I made a comment about healthcare executives a while back, 2235 01:51:42,680 --> 01:51:45,240 Speaker 2: needing like a fucking retirement plan paid in millimeters. So 2236 01:51:45,240 --> 01:51:48,200 Speaker 2: I'm not going to make that same comment about tech 2237 01:51:48,200 --> 01:51:51,040 Speaker 2: industry rules because you know, we all know what's in 2238 01:51:51,080 --> 01:51:55,200 Speaker 2: the news. But something has to be done to force 2239 01:51:55,280 --> 01:52:00,200 Speaker 2: these people to stop moving in this direction. And I 2240 01:52:00,240 --> 01:52:02,599 Speaker 2: don't know how to get across, and like they're already 2241 01:52:02,600 --> 01:52:05,360 Speaker 2: at this point, like they seem to really want not 2242 01:52:05,520 --> 01:52:07,519 Speaker 2: want this, and we have to find a way. They're 2243 01:52:07,560 --> 01:52:08,960 Speaker 2: just not ready. We have to find a way to 2244 01:52:08,960 --> 01:52:11,400 Speaker 2: force this on them. Ideas, I don't know how to 2245 01:52:11,479 --> 01:52:14,400 Speaker 2: get across to them in a peaceful manner. Oh oh sorry, 2246 01:52:14,800 --> 01:52:17,919 Speaker 2: people don't want this. I'm a man of peace, Garrison. 2247 01:52:18,000 --> 01:52:20,000 Speaker 2: I'm a man of peace. I'm not a plumber. 2248 01:52:23,840 --> 01:52:25,040 Speaker 5: The last thing I had to add out of this 2249 01:52:25,040 --> 01:52:27,280 Speaker 5: panel just in terms of how much this stuff is 2250 01:52:27,320 --> 01:52:29,040 Speaker 5: just actually taking over more and more of the market 2251 01:52:29,040 --> 01:52:31,599 Speaker 5: even if people don't want it. Is that the guy 2252 01:52:31,640 --> 01:52:33,880 Speaker 5: from Adobe announced it in the fourth quarter of last year. 2253 01:52:34,240 --> 01:52:36,479 Speaker 5: They were able to boost all of the Adobe's like 2254 01:52:36,520 --> 01:52:38,800 Speaker 5: you know emails. If you send like an email to Adobe, right, 2255 01:52:39,200 --> 01:52:41,000 Speaker 5: you have a problem, like you need help. But like 2256 01:52:41,160 --> 01:52:43,800 Speaker 5: everything that they do on emails is now one hundred 2257 01:52:43,840 --> 01:52:47,080 Speaker 5: percent generated by AI. And this was boosted from fifty 2258 01:52:47,120 --> 01:52:48,840 Speaker 5: percent at the start of last year. Now it's one 2259 01:52:48,880 --> 01:52:51,840 Speaker 5: hundred percent of all of their email content is now 2260 01:52:51,880 --> 01:52:54,440 Speaker 5: done by AI with some moderation. 2261 01:52:55,320 --> 01:52:58,440 Speaker 2: There comes like when the company itself is like communicating 2262 01:52:58,520 --> 01:52:59,639 Speaker 2: with customers through email. 2263 01:52:59,760 --> 01:53:01,320 Speaker 5: That's that's what it sounded like. 2264 01:53:01,400 --> 01:53:05,320 Speaker 2: Yes, they're still writing emails sometimes to each other or 2265 01:53:05,800 --> 01:53:07,120 Speaker 2: AI for that too. 2266 01:53:07,520 --> 01:53:09,920 Speaker 5: He described it as like email content, So I'm pretty 2267 01:53:09,960 --> 01:53:13,280 Speaker 5: sure it is like content then customer service stuff like marketing, 2268 01:53:13,320 --> 01:53:16,160 Speaker 5: maybe like outreach, like certain like outreach things. But yeah, 2269 01:53:17,040 --> 01:53:21,200 Speaker 5: now generated by AI with some human like moderation. But yeah, 2270 01:53:21,439 --> 01:53:23,559 Speaker 5: that is where things are moving. And that's how I 2271 01:53:23,600 --> 01:53:24,839 Speaker 5: started my morning. 2272 01:53:25,080 --> 01:53:28,000 Speaker 2: Well better than a cup of coffee. Is that sense 2273 01:53:28,040 --> 01:53:30,439 Speaker 2: of creeping dread that, Like, wow, I just saw a 2274 01:53:30,479 --> 01:53:33,960 Speaker 2: bunch of people who will probably would rather kill the 2275 01:53:34,040 --> 01:53:38,880 Speaker 2: world than be stopped from shoveling AI slop into people's mouths, 2276 01:53:38,920 --> 01:53:41,280 Speaker 2: because this is the only future they can imagine is 2277 01:53:41,280 --> 01:53:43,960 Speaker 2: one in which they work for a company that feeds 2278 01:53:43,960 --> 01:53:47,640 Speaker 2: the planet poison and kills the human concept of creativity 2279 01:53:47,960 --> 01:53:50,120 Speaker 2: so that they can buy a house in San Francisco. 2280 01:53:50,680 --> 01:53:53,559 Speaker 5: Do you know what I want to feed the concept of. 2281 01:53:54,040 --> 01:54:06,559 Speaker 2: Yeah, we'll talk about that, but here's some ants. We're back. 2282 01:54:07,160 --> 01:54:09,200 Speaker 2: What was part two of this episode? That's be buddy, 2283 01:54:09,200 --> 01:54:11,320 Speaker 2: I'm a oh, let's talk about that helicopter. 2284 01:54:11,600 --> 01:54:14,559 Speaker 5: No, yeah, I think as I was going from panel 2285 01:54:14,640 --> 01:54:18,280 Speaker 5: to panels stripling notes on AI has some very exciting 2286 01:54:18,320 --> 01:54:20,640 Speaker 5: news stories drop that we'll talk about later. What were 2287 01:54:20,680 --> 01:54:21,800 Speaker 5: you up to, Robert. 2288 01:54:21,520 --> 01:54:24,639 Speaker 2: Well, I was. I was trawling the show floor as 2289 01:54:24,680 --> 01:54:27,120 Speaker 2: I off to do at some point in a cees 2290 01:54:27,800 --> 01:54:31,559 Speaker 2: and I came across a number of majestic products. You know, 2291 01:54:31,600 --> 01:54:33,560 Speaker 2: a lot of it was AI based, and we'll talk 2292 01:54:33,640 --> 01:54:35,840 Speaker 2: some more about that here, but I ran into something 2293 01:54:35,920 --> 01:54:38,480 Speaker 2: that was, thank god, had nothing to do with AI, 2294 01:54:38,800 --> 01:54:41,360 Speaker 2: and it's a death trap. Every every one of these. 2295 01:54:41,200 --> 01:54:43,360 Speaker 5: There's like some sort yes, we find a new death. 2296 01:54:43,880 --> 01:54:45,640 Speaker 2: There's a lot of connected vehicles. There were a lot 2297 01:54:45,680 --> 01:54:48,560 Speaker 2: of evs last year. There were a ton of different 2298 01:54:48,680 --> 01:54:52,400 Speaker 2: flying taxi type options. People that were really trying to. 2299 01:54:52,560 --> 01:54:53,840 Speaker 5: You don't see it all this year. 2300 01:54:53,920 --> 01:54:56,560 Speaker 2: Nothing this year, nothing this year, because it's a terrible idea. 2301 01:54:56,680 --> 01:54:59,400 Speaker 2: It's a terrible idea. The people who are rich enough 2302 01:54:59,400 --> 01:55:01,640 Speaker 2: to pay for flo flying vehicles don't want it to 2303 01:55:01,680 --> 01:55:06,160 Speaker 2: be a taxi, and the people who can't afford their 2304 01:55:06,160 --> 01:55:09,880 Speaker 2: own flying vehicles also can't afford anyway. So this is 2305 01:55:10,240 --> 01:55:14,960 Speaker 2: instead of any of that Richter Richter ri ct O 2306 01:55:15,280 --> 01:55:19,840 Speaker 2: R which is a Chinese company. Their ads say, I'll say, 2307 01:55:20,040 --> 01:55:25,240 Speaker 2: why be normal? Are saying this the future of travel 2308 01:55:25,320 --> 01:55:29,160 Speaker 2: will not be on the ground. And the Richter is 2309 01:55:29,200 --> 01:55:33,960 Speaker 2: a hybrid. It is like a smart car style sized vehicle. 2310 01:55:33,960 --> 01:55:36,880 Speaker 2: It's like it only has two wheels, though it looks 2311 01:55:36,920 --> 01:55:39,200 Speaker 2: more like a scooter. It's more like a weird little scooter, 2312 01:55:40,640 --> 01:55:43,360 Speaker 2: but it's fully enclosed and in addition to having its 2313 01:55:43,360 --> 01:55:45,200 Speaker 2: wheels and being able to travel about on the ground, 2314 01:55:45,320 --> 01:55:49,120 Speaker 2: it has four like quadricopter style rotors. Because it is 2315 01:55:49,160 --> 01:55:52,879 Speaker 2: an aquatic flying car aquatic flying i. Saw no evidence 2316 01:55:52,920 --> 01:55:54,120 Speaker 2: that could actually go in the water. 2317 01:55:54,400 --> 01:55:55,800 Speaker 5: How high can these things go up? 2318 01:55:56,000 --> 01:55:58,000 Speaker 2: Less than two hundred meters? You know why, Garrison? 2319 01:55:58,080 --> 01:55:58,240 Speaker 5: Why? 2320 01:55:58,400 --> 01:55:59,960 Speaker 2: Why is that? Because if you try to go up 2321 01:56:00,240 --> 01:56:01,480 Speaker 2: that you need a pilot's license. 2322 01:56:01,640 --> 01:56:03,560 Speaker 5: You don't need a pilot's license. 2323 01:56:03,640 --> 01:56:04,120 Speaker 8: I have that. 2324 01:56:04,400 --> 01:56:06,040 Speaker 2: When I was interviewing them, I was like, so I 2325 01:56:06,080 --> 01:56:08,000 Speaker 2: assume there's gonna be some sort of pilot's license for 2326 01:56:08,040 --> 01:56:10,160 Speaker 2: this flying craft. And they're like, no, as long as 2327 01:56:10,200 --> 01:56:11,880 Speaker 2: you stand or two hundred meters you get, do you 2328 01:56:12,040 --> 01:56:13,120 Speaker 2: need drivers? 2329 01:56:13,160 --> 01:56:13,200 Speaker 4: Like? 2330 01:56:13,880 --> 01:56:15,680 Speaker 5: Are you gonna put a license plate on this? Or 2331 01:56:15,680 --> 01:56:16,800 Speaker 5: is there's no space for one? 2332 01:56:16,840 --> 01:56:17,120 Speaker 8: Buddy? 2333 01:56:17,440 --> 01:56:19,040 Speaker 5: Completely unregularly to be. 2334 01:56:19,040 --> 01:56:21,920 Speaker 2: Honest, and I don't say this for any problematic reason, 2335 01:56:21,960 --> 01:56:25,120 Speaker 2: but like, these folks are Chinese and did not seem 2336 01:56:25,160 --> 01:56:27,160 Speaker 2: to have a great deal of knowledge about the US words. 2337 01:56:27,520 --> 01:56:31,880 Speaker 2: Sure that said, I can't imagine China's less strict about 2338 01:56:31,960 --> 01:56:33,120 Speaker 2: personal aircraft. 2339 01:56:33,240 --> 01:56:35,520 Speaker 5: I would like to take this fucker on the I five. 2340 01:56:36,080 --> 01:56:36,840 Speaker 2: Just start. 2341 01:56:38,400 --> 01:56:41,240 Speaker 5: Start zooming. Yeah, see it up in the air, because 2342 01:56:41,320 --> 01:56:43,160 Speaker 5: you could probably do like a pretty a pretty good 2343 01:56:43,200 --> 01:56:44,880 Speaker 5: road trip on this right you can you can you 2344 01:56:44,880 --> 01:56:45,840 Speaker 5: can move about that. 2345 01:56:45,920 --> 01:56:48,720 Speaker 2: So it's very small and it's completely electric. So I 2346 01:56:48,760 --> 01:56:50,640 Speaker 2: asked him, how much time do you get in the 2347 01:56:50,680 --> 01:56:53,840 Speaker 2: air with this bad boy on battery? Maybe twenty five minutes? 2348 01:56:56,640 --> 01:57:01,920 Speaker 2: What happens after twenty minutes. I did ask this and 2349 01:57:02,040 --> 01:57:03,760 Speaker 2: I was like, this is just rough out of the sky, 2350 01:57:03,840 --> 01:57:05,840 Speaker 2: and they were like, no, we're working on like a 2351 01:57:06,200 --> 01:57:10,600 Speaker 2: like an intelligent thing that will like yeah, which is 2352 01:57:10,640 --> 01:57:14,000 Speaker 2: also very exciting, really looking forward to seeing how they 2353 01:57:14,000 --> 01:57:16,480 Speaker 2: pull that off. The videos that they have show it 2354 01:57:16,560 --> 01:57:19,680 Speaker 2: driving on the highway too. They weren't able to tell 2355 01:57:19,720 --> 01:57:22,320 Speaker 2: me what a top speed was. It has no rear 2356 01:57:22,400 --> 01:57:24,520 Speaker 2: view mirrors and no side view mirrors, but they said 2357 01:57:24,520 --> 01:57:26,720 Speaker 2: there's lots of cameras on the inside, so I'm sure 2358 01:57:26,760 --> 01:57:30,120 Speaker 2: that's fine. It's a death trap. This thing will get 2359 01:57:30,200 --> 01:57:32,920 Speaker 2: everyone who even looks at it wrong killed. They should 2360 01:57:32,960 --> 01:57:36,320 Speaker 2: be a video of the prototype. It was completely frameless. 2361 01:57:36,320 --> 01:57:38,720 Speaker 2: It was just quadr coopter blades and like a chair 2362 01:57:38,840 --> 01:57:41,160 Speaker 2: on a platform lifting a guy into the air. It 2363 01:57:41,200 --> 01:57:44,560 Speaker 2: couldn't go forward or backwards. But they're like, a year, 2364 01:57:44,680 --> 01:57:45,600 Speaker 2: we can have this figure out. 2365 01:57:45,760 --> 01:57:47,160 Speaker 5: It can't. It can't move forward. 2366 01:57:47,320 --> 01:57:49,480 Speaker 2: It only only went up in the videos I saw, 2367 01:57:51,200 --> 01:57:55,840 Speaker 2: so you can't actually travel absolutely not by the way. 2368 01:57:56,040 --> 01:57:59,160 Speaker 2: I couldn't fit in this thing like you would be 2369 01:57:59,240 --> 01:58:00,360 Speaker 2: cramped in this fucker. 2370 01:58:00,480 --> 01:58:01,960 Speaker 5: But it's good for vertical travel. 2371 01:58:02,040 --> 01:58:04,200 Speaker 2: It's great if you just need to go up to 2372 01:58:04,400 --> 01:58:09,400 Speaker 2: under two hundred meters, there's no more efficient ways. 2373 01:58:09,400 --> 01:58:12,440 Speaker 5: If you're gonna pull over by the cops, you just. 2374 01:58:11,920 --> 01:58:15,200 Speaker 2: Just go up above them. I'm in the sky. Now 2375 01:58:15,480 --> 01:58:18,280 Speaker 2: you get new shit to me for twenty five mass minutes. 2376 01:58:21,040 --> 01:58:23,320 Speaker 2: Oh god, it's like if you're just driving, you go 2377 01:58:23,360 --> 01:58:25,280 Speaker 2: up to one hundred kilometers, which made me think, so 2378 01:58:25,440 --> 01:58:28,680 Speaker 2: a second, that's like sixty the AAR for twenty minutes 2379 01:58:28,720 --> 01:58:30,560 Speaker 2: than I land. Then my battery is dead. 2380 01:58:30,920 --> 01:58:31,760 Speaker 5: Then you can't go anywhere. 2381 01:58:31,800 --> 01:58:33,160 Speaker 2: You can't go anywhere. You can get back. 2382 01:58:33,360 --> 01:58:36,160 Speaker 5: The battery issue is gonna is gonna be troubling. 2383 01:58:36,240 --> 01:58:37,560 Speaker 2: But it seems completely useless. 2384 01:58:37,600 --> 01:58:40,960 Speaker 5: But as we've heard NonStop the past two days, this 2385 01:58:41,040 --> 01:58:41,880 Speaker 5: is the worst it's gonna be. 2386 01:58:41,960 --> 01:58:44,360 Speaker 2: This is the worst it's gonna be. Only gonna get better. Things, 2387 01:58:44,440 --> 01:58:47,640 Speaker 2: only ever get better. That's that's what everyone was trying 2388 01:58:47,640 --> 01:58:50,640 Speaker 2: to insist upon to me. Here, what else did you 2389 01:58:50,640 --> 01:58:53,000 Speaker 2: see on the show floor that caught your off Garrison? 2390 01:58:53,120 --> 01:58:57,040 Speaker 2: So many magical, wonderful, marvelous things, most of which were 2391 01:58:57,120 --> 01:59:00,200 Speaker 2: just like various different AI connected smart houses. That was 2392 01:59:00,240 --> 01:59:03,040 Speaker 2: what Samsung was showing off. That was what LG was 2393 01:59:03,120 --> 01:59:06,720 Speaker 2: showing off. I believe you saw one as well, right, Yeah, 2394 01:59:06,720 --> 01:59:09,040 Speaker 2: I mean I I walked through the l G booth. 2395 01:59:09,480 --> 01:59:11,600 Speaker 5: It was kind of the same as same as last year. 2396 01:59:11,640 --> 01:59:14,400 Speaker 5: The Samsung booth was too intimidating. But I should check 2397 01:59:14,400 --> 01:59:16,959 Speaker 5: it out because last year we didn't do the Samsung 2398 01:59:17,000 --> 01:59:22,480 Speaker 5: booth because we were going to and then either either 2399 01:59:22,520 --> 01:59:24,760 Speaker 5: one of us threw up or spilled something. 2400 01:59:25,360 --> 01:59:29,240 Speaker 2: Hey, okay, okay, yes, right? 2401 01:59:29,400 --> 01:59:31,720 Speaker 5: Did I did I. 2402 01:59:31,640 --> 01:59:35,400 Speaker 2: Pour my crative into a constant into a carbonated beverage 2403 01:59:35,520 --> 01:59:40,360 Speaker 2: that spewed a geyser a blood red foam into the 2404 01:59:40,400 --> 01:59:41,520 Speaker 2: sky around to. 2405 01:59:41,560 --> 01:59:43,120 Speaker 5: The white Samsung guard. 2406 01:59:43,200 --> 01:59:46,839 Speaker 2: Did the security guard stare at me as it happened? 2407 01:59:47,200 --> 01:59:49,720 Speaker 2: Did I set the drink down as it continued to 2408 01:59:49,760 --> 01:59:52,400 Speaker 2: spew and said, I'll go get some towels and then 2409 01:59:52,480 --> 01:59:53,480 Speaker 2: leave forever. 2410 01:59:53,480 --> 01:59:57,200 Speaker 5: Wet towels left. 2411 01:59:57,240 --> 01:59:59,280 Speaker 2: We fucking bounced, So. 2412 01:59:59,160 --> 02:00:02,280 Speaker 5: We couldn't do this some booths last year. Maybe I'll 2413 02:00:02,320 --> 02:00:05,600 Speaker 5: try this year. But tell me about these smart houses. 2414 02:00:06,120 --> 02:00:08,480 Speaker 2: Well, Garrett sam Sumi has a great idea for a 2415 02:00:08,560 --> 02:00:11,520 Speaker 2: smart house. First of all, you remember that game the Sims. No, 2416 02:00:12,160 --> 02:00:15,800 Speaker 2: well they're really betting that you do, because their current 2417 02:00:15,880 --> 02:00:19,360 Speaker 2: plan is design your home with the AI powered map views. Okay, okay, 2418 02:00:19,400 --> 02:00:21,760 Speaker 2: sure you get like you feed it like a picture, 2419 02:00:21,880 --> 02:00:24,880 Speaker 2: you like, you lay out your your floor, planning your house, 2420 02:00:25,080 --> 02:00:26,680 Speaker 2: and it gives you like a three D model, and 2421 02:00:26,680 --> 02:00:29,360 Speaker 2: you can take pictures of your furniture or pictures of 2422 02:00:29,400 --> 02:00:31,960 Speaker 2: furniture that you want, and then it really places it 2423 02:00:31,960 --> 02:00:34,440 Speaker 2: around and you can place them. Now, a couple of things, 2424 02:00:34,440 --> 02:00:37,280 Speaker 2: one of them is that there's no scaling done by 2425 02:00:37,320 --> 02:00:40,200 Speaker 2: the AI, so it's up to you to figure out 2426 02:00:40,480 --> 02:00:44,160 Speaker 2: how the furniture you might want to buy measures up 2427 02:00:44,200 --> 02:00:45,919 Speaker 2: in comparison to the apartment. 2428 02:00:46,040 --> 02:00:46,280 Speaker 5: Sure. 2429 02:00:46,360 --> 02:00:49,440 Speaker 2: Sure, but it does look like the actual like map 2430 02:00:49,440 --> 02:00:51,200 Speaker 2: that they've got. I'll show you the picture that I took. 2431 02:00:52,120 --> 02:00:54,880 Speaker 2: I'll try to put it up somewhere like it looks 2432 02:00:54,960 --> 02:00:58,880 Speaker 2: like the video game the Sims. You're populating like a 2433 02:00:58,880 --> 02:01:01,560 Speaker 2: little three D CG house. And I was like, okay, 2434 02:01:01,560 --> 02:01:04,200 Speaker 2: well there's there's a use there, right. People like planning 2435 02:01:04,200 --> 02:01:06,200 Speaker 2: out like you're you're moving into a new apartment. You 2436 02:01:06,200 --> 02:01:07,920 Speaker 2: can like fill it in here and before you even 2437 02:01:07,960 --> 02:01:10,240 Speaker 2: move in, you can figure out what kind of furniture 2438 02:01:10,280 --> 02:01:12,520 Speaker 2: you need or how you're existing furnish will fit in there. 2439 02:01:13,080 --> 02:01:15,760 Speaker 2: I would never have used that. I usually picked up 2440 02:01:15,760 --> 02:01:17,960 Speaker 2: all of my furniture from the trash before I had 2441 02:01:17,960 --> 02:01:20,720 Speaker 2: a house when I moved into a new place. But 2442 02:01:20,800 --> 02:01:22,800 Speaker 2: I know people who would have used that. Sure, that 2443 02:01:22,840 --> 02:01:25,360 Speaker 2: seems useful. So I talked about security. Some one thing 2444 02:01:25,400 --> 02:01:27,200 Speaker 2: that concerned me is like the first guy I talked, 2445 02:01:27,240 --> 02:01:29,960 Speaker 2: he was like, oh, yeah, I think it's all stored locally. 2446 02:01:30,400 --> 02:01:32,200 Speaker 2: And I was like, so Samsung doesn't have any access 2447 02:01:32,240 --> 02:01:34,560 Speaker 2: to any of the data on like my house and 2448 02:01:34,600 --> 02:01:36,800 Speaker 2: it's layout. And he was like, let me, let me 2449 02:01:36,920 --> 02:01:38,880 Speaker 2: get you to one of our engineers because he can 2450 02:01:38,920 --> 02:01:41,920 Speaker 2: answer that question. And the engineer's answer was, and I'm 2451 02:01:41,960 --> 02:01:43,000 Speaker 2: paraphrasing here. 2452 02:01:43,160 --> 02:01:44,760 Speaker 5: Oh okay. 2453 02:01:44,800 --> 02:01:45,960 Speaker 2: So that made me very confident. 2454 02:01:46,040 --> 02:01:48,960 Speaker 5: That does make you feel safe about sharing your personal data? 2455 02:01:49,080 --> 02:01:51,520 Speaker 2: Right, yeah, I'm the layout of my actual house. 2456 02:01:51,560 --> 02:01:51,720 Speaker 3: Well. 2457 02:01:51,720 --> 02:01:53,520 Speaker 5: And the thing is, I really don't like that at all, 2458 02:01:53,560 --> 02:01:55,720 Speaker 5: because this is this is something that people were asking 2459 02:01:55,760 --> 02:01:59,000 Speaker 5: Facebook slash meta when they were doing like their you know, 2460 02:01:59,120 --> 02:02:01,960 Speaker 5: like metaver stuff because their headsets are recording you know, 2461 02:02:02,160 --> 02:02:06,200 Speaker 5: very very extensively, like your home layout and the whole point. Well, 2462 02:02:06,320 --> 02:02:08,600 Speaker 5: part of the point was that some of that data 2463 02:02:08,800 --> 02:02:11,320 Speaker 5: could then be used to send you targeted advertisements based 2464 02:02:11,320 --> 02:02:15,960 Speaker 5: on them seeing everything in your home. And I suspect 2465 02:02:16,080 --> 02:02:20,320 Speaker 5: that Samsung might also have some interest in targeted advertisements, 2466 02:02:20,360 --> 02:02:23,800 Speaker 5: being a tech company, but you know, I could never say. 2467 02:02:24,280 --> 02:02:26,840 Speaker 2: Yeah, and they were that wasn't really One thing they 2468 02:02:26,840 --> 02:02:29,120 Speaker 2: had is for like their retail segment, they had like 2469 02:02:29,600 --> 02:02:32,840 Speaker 2: a live video grocery store ad showing you prices of 2470 02:02:32,880 --> 02:02:35,840 Speaker 2: different produce and I think like the insinuation that didn't 2471 02:02:35,920 --> 02:02:38,240 Speaker 2: layout is like you can change prices on the fly, 2472 02:02:38,720 --> 02:02:41,040 Speaker 2: you know it, which kind of made me think about that. 2473 02:02:41,480 --> 02:02:43,360 Speaker 2: There was some talk last year of like, Okay, we 2474 02:02:43,480 --> 02:02:46,120 Speaker 2: want to be able to like face scan customers so 2475 02:02:46,160 --> 02:02:48,640 Speaker 2: we can see if they have money and increase prices 2476 02:02:48,680 --> 02:02:51,640 Speaker 2: for like products for certain people, which I'm sure they're 2477 02:02:51,640 --> 02:02:53,840 Speaker 2: going to try. They are too enticed by that idea 2478 02:02:53,960 --> 02:02:57,080 Speaker 2: not to so I caught a little bit of that, 2479 02:02:57,120 --> 02:02:59,440 Speaker 2: But they really like to the extent of how big 2480 02:02:59,480 --> 02:03:02,520 Speaker 2: And this was an interesting last year Samsung and LG 2481 02:03:02,640 --> 02:03:04,280 Speaker 2: their boots were huge and they had a lot of 2482 02:03:04,280 --> 02:03:07,960 Speaker 2: get different gadgets. Samsung's booth is big this year, forty 2483 02:03:08,000 --> 02:03:11,800 Speaker 2: percent of it was that scan your furniture, scan your 2484 02:03:11,840 --> 02:03:15,280 Speaker 2: fucking like map acts, not that much like, very little 2485 02:03:15,520 --> 02:03:17,240 Speaker 2: actual shit going on. 2486 02:03:17,360 --> 02:03:19,920 Speaker 5: The people slapped the word AI onto everything there was. 2487 02:03:19,960 --> 02:03:22,960 Speaker 2: Another big thing was all Samsung, because Samsung makes a 2488 02:03:22,960 --> 02:03:25,920 Speaker 2: ton of appliances, they make TVs, all sorts of entertainment products. 2489 02:03:25,960 --> 02:03:27,320 Speaker 2: All of them have this I figure what they called 2490 02:03:27,320 --> 02:03:30,200 Speaker 2: like Samsung tag or something that you can you can 2491 02:03:30,240 --> 02:03:31,640 Speaker 2: map it in your phone, so you can have a 2492 02:03:31,640 --> 02:03:33,960 Speaker 2: whole map of all of the devices and shit that 2493 02:03:34,000 --> 02:03:35,760 Speaker 2: you have in your phone and you can control them 2494 02:03:35,760 --> 02:03:38,160 Speaker 2: all from a single point. And right, no one, by 2495 02:03:38,200 --> 02:03:40,760 Speaker 2: the way, had any interest in answering my security questions there. 2496 02:03:40,800 --> 02:03:43,000 Speaker 2: But also if you're into that, if you want to 2497 02:03:43,080 --> 02:03:46,040 Speaker 2: have all of your appliances and entertainment things linked up 2498 02:03:46,120 --> 02:03:49,320 Speaker 2: and controlled on your phone and all of them are Samsung, 2499 02:03:49,560 --> 02:03:50,960 Speaker 2: you don't care. You don't care. 2500 02:03:51,320 --> 02:03:53,560 Speaker 5: No, if you're getting a smart home, I don't think 2501 02:03:53,600 --> 02:03:54,840 Speaker 5: you really care about that. 2502 02:03:54,880 --> 02:03:56,760 Speaker 2: But also none of it was like, yeah, I can 2503 02:03:56,800 --> 02:03:59,960 Speaker 2: control everything from my phone. You've been promising me that, literally, 2504 02:04:00,320 --> 02:04:03,720 Speaker 2: like in twenty eleven. It's decades they were promising me 2505 02:04:03,760 --> 02:04:05,440 Speaker 2: You're gonna be able to control your whole house. 2506 02:04:05,360 --> 02:04:07,680 Speaker 5: Thing feels new this year. This is the thing. Is 2507 02:04:07,720 --> 02:04:10,120 Speaker 5: like even walking through the LGBO, which usually has some 2508 02:04:10,240 --> 02:04:14,000 Speaker 5: really cool new thing this year nothing new. No, nothing new. 2509 02:04:14,240 --> 02:04:16,480 Speaker 5: They slapped the word AI on one corner of their 2510 02:04:16,560 --> 02:04:19,640 Speaker 5: television set. Right, I guess lg does have like a 2511 02:04:19,720 --> 02:04:22,320 Speaker 5: large language model in like one corner of their booth, 2512 02:04:22,360 --> 02:04:24,160 Speaker 5: but like, so does everyone else, Like that's not like, 2513 02:04:24,360 --> 02:04:25,200 Speaker 5: yeah compelling. 2514 02:04:25,440 --> 02:04:28,760 Speaker 2: There was sk which is a South South Korea company 2515 02:04:29,160 --> 02:04:33,240 Speaker 2: there booth again the massive like AI a big thing, 2516 02:04:33,280 --> 02:04:33,880 Speaker 2: but it's nothing. 2517 02:04:33,880 --> 02:04:35,400 Speaker 5: It's just a big visual. 2518 02:04:35,160 --> 02:04:37,120 Speaker 2: Display that looks cool, that looks like a bunch of 2519 02:04:37,200 --> 02:04:39,680 Speaker 2: server racks, like you're in this huge cube of servers. 2520 02:04:40,040 --> 02:04:43,920 Speaker 2: But everything does in different actual products. One of them 2521 02:04:44,000 --> 02:04:48,280 Speaker 2: was real time CCTVs that use an AI like an 2522 02:04:48,680 --> 02:04:51,840 Speaker 2: M type thing to summarize pictures. So I like walked 2523 02:04:51,840 --> 02:04:54,160 Speaker 2: through and it did pick me out as a notable person. 2524 02:04:54,560 --> 02:04:56,560 Speaker 2: So I've got like this people of interest thing where 2525 02:04:56,560 --> 02:05:00,120 Speaker 2: it's like a man holding a smartphone standing next to 2526 02:05:00,160 --> 02:05:03,240 Speaker 2: another man. But also I'm like, what does that really 2527 02:05:03,240 --> 02:05:05,680 Speaker 2: get you? Like the fact that you're summarizing up like 2528 02:05:05,720 --> 02:05:08,080 Speaker 2: these people who are like this person's kneeling and taking 2529 02:05:08,080 --> 02:05:11,800 Speaker 2: a pictures person standing because I like actually tried deliberately, 2530 02:05:11,840 --> 02:05:13,920 Speaker 2: I like reached my bag to try to be suspicious. 2531 02:05:13,920 --> 02:05:16,480 Speaker 2: I like did finger guns and it never marked me out, 2532 02:05:16,560 --> 02:05:18,720 Speaker 2: and like I didn't pull a real gun or anything, 2533 02:05:18,760 --> 02:05:21,200 Speaker 2: because I very rarely bring that to the cees for 2534 02:05:22,720 --> 02:05:25,200 Speaker 2: But I don't know, like I can see how there 2535 02:05:25,200 --> 02:05:27,160 Speaker 2: could be a utility there if you're actually able to 2536 02:05:27,520 --> 02:05:30,280 Speaker 2: say you're setting up like surveillance outside of a residential 2537 02:05:30,320 --> 02:05:33,120 Speaker 2: building and it can alert security that like something is 2538 02:05:33,120 --> 02:05:36,680 Speaker 2: happening outside. There's a potential you if it's good enough 2539 02:05:36,760 --> 02:05:39,440 Speaker 2: utility in that that they didn't display it at the show. 2540 02:05:39,720 --> 02:05:43,760 Speaker 2: It was literally just describing randos from the audience, And like, 2541 02:05:43,840 --> 02:05:45,640 Speaker 2: I just don't see how a security guy is there's 2542 02:05:45,640 --> 02:05:47,920 Speaker 2: a guy with a phone on outside of the building, Like. 2543 02:05:48,040 --> 02:05:51,640 Speaker 5: Ah, yeah, no, it's it doesn't seem very new, it 2544 02:05:51,640 --> 02:05:53,800 Speaker 5: doesn't seem very innovative. Nah. 2545 02:05:54,040 --> 02:05:56,480 Speaker 2: So again when I'm when I'm seeing here overwhelmingly for 2546 02:05:56,520 --> 02:05:59,920 Speaker 2: all to talk about, like there's no resisting it. AI's coming. 2547 02:06:00,120 --> 02:06:02,680 Speaker 2: It's going to dominate everything. This is the next big thing. 2548 02:06:03,120 --> 02:06:06,160 Speaker 2: A remarkable lack outside of what I will say, the 2549 02:06:06,200 --> 02:06:09,160 Speaker 2: one thing where there are continuously new products that are 2550 02:06:09,160 --> 02:06:13,440 Speaker 2: better every year. The smart glasses, yes, they're getting more impressives. 2551 02:06:14,760 --> 02:06:16,920 Speaker 2: I don't think I'll ever be a smart glasses guy. 2552 02:06:17,520 --> 02:06:19,760 Speaker 2: I hated glasses enough that I let them shoot me 2553 02:06:19,800 --> 02:06:22,920 Speaker 2: in the eye with lasers. Shout out to our lacek sponsors. 2554 02:06:23,400 --> 02:06:27,480 Speaker 2: But I see why people would like it, and there 2555 02:06:27,480 --> 02:06:31,040 Speaker 2: seems to be legitimately substantial utility. 2556 02:06:30,760 --> 02:06:32,960 Speaker 5: If we have high power smart glasses. Yeah, that looks 2557 02:06:33,000 --> 02:06:34,720 Speaker 5: like a regular pair of glasses. I will get a 2558 02:06:34,840 --> 02:06:36,560 Speaker 5: pair eventually, because yeah, why not. 2559 02:06:36,800 --> 02:06:39,080 Speaker 2: There was a great demo. I'm pulling over to an 2560 02:06:39,680 --> 02:06:42,800 Speaker 2: LAWK view. They had like one glass that was the 2561 02:06:42,840 --> 02:06:46,280 Speaker 2: first world smart glasses for TikTok Life. Not particularly excited 2562 02:06:46,280 --> 02:06:48,720 Speaker 2: about that. But they had another set of ar glasses 2563 02:06:49,040 --> 02:06:51,720 Speaker 2: with a twelve hour battery, where like, if it works 2564 02:06:51,720 --> 02:06:53,680 Speaker 2: as well as the demo, and that's a big if, 2565 02:06:53,720 --> 02:06:56,560 Speaker 2: but it seems to like your smart watch. So it'll 2566 02:06:56,600 --> 02:06:58,400 Speaker 2: tell you you can see in a heads up display 2567 02:06:58,440 --> 02:07:00,640 Speaker 2: as you're cycling, that was the demo. It'll both like 2568 02:07:00,680 --> 02:07:03,760 Speaker 2: give you directions like in your eyes, and it seemed 2569 02:07:03,760 --> 02:07:05,560 Speaker 2: to be like fairly well thought out, so it's not 2570 02:07:05,640 --> 02:07:09,320 Speaker 2: like overly corrupting your view. It'll show you your heart rate. 2571 02:07:09,400 --> 02:07:11,480 Speaker 2: You know, it'll show you like all that kind of stuff. 2572 02:07:12,280 --> 02:07:15,680 Speaker 2: So you get like a useful degree of control and 2573 02:07:15,720 --> 02:07:18,320 Speaker 2: assistance from that kind of thing. And that is I 2574 02:07:18,360 --> 02:07:21,280 Speaker 2: will say the last three cees is the glasses get 2575 02:07:21,320 --> 02:07:24,240 Speaker 2: a little better and a little smaller every year. Smaller, certainly, 2576 02:07:24,280 --> 02:07:27,600 Speaker 2: I would say that's a real product that's probably going 2577 02:07:27,680 --> 02:07:29,000 Speaker 2: to continue to improve. 2578 02:07:29,760 --> 02:07:33,200 Speaker 5: Do you know what else always seeks improvement, Robert No, 2579 02:07:33,720 --> 02:07:39,360 Speaker 5: The capacity for you to get personalized possibly AI powered ads. Well, 2580 02:07:39,400 --> 02:07:42,200 Speaker 5: that human is exciting form the consumer choices. Let's all 2581 02:07:42,240 --> 02:07:44,560 Speaker 5: sit down for some AI powered ads. 2582 02:07:53,040 --> 02:07:56,520 Speaker 2: Wow, I can't believe they put Jay Shetty's voice the 2583 02:07:56,600 --> 02:07:59,800 Speaker 2: d aged Harris and Ford the latest in Handed Jones movie, 2584 02:08:00,200 --> 02:08:02,360 Speaker 2: My Dick's Hard. How are you Garrison? Oh? 2585 02:08:02,400 --> 02:08:05,720 Speaker 5: I feel good because today, as we are recording this, 2586 02:08:05,800 --> 02:08:10,560 Speaker 5: it's late Tuesday night, there was a series of fascinating 2587 02:08:10,600 --> 02:08:14,640 Speaker 5: breaking news articles that happened as we were sitting or 2588 02:08:14,720 --> 02:08:17,000 Speaker 5: at least as I was sitting in on these AI panels, 2589 02:08:17,080 --> 02:08:19,920 Speaker 5: which made it hard to not just like completely interrupt 2590 02:08:20,000 --> 02:08:23,040 Speaker 5: everything and be like, yeah, hey, hey, any comment on this. 2591 02:08:23,360 --> 02:08:27,040 Speaker 2: Guys, Guys, something real happened. Shut your fucking stupid mouds 2592 02:08:27,080 --> 02:08:29,760 Speaker 2: about this AI Hollywood bullshit. 2593 02:08:30,000 --> 02:08:34,000 Speaker 5: So yeah, So a few weeks ago, if you were unaware, 2594 02:08:34,800 --> 02:08:37,320 Speaker 5: a Green Beret rented a test the cyber truck to 2595 02:08:37,320 --> 02:08:41,920 Speaker 5: feel like Batman and Halo and drove to first the 2596 02:08:42,000 --> 02:08:46,160 Speaker 5: wrong Las Vegas and then eventually Las Vegas, Nevada, parked 2597 02:08:46,240 --> 02:08:50,400 Speaker 5: outside of the Trump Hotel and Casino and then loo 2598 02:08:50,440 --> 02:08:53,360 Speaker 5: himself up. And this has been a big news story. 2599 02:08:53,400 --> 02:08:55,879 Speaker 5: It happened during the same day as a pretty horrible 2600 02:08:56,160 --> 02:08:59,280 Speaker 5: terrorist attack in New Orleans, which resulted in about fifteen 2601 02:08:59,320 --> 02:09:02,800 Speaker 5: people dead, done by a guy who was employed by Deloitte, 2602 02:09:02,840 --> 02:09:07,320 Speaker 5: a frequent frequent CES sponsor. So this this, these felt 2603 02:09:07,320 --> 02:09:10,400 Speaker 5: like a very Cees style of attacks, you know, one 2604 02:09:10,440 --> 02:09:14,320 Speaker 5: delowed guy driving into people, murdering hosh you guys. And 2605 02:09:14,360 --> 02:09:17,120 Speaker 5: then this cyber truck explosion in Vegas like a week 2606 02:09:17,160 --> 02:09:19,640 Speaker 5: before Cees, you know, very odd. And then and then 2607 02:09:19,760 --> 02:09:22,360 Speaker 5: Robert some news drops today that I would love to 2608 02:09:22,360 --> 02:09:24,800 Speaker 5: hear you announce. 2609 02:09:24,920 --> 02:09:26,840 Speaker 2: You know, Garrison. I made a comment the other night 2610 02:09:26,920 --> 02:09:30,320 Speaker 2: about how like it's pretty well documented that veterans, you know, 2611 02:09:30,920 --> 02:09:33,200 Speaker 2: not that they're more likely to carry out violence, but 2612 02:09:33,240 --> 02:09:35,040 Speaker 2: when they do, they tend to have higher body counts 2613 02:09:35,040 --> 02:09:38,720 Speaker 2: because they have more skills. It turns out I thought 2614 02:09:38,760 --> 02:09:41,320 Speaker 2: we were getting more literal bang for our buck training 2615 02:09:41,320 --> 02:09:44,200 Speaker 2: Green Berets than we are. My assumption is because my 2616 02:09:44,440 --> 02:09:47,200 Speaker 2: uncle was a Green Beret and he did some very scary, 2617 02:09:47,400 --> 02:09:51,440 Speaker 2: probably wore crime shit in Vietnam, and I assumed like thatman, 2618 02:09:51,520 --> 02:09:53,600 Speaker 2: I'll tell you one thing about my uncle, Jim, that 2619 02:09:53,800 --> 02:09:56,880 Speaker 2: man could make a bomb. That man would not need 2620 02:09:56,920 --> 02:09:59,600 Speaker 2: to ask anyone for advice if he needed to make 2621 02:09:59,640 --> 02:10:02,240 Speaker 2: a bomb. He's not with us anymore. God rest his soul. 2622 02:10:02,680 --> 02:10:06,520 Speaker 2: But it turns out this Green Beret, who you know, 2623 02:10:06,720 --> 02:10:10,760 Speaker 2: a fucking dollar store TJ Max version of the Green 2624 02:10:10,840 --> 02:10:14,000 Speaker 2: Berets is what we're working with now, asked chat Gpt 2625 02:10:14,200 --> 02:10:16,400 Speaker 2: how to build a fucking bomb, and it sounds like 2626 02:10:16,440 --> 02:10:18,840 Speaker 2: he was trying to make it triggered by Tannerit with it, 2627 02:10:18,840 --> 02:10:21,720 Speaker 2: which is a bipartite explosive compound that you use as 2628 02:10:21,760 --> 02:10:24,440 Speaker 2: like an exploding target, so it'll go boom big, but 2629 02:10:24,480 --> 02:10:26,280 Speaker 2: you have to shoot it with something like a rifle 2630 02:10:26,280 --> 02:10:29,400 Speaker 2: that's high velocity, or use like a blasting cap. Otherwise 2631 02:10:29,400 --> 02:10:32,240 Speaker 2: it's very stable and very safe, which obviously has use. 2632 02:10:32,360 --> 02:10:34,920 Speaker 2: You know, it was invented actually to set off avalanches 2633 02:10:34,960 --> 02:10:38,280 Speaker 2: and stuff anyway, because that's very available in very high power. 2634 02:10:38,320 --> 02:10:40,080 Speaker 2: He was looking to like fill his car with that 2635 02:10:40,240 --> 02:10:41,840 Speaker 2: and then shoot it with a rifle while he was 2636 02:10:41,880 --> 02:10:44,720 Speaker 2: in it, and that's what he was asking chat gpt about. 2637 02:10:45,040 --> 02:10:47,920 Speaker 2: So it's not clear to me. Actually, the actual headline 2638 02:10:47,960 --> 02:10:50,560 Speaker 2: is that like he used chat gpt to make his bomb. 2639 02:10:51,240 --> 02:10:55,280 Speaker 2: It seems and I'm not privy to what the police 2640 02:10:55,320 --> 02:10:58,080 Speaker 2: are obviously, but it seems like based on what I 2641 02:10:58,120 --> 02:11:00,720 Speaker 2: read in the article, we're not or if he actually 2642 02:11:00,800 --> 02:11:03,200 Speaker 2: used chat gpt to make a bomb. It's more that 2643 02:11:03,280 --> 02:11:06,919 Speaker 2: he was interested in making a bomb setting off tannerite 2644 02:11:06,960 --> 02:11:10,120 Speaker 2: by shooting it, but may have ultimately decided not to 2645 02:11:10,160 --> 02:11:13,040 Speaker 2: do that because he would then be alive for the explosion, 2646 02:11:13,120 --> 02:11:15,920 Speaker 2: which he didn't want to be. Also, the authorities don't 2647 02:11:15,960 --> 02:11:18,520 Speaker 2: seem to fully know how he triggered it. Yeah, so 2648 02:11:18,560 --> 02:11:21,000 Speaker 2: it's still kind of unclear to me. I guess hopefully 2649 02:11:21,040 --> 02:11:24,520 Speaker 2: we'll get more later. But he he definitely needed chat 2650 02:11:24,560 --> 02:11:27,360 Speaker 2: GPT's help to try and figure out how to make 2651 02:11:27,400 --> 02:11:27,760 Speaker 2: the bomb. 2652 02:11:28,160 --> 02:11:32,160 Speaker 5: He's certainly used chat gpt in the planning process of 2653 02:11:32,200 --> 02:11:32,839 Speaker 5: this attack. 2654 02:11:33,000 --> 02:11:34,520 Speaker 2: Yeah, fair to say that. 2655 02:11:34,840 --> 02:11:38,360 Speaker 5: And It's odd because both me and you spent a 2656 02:11:38,480 --> 02:11:41,760 Speaker 5: number of hours today actually like attending like demos from 2657 02:11:41,800 --> 02:11:44,760 Speaker 5: like these you know, speech to text, text to speech 2658 02:11:45,200 --> 02:11:48,320 Speaker 5: AI systems. We went to like two specific ones that 2659 02:11:48,360 --> 02:11:52,600 Speaker 5: they like, you know, demonstrated demonstrated the capabilities of their 2660 02:11:52,680 --> 02:11:55,960 Speaker 5: like you know, like AI assistive tech. The first one 2661 02:11:55,960 --> 02:11:59,920 Speaker 5: we went to spent twenty minutes talking about how they're 2662 02:12:00,000 --> 02:12:03,480 Speaker 5: bigest inspiration there quote unquote. North Star was the movie 2663 02:12:03,560 --> 02:12:05,960 Speaker 5: Her with Joaqui Fi. 2664 02:12:06,160 --> 02:12:09,600 Speaker 15: They had a whole slide about how that was the 2665 02:12:09,640 --> 02:12:15,280 Speaker 15: gold standard for AI human communications. The movie Her, in 2666 02:12:15,320 --> 02:12:19,360 Speaker 15: which Joaquin Phoenix falls in love with an AI chatbot 2667 02:12:19,480 --> 02:12:23,800 Speaker 15: voiced by Scarlett Johansson who hires a prostitute to have 2668 02:12:23,960 --> 02:12:28,120 Speaker 15: sex with them while she participates vocally. And then it 2669 02:12:28,160 --> 02:12:31,000 Speaker 15: turns out the AI is really kind of Polly and 2670 02:12:31,120 --> 02:12:33,760 Speaker 15: Joaquin Phoenix is not okay with that, and then maybe 2671 02:12:33,760 --> 02:12:35,760 Speaker 15: the ais all go to space. It's kind of unclear 2672 02:12:35,800 --> 02:12:37,640 Speaker 15: at the end. I don't think it was a great movie. 2673 02:12:37,680 --> 02:12:40,160 Speaker 15: A lot of people liked it. I don't see whether 2674 02:12:40,240 --> 02:12:42,240 Speaker 15: you or not you like it, why this is your 2675 02:12:42,320 --> 02:12:44,160 Speaker 15: vision of how what chatbot should work. 2676 02:12:44,480 --> 02:12:46,600 Speaker 5: The actual chatbot they had was like fine. It was. 2677 02:12:46,880 --> 02:12:49,680 Speaker 5: It was. It was actually pretty good at translation, you know, 2678 02:12:50,000 --> 02:12:51,600 Speaker 5: translitting from Spanish to English. 2679 02:12:51,680 --> 02:12:54,360 Speaker 2: It works quite well. Yeah, the demo was like solid, 2680 02:12:54,440 --> 02:12:55,760 Speaker 2: It was pretty accurate. 2681 02:12:55,800 --> 02:12:55,960 Speaker 11: You know. 2682 02:12:56,040 --> 02:12:58,080 Speaker 2: I love coming here and fucking with people. I love 2683 02:12:58,160 --> 02:13:01,560 Speaker 2: like being a dicky. They asked for a volunteer, and 2684 02:13:01,600 --> 02:13:04,280 Speaker 2: at that point we knew about the chat GPTs. I 2685 02:13:04,360 --> 02:13:07,680 Speaker 2: wanted to go up and ask, like live this robot 2686 02:13:07,760 --> 02:13:11,360 Speaker 2: to like help me make a bomb. But the guy 2687 02:13:11,440 --> 02:13:15,320 Speaker 2: who was pretty handsome and like an interesting like English 2688 02:13:15,360 --> 02:13:18,760 Speaker 2: Spanish like that you specified he was, and he didn't 2689 02:13:18,760 --> 02:13:22,280 Speaker 2: want to be mean to him. He seemed nice, handsome. 2690 02:13:22,360 --> 02:13:25,440 Speaker 2: Girl wasn't shitty like he was. There were just ten 2691 02:13:25,520 --> 02:13:27,440 Speaker 2: people in this room that was supposed to have two hundred. 2692 02:13:27,480 --> 02:13:29,360 Speaker 2: I'm sure she wasn't the one that talked about her. 2693 02:13:29,560 --> 02:13:30,560 Speaker 5: That was someone else. 2694 02:13:30,560 --> 02:13:32,360 Speaker 2: That that was someone else at his company. And like 2695 02:13:32,400 --> 02:13:34,880 Speaker 2: he just seemed like he wanted to do I didn't 2696 02:13:34,920 --> 02:13:36,280 Speaker 2: want to be a dick to it. No, no, and 2697 02:13:36,320 --> 02:13:37,680 Speaker 2: like it it wasn't hurting anything. 2698 02:13:37,720 --> 02:13:39,720 Speaker 5: It was fine. Like similarly, we went to this a 2699 02:13:40,120 --> 02:13:42,400 Speaker 5: nice jaw line, we went to this other one about 2700 02:13:42,400 --> 02:13:45,400 Speaker 5: this like actually a much more dubious concept in my mind, 2701 02:13:45,400 --> 02:13:48,000 Speaker 5: which is like this this AI assistant to help like 2702 02:13:48,320 --> 02:13:50,960 Speaker 5: elderly people, like people in like their eighties and nineties 2703 02:13:51,120 --> 02:13:53,320 Speaker 5: who don't want to be an assistant living facilities, who 2704 02:13:53,400 --> 02:13:55,320 Speaker 5: have been living on their own, but they're getting to 2705 02:13:55,320 --> 02:13:57,160 Speaker 5: the point in their life where like they need like 2706 02:13:57,240 --> 02:13:58,680 Speaker 5: some degree like in home care. 2707 02:13:58,840 --> 02:14:01,000 Speaker 2: He specified. A lot of them are people who have 2708 02:14:01,240 --> 02:14:04,160 Speaker 2: either just lost a spouse or maybe their spouse is 2709 02:14:04,200 --> 02:14:06,400 Speaker 2: aging faster and worse than them. It is no longer 2710 02:14:06,440 --> 02:14:09,400 Speaker 2: really able to be the kind of companion that they 2711 02:14:09,400 --> 02:14:10,000 Speaker 2: were before. 2712 02:14:10,480 --> 02:14:12,920 Speaker 5: Yeah, so it's like this. It's both like a conversation tool. 2713 02:14:12,960 --> 02:14:15,800 Speaker 5: It helps like memory recall. It's kind of in some 2714 02:14:15,840 --> 02:14:17,800 Speaker 5: ways has the features that like, you know, someone in 2715 02:14:17,840 --> 02:14:19,920 Speaker 5: their sixties would just use their smartphone for it to 2716 02:14:19,960 --> 02:14:21,920 Speaker 5: help keep in touch with their family. It's kind of 2717 02:14:21,960 --> 02:14:25,160 Speaker 5: simplified and more automated, so you know, ways to help 2718 02:14:25,240 --> 02:14:27,560 Speaker 5: keep in touch with like your family can prove like 2719 02:14:27,600 --> 02:14:29,640 Speaker 5: your memory, like talked about your own life. 2720 02:14:29,680 --> 02:14:32,240 Speaker 2: And the device is weird. It's about the width of 2721 02:14:32,280 --> 02:14:35,600 Speaker 2: like a bedside table maybe six to eight inches deep, 2722 02:14:35,760 --> 02:14:37,960 Speaker 2: So think about like eighteen inches long to maybe six 2723 02:14:38,000 --> 02:14:40,840 Speaker 2: inches deep something like that. Half of it is like 2724 02:14:40,880 --> 02:14:43,879 Speaker 2: a little tablet, like a seven inch tablet with a speaker. 2725 02:14:44,920 --> 02:14:48,280 Speaker 2: Half of it is something about the shape and size 2726 02:14:48,280 --> 02:14:50,680 Speaker 2: of a head on like a neck that can pivot 2727 02:14:50,760 --> 02:14:54,080 Speaker 2: and nod on the neck. There's no face, so when 2728 02:14:54,080 --> 02:14:56,640 Speaker 2: it's talking, there's like a white light in the center 2729 02:14:56,680 --> 02:14:59,600 Speaker 2: of it that kind of like pulses in time with 2730 02:14:59,800 --> 02:15:02,360 Speaker 2: the the speaking that it does. So we saw this 2731 02:15:02,400 --> 02:15:04,280 Speaker 2: picture of the device and we saw the description of 2732 02:15:04,360 --> 02:15:07,720 Speaker 2: like this is an AI companion for the elderly, and 2733 02:15:07,760 --> 02:15:09,120 Speaker 2: we were both like, number one of these people are 2734 02:15:09,120 --> 02:15:10,840 Speaker 2: gonna be monsters. This is going to be like something 2735 02:15:10,880 --> 02:15:12,760 Speaker 2: to shovel you're dying dad off with because you don't 2736 02:15:12,760 --> 02:15:14,160 Speaker 2: want to spend it. You want to spend time with 2737 02:15:14,200 --> 02:15:18,400 Speaker 2: your family, Scum, You're too busy AI generating scam music 2738 02:15:19,680 --> 02:15:22,480 Speaker 2: and trying to sell your shitty robot to Garrison and me. 2739 02:15:22,720 --> 02:15:25,480 Speaker 2: More on that tomorrow, More on that tomorrow. And so 2740 02:15:25,560 --> 02:15:28,120 Speaker 2: that's what we came in prepped to this meeting, like 2741 02:15:28,160 --> 02:15:29,320 Speaker 2: this is this idea I. 2742 02:15:29,280 --> 02:15:33,200 Speaker 5: Find pretty distasteful in general, is like replacing actual like 2743 02:15:33,240 --> 02:15:35,720 Speaker 5: you know, friends or human contact or like like in 2744 02:15:35,800 --> 02:15:39,360 Speaker 5: home care with a fucking like Alexa machine essentially, and 2745 02:15:39,400 --> 02:15:39,960 Speaker 5: to be clear. 2746 02:15:40,200 --> 02:15:43,040 Speaker 2: I still think this product might be a bad idea 2747 02:15:43,080 --> 02:15:46,520 Speaker 2: that doesn't work. But the guy behind it, who is 2748 02:15:46,560 --> 02:15:49,040 Speaker 2: the dude that we talked to, cares a lot and 2749 02:15:49,160 --> 02:15:52,520 Speaker 2: is really very clearly trying to do a good thing 2750 02:15:52,760 --> 02:15:56,600 Speaker 2: and thought through the ethics and the efficacy of what 2751 02:15:56,680 --> 02:15:59,960 Speaker 2: he was doing a lot, And I I'm not convinced 2752 02:16:00,200 --> 02:16:03,400 Speaker 2: it will actually do anything, but I like wish him 2753 02:16:03,400 --> 02:16:03,840 Speaker 2: the best. 2754 02:16:03,960 --> 02:16:06,000 Speaker 5: No, Like, it specifically is designed to not look like 2755 02:16:06,040 --> 02:16:08,360 Speaker 5: a human, so that someone's using it, you know, wouldn't 2756 02:16:08,400 --> 02:16:11,320 Speaker 5: like start to believe it's like human, Like, we don't. 2757 02:16:11,160 --> 02:16:13,360 Speaker 2: Want to trick people. We don't want them to mistake it. 2758 02:16:14,080 --> 02:16:16,640 Speaker 5: It refers to itself to like like as a robot 2759 02:16:16,760 --> 02:16:18,480 Speaker 5: as like, it refers to its own like you know, 2760 02:16:18,480 --> 02:16:22,160 Speaker 5: like motors and functionality like like pretty consistently to to 2761 02:16:22,360 --> 02:16:24,400 Speaker 5: like you know, make sure that the person who's talking 2762 02:16:24,400 --> 02:16:26,400 Speaker 5: to it gets like reminded of that. And something I 2763 02:16:26,400 --> 02:16:27,720 Speaker 5: talked about is, you know, there's been a lot of 2764 02:16:27,720 --> 02:16:32,120 Speaker 5: news stories this year about people building very unhealthy attachments 2765 02:16:32,120 --> 02:16:35,640 Speaker 5: and relationships to these kind of ai AI programs, like 2766 02:16:36,000 --> 02:16:37,760 Speaker 5: character AI. There's a story like a year and a 2767 02:16:37,800 --> 02:16:40,240 Speaker 5: half ago about like a journalist to quote unquote like 2768 02:16:40,280 --> 02:16:42,160 Speaker 5: you know, like like fell in love with some kind 2769 02:16:42,200 --> 02:16:45,840 Speaker 5: of chat thing that resulted in him killing himself. You know, 2770 02:16:46,000 --> 02:16:47,320 Speaker 5: but these kind of these systems like he. 2771 02:16:47,320 --> 02:16:51,200 Speaker 2: Was not a teenager, was no character? Was that a journalist? 2772 02:16:51,360 --> 02:16:53,760 Speaker 5: Last year there was there was a journalist people fellow 2773 02:16:53,760 --> 02:16:56,280 Speaker 5: in love with an ai chat thing. A few weeks 2774 02:16:56,280 --> 02:16:58,880 Speaker 5: ago there was the kid who you know was talking 2775 02:16:58,879 --> 02:16:59,720 Speaker 5: to this like a character. 2776 02:17:00,680 --> 02:17:03,920 Speaker 2: Also, I just need to reiterate her not a great movie. 2777 02:17:05,160 --> 02:17:06,920 Speaker 5: But but you know, there has been a lot of 2778 02:17:06,920 --> 02:17:08,920 Speaker 5: these stories of these things like going wrong or you know, 2779 02:17:09,120 --> 02:17:12,360 Speaker 5: encouraging or like not stopping you know, like these like 2780 02:17:12,440 --> 02:17:16,000 Speaker 5: intense conversations with like suicidal ideation or you know, like 2781 02:17:16,080 --> 02:17:17,720 Speaker 5: self harm, all these things. 2782 02:17:18,040 --> 02:17:20,160 Speaker 2: We brought these up kind of thinking he would flinch 2783 02:17:20,200 --> 02:17:22,360 Speaker 2: away and not want to talk about it, and he 2784 02:17:22,520 --> 02:17:24,760 Speaker 2: very much acknowledged that, like he was aware of this, 2785 02:17:24,800 --> 02:17:27,240 Speaker 2: and this is something that they were attempting to build in. 2786 02:17:27,360 --> 02:17:28,840 Speaker 5: This is this is like this is you know, built 2787 02:17:28,840 --> 02:17:30,640 Speaker 5: into it. I think this is still you know, a 2788 02:17:30,640 --> 02:17:33,400 Speaker 5: big problem with this entire industries. I'm sure everyone would 2789 02:17:33,440 --> 02:17:34,959 Speaker 5: say this is the you know obviously that we have 2790 02:17:35,040 --> 02:17:36,520 Speaker 5: we have guardrails for this, and then becomes a new 2791 02:17:36,560 --> 02:17:39,760 Speaker 5: story when those guardrails fail. Similarly, was to go back 2792 02:17:39,800 --> 02:17:43,120 Speaker 5: to the Tesla bomb. You know, they're supposed to be 2793 02:17:43,200 --> 02:17:45,720 Speaker 5: guardrails and chat GPT to make sure he doesn't tell 2794 02:17:45,720 --> 02:17:48,840 Speaker 5: you how to build a bomb and those guardrails can fail. 2795 02:17:49,160 --> 02:17:51,400 Speaker 2: He showed us one which was like he told the robot, 2796 02:17:51,560 --> 02:17:54,320 Speaker 2: I love you. What was it? L e q l 2797 02:17:54,480 --> 02:17:56,560 Speaker 2: q was the l eq e l l i q, 2798 02:17:57,080 --> 02:17:59,960 Speaker 2: I love you l eq and the robot like respond 2799 02:18:00,280 --> 02:18:02,480 Speaker 2: with a like, oh that makes like my fans are 2800 02:18:02,520 --> 02:18:04,520 Speaker 2: all spinning or something like that, where he's like, I 2801 02:18:04,560 --> 02:18:07,800 Speaker 2: wanted the responsibility that it's reminding the person talking to 2802 02:18:07,840 --> 02:18:10,520 Speaker 2: it that it's a machine, that it can't think we're 2803 02:18:10,600 --> 02:18:12,840 Speaker 2: love them back. We don't want it to be negative, 2804 02:18:12,840 --> 02:18:14,880 Speaker 2: but we like we don't want to be like feeding 2805 02:18:14,879 --> 02:18:16,280 Speaker 2: into that. And I don't know that that's the best 2806 02:18:16,280 --> 02:18:18,600 Speaker 2: way to do that, but like, at least they're thinking 2807 02:18:18,640 --> 02:18:21,039 Speaker 2: about that kind of thing. The thing that it was 2808 02:18:21,040 --> 02:18:22,520 Speaker 2: interesting to me is that he build this as the 2809 02:18:22,560 --> 02:18:26,279 Speaker 2: first proactive home AI thing, so unlike an Alexa or whatever, 2810 02:18:26,280 --> 02:18:29,119 Speaker 2: where it's just waiting for you to ask it something, 2811 02:18:29,160 --> 02:18:32,480 Speaker 2: but it does not chime in randomly to talk to you, 2812 02:18:32,720 --> 02:18:33,360 Speaker 2: or it won't like. 2813 02:18:33,360 --> 02:18:35,720 Speaker 5: Change the subject either and like continue conversation. 2814 02:18:36,000 --> 02:18:38,200 Speaker 2: This will prompt you out of the blue, be like, hey, 2815 02:18:38,240 --> 02:18:40,280 Speaker 2: how are you doing? How are you feeling today? It's 2816 02:18:40,280 --> 02:18:41,240 Speaker 2: been a way and specific. 2817 02:18:41,280 --> 02:18:42,600 Speaker 5: You want to see pictures of your family? 2818 02:18:42,640 --> 02:18:44,680 Speaker 2: You see pictures of your family. Do you want to 2819 02:18:44,680 --> 02:18:45,360 Speaker 2: call your son? 2820 02:18:45,640 --> 02:18:45,840 Speaker 5: You know? 2821 02:18:46,320 --> 02:18:47,520 Speaker 2: But do you want to play a game? 2822 02:18:47,800 --> 02:18:49,879 Speaker 5: Talk to me about that movie you saw last Talk to. 2823 02:18:49,879 --> 02:18:51,400 Speaker 2: Me about that. Hey, remind me how did you meet 2824 02:18:51,400 --> 02:18:53,320 Speaker 2: your husband? You know? Like literally, these are all the 2825 02:18:53,320 --> 02:18:56,520 Speaker 2: things that will do. And it had some side features 2826 02:18:56,560 --> 02:18:58,400 Speaker 2: like if it prompts you to start telling a story, 2827 02:18:58,440 --> 02:19:00,840 Speaker 2: it'll save that as like a memos things so that 2828 02:19:01,000 --> 02:19:03,959 Speaker 2: like you know, when your elderly mother passes or whatever, 2829 02:19:04,000 --> 02:19:07,480 Speaker 2: it saved up this like collection of stories over the years, 2830 02:19:07,600 --> 02:19:09,760 Speaker 2: and you can like show it pictures while you're telling 2831 02:19:09,800 --> 02:19:12,119 Speaker 2: it stories and it will listen and it'll have comments 2832 02:19:12,120 --> 02:19:15,000 Speaker 2: and it'll ask you further questions about so, how did 2833 02:19:15,040 --> 02:19:17,320 Speaker 2: you feel, you know after meeting them this way, Like 2834 02:19:17,400 --> 02:19:20,120 Speaker 2: that's really interesting, I didn't know that. Explain to me 2835 02:19:20,160 --> 02:19:22,480 Speaker 2: how it worked. And it will also prompt you to 2836 02:19:22,560 --> 02:19:26,280 Speaker 2: send those to your kids. And the big thing almost 2837 02:19:26,320 --> 02:19:29,520 Speaker 2: every kind of dialogue thing would prompt you to send 2838 02:19:29,520 --> 02:19:31,560 Speaker 2: a message to a friend or your kid. So a 2839 02:19:31,560 --> 02:19:33,520 Speaker 2: big part of it seemed to be this is not 2840 02:19:33,600 --> 02:19:36,560 Speaker 2: a replacement. This is a machine that we hope people 2841 02:19:36,600 --> 02:19:39,480 Speaker 2: will get comfortable with and then it can prompt them 2842 02:19:39,520 --> 02:19:42,480 Speaker 2: to try to engage with the world more. And yeah, 2843 02:19:42,600 --> 02:19:45,080 Speaker 2: loved ones, because that's our whole goal is to connect 2844 02:19:45,440 --> 02:19:46,400 Speaker 2: them to people. 2845 02:19:46,720 --> 02:19:48,400 Speaker 5: I asked him, is like, you know, part of this 2846 02:19:48,440 --> 02:19:50,360 Speaker 5: product is designed to like, you know, help solve like 2847 02:19:50,800 --> 02:19:53,520 Speaker 5: loneliness in older adults, And like, how much of this 2848 02:19:53,560 --> 02:19:55,400 Speaker 5: is really just like kind of trying to like replace 2849 02:19:55,480 --> 02:19:58,200 Speaker 5: actual human contact with this, like you know AI contact. 2850 02:19:58,440 --> 02:20:01,480 Speaker 5: Will that really help, you know, lowliness? And he talked 2851 02:20:01,520 --> 02:20:03,720 Speaker 5: about how like like I think, like he said, like 2852 02:20:03,800 --> 02:20:06,040 Speaker 5: ninety percent of the people who like use this, like 2853 02:20:06,120 --> 02:20:09,840 Speaker 5: it results in actually more more communication with their family. 2854 02:20:10,240 --> 02:20:12,920 Speaker 2: They have this in like some two thousand homes right now, 2855 02:20:13,320 --> 02:20:14,920 Speaker 2: they have like two thousand units. 2856 02:20:15,040 --> 02:20:17,280 Speaker 5: It's like a subscription model. I think it's right now 2857 02:20:17,320 --> 02:20:19,360 Speaker 5: it is like ninety nine dollars a Month's gonna be 2858 02:20:19,360 --> 02:20:21,120 Speaker 5: boost up to like one hundred and fifty with some 2859 02:20:21,240 --> 02:20:22,720 Speaker 5: like extra features in the next year. 2860 02:20:22,879 --> 02:20:24,760 Speaker 2: It's very much still under evolution. So one thing he 2861 02:20:24,840 --> 02:20:27,600 Speaker 2: pointed at is that, like, yeah, initially we had the 2862 02:20:27,640 --> 02:20:31,640 Speaker 2: ability to like connect people to other elderly folks using this, 2863 02:20:31,800 --> 02:20:33,600 Speaker 2: and so they've kind of formed their own community, had 2864 02:20:33,640 --> 02:20:35,720 Speaker 2: like a weekly Bingo game and asked us to build 2865 02:20:35,720 --> 02:20:37,960 Speaker 2: in more chats so they can message each other directly, 2866 02:20:38,000 --> 02:20:40,199 Speaker 2: and so some of them are like playing bingo directly 2867 02:20:40,240 --> 02:20:43,480 Speaker 2: now through these machines. And I'm like, well, that seems 2868 02:20:43,680 --> 02:20:44,480 Speaker 2: probably good. 2869 02:20:44,920 --> 02:20:47,680 Speaker 5: Yeah, yeah, because like I still am like fundamentally opposed 2870 02:20:47,720 --> 02:20:51,039 Speaker 5: to this premise. Yes, but it's interesting to seeing someone still. 2871 02:20:50,760 --> 02:20:54,520 Speaker 2: But age sad aging. Yeah right, that's not their fault. 2872 02:20:54,600 --> 02:20:56,960 Speaker 5: And it's interesting to see someone like approach this from 2873 02:20:56,959 --> 02:20:59,600 Speaker 5: like a you know, a very like compassionate standpoint, even 2874 02:20:59,640 --> 02:21:01,640 Speaker 5: if I find the actual kind of nature of this 2875 02:21:01,680 --> 02:21:04,520 Speaker 5: thing existing to be like deeply uncomfortable. 2876 02:21:03,920 --> 02:21:06,720 Speaker 2: Because yeah, I can't not find it off putting, but 2877 02:21:06,840 --> 02:21:10,320 Speaker 2: I I think there's a chance that it will help 2878 02:21:10,600 --> 02:21:15,760 Speaker 2: with the real problem. I certainly would prefer if it helped. Yeah, 2879 02:21:16,120 --> 02:21:17,440 Speaker 2: So I don't know. It was kind of it was 2880 02:21:17,640 --> 02:21:19,800 Speaker 2: a unique in this world of like as it was 2881 02:21:19,840 --> 02:21:22,080 Speaker 2: a unique kind of like product for me where it's 2882 02:21:22,120 --> 02:21:26,160 Speaker 2: like I don't know that this application of AI technology 2883 02:21:26,640 --> 02:21:30,280 Speaker 2: will actually do what you're hoping it will. But I 2884 02:21:30,360 --> 02:21:33,000 Speaker 2: got the vibe from that guy I got was nothing 2885 02:21:33,080 --> 02:21:33,720 Speaker 2: but good will. 2886 02:21:33,920 --> 02:21:34,080 Speaker 3: Yeah. 2887 02:21:34,760 --> 02:21:36,960 Speaker 5: Some of the other people we talked to today who 2888 02:21:36,959 --> 02:21:38,519 Speaker 5: are completely soul. 2889 02:21:38,760 --> 02:21:42,080 Speaker 2: Out of yes, yes, nothing behind their eyes, dead eyes, 2890 02:21:42,160 --> 02:21:43,840 Speaker 2: black eyes, like a dull's eye. 2891 02:21:44,040 --> 02:21:45,520 Speaker 5: Even the way this guy is talking, you could tell 2892 02:21:45,520 --> 02:21:47,600 Speaker 5: you you had like a very like empathetic voice, like 2893 02:21:48,440 --> 02:21:48,879 Speaker 5: much like. 2894 02:21:48,920 --> 02:21:50,680 Speaker 2: One of the things he did is he he would 2895 02:21:50,680 --> 02:21:52,360 Speaker 2: tell it like, I'm in some pain, and then the 2896 02:21:52,440 --> 02:21:55,200 Speaker 2: robot would cycle through to the pain scale and would 2897 02:21:55,200 --> 02:21:56,840 Speaker 2: try to because one of the things it does is 2898 02:21:56,879 --> 02:21:59,960 Speaker 2: it will take information for care and it will text actively, 2899 02:22:00,360 --> 02:22:04,040 Speaker 2: so it's not just communicating with the old person. It 2900 02:22:04,080 --> 02:22:08,040 Speaker 2: will text and message their kids, you know, and whatnot, 2901 02:22:08,760 --> 02:22:10,920 Speaker 2: prompt their kids, Hey, your mom's lonely. 2902 02:22:11,120 --> 02:22:13,360 Speaker 5: Yeah, or it'll even say if you know, someone like 2903 02:22:13,400 --> 02:22:14,840 Speaker 5: didn't take their meds today. 2904 02:22:14,760 --> 02:22:19,560 Speaker 2: And again it's kind of sad that. But also his 2905 02:22:19,720 --> 02:22:21,680 Speaker 2: part of this is he was talking a lot about 2906 02:22:21,680 --> 02:22:24,400 Speaker 2: like empathy, and I think just because of the kind 2907 02:22:24,400 --> 02:22:25,760 Speaker 2: of brain you have to have to want to do this, 2908 02:22:26,240 --> 02:22:28,760 Speaker 2: he used it in terms of like the machines empathy, 2909 02:22:28,959 --> 02:22:32,959 Speaker 2: which it doesn't have, but the whole project, it was 2910 02:22:33,000 --> 02:22:36,120 Speaker 2: impossible not to see that he was a deeply empathetic 2911 02:22:36,200 --> 02:22:39,400 Speaker 2: man who was really trying to make the world better. 2912 02:22:39,440 --> 02:22:42,120 Speaker 2: And I can't not respect that. 2913 02:22:43,680 --> 02:22:46,800 Speaker 5: Well, I think that does it for us here at Cees. 2914 02:22:47,160 --> 02:22:51,000 Speaker 2: That's right, what a packed thirteen. No worry, no empathy. 2915 02:22:51,040 --> 02:22:55,320 Speaker 2: Tomorrow takes just a real dead eyed monster. I am 2916 02:22:55,360 --> 02:22:58,400 Speaker 2: a true villain you're gonna hear from in the next episode. 2917 02:22:58,480 --> 02:23:01,240 Speaker 5: I am a stumbbag. I am the best that I'm 2918 02:23:01,240 --> 02:23:03,360 Speaker 5: gonna be. Because I'm starting this week, I can still 2919 02:23:03,400 --> 02:23:07,680 Speaker 5: feel the Cees magic. Yeah, by Friday, I am going 2920 02:23:07,760 --> 02:23:12,920 Speaker 5: to be a different person. I am going to rip 2921 02:23:13,040 --> 02:23:17,360 Speaker 5: some poor pr person two shreds, I swear, but yeah, 2922 02:23:17,360 --> 02:23:20,720 Speaker 5: tune tomorrow to hear are our takes from the Cees 2923 02:23:20,920 --> 02:23:24,600 Speaker 5: kind of side show called show Stoppers to here also 2924 02:23:25,080 --> 02:23:29,280 Speaker 5: some exclusive, brand new AI generated SKA music, So we'll 2925 02:23:29,280 --> 02:23:32,640 Speaker 5: give you that hint for tomorrow's episode. See you see 2926 02:23:32,640 --> 02:23:33,199 Speaker 5: you there. 2927 02:23:33,200 --> 02:23:36,000 Speaker 2: M h We'll see you all there. I love you all. 2928 02:23:36,240 --> 02:23:53,520 Speaker 2: Good help, oh man, welcome to it can happen here? 2929 02:23:53,640 --> 02:23:57,560 Speaker 2: A podcast that's happening here? If here is your ears. 2930 02:23:58,640 --> 02:24:01,840 Speaker 2: If you're deaf and reading this, then it's happening to 2931 02:24:01,920 --> 02:24:04,519 Speaker 2: your eyes either way, it's happening. 2932 02:24:04,640 --> 02:24:07,120 Speaker 5: Here here also being Las Vegas. 2933 02:24:07,440 --> 02:24:10,720 Speaker 2: Well, yes, also Las Vegas. That's Nevada, not the other one. 2934 02:24:10,800 --> 02:24:17,880 Speaker 2: Nevada A yeah, uh huh. Podcast number three, How the 2935 02:24:18,000 --> 02:24:19,400 Speaker 2: time does fly? 2936 02:24:19,879 --> 02:24:20,360 Speaker 5: Sure does. 2937 02:24:20,680 --> 02:24:22,640 Speaker 2: By the time you listen to this, Garrison and I 2938 02:24:22,680 --> 02:24:25,119 Speaker 2: will have just had the best meal that we're going 2939 02:24:25,120 --> 02:24:25,360 Speaker 2: to have. 2940 02:24:25,440 --> 02:24:27,840 Speaker 5: Oh my god. Yeah, it's tomorrow for. 2941 02:24:27,800 --> 02:24:30,840 Speaker 2: Us still, but we're still we're very excited about Moramoto, 2942 02:24:30,879 --> 02:24:34,160 Speaker 2: which is a fantastic Every year we have a very 2943 02:24:34,200 --> 02:24:37,640 Speaker 2: special dinner just them and me and a couple of 2944 02:24:37,680 --> 02:24:40,800 Speaker 2: friends who will remain anonymous because people get weird on 2945 02:24:40,840 --> 02:24:41,280 Speaker 2: the internet. 2946 02:24:41,320 --> 02:24:44,840 Speaker 5: Sometimes it is literally the highlight of my year. Sometimes 2947 02:24:45,400 --> 02:24:47,840 Speaker 5: it does keep me going. Actually really gives me a 2948 02:24:47,879 --> 02:24:49,680 Speaker 5: lot of power. Some of the best tacos I've ever 2949 02:24:49,680 --> 02:24:52,640 Speaker 5: had in my life. So good. Uh huh. 2950 02:24:52,680 --> 02:24:55,600 Speaker 2: Anyway, ah, we're just thinking about delicious food. Let's talk 2951 02:24:55,600 --> 02:24:57,440 Speaker 2: about the dead eyed ghoul we met. Oh wait, no, 2952 02:24:57,440 --> 02:25:00,760 Speaker 2: we yet. We met a dead eyed that I'm gonna 2953 02:25:00,760 --> 02:25:05,600 Speaker 2: spoil now. Real monster, like real, real, real evil vibes, 2954 02:25:05,720 --> 02:25:08,640 Speaker 2: Like if this guy as soon as I met him, 2955 02:25:08,680 --> 02:25:10,520 Speaker 2: shook his hand like oh if you get if this 2956 02:25:10,600 --> 02:25:13,280 Speaker 2: guy gets power, you're going to be responsible for a 2957 02:25:13,320 --> 02:25:14,520 Speaker 2: lot of death and suffering. 2958 02:25:14,680 --> 02:25:16,400 Speaker 5: I mean speaking of kind of think he will. 2959 02:25:16,480 --> 02:25:20,440 Speaker 2: He's just not that talented. He wishes, but you never 2960 02:25:20,440 --> 02:25:21,640 Speaker 2: know where these guys are gonna end up. 2961 02:25:21,680 --> 02:25:26,880 Speaker 5: Speaking of sad evil Uh, Twitter X the everything appy, 2962 02:25:27,920 --> 02:25:30,280 Speaker 5: that's what people are calling it. They gave a keynote 2963 02:25:30,280 --> 02:25:31,400 Speaker 5: which was very sad. 2964 02:25:31,959 --> 02:25:35,840 Speaker 2: The the CEO Linda Linda really yakarino about Twitter for 2965 02:25:35,879 --> 02:25:37,400 Speaker 2: a while oh so bad. 2966 02:25:38,080 --> 02:25:43,520 Speaker 5: So they started by talking about how Facebook meta has 2967 02:25:43,600 --> 02:25:47,320 Speaker 5: has copied Twitter's like fact checking policy of actually not 2968 02:25:47,480 --> 02:25:51,000 Speaker 5: having real fact checks. Yes, now maybe has actually kind 2969 02:25:51,000 --> 02:25:53,959 Speaker 5: of failed as an industry, but for you know, our 2970 02:25:54,040 --> 02:25:57,800 Speaker 5: problems perhaps with fact checking very different from these people's problems. 2971 02:25:58,240 --> 02:26:01,400 Speaker 5: And the fact now that that Facebook is walking away 2972 02:26:01,760 --> 02:26:05,119 Speaker 5: from actual, like genuine like fact checks against like disinformation 2973 02:26:05,200 --> 02:26:09,000 Speaker 5: misinformation and parting ways with like using like legacy media 2974 02:26:09,040 --> 02:26:11,640 Speaker 5: aulets to verify information because those media aulets are too 2975 02:26:11,720 --> 02:26:16,120 Speaker 5: political quote unquote, and instead is copying the current X 2976 02:26:16,240 --> 02:26:19,480 Speaker 5: model of free speech and specifically saying like there's been 2977 02:26:19,480 --> 02:26:22,359 Speaker 5: way too much censorship on gender issues. 2978 02:26:22,720 --> 02:26:26,240 Speaker 2: Now you can comment that women are a piece of property, well, I. 2979 02:26:26,160 --> 02:26:28,119 Speaker 5: Mean I think specifically this is this is like trans 2980 02:26:28,320 --> 02:26:29,400 Speaker 5: like no, no, no stuff too. 2981 02:26:29,560 --> 02:26:32,000 Speaker 2: One of the things that is specific exemption now is 2982 02:26:32,040 --> 02:26:34,039 Speaker 2: that you can now refer to women as if they 2983 02:26:34,040 --> 02:26:36,519 Speaker 2: are property on Facebook. 2984 02:26:36,959 --> 02:26:39,200 Speaker 5: This is the future of communication. 2985 02:26:38,920 --> 02:26:42,199 Speaker 2: Right, Yeah, thank god, Linda is really blazing a trail 2986 02:26:42,280 --> 02:26:43,199 Speaker 2: for women everywhere. 2987 02:26:43,360 --> 02:26:48,040 Speaker 5: Linda was very excited about that. And they yakarino about 2988 02:26:48,040 --> 02:26:51,879 Speaker 5: that for like a good ten minutes about how you 2989 02:26:51,879 --> 02:26:53,720 Speaker 5: know this is this is where we're really entering a 2990 02:26:53,760 --> 02:26:56,959 Speaker 5: new era of free speech and social media. And then 2991 02:26:57,040 --> 02:27:01,119 Speaker 5: she got asked a question about how much x Twitter, 2992 02:27:01,200 --> 02:27:04,520 Speaker 5: the everything app, will we'll take a part in Elon 2993 02:27:04,640 --> 02:27:09,600 Speaker 5: Musk's plans for the Department of Government Efficiency DOGE, and 2994 02:27:09,600 --> 02:27:12,840 Speaker 5: and this got the the first applause of the panel. 2995 02:27:12,920 --> 02:27:15,760 Speaker 5: It applause only happens two times during the DOGE section. 2996 02:27:16,040 --> 02:27:19,360 Speaker 5: Was the first, like you know, room, room starts clapping moment, 2997 02:27:19,480 --> 02:27:23,640 Speaker 5: everyone goes crazy. How how many minutes in was that? Oh? 2998 02:27:23,680 --> 02:27:26,040 Speaker 5: Maybe it was like maybe like maybe like twelve thirteen minutes? 2999 02:27:26,200 --> 02:27:28,720 Speaker 2: People really yeah, I had to had to be intentional here. 3000 02:27:28,760 --> 02:27:31,000 Speaker 2: This is not like they were just overdue for class. 3001 02:27:31,080 --> 02:27:33,920 Speaker 5: No, no, no. They talked about Vivak talked about, you know, 3002 02:27:33,959 --> 02:27:37,040 Speaker 5: elon turning to Twitter, X the everything app for like 3003 02:27:37,080 --> 02:27:40,879 Speaker 5: suggestions on which government agencies to get rid of. 3004 02:27:42,959 --> 02:27:46,160 Speaker 2: I hope we get rid of the ATF so so 3005 02:27:46,280 --> 02:27:49,760 Speaker 2: that that was machine guns mandatory? Why not at this point, right, 3006 02:27:49,840 --> 02:27:53,160 Speaker 2: it can only help, It can only help. Look, if 3007 02:27:53,160 --> 02:27:55,640 Speaker 2: we learned anything from a thing I'm not going to 3008 02:27:55,640 --> 02:28:00,000 Speaker 2: specify that happened late last year. More suppressors is always 3009 02:28:00,160 --> 02:28:04,800 Speaker 2: is handy. 3010 02:28:05,040 --> 02:28:06,920 Speaker 5: The second thing that got applause was what they talked 3011 02:28:06,959 --> 02:28:11,240 Speaker 5: about next, was about you know, everyone's everyone's turning to 3012 02:28:11,240 --> 02:28:15,360 Speaker 5: to x, Twitter everything everything else for information now and 3013 02:28:15,360 --> 02:28:17,959 Speaker 5: and and Twitter x everything app played a crucial part 3014 02:28:18,040 --> 02:28:22,120 Speaker 5: in bringing to light the Muslim rape gang story in 3015 02:28:22,200 --> 02:28:25,279 Speaker 5: the UK and how that was so important for saving 3016 02:28:25,400 --> 02:28:27,920 Speaker 5: children and we have to we have to post more, 3017 02:28:28,120 --> 02:28:30,800 Speaker 5: not less, and like this was the other thing that 3018 02:28:30,879 --> 02:28:35,120 Speaker 5: got massive applause was talking about the rape gangs. 3019 02:28:35,280 --> 02:28:37,879 Speaker 2: People love rape gangs, people love rape gangs. That that 3020 02:28:37,959 --> 02:28:40,760 Speaker 2: was a pretty good Star Trek episode. That was Doctor 3021 02:28:40,920 --> 02:28:42,440 Speaker 2: R's Planet with the rape gangs, one. 3022 02:28:42,360 --> 02:28:44,280 Speaker 5: Of one of the more black pilling things. 3023 02:28:45,000 --> 02:28:46,640 Speaker 2: It wasn't a very good Star Trek episode. 3024 02:28:46,800 --> 02:28:49,320 Speaker 5: It's also not a good track episode. I was referring 3025 02:28:49,320 --> 02:28:52,600 Speaker 5: to the panel, not the Trek episode. But that's the 3026 02:28:52,600 --> 02:28:54,600 Speaker 5: other thing that got massive applause is it's like save 3027 02:28:54,680 --> 02:28:57,720 Speaker 5: the children type rhetoric and you know, saying, you know, 3028 02:28:57,760 --> 02:29:00,280 Speaker 5: like as a mother, it's it's so important that the 3029 02:29:00,400 --> 02:29:04,240 Speaker 5: more people post about this problem. That was the two 3030 02:29:04,280 --> 02:29:07,240 Speaker 5: big applause moments. But I think in general, this this 3031 02:29:07,280 --> 02:29:10,640 Speaker 5: whole panel was trying to like, you know, demonstrate how 3032 02:29:10,720 --> 02:29:14,400 Speaker 5: symbiotic a new Trump presidency and Elon Musk's Twitter. 3033 02:29:14,240 --> 02:29:17,000 Speaker 2: This is issue a direct info line, this is a 3034 02:29:17,080 --> 02:29:18,800 Speaker 2: tap from the Trump presidency. 3035 02:29:18,840 --> 02:29:21,080 Speaker 5: Tou this is how you talk to the new government. Like, 3036 02:29:21,200 --> 02:29:23,560 Speaker 5: this is how you talk to all of these new people, 3037 02:29:23,640 --> 02:29:25,959 Speaker 5: all these new cabinet members. They're all on Twitter. They're 3038 02:29:26,000 --> 02:29:27,959 Speaker 5: all talking on Twitter. This is this is how you 3039 02:29:28,000 --> 02:29:30,320 Speaker 5: stay connected to the new government. 3040 02:29:30,480 --> 02:29:33,200 Speaker 2: It's interesting. One thing I'm curious about so that this 3041 02:29:33,280 --> 02:29:35,800 Speaker 2: is the thing that happened the last set of Nazis 3042 02:29:35,840 --> 02:29:38,039 Speaker 2: that gained power in a country in a big way, 3043 02:29:38,480 --> 02:29:42,039 Speaker 2: the German ones. There was this common attitude of like 3044 02:29:42,080 --> 02:29:45,200 Speaker 2: if only Hitler knew. Because Nazi policies didn't help the 3045 02:29:45,200 --> 02:29:47,000 Speaker 2: people that were supposed to help, They hurt a lot 3046 02:29:47,000 --> 02:29:49,440 Speaker 2: of people like they were just bad at everything, like 3047 02:29:49,520 --> 02:29:52,080 Speaker 2: fascists tend to be. And there was this attitude that like, well, 3048 02:29:52,120 --> 02:29:55,920 Speaker 2: Hitler can't know, like the fact that, like we the 3049 02:29:55,959 --> 02:29:58,200 Speaker 2: country's been handed over to gangsters who were continuing to 3050 02:29:58,280 --> 02:30:00,400 Speaker 2: hurt the people Hitler problems to help. He must not 3051 02:30:00,440 --> 02:30:02,560 Speaker 2: be aware, like if he knew, he would fix this, 3052 02:30:02,680 --> 02:30:05,200 Speaker 2: only he knew. So I'm wondering how that's going to 3053 02:30:05,400 --> 02:30:08,640 Speaker 2: play in here. As Trump's policies continue to hurt the 3054 02:30:08,680 --> 02:30:11,040 Speaker 2: people who a lot of the people who voted from, 3055 02:30:11,040 --> 02:30:12,920 Speaker 2: not the rich people who voted for him, but the 3056 02:30:12,959 --> 02:30:15,680 Speaker 2: people who like flipped between him and Biden or whatever, 3057 02:30:15,879 --> 02:30:18,480 Speaker 2: Like those folks are going to get fucked like the 3058 02:30:18,520 --> 02:30:22,000 Speaker 2: rest of us. And I kind of wonder if they're 3059 02:30:22,000 --> 02:30:24,879 Speaker 2: going to if there's going to be what win the 3060 02:30:24,920 --> 02:30:28,600 Speaker 2: blowback against X, the everything app will happen, right like 3061 02:30:29,120 --> 02:30:32,760 Speaker 2: as people are like either I'm being ignored or I'm 3062 02:30:32,800 --> 02:30:36,920 Speaker 2: being called like a retard by Elon Musk for complaining that, 3063 02:30:37,160 --> 02:30:41,560 Speaker 2: Like Elon Musk tweets it and randomly to people when 3064 02:30:41,600 --> 02:30:45,080 Speaker 2: they make very valid critiques of the shit that he's doing, 3065 02:30:45,720 --> 02:30:48,920 Speaker 2: Like that's literally what he's calling. He's saying it like everything, 3066 02:30:48,959 --> 02:30:51,480 Speaker 2: like constantly. I'm not using it as a slur, that's 3067 02:30:51,520 --> 02:30:54,280 Speaker 2: just the term he's using. If they comment that like 3068 02:30:54,360 --> 02:30:57,880 Speaker 2: their fucking medicaid got cut because Trump put doctor Oz 3069 02:30:57,959 --> 02:31:00,760 Speaker 2: in charge of it, and Elon Musk calls them like, 3070 02:31:00,959 --> 02:31:04,000 Speaker 2: you know, a slur, What does that do to you? 3071 02:31:04,080 --> 02:31:05,400 Speaker 2: Like they like, I don't even know, I don't even 3072 02:31:05,440 --> 02:31:07,520 Speaker 2: have any more intelligent than like, yeah, I wonder what 3073 02:31:07,560 --> 02:31:09,359 Speaker 2: that does to Twitter's bottom line? 3074 02:31:09,280 --> 02:31:11,840 Speaker 5: Ye, I mean yeah, I'm not sure if they care anymore. 3075 02:31:11,840 --> 02:31:14,360 Speaker 5: I mean, something else Linda talked about is how you know, 3076 02:31:14,360 --> 02:31:17,840 Speaker 5: Twitter's the only place for independent news to spread and 3077 02:31:18,280 --> 02:31:20,240 Speaker 5: as both of us have, you know, worked in the 3078 02:31:20,280 --> 02:31:24,600 Speaker 5: independent journalism, minds nothing nothing spreads on Twitter anymore. 3079 02:31:24,680 --> 02:31:26,920 Speaker 2: Now if it's news, it doesn't. The only thing that 3080 02:31:26,959 --> 02:31:29,920 Speaker 2: spreads is yeah, like the shit that makes people very 3081 02:31:29,959 --> 02:31:32,960 Speaker 2: angry but keeps them on the site, like articles, videos, 3082 02:31:32,959 --> 02:31:34,600 Speaker 2: if it takes you off site, it doesn't mean. 3083 02:31:34,680 --> 02:31:36,640 Speaker 5: Yeah, the things that go viral and get spread is 3084 02:31:36,680 --> 02:31:40,400 Speaker 5: like encouraging racial bias, yes, pogrooms essentially. 3085 02:31:40,160 --> 02:31:42,400 Speaker 2: Yeah, which is what happened last year in the UK, 3086 02:31:42,480 --> 02:31:43,880 Speaker 2: and they're sure trying to do it again. 3087 02:31:43,959 --> 02:31:45,920 Speaker 5: I mean, I think some of someone's some of what 3088 02:31:45,959 --> 02:31:47,400 Speaker 5: she's referencing is, you know, there's a lot of like 3089 02:31:47,480 --> 02:31:51,039 Speaker 5: throttling attentionally of you know people on maybe our proclivities, 3090 02:31:51,280 --> 02:31:53,720 Speaker 5: and there is a degree of boosting for more you know, 3091 02:31:53,800 --> 02:31:57,080 Speaker 5: centrist or right wing journalists. And maybe that's that's some 3092 02:31:57,160 --> 02:31:59,280 Speaker 5: of some of what they could be kind of more 3093 02:31:59,440 --> 02:32:02,200 Speaker 5: more refer to there, But you know it was it 3094 02:32:02,240 --> 02:32:05,440 Speaker 5: was a short keynote, only thirty minutes. Just the two 3095 02:32:05,480 --> 02:32:07,800 Speaker 5: things that got applause are doge. 3096 02:32:07,680 --> 02:32:09,720 Speaker 2: Well Linda doesn't know that many words, so they really 3097 02:32:09,800 --> 02:32:11,400 Speaker 2: need to keep it under thirty minutes. 3098 02:32:11,360 --> 02:32:14,440 Speaker 5: Jen and literally Muslim rape gangs is you know this 3099 02:32:14,760 --> 02:32:18,080 Speaker 5: type of like like very very gross racial fear mongering, 3100 02:32:18,440 --> 02:32:20,400 Speaker 5: and those are things that like lit up the room. 3101 02:32:20,879 --> 02:32:22,879 Speaker 2: You know, we all want there to be an after 3102 02:32:23,200 --> 02:32:27,560 Speaker 2: where there's even the minimal degree of accountability that happened 3103 02:32:27,600 --> 02:32:30,760 Speaker 2: after the Nazis. But like what I try to in 3104 02:32:30,840 --> 02:32:33,800 Speaker 2: my darker moments think is like, well that's another person 3105 02:32:33,840 --> 02:32:36,640 Speaker 2: who like really made the argument of like what needs 3106 02:32:36,680 --> 02:32:41,800 Speaker 2: to happen when this this ends, because it's just I 3107 02:32:41,879 --> 02:32:45,400 Speaker 2: want to hurt people. My business is enabling a harm. 3108 02:32:45,480 --> 02:32:48,960 Speaker 2: I want to get mobs in the street beating migrants 3109 02:32:49,440 --> 02:32:53,560 Speaker 2: like that's Linda's business. That's the business she has willfully 3110 02:32:53,600 --> 02:32:58,280 Speaker 2: attached herself to. And we should all see that it's 3111 02:32:58,360 --> 02:33:00,920 Speaker 2: very important to not stop to talking about it like 3112 02:33:00,959 --> 02:33:05,320 Speaker 2: what it is. These people are trying to cause racial violence, 3113 02:33:05,600 --> 02:33:08,640 Speaker 2: and they are trying to cause gendered violence, and they 3114 02:33:08,640 --> 02:33:13,039 Speaker 2: are trying to cause harm at scale to communities of 3115 02:33:13,120 --> 02:33:18,959 Speaker 2: people that they see financial profit in damaging. Well in 3116 02:33:19,040 --> 02:33:22,680 Speaker 2: other uplifting c yes, news, cool stuff. I love the 3117 02:33:22,680 --> 02:33:24,200 Speaker 2: Consumer Electronics Show. 3118 02:33:26,200 --> 02:33:28,320 Speaker 5: Actually, I think it might be time for an ad 3119 02:33:28,360 --> 02:33:31,440 Speaker 5: break speaking of damaging communities of people. 3120 02:33:31,600 --> 02:33:33,200 Speaker 2: That's right, there's a chance. 3121 02:33:33,560 --> 02:33:45,400 Speaker 5: Yeah, ads, Oh well we're back. 3122 02:33:45,640 --> 02:33:49,040 Speaker 2: Boy. I'm so glad that those ads told me that 3123 02:33:49,040 --> 02:33:52,800 Speaker 2: Fragaccio Blow is touring with Bono. I never thought they'd 3124 02:33:52,840 --> 02:33:56,440 Speaker 2: do it, but boy howdy, and they're singing each other's song, 3125 02:33:56,680 --> 02:34:00,400 Speaker 2: so you know, that's really exciting. It's like when Barbara Diden. 3126 02:34:01,000 --> 02:34:03,119 Speaker 2: I don't know who Barber Selene is, but that's, oh 3127 02:34:03,160 --> 02:34:06,240 Speaker 2: my god, that's cool. Robert Luckily, I do know what 3128 02:34:06,320 --> 02:34:06,960 Speaker 2: SKA is. 3129 02:34:07,040 --> 02:34:11,560 Speaker 5: I consider myself a day of culture and for for tonight, 3130 02:34:12,120 --> 02:34:15,080 Speaker 5: me and Robert attended this kind of like side event 3131 02:34:15,200 --> 02:34:19,240 Speaker 5: at CEES called Showstoppers, And as you walk around the 3132 02:34:19,280 --> 02:34:22,360 Speaker 5: CEES floor, there's a lot of frankly garbage. There's a 3133 02:34:22,360 --> 02:34:25,400 Speaker 5: lot of just like mostly mostly garbage stuff. 3134 02:34:25,160 --> 02:34:27,280 Speaker 2: That you or stuff that like you're just not interested 3135 02:34:27,320 --> 02:34:30,760 Speaker 2: in because you're literally buying like screens from a manufacturer 3136 02:34:30,760 --> 02:34:32,680 Speaker 2: in China. Like it's like that's just not the business 3137 02:34:32,680 --> 02:34:34,360 Speaker 2: you're into because some of this stuff has be meant 3138 02:34:34,360 --> 02:34:34,840 Speaker 2: for companies. 3139 02:34:35,080 --> 02:34:38,280 Speaker 5: So much floor space, Like there's like, I we walked 3140 02:34:38,280 --> 02:34:39,439 Speaker 5: to what twenty town. 3141 02:34:39,280 --> 02:34:41,240 Speaker 2: That I spent the first seven years of my life 3142 02:34:41,280 --> 02:34:43,520 Speaker 2: in is smaller than one of the room CEU. 3143 02:34:44,720 --> 02:34:47,720 Speaker 5: It's across like three hotels and a massive convention center. 3144 02:34:48,160 --> 02:34:50,400 Speaker 2: Ninety thousand people come into town for this thing. 3145 02:34:50,440 --> 02:34:53,039 Speaker 5: It could be hard to like see everything you want to. Now, 3146 02:34:53,080 --> 02:34:55,800 Speaker 5: what's cool about Showstoppers, this is the side event at 3147 02:34:55,800 --> 02:34:59,440 Speaker 5: the Bellaggio, is that basically it's a room full of 3148 02:34:59,520 --> 02:35:01,440 Speaker 5: kind of all all the coolest stuff, a whole bunch 3149 02:35:01,440 --> 02:35:04,720 Speaker 5: of stuff that has won CEES innovation awards, all packed 3150 02:35:04,760 --> 02:35:08,680 Speaker 5: into one room with food and alcohol. So oh boy, 3151 02:35:08,840 --> 02:35:11,320 Speaker 5: did I order free food and free alcohol, so many 3152 02:35:11,440 --> 02:35:15,160 Speaker 5: drinks that I then just left on tables. And it 3153 02:35:15,280 --> 02:35:17,280 Speaker 5: always pretty good food, pretty good food. 3154 02:35:17,440 --> 02:35:17,640 Speaker 2: Yeah. 3155 02:35:17,879 --> 02:35:20,600 Speaker 5: So we walked around show stoppers and there was a 3156 02:35:20,680 --> 02:35:23,800 Speaker 5: number of pretty pretty cool stuff that we saw. Yeah, 3157 02:35:23,959 --> 02:35:26,640 Speaker 5: but I think I think it's maybe time to talk 3158 02:35:26,680 --> 02:35:29,199 Speaker 5: about the saddest, the saddest man. 3159 02:35:29,200 --> 02:35:32,760 Speaker 2: The villain, the villain of the episode and of our 3160 02:35:33,080 --> 02:35:36,000 Speaker 2: this year's ces. I have trouble. Can you bring up 3161 02:35:36,040 --> 02:35:37,680 Speaker 2: their name? Because I'm gonna want to get this right. 3162 02:35:37,920 --> 02:35:40,359 Speaker 5: Oh, so we could be dangerous. 3163 02:35:40,840 --> 02:35:42,560 Speaker 2: We had neither of us had eaten and I had 3164 02:35:42,600 --> 02:35:45,119 Speaker 2: had like a hot dog eight hours ago and walked 3165 02:35:45,200 --> 02:35:48,600 Speaker 2: literally nineteen thousand steps and also done forty minutes of 3166 02:35:48,640 --> 02:35:51,160 Speaker 2: push ups in between. So I was starving. So we 3167 02:35:51,160 --> 02:35:53,760 Speaker 2: we like shovel food into our faces and we turn 3168 02:35:53,840 --> 02:35:56,840 Speaker 2: the first booth we see is called open Droid. 3169 02:35:57,600 --> 02:36:00,400 Speaker 5: Open Droid or open Droids Droids droids. 3170 02:36:00,480 --> 02:36:02,880 Speaker 2: Yes, I did. There is an s Open Droids and 3171 02:36:02,920 --> 02:36:05,640 Speaker 2: it's like kind of Star Wars. He font it is 3172 02:36:05,720 --> 02:36:08,760 Speaker 2: and I did ask them if you know they had 3173 02:36:08,800 --> 02:36:12,800 Speaker 2: any issues with Lucasfilm. Apparently not yet sue them. Lucasfilm, 3174 02:36:12,959 --> 02:36:14,160 Speaker 2: by the way, sue these kids. 3175 02:36:14,240 --> 02:36:16,360 Speaker 5: I know there's people who work for Lucasfilm who listen 3176 02:36:16,440 --> 02:36:17,519 Speaker 5: to this, crush. 3177 02:36:17,280 --> 02:36:22,000 Speaker 2: Them, burn them like Los Angeles is burning down as 3178 02:36:22,040 --> 02:36:22,480 Speaker 2: we see. 3179 02:36:22,600 --> 02:36:25,040 Speaker 5: They had a giant sign that said R two D three. 3180 02:36:25,320 --> 02:36:27,120 Speaker 2: Yeah, that's the name of the robot that they're selling. 3181 02:36:27,160 --> 02:36:29,160 Speaker 2: And the robot. The robot that they're selling is like 3182 02:36:29,240 --> 02:36:35,440 Speaker 2: a an AI enabled household helping slash like retail you know, 3183 02:36:35,879 --> 02:36:39,080 Speaker 2: like you know robot where it basically is like a 3184 02:36:39,160 --> 02:36:44,360 Speaker 2: human torso with articulated arms and pincher hands on. And 3185 02:36:44,400 --> 02:36:47,880 Speaker 2: then the base is like a little tank. Basically it's 3186 02:36:47,879 --> 02:36:49,680 Speaker 2: got like treads or wheels and it rolls. 3187 02:36:49,760 --> 02:36:50,640 Speaker 5: It is wheels, yeah, and. 3188 02:36:50,680 --> 02:36:54,080 Speaker 2: Then the torso there's like a tall maybe six foot 3189 02:36:54,120 --> 02:36:58,320 Speaker 2: tall like pillar built into this like rolling base that 3190 02:36:58,360 --> 02:37:01,000 Speaker 2: the torso slides up and down on. And this was 3191 02:37:01,040 --> 02:37:04,400 Speaker 2: their way of not making like what Musk is trying 3192 02:37:04,440 --> 02:37:06,360 Speaker 2: to do, right, a humanoid robot where you have to 3193 02:37:06,360 --> 02:37:08,959 Speaker 2: figure out like knees and balance and stuff. It's like now, 3194 02:37:09,040 --> 02:37:11,800 Speaker 2: well like Boston dynamic wheel right, wheels are cheap in 3195 02:37:11,920 --> 02:37:14,520 Speaker 2: a role it works in most situations, you know, and 3196 02:37:14,560 --> 02:37:17,080 Speaker 2: then but you still have the ability for it to 3197 02:37:17,440 --> 02:37:20,560 Speaker 2: articulate and go up higher or go down lower like 3198 02:37:20,600 --> 02:37:23,360 Speaker 2: something that can crouch, but it's much simpler. You don't 3199 02:37:23,400 --> 02:37:25,760 Speaker 2: have to deal with nearly as much. And so I 3200 02:37:25,800 --> 02:37:28,039 Speaker 2: saw that, I'm like, oh, well, that's at least somebody 3201 02:37:28,040 --> 02:37:29,879 Speaker 2: who's thinking about, like how do we make something like 3202 02:37:29,920 --> 02:37:32,680 Speaker 2: this like more affordable and less complicated, less to fuck up? 3203 02:37:33,120 --> 02:37:35,520 Speaker 2: And so I start talking with one of the co 3204 02:37:35,560 --> 02:37:39,039 Speaker 2: founders of the company, who is an Indian guy in 3205 02:37:39,080 --> 02:37:42,240 Speaker 2: his forties something around that he had like gray hair, 3206 02:37:42,280 --> 02:37:44,720 Speaker 2: he'd clearly he said he'd spent twenty years in robotics. 3207 02:37:45,440 --> 02:37:47,800 Speaker 2: Very nice guy, you know. I brought up that I 3208 02:37:47,800 --> 02:37:50,840 Speaker 2: thought the design was interesting, and he was very much specifying, like, 3209 02:37:50,840 --> 02:37:54,240 Speaker 2: here's the things we didn't do because they were too difficult, 3210 02:37:54,280 --> 02:37:57,560 Speaker 2: too inefficient, you know, this is what we're thinking of. 3211 02:37:57,680 --> 02:37:59,680 Speaker 2: This is a machine that can fold laundry. This is 3212 02:37:59,680 --> 02:38:01,920 Speaker 2: a machine that can do dishes. This is a machine. 3213 02:38:01,959 --> 02:38:04,520 Speaker 2: And he was very much specifying and the way he 3214 02:38:04,560 --> 02:38:07,800 Speaker 2: phrases like, these are undesirable tasks people don't want to do, 3215 02:38:08,280 --> 02:38:10,680 Speaker 2: and this is a robot that can handle those for 3216 02:38:10,800 --> 02:38:13,240 Speaker 2: like small businesses or for households. And we do see 3217 02:38:13,280 --> 02:38:16,600 Speaker 2: this as eventually like a you know, something like this 3218 02:38:16,760 --> 02:38:18,720 Speaker 2: we want to have in households. But he was more 3219 02:38:18,760 --> 02:38:22,680 Speaker 2: focused on small businesses and he was again, very focused 3220 02:38:22,680 --> 02:38:25,880 Speaker 2: on this is a thing that will do undesirable tasks 3221 02:38:26,640 --> 02:38:31,039 Speaker 2: for people, right, And as I started asking more questions, 3222 02:38:31,040 --> 02:38:34,400 Speaker 2: at a certain point, I got foisted off to the 3223 02:38:34,400 --> 02:38:37,080 Speaker 2: co founder of the company. Is it the co founder 3224 02:38:37,120 --> 02:38:38,840 Speaker 2: or is it just like another one of their rests. 3225 02:38:38,879 --> 02:38:41,160 Speaker 2: You know, I'm assuming co founder because I think it's 3226 02:38:41,240 --> 02:38:43,320 Speaker 2: just a couple of guys, but maybe I'm not gonna sorry. 3227 02:38:43,400 --> 02:38:45,720 Speaker 5: I got foisted over to the other of the two guys. 3228 02:38:45,720 --> 02:38:48,840 Speaker 2: There were two guys there, right, I'm not sure because 3229 02:38:48,879 --> 02:38:51,360 Speaker 2: they don't have listed any where what their role in 3230 02:38:51,400 --> 02:38:54,440 Speaker 2: the company is. I got a co founder's vibe from them. 3231 02:38:54,680 --> 02:38:57,240 Speaker 2: That's how it seemed to be to me, at least 3232 02:38:57,280 --> 02:38:59,039 Speaker 2: in terms of like the way these two were talking. 3233 02:38:59,480 --> 02:39:03,199 Speaker 2: But I don't know the scope of the open Droid's company. 3234 02:39:03,240 --> 02:39:06,039 Speaker 2: Maybe there's a lot more there, but these were the 3235 02:39:06,080 --> 02:39:07,840 Speaker 2: two guys who were there talking to us. So one 3236 02:39:07,840 --> 02:39:11,000 Speaker 2: of them is this very wonky engineer who's been at 3237 02:39:11,000 --> 02:39:13,520 Speaker 2: this a long time and was really focused on the 3238 02:39:13,600 --> 02:39:15,760 Speaker 2: nuts and bolts details and wanted to build a robot 3239 02:39:15,800 --> 02:39:18,600 Speaker 2: that could handle unpleasant tasks for human beings, right, the 3240 02:39:18,640 --> 02:39:20,440 Speaker 2: same thing we've all been wanting to see. So at 3241 02:39:20,440 --> 02:39:22,920 Speaker 2: this point, I'm like, this could work. Maybe this is 3242 02:39:22,959 --> 02:39:23,760 Speaker 2: a viable product. 3243 02:39:23,879 --> 02:39:24,039 Speaker 1: Right. 3244 02:39:24,800 --> 02:39:30,920 Speaker 2: The second guy, Jack j Jessenowski, So he is wearing 3245 02:39:31,520 --> 02:39:36,160 Speaker 2: what Garrison described as a Jordan Peterson suit because it 3246 02:39:36,280 --> 02:39:44,000 Speaker 2: is half purple face suit split down the motherfucking middle. 3247 02:39:44,080 --> 02:39:46,800 Speaker 5: With like like new age hippie like. 3248 02:39:46,520 --> 02:39:50,599 Speaker 2: Necklaces, five necklaceslaces, five necklaces. 3249 02:39:50,800 --> 02:39:54,160 Speaker 5: He had pants with like like embroidered flowers on those 3250 02:39:54,200 --> 02:39:56,440 Speaker 5: and like a nose bridge, like it looked like one 3251 02:39:56,440 --> 02:39:57,959 Speaker 5: of those things you put in your nose. That was 3252 02:39:57,959 --> 02:40:01,080 Speaker 5: one of the other things U showstoppers. There was a 3253 02:40:01,080 --> 02:40:02,039 Speaker 5: company that was doing that. 3254 02:40:02,160 --> 02:40:05,200 Speaker 2: So yeah, he had want to be Steve Jobs vibes 3255 02:40:05,200 --> 02:40:09,160 Speaker 2: from his half unbuttoned shirt and like many many spiritual 3256 02:40:09,200 --> 02:40:14,680 Speaker 2: medallions to his like Jordan Peterson's suit, and very much 3257 02:40:14,840 --> 02:40:18,560 Speaker 2: just that. Like I am the charismatic founder and what 3258 02:40:18,680 --> 02:40:20,520 Speaker 2: I bring to the table. My partner knows how to 3259 02:40:20,560 --> 02:40:26,960 Speaker 2: build robots. I'm charismatic. I'm Jack j Jessanowski, and Jack 3260 02:40:27,000 --> 02:40:30,800 Speaker 2: and I started talking and boy howdy, we had us 3261 02:40:30,800 --> 02:40:33,720 Speaker 2: a conversation and I think we're just going to play that. 3262 02:40:34,080 --> 02:40:35,560 Speaker 2: What do I need to do to set this up? 3263 02:40:35,640 --> 02:40:38,200 Speaker 5: No, I think you've set it up. We walk up 3264 02:40:38,200 --> 02:40:41,440 Speaker 5: to Jack, I start, I start recording, and we start 3265 02:40:41,480 --> 02:40:44,120 Speaker 5: talking about the robot, and then things spin in some 3266 02:40:44,440 --> 02:40:45,800 Speaker 5: pretty interesting directions. 3267 02:40:45,879 --> 02:40:52,000 Speaker 2: Yeah, all right, So what is this thing useful for? 3268 02:40:53,680 --> 02:40:57,680 Speaker 3: Well, generally capable, just like a human can reach to 3269 02:40:57,720 --> 02:41:00,440 Speaker 3: the floor and reach up high to a cupboard, go 3270 02:41:00,560 --> 02:41:01,039 Speaker 3: up and down. 3271 02:41:01,080 --> 02:41:02,080 Speaker 11: That's what we made this for. 3272 02:41:02,200 --> 02:41:05,160 Speaker 3: Obviously in a little bit of a different fashion because 3273 02:41:05,280 --> 02:41:07,879 Speaker 3: most surfaces are level. 3274 02:41:08,120 --> 02:41:09,480 Speaker 11: We don't need to reinvent the wheel. 3275 02:41:10,400 --> 02:41:15,080 Speaker 3: And the biggest market that we're going after is households. 3276 02:41:15,200 --> 02:41:18,760 Speaker 3: Domestic dishes, laundry, make the bed, clean up around the house, 3277 02:41:19,200 --> 02:41:22,560 Speaker 3: eventually cooking that's more fine tuned, you know, dishes and 3278 02:41:22,600 --> 02:41:26,120 Speaker 3: laundry is really that first task that is gonna be 3279 02:41:26,120 --> 02:41:31,520 Speaker 3: fully autonomous. Obviously from a folding standpoint and cooking standpoint, 3280 02:41:31,560 --> 02:41:35,959 Speaker 3: you can do teleoperation today, so can use cheaper labor 3281 02:41:36,000 --> 02:41:41,119 Speaker 3: internationally through a robot. But full autonomous is coming very quickly, 3282 02:41:41,200 --> 02:41:42,840 Speaker 3: like Jensen talked about recently. 3283 02:41:43,800 --> 02:41:47,000 Speaker 2: So I see there's a lot of folks in the 3284 02:41:47,080 --> 02:41:50,600 Speaker 2: robot space that are trying robots based on the human form. Right, 3285 02:41:50,840 --> 02:41:53,720 Speaker 2: you guys have not gone that route. Talk to me about. 3286 02:41:53,520 --> 02:41:55,199 Speaker 11: That droid form. 3287 02:41:55,520 --> 02:42:00,000 Speaker 3: Yes, Well, as we know, robots didn't evolve from monkey, 3288 02:42:00,520 --> 02:42:03,720 Speaker 3: and so we have an ability to reimagine them. All 3289 02:42:03,760 --> 02:42:06,800 Speaker 3: of the existing hardware we use in the world has 3290 02:42:06,840 --> 02:42:10,440 Speaker 3: wheels for a reason. It just works better. It's easier, 3291 02:42:10,480 --> 02:42:13,560 Speaker 3: there's less friction. That means there's less maintenance. That means 3292 02:42:13,600 --> 02:42:17,920 Speaker 3: there's less energy output. It's efficiency. It's also easier for 3293 02:42:18,040 --> 02:42:21,240 Speaker 3: us to manufacture that stuff at scale. So I think 3294 02:42:21,280 --> 02:42:29,240 Speaker 3: long term, do robots all have legs? Yeah, more or less. 3295 02:42:29,280 --> 02:42:31,520 Speaker 3: The home robot does turn into the like robot because 3296 02:42:31,520 --> 02:42:33,400 Speaker 3: then it can go with you in the car everything. 3297 02:42:33,760 --> 02:42:37,320 Speaker 3: But I think the early stages the wheels, because of 3298 02:42:37,360 --> 02:42:42,080 Speaker 3: their cheaperness, because of their reliability. I think that will 3299 02:42:42,120 --> 02:42:45,600 Speaker 3: be what wins early stage. That's where we started here. 3300 02:42:45,879 --> 02:42:47,560 Speaker 2: You just said, because the robot can go in the 3301 02:42:47,600 --> 02:42:49,800 Speaker 2: car with you, what do you see people wanting to 3302 02:42:49,840 --> 02:42:51,360 Speaker 2: have a robot in the car with them for? 3303 02:42:51,800 --> 02:42:54,600 Speaker 3: I think it will just become basically the same way 3304 02:42:54,640 --> 02:42:57,560 Speaker 3: if you have enough money, a lot of people afford 3305 02:42:57,720 --> 02:43:01,120 Speaker 3: like an assistant to come with them places. 3306 02:43:01,680 --> 02:43:02,920 Speaker 11: It's a lot of people. 3307 02:43:04,560 --> 02:43:08,000 Speaker 2: Because that seems like a niche market compared to household 3308 02:43:09,320 --> 02:43:12,680 Speaker 2: I think it's the barrier I think is because of 3309 02:43:12,720 --> 02:43:16,400 Speaker 2: the the cost and then the humanness, like then you 3310 02:43:16,400 --> 02:43:18,119 Speaker 2: have to care for another human. 3311 02:43:18,400 --> 02:43:21,920 Speaker 3: And whereas in this case it's kind of all positive 3312 02:43:22,000 --> 02:43:27,600 Speaker 3: sum And yeah, I guess it's wrong to try to 3313 02:43:27,640 --> 02:43:30,160 Speaker 3: say majority. 3314 02:43:29,840 --> 02:43:35,320 Speaker 11: Of people, but anyone who's you know in media. 3315 02:43:35,280 --> 02:43:38,280 Speaker 3: You know they videographer will be something you use a 3316 02:43:38,360 --> 02:43:40,840 Speaker 3: robot for to follow you around and take media and 3317 02:43:40,879 --> 02:43:43,800 Speaker 3: film for you. Every won't get tired and say go 3318 02:43:43,879 --> 02:43:46,600 Speaker 3: grab me a drink or you know, go figure that 3319 02:43:46,640 --> 02:43:48,000 Speaker 3: thing out. 3320 02:43:48,120 --> 02:43:51,480 Speaker 2: But it also can't decide, oh, that's actually not a 3321 02:43:51,480 --> 02:43:53,760 Speaker 2: good location to film from. It's not going to look 3322 02:43:53,800 --> 02:43:55,720 Speaker 2: as good. We need to get over here, or we 3323 02:43:55,760 --> 02:43:57,840 Speaker 2: need another camera on this side here, we need to 3324 02:43:57,879 --> 02:44:00,920 Speaker 2: get like different angles because we're want to edit this 3325 02:44:01,000 --> 02:44:03,920 Speaker 2: together into a thing. And as a videographer, I'm not 3326 02:44:04,000 --> 02:44:07,640 Speaker 2: just a machine. I'm a part of a collaborative creative enterprise. 3327 02:44:08,959 --> 02:44:13,920 Speaker 3: I think we're starting to see just how artistic these 3328 02:44:14,520 --> 02:44:15,400 Speaker 3: ais can be. 3329 02:44:16,040 --> 02:44:18,080 Speaker 2: What's the best example of that, you sing. 3330 02:44:19,640 --> 02:44:21,840 Speaker 3: Well, I think the most used thing is this the 3331 02:44:21,920 --> 02:44:25,680 Speaker 3: Jenai art And then you have some of the new 3332 02:44:25,720 --> 02:44:30,279 Speaker 3: video models are pretty cool and. 3333 02:44:29,120 --> 02:44:32,680 Speaker 11: They're using certain sort of zoom in shots. 3334 02:44:32,800 --> 02:44:38,400 Speaker 3: Everything I think they'll make just as good of movies 3335 02:44:38,680 --> 02:44:39,359 Speaker 3: as humans. 3336 02:44:39,440 --> 02:44:39,600 Speaker 2: Oh. 3337 02:44:39,640 --> 02:44:42,199 Speaker 11: I think the best reference in order to. 3338 02:44:42,120 --> 02:44:45,960 Speaker 3: Actually say that that's possible is music. I don't know 3339 02:44:46,000 --> 02:44:48,400 Speaker 3: if you've played with the most recent AI music. There's 3340 02:44:48,400 --> 02:44:49,760 Speaker 3: song GBT dot com. 3341 02:44:50,000 --> 02:44:52,760 Speaker 2: I've heard some things people call music that are produced 3342 02:44:52,760 --> 02:44:53,000 Speaker 2: by that. 3343 02:44:53,160 --> 02:44:55,880 Speaker 11: Yeah, we can make one live right now that. 3344 02:44:55,920 --> 02:44:59,560 Speaker 3: I don't know if you've heard like the latest models 3345 02:45:00,480 --> 02:45:02,199 Speaker 3: up pick me a genre. 3346 02:45:03,320 --> 02:45:09,320 Speaker 2: Uh Irish spirituals SCA, you can try Scott too. 3347 02:45:09,480 --> 02:45:14,200 Speaker 3: You love sk ska is like definitely probably niche stuff 3348 02:45:14,280 --> 02:45:18,760 Speaker 3: is where it's gonna have a harder time, but skka 3349 02:45:19,120 --> 02:45:19,960 Speaker 3: s ka. 3350 02:45:20,879 --> 02:45:23,720 Speaker 11: I wonder how much SKA data there is out there. 3351 02:45:24,280 --> 02:45:26,039 Speaker 2: There's a lot of SKA music out there. 3352 02:45:26,120 --> 02:45:27,320 Speaker 11: What should we make it about? Should we make it 3353 02:45:27,360 --> 02:45:28,280 Speaker 11: about iHeart radio? 3354 02:45:28,959 --> 02:45:29,240 Speaker 2: Sure? 3355 02:45:29,520 --> 02:45:30,280 Speaker 11: iHeart radio? 3356 02:45:30,480 --> 02:45:31,480 Speaker 2: And communication? 3357 02:45:31,680 --> 02:45:36,080 Speaker 3: Robert and clear Channel Communications? 3358 02:45:36,560 --> 02:45:40,240 Speaker 11: All right, it's here a SKA song. We're like, oh, 3359 02:45:40,280 --> 02:45:41,960 Speaker 11: it has to load for like thirty seconds. 3360 02:45:41,959 --> 02:45:45,400 Speaker 3: It feels weirdly like I'm upset that I have to 3361 02:45:45,400 --> 02:45:47,840 Speaker 3: wait that long for something to load online. 3362 02:45:48,160 --> 02:45:49,760 Speaker 2: Is that really how it feels to you? Huh? 3363 02:45:49,840 --> 02:45:51,640 Speaker 3: Yeah, I got what They're playing with it a lot, 3364 02:45:52,080 --> 02:45:54,320 Speaker 3: But it's funny to think about how much time and 3365 02:45:54,360 --> 02:45:57,400 Speaker 3: effort it does take to like produce a song typically 3366 02:45:58,360 --> 02:45:59,240 Speaker 3: I am twenty seven. 3367 02:46:00,879 --> 02:46:03,680 Speaker 2: That's interesting, wouldn't a guess that? 3368 02:46:05,600 --> 02:46:05,680 Speaker 5: What? 3369 02:46:05,879 --> 02:46:09,400 Speaker 2: Uh one thing that's really compelling to me is your partner. 3370 02:46:09,879 --> 02:46:11,959 Speaker 2: When I came in here was very very much talking 3371 02:46:11,959 --> 02:46:14,680 Speaker 2: about the utility of this in terms of replacing human 3372 02:46:14,720 --> 02:46:19,680 Speaker 2: beings and tasks that are generally unpleasant laundry, doing the dishes, 3373 02:46:19,800 --> 02:46:22,920 Speaker 2: cleaning up trash. You seem a lot more bullish on 3374 02:46:23,920 --> 02:46:27,320 Speaker 2: robots replacing human beings and what are generally considered to 3375 02:46:27,360 --> 02:46:31,360 Speaker 2: be enterprises people want to do with their time. Is 3376 02:46:31,400 --> 02:46:33,920 Speaker 2: that like a discrepancy that that that you guys have 3377 02:46:34,080 --> 02:46:36,240 Speaker 2: kind of talked about, or do you think it's something 3378 02:46:36,480 --> 02:46:38,679 Speaker 2: you guys are more on the same page with stuff. 3379 02:46:38,720 --> 02:46:42,360 Speaker 3: From a business endpoint, we're one hundred percent going after 3380 02:46:43,320 --> 02:46:49,600 Speaker 3: the dishes, laundry, uh nursing practice of just doing vitals, 3381 02:46:49,600 --> 02:46:55,520 Speaker 3: which is the very repetitive task that's the push. I 3382 02:46:55,560 --> 02:46:58,879 Speaker 3: was starting to just talk into the aspect of the 3383 02:46:59,000 --> 02:47:03,440 Speaker 3: legged robots and kind of imagining why a leged version 3384 02:47:03,959 --> 02:47:07,920 Speaker 3: would have better utility or be something someone wants to 3385 02:47:07,959 --> 02:47:11,640 Speaker 3: purchase scare rather than the wheeled robot and Yeah, stairs 3386 02:47:11,720 --> 02:47:14,160 Speaker 3: is definitely a big one of those. 3387 02:47:14,200 --> 02:47:17,560 Speaker 11: There are wheel types we're working on right now. 3388 02:47:17,600 --> 02:47:21,040 Speaker 3: We have ability to climb like single stairs obviously easiest, 3389 02:47:21,040 --> 02:47:23,360 Speaker 3: and that's what most people have in their home if 3390 02:47:23,400 --> 02:47:25,240 Speaker 3: they do have stairs. 3391 02:47:25,280 --> 02:47:27,160 Speaker 2: Oh, are we gonna listen to some robots? 3392 02:47:27,160 --> 02:47:30,000 Speaker 11: Scott my Heart Listeners System or. 3393 02:47:41,640 --> 02:47:59,760 Speaker 5: Scott Clear Crazy sco Is this Scot? 3394 02:48:01,280 --> 02:48:05,039 Speaker 2: It's a pretty basic melody. I mean there's horns in it, 3395 02:48:05,080 --> 02:48:06,760 Speaker 2: but I feel like it's kind of takes in a 3396 02:48:07,560 --> 02:48:09,720 Speaker 2: I think it's Ophelia is trying to do pot that 3397 02:48:09,800 --> 02:48:12,840 Speaker 2: it's just thrown some horns in on. This is a 3398 02:48:12,840 --> 02:48:32,560 Speaker 2: little closer to SCA, although it's still Yeah, it's not 3399 02:48:32,640 --> 02:48:41,760 Speaker 2: really singing, but I guess that's a matter of taste. 3400 02:48:42,720 --> 02:48:43,520 Speaker 2: What do you listen to? 3401 02:48:43,680 --> 02:48:45,039 Speaker 11: This is the worst that's gonna be. 3402 02:48:45,840 --> 02:48:46,720 Speaker 5: I hear that a lot. 3403 02:48:47,200 --> 02:48:51,240 Speaker 2: It's interesting because GPT four took fifty times as much 3404 02:48:51,280 --> 02:48:54,160 Speaker 2: power as GPT three to train, and there's a lot 3405 02:48:54,160 --> 02:48:57,760 Speaker 2: of mixed reactions on that. And we're entering into a 3406 02:48:57,800 --> 02:49:02,000 Speaker 2: period where we're very likely looking at a recession venture 3407 02:49:02,040 --> 02:49:05,039 Speaker 2: capital funding. There's a chance it's not going to be 3408 02:49:05,080 --> 02:49:07,879 Speaker 2: what it has been. Is that concern you at all that, 3409 02:49:08,000 --> 02:49:10,840 Speaker 2: like this vaunted next level for all of this stuff, 3410 02:49:10,879 --> 02:49:15,000 Speaker 2: the energy cost, the investment cost is just not going 3411 02:49:15,080 --> 02:49:18,080 Speaker 2: to be borne by a market that uh is not 3412 02:49:18,160 --> 02:49:20,959 Speaker 2: going to be as strong tomorrow as it was today, 3413 02:49:21,560 --> 02:49:23,240 Speaker 2: at least in the immediate term. 3414 02:49:23,640 --> 02:49:28,600 Speaker 3: I think even if we created no more energy as 3415 02:49:28,640 --> 02:49:32,400 Speaker 3: a human species today, the amount of advancements we create 3416 02:49:33,320 --> 02:49:38,080 Speaker 3: would from an architectural standpoint, continue to advance. 3417 02:49:38,400 --> 02:49:41,119 Speaker 11: So you have other models. 3418 02:49:43,280 --> 02:49:46,080 Speaker 3: Like I think LAMA three point three, which has matched 3419 02:49:46,160 --> 02:49:51,480 Speaker 3: four ohs capabilities and is I forget how many parameters, 3420 02:49:51,480 --> 02:49:54,520 Speaker 3: but like super like much much much smaller and was 3421 02:49:54,640 --> 02:49:56,879 Speaker 3: much scheaper to train, and like we're continuing to see 3422 02:49:56,920 --> 02:50:02,560 Speaker 3: like smaller models that are just as effective and we're 3423 02:50:02,680 --> 02:50:03,840 Speaker 3: much cheaper training runs. 3424 02:50:03,840 --> 02:50:06,480 Speaker 11: I think Deep Seek was one of the newest ones. 3425 02:50:07,120 --> 02:50:09,200 Speaker 2: I'm what I'm concerned about is I'm looking at the 3426 02:50:09,640 --> 02:50:12,280 Speaker 2: p and L right, I'm looking at open AI's PNL. 3427 02:50:12,560 --> 02:50:15,000 Speaker 2: I'm looking at the fact that they're losing five or 3428 02:50:15,000 --> 02:50:18,760 Speaker 2: six billion dollars last year and we're very good chance 3429 02:50:18,760 --> 02:50:20,360 Speaker 2: it's going to be somewhere in the neighborhood to double 3430 02:50:20,400 --> 02:50:23,480 Speaker 2: that this year. And I'm it's not that there's nothing 3431 02:50:23,520 --> 02:50:25,440 Speaker 2: impressive there it's not that I don't see like, oh, 3432 02:50:25,480 --> 02:50:29,000 Speaker 2: you can generate a song that's got like guitar and 3433 02:50:29,920 --> 02:50:32,560 Speaker 2: trumpets and vocals and stuff and you know a minute 3434 02:50:32,640 --> 02:50:34,840 Speaker 2: or so. It's not that that's not impressive, but like 3435 02:50:35,600 --> 02:50:39,400 Speaker 2: a pilier trick, isn't a trillion dollar business, And that's 3436 02:50:39,440 --> 02:50:41,360 Speaker 2: the kind of investment they're looking at. And I do 3437 02:50:41,480 --> 02:50:44,600 Speaker 2: wonder like, is it not much more reasonable to focus 3438 02:50:44,640 --> 02:50:45,760 Speaker 2: on folding laundry? 3439 02:50:46,320 --> 02:50:51,480 Speaker 3: Well, obviously, I personally am in the the boathouse of 3440 02:50:51,560 --> 02:50:56,760 Speaker 3: focusing on allowing this intelligence to flourish and doing these 3441 02:50:56,840 --> 02:51:00,440 Speaker 3: laborious tasks and getting them in the households. I do 3442 02:51:00,560 --> 02:51:04,720 Speaker 3: think from open eye standpoint, and the reason why vcs 3443 02:51:04,800 --> 02:51:08,680 Speaker 3: and private investors will value them so highly is what's 3444 02:51:08,760 --> 02:51:14,599 Speaker 3: next is white collar work. A lot of the jobs online. 3445 02:51:15,720 --> 02:51:16,959 Speaker 11: That's what they do. 3446 02:51:17,000 --> 02:51:21,280 Speaker 3: Have an internal model which is able to control the computer, 3447 02:51:22,000 --> 02:51:23,760 Speaker 3: you know, the same way you would ask an executive 3448 02:51:23,760 --> 02:51:26,720 Speaker 3: assistant to do certain things online. 3449 02:51:27,360 --> 02:51:30,959 Speaker 2: Now it's just Sdobe's handing along all of their emails 3450 02:51:31,040 --> 02:51:34,640 Speaker 2: now through AIS, which is you know, we'll see how 3451 02:51:35,000 --> 02:51:36,560 Speaker 2: well that works on the long term. There's been some 3452 02:51:36,640 --> 02:51:41,560 Speaker 2: interesting polling on like the degree to which customers and 3453 02:51:41,879 --> 02:51:46,360 Speaker 2: investors feel trust when somebody's responding to them with an AI. 3454 02:51:46,840 --> 02:51:49,600 Speaker 2: But what's interesting, I mean more here is the dichotomy 3455 02:51:49,680 --> 02:51:53,360 Speaker 2: between what I see here is a very pragmatic choice, 3456 02:51:53,360 --> 02:51:56,080 Speaker 2: which is, we're not going to try and remake a 3457 02:51:56,160 --> 02:51:59,200 Speaker 2: human being formed robot and deal with like knees and 3458 02:51:59,280 --> 02:52:01,400 Speaker 2: hips and all of that stuff. We don't need that. 3459 02:52:01,440 --> 02:52:04,160 Speaker 2: We can have it turn up and down on this 3460 02:52:04,240 --> 02:52:08,600 Speaker 2: platform and reach things the same way, melded to what 3461 02:52:08,640 --> 02:52:10,840 Speaker 2: I consider to be kind of a little more high 3462 02:52:10,840 --> 02:52:14,160 Speaker 2: in the sky. We're doing this as eventually something that 3463 02:52:14,200 --> 02:52:18,560 Speaker 2: can take creative roles and think independently and make things, 3464 02:52:19,000 --> 02:52:21,520 Speaker 2: which is it's interesting to me to see that in 3465 02:52:21,560 --> 02:52:23,680 Speaker 2: a company's DNA of what you guys are eight months 3466 02:52:23,680 --> 02:52:26,879 Speaker 2: out right now, yep. Is that what you're more interested in? 3467 02:52:27,440 --> 02:52:30,400 Speaker 11: I'd say, I tailor my pitch to the person I'm 3468 02:52:30,400 --> 02:52:32,320 Speaker 11: talking to. Uh huh. 3469 02:52:32,840 --> 02:52:36,520 Speaker 3: So some people definitely enjoy thinking about more of the 3470 02:52:36,520 --> 02:52:39,800 Speaker 3: sci fi futures that are coming, for example, the droids 3471 02:52:39,800 --> 02:52:44,840 Speaker 3: building Droid's moment. It's when you know you are decreasing 3472 02:52:44,920 --> 02:52:49,640 Speaker 3: your own manufacturing costs by using your own hardware to 3473 02:52:49,640 --> 02:52:51,879 Speaker 3: build more of that hardware and. 3474 02:52:51,959 --> 02:52:54,000 Speaker 11: Parts are just being shipped into the factory. 3475 02:52:54,480 --> 02:52:58,480 Speaker 3: Obviously, I think the first fully automated phone factory just 3476 02:52:58,520 --> 02:53:01,600 Speaker 3: came out in chinery, which is like some cool press 3477 02:53:01,600 --> 02:53:05,480 Speaker 3: in news, but the phone is separate from the actual 3478 02:53:05,520 --> 02:53:06,800 Speaker 3: manufacturing process. 3479 02:53:07,480 --> 02:53:10,920 Speaker 11: So there's that interesting component. 3480 02:53:11,480 --> 02:53:16,920 Speaker 3: The exciting part of the idea that how do we 3481 02:53:16,959 --> 02:53:21,080 Speaker 3: reach true abundance as a species of material and resources 3482 02:53:21,160 --> 02:53:27,760 Speaker 3: is well, because GDP is a calculation of capita times productivity, 3483 02:53:28,200 --> 02:53:34,280 Speaker 3: a robot really represents capita one unit of creation. And 3484 02:53:35,040 --> 02:53:38,360 Speaker 3: I'd say that's where the sci fi thinking comes into play. 3485 02:53:38,400 --> 02:53:44,560 Speaker 3: And it's not not worth going there when just dreaming 3486 02:53:44,560 --> 02:53:46,880 Speaker 3: about the future robotics and talking about it and having 3487 02:53:46,920 --> 02:53:51,640 Speaker 3: an interesting, engaging conversation, But definitely when it comes to 3488 02:53:52,000 --> 02:53:54,480 Speaker 3: what are we doing from an engineering standpoint on the 3489 02:53:54,560 --> 02:53:56,280 Speaker 3: day to day and how are we trying to approach 3490 02:53:56,320 --> 02:53:59,920 Speaker 3: the market, those conversations are not at being had. 3491 02:54:00,200 --> 02:54:03,000 Speaker 2: Okay, well, I appreciate your time and he gave me 3492 02:54:03,040 --> 02:54:04,600 Speaker 2: a lot. I'm gonna let you get to the other peak. 3493 02:54:04,680 --> 02:54:06,120 Speaker 2: Thank you, Thank you so much to meet you. 3494 02:54:06,200 --> 02:54:06,440 Speaker 1: Jack. 3495 02:54:07,600 --> 02:54:10,760 Speaker 2: Oh wow, that's super interesting. I hope you all liked 3496 02:54:10,920 --> 02:54:12,959 Speaker 2: Jack Jay as much as I didn't. 3497 02:54:13,120 --> 02:54:15,760 Speaker 5: Getting to twenty seventy years old and not knowing what 3498 02:54:15,879 --> 02:54:18,160 Speaker 5: Scott is that old. I thought he was much younger, 3499 02:54:18,560 --> 02:54:20,720 Speaker 5: like you thought he was like twenty two. Yes, but 3500 02:54:20,760 --> 02:54:23,360 Speaker 5: the fact that he didn't know what SKA was as 3501 02:54:23,400 --> 02:54:25,560 Speaker 5: a genre, he wasn't was unaware of it. 3502 02:54:25,600 --> 02:54:27,160 Speaker 2: I don't think he listens to music. 3503 02:54:26,920 --> 02:54:29,959 Speaker 5: Well, he listens to AI generated hess to AI generated music, 3504 02:54:29,959 --> 02:54:32,440 Speaker 5: he's just as good. He has the most, he has 3505 02:54:32,480 --> 02:54:36,200 Speaker 5: the most. I listened to AI generated music vines. If 3506 02:54:36,240 --> 02:54:38,280 Speaker 5: anyone I've ever seen before just. 3507 02:54:38,360 --> 02:54:41,800 Speaker 2: Very clearly does not have a soul, no, like nothing, 3508 02:54:41,959 --> 02:54:45,520 Speaker 2: nothing would leave the universe if he did, right, Like. 3509 02:54:45,480 --> 02:54:48,080 Speaker 5: It's so opposite from the first guy you talked to, 3510 02:54:48,240 --> 02:54:51,160 Speaker 5: who was it so like about? No? I want to 3511 02:54:51,160 --> 02:54:54,840 Speaker 5: how smart actual tasks that people don't enjoy. Yeah, I 3512 02:54:54,920 --> 02:54:59,119 Speaker 5: love cinematography, I love I love filmmaking. I don't First 3513 02:54:59,160 --> 02:55:02,760 Speaker 5: of all, I don't think a robot can can replace this. 3514 02:55:03,080 --> 02:55:03,200 Speaker 1: No. 3515 02:55:03,400 --> 02:55:07,240 Speaker 2: I watched five different AI generated movies yesterday, and they 3516 02:55:07,280 --> 02:55:08,119 Speaker 2: all looked like shit. 3517 02:55:08,000 --> 02:55:10,240 Speaker 5: Even like a robot handling a physical camera to make 3518 02:55:10,280 --> 02:55:13,280 Speaker 5: like to make like choices on like shot framing and composition, 3519 02:55:13,320 --> 02:55:14,160 Speaker 5: and like it's. 3520 02:55:14,040 --> 02:55:15,600 Speaker 2: One thing to be like we want we have a 3521 02:55:15,680 --> 02:55:17,760 Speaker 2: race cargoing, and so we've got this robot on a 3522 02:55:17,840 --> 02:55:19,600 Speaker 2: track so we can go seventy miles an hour and 3523 02:55:19,879 --> 02:55:21,920 Speaker 2: we're just kind of running on a street. Follow it 3524 02:55:21,920 --> 02:55:24,240 Speaker 2: because a human being can move that. That's sure. One 3525 02:55:24,280 --> 02:55:26,120 Speaker 2: thing we've left out of this up so far. So 3526 02:55:26,160 --> 02:55:29,160 Speaker 2: this this machine that I described earlier, this robot that 3527 02:55:29,200 --> 02:55:31,840 Speaker 2: goes up and down this rolling base has a floppy 3528 02:55:31,920 --> 02:55:35,280 Speaker 2: Donald Trump mask over the over its head, which first 3529 02:55:35,280 --> 02:55:37,960 Speaker 2: attracted us to this. Yeah, that's why we showed up 3530 02:55:37,959 --> 02:55:38,400 Speaker 2: there in the. 3531 02:55:38,879 --> 02:55:42,720 Speaker 5: A robot moving its arms around wearing a Donald Trump mask. 3532 02:55:42,760 --> 02:55:46,080 Speaker 5: And as Robert was interviewing this guy, the robot was 3533 02:55:46,080 --> 02:55:48,560 Speaker 5: like moving around and like trying to stimulate its washing 3534 02:55:48,640 --> 02:55:52,320 Speaker 5: dishes capability, and it knocked over the same water bottle 3535 02:55:52,360 --> 02:55:55,800 Speaker 5: about five times. It couldn't it couldn't pick it up consistently, 3536 02:55:56,160 --> 02:55:58,600 Speaker 5: So I will not trust it with my fine china. 3537 02:55:58,680 --> 02:55:59,240 Speaker 5: I'll say that. 3538 02:55:59,280 --> 02:56:00,760 Speaker 2: As soon as I got up there, I asked like, 3539 02:56:00,800 --> 02:56:02,600 Speaker 2: I couldn't take my jacket off, now can it fold? 3540 02:56:02,640 --> 02:56:04,800 Speaker 2: And he's like, well, we'd have to reprogram it. And 3541 02:56:04,879 --> 02:56:06,840 Speaker 2: it was this when I talked to the guy, I 3542 02:56:07,040 --> 02:56:08,960 Speaker 2: was like because he was like, yeah, we really see 3543 02:56:09,000 --> 02:56:13,000 Speaker 2: this as being you know, potentially good for elder care. Sure, 3544 02:56:13,040 --> 02:56:14,920 Speaker 2: and you know, we had just seen the product we 3545 02:56:14,920 --> 02:56:17,320 Speaker 2: talked about in the last episode, which, for all of 3546 02:56:17,360 --> 02:56:19,440 Speaker 2: it's I don't know that I think it'll work, was 3547 02:56:19,879 --> 02:56:22,240 Speaker 2: a lot of thought and care went into it. I 3548 02:56:22,280 --> 02:56:23,959 Speaker 2: was like, okay, so, like, what work have you done 3549 02:56:24,040 --> 02:56:26,680 Speaker 2: to build a machine that can like communicate and be 3550 02:56:26,760 --> 02:56:29,560 Speaker 2: helpful to like people who are dealing with health issues 3551 02:56:29,600 --> 02:56:31,920 Speaker 2: in their their later years. They're like, well, that's why 3552 02:56:32,040 --> 02:56:34,880 Speaker 2: it's open, right, someone else will yeahs open sources. Someone 3553 02:56:34,920 --> 02:56:36,600 Speaker 2: else can do a bit part. See, you guys are 3554 02:56:36,600 --> 02:56:38,600 Speaker 2: just you guys are just saying it can do everything 3555 02:56:38,640 --> 02:56:41,760 Speaker 2: because somebody could potentially code something for it. 3556 02:56:41,879 --> 02:56:44,000 Speaker 5: Yeah cool, there always could be code. 3557 02:56:44,080 --> 02:56:47,400 Speaker 2: Yeah, there could be code. I mean again, the other guy, 3558 02:56:48,200 --> 02:56:52,560 Speaker 2: the actual engineer, seemed very interested in the nuts and 3559 02:56:52,600 --> 02:56:58,000 Speaker 2: bolts of making an affordable, reproducible machine that could handle 3560 02:56:58,000 --> 02:57:02,520 Speaker 2: specific tasks, and Jack Jay had absolutely no interest in 3561 02:57:02,560 --> 02:57:04,840 Speaker 2: the actual machine that they were making. This is clearly, 3562 02:57:04,879 --> 02:57:07,080 Speaker 2: could not be clear. This is just a stepping stone, 3563 02:57:07,280 --> 02:57:09,480 Speaker 2: and he's kind of grossed out by it because it's 3564 02:57:09,520 --> 02:57:12,279 Speaker 2: not replacing all human art with a machine that he owns. 3565 02:57:12,160 --> 02:57:15,800 Speaker 5: He's a man completely fueled by Lex Friedman podcasts, and 3566 02:57:15,879 --> 02:57:18,440 Speaker 5: he doesn't want to actually do any real work. He 3567 02:57:18,520 --> 02:57:21,600 Speaker 5: just wants to talk about how AI is going to 3568 02:57:21,720 --> 02:57:24,879 Speaker 5: take over everything and we have to welcome it in 3569 02:57:25,280 --> 02:57:26,960 Speaker 5: and here listen to this is SKA. 3570 02:57:27,320 --> 02:57:30,480 Speaker 2: He wants to take money by owning something that does 3571 02:57:30,520 --> 02:57:34,080 Speaker 2: not provide anything and also put people out of work. Like, 3572 02:57:34,720 --> 02:57:37,720 Speaker 2: at no point did he express a desire to do 3573 02:57:37,879 --> 02:57:42,080 Speaker 2: anything other than replace something people were already doing with 3574 02:57:42,160 --> 02:57:47,280 Speaker 2: something worse that tech guys could profit from. That's all 3575 02:57:47,360 --> 02:57:49,720 Speaker 2: there is to this man. He's not a human. 3576 02:57:49,959 --> 02:57:51,040 Speaker 5: It's so anti human. 3577 02:57:51,280 --> 02:57:54,920 Speaker 2: Yeah, I cannot overemphasize the degree to which there was 3578 02:57:55,000 --> 02:57:56,680 Speaker 2: nothing behind this boy's eyes. 3579 02:57:57,760 --> 02:58:02,000 Speaker 5: Well, do you know what, there's also nothing super intelligent behind. 3580 02:58:02,240 --> 02:58:04,320 Speaker 2: That's not true. All of our ads are sponsored by 3581 02:58:04,400 --> 02:58:08,040 Speaker 2: real people, even if they're bad people, they're at least people. 3582 02:58:08,480 --> 02:58:11,920 Speaker 2: They live and they love and they hate and you know, 3583 02:58:12,040 --> 02:58:25,119 Speaker 2: maybe they have a promo code. Let's let's see, all. 3584 02:58:25,120 --> 02:58:29,560 Speaker 5: Right, So after our our lovely, our lovely Robotic Jack 3585 02:58:29,760 --> 02:58:34,280 Speaker 5: j Jessanowski SKA Adventure. Oh god. 3586 02:58:34,680 --> 02:58:38,200 Speaker 2: Also the SKA was shit, not good, not good? It 3587 02:58:38,240 --> 02:58:40,800 Speaker 2: did it just kept saying the words Scott. 3588 02:58:40,920 --> 02:58:44,280 Speaker 5: He kept saying the word Ska music and saying the 3589 02:58:44,280 --> 02:58:49,400 Speaker 5: word Robert Robert Scott, well, just doing random noises. After 3590 02:58:49,440 --> 02:58:51,080 Speaker 5: we had our fill of that, we did walk around 3591 02:58:51,160 --> 02:58:55,320 Speaker 5: the rest of Showstoppers. He was so surprised that I wasn't. 3592 02:58:55,440 --> 02:58:57,760 Speaker 2: I wasn't impressed by any of the He was like, 3593 02:58:57,760 --> 02:59:00,680 Speaker 2: you must have heard the lady, man, I hear them. 3594 02:59:00,720 --> 02:59:04,320 Speaker 2: It's not good. It's like it's like I made this 3595 02:59:04,360 --> 02:59:08,160 Speaker 2: comparison a few times. If somebody like walked in while 3596 02:59:08,160 --> 02:59:10,680 Speaker 2: I'm at a house party was like, hey, man, I 3597 02:59:10,800 --> 02:59:14,360 Speaker 2: taught my dog to masturbate to pornography with its with 3598 02:59:14,400 --> 02:59:17,800 Speaker 2: its pause. I would be like, well, I mean that's like, 3599 02:59:18,200 --> 02:59:21,080 Speaker 2: I guess impressed. I didn't think a dog could do that. 3600 02:59:21,640 --> 02:59:26,000 Speaker 2: Like I am kind of impressed, I guess, But I 3601 02:59:26,040 --> 02:59:30,240 Speaker 2: don't want this like this. This doesn't do anything for me. 3602 02:59:30,520 --> 02:59:33,280 Speaker 2: It's why you figured this out? 3603 02:59:33,400 --> 02:59:35,200 Speaker 5: What what value does this? 3604 02:59:35,600 --> 02:59:37,959 Speaker 2: How does the dog know who Farah Fauce it is? 3605 02:59:38,000 --> 02:59:41,600 Speaker 2: I have questions, sure, but it doesn't give me anything 3606 02:59:42,120 --> 02:59:47,160 Speaker 2: like no Fara Fauce it was Garrison. No, God damn it. 3607 02:59:47,320 --> 02:59:48,200 Speaker 5: What do you think I do? 3608 02:59:48,320 --> 02:59:48,440 Speaker 2: What? 3609 02:59:48,800 --> 02:59:53,959 Speaker 5: I I don't know anymore? Well, what I did is 3610 02:59:54,000 --> 02:59:57,080 Speaker 5: walk around the rest of showstoppers. I saw this one 3611 02:59:57,120 --> 02:59:59,800 Speaker 5: booth that had like a like an iPhone case with 3612 02:59:59,840 --> 03:00:02,000 Speaker 5: like a little like keyboard on the bottom that like 3613 03:00:02,040 --> 03:00:05,039 Speaker 5: plugs in, and I started messing around with it, and 3614 03:00:05,120 --> 03:00:06,840 Speaker 5: the guy that booth walked up to me and made 3615 03:00:06,879 --> 03:00:09,959 Speaker 5: fun of me because he's like, you've never you've never 3616 03:00:10,040 --> 03:00:13,320 Speaker 5: held a phone with the BlackBerry. He literally said, like, 3617 03:00:13,640 --> 03:00:15,360 Speaker 5: you've never had a BlackBerry before? 3618 03:00:15,440 --> 03:00:15,720 Speaker 3: Have you? 3619 03:00:15,920 --> 03:00:18,840 Speaker 5: Like no, Like, yeah, you're typing all wrong on there. 3620 03:00:19,200 --> 03:00:23,680 Speaker 2: There was a solid nine day news cycle when Barack Obama, 3621 03:00:23,760 --> 03:00:26,520 Speaker 2: newly the president, revealed that he had a BlackBerry. 3622 03:00:26,640 --> 03:00:30,600 Speaker 5: I remember that, which sounds like a lifetime. 3623 03:00:30,760 --> 03:00:33,560 Speaker 2: There was a company called r I M once and 3624 03:00:33,600 --> 03:00:36,640 Speaker 2: they made a tablet that was pretty good, and we 3625 03:00:36,720 --> 03:00:39,160 Speaker 2: only made a couple of rim job jokes about it, 3626 03:00:39,560 --> 03:00:41,720 Speaker 2: but it didn't do very well, and so I gave 3627 03:00:41,760 --> 03:00:44,080 Speaker 2: it to my dad and accidentally there was still a 3628 03:00:44,120 --> 03:00:46,720 Speaker 2: picture of my dick on it. Anyway, that's a story 3629 03:00:46,760 --> 03:00:49,400 Speaker 2: for another day. 3630 03:00:49,440 --> 03:00:49,760 Speaker 5: Cool. 3631 03:00:51,840 --> 03:00:53,640 Speaker 2: These are the kind of things you get recording at 3632 03:00:53,640 --> 03:00:56,039 Speaker 2: eleven fifty six pm and we. 3633 03:00:56,480 --> 03:00:59,880 Speaker 5: Caught Tuesday night. Gott to get to bed, but no, 3634 03:01:00,160 --> 03:01:01,480 Speaker 5: keep making fun of me. For not knowing how to 3635 03:01:01,560 --> 03:01:03,359 Speaker 5: use a smartphone keyboard. 3636 03:01:03,520 --> 03:01:04,280 Speaker 2: He did the right thing. 3637 03:01:04,280 --> 03:01:05,720 Speaker 5: I don't need to use that because I have a 3638 03:01:05,800 --> 03:01:08,440 Speaker 5: keyboard on my phone built in already. It's much faster. 3639 03:01:08,680 --> 03:01:12,120 Speaker 5: So anyway, we stopped it at this company that makes 3640 03:01:12,400 --> 03:01:16,279 Speaker 5: well now just makes the software to use in conjunction 3641 03:01:16,360 --> 03:01:20,119 Speaker 5: with the augmented reality glasses and any like high powered laptop. 3642 03:01:20,240 --> 03:01:22,600 Speaker 5: Specifically the laptops that have like built in like you know, 3643 03:01:22,680 --> 03:01:25,840 Speaker 5: like co pilots because they require like higher processing power, 3644 03:01:26,000 --> 03:01:28,840 Speaker 5: they have an NPU or something like that, like a 3645 03:01:28,959 --> 03:01:31,920 Speaker 5: like an ampuro processing unit is what they're calling like 3646 03:01:32,000 --> 03:01:36,600 Speaker 5: the AI dedicated GPU thing. Effectively, it allows you to 3647 03:01:36,600 --> 03:01:40,640 Speaker 5: hook up these glasses and run you know, possibly infinite 3648 03:01:40,920 --> 03:01:43,959 Speaker 5: amount of monitors using AR. And we talked about this 3649 03:01:44,000 --> 03:01:47,080 Speaker 5: company last year because we saw them at show stoppers. 3650 03:01:47,080 --> 03:01:48,760 Speaker 2: You put on the glasses and it's like you've got 3651 03:01:48,879 --> 03:01:51,560 Speaker 2: six monitors or whatever that are all full size. 3652 03:01:51,360 --> 03:01:55,080 Speaker 5: And it's actually really easy to use. It works very well, seamlessly, 3653 03:01:55,120 --> 03:01:58,120 Speaker 5: it's nice, it's it's good quality, easy to use, you 3654 03:01:58,160 --> 03:01:59,240 Speaker 5: can move the monitors around. 3655 03:01:59,320 --> 03:02:00,800 Speaker 2: It's an excellent an excellent game. 3656 03:02:00,879 --> 03:02:02,560 Speaker 5: We talked to them last year, and the main thing 3657 03:02:02,600 --> 03:02:05,000 Speaker 5: that was holding this, like holding us back on it 3658 03:02:05,040 --> 03:02:07,199 Speaker 5: is that you needed to use their own proprietary lap. 3659 03:02:07,200 --> 03:02:09,160 Speaker 5: It was their own laptop, and it wasn't a great one. 3660 03:02:09,240 --> 03:02:11,400 Speaker 5: It was just like a Linux laptop didn't. It didn't 3661 03:02:11,440 --> 03:02:13,400 Speaker 5: have everything I like I want out of my own 3662 03:02:13,440 --> 03:02:14,160 Speaker 5: personal laptop. 3663 03:02:14,320 --> 03:02:15,760 Speaker 2: And we were still impressed with it. 3664 03:02:15,800 --> 03:02:17,600 Speaker 5: Then it was still it was still good. Yeah, and 3665 03:02:17,640 --> 03:02:19,600 Speaker 5: now you can just use any high power laptop with 3666 03:02:19,640 --> 03:02:23,080 Speaker 5: it essentially. So it's lovely to see that improved. We 3667 03:02:23,120 --> 03:02:28,160 Speaker 5: saw this lovely like like very small foldable projector. Oh yeah, 3668 03:02:28,200 --> 03:02:28,840 Speaker 5: that was cool. 3669 03:02:28,879 --> 03:02:30,800 Speaker 2: What's the company? That company name, because we should be 3670 03:02:30,840 --> 03:02:32,840 Speaker 2: we should be giving out the names of these Yes. 3671 03:02:32,760 --> 03:02:35,760 Speaker 5: The ar glasses and the software system is called spacetop 3672 03:02:36,080 --> 03:02:40,320 Speaker 5: very good by a company called Sightful. It works works great. 3673 03:02:40,480 --> 03:02:43,960 Speaker 5: But yeah, this this little folding folding projector currently has 3674 03:02:43,959 --> 03:02:47,840 Speaker 5: a Kickstarter. The company is called Aura Zen yeah or 3675 03:02:47,920 --> 03:02:53,160 Speaker 5: a Zen specifically, it was the zip trifled projector right now. 3676 03:02:53,160 --> 03:02:55,960 Speaker 5: It's it's a seven to twenty p very small foldable projector. 3677 03:02:56,360 --> 03:02:58,000 Speaker 5: It has a whole it has like like an auto 3678 03:02:58,040 --> 03:03:00,840 Speaker 5: focusing auto keystone. There were could get it up to 3679 03:03:00,840 --> 03:03:04,080 Speaker 5: ten VP, but they're running a kickstarter right now to 3680 03:03:04,200 --> 03:03:07,800 Speaker 5: ship in about three months. Super good quality stuff. 3681 03:03:07,840 --> 03:03:09,840 Speaker 2: If you're a gadget Ritina, Like it felt like a 3682 03:03:10,080 --> 03:03:13,200 Speaker 2: quality piece of electronics in my hands, Like the way 3683 03:03:13,240 --> 03:03:17,200 Speaker 2: it like snapped when it closed, just felt good. I'm 3684 03:03:17,280 --> 03:03:20,240 Speaker 2: I think I'm gonna buy one like it's it's exactly 3685 03:03:20,280 --> 03:03:22,360 Speaker 2: what I want for traveling, which is the ability to 3686 03:03:22,440 --> 03:03:24,360 Speaker 2: it goes up to like eighty inches of screen and 3687 03:03:24,400 --> 03:03:27,000 Speaker 2: like very good resolution, the ability to just have that 3688 03:03:27,600 --> 03:03:30,400 Speaker 2: plugged in to a battery or the wall and my 3689 03:03:30,480 --> 03:03:33,680 Speaker 2: laptop and like wherever I happen to be, I've got 3690 03:03:33,720 --> 03:03:35,600 Speaker 2: a movie screen that I don't have to worry about 3691 03:03:35,600 --> 03:03:37,560 Speaker 2: the fucking hooking up a TV to my laptop or 3692 03:03:37,600 --> 03:03:37,920 Speaker 2: some shit. 3693 03:03:38,080 --> 03:03:40,400 Speaker 5: It doesn't need Wi Fi to work. It just can 3694 03:03:40,480 --> 03:03:42,400 Speaker 5: cast from your phone. 3695 03:03:42,320 --> 03:03:46,280 Speaker 2: A U r Z and ZIP trifle projector or zen 3696 03:03:46,400 --> 03:03:48,240 Speaker 2: yep yep. I think they're selling them for two fifty 3697 03:03:48,320 --> 03:03:51,360 Speaker 2: right now. That's for the for the Kickstarter, for the Kickstarter. 3698 03:03:51,400 --> 03:03:53,400 Speaker 2: It will go up a little when it's a product. 3699 03:03:53,400 --> 03:03:56,160 Speaker 2: But we saw it. It works. They had a lot 3700 03:03:56,200 --> 03:03:59,400 Speaker 2: of they had tracking and stuff so it like automatically 3701 03:03:59,440 --> 03:04:00,320 Speaker 2: would focus. 3702 03:04:00,440 --> 03:04:04,000 Speaker 5: Auto focuses and at like it scales correctly, forwards, projecting 3703 03:04:04,640 --> 03:04:07,320 Speaker 5: it automatically like adjusts like the tilt of it so 3704 03:04:07,440 --> 03:04:08,000 Speaker 5: that it you. 3705 03:04:07,920 --> 03:04:10,200 Speaker 2: Know, yeah, obviously this isn't the full review because we 3706 03:04:10,200 --> 03:04:12,520 Speaker 2: don't own one, but everything we could tell by looking 3707 03:04:12,520 --> 03:04:13,400 Speaker 2: at it in the moment. 3708 03:04:13,360 --> 03:04:15,120 Speaker 5: We tried it out, I hooked up my phone to it. 3709 03:04:15,160 --> 03:04:17,280 Speaker 5: As I went to my phone screen, I realized I 3710 03:04:17,360 --> 03:04:22,600 Speaker 5: have a slightly I would say, artful flute image of 3711 03:04:22,640 --> 03:04:26,080 Speaker 5: an angel. So I quickly swiped away from we shouldn't 3712 03:04:26,080 --> 03:04:28,720 Speaker 5: show your dick to your damn my home screen of 3713 03:04:28,879 --> 03:04:31,440 Speaker 5: my phone. You know, things can always be worse. 3714 03:04:31,480 --> 03:04:35,039 Speaker 2: Things could always be worse. But I think where will 3715 03:04:35,160 --> 03:04:37,800 Speaker 2: end is? And this actually is not entirely in order, 3716 03:04:37,840 --> 03:04:40,480 Speaker 2: because this is the next after we had that conversation 3717 03:04:40,560 --> 03:04:45,039 Speaker 2: with our friend Jack Jay, which just left me thinking about, like, 3718 03:04:45,600 --> 03:04:48,120 Speaker 2: some people aren't really people, right, That's what I The 3719 03:04:48,120 --> 03:04:48,720 Speaker 2: sole thing. 3720 03:04:48,640 --> 03:04:51,400 Speaker 5: Is is a sham. It's all. It's all for rooms. 3721 03:04:51,560 --> 03:04:52,439 Speaker 5: It's soulless. 3722 03:04:52,760 --> 03:04:54,800 Speaker 2: We immediately walk over and we just kind of like 3723 03:04:54,879 --> 03:04:58,360 Speaker 2: randomly turn a corner and there's like a human shin 3724 03:04:58,959 --> 03:05:03,760 Speaker 2: like tibia amphibia basically with like a carbon fiber you know, 3725 03:05:04,120 --> 03:05:06,680 Speaker 2: frame around it. That's roughly the shape of like a 3726 03:05:06,879 --> 03:05:12,160 Speaker 2: person's lower leg, lower leg, and it's called bio leg. 3727 03:05:12,240 --> 03:05:16,520 Speaker 2: It's a powered microprocessor knee made in Japan, where it 3728 03:05:16,600 --> 03:05:20,200 Speaker 2: is a prosthetic, but unlike most prosthetics, it is powered 3729 03:05:20,280 --> 03:05:22,640 Speaker 2: and has a muscle built into it, so like when 3730 03:05:22,680 --> 03:05:25,120 Speaker 2: you lift up your prosthetic, it doesn't hang it, it doesn't lock. 3731 03:05:25,480 --> 03:05:27,920 Speaker 2: It actually has a degree of motion and it feels 3732 03:05:28,000 --> 03:05:30,080 Speaker 2: like what lifts the rest of the leg what you're 3733 03:05:30,160 --> 03:05:34,160 Speaker 2: remaining muscles like. It measures based on like it can 3734 03:05:34,200 --> 03:05:37,360 Speaker 2: like take measurements from them and it can act intelligently 3735 03:05:38,040 --> 03:05:41,200 Speaker 2: based on that. And I know that it works because 3736 03:05:41,200 --> 03:05:43,600 Speaker 2: the inventor was there and he was a man who 3737 03:05:43,640 --> 03:05:46,440 Speaker 2: was missing his leg below the knee and had built 3738 03:05:46,440 --> 03:05:47,600 Speaker 2: this for himself. 3739 03:05:48,240 --> 03:05:49,680 Speaker 5: He's spent like ten years working on this. 3740 03:05:49,840 --> 03:05:53,000 Speaker 2: Yeah, eight years, he's said, eight years. And that's like 3741 03:05:53,120 --> 03:05:56,400 Speaker 2: really the thing that is like so both like addictive 3742 03:05:56,520 --> 03:05:59,160 Speaker 2: and also like this like very tonal whiples you get 3743 03:05:59,160 --> 03:06:01,320 Speaker 2: at ces as you will go from like this dead 3744 03:06:01,400 --> 03:06:04,160 Speaker 2: eyed con man trying to scam the world so he 3745 03:06:04,200 --> 03:06:06,360 Speaker 2: can do god knows what kinds of other harms with 3746 03:06:06,480 --> 03:06:10,400 Speaker 2: absolutely nothing, nothing inside him at all, And then I 3747 03:06:10,520 --> 03:06:13,360 Speaker 2: lost my leg and I built a better prosthetic to 3748 03:06:13,440 --> 03:06:17,360 Speaker 2: help the entire world. And that's like thirty seconds between 3749 03:06:17,400 --> 03:06:19,360 Speaker 2: those two experiences. 3750 03:06:18,760 --> 03:06:21,560 Speaker 5: And like that's like that's like the dark magic of Cees. 3751 03:06:21,760 --> 03:06:25,000 Speaker 5: And like I don't, like I'm not like anti tech, 3752 03:06:25,160 --> 03:06:28,160 Speaker 5: like I think there. I think technology can really improve 3753 03:06:28,200 --> 03:06:31,280 Speaker 5: people's lives if used well. And sometimes I get kind 3754 03:06:31,280 --> 03:06:34,440 Speaker 5: of black pilled walking around Cees. But then we'll stumble 3755 03:06:34,480 --> 03:06:37,800 Speaker 5: across this, like you know someone who like literally lost 3756 03:06:37,800 --> 03:06:41,240 Speaker 5: a leg and made themselves their own better leg. 3757 03:06:41,360 --> 03:06:43,520 Speaker 2: Eight years figuring out how to do this. 3758 03:06:43,680 --> 03:06:46,440 Speaker 5: Yeah, is winning award. It's like for it award winning 3759 03:06:46,560 --> 03:06:47,720 Speaker 5: like tech innovations. 3760 03:06:47,760 --> 03:06:50,200 Speaker 2: It's changing your as a person who has lost your 3761 03:06:50,200 --> 03:06:52,560 Speaker 2: lower like changing being able to like have a normal 3762 03:06:52,640 --> 03:06:57,320 Speaker 2: gait and balance again, like massive potential to improve people's 3763 03:06:57,400 --> 03:06:58,640 Speaker 2: lives as a result of this. 3764 03:06:59,520 --> 03:07:03,760 Speaker 5: Yeah, just steps away from Ai ska at Ai and 3765 03:07:04,080 --> 03:07:09,560 Speaker 5: Donald trump Man the laund the profile. The company is 3766 03:07:09,600 --> 03:07:14,000 Speaker 5: again Bionic IM and it's the bioleg biolags the product. Yeah, 3767 03:07:14,000 --> 03:07:16,039 Speaker 5: the biolague is the product by Bionic Im. 3768 03:07:16,200 --> 03:07:17,760 Speaker 2: I'm going to try to check it out more. 3769 03:07:17,640 --> 03:07:20,200 Speaker 5: Tomorrow at Eureka Park, which at this point, you know 3770 03:07:20,320 --> 03:07:24,080 Speaker 5: that'll be like maybe future episodes come next week. But 3771 03:07:24,280 --> 03:07:28,720 Speaker 5: I guess this closes our actual Like Coridge, Let's go 3772 03:07:28,840 --> 03:07:32,440 Speaker 5: get fucked up and eat Japanese food. Oh I'm down. Yeah, 3773 03:07:32,480 --> 03:07:32,920 Speaker 5: I'm down. 3774 03:07:33,040 --> 03:07:33,640 Speaker 2: Let's do it. 3775 03:07:35,720 --> 03:07:35,959 Speaker 5: Hey. 3776 03:07:36,040 --> 03:07:39,119 Speaker 2: We'll be back Monday, with more episodes every week from 3777 03:07:39,160 --> 03:07:40,840 Speaker 2: now until the heat death of the universe. 3778 03:07:41,520 --> 03:07:44,000 Speaker 9: It Could Happen Here is a production of cool Zone Media. 3779 03:07:44,200 --> 03:07:47,200 Speaker 9: For more podcasts from cool Zone Media, visit our website 3780 03:07:47,320 --> 03:07:49,879 Speaker 9: cool Zonemedia dot com, or check us out on the 3781 03:07:49,920 --> 03:07:53,960 Speaker 9: iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts. 3782 03:07:54,280 --> 03:07:56,240 Speaker 1: You can now find sources for It Could Happen Here, 3783 03:07:56,240 --> 03:07:59,240 Speaker 1: listed directly in episode descriptions. Thanks for listening.