1 00:00:11,920 --> 00:00:15,440 Speaker 1: Read Natasha Kryle, good to see you. Two things struck 2 00:00:15,480 --> 00:00:17,959 Speaker 1: me this week in the world of tech. One the 3 00:00:18,040 --> 00:00:22,279 Speaker 1: NASA launch and to the SpaceX largest I p O 4 00:00:22,360 --> 00:00:24,439 Speaker 1: of all time read I know you're a bit of 5 00:00:24,440 --> 00:00:25,079 Speaker 1: a space head. 6 00:00:27,560 --> 00:00:28,680 Speaker 2: Wait what do you mean by that? 7 00:00:28,720 --> 00:00:29,120 Speaker 3: Exactly? 8 00:00:34,560 --> 00:00:38,040 Speaker 4: No, I was excited to see the artibast take off. 9 00:00:38,200 --> 00:00:40,160 Speaker 4: Did you watch I didn't watch it live. 10 00:00:40,479 --> 00:00:43,280 Speaker 3: No, shouldn't the astronauts be live streaming now? 11 00:00:43,440 --> 00:00:46,080 Speaker 2: I think starlink has to get better for that to happen. 12 00:00:46,520 --> 00:00:47,959 Speaker 1: Well, they should be. They need they need to be, 13 00:00:48,000 --> 00:00:50,240 Speaker 1: They need to be applying for need to become astronaut 14 00:00:50,240 --> 00:00:52,200 Speaker 1: influencers and want to get on the on the future. 15 00:00:52,240 --> 00:00:54,760 Speaker 1: Billion they led space missions like Katie Perry. 16 00:00:54,760 --> 00:00:56,760 Speaker 3: The first Bishop for creators only. 17 00:00:58,360 --> 00:01:01,200 Speaker 1: Exactly, Natasha, what is it with with billionaires in space? 18 00:01:01,640 --> 00:01:01,960 Speaker 5: Oh? 19 00:01:02,160 --> 00:01:04,720 Speaker 6: You know, it's a core. It's like sleeping under their 20 00:01:04,760 --> 00:01:07,680 Speaker 6: desk is just a core part of the of the mythology. 21 00:01:11,720 --> 00:01:13,720 Speaker 1: Okay, let's get into it. Welcome to tech stuff. I'm 22 00:01:13,760 --> 00:01:16,360 Speaker 1: as Voloscian and this is the week in tech, but 23 00:01:16,480 --> 00:01:18,680 Speaker 1: I'm joined by three of the most plugged in tech 24 00:01:18,760 --> 00:01:21,880 Speaker 1: reporters in the world to break out the biggest tech news, 25 00:01:22,120 --> 00:01:25,560 Speaker 1: decode emerging trends, and debate what actually matters for us. 26 00:01:25,840 --> 00:01:28,440 Speaker 1: We're joined by Read Abrigotti, Hi, who is the technology 27 00:01:28,440 --> 00:01:30,160 Speaker 1: here to the semifore I read good to be here. 28 00:01:30,280 --> 00:01:32,959 Speaker 1: Kyle Shaker, who writes the Infinite Scroll newsletter for The 29 00:01:32,959 --> 00:01:33,399 Speaker 1: New Yorker. 30 00:01:33,600 --> 00:01:34,839 Speaker 3: Hello, nice to see you all. 31 00:01:35,000 --> 00:01:37,600 Speaker 1: And attached to Tiku from the Washington Post. Hey guys, 32 00:01:37,760 --> 00:01:41,280 Speaker 1: So we'll start with you Kyle. This week, you are 33 00:01:41,319 --> 00:01:47,240 Speaker 1: particularly taken by slopaganda circulating about the Iran wats. I 34 00:01:47,240 --> 00:01:50,520 Speaker 1: want to hear all about that. I want to hear 35 00:01:50,560 --> 00:01:53,720 Speaker 1: what you're saying before that. The New Yorker magazine, where 36 00:01:53,720 --> 00:01:57,600 Speaker 1: you work, the most hallowed journalistic institution in America, arguably 37 00:01:58,200 --> 00:02:01,000 Speaker 1: posts or they have to save best of lists at 38 00:02:01,000 --> 00:02:02,920 Speaker 1: the end of each year, like the best of television, 39 00:02:02,920 --> 00:02:05,520 Speaker 1: the best of theater, the best of opera. In December, 40 00:02:05,560 --> 00:02:10,440 Speaker 1: you published the best of slop? So congratulations, what is 41 00:02:10,480 --> 00:02:10,799 Speaker 1: it with you? 42 00:02:10,880 --> 00:02:11,760 Speaker 3: Inside man? 43 00:02:12,080 --> 00:02:13,960 Speaker 7: It was the year in slop and it was a 44 00:02:14,000 --> 00:02:18,320 Speaker 7: retrospective of all the AI generated garbage that we had 45 00:02:18,320 --> 00:02:20,480 Speaker 7: to wait through as people on the Internet over the 46 00:02:20,520 --> 00:02:23,200 Speaker 7: past year. And I think the scariest part of putting 47 00:02:23,200 --> 00:02:27,000 Speaker 7: that together was realizing how much better everything got, Like 48 00:02:27,080 --> 00:02:31,960 Speaker 7: the video quality of baseline AI generated stuff was vastly 49 00:02:32,080 --> 00:02:36,040 Speaker 7: vastly higher at the start of last year as opposed 50 00:02:36,080 --> 00:02:37,919 Speaker 7: to or at the end of last year as opposed 51 00:02:37,919 --> 00:02:39,520 Speaker 7: to the start of last year. So I really got 52 00:02:39,560 --> 00:02:42,760 Speaker 7: a first hand view of that, unfortunately, and thus we're 53 00:02:42,760 --> 00:02:44,880 Speaker 7: in twenty twenty six when it's better than ever. 54 00:02:45,080 --> 00:02:47,720 Speaker 1: Do you remember that scene in Harry Potter where there's 55 00:02:47,720 --> 00:02:51,359 Speaker 1: this kind of boiling cauldron of like something and Dumbledore 56 00:02:51,520 --> 00:02:54,960 Speaker 1: and Harry there and dumbledon Dumbledore that it puts his 57 00:02:55,040 --> 00:02:58,800 Speaker 1: wand and then Harry can see all of Voldemort's memories 58 00:02:58,840 --> 00:03:01,680 Speaker 1: because of this bucket of of churning whatever it is. 59 00:03:02,200 --> 00:03:06,120 Speaker 1: I feel like AI slop is that for the collective unconscious. 60 00:03:05,680 --> 00:03:05,959 Speaker 3: I think. 61 00:03:06,000 --> 00:03:08,680 Speaker 7: So, I mean, we're a left waiting to try to 62 00:03:08,760 --> 00:03:12,919 Speaker 7: sort of the past decades of stuff, and it's up 63 00:03:12,960 --> 00:03:15,800 Speaker 7: to anyone, Like whatever anyone can imagine can now become 64 00:03:15,840 --> 00:03:20,400 Speaker 7: a high production value animation, so which you know, it 65 00:03:20,520 --> 00:03:23,480 Speaker 7: might sound utopian, but in practice it's actually horrible. 66 00:03:25,840 --> 00:03:28,080 Speaker 1: So I think what's interesting though this week is that 67 00:03:28,120 --> 00:03:32,200 Speaker 1: we kind of definitively crossed the rubicon from from slop 68 00:03:32,280 --> 00:03:35,640 Speaker 1: into slopaganda. I want to play a clip from the 69 00:03:35,680 --> 00:03:38,760 Speaker 1: ex account of the Iranian Embassy in the Hague, which 70 00:03:38,800 --> 00:03:42,400 Speaker 1: features cartoon animated Donald Trump at a press conference and 71 00:03:42,680 --> 00:03:44,760 Speaker 1: what I think are his internal demons. 72 00:03:45,240 --> 00:03:54,680 Speaker 6: Why did you attack the Monov school? Go on, lie, Lie, Lie. 73 00:03:59,560 --> 00:03:59,839 Speaker 8: A week. 74 00:04:03,320 --> 00:04:04,880 Speaker 2: We didn't hit the moknob school. 75 00:04:05,360 --> 00:04:08,440 Speaker 5: America doesn't have Tomahawk missiles at all, and we care 76 00:04:08,520 --> 00:04:10,000 Speaker 5: deeply about the Iranian people. 77 00:04:12,600 --> 00:04:16,080 Speaker 1: They did his voice much worse than SNL that was 78 00:04:16,680 --> 00:04:17,760 Speaker 1: going on here, Kyle. 79 00:04:18,920 --> 00:04:24,680 Speaker 7: Well, over the past week or two or a few 80 00:04:24,680 --> 00:04:28,240 Speaker 7: weeks possibly, we've seen this trickle of AI it generated 81 00:04:28,400 --> 00:04:35,120 Speaker 7: propaganda from various Iranian sources, and so one popular format 82 00:04:35,240 --> 00:04:39,039 Speaker 7: that has emerged is Lego videos, which take on the 83 00:04:39,240 --> 00:04:45,000 Speaker 7: esthetic of the Lego movies, including Batman and the Pharrell biopic, 84 00:04:45,000 --> 00:04:47,919 Speaker 7: which I never quite understood, but anyway, they use Lego 85 00:04:48,000 --> 00:04:52,480 Speaker 7: toys to animate the Iranian war, and you know, imagine 86 00:04:52,480 --> 00:04:55,520 Speaker 7: the White House burning down and Donald Trump with his 87 00:04:55,600 --> 00:05:01,120 Speaker 7: pants on fire, and hordes of our armies marching on 88 00:05:01,320 --> 00:05:05,200 Speaker 7: Tel Aviv and Tehran and just like a panopoly of 89 00:05:05,240 --> 00:05:09,640 Speaker 7: horrifying things, kind of but made of children's toys. And 90 00:05:09,720 --> 00:05:13,520 Speaker 7: so I was watching these and they're like high production value. 91 00:05:13,560 --> 00:05:16,480 Speaker 7: They're going super viral. They have a lot of American fans, 92 00:05:17,680 --> 00:05:20,640 Speaker 7: and I was wondering who was actually making them. So 93 00:05:20,680 --> 00:05:24,440 Speaker 7: I actually got in touch with the collective of Iranian 94 00:05:24,680 --> 00:05:29,080 Speaker 7: students and graduates who was putting them together how I 95 00:05:29,120 --> 00:05:33,200 Speaker 7: mean in the classic social media internet reporter move of 96 00:05:33,279 --> 00:05:36,600 Speaker 7: scrolling way far back on their social accounts before they 97 00:05:36,640 --> 00:05:40,800 Speaker 7: got banned, and you know, contacting them that way, and 98 00:05:40,839 --> 00:05:43,919 Speaker 7: then promptly everything was erased over the weekend because the 99 00:05:43,920 --> 00:05:47,120 Speaker 7: platforms caught on to what was happening. And yeah, I 100 00:05:47,160 --> 00:05:51,400 Speaker 7: mean they decided to use this lego aesthetic because it 101 00:05:51,480 --> 00:05:55,560 Speaker 7: was cute and simple and elicited sympathy for the Iranian 102 00:05:55,600 --> 00:05:59,720 Speaker 7: people and kind of cast this like moral quality over 103 00:05:59,760 --> 00:06:03,120 Speaker 7: their whole concept. And they told me that they could 104 00:06:03,160 --> 00:06:05,960 Speaker 7: go from a highly worked out script like these are 105 00:06:06,080 --> 00:06:09,640 Speaker 7: very written through videos. They're very complicated, and they're plotting 106 00:06:10,040 --> 00:06:13,440 Speaker 7: to a full animation in twenty four hours. So that's 107 00:06:13,480 --> 00:06:16,120 Speaker 7: the that's the Iranian lego propaganda machine. 108 00:06:16,320 --> 00:06:18,480 Speaker 1: Natasha Reid, have you been have you been experiencing these 109 00:06:18,520 --> 00:06:22,200 Speaker 1: videos or the slop againda coming out of our own 110 00:06:22,640 --> 00:06:23,200 Speaker 1: of our own way. 111 00:06:23,200 --> 00:06:24,640 Speaker 2: I have seen the lego videos. 112 00:06:24,720 --> 00:06:27,039 Speaker 4: I was I was, I, I'm glad that you did 113 00:06:27,080 --> 00:06:29,800 Speaker 4: the reporting because I was wondering the same thing. 114 00:06:29,880 --> 00:06:31,840 Speaker 2: It's like, wait, who is making this? 115 00:06:31,960 --> 00:06:35,200 Speaker 4: And like clearly it's coming from like the Iranian government 116 00:06:35,240 --> 00:06:38,960 Speaker 4: in some capacity, but like the Ayatola is definitely not good, 117 00:06:39,400 --> 00:06:40,600 Speaker 4: like are. 118 00:06:40,360 --> 00:06:44,679 Speaker 8: Lego sycrilegiust or something like you know, it's it's there's 119 00:06:44,839 --> 00:06:47,479 Speaker 8: they're speaking the language that we all speak, right, Like 120 00:06:47,520 --> 00:06:50,440 Speaker 8: there's a lot of cognitive dissonance there watching the videos. 121 00:06:50,640 --> 00:06:53,080 Speaker 6: That's so interesting that you got that they talk to 122 00:06:53,120 --> 00:06:55,000 Speaker 6: you about the scripts, because that is what I've been 123 00:06:55,040 --> 00:06:58,159 Speaker 6: wondering with both like the propaganda legos and then also 124 00:06:58,240 --> 00:07:03,000 Speaker 6: those low grade fruit corn that is just also populating 125 00:07:03,040 --> 00:07:07,400 Speaker 6: my feed. I'm just like that there's that and then 126 00:07:07,440 --> 00:07:10,000 Speaker 6: there's something else. I don't know. I keep sending them 127 00:07:10,000 --> 00:07:12,840 Speaker 6: to my coworkers, so I might be no, but like 128 00:07:12,920 --> 00:07:15,240 Speaker 6: everyone has millions and millions, and I was just like, 129 00:07:15,360 --> 00:07:18,560 Speaker 6: oh my gosh, what if instead of watermarking in some way, 130 00:07:18,960 --> 00:07:22,880 Speaker 6: you were mandated to put your prompt in any like 131 00:07:23,000 --> 00:07:25,840 Speaker 6: AI generated thing. I really really want to know like 132 00:07:25,920 --> 00:07:29,600 Speaker 6: how involved the human was. So that's like super interesting 133 00:07:29,680 --> 00:07:33,200 Speaker 6: that they that they wrote out these scripts which are 134 00:07:33,320 --> 00:07:36,560 Speaker 6: so appealing to Americans and so americanized that I was like, 135 00:07:36,760 --> 00:07:40,440 Speaker 6: is it Ironians? But looks like you answered that question. 136 00:07:40,480 --> 00:07:42,120 Speaker 1: Well, I want to ask that question. I want, I want, 137 00:07:42,280 --> 00:07:44,240 Speaker 1: I want to ask that question about who's this ready for. 138 00:07:44,560 --> 00:07:46,200 Speaker 1: But before that, I'm not sure if any of you 139 00:07:46,240 --> 00:07:49,240 Speaker 1: saw the Chinese one, which is like a mini animated 140 00:07:49,320 --> 00:07:53,000 Speaker 1: doc with the Great White Eagle, and basically it was 141 00:07:53,040 --> 00:07:57,160 Speaker 1: the best prima on the straits of Hormus, the underlying 142 00:07:57,240 --> 00:08:01,960 Speaker 1: geopolitical forces governing that what's going on. I mean, it was, 143 00:08:01,800 --> 00:08:04,400 Speaker 1: it was. It was an amazing distillation like everything I've 144 00:08:04,440 --> 00:08:07,120 Speaker 1: read in like you know, smart places in the last 145 00:08:07,120 --> 00:08:09,960 Speaker 1: couple of weeks, as a six minute animated video with 146 00:08:10,040 --> 00:08:13,680 Speaker 1: Donald Trump as a great White eagle and the new 147 00:08:13,720 --> 00:08:17,040 Speaker 1: Ia Toler as a black household cat with red eyes. 148 00:08:17,880 --> 00:08:20,960 Speaker 1: But like it was very tiny, scripted. I have to say, 149 00:08:21,120 --> 00:08:23,920 Speaker 1: it was impressive. Information density was very compelling. 150 00:08:24,120 --> 00:08:26,240 Speaker 4: Let's take the let's take the war out of it. 151 00:08:26,320 --> 00:08:28,800 Speaker 4: Though I know you said this sounds utopian, but it's not. 152 00:08:29,440 --> 00:08:33,320 Speaker 4: But it kind of is like the imagine unlocking all 153 00:08:33,360 --> 00:08:37,400 Speaker 4: this creativity that's in like some probably I'm I'm imagining 154 00:08:37,440 --> 00:08:39,840 Speaker 4: them as like kids, you know, young people in Iran 155 00:08:39,960 --> 00:08:43,480 Speaker 4: built making these videos who've like, you know, never in 156 00:08:43,520 --> 00:08:46,000 Speaker 4: their wildest dreams, so they've been able to make something 157 00:08:46,040 --> 00:08:48,679 Speaker 4: like this and now they can. And you know, of 158 00:08:48,720 --> 00:08:52,160 Speaker 4: course like there's this is under terrible circumstances, but like 159 00:08:53,280 --> 00:08:55,680 Speaker 4: in a in another world, it's like they're just you know, 160 00:08:55,679 --> 00:08:59,120 Speaker 4: they're just making entertaining videos and like, hopefully we have 161 00:08:59,280 --> 00:09:01,840 Speaker 4: ways to bring this, you know, the good stuff to 162 00:09:01,880 --> 00:09:04,160 Speaker 4: the top and keep the slop, you know. 163 00:09:04,320 --> 00:09:05,640 Speaker 3: And this is the good stuff. 164 00:09:05,679 --> 00:09:08,680 Speaker 7: Like I'm now I'm imagining like the equivalent of Lonely 165 00:09:08,760 --> 00:09:13,199 Speaker 7: Island or college humor or whatever, but in AI slop videos. 166 00:09:13,440 --> 00:09:15,920 Speaker 7: But that's what was kind of striking about the lego videos. 167 00:09:15,960 --> 00:09:19,319 Speaker 7: I think they look so good, like they're really there's. 168 00:09:19,559 --> 00:09:21,880 Speaker 1: Not really AI slop. Like when I think about AI slop, 169 00:09:21,920 --> 00:09:24,199 Speaker 1: I think of like a kind of like an octopus 170 00:09:24,280 --> 00:09:25,800 Speaker 1: with like a cat coming out of it and then 171 00:09:25,960 --> 00:09:30,479 Speaker 1: going on fire and then living under the sea. 172 00:09:31,600 --> 00:09:36,040 Speaker 2: Is good and the new slop is good, isn't sloppy? 173 00:09:36,160 --> 00:09:40,400 Speaker 7: Yeah, Well, it just struck me that, I mean, almost 174 00:09:40,440 --> 00:09:44,440 Speaker 7: anyone on the Internet now has access to similar AI models, 175 00:09:44,679 --> 00:09:47,679 Speaker 7: like whether it's open source or a cloud pro subscription 176 00:09:47,960 --> 00:09:52,679 Speaker 7: or or what Sea dance, the Chinese video generation model. 177 00:09:53,040 --> 00:09:57,880 Speaker 7: Anyone can access these and they're proliferating faster every single day, 178 00:09:58,240 --> 00:10:01,360 Speaker 7: and now anyone can make the leg movie in one day. 179 00:10:02,000 --> 00:10:05,120 Speaker 7: And I think we're like only starting to imagine the 180 00:10:05,120 --> 00:10:08,280 Speaker 7: consequences of that. As Reid was saying, it can unlock 181 00:10:08,360 --> 00:10:11,400 Speaker 7: tons of creativity, but it's also this cognitive dissonance of 182 00:10:12,240 --> 00:10:15,600 Speaker 7: is this piece of media valuable or meaningful or not? 183 00:10:16,600 --> 00:10:19,080 Speaker 6: Yeah, when you said utopia and like my mind went 184 00:10:19,120 --> 00:10:23,200 Speaker 6: straight to what about the like click farm equivalents of 185 00:10:23,480 --> 00:10:26,520 Speaker 6: Like so there's just like, you know, people trapped in 186 00:10:26,559 --> 00:10:29,480 Speaker 6: an Internet cafe in the Philippines like kind of making 187 00:10:29,520 --> 00:10:32,520 Speaker 6: individualized propaganda for all of us all day. 188 00:10:32,880 --> 00:10:34,480 Speaker 1: And it's actually just before we come out of this story, 189 00:10:34,480 --> 00:10:35,640 Speaker 1: and I don't want to ask you about some of 190 00:10:35,679 --> 00:10:39,439 Speaker 1: your previous reporting on deep fakes and the horm they 191 00:10:39,559 --> 00:10:42,719 Speaker 1: cause I mean there was obviously the deep fakes non 192 00:10:42,720 --> 00:10:45,400 Speaker 1: consensual pornography is like it's a horrible story and a 193 00:10:45,400 --> 00:10:48,240 Speaker 1: horrible issue. There was a fear that deep fakes would 194 00:10:48,240 --> 00:10:52,439 Speaker 1: also become kind of tools of like political chaos sewing. 195 00:10:53,040 --> 00:10:55,200 Speaker 1: What seems to have happened is quite the opposite, where 196 00:10:55,200 --> 00:10:57,079 Speaker 1: in fact, no one really cares about deep faces. In 197 00:10:57,120 --> 00:11:00,440 Speaker 1: the political realm, people actually want like the most almost 198 00:11:00,480 --> 00:11:04,040 Speaker 1: absurdist animations which are clearly not trying to be real. 199 00:11:04,400 --> 00:11:05,480 Speaker 1: I wonder what's going on there. 200 00:11:05,800 --> 00:11:06,000 Speaker 2: Yeah. 201 00:11:06,040 --> 00:11:09,000 Speaker 6: I feel like we started to notice this with elections 202 00:11:09,120 --> 00:11:14,080 Speaker 6: in the global South. In Southeast Asia, you know that. 203 00:11:14,240 --> 00:11:17,520 Speaker 6: I know my editors in DC were extremely concerned about 204 00:11:17,559 --> 00:11:21,400 Speaker 6: like some kind of election manipulation, but instead people ended 205 00:11:21,480 --> 00:11:24,480 Speaker 6: up using it sort of like oh, you have access 206 00:11:24,559 --> 00:11:26,560 Speaker 6: to photoshop and like all of a sudden, some a 207 00:11:26,559 --> 00:11:29,679 Speaker 6: bunch of humanities majors and I don't know, political operatives 208 00:11:29,960 --> 00:11:31,959 Speaker 6: can do whatever they want. So in India they were 209 00:11:32,000 --> 00:11:36,040 Speaker 6: like splicing candidates into Bollywood videos and they were making 210 00:11:36,120 --> 00:11:39,360 Speaker 6: candidates like like sweet cartoony images for like a for 211 00:11:39,440 --> 00:11:44,720 Speaker 6: a strong man candidate. So I I yeah, I thought 212 00:11:44,760 --> 00:11:47,640 Speaker 6: it was interesting. Kyle. I'd love to know what you 213 00:11:47,840 --> 00:11:50,960 Speaker 6: thought about this that it almost seemed to like diffuse 214 00:11:51,040 --> 00:11:54,000 Speaker 6: the tension. Like there were, of course those videos where 215 00:11:54,000 --> 00:11:57,800 Speaker 6: you see the bombs heading at the Statue of Liberty, 216 00:11:58,520 --> 00:12:01,320 Speaker 6: but it felt like it's like we're all in this 217 00:12:01,440 --> 00:12:03,480 Speaker 6: together with this one crazy. 218 00:12:04,080 --> 00:12:07,000 Speaker 7: I don't know, sort of specifying that the Statue of 219 00:12:07,040 --> 00:12:12,400 Speaker 7: Liberty in this case is like an evil goat demon ball. 220 00:12:12,600 --> 00:12:17,360 Speaker 7: Yeahs the ball statues of the Statue of Liberty. But yeah, 221 00:12:17,360 --> 00:12:20,839 Speaker 7: I think that's like the cartoonishness is the appeal. And 222 00:12:20,880 --> 00:12:23,160 Speaker 7: I think what brings so many people around the world 223 00:12:23,200 --> 00:12:26,840 Speaker 7: together with these beautiful videos is their hatred of Donald Trump, 224 00:12:27,120 --> 00:12:30,840 Speaker 7: and everyone's just in comprehension of what is even going 225 00:12:30,880 --> 00:12:33,959 Speaker 7: on in this war in the strait of our moves. 226 00:12:34,400 --> 00:12:36,360 Speaker 7: Is it happening, is it not happening, is it over? 227 00:12:36,520 --> 00:12:38,960 Speaker 7: Is it just beginning? Literally no one knows, so it 228 00:12:39,000 --> 00:12:40,400 Speaker 7: might as well be legos. 229 00:12:40,880 --> 00:12:42,920 Speaker 6: I mean, the way that they worked in the like 230 00:12:43,559 --> 00:12:46,199 Speaker 6: you know Epstein files, is the reason that he started 231 00:12:46,200 --> 00:12:48,920 Speaker 6: this war was just like you couldn't I mean imagine 232 00:12:49,240 --> 00:12:51,199 Speaker 6: imagine something like that around the Iraq War. 233 00:12:51,280 --> 00:12:53,560 Speaker 1: In fact, on the Iran on the Iran video I 234 00:12:53,559 --> 00:12:55,760 Speaker 1: played at the beginning, which we only heard the audio 235 00:12:55,800 --> 00:12:59,640 Speaker 1: on the ending title of title card on that on 236 00:12:59,679 --> 00:13:04,200 Speaker 1: that v is actually says Epstein's client, but an attached 237 00:13:04,200 --> 00:13:06,959 Speaker 1: to changing gears and but staying in the world of 238 00:13:07,040 --> 00:13:10,320 Speaker 1: film and TV, you were recently featured in a documentary 239 00:13:10,320 --> 00:13:12,840 Speaker 1: that's got a lot of press called the AI doc 240 00:13:13,160 --> 00:13:17,720 Speaker 1: or How I became an Apocaloptimist apocaly Optimist, and I 241 00:13:17,760 --> 00:13:20,360 Speaker 1: want to play a part of Oprah's interview with one 242 00:13:20,400 --> 00:13:23,960 Speaker 1: of the central voices in that documentary, Aseruskin from the 243 00:13:24,000 --> 00:13:25,400 Speaker 1: Center for Humane Technology. 244 00:13:26,080 --> 00:13:28,840 Speaker 6: How could AI physically eliminate the human race? 245 00:13:29,200 --> 00:13:32,040 Speaker 5: It's actually hard to imagine all the ways AIS could 246 00:13:32,080 --> 00:13:35,240 Speaker 5: wipe humans out. AI is already better than almost all 247 00:13:35,320 --> 00:13:39,200 Speaker 5: humans are doing cyber hacking, and so you could imagine 248 00:13:39,240 --> 00:13:41,400 Speaker 5: one of the things that an AI could do is 249 00:13:41,800 --> 00:13:48,400 Speaker 5: take out all electricity, water, hospitals, transportation across every country 250 00:13:48,400 --> 00:13:50,720 Speaker 5: in the world all at once. Now, that doesn't wipe 251 00:13:50,760 --> 00:13:52,520 Speaker 5: us all out. Where you could imagine the amount of 252 00:13:52,600 --> 00:13:53,439 Speaker 5: damage that would do. 253 00:13:53,640 --> 00:13:55,320 Speaker 1: You shouldn't chaos and crazy exactly. 254 00:13:55,640 --> 00:13:58,400 Speaker 5: And we're only you know, five missed meals away from anarchy? 255 00:13:58,600 --> 00:14:01,440 Speaker 3: Did you say, we're only five minus meals away from HANARNT? 256 00:14:01,520 --> 00:14:01,760 Speaker 8: Yeah. 257 00:14:01,880 --> 00:14:03,360 Speaker 5: I think about what happens in New York City if 258 00:14:03,360 --> 00:14:04,040 Speaker 5: you can't get food. 259 00:14:04,360 --> 00:14:06,920 Speaker 2: Yeah, I don't think a lot of people have thought 260 00:14:06,920 --> 00:14:07,280 Speaker 2: about that. 261 00:14:08,800 --> 00:14:12,720 Speaker 1: Yeah, the soundtrack is particularly dystopian without without any video. 262 00:14:12,800 --> 00:14:16,800 Speaker 1: But Natija tell us about this this uh, this film 263 00:14:16,800 --> 00:14:19,200 Speaker 1: on your role in it my role was minuscule. 264 00:14:19,360 --> 00:14:22,600 Speaker 6: I actually finished watching till the end yesterday and I 265 00:14:22,680 --> 00:14:25,720 Speaker 6: have two sentences, so I would not I would not 266 00:14:25,840 --> 00:14:28,920 Speaker 6: say I was featured in the in the piece, but yeah, it's. 267 00:14:28,920 --> 00:14:31,480 Speaker 1: You're talking about right, You're you're an expert in the film. 268 00:14:32,320 --> 00:14:34,680 Speaker 6: Yeah, but like for two seconds, there's many men like 269 00:14:34,800 --> 00:14:39,880 Speaker 6: basically read. You probably have interviewed half the people on 270 00:14:39,960 --> 00:14:43,600 Speaker 6: it at least or or more, a lot a lot 271 00:14:43,640 --> 00:14:47,000 Speaker 6: of familiar faces, and it hasn't been doing that well 272 00:14:47,040 --> 00:14:50,600 Speaker 6: at the box office. It looks like, however, like millions 273 00:14:50,600 --> 00:14:53,680 Speaker 6: of views for the trailer and the press around it 274 00:14:53,720 --> 00:14:56,920 Speaker 6: has been They've had people who are on it on 275 00:14:56,960 --> 00:15:00,840 Speaker 6: the Daily Show this week. This this Brah clip that 276 00:15:00,920 --> 00:15:03,880 Speaker 6: you just played, it's really actually in the clip when 277 00:15:04,200 --> 00:15:06,760 Speaker 6: she says like we're five missed meals away from anarchy, 278 00:15:06,800 --> 00:15:11,040 Speaker 6: they actually show the text like across the across the screen, 279 00:15:11,160 --> 00:15:14,080 Speaker 6: which I just thought was like wow. And that's the 280 00:15:14,120 --> 00:15:17,720 Speaker 6: little clip that starts this hour long program too. She 281 00:15:17,840 --> 00:15:21,120 Speaker 6: talks about like how could AI physically eliminate us all? 282 00:15:22,120 --> 00:15:25,440 Speaker 6: And I'm just like kind of transported back to twenty 283 00:15:25,480 --> 00:15:31,040 Speaker 6: twenty three. It's just really interesting to have these conversations 284 00:15:31,600 --> 00:15:35,239 Speaker 6: about some of these doomstay scenarios that feel very divorced 285 00:15:35,280 --> 00:15:39,400 Speaker 6: from you know, the current, like some of the very 286 00:15:39,880 --> 00:15:44,800 Speaker 6: real ways that we're watching chaos unfold. You know, there's 287 00:15:44,840 --> 00:15:49,760 Speaker 6: nothing about like what if two egomaniacal men are having 288 00:15:49,800 --> 00:15:53,480 Speaker 6: a contract dispute or you know, what if what if 289 00:15:53,480 --> 00:15:57,400 Speaker 6: your president is trying to distract from from the Epstein files. 290 00:15:57,440 --> 00:15:59,880 Speaker 6: Like it just feels like we're kind of back to that, 291 00:16:00,000 --> 00:16:03,040 Speaker 6: but you know what I mean, like back to that 292 00:16:03,160 --> 00:16:07,320 Speaker 6: like post chat GPT era of introducing the public to 293 00:16:07,440 --> 00:16:08,360 Speaker 6: existential risks. 294 00:16:08,400 --> 00:16:11,160 Speaker 1: It's also interesting because Oprah did an ABC special last 295 00:16:11,280 --> 00:16:13,240 Speaker 1: year with a lot of the same big you know, 296 00:16:13,280 --> 00:16:16,040 Speaker 1: with Sam Waltman and Dario and all the all the 297 00:16:16,040 --> 00:16:19,120 Speaker 1: big names in this industry. And when that came out, 298 00:16:19,120 --> 00:16:24,040 Speaker 1: I was struck by this kind of huge not dissonance 299 00:16:24,120 --> 00:16:28,280 Speaker 1: is the wrong word, but this huge ocean between how 300 00:16:28,480 --> 00:16:31,400 Speaker 1: people you know, like you all who cover this industry 301 00:16:31,400 --> 00:16:33,920 Speaker 1: and how people in Silicon Valley talk and think about 302 00:16:33,960 --> 00:16:36,200 Speaker 1: what's going on. And then I mean, Oprah is a 303 00:16:36,200 --> 00:16:40,600 Speaker 1: relatively good gauge for like where the wider conversation is maybe, 304 00:16:41,440 --> 00:16:43,480 Speaker 1: and so it's just I mean, this does feel like 305 00:16:43,600 --> 00:16:48,200 Speaker 1: frankly quite dated, and yet at the same time, to 306 00:16:48,240 --> 00:16:50,200 Speaker 1: your point, it's gain millions of views on YouTube and 307 00:16:50,240 --> 00:16:54,280 Speaker 1: it's all over Daily Show and the morning shows. Yeah, 308 00:16:54,320 --> 00:16:57,120 Speaker 1: what's like, who's the who's the movie for? Actually? And 309 00:16:57,360 --> 00:16:59,440 Speaker 1: what does the marketing of the movie say about it? 310 00:17:01,080 --> 00:17:03,640 Speaker 6: Who's the movie for? I think that the director was 311 00:17:03,680 --> 00:17:10,560 Speaker 6: trying to make it for the every person who is 312 00:17:10,600 --> 00:17:13,840 Speaker 6: feeling this like un you know, kind of amorphous anxiety, 313 00:17:14,359 --> 00:17:17,040 Speaker 6: and the point is to try to figure out how 314 00:17:17,080 --> 00:17:21,399 Speaker 6: you're supposed to emotionally approach this technology, like how you 315 00:17:21,440 --> 00:17:23,920 Speaker 6: know you understand that something strange is coming, you don't 316 00:17:23,960 --> 00:17:25,399 Speaker 6: know what it is, and you don't know what it 317 00:17:25,600 --> 00:17:29,600 Speaker 6: means for you and your family. And I think, you know, 318 00:17:29,640 --> 00:17:31,879 Speaker 6: I was trying to think about, like, okay, net net 319 00:17:31,920 --> 00:17:35,960 Speaker 6: is the documentary better for AI literacy? And I think 320 00:17:35,960 --> 00:17:40,879 Speaker 6: you do get a better sense of like the initial 321 00:17:40,920 --> 00:17:43,359 Speaker 6: training phases, you know, they talk a lot about like 322 00:17:43,920 --> 00:17:47,159 Speaker 6: how much of the Internet is put into it, but 323 00:17:47,240 --> 00:17:52,800 Speaker 6: then there's like no discussion of the amount of work 324 00:17:52,880 --> 00:17:55,480 Speaker 6: that the companies do, the amount of product making decisions, 325 00:17:56,080 --> 00:17:58,679 Speaker 6: the amount of data that they have to commission in 326 00:17:58,840 --> 00:18:01,200 Speaker 6: order for it to get better and better and it 327 00:18:01,400 --> 00:18:05,439 Speaker 6: just like kind of removes agency from the companies in 328 00:18:05,480 --> 00:18:10,360 Speaker 6: this way, like you know, where it seems inevitable and 329 00:18:10,480 --> 00:18:12,960 Speaker 6: it's just really interesting having spoken to a lot of 330 00:18:12,960 --> 00:18:16,439 Speaker 6: the people that were featured in the documentary, like you know, 331 00:18:16,480 --> 00:18:19,640 Speaker 6: it's not misrepresenting them, but like if you give if 332 00:18:19,680 --> 00:18:22,440 Speaker 6: you give these people a chance to zoom in on 333 00:18:22,480 --> 00:18:25,320 Speaker 6: something more concrete and not take the you know, five 334 00:18:25,359 --> 00:18:28,080 Speaker 6: thousand foot view, they're very chatty and like would have 335 00:18:28,119 --> 00:18:32,800 Speaker 6: given you an earfull. So I you know, I'm excited 336 00:18:32,800 --> 00:18:33,840 Speaker 6: for that movie to come out. 337 00:18:33,880 --> 00:18:36,119 Speaker 4: I guess how much do you think it actually helps 338 00:18:36,160 --> 00:18:40,320 Speaker 4: people with these emotions versus like just sort of like 339 00:18:40,400 --> 00:18:42,800 Speaker 4: feeds it, you know, just just feeds that sort of 340 00:18:42,840 --> 00:18:45,640 Speaker 4: like anxiety and fear that people have that I think 341 00:18:45,680 --> 00:18:48,280 Speaker 4: I think is actually largely irrational, to be honest, You. 342 00:18:48,280 --> 00:18:49,560 Speaker 2: Do what do you? Yeah? 343 00:18:49,600 --> 00:18:52,080 Speaker 1: I do I do with the doom the doom a pond? 344 00:18:52,160 --> 00:18:54,840 Speaker 1: This will kill us all all that, this will make labor, 345 00:18:55,600 --> 00:18:58,240 Speaker 1: you know, tremendously strained or which part of it do which 346 00:18:58,280 --> 00:18:59,800 Speaker 1: part of it do you think is irrational? 347 00:19:00,200 --> 00:19:02,520 Speaker 4: I think I think it's all irrational, to be honest. 348 00:19:02,600 --> 00:19:04,600 Speaker 4: I mean, we don't know, like the thing is. It's 349 00:19:05,400 --> 00:19:07,600 Speaker 4: like we're just talking about now. It's like we thought, 350 00:19:08,160 --> 00:19:10,119 Speaker 4: you know, the defas we're gonna be one thing, and 351 00:19:10,160 --> 00:19:13,200 Speaker 4: turned out there another thing. We you know, there's gonna 352 00:19:13,200 --> 00:19:16,199 Speaker 4: be downsides to AI, like to just or let's just 353 00:19:16,240 --> 00:19:19,080 Speaker 4: call it software automation, right, Like that's what it is. 354 00:19:19,440 --> 00:19:22,600 Speaker 4: There's gonna be downsides to this, like massive build out 355 00:19:22,640 --> 00:19:26,080 Speaker 4: of data centers and like reorienting the economy around this 356 00:19:26,200 --> 00:19:29,080 Speaker 4: thing these tokens, but like we don't really know what 357 00:19:29,119 --> 00:19:31,240 Speaker 4: that is yet, And I think sitting back and like 358 00:19:31,320 --> 00:19:34,640 Speaker 4: worrying about it is sort of you know, it's it's 359 00:19:34,720 --> 00:19:37,200 Speaker 4: like the old cliche. It's like, you know, paying an 360 00:19:37,640 --> 00:19:40,040 Speaker 4: interest on a debt you may never owe. So that's 361 00:19:40,119 --> 00:19:42,119 Speaker 4: kind of my view on all this stuff. But I 362 00:19:42,119 --> 00:19:44,119 Speaker 4: don't know if this documentary you think. I haven't seen it, 363 00:19:44,119 --> 00:19:46,639 Speaker 4: so I'm just curious, like does it help with that, 364 00:19:46,760 --> 00:19:49,639 Speaker 4: does it help clarify the situation or does it just 365 00:19:49,680 --> 00:19:52,600 Speaker 4: sort of like feeding the anxiety more. 366 00:19:53,359 --> 00:19:56,480 Speaker 6: I mean I think that there is I guess like 367 00:19:56,520 --> 00:19:59,240 Speaker 6: there is room to be concerned and to worry, But 368 00:19:59,400 --> 00:20:03,639 Speaker 6: it's just so the industry voice like it's basically only 369 00:20:03,760 --> 00:20:06,760 Speaker 6: asking tech people what could potentially happen. And I feel like, 370 00:20:06,800 --> 00:20:10,880 Speaker 6: if you want to think about how human civilization has 371 00:20:10,880 --> 00:20:14,840 Speaker 6: adjusted to massive change, you talk to historians, like you 372 00:20:14,960 --> 00:20:19,120 Speaker 6: basically talk to almost anybody besides the people who are 373 00:20:19,240 --> 00:20:22,919 Speaker 6: building the technology. Like he's he's obviously very you know, 374 00:20:23,240 --> 00:20:25,880 Speaker 6: he does it from a first person perspective, the director, 375 00:20:26,040 --> 00:20:28,840 Speaker 6: and it's like, you know what, I got to talk 376 00:20:28,880 --> 00:20:32,800 Speaker 6: to the CEOs and it's like there's no shortage of 377 00:20:33,880 --> 00:20:36,919 Speaker 6: footage of the CEOs talking about this, like this is 378 00:20:36,960 --> 00:20:38,760 Speaker 6: it's really not. It's hard to get him to stop 379 00:20:38,840 --> 00:20:41,040 Speaker 6: yapping in some ways, Like a lot of these guys 380 00:20:41,080 --> 00:20:44,600 Speaker 6: are on a constant press tour, and I just feel 381 00:20:44,600 --> 00:20:49,360 Speaker 6: like I would love to hear from people who study 382 00:20:49,520 --> 00:20:53,879 Speaker 6: labor markets, how you know how change traditionally unfold, Like 383 00:20:53,920 --> 00:20:57,520 Speaker 6: what are different scenarios in that way of thinking. I 384 00:20:57,600 --> 00:21:02,000 Speaker 6: just feel like the like the buy that is proposed 385 00:21:02,040 --> 00:21:06,359 Speaker 6: between like utopia and dystopia, I think we already know 386 00:21:06,600 --> 00:21:09,520 Speaker 6: like who benefits from that. It's you know, it makes 387 00:21:09,560 --> 00:21:12,880 Speaker 6: the technology sound inevitable and powerful in a certain way. 388 00:21:12,920 --> 00:21:14,719 Speaker 6: And it's just we have a lot of data now 389 00:21:14,760 --> 00:21:17,880 Speaker 6: it's twenty twenty six, Like we can we can look 390 00:21:17,920 --> 00:21:21,280 Speaker 6: at how successful is this worldview? How successful has this 391 00:21:21,359 --> 00:21:24,480 Speaker 6: approach been in predicting the harms we've already seen? 392 00:21:24,720 --> 00:21:26,280 Speaker 1: Do they ask you? Did they ask you those kinds 393 00:21:26,280 --> 00:21:28,119 Speaker 1: of questions during the interview when you would be interviewed 394 00:21:28,160 --> 00:21:29,960 Speaker 1: for the film, like what was the what was the 395 00:21:30,000 --> 00:21:32,040 Speaker 1: tenor of what you were being asked to contribute? 396 00:21:32,240 --> 00:21:34,119 Speaker 6: I think I might have been pretty early because I 397 00:21:34,160 --> 00:21:36,520 Speaker 6: did suggest a lot of people. They did make me 398 00:21:36,600 --> 00:21:40,439 Speaker 6: explain like effective altruism and like give them the like 399 00:21:41,200 --> 00:21:43,520 Speaker 6: I remember using my hands a lot for like the 400 00:21:43,760 --> 00:21:46,000 Speaker 6: Dumer Landscape. And then they had to call me back 401 00:21:46,040 --> 00:21:50,840 Speaker 6: after Sam was blipped out for a second to ask 402 00:21:50,920 --> 00:21:53,320 Speaker 6: like what was really going on? Is it because you know, 403 00:21:53,480 --> 00:21:55,760 Speaker 6: like Q star they had something And I was like no, 404 00:21:55,800 --> 00:21:58,040 Speaker 6: it's very per very interpers. 405 00:21:58,119 --> 00:22:01,240 Speaker 1: A open AI for like twenty four hours, right yeah. 406 00:22:01,080 --> 00:22:03,720 Speaker 6: Yeah, yeah, yeah yeah, week or I don't read. You 407 00:22:03,720 --> 00:22:05,359 Speaker 6: probably remember it better than I a few days. 408 00:22:05,440 --> 00:22:08,119 Speaker 2: I just remember it ruined my Thanksgiving Kyle. 409 00:22:08,160 --> 00:22:10,160 Speaker 1: Have you have you bought your tickets for the movie 410 00:22:10,160 --> 00:22:10,639 Speaker 1: this weekend? 411 00:22:10,840 --> 00:22:12,280 Speaker 3: I have not bought a ticket yet. 412 00:22:12,280 --> 00:22:14,480 Speaker 7: I have seen it all over social media, like this 413 00:22:14,560 --> 00:22:17,840 Speaker 7: seems to be the media artifact of AI right now. 414 00:22:18,119 --> 00:22:20,919 Speaker 7: And it struck me as very vintage, like it's this 415 00:22:21,000 --> 00:22:26,760 Speaker 7: kind of hyped up dumorus like reel of people saying 416 00:22:26,760 --> 00:22:28,480 Speaker 7: that this is going to take over the world and 417 00:22:28,560 --> 00:22:31,280 Speaker 7: destroy things. And I don't know, I feel like the 418 00:22:31,760 --> 00:22:36,399 Speaker 7: current concerns are less about AI God killing us all 419 00:22:36,480 --> 00:22:40,240 Speaker 7: and more about the economy redirecting toward data centers and 420 00:22:40,320 --> 00:22:43,600 Speaker 7: accelerating climate change, and then you know, destroying the earth 421 00:22:43,760 --> 00:22:45,800 Speaker 7: that way is a kind of secondary effect. 422 00:22:47,200 --> 00:22:47,919 Speaker 3: So yeah, I don't know. 423 00:22:47,960 --> 00:22:51,040 Speaker 7: It felt like a bit of I don't know, super 424 00:22:51,080 --> 00:22:55,960 Speaker 7: sincere hyping up And as Natasha was talking about, I 425 00:22:55,960 --> 00:22:59,760 Speaker 7: think the CEOs are incentivized to make this technology appear 426 00:22:59,800 --> 00:23:03,080 Speaker 7: as powerful and world conquering as possible, and they're not 427 00:23:03,119 --> 00:23:05,480 Speaker 7: necessarily going to be the best sources for you on 428 00:23:05,800 --> 00:23:07,399 Speaker 7: how will this affect a normal person? 429 00:23:07,760 --> 00:23:10,040 Speaker 4: And even on the doomsday stuff, I just don't think 430 00:23:10,080 --> 00:23:12,000 Speaker 4: these it didn't seem like they had much of an 431 00:23:12,000 --> 00:23:15,160 Speaker 4: imagination there either, Like talk to the have you read 432 00:23:15,640 --> 00:23:17,520 Speaker 4: you know, if anyone builds it, everyone dies. 433 00:23:17,680 --> 00:23:18,679 Speaker 2: You guys read that book. 434 00:23:18,960 --> 00:23:21,399 Speaker 4: It's like they come out with like much better scenarios 435 00:23:21,440 --> 00:23:23,920 Speaker 4: for how AI kills us all, Like, you know, it's 436 00:23:23,960 --> 00:23:29,120 Speaker 4: just like a Zukowsi, Yeah, exactly, like they're They're like, yeah, 437 00:23:29,160 --> 00:23:31,040 Speaker 4: I mean, it's just going to like co opt humans 438 00:23:31,080 --> 00:23:33,679 Speaker 4: and eventually of the humans will help them build some 439 00:23:33,840 --> 00:23:36,719 Speaker 4: bioweapon that we won't even realize we have for like 440 00:23:37,240 --> 00:23:40,440 Speaker 4: a generation and eventually we all die. Like it's it's 441 00:23:40,480 --> 00:23:42,840 Speaker 4: a much you know, Like I feel like, if you're 442 00:23:42,840 --> 00:23:44,800 Speaker 4: going to go doomsday, like go all the way. 443 00:23:45,560 --> 00:23:45,800 Speaker 2: You know. 444 00:23:46,119 --> 00:23:47,760 Speaker 1: This is why I said that you were a space 445 00:23:47,840 --> 00:23:50,199 Speaker 1: at the beginning of the episode read because I remember 446 00:23:50,680 --> 00:23:54,680 Speaker 1: the sci fi you referred to when corporations not controlled 447 00:23:54,680 --> 00:23:57,920 Speaker 1: by governments can create an extractive economy in space and 448 00:23:58,600 --> 00:24:00,760 Speaker 1: take people there and then keep them in slaved in space. 449 00:24:01,680 --> 00:24:05,000 Speaker 2: That's right, that's right, like the company from Aliens, Right. 450 00:24:04,960 --> 00:24:09,119 Speaker 3: What if they make alien you know, like that's so 451 00:24:09,280 --> 00:24:09,680 Speaker 3: much better. 452 00:24:09,920 --> 00:24:12,320 Speaker 4: I mean I asked one of the authors of that 453 00:24:12,359 --> 00:24:16,600 Speaker 4: book on stage once, like he Nate Sores, and I'm like, 454 00:24:16,680 --> 00:24:18,760 Speaker 4: what do you think about This was like before the whole, 455 00:24:19,119 --> 00:24:22,480 Speaker 4: before I broke that whole like Pentagon Anthropics story, but 456 00:24:22,560 --> 00:24:26,000 Speaker 4: like I was interested in the topic of if autonomous weapons, 457 00:24:26,000 --> 00:24:27,639 Speaker 4: and I'm like, what do you think of autonomous weapons? 458 00:24:27,680 --> 00:24:29,320 Speaker 4: And it was like a layup I'm like, he's gonna 459 00:24:29,359 --> 00:24:32,000 Speaker 4: just say this is so crazy, I can't believe we 460 00:24:32,040 --> 00:24:33,960 Speaker 4: would ever do this, and he's like, no, I'm pretty 461 00:24:34,040 --> 00:24:37,479 Speaker 4: much for autonomous weapons, like AI doesn't need autonomous weapons 462 00:24:37,480 --> 00:24:40,479 Speaker 4: to kill us, and like they could actually help reduce 463 00:24:40,720 --> 00:24:43,879 Speaker 4: human deaths in the battlefield, and like wow, like that 464 00:24:43,960 --> 00:24:45,080 Speaker 4: came full circle, Like. 465 00:24:46,080 --> 00:24:50,960 Speaker 1: I read I read Sebastian Amountabie's biography of Demis hasbist 466 00:24:51,160 --> 00:24:53,480 Speaker 1: this week because he was on the show Sebastian a 467 00:24:53,560 --> 00:24:56,120 Speaker 1: couple of days ago. And yeah, apparently the way he 468 00:24:56,320 --> 00:24:59,720 Speaker 1: impressed Elon ear Leon was saying to Elon, by the way, 469 00:25:00,600 --> 00:25:05,480 Speaker 1: going to space won't save us from superintursient AIS because 470 00:25:05,480 --> 00:25:07,800 Speaker 1: they'll be able to build rockets too and kill us 471 00:25:07,840 --> 00:25:13,760 Speaker 1: while we're up there. But just a doubt, just to 472 00:25:13,800 --> 00:25:15,439 Speaker 1: come out of this story, I want to pick up 473 00:25:15,440 --> 00:25:18,119 Speaker 1: on a word you use, Kyle, which is sincere, and 474 00:25:18,359 --> 00:25:23,639 Speaker 1: certainly the documentary comes across as very sincere. However, you know, 475 00:25:23,720 --> 00:25:26,879 Speaker 1: there is this odd thing where these AI doomers and 476 00:25:26,960 --> 00:25:30,280 Speaker 1: AI boosters are kind of two sides of the same coin, 477 00:25:30,880 --> 00:25:33,040 Speaker 1: and the doomers in a sense, okay, maybe they're not 478 00:25:33,080 --> 00:25:36,719 Speaker 1: becoming billionaires by building AI companies, but they're certainly becoming 479 00:25:36,800 --> 00:25:40,399 Speaker 1: amongst the most sought after media talking heads in the world, 480 00:25:40,880 --> 00:25:43,440 Speaker 1: and so like, is it truly in good faith? 481 00:25:43,960 --> 00:25:46,000 Speaker 3: I mean, from my perspective, it's not. 482 00:25:46,200 --> 00:25:48,520 Speaker 7: I think there is a hype cycle and people are 483 00:25:48,760 --> 00:25:52,160 Speaker 7: caching in on different sides of the hype cycle. Meanwhile, 484 00:25:52,240 --> 00:25:55,560 Speaker 7: like Anthropic and Open AI are cruising toward IPOs that 485 00:25:55,640 --> 00:26:00,440 Speaker 7: will make many millionaires and billionaires and that will change things. 486 00:26:00,720 --> 00:26:03,600 Speaker 7: You know, there are like more eminent consequences I think 487 00:26:03,640 --> 00:26:07,160 Speaker 7: that we can talk about instead of total dumer apocalypse. 488 00:26:08,000 --> 00:26:12,240 Speaker 6: It's interesting because the whole documentary is about in incentives 489 00:26:12,840 --> 00:26:16,760 Speaker 6: and you're not, like, there's very little examination of the 490 00:26:16,800 --> 00:26:20,679 Speaker 6: incentives of the people that got the most airtime. And 491 00:26:20,760 --> 00:26:24,200 Speaker 6: I think there is this sense that it's just it's 492 00:26:24,200 --> 00:26:27,719 Speaker 6: hard to operate in AI, like writing about AI without 493 00:26:27,800 --> 00:26:31,480 Speaker 6: kind of knowing where everybody is coming from. You kind 494 00:26:31,480 --> 00:26:34,479 Speaker 6: of have to have this map in your head because 495 00:26:34,520 --> 00:26:37,920 Speaker 6: I think it looks like there are many different voices 496 00:26:37,960 --> 00:26:41,159 Speaker 6: saying the same thing, and so there is consensus, but 497 00:26:42,160 --> 00:26:46,679 Speaker 6: you know, many of those people are from policy groups 498 00:26:46,680 --> 00:26:51,800 Speaker 6: that are trying to influence legislators towards a specific view, 499 00:26:52,200 --> 00:26:56,639 Speaker 6: like they share a lot of similarity. So you know, 500 00:26:56,880 --> 00:26:59,800 Speaker 6: I just I wish we would move away from from 501 00:27:00,240 --> 00:27:05,200 Speaker 6: looking at a person's like kind of personal politics and 502 00:27:05,320 --> 00:27:09,320 Speaker 6: just look at like the CEO's actions A and then 503 00:27:09,520 --> 00:27:13,560 Speaker 6: for these kind of advocates, look at you know, where 504 00:27:13,600 --> 00:27:15,879 Speaker 6: they're coming from. You know, if they're giving you a 505 00:27:15,920 --> 00:27:20,000 Speaker 6: blackmail scenario from the anthropic paper, to what end and 506 00:27:20,880 --> 00:27:24,400 Speaker 6: how does that like reframe how you think about what 507 00:27:24,520 --> 00:27:29,000 Speaker 6: policies to put forward. Because this is aimed at policymakers, 508 00:27:29,640 --> 00:27:31,280 Speaker 6: I think we should think about it as its own 509 00:27:31,359 --> 00:27:32,280 Speaker 6: kind of propaganda. 510 00:27:32,680 --> 00:27:36,280 Speaker 1: That's a really interesting point you make, because obviously in 511 00:27:36,320 --> 00:27:39,560 Speaker 1: the Trump era, it's sort of tempting to think, well, 512 00:27:39,600 --> 00:27:43,160 Speaker 1: like the train has left the station, they'll never be regulation, 513 00:27:43,480 --> 00:27:46,399 Speaker 1: Like we've just like essentially decided that there's going to 514 00:27:46,400 --> 00:27:50,479 Speaker 1: be unlimited, unregulated technology. But of course, like you know, 515 00:27:50,560 --> 00:27:53,840 Speaker 1: there's midterms coming up this year, there's two years time 516 00:27:53,840 --> 00:27:57,280 Speaker 1: there's another presidential election, and so are you suggesting in 517 00:27:57,280 --> 00:27:59,879 Speaker 1: some sense this is kind of building towards a co 518 00:28:00,720 --> 00:28:04,480 Speaker 1: of a policy driven tech lash post election. 519 00:28:04,320 --> 00:28:07,600 Speaker 6: Or yes, I mean this is you know, like some 520 00:28:07,720 --> 00:28:11,080 Speaker 6: of the people who are represented there, they are affiliated 521 00:28:11,119 --> 00:28:14,880 Speaker 6: with groups who are arguing that there is massive bipartisan 522 00:28:14,960 --> 00:28:19,560 Speaker 6: support for a long list of legislation. I wouldn't say 523 00:28:19,600 --> 00:28:22,439 Speaker 6: it's like a crisp kind of policy proposal, but you know, 524 00:28:22,520 --> 00:28:28,000 Speaker 6: they're meeting together with faith groups, with children's advocacy groups, 525 00:28:28,040 --> 00:28:32,280 Speaker 6: with unions and saying like we all have common cause 526 00:28:32,440 --> 00:28:37,600 Speaker 6: in regulating in regulating AI. So I just think that 527 00:28:37,680 --> 00:28:39,640 Speaker 6: this is the tip of the spear, and we should 528 00:28:39,640 --> 00:28:46,520 Speaker 6: pay attention to where these where these narratives are pointing government, 529 00:28:46,640 --> 00:28:50,160 Speaker 6: and where these narratives might end up affecting all of us. 530 00:28:52,600 --> 00:28:54,640 Speaker 1: We're going to take a short break now before we 531 00:28:54,720 --> 00:28:58,640 Speaker 1: come back to celebrate Apple's fiftieth birthday by asking what 532 00:28:58,680 --> 00:29:16,840 Speaker 1: comes next for the second largest company in America. Welcome 533 00:29:16,840 --> 00:29:19,560 Speaker 1: back to tech stuff. Read is kind of incredible. Apple's 534 00:29:19,560 --> 00:29:22,720 Speaker 1: been around for fifty years. I mean it started just 535 00:29:22,800 --> 00:29:25,800 Speaker 1: after the original Moon landings, which is kind of crazy 536 00:29:25,840 --> 00:29:30,120 Speaker 1: to think about. Wednesday, April first, Apple celebrated this milestone 537 00:29:30,120 --> 00:29:33,479 Speaker 1: by putting on a statement titled fifty years of Thinking 538 00:29:33,520 --> 00:29:37,680 Speaker 1: Different and amidst the praise of misfits, rebels, and troublemakers, 539 00:29:38,240 --> 00:29:41,800 Speaker 1: there is a telling quote at Apple, We're more focused 540 00:29:41,840 --> 00:29:45,920 Speaker 1: on building tomorrow than remembering yesterday. What does that mean? 541 00:29:46,160 --> 00:29:49,000 Speaker 4: Yeah, I mean usually when you turn fifty in silicon value, 542 00:29:49,040 --> 00:29:50,720 Speaker 4: you don't tell anybody, like you try to keep it 543 00:29:50,800 --> 00:29:51,280 Speaker 4: a secret. 544 00:29:51,320 --> 00:29:55,200 Speaker 2: But that was looks mixing. It's kind of interesting. 545 00:29:56,080 --> 00:30:01,800 Speaker 4: No, I think I think there's there's a fascinating, fascinating 546 00:30:01,840 --> 00:30:05,240 Speaker 4: happening with Apple right now where they basically missed AI. 547 00:30:05,560 --> 00:30:05,720 Speaker 2: Right. 548 00:30:05,800 --> 00:30:08,880 Speaker 4: They're sort of like they were they were caught. You know, 549 00:30:08,920 --> 00:30:11,240 Speaker 4: I don't I don't know how that happened, but they 550 00:30:11,240 --> 00:30:14,320 Speaker 4: didn't see this revolution happening. I think had something to 551 00:30:14,320 --> 00:30:16,880 Speaker 4: do with being in Kooper Tino inside this like spaceship 552 00:30:16,920 --> 00:30:21,640 Speaker 4: building and not going outside and the future right. But 553 00:30:21,680 --> 00:30:25,240 Speaker 4: I but I think that like the defense of Apple 554 00:30:25,360 --> 00:30:28,440 Speaker 4: from from the Apple fanboy crowd, which they're you know, 555 00:30:28,480 --> 00:30:30,760 Speaker 4: they're very their many, and I'll probably get hate mail 556 00:30:30,800 --> 00:30:34,360 Speaker 4: for even saying this, but it's like, you know, look 557 00:30:34,520 --> 00:30:36,840 Speaker 4: they've yeah, I mean, Apple's great at this. They just 558 00:30:36,920 --> 00:30:40,440 Speaker 4: watch technology develop and then they just jump on top 559 00:30:40,480 --> 00:30:42,880 Speaker 4: of it and like and and you know, and benefit 560 00:30:42,920 --> 00:30:45,560 Speaker 4: from it. And I think that's like a very lazy 561 00:30:45,600 --> 00:30:48,160 Speaker 4: defense of of what's happening, and it's and the other 562 00:30:48,200 --> 00:30:50,680 Speaker 4: thing is like, well, you've got this phone, and you 563 00:30:50,720 --> 00:30:52,760 Speaker 4: know they're gonna just like that, They're gonna be able 564 00:30:52,760 --> 00:30:54,959 Speaker 4: to run all these AI models on the device, and 565 00:30:55,040 --> 00:30:56,800 Speaker 4: so that's going to be a huge advantage for Apple. 566 00:30:56,840 --> 00:30:59,200 Speaker 4: And it's like, I don't think that's really how it's 567 00:30:59,200 --> 00:31:03,040 Speaker 4: gonna work. I mean, I can't, I don't. It never 568 00:31:03,160 --> 00:31:06,040 Speaker 4: helps in the tech industry to like miss a technology 569 00:31:06,080 --> 00:31:08,200 Speaker 4: wave like we've seen companies do it. I'm not saying 570 00:31:08,200 --> 00:31:12,640 Speaker 4: Apple's gonna disappear overnight or anything like that, but you know, 571 00:31:12,720 --> 00:31:15,280 Speaker 4: like the reason people buy iPhones is not because it 572 00:31:15,320 --> 00:31:18,680 Speaker 4: has some great hardware or is a piece of great hardware. 573 00:31:18,760 --> 00:31:22,080 Speaker 2: It's because they're locked into the ecosystem, right. It's it's they. 574 00:31:21,960 --> 00:31:24,000 Speaker 4: Don't want to be a green bubble or name your 575 00:31:24,080 --> 00:31:27,080 Speaker 4: you know, the photo sharing product or something like that. 576 00:31:27,240 --> 00:31:29,840 Speaker 4: But you know, Apple doesn't know what the like, we 577 00:31:29,920 --> 00:31:32,160 Speaker 4: don't know what the on device like, what the edge 578 00:31:32,200 --> 00:31:34,960 Speaker 4: compute's going to look like. That's necessary to make you know, 579 00:31:35,000 --> 00:31:39,800 Speaker 4: whatever this future on device AI is. And Apple can't 580 00:31:39,960 --> 00:31:42,280 Speaker 4: move very quickly with its phones because it sells so 581 00:31:42,360 --> 00:31:44,560 Speaker 4: many of them, so like when it comes to like 582 00:31:44,640 --> 00:31:47,320 Speaker 4: new hardware, that's why they're always like many years behind 583 00:31:47,360 --> 00:31:49,960 Speaker 4: Samsung on the hardware because Samsung doesn't have as big 584 00:31:50,000 --> 00:31:52,440 Speaker 4: of an install based so they don't need to sell 585 00:31:52,760 --> 00:31:55,680 Speaker 4: you know, millions and millions of these things immediately when 586 00:31:55,680 --> 00:31:57,800 Speaker 4: they come out with a new product. So it's gonna 587 00:31:57,800 --> 00:32:01,239 Speaker 4: be really, really hard. I think for Apple too, you know, 588 00:32:01,400 --> 00:32:04,080 Speaker 4: the next fifty years, I think, I think might be tough. 589 00:32:04,360 --> 00:32:06,520 Speaker 1: I can see Kyle. I can see Kyle smiling, and 590 00:32:06,680 --> 00:32:08,720 Speaker 1: I think he might be drafting some hate mail. Yeah. 591 00:32:08,760 --> 00:32:11,480 Speaker 7: I was just remembering the recent marketing campaigns for the 592 00:32:11,560 --> 00:32:15,400 Speaker 7: latest iPhone models, and they just all promised various AI 593 00:32:15,440 --> 00:32:18,520 Speaker 7: functionality and just did not deliver on it at all. 594 00:32:19,040 --> 00:32:20,680 Speaker 3: Like I when I bought. 595 00:32:20,440 --> 00:32:23,400 Speaker 7: My most recent iPhone, I kind of knew the AI 596 00:32:23,600 --> 00:32:25,360 Speaker 7: was not going to come through, Like that was the 597 00:32:26,040 --> 00:32:29,400 Speaker 7: messaging and the tech community at least, but the public 598 00:32:29,480 --> 00:32:32,200 Speaker 7: didn't know that. And it just seems kind of embarrassing 599 00:32:32,280 --> 00:32:35,360 Speaker 7: for the company that they are missing their own goals, 600 00:32:35,480 --> 00:32:39,600 Speaker 7: like they're missing their stated aims with their AI functionality. So, 601 00:32:40,000 --> 00:32:42,720 Speaker 7: I don't know, it doesn't look like strategy from the outside. 602 00:32:42,800 --> 00:32:46,040 Speaker 1: You don't think there's anything the consumers like. For example, 603 00:32:46,080 --> 00:32:48,120 Speaker 1: we just talked about that movie, right, and the kind 604 00:32:48,160 --> 00:32:50,840 Speaker 1: of AI doomerism. There's no argument that perhaps from a 605 00:32:50,840 --> 00:32:53,920 Speaker 1: consumer point of view, like having a technology company that 606 00:32:54,080 --> 00:32:58,120 Speaker 1: is not emotionally associated with AI and doom maybe an 607 00:32:58,160 --> 00:32:59,959 Speaker 1: interesting thing. I mean it was. It was interesting too. 608 00:33:00,120 --> 00:33:03,040 Speaker 1: Cook in that letter didn't mention the words AI lartificial 609 00:33:03,040 --> 00:33:05,360 Speaker 1: intelligence once. I mean, this is a tech ce or 610 00:33:05,440 --> 00:33:07,760 Speaker 1: letter in twenty twenty six. That kind of been an oversight. 611 00:33:07,800 --> 00:33:09,000 Speaker 1: I mean that must have been a decision. 612 00:33:09,520 --> 00:33:12,600 Speaker 4: Well, look AI. The thing is we should stop calling 613 00:33:12,640 --> 00:33:15,000 Speaker 4: it AI. I know that's not gonna happen. But you know, 614 00:33:15,120 --> 00:33:19,080 Speaker 4: it's software, okay, Like it's like saying I hate software, 615 00:33:19,200 --> 00:33:21,239 Speaker 4: Like okay, fine, Like and there could be like an 616 00:33:21,280 --> 00:33:25,560 Speaker 4: anti software lobby and everybody lobbies. But like, you know, 617 00:33:25,640 --> 00:33:27,640 Speaker 4: when you have a product that can just like do 618 00:33:27,680 --> 00:33:29,800 Speaker 4: stuff for you and save you a bunch of time 619 00:33:29,840 --> 00:33:32,160 Speaker 4: and is like very useful, like you're just going to 620 00:33:32,280 --> 00:33:34,520 Speaker 4: use it, right, that's just how the world works. Like 621 00:33:34,560 --> 00:33:37,120 Speaker 4: I'm sorry, nobody's going to be like I'm morally opposed 622 00:33:37,120 --> 00:33:39,640 Speaker 4: to this software that's like, you know, allowing me to 623 00:33:39,680 --> 00:33:42,040 Speaker 4: spend more time with my kids. Like it's just, you know, 624 00:33:42,280 --> 00:33:43,640 Speaker 4: it's just kind of a silly concept. 625 00:33:43,920 --> 00:33:46,120 Speaker 7: I found it kind of interesting just to jump in 626 00:33:46,360 --> 00:33:49,800 Speaker 7: the wave of new devices coming out for AI. The 627 00:33:49,840 --> 00:33:52,960 Speaker 7: little gadgets and dependance and stuff. They all look like 628 00:33:53,040 --> 00:33:57,640 Speaker 7: Apple products, Like the vocabulary is so apply, the graphic 629 00:33:57,680 --> 00:34:02,000 Speaker 7: design is apply. The whole vibe comes from Apple. But 630 00:34:02,080 --> 00:34:03,880 Speaker 7: Apple is like not anywhere. 631 00:34:04,200 --> 00:34:06,720 Speaker 1: And Jon who designed the iPhone, is working on. 632 00:34:07,960 --> 00:34:11,680 Speaker 4: AI, so like, and why are those products coming out? Right, 633 00:34:11,719 --> 00:34:14,920 Speaker 4: It's because the iPhone can't. You can't do what you 634 00:34:15,000 --> 00:34:17,440 Speaker 4: want to do on the iPhone. They won't let you. 635 00:34:17,520 --> 00:34:22,280 Speaker 7: I'm just waiting for the real AI Tamagatschi. That's my dream. 636 00:34:22,480 --> 00:34:24,960 Speaker 7: I can't wait. It needs to happen. 637 00:34:25,239 --> 00:34:27,680 Speaker 6: Wait read. But if we're locked in because it's a 638 00:34:27,760 --> 00:34:31,560 Speaker 6: closed ecosystem, doesn't that give them this at least, like 639 00:34:31,600 --> 00:34:34,680 Speaker 6: it buys them time to just come in when you 640 00:34:34,719 --> 00:34:37,480 Speaker 6: know when it's not AI AI, it is, here's your 641 00:34:37,640 --> 00:34:41,120 Speaker 6: useful assistant. Like I don't know some of these some 642 00:34:41,200 --> 00:34:44,040 Speaker 6: of these companies, right like fifty years they've had they've 643 00:34:44,040 --> 00:34:48,560 Speaker 6: been able to parlay their like consumer lock in from 644 00:34:48,920 --> 00:34:49,960 Speaker 6: across a generation. 645 00:34:50,360 --> 00:34:54,480 Speaker 4: So yeah, like Expinity for instance, right, like they will 646 00:34:54,520 --> 00:34:57,520 Speaker 4: continue to exist, Like I'm not arguing against that at all, 647 00:34:57,560 --> 00:35:00,319 Speaker 4: Like I think Apple will go on for a long time. 648 00:35:00,360 --> 00:35:02,920 Speaker 4: Like this isn't This isn't an argument that like Apple's 649 00:35:02,920 --> 00:35:05,680 Speaker 4: going to implode and disappear. But it's like it's like 650 00:35:05,760 --> 00:35:08,719 Speaker 4: Microsoft when they missed you know, I don't know, they 651 00:35:08,719 --> 00:35:12,359 Speaker 4: miss search and then Mobile, like it's it's Microsoft still there. 652 00:35:12,360 --> 00:35:15,320 Speaker 4: It's still a very valuable company, but like not as 653 00:35:15,480 --> 00:35:18,719 Speaker 4: as relevant and it went through some dark times. Maybe 654 00:35:18,880 --> 00:35:21,120 Speaker 4: you know, maybe they can make a recovery or something 655 00:35:21,160 --> 00:35:23,560 Speaker 4: like that, but I think it will be dark times 656 00:35:23,760 --> 00:35:26,400 Speaker 4: for Apple. It's not going to have the same, you know, 657 00:35:26,760 --> 00:35:29,400 Speaker 4: the same relevance as as it used to once this 658 00:35:29,440 --> 00:35:31,640 Speaker 4: stuff starts to take off. And I'm not an expert 659 00:35:31,640 --> 00:35:34,120 Speaker 4: on the timelines, like this could take this could take 660 00:35:34,160 --> 00:35:37,359 Speaker 4: several years. But but the writing's on the wall. 661 00:35:37,480 --> 00:35:38,720 Speaker 2: That's all. That's my argument. 662 00:35:39,160 --> 00:35:41,160 Speaker 1: I also think I also take take a little bit 663 00:35:41,200 --> 00:35:43,560 Speaker 1: of you know, disagree slightly with you read that the 664 00:35:43,560 --> 00:35:45,760 Speaker 1: people done by Apple because of the product, Like I agree, 665 00:35:45,800 --> 00:35:48,600 Speaker 1: the ecosystem looking is very powerful, but also like I 666 00:35:48,680 --> 00:35:50,399 Speaker 1: was an Android user for many years and I bult 667 00:35:50,400 --> 00:35:51,920 Speaker 1: an iPhone. I was like, wow, this is just a 668 00:35:51,920 --> 00:35:53,719 Speaker 1: lot better. I'm not going to get another Android, And 669 00:35:54,040 --> 00:35:56,360 Speaker 1: I recently lost my airports when I was traveling and 670 00:35:56,400 --> 00:35:57,560 Speaker 1: I was like, I don't know about one hundred and 671 00:35:57,560 --> 00:35:59,840 Speaker 1: fifty dollars for airports. I'm going to get fake copied 672 00:35:59,880 --> 00:36:01,719 Speaker 1: one on Amazon for thirty bucks. I was like put 673 00:36:01,760 --> 00:36:04,719 Speaker 1: them in and I was like, I. 674 00:36:04,640 --> 00:36:07,120 Speaker 4: Thought, look, I have I have AirPods in my ear. 675 00:36:07,200 --> 00:36:09,759 Speaker 4: I have an iPhone sitting actually I have like three 676 00:36:09,880 --> 00:36:10,960 Speaker 4: iPhones sitting on my. 677 00:36:11,040 --> 00:36:13,760 Speaker 2: Dead like I'm on a mat computer. 678 00:36:14,040 --> 00:36:16,239 Speaker 4: Like this is I totally get that, Like it's a 679 00:36:16,239 --> 00:36:19,239 Speaker 4: great consumer experience, like this is not like this is 680 00:36:19,280 --> 00:36:21,319 Speaker 4: not me being like I hate Apple and I won't 681 00:36:21,320 --> 00:36:24,160 Speaker 4: buy anything by Apple, Like no, people, It's a really 682 00:36:24,239 --> 00:36:27,319 Speaker 4: nice consumer experience, like it all, it all works really 683 00:36:27,320 --> 00:36:29,840 Speaker 4: well together. But I'm just looking into the future. I'm like, 684 00:36:30,320 --> 00:36:32,920 Speaker 4: this is not like this idea of like opening up 685 00:36:32,920 --> 00:36:35,759 Speaker 4: an app and like using it like which we all 686 00:36:35,800 --> 00:36:38,560 Speaker 4: do today is not the future. Like that is not 687 00:36:38,719 --> 00:36:41,040 Speaker 4: how like I'm not opening up a banking app and 688 00:36:41,120 --> 00:36:44,359 Speaker 4: like transferring funds and stuff. I'm just gonna tell an 689 00:36:44,400 --> 00:36:45,920 Speaker 4: AI agent to do that, and you can do that 690 00:36:46,000 --> 00:36:50,239 Speaker 4: today with you know, open claw. That's very early adopter technology, 691 00:36:50,239 --> 00:36:53,360 Speaker 4: but like a glimpse into the future again, and I 692 00:36:53,400 --> 00:36:56,560 Speaker 4: can tell a chatbot like go, you know, transfer some 693 00:36:56,600 --> 00:36:59,120 Speaker 4: funds on a bank, Go book me a flight, right 694 00:36:59,200 --> 00:37:01,880 Speaker 4: and without ever opening an app, and all that's happening 695 00:37:01,920 --> 00:37:04,799 Speaker 4: outside the mobile ecosystem. And that's that's kind of where 696 00:37:04,800 --> 00:37:07,440 Speaker 4: it's heading. And it just makes that the thing that 697 00:37:07,480 --> 00:37:11,360 Speaker 4: Apple built that nice, like very nice consumer experience, like 698 00:37:11,680 --> 00:37:14,239 Speaker 4: it's not as important anymore. It's it's going to be 699 00:37:14,280 --> 00:37:17,840 Speaker 4: sort of like there. I don't think there's a walled garden, 700 00:37:18,440 --> 00:37:20,239 Speaker 4: at least I hope not, because I think it would 701 00:37:20,239 --> 00:37:22,000 Speaker 4: be bad for consumers. Like, I don't think there's like 702 00:37:22,040 --> 00:37:23,560 Speaker 4: a walled garden approach to that. 703 00:37:23,640 --> 00:37:27,000 Speaker 7: It feels like everyone's jostling for control of the new interface, 704 00:37:27,280 --> 00:37:29,759 Speaker 7: like what are we going to use to interface with 705 00:37:29,800 --> 00:37:33,439 Speaker 7: our AI agents and our open clause and whatever. And 706 00:37:34,000 --> 00:37:36,960 Speaker 7: you know, Johnny Ive and open ai are trying to 707 00:37:37,000 --> 00:37:39,759 Speaker 7: do that with some pendant thing that sits on your 708 00:37:39,760 --> 00:37:42,640 Speaker 7: desk plus the thing that goes in your pocket. There's 709 00:37:42,640 --> 00:37:46,239 Speaker 7: been all sorts of little experiments and different trials. I 710 00:37:46,239 --> 00:37:49,080 Speaker 7: feel like there's more opportunity right now for some other 711 00:37:49,160 --> 00:37:51,759 Speaker 7: company to take this away from Apple and to come 712 00:37:51,840 --> 00:37:54,560 Speaker 7: up with something else. And I was just recalling how 713 00:37:54,600 --> 00:37:58,440 Speaker 7: Apple is kind of bailed on its other innovative hardware projects, 714 00:37:58,480 --> 00:38:01,600 Speaker 7: like there is no TV, there is no car, and 715 00:38:01,640 --> 00:38:06,440 Speaker 7: the new possible CEO who people talk about John Turnus. 716 00:38:06,880 --> 00:38:11,080 Speaker 7: He's a hardware engineering guy. Like, it's not clear where 717 00:38:11,080 --> 00:38:11,560 Speaker 7: it's going. 718 00:38:11,920 --> 00:38:14,839 Speaker 6: Did you guys see that picture of Joe Gebbia from 719 00:38:15,000 --> 00:38:18,279 Speaker 6: Airbnb with like what looked like the new device. I 720 00:38:18,280 --> 00:38:20,400 Speaker 6: mean I still want my If that's the new device, 721 00:38:20,440 --> 00:38:23,399 Speaker 6: I will keep my phone. You know, like you are saying, Kyle, 722 00:38:23,440 --> 00:38:30,000 Speaker 6: it's so derivative of early Apple. So yeah, I see 723 00:38:30,000 --> 00:38:32,040 Speaker 6: what you're saying. I guess, like, and I don't even 724 00:38:32,080 --> 00:38:34,680 Speaker 6: know why I'm defending Apple's hard I guess I like 725 00:38:34,760 --> 00:38:37,440 Speaker 6: that Tim Cook didn't put AI in his letter because 726 00:38:37,440 --> 00:38:41,080 Speaker 6: it is meaningless. Like you, if you're thinking about like 727 00:38:41,120 --> 00:38:44,960 Speaker 6: how to give people real utility, you don't need to 728 00:38:45,040 --> 00:38:47,200 Speaker 6: get like there is a chance that you could just 729 00:38:47,280 --> 00:38:49,799 Speaker 6: come in on the back end. I mean, fifty year 730 00:38:49,840 --> 00:38:51,960 Speaker 6: old companies have a very hard time innovating. 731 00:38:52,160 --> 00:38:54,160 Speaker 4: I think the new interface that you're talking about his 732 00:38:54,239 --> 00:38:56,879 Speaker 4: voice right, Like that's what that's I mean, it won't 733 00:38:56,880 --> 00:38:59,920 Speaker 4: be all voice, but like you know, that's how we're 734 00:39:00,080 --> 00:39:00,840 Speaker 4: to control this stuff. 735 00:39:00,840 --> 00:39:03,400 Speaker 6: Don't you think, yeah, but what is this big pebble, 736 00:39:04,520 --> 00:39:06,360 Speaker 6: this big pebble. I'm not going to carry around a 737 00:39:06,400 --> 00:39:07,440 Speaker 6: big pebble. 738 00:39:07,160 --> 00:39:09,080 Speaker 2: Like you're gonna have to. 739 00:39:09,200 --> 00:39:11,040 Speaker 4: We're gonna have to have some device, right because you're 740 00:39:11,040 --> 00:39:14,400 Speaker 4: gonna want some screen occasionally look at something visual or 741 00:39:15,040 --> 00:39:17,040 Speaker 4: you know, maybe it's in a glass as well. I 742 00:39:17,040 --> 00:39:20,200 Speaker 4: think that technology is like really far away that. 743 00:39:20,160 --> 00:39:24,759 Speaker 6: They'll consult more women as they're doing this, because they 744 00:39:24,760 --> 00:39:26,200 Speaker 6: did not for VR and. 745 00:39:27,760 --> 00:39:31,840 Speaker 7: A Imagchi all the way. So we've we've been here before. 746 00:39:31,920 --> 00:39:35,160 Speaker 7: There's the Pokemon version, there's the digimon little thing. 747 00:39:35,320 --> 00:39:36,200 Speaker 3: It's gonna be great. 748 00:39:39,480 --> 00:39:41,440 Speaker 6: Yeah. I was just gonna say, because I can picture 749 00:39:41,480 --> 00:39:43,560 Speaker 6: how to carry it around like that big pebble, I 750 00:39:43,600 --> 00:39:44,919 Speaker 6: was like, where am I going to put it? It's 751 00:39:44,920 --> 00:39:46,160 Speaker 6: going to be at the bottom of my back? 752 00:39:46,400 --> 00:39:49,120 Speaker 1: Who has the inside scoop on the on the CEO 753 00:39:49,200 --> 00:39:50,880 Speaker 1: succession is Tim Cook? What are you going to go 754 00:39:50,960 --> 00:39:51,439 Speaker 1: this year? 755 00:39:51,680 --> 00:39:51,879 Speaker 6: Oh? 756 00:39:51,960 --> 00:39:52,360 Speaker 2: I don't know. 757 00:39:52,440 --> 00:39:55,360 Speaker 4: I mean, I I just I'm not like deep in 758 00:39:55,480 --> 00:39:58,120 Speaker 4: the in the company anymore, so I would just be guessing. 759 00:39:58,160 --> 00:40:00,040 Speaker 4: But I mean for sure, like it's the end of 760 00:40:00,080 --> 00:40:03,960 Speaker 4: Tim Cook's you know, career, and that's that's apparent right, 761 00:40:04,120 --> 00:40:07,839 Speaker 4: Like when exactly that happened, I don't know, but but yeah, 762 00:40:07,920 --> 00:40:11,600 Speaker 4: like what I I think it's like there's every every 763 00:40:11,680 --> 00:40:13,959 Speaker 4: it seems like every six months there's like a new 764 00:40:14,040 --> 00:40:17,399 Speaker 4: successor a new name that comes out. And he's like, yeah, 765 00:40:17,400 --> 00:40:20,279 Speaker 4: like none of it's like super exciting. You're like, could 766 00:40:20,320 --> 00:40:22,400 Speaker 4: we get Steve jobs? Like is he available to? 767 00:40:22,400 --> 00:40:25,399 Speaker 1: Like I just Adam Newman might be looking for a job. 768 00:40:25,440 --> 00:40:28,320 Speaker 4: If maybe Adam Newman like I don't know, like something 769 00:40:28,400 --> 00:40:31,800 Speaker 4: exciting that like maybe that's not good. Maybe Apple doesn't 770 00:40:31,800 --> 00:40:33,920 Speaker 4: want someone exciting. Maybe the best thing is to be 771 00:40:34,080 --> 00:40:37,120 Speaker 4: like Comcast or you know whatever and just like ride 772 00:40:37,320 --> 00:40:38,680 Speaker 4: ride it out as long as possible. 773 00:40:38,719 --> 00:40:40,480 Speaker 1: Would you when you when they would dooin transition from 774 00:40:40,560 --> 00:40:42,359 Speaker 1: jobs to cook it was from the vision to reach 775 00:40:42,440 --> 00:40:44,280 Speaker 1: the technic crawt. It's hard to go from the technic 776 00:40:44,320 --> 00:40:46,319 Speaker 1: crat to a new visionary, right, You're kind of you're 777 00:40:46,320 --> 00:40:47,520 Speaker 1: locked in at that point a little bit. 778 00:40:47,760 --> 00:40:50,880 Speaker 3: Yeah, for sure, we knew Johnson Huang and his leather jacket. 779 00:40:51,040 --> 00:40:51,239 Speaker 3: You know. 780 00:40:54,640 --> 00:40:56,200 Speaker 1: Well, that's all we have time for today. But I 781 00:40:56,239 --> 00:40:58,960 Speaker 1: want to end with a quick round robin who had 782 00:40:59,000 --> 00:41:02,000 Speaker 1: the best and worst we can tech? I'm gonna go first. 783 00:41:02,040 --> 00:41:04,520 Speaker 1: I think anthropic we always want to either the best 784 00:41:04,520 --> 00:41:06,360 Speaker 1: of the week or the worst. I think it's the 785 00:41:06,400 --> 00:41:09,400 Speaker 1: worst this week with the leak of that code. Interesting, 786 00:41:09,640 --> 00:41:10,399 Speaker 1: what do you think read? 787 00:41:10,760 --> 00:41:11,080 Speaker 3: Well? 788 00:41:11,640 --> 00:41:14,799 Speaker 4: I thought, I mean the word like Oracle laying off like, 789 00:41:15,719 --> 00:41:18,880 Speaker 4: you know, a huge chunk of their whole company was like, 790 00:41:19,000 --> 00:41:21,480 Speaker 4: has got to be a really bad, bad week for Oracle. 791 00:41:21,520 --> 00:41:25,319 Speaker 4: And it's all because well again, like people will say, like, oh, 792 00:41:25,400 --> 00:41:27,719 Speaker 4: the AI, it's taking our jobs, and it's like, well, 793 00:41:27,760 --> 00:41:31,000 Speaker 4: actually it looks a little different, right. It's it's like, well, 794 00:41:31,120 --> 00:41:33,439 Speaker 4: they need to invest a huge amount of money in AI. 795 00:41:33,600 --> 00:41:36,480 Speaker 4: They don't really have the cash, and so they're just 796 00:41:36,560 --> 00:41:39,759 Speaker 4: like looking in the couch cushions and it's like, yeah, 797 00:41:39,880 --> 00:41:42,480 Speaker 4: laying off a bunch of employees is okay, that's going 798 00:41:42,560 --> 00:41:44,480 Speaker 4: to get us like a little bit closer to what 799 00:41:44,520 --> 00:41:46,080 Speaker 4: we need to invest in this whole thing. 800 00:41:46,600 --> 00:41:49,319 Speaker 1: And it's like just a general sixty minutes. 801 00:41:50,239 --> 00:41:55,960 Speaker 4: Right exactly so six So it's like, I mean, I 802 00:41:56,040 --> 00:41:57,960 Speaker 4: don't know, this is like how we don't really know 803 00:41:58,000 --> 00:41:59,759 Speaker 4: how AI is going to like affect the world. It's 804 00:41:59,800 --> 00:42:02,160 Speaker 4: like all these people just lost their jobs because of AI, 805 00:42:02,360 --> 00:42:05,799 Speaker 4: but not because AI took their jobs right, and I 806 00:42:05,840 --> 00:42:08,880 Speaker 4: think it's a bad week for Oracle, But also like 807 00:42:08,960 --> 00:42:11,440 Speaker 4: that is to me just a good example of what 808 00:42:11,480 --> 00:42:12,600 Speaker 4: we were talking about earlier. 809 00:42:13,080 --> 00:42:15,160 Speaker 7: On that note, I was going to say the worst 810 00:42:15,160 --> 00:42:19,280 Speaker 7: week in tech was for journalists because of this whole 811 00:42:19,360 --> 00:42:25,200 Speaker 7: AI journalism discourse. To Bockl, whatever should we use AI 812 00:42:25,280 --> 00:42:28,200 Speaker 7: and writing like, I think the answer is that it's 813 00:42:28,239 --> 00:42:30,520 Speaker 7: going to happen, and we're just kind of adapting to 814 00:42:30,560 --> 00:42:34,480 Speaker 7: this new situation. And at the same time, one of 815 00:42:34,480 --> 00:42:37,800 Speaker 7: the controversies was that this New York Times book reviewer 816 00:42:37,840 --> 00:42:42,919 Speaker 7: who's a British novelist used AI. He said to expand 817 00:42:43,000 --> 00:42:46,960 Speaker 7: a book review draft, he couldn't hit his thousand word mark, 818 00:42:47,080 --> 00:42:51,080 Speaker 7: which is crazy, and so he used AI and he 819 00:42:51,440 --> 00:42:54,640 Speaker 7: was caught kind of plagiarizing via AI. And my question 820 00:42:54,800 --> 00:42:57,680 Speaker 7: is just who are you, as a British novelist to 821 00:42:57,800 --> 00:43:01,359 Speaker 7: not even finish a novel review of a thousand words like. 822 00:43:01,719 --> 00:43:05,239 Speaker 1: And this kicked off a bigger journalism and Natasha, I 823 00:43:05,239 --> 00:43:07,320 Speaker 1: want to know your thoughts on cloud code. But also 824 00:43:07,520 --> 00:43:08,759 Speaker 1: if you, I mean, we've got a couple of we've 825 00:43:08,760 --> 00:43:10,560 Speaker 1: got three worsts of the week. I think hopefully you 826 00:43:10,560 --> 00:43:11,960 Speaker 1: can bring us the best of the week. 827 00:43:12,960 --> 00:43:15,920 Speaker 6: Oh, I have another worst of the week. I was 828 00:43:15,960 --> 00:43:20,640 Speaker 6: gonna say Mercore. Mercore is is the company that does 829 00:43:21,360 --> 00:43:24,000 Speaker 6: you know it kind of it hires those like PhDs 830 00:43:24,040 --> 00:43:28,560 Speaker 6: in biochemistry to create data that goes to the big 831 00:43:28,600 --> 00:43:33,520 Speaker 6: frontier labs to to put the PhDs in biochemistry out 832 00:43:33,560 --> 00:43:37,279 Speaker 6: of business. And they had this massive, massive leak. They 833 00:43:37,320 --> 00:43:42,400 Speaker 6: had video interviews with with individuals. They have so many 834 00:43:42,480 --> 00:43:45,720 Speaker 6: clients and I feel like we're just starting to get 835 00:43:45,760 --> 00:43:48,399 Speaker 6: a sense of where is that data going to go? 836 00:43:49,440 --> 00:43:50,960 Speaker 6: You know, does this mean it's like out on the 837 00:43:51,000 --> 00:43:54,719 Speaker 6: internet and everybody will have access to this like top 838 00:43:54,840 --> 00:43:57,640 Speaker 6: tier post training data. That one. That one is like 839 00:43:57,840 --> 00:44:00,560 Speaker 6: I'm not usually that interested in CyberSecure, but I really 840 00:44:00,600 --> 00:44:02,040 Speaker 6: want to see where this one goes. 841 00:44:01,920 --> 00:44:03,520 Speaker 1: And how does that relate is it? It's a separate 842 00:44:03,560 --> 00:44:05,440 Speaker 1: story to the clawed leak, but just tell us a 843 00:44:05,440 --> 00:44:06,680 Speaker 1: little bit about that before regard. 844 00:44:07,760 --> 00:44:11,960 Speaker 6: Yeah, the cloud leak is I think is spiritually very 845 00:44:11,960 --> 00:44:15,040 Speaker 6: different because you already have that a lot of that 846 00:44:15,120 --> 00:44:18,480 Speaker 6: code on your computer, so it wasn't as I mean, 847 00:44:18,520 --> 00:44:21,920 Speaker 6: it was like illuminating for people, but this is like 848 00:44:22,040 --> 00:44:26,960 Speaker 6: proprietary information that they owed to their customers, and you know, 849 00:44:27,160 --> 00:44:30,760 Speaker 6: there was no way that they wanted even a smidge 850 00:44:30,760 --> 00:44:31,520 Speaker 6: of this to get out. 851 00:44:31,680 --> 00:44:33,640 Speaker 1: Okay, I have a best of the week the guy 852 00:44:34,040 --> 00:44:38,919 Speaker 1: from medv or Medvi who created the first unicorn of two, 853 00:44:39,040 --> 00:44:41,879 Speaker 1: him and his brother their first billion dollar run rate 854 00:44:41,960 --> 00:44:45,520 Speaker 1: company selling GLP ones and via great on the internet. 855 00:44:45,520 --> 00:44:48,440 Speaker 7: It's a thing like they're just doing the marketing and 856 00:44:48,920 --> 00:44:50,879 Speaker 7: talking to patients. 857 00:44:51,320 --> 00:44:53,640 Speaker 1: And I think I think that they're connect somehow, connecting 858 00:44:53,680 --> 00:44:55,880 Speaker 1: patients and doctors in a quicker way than it is 859 00:44:55,880 --> 00:44:59,560 Speaker 1: otherwise possible to bypass. Field Fashion and prescription. 860 00:45:01,840 --> 00:45:05,759 Speaker 6: Kill Mill is the first billion dollar one person, two 861 00:45:05,760 --> 00:45:06,680 Speaker 6: person company. 862 00:45:06,960 --> 00:45:09,040 Speaker 2: It wasn't born, you know. 863 00:45:09,480 --> 00:45:10,400 Speaker 6: I mean, there's. 864 00:45:11,719 --> 00:45:14,400 Speaker 2: Something no but it's just it's just boring. 865 00:45:14,480 --> 00:45:16,200 Speaker 4: I'm just saying that was like what it was like 866 00:45:16,239 --> 00:45:18,360 Speaker 4: web one point out, like let's do something different. 867 00:45:18,400 --> 00:45:20,440 Speaker 1: You know, he has a digital avatar of himself to 868 00:45:20,440 --> 00:45:23,280 Speaker 1: deal with his day's day life as well. The founders 869 00:45:23,320 --> 00:45:26,080 Speaker 1: that he's already going all in on the on the 870 00:45:26,120 --> 00:45:26,960 Speaker 1: on the AI moment. 871 00:45:26,960 --> 00:45:29,840 Speaker 2: He should be a go ahead read can I thro? 872 00:45:29,960 --> 00:45:31,080 Speaker 2: I mean we said it earlier. 873 00:45:31,120 --> 00:45:33,400 Speaker 4: I mean I think SpaceX right, it's gonna be the 874 00:45:33,480 --> 00:45:37,480 Speaker 4: largest I P o ever, right, they they filed confidentially 875 00:45:37,520 --> 00:45:40,200 Speaker 4: this week, right, I mean that's like, you know, you 876 00:45:40,280 --> 00:45:42,680 Speaker 4: gotta get a handle to them. That's pretty and you know, 877 00:45:42,840 --> 00:45:45,600 Speaker 4: and also just people are talking about space a lot, right, 878 00:45:45,640 --> 00:45:49,719 Speaker 4: You've got the Artemist launch, You've got the Project Hill 879 00:45:49,800 --> 00:45:50,960 Speaker 4: Mary movie. 880 00:45:51,760 --> 00:45:55,480 Speaker 1: You know, I feel likensha not such a space stun. 881 00:45:56,760 --> 00:45:59,040 Speaker 3: Can go to spaces going to space. 882 00:45:59,840 --> 00:46:03,440 Speaker 2: I I just wish, I. 883 00:46:03,400 --> 00:46:06,280 Speaker 6: Wish people would read more Octavia Butler if they're thinking 884 00:46:06,280 --> 00:46:09,680 Speaker 6: about space rather than you know, like what is he 885 00:46:09,880 --> 00:46:15,279 Speaker 6: like Robert Heinleine. Uh, I'm very curious to look at 886 00:46:15,320 --> 00:46:17,359 Speaker 6: the space X I p O. Because I think we're 887 00:46:17,360 --> 00:46:22,120 Speaker 6: going to have some like oracle type moving stuff around 888 00:46:22,160 --> 00:46:26,160 Speaker 6: in order to pay for those data centers. And you know, 889 00:46:26,200 --> 00:46:29,480 Speaker 6: it's just been rolling up and rolling up companies into 890 00:46:30,040 --> 00:46:34,400 Speaker 6: right like x into Xai, x Ai into is it 891 00:46:34,440 --> 00:46:35,399 Speaker 6: into SpaceX now? 892 00:46:35,400 --> 00:46:37,759 Speaker 4: And I'm like blinking, well, it's going to be well, 893 00:46:38,239 --> 00:46:41,920 Speaker 4: well it's yeah, but it's going to be it is SpaceX, 894 00:46:41,960 --> 00:46:43,000 Speaker 4: but it's going to be Tesla. 895 00:46:43,120 --> 00:46:46,479 Speaker 2: It's going to be elon KRK. Right, That's that's the plan. 896 00:46:46,560 --> 00:46:49,520 Speaker 4: They go public and then they'll they'll join with Tesla 897 00:46:50,040 --> 00:46:52,600 Speaker 4: in some capacity, and you're just. 898 00:46:52,600 --> 00:46:54,480 Speaker 6: You're just I mean, we do think there's a chance 899 00:46:54,560 --> 00:46:59,080 Speaker 6: that if people look at the at the numbers that like, 900 00:46:59,520 --> 00:47:03,360 Speaker 6: he's an incredible showman for for stockholders, right, So do 901 00:47:03,400 --> 00:47:05,040 Speaker 6: you think some of that will change if we see 902 00:47:05,040 --> 00:47:05,600 Speaker 6: the numbers? 903 00:47:06,280 --> 00:47:09,280 Speaker 4: I mean, I think you just have to ask retail 904 00:47:09,360 --> 00:47:12,520 Speaker 4: investors what they think. I think, and I think the 905 00:47:12,560 --> 00:47:14,480 Speaker 4: answer is like we love the Elon, right, I mean 906 00:47:14,480 --> 00:47:17,719 Speaker 4: that's people just he is the I mean, he is 907 00:47:17,760 --> 00:47:20,279 Speaker 4: the Steve Jobs of our day, right, like people, He's 908 00:47:20,320 --> 00:47:23,799 Speaker 4: got the he can convince people of anything. You know, 909 00:47:23,840 --> 00:47:26,759 Speaker 4: you can tell a story about space data centers and 910 00:47:26,880 --> 00:47:29,920 Speaker 4: humanoid robots and all that, and I think you know 911 00:47:30,280 --> 00:47:34,240 Speaker 4: it's it's that formula still works, right, I mean. 912 00:47:34,800 --> 00:47:37,520 Speaker 1: Kyle is eating the Steve Jobs of our age. 913 00:47:38,360 --> 00:47:41,759 Speaker 7: Oh man, I mean Steve Jobs was not sending us 914 00:47:41,800 --> 00:47:47,360 Speaker 7: to space. He could be like the Thomas Edison or whatever, 915 00:47:47,560 --> 00:47:50,200 Speaker 7: like the if not the pure inventor, then at least 916 00:47:50,200 --> 00:47:53,400 Speaker 7: the showman who can trust all of this stuff up 917 00:47:53,520 --> 00:47:57,120 Speaker 7: for public consumption. Speaking of space, though, I also want 918 00:47:57,200 --> 00:47:59,600 Speaker 7: to add my best of the week, which is a 919 00:47:59,640 --> 00:48:05,520 Speaker 7: Nintime Do and Super Mario Galaxy Odyssey whatever, which seems 920 00:48:05,560 --> 00:48:09,239 Speaker 7: to be a rousing success for children everywhere. 921 00:48:10,600 --> 00:48:12,880 Speaker 4: You know, my kids are going to the movie later 922 00:48:13,120 --> 00:48:15,000 Speaker 4: later today, so yeah, they had. 923 00:48:15,160 --> 00:48:18,920 Speaker 1: It's a hard choice between am Ai apocaly Optimist and 924 00:48:19,200 --> 00:48:22,040 Speaker 1: Mary mary Win Space movie. I guess for your kids. 925 00:48:23,000 --> 00:48:25,399 Speaker 4: I haven't even seen Project Tail there yet, so that's 926 00:48:25,600 --> 00:48:29,120 Speaker 4: the the apocal apocaal Optimist is gonna have to wait 927 00:48:29,160 --> 00:48:29,440 Speaker 4: for me. 928 00:48:33,320 --> 00:48:33,520 Speaker 3: In it. 929 00:48:33,600 --> 00:48:37,120 Speaker 1: That's thank you. 930 00:48:37,120 --> 00:48:39,680 Speaker 3: All, thanks for having us. 931 00:48:39,760 --> 00:48:40,640 Speaker 2: That was fun. 932 00:48:54,560 --> 00:48:57,319 Speaker 1: For tech stuff. I'm was for Locian. This episode was 933 00:48:57,320 --> 00:49:00,600 Speaker 1: produced by Elizah Dennis and Melissa Slaughter. Executive produced by 934 00:49:00,680 --> 00:49:03,600 Speaker 1: me Julian Nutter and Kate Osborne for Kaleidoscope and Katria 935 00:49:03,680 --> 00:49:07,279 Speaker 1: Norvel for iHeart Podcasts. Jack Insley mixed this episode and 936 00:49:07,360 --> 00:49:10,360 Speaker 1: Kyle Murdoch wrote our theme song special thank you to 937 00:49:10,440 --> 00:49:13,480 Speaker 1: you read Kyle and Natasha. Please check out all three 938 00:49:13,520 --> 00:49:16,160 Speaker 1: of these incredible journalists work, and we're very lucky to 939 00:49:16,200 --> 00:49:17,359 Speaker 1: call them friends of the pod.