1 00:00:15,840 --> 00:00:16,800 Speaker 1: Welcome to Tech Stuff. 2 00:00:16,840 --> 00:00:19,239 Speaker 2: I'm's Volosca and I'm thrilled to announce that the Week 3 00:00:19,280 --> 00:00:22,840 Speaker 2: in Tech is back and it's growing. Instead of Karen 4 00:00:22,880 --> 00:00:26,280 Speaker 2: and I recounting the Essential news today and every Friday 5 00:00:26,280 --> 00:00:28,600 Speaker 2: from now on, I'm going to be joined by three 6 00:00:28,640 --> 00:00:33,000 Speaker 2: of the best writers covering Silicon Valley. Read Abagotti, technology 7 00:00:33,080 --> 00:00:36,239 Speaker 2: editor for Semophore. I read your newsletter religiously and we 8 00:00:36,280 --> 00:00:37,640 Speaker 2: had a lot of fun on Tech Stuff a few 9 00:00:37,640 --> 00:00:40,840 Speaker 2: months ago with a story you wrote comparing different AI 10 00:00:40,960 --> 00:00:42,639 Speaker 2: companies to characters in The. 11 00:00:42,560 --> 00:00:43,360 Speaker 1: Wizard of Oz. 12 00:00:43,680 --> 00:00:46,320 Speaker 3: Welcome back, Thanks, it's good to be here. 13 00:00:46,560 --> 00:00:50,520 Speaker 2: Kyle Jaker writes the Infinite Scroll newsletter for The New Yorker. Kyle, 14 00:00:50,560 --> 00:00:52,480 Speaker 2: last time you were on Tech Stuff, you talked about 15 00:00:52,560 --> 00:00:56,920 Speaker 2: your holiday tech gift guide, which leaned quite surrealist. Happy 16 00:00:56,920 --> 00:00:57,520 Speaker 2: to have you back. 17 00:00:57,680 --> 00:00:58,480 Speaker 1: It was very fun. 18 00:00:58,520 --> 00:01:01,760 Speaker 2: Thank you and Taylor friends who joined a great discussion 19 00:01:01,800 --> 00:01:04,600 Speaker 2: on last week's panel and is back this week. Taylor 20 00:01:04,600 --> 00:01:07,960 Speaker 2: writes the User mag newsletter and hosts the Power User podcast. 21 00:01:08,360 --> 00:01:09,640 Speaker 1: Taylor, thanks for joining us again. 22 00:01:09,880 --> 00:01:11,120 Speaker 4: Thanks for having me so. 23 00:01:11,360 --> 00:01:13,160 Speaker 2: I've been following each of you for a long time, 24 00:01:13,480 --> 00:01:15,720 Speaker 2: and many of you have also known each other for 25 00:01:15,760 --> 00:01:18,000 Speaker 2: a long time, or at least you all know Taylor. 26 00:01:19,360 --> 00:01:21,320 Speaker 5: Yes, I don't think Read and I have ever met, 27 00:01:21,319 --> 00:01:24,520 Speaker 5: but I follow this work of course likewise likewise. 28 00:01:24,840 --> 00:01:28,760 Speaker 2: So just imagine we're all friends sitting around a table 29 00:01:28,959 --> 00:01:31,480 Speaker 2: in a living room with huge microphones in our face, 30 00:01:31,640 --> 00:01:35,600 Speaker 2: just shooting the breeze about technology. And let's really take 31 00:01:35,640 --> 00:01:39,199 Speaker 2: our listeners on a journey, because each of you lives 32 00:01:39,240 --> 00:01:42,880 Speaker 2: and breathes your beat and has, as far as I'm concerned, 33 00:01:42,959 --> 00:01:46,640 Speaker 2: better insights and access than anyone else in the world. Okay, 34 00:01:46,680 --> 00:01:49,960 Speaker 2: let's get into it and Read. Let's start with you. 35 00:01:49,960 --> 00:01:52,440 Speaker 2: You have what could be described as an insiderish story 36 00:01:52,480 --> 00:01:57,040 Speaker 2: this week a company conference, except not just any company conference, 37 00:01:57,720 --> 00:02:01,440 Speaker 2: the massive developer conference held by the multi trillion dollar 38 00:02:01,520 --> 00:02:02,600 Speaker 2: company in Video. 39 00:02:02,880 --> 00:02:03,080 Speaker 1: Yeah. 40 00:02:03,160 --> 00:02:05,520 Speaker 3: Yeah, it's funny you say insidery because I was in 41 00:02:05,600 --> 00:02:09,840 Speaker 3: this press conference with n Video CEO Jensen Walling yesterday 42 00:02:09,840 --> 00:02:12,280 Speaker 3: and I think there were three hundred reporters. 43 00:02:11,800 --> 00:02:12,239 Speaker 1: In the room. 44 00:02:12,960 --> 00:02:16,040 Speaker 3: If you can imagine that. I was thinking, like, I 45 00:02:16,040 --> 00:02:18,360 Speaker 3: wonder how many reporters are in like the White House 46 00:02:18,440 --> 00:02:21,079 Speaker 3: right now, you know, like it was an insane amount 47 00:02:21,080 --> 00:02:23,760 Speaker 3: of interest in coverage of this thing. But he said 48 00:02:23,760 --> 00:02:26,200 Speaker 3: something that I thought was in this I think this 49 00:02:26,280 --> 00:02:29,720 Speaker 3: made the rounds yesterday. But he said, you know, his 50 00:02:29,760 --> 00:02:34,280 Speaker 3: life philosophy is three things. Essentially, he said, don't don't 51 00:02:34,400 --> 00:02:38,560 Speaker 3: get fired, don't get bored, and don't get killed. And 52 00:02:39,160 --> 00:02:42,600 Speaker 3: it wants to live by literally right exactly. And I 53 00:02:42,639 --> 00:02:46,040 Speaker 3: thought he said living in that triangle is higher risk 54 00:02:46,160 --> 00:02:48,400 Speaker 3: than you think. And I sort of thought about that, 55 00:02:48,440 --> 00:02:50,600 Speaker 3: and I'm like, that actually does explain a lot of 56 00:02:50,639 --> 00:02:53,600 Speaker 3: what you're seeing from Nvideo, where it's just a company 57 00:02:53,720 --> 00:02:56,360 Speaker 3: like all the mag seven, all these big tech companies 58 00:02:56,440 --> 00:02:58,480 Speaker 3: end up in this place where you get to a 59 00:02:58,520 --> 00:03:02,360 Speaker 3: certain size and success and you end up just competing 60 00:03:02,639 --> 00:03:06,880 Speaker 3: in everything in every front. And the fronts are just 61 00:03:07,000 --> 00:03:10,280 Speaker 3: expanding for Nvidia into all these new areas and the 62 00:03:10,320 --> 00:03:14,200 Speaker 3: big I think externality that that has happened since you know, 63 00:03:14,280 --> 00:03:17,800 Speaker 3: the last year's big en Video conference here, although they 64 00:03:17,840 --> 00:03:20,160 Speaker 3: seem to do something like this a few times a 65 00:03:20,240 --> 00:03:25,200 Speaker 3: year now, is that is that open claw happened and 66 00:03:25,400 --> 00:03:27,720 Speaker 3: that's been sort of I think of it as like 67 00:03:27,760 --> 00:03:31,840 Speaker 3: the chatchypt moment that everyone's ignoring because the world is 68 00:03:31,880 --> 00:03:33,880 Speaker 3: on fire and it's and it doesn't and it's also 69 00:03:34,040 --> 00:03:37,040 Speaker 3: not like something that anybody can just do, can just 70 00:03:37,080 --> 00:03:40,560 Speaker 3: fire up and do like chatchept. But it's like a 71 00:03:40,640 --> 00:03:44,400 Speaker 3: glimpse into the future where if you talk to people 72 00:03:44,440 --> 00:03:47,680 Speaker 3: out here who are using this, you're like, oh, of course, like, 73 00:03:47,760 --> 00:03:50,119 Speaker 3: of course this is where things are headed. But it's 74 00:03:50,120 --> 00:03:54,760 Speaker 3: also quantum computing, it's video games still, it's like everything space. 75 00:03:55,040 --> 00:03:57,920 Speaker 2: Yeah, so you and audience smiling when Reid was talking 76 00:03:57,920 --> 00:04:01,000 Speaker 2: about can you give us a little bit of the backstory, 77 00:04:01,040 --> 00:04:02,760 Speaker 2: like what are the essentials we need to know about 78 00:04:02,800 --> 00:04:04,400 Speaker 2: open floor and then what will you grinew the. 79 00:04:04,560 --> 00:04:06,600 Speaker 5: Man, Well, I was grinning about the fact that I 80 00:04:06,680 --> 00:04:10,360 Speaker 5: don't think anyone outside of tech understands it at all yet. 81 00:04:11,120 --> 00:04:14,440 Speaker 5: I think there are these memes coming out of China 82 00:04:14,600 --> 00:04:18,400 Speaker 5: of like AI anxiety after this open Claw moment, from 83 00:04:18,440 --> 00:04:22,240 Speaker 5: people who are now scrambling to let agents take over 84 00:04:22,360 --> 00:04:26,159 Speaker 5: their inboxes and their text messages and their computing lives. 85 00:04:27,200 --> 00:04:29,839 Speaker 5: And I think there was one incident in which an 86 00:04:29,839 --> 00:04:33,560 Speaker 5: open Claw agent just deleted someone's entire email inbox or 87 00:04:33,560 --> 00:04:37,560 Speaker 5: something like this. So it's like, I don't know, it's 88 00:04:37,600 --> 00:04:40,840 Speaker 5: like vaping or something. It's like, let's just inhale it 89 00:04:41,200 --> 00:04:44,680 Speaker 5: straight into our bodies and like, see what happens. But 90 00:04:44,800 --> 00:04:47,320 Speaker 5: I do, Asri said, I think it's this tidal wave 91 00:04:47,400 --> 00:04:50,960 Speaker 5: that everyone's ignoring or just can't quite understand. 92 00:04:51,520 --> 00:04:54,920 Speaker 2: School Christra of AI agents who basically have full access 93 00:04:54,960 --> 00:04:56,920 Speaker 2: to your computer and all your emails and everything you've 94 00:04:56,920 --> 00:04:59,000 Speaker 2: ever done, and then make decisions on your behalf. 95 00:05:00,040 --> 00:05:02,680 Speaker 3: This guy I was talking to who was doing this, 96 00:05:02,839 --> 00:05:06,640 Speaker 3: one of these, one of these early adopters, was telling 97 00:05:06,640 --> 00:05:10,000 Speaker 3: me just how he has completely blindly turned his life 98 00:05:10,000 --> 00:05:14,520 Speaker 3: over to his open Claw implementation. And I'm like, aren't 99 00:05:14,520 --> 00:05:16,760 Speaker 3: you worried, Like it's gonna delete your inbox or do 100 00:05:16,839 --> 00:05:20,000 Speaker 3: something really bad? And he's like, oh, I think it will. 101 00:05:20,400 --> 00:05:22,279 Speaker 3: He's like, but I'll just I'll learn from it. It'll 102 00:05:22,320 --> 00:05:24,200 Speaker 3: it'll teach me a lesson that will probably give me 103 00:05:24,240 --> 00:05:25,760 Speaker 3: some idea for my next company. 104 00:05:25,800 --> 00:05:28,840 Speaker 2: And I'm just like, okay, but the open cool thing 105 00:05:28,880 --> 00:05:31,240 Speaker 2: was happening before in video, Like why why was it 106 00:05:31,279 --> 00:05:33,839 Speaker 2: the topic? Dujoure of GtC. 107 00:05:34,400 --> 00:05:36,000 Speaker 3: Yeah, and by the way, I don't even know if 108 00:05:36,000 --> 00:05:37,880 Speaker 3: it was the topic dujore like, It's just to me 109 00:05:38,480 --> 00:05:41,839 Speaker 3: the best example of how it's like this thing happens 110 00:05:42,200 --> 00:05:44,960 Speaker 3: and and now and video has to like scramble and 111 00:05:44,960 --> 00:05:47,320 Speaker 3: figure out how to how to play in that space. 112 00:05:47,560 --> 00:05:49,679 Speaker 2: And the way that's stepping into the open potal space 113 00:05:49,800 --> 00:05:53,320 Speaker 2: is essentially building their own product around open Pool called 114 00:05:53,360 --> 00:05:53,960 Speaker 2: Nemo Cure. 115 00:05:54,000 --> 00:05:54,279 Speaker 3: Is that right? 116 00:05:54,360 --> 00:05:55,000 Speaker 1: Read that's right? 117 00:05:55,080 --> 00:05:58,159 Speaker 3: And right now it's it's so expensive and so risky 118 00:05:58,240 --> 00:06:01,320 Speaker 3: from a security standpoint to do this, to just turn 119 00:06:01,760 --> 00:06:05,520 Speaker 3: your life over to many hundreds of agents and different 120 00:06:05,520 --> 00:06:08,520 Speaker 3: AI models. But like the prices are coming down so 121 00:06:08,640 --> 00:06:11,320 Speaker 3: much and the infrastructure is being built around it so 122 00:06:11,480 --> 00:06:15,520 Speaker 3: fast that that just means a lot more tokens, a 123 00:06:15,520 --> 00:06:19,400 Speaker 3: lot more revenue for companies like Nvidia and all the hyperscalers. 124 00:06:19,880 --> 00:06:23,040 Speaker 3: But be like a new a new field, a new 125 00:06:23,080 --> 00:06:26,400 Speaker 3: battlefield to try to like gain market share or potentially 126 00:06:26,400 --> 00:06:28,680 Speaker 3: lose market share. And that's so Nvidia is like doing 127 00:06:28,760 --> 00:06:29,960 Speaker 3: this stuff around that. 128 00:06:30,040 --> 00:06:33,200 Speaker 2: And they're trying to create like a more secure architecture 129 00:06:33,240 --> 00:06:37,000 Speaker 2: around this open source thing. Right to basically honess the 130 00:06:37,040 --> 00:06:39,200 Speaker 2: power of open tool but make it somewhat safer. 131 00:06:39,320 --> 00:06:40,640 Speaker 1: Right is the kind of goal. 132 00:06:41,080 --> 00:06:43,040 Speaker 2: Taylor, what are you? What are you thinking about open toloor? 133 00:06:43,120 --> 00:06:44,880 Speaker 2: I saw you what us mining as well? And we 134 00:06:44,960 --> 00:06:45,920 Speaker 2: will come back to you up to us. 135 00:06:46,120 --> 00:06:50,520 Speaker 4: Yeah, I have Fomo, and I want a claw. I 136 00:06:50,560 --> 00:06:53,920 Speaker 4: want one of these little bots. I like was debating 137 00:06:53,960 --> 00:06:56,720 Speaker 4: buying a Macmini. I just can't justify the like five 138 00:06:57,120 --> 00:07:00,560 Speaker 4: hundred dollars or whatever it costs these days, I kind 139 00:07:00,560 --> 00:07:03,160 Speaker 4: of feel like that that person who said that they 140 00:07:03,240 --> 00:07:05,840 Speaker 4: might learn from it, Like now I wouldn't give it 141 00:07:06,240 --> 00:07:09,760 Speaker 4: unmitigated access to my email. But I do think that 142 00:07:09,840 --> 00:07:12,239 Speaker 4: Read is right that this is sort of we're probably 143 00:07:12,320 --> 00:07:14,440 Speaker 4: moving towards this sort of like agentic future or whatever 144 00:07:14,440 --> 00:07:17,000 Speaker 4: you want to call it. And I don't know, there's 145 00:07:17,080 --> 00:07:19,679 Speaker 4: just like you guys know how hard like time consuming 146 00:07:19,680 --> 00:07:21,480 Speaker 4: could be to be a freelancer. I'm like, even if 147 00:07:21,480 --> 00:07:23,640 Speaker 4: this thing could just like sort of invoice companies for me, 148 00:07:23,680 --> 00:07:24,440 Speaker 4: it would be amazing. 149 00:07:24,640 --> 00:07:28,120 Speaker 5: Yeah, it feels like it's proving out that AI will 150 00:07:28,160 --> 00:07:30,920 Speaker 5: be used a lot by normal people. Also, like we'll 151 00:07:30,920 --> 00:07:34,440 Speaker 5: all be using these large amounts of tokens, Like it's 152 00:07:34,600 --> 00:07:37,160 Speaker 5: going to be embedded much more deeply in our workflows 153 00:07:37,240 --> 00:07:38,640 Speaker 5: than just using a chatbot. 154 00:07:38,840 --> 00:07:42,040 Speaker 2: Yeah, on in Vidio specifically, I mean, it's interesting, right, 155 00:07:42,200 --> 00:07:44,440 Speaker 2: Taylor and Kyle, you both focus on kind of the 156 00:07:44,440 --> 00:07:47,400 Speaker 2: intersection of technology and culture. I was thinking about this 157 00:07:47,440 --> 00:07:49,480 Speaker 2: morning like how to, how to how to relate in 158 00:07:49,560 --> 00:07:51,280 Speaker 2: video to both of your beats? And I had the 159 00:07:51,400 --> 00:07:55,080 Speaker 2: rather depressing thought that maybe all culture now sits downstream 160 00:07:55,120 --> 00:07:55,640 Speaker 2: of Nvidia. 161 00:07:56,600 --> 00:07:59,920 Speaker 5: What well I was I was thinking about that because 162 00:08:00,080 --> 00:08:02,560 Speaker 5: I do feel like this is one of the biggest 163 00:08:02,560 --> 00:08:06,520 Speaker 5: moments for Johnson Huang as like a meme. Right Suddenly 164 00:08:06,600 --> 00:08:08,920 Speaker 5: his image was everywhere and he's been wearing the same 165 00:08:09,040 --> 00:08:12,920 Speaker 5: leather jacket for years or decades or whatever, and there's 166 00:08:12,960 --> 00:08:15,680 Speaker 5: crowds of people around him, and so suddenly it does 167 00:08:15,760 --> 00:08:19,720 Speaker 5: feel like a cultural movement as much as a technological one, 168 00:08:20,080 --> 00:08:20,520 Speaker 5: for sure. 169 00:08:21,040 --> 00:08:23,200 Speaker 2: Read coming out of this story, I mean, what what? 170 00:08:23,200 --> 00:08:24,400 Speaker 1: What's the you know? 171 00:08:24,480 --> 00:08:26,760 Speaker 2: One of the interesting things was that all these incredible 172 00:08:26,760 --> 00:08:29,720 Speaker 2: announcements and more and more and more revenues, and yet 173 00:08:29,760 --> 00:08:32,120 Speaker 2: in video stock price was was I think flat or 174 00:08:32,120 --> 00:08:34,800 Speaker 2: slightly down? Like how did how did people react to this? 175 00:08:34,880 --> 00:08:36,840 Speaker 2: Did that correspond to the mood in the room? Or 176 00:08:36,920 --> 00:08:38,800 Speaker 2: is I mean, is it just they've had so many 177 00:08:38,840 --> 00:08:41,040 Speaker 2: wins that the market has kind of priced them at 178 00:08:41,040 --> 00:08:43,280 Speaker 2: this place. There's nowhere more for them to grow, Like 179 00:08:43,320 --> 00:08:45,320 Speaker 2: what's and what are the competitive threats? 180 00:08:45,559 --> 00:08:47,920 Speaker 3: I think it's actually it's funny The thing about n 181 00:08:48,000 --> 00:08:50,880 Speaker 3: video is it's almost like too simple of a story, right. 182 00:08:50,960 --> 00:08:55,079 Speaker 3: It's like there's these things called tokens that we're all using, right, 183 00:08:55,120 --> 00:08:56,960 Speaker 3: that's what that's what we're using when we when we 184 00:08:57,040 --> 00:09:00,440 Speaker 3: query chat ept and people can't get another of them, 185 00:09:00,480 --> 00:09:02,480 Speaker 3: and they're willing to pay a lot of money for them. 186 00:09:02,880 --> 00:09:04,880 Speaker 3: And you could see with like this open cloth stuff 187 00:09:04,920 --> 00:09:08,800 Speaker 3: there will actually the more tokens you use, the better 188 00:09:08,880 --> 00:09:12,120 Speaker 3: it gets, and people are paying for it, and it's like, okay, 189 00:09:12,200 --> 00:09:16,520 Speaker 3: so these companies are making a product that is pretty profitable, 190 00:09:16,640 --> 00:09:19,280 Speaker 3: has good margins and everybody wants and they can't make 191 00:09:19,400 --> 00:09:21,800 Speaker 3: enough of it. Like that's just a good business, like 192 00:09:22,320 --> 00:09:25,400 Speaker 3: and I, but it's like too simple for tech. Normally, 193 00:09:25,520 --> 00:09:28,280 Speaker 3: Tech's like we're gonna get all these people these eyeballs, 194 00:09:28,320 --> 00:09:30,320 Speaker 3: and it's gonna be free, and then we'll figure out 195 00:09:30,320 --> 00:09:32,960 Speaker 3: how to monetize it and we'll offer all these free services. 196 00:09:33,000 --> 00:09:35,920 Speaker 3: And it's like, wait, you mean you're just gonna charge 197 00:09:35,960 --> 00:09:38,559 Speaker 3: people a lot of money for a product and make 198 00:09:38,600 --> 00:09:40,720 Speaker 3: a bunch of money on It's like that's so weird. 199 00:09:40,800 --> 00:09:42,040 Speaker 3: It's like a normal business. 200 00:09:43,040 --> 00:09:45,400 Speaker 2: Taylor Caryle, I'm curious before we move to the next story, 201 00:09:45,400 --> 00:09:48,240 Speaker 2: and what are your burning questions about Nvidio. I mean, 202 00:09:48,280 --> 00:09:50,000 Speaker 2: the biggest company on us don't work it. 203 00:09:50,679 --> 00:09:53,760 Speaker 4: I think, you know, that's a really good question. I 204 00:09:53,800 --> 00:09:55,439 Speaker 4: don't know. I mean I kind of come to it 205 00:09:55,679 --> 00:09:58,480 Speaker 4: like from the same angle as Kyle maybe of like 206 00:09:58,600 --> 00:10:02,719 Speaker 4: I'm very interested to see how they navigate this like 207 00:10:02,920 --> 00:10:07,439 Speaker 4: AI backlash, how Jensen kind of like ascends into this 208 00:10:07,679 --> 00:10:11,000 Speaker 4: like role of a very public CEO. I mean, he's 209 00:10:11,160 --> 00:10:13,360 Speaker 4: been CEO for so long, and I think most of 210 00:10:13,440 --> 00:10:16,040 Speaker 4: his time as a tech leader has been kind of 211 00:10:16,080 --> 00:10:18,080 Speaker 4: behind the scenes by Steve Joes. 212 00:10:18,080 --> 00:10:20,360 Speaker 2: He didn't rebrand, he's worn that that that leather jacket 213 00:10:20,440 --> 00:10:21,480 Speaker 2: for a really long time. 214 00:10:21,679 --> 00:10:24,199 Speaker 1: He already he already. 215 00:10:26,400 --> 00:10:28,400 Speaker 4: But I also think it's interesting, you know, we had 216 00:10:28,559 --> 00:10:31,920 Speaker 4: like Sam Altman open AI CEO at the Vanity Fair 217 00:10:32,080 --> 00:10:35,040 Speaker 4: Oscar parties, where where if famous playwright I guess called 218 00:10:35,080 --> 00:10:37,640 Speaker 4: him like a Nazi, and there is like this kind 219 00:10:37,640 --> 00:10:41,760 Speaker 4: of burgeoning like anti tech sentiment, especially from more left 220 00:10:41,800 --> 00:10:44,679 Speaker 4: cleaning spaces, and so I don't know, I don't know 221 00:10:44,760 --> 00:10:47,120 Speaker 4: that we'll see Jensen Wong like out at you know, 222 00:10:47,320 --> 00:10:49,480 Speaker 4: Hollywood parties, but I do think that like he's going 223 00:10:49,520 --> 00:10:51,800 Speaker 4: to have to step into his public image and curate 224 00:10:51,800 --> 00:10:53,160 Speaker 4: a public image for himself that. 225 00:10:53,360 --> 00:10:54,280 Speaker 3: Is that is right on. 226 00:10:54,360 --> 00:10:56,640 Speaker 5: It's like in video is a brand, but what is 227 00:10:56,960 --> 00:10:59,720 Speaker 5: what is the brand of? Like a chip or tokens, 228 00:10:59,800 --> 00:11:03,920 Speaker 5: Like it's a utility rather than a particular product. And 229 00:11:03,960 --> 00:11:07,199 Speaker 5: I think I saw at a press conference or just 230 00:11:07,280 --> 00:11:11,880 Speaker 5: the scrum afterward, Jensen just said everyone, the Nvidia ecosystem 231 00:11:12,040 --> 00:11:16,360 Speaker 5: is rich, like it's this kind of MIAs like everything 232 00:11:16,440 --> 00:11:19,240 Speaker 5: I touch turns to gold. But also that feels kind 233 00:11:19,280 --> 00:11:22,520 Speaker 5: of cursed, like you're flying too close to the sun. 234 00:11:24,000 --> 00:11:27,160 Speaker 5: So I'm curious how much higher it can even go, 235 00:11:28,080 --> 00:11:32,160 Speaker 5: Like like, are we just getting started or or is 236 00:11:32,280 --> 00:11:34,240 Speaker 5: this a kind of threshold moment? 237 00:11:35,080 --> 00:11:36,560 Speaker 1: Taylor moving moving right along. 238 00:11:37,520 --> 00:11:41,040 Speaker 2: You mentioned techlash, and you brought us our attention to 239 00:11:41,120 --> 00:11:44,719 Speaker 2: a story about a Senate committee hearing this week with 240 00:11:44,840 --> 00:11:50,319 Speaker 2: the title Liability or Deniability Platform power as section two 241 00:11:50,480 --> 00:11:54,559 Speaker 2: thirty turns thirty. Now, before we get into that hearing, 242 00:11:54,840 --> 00:11:58,199 Speaker 2: you've been covering section two thirty for decade. You put 243 00:11:58,240 --> 00:12:02,200 Speaker 2: our six pot YouTube series devoted to it, You spoke 244 00:12:02,240 --> 00:12:06,280 Speaker 2: about it a south By Southwest last week. What drives 245 00:12:06,320 --> 00:12:08,880 Speaker 2: your fascination with this law? What do we need to 246 00:12:09,000 --> 00:12:11,920 Speaker 2: understand about it and what might change based on all 247 00:12:11,960 --> 00:12:13,400 Speaker 2: of these legislative efforts. 248 00:12:14,040 --> 00:12:16,599 Speaker 4: Yeah, so Section two thirty is what a law that 249 00:12:16,679 --> 00:12:20,040 Speaker 4: frankly should be very non controversial. It is a law 250 00:12:20,200 --> 00:12:24,360 Speaker 4: that it effectively says, you, as a speaker of content online, 251 00:12:24,360 --> 00:12:28,240 Speaker 4: should be held liable for your own speech. It's a 252 00:12:28,320 --> 00:12:30,600 Speaker 4: foundational law. It's what that allows the Internet to kind 253 00:12:30,600 --> 00:12:35,080 Speaker 4: of exist, because the Internet is not like a public space, right, 254 00:12:35,160 --> 00:12:37,520 Speaker 4: There's no it's not like the real world where you 255 00:12:37,559 --> 00:12:39,839 Speaker 4: could where there is like a physical world out there. 256 00:12:39,920 --> 00:12:42,719 Speaker 4: So you need people or companies to kind of go 257 00:12:42,880 --> 00:12:45,520 Speaker 4: in and set up spaces in this world, whether that's 258 00:12:45,559 --> 00:12:47,959 Speaker 4: a website, whether that's a forum, whether that's a social 259 00:12:48,000 --> 00:12:51,720 Speaker 4: media platform, et cetera. And I think you know, there 260 00:12:51,840 --> 00:12:56,280 Speaker 4: was this really really bad article written about a decade 261 00:12:56,280 --> 00:12:59,599 Speaker 4: ago on section two thirty that was claiming that I 262 00:12:59,679 --> 00:13:03,520 Speaker 4: guess like Facebook was using section two thirty to kind 263 00:13:03,559 --> 00:13:06,800 Speaker 4: of like not police hate speech. Ironically, section two thirty 264 00:13:06,880 --> 00:13:10,800 Speaker 4: is what allows platforms to do moderation. But that kind 265 00:13:10,800 --> 00:13:13,600 Speaker 4: of set off this like decade long discussion where pretty 266 00:13:13,679 --> 00:13:17,400 Speaker 4: much everyone in power wants to transform the Internet from 267 00:13:17,440 --> 00:13:21,000 Speaker 4: this open place where people can sort of freely discuss 268 00:13:21,120 --> 00:13:24,120 Speaker 4: ideas to something more like the mainstream media traditional media, 269 00:13:24,160 --> 00:13:27,160 Speaker 4: where it's a top down sort of corporate control over 270 00:13:27,320 --> 00:13:32,439 Speaker 4: speech and every single platform or forum admin or person 271 00:13:32,520 --> 00:13:35,520 Speaker 4: online you know, is responsible for kind of the speech 272 00:13:35,559 --> 00:13:36,480 Speaker 4: that they host. 273 00:13:36,760 --> 00:13:39,400 Speaker 2: Radio noting nodding along to Taylor's I'm always curious when 274 00:13:39,440 --> 00:13:40,720 Speaker 2: people are nodding what they're noting about. 275 00:13:41,240 --> 00:13:43,520 Speaker 3: No, I mean, I just I think what she's saying 276 00:13:43,600 --> 00:13:46,280 Speaker 3: makes sense, and you're you're I think explaining it in 277 00:13:46,640 --> 00:13:50,240 Speaker 3: a good way, for sure. I don't think there was 278 00:13:50,280 --> 00:13:52,520 Speaker 3: anything I was like, you know, I don't think there's 279 00:13:52,520 --> 00:13:54,559 Speaker 3: anything controversial in that. It was sort of laying out 280 00:13:54,600 --> 00:13:55,000 Speaker 3: the fact. 281 00:13:55,280 --> 00:13:57,280 Speaker 2: Now to the controversy, Taylor take us away. 282 00:13:57,480 --> 00:14:01,319 Speaker 4: Well, so, you know, as as the Internet has grown 283 00:14:01,400 --> 00:14:05,439 Speaker 4: and these platforms have become more powerful, a lot of 284 00:14:05,480 --> 00:14:08,240 Speaker 4: people in power want to repeal section two thirty. This 285 00:14:08,360 --> 00:14:11,199 Speaker 4: would devastate the Internet. It would transform the Internet from 286 00:14:11,280 --> 00:14:14,880 Speaker 4: a place again where there is user generated content, to 287 00:14:15,120 --> 00:14:17,800 Speaker 4: a place where the whole Internet is Netflix right, where 288 00:14:17,840 --> 00:14:20,640 Speaker 4: you have to apply to a platform to get your 289 00:14:20,720 --> 00:14:24,800 Speaker 4: content on there, and nobody can kind of reshare content. 290 00:14:24,920 --> 00:14:25,000 Speaker 3: You know. 291 00:14:25,080 --> 00:14:27,120 Speaker 4: Section two thirty is what allows you to retweet a 292 00:14:27,160 --> 00:14:29,960 Speaker 4: tweet or forward an email. Removing section two thirty would 293 00:14:29,960 --> 00:14:31,920 Speaker 4: not allow you to sort of engage in that way. 294 00:14:31,920 --> 00:14:33,360 Speaker 4: It would all it would just be sort of a 295 00:14:33,440 --> 00:14:37,040 Speaker 4: Netflix style Internet. Obviously, this is something people in power 296 00:14:37,120 --> 00:14:39,640 Speaker 4: really want. They don't like that people, average people can 297 00:14:39,680 --> 00:14:41,480 Speaker 4: go on the Internet and sort of connect with each other. 298 00:14:41,680 --> 00:14:46,360 Speaker 4: So there's this aggressive effort to repeal section two thirty 299 00:14:46,400 --> 00:14:50,480 Speaker 4: that ironically, the big platforms are also now behind because they, 300 00:14:50,640 --> 00:14:53,640 Speaker 4: with the rise of AI, feel comfortable pre screening every 301 00:14:53,680 --> 00:14:57,840 Speaker 4: single piece of content via AI to kind of ensure 302 00:14:57,920 --> 00:15:01,280 Speaker 4: that any speech on the platform alot with what the 303 00:15:01,360 --> 00:15:03,960 Speaker 4: government wants. This is why Meta has been sort of 304 00:15:04,080 --> 00:15:07,480 Speaker 4: integral to these efforts to chip away at section two thirty. 305 00:15:08,080 --> 00:15:10,080 Speaker 2: This is so interesting because I think the way section 306 00:15:10,200 --> 00:15:13,600 Speaker 2: two thirty is commonly understood, and you mentioned that wide 307 00:15:13,680 --> 00:15:15,400 Speaker 2: article from a few years ago. I haven't read it, 308 00:15:15,480 --> 00:15:18,640 Speaker 2: but it's essentially the battle of a Section two thirty 309 00:15:18,720 --> 00:15:22,480 Speaker 2: is the battle where like regular people are fighting back 310 00:15:22,560 --> 00:15:26,360 Speaker 2: against the tech companies and their ability to like harm 311 00:15:26,440 --> 00:15:28,520 Speaker 2: people with content and not be held responsible. 312 00:15:28,680 --> 00:15:32,920 Speaker 4: Well, Section two thirty protects those regular people. So section 313 00:15:33,040 --> 00:15:37,040 Speaker 4: two thirty ensures again that you as a user without 314 00:15:37,080 --> 00:15:39,560 Speaker 4: a lot of power have the ability to speak to 315 00:15:39,840 --> 00:15:42,840 Speaker 4: to power. It allows for user generated content. If we 316 00:15:42,920 --> 00:15:46,760 Speaker 4: didn't have that, it would consolidate power among big tech. Now, 317 00:15:47,080 --> 00:15:49,000 Speaker 4: I you know, as somebody that covers tech for a 318 00:15:49,080 --> 00:15:52,320 Speaker 4: long time, especially in light of the tech lash over 319 00:15:52,360 --> 00:15:54,520 Speaker 4: the past decade. About a decade ago, these far right 320 00:15:54,600 --> 00:15:58,520 Speaker 4: religious fundamentalist groups and organizations that have always wanted a 321 00:15:58,640 --> 00:16:02,840 Speaker 4: censored Internet, groups like you know, the Heritage Foundation, they 322 00:16:03,000 --> 00:16:06,640 Speaker 4: realized that by claiming, you know, repealing Section two thirty 323 00:16:06,760 --> 00:16:08,840 Speaker 4: was cracking down on big tech, they've gotten a lot 324 00:16:08,920 --> 00:16:11,360 Speaker 4: of I think leftists and liberals to kind of fall 325 00:16:11,440 --> 00:16:14,760 Speaker 4: for their ruse and go along with this. Now, it's 326 00:16:14,840 --> 00:16:16,800 Speaker 4: important to note they did have a big victory. Back 327 00:16:16,880 --> 00:16:20,120 Speaker 4: in twenty eighteen we saw foss Assessa pass. Foss Assessa 328 00:16:20,720 --> 00:16:22,840 Speaker 4: was the first major carve out to Section two thirty 329 00:16:23,240 --> 00:16:25,160 Speaker 4: that was framed as cracking down on big tech. I 330 00:16:25,200 --> 00:16:27,800 Speaker 4: would ask people, you know, do you think Facebook and 331 00:16:27,840 --> 00:16:30,520 Speaker 4: Google are more or less powerful today than twenty eighteen? 332 00:16:30,600 --> 00:16:31,760 Speaker 4: I think we know the answer to that. 333 00:16:32,320 --> 00:16:35,000 Speaker 2: And the hearing this week with center Ted Cruz, what's 334 00:16:35,120 --> 00:16:37,520 Speaker 2: what's going on here? What can we expect and what's 335 00:16:37,560 --> 00:16:41,080 Speaker 2: different from the bipartisan legislation introduced by Lindsay Graham and 336 00:16:41,360 --> 00:16:43,960 Speaker 2: Amy Klobashaw and others late last year on Section two thirty. 337 00:16:44,120 --> 00:16:47,400 Speaker 4: It's essentially like the same thing. There is this Sunset 338 00:16:47,520 --> 00:16:49,920 Speaker 4: Section two thirty Act that again has been brought by 339 00:16:49,960 --> 00:16:53,080 Speaker 4: the Democrats that want to seize control over online speech. 340 00:16:53,440 --> 00:16:55,160 Speaker 4: I have to say, I know this is bipartisan. A 341 00:16:55,200 --> 00:16:59,000 Speaker 4: lot of issues related to mass surveillance and censorship are bipartisan. Unfortunately, 342 00:16:59,040 --> 00:16:59,320 Speaker 4: I told you. 343 00:16:59,320 --> 00:17:02,240 Speaker 2: Rather last week where I think you said mass eventage 344 00:17:02,280 --> 00:17:03,600 Speaker 2: is always a bipod is any. 345 00:17:03,440 --> 00:17:07,080 Speaker 4: Shoe exactly, And so you know, and I think it's 346 00:17:07,160 --> 00:17:09,399 Speaker 4: kind of terrifying that we're seeing this happen under the 347 00:17:09,400 --> 00:17:11,600 Speaker 4: Trump administration. I mean, I certainly couldn't do my job. 348 00:17:11,640 --> 00:17:13,840 Speaker 4: I wouldn't have a job as an independent journalist if 349 00:17:14,160 --> 00:17:16,600 Speaker 4: Section two thirty didn't exist. But yeah, there's this big 350 00:17:16,680 --> 00:17:19,040 Speaker 4: hearing to kind of discuss all of this on the HILP. 351 00:17:19,160 --> 00:17:19,720 Speaker 1: Go ahead, Kyle. 352 00:17:19,800 --> 00:17:21,919 Speaker 5: I have some of the same worries as Taylor here 353 00:17:22,040 --> 00:17:24,720 Speaker 5: that Section two thirty does protect a lot of speech 354 00:17:24,800 --> 00:17:27,600 Speaker 5: on the Internet. It's meant to kind of separate a 355 00:17:27,640 --> 00:17:30,399 Speaker 5: little bit the platform from the content on it, and 356 00:17:30,520 --> 00:17:33,480 Speaker 5: it has allowed for Facebook to exist, for Instagram and 357 00:17:33,600 --> 00:17:36,719 Speaker 5: YouTube to exist. But I think if we look forward 358 00:17:36,760 --> 00:17:39,800 Speaker 5: into the future, these tech giants are going to find 359 00:17:39,840 --> 00:17:43,119 Speaker 5: ways around whatever regulations exist. They will find ways to 360 00:17:43,240 --> 00:17:47,679 Speaker 5: profit from this situation. If it's repealed, and the smaller platforms, 361 00:17:47,760 --> 00:17:51,359 Speaker 5: like more independent spaces, spaces that don't have access to 362 00:17:51,960 --> 00:17:55,520 Speaker 5: millions of AI tokens as we were just discussing, we'll 363 00:17:55,560 --> 00:17:57,639 Speaker 5: kind of fall by the wayside and we will be 364 00:17:57,760 --> 00:18:02,280 Speaker 5: left with that narrowed, more homogenized internet. I think there 365 00:18:02,560 --> 00:18:04,760 Speaker 5: are good and bad parts of Section two thirty, Like 366 00:18:04,880 --> 00:18:07,840 Speaker 5: it's it's not just one thing, but it does really 367 00:18:07,960 --> 00:18:10,480 Speaker 5: protect our ability to put things out on the Internet 368 00:18:10,960 --> 00:18:15,160 Speaker 5: and for Internet hosts to support the publishing of things 369 00:18:15,240 --> 00:18:18,400 Speaker 5: online without being always responsible for every single thing. 370 00:18:18,720 --> 00:18:21,200 Speaker 4: I just want to say, like, I actually don't think 371 00:18:21,200 --> 00:18:23,440 Speaker 4: there's bad parts of Section two thirty, Like I think 372 00:18:23,440 --> 00:18:27,080 Speaker 4: it's such a short, like uncontroversial law. Meta and Google 373 00:18:27,280 --> 00:18:29,440 Speaker 4: and all these big tech companies have been using so 374 00:18:29,760 --> 00:18:34,119 Speaker 4: many other laws to evade responsibility and to consolidate corporate power, 375 00:18:34,680 --> 00:18:36,879 Speaker 4: and we could pass so many laws. We could reform 376 00:18:36,920 --> 00:18:38,960 Speaker 4: things like the Computer Fraud and Abuse Act, which Meta 377 00:18:39,040 --> 00:18:42,000 Speaker 4: uses to crush competitors. We could, you know, prosecute them 378 00:18:42,040 --> 00:18:45,359 Speaker 4: for anti competitive behavior. So many things could there are 379 00:18:45,440 --> 00:18:48,600 Speaker 4: so many legislative and sort of broader political things that 380 00:18:48,640 --> 00:18:50,080 Speaker 4: we could do to curb the power of these big 381 00:18:50,119 --> 00:18:53,000 Speaker 4: tech companies. But chipping away at section two thirty only 382 00:18:53,160 --> 00:18:55,639 Speaker 4: chips away at the power of users, you know, to 383 00:18:55,760 --> 00:18:57,879 Speaker 4: speak truth to power, and I think that's really dangerous, 384 00:18:57,960 --> 00:18:58,640 Speaker 4: especially right now. 385 00:18:59,080 --> 00:18:59,280 Speaker 1: Kyle. 386 00:18:59,280 --> 00:19:01,600 Speaker 2: I want to come back to you because you wrote 387 00:19:01,600 --> 00:19:04,040 Speaker 2: a book called Filter World, and you wrote about Section 388 00:19:04,160 --> 00:19:07,000 Speaker 2: two thirty in that book, and here's what you said. 389 00:19:07,640 --> 00:19:10,720 Speaker 2: The problem with section two thirty is that ultimately and bizarrely, 390 00:19:10,880 --> 00:19:13,720 Speaker 2: the law makes it so no one is currently responsible 391 00:19:13,760 --> 00:19:15,760 Speaker 2: for the effects of algorithmic recommendations. 392 00:19:16,400 --> 00:19:20,080 Speaker 4: That's not true. How would you say no one is responsible? 393 00:19:20,240 --> 00:19:22,720 Speaker 5: Well, the law that section two thirty was based on, 394 00:19:22,920 --> 00:19:26,400 Speaker 5: or one of the precedents, was this bookstore ruling where 395 00:19:26,440 --> 00:19:29,639 Speaker 5: a bookstore cannot be held responsible for the contents of 396 00:19:29,760 --> 00:19:33,000 Speaker 5: the books that it's providing. And I think, like the 397 00:19:33,119 --> 00:19:36,680 Speaker 5: platforms and the way the internet works has evolved since 398 00:19:36,880 --> 00:19:39,240 Speaker 5: the time that Section two thirty was put into place, 399 00:19:39,720 --> 00:19:44,120 Speaker 5: and algorithmic recommendations and kind of top down wait a minute, 400 00:19:44,240 --> 00:19:48,440 Speaker 5: censorship where filtering has become more aggressive, and so I 401 00:19:48,520 --> 00:19:51,639 Speaker 5: think Section two thirty has protected a lot of that 402 00:19:52,080 --> 00:19:54,680 Speaker 5: algorithmic filtering action over the years. 403 00:19:55,480 --> 00:19:57,600 Speaker 4: Kyle. I just did an episode which I hope you'll 404 00:19:57,600 --> 00:20:01,320 Speaker 4: watch on my channel last week about this idea that 405 00:20:01,880 --> 00:20:04,960 Speaker 4: Section two thirty wasn't written to protect algorithms. The two 406 00:20:05,080 --> 00:20:07,720 Speaker 4: cases right that led to Section two thirty, the Prodigy 407 00:20:07,960 --> 00:20:12,280 Speaker 4: and compu Serve cases. Both of those platforms used algorithms 408 00:20:12,320 --> 00:20:15,800 Speaker 4: to recommend content. Algorithms are speech as defined by the 409 00:20:15,840 --> 00:20:18,000 Speaker 4: First Amendment. Like, I think what people need to realize 410 00:20:18,080 --> 00:20:20,680 Speaker 4: is a lot of problems that society. I'm not saying 411 00:20:20,720 --> 00:20:22,240 Speaker 4: this is you, Kyle, but like I think a lot 412 00:20:22,280 --> 00:20:24,160 Speaker 4: of people have problems with speech law. And what Section 413 00:20:24,240 --> 00:20:26,480 Speaker 4: two thirty does is effectively just make it cheaper to 414 00:20:27,000 --> 00:20:29,280 Speaker 4: win cases on First Amendment grounds. So, you know, the 415 00:20:29,359 --> 00:20:32,680 Speaker 4: same cases that these a lot of these smaller platforms 416 00:20:32,720 --> 00:20:34,440 Speaker 4: win on Section two thirty, they only have to It 417 00:20:34,520 --> 00:20:36,320 Speaker 4: only cost them fifty to one hundred thousand dollars to 418 00:20:36,400 --> 00:20:38,879 Speaker 4: fight that case and win. But if they were to 419 00:20:38,920 --> 00:20:40,960 Speaker 4: fight it on First Amendment grounds, it would cost over 420 00:20:41,040 --> 00:20:44,679 Speaker 4: five million to ten million dollars so you know, algorithmic 421 00:20:44,880 --> 00:20:48,800 Speaker 4: editorial choices are speech in themselves. And the idea has 422 00:20:48,920 --> 00:20:52,400 Speaker 4: always been to give these platforms the power of publishers 423 00:20:52,480 --> 00:20:55,040 Speaker 4: and to make them publishers, but without the sort of 424 00:20:55,160 --> 00:20:57,879 Speaker 4: liability of treating them as newspapers. As Kyle said, it's 425 00:20:57,920 --> 00:20:59,800 Speaker 4: much more treating them like a bookstore. But a book 426 00:21:00,000 --> 00:21:02,200 Speaker 4: store does put books in the front window, right, a 427 00:21:02,200 --> 00:21:06,840 Speaker 4: bookstore does make recommendations to readers, And so I think 428 00:21:06,880 --> 00:21:09,399 Speaker 4: there are ways to hold these platforms accountable for, you know, 429 00:21:09,480 --> 00:21:11,440 Speaker 4: a lot of the bad effects that they've had on society. 430 00:21:11,960 --> 00:21:14,600 Speaker 4: But again, I don't think Section two thirty is the 431 00:21:14,720 --> 00:21:16,440 Speaker 4: law to do that through. 432 00:21:16,880 --> 00:21:19,840 Speaker 3: Isn't that? Isn't that what the what these groundbreaking social 433 00:21:19,920 --> 00:21:22,520 Speaker 3: media lawsuits are that are happening right now are sort 434 00:21:22,520 --> 00:21:26,159 Speaker 3: of based on, is that the algorithm is the you know, 435 00:21:26,600 --> 00:21:28,639 Speaker 3: is the thing that these companies should be held viable for. 436 00:21:29,119 --> 00:21:32,840 Speaker 4: Well, yeah, it's just very funny because when once you 437 00:21:32,920 --> 00:21:35,240 Speaker 4: know there is no problem with the algorithm, the problem 438 00:21:35,359 --> 00:21:37,680 Speaker 4: is the content, right, What people have problems with is 439 00:21:37,720 --> 00:21:42,040 Speaker 4: the content If the algorithm. If if everything on Instagram, 440 00:21:42,160 --> 00:21:44,960 Speaker 4: for instance, was just Wikipedia articles and that's the only 441 00:21:45,000 --> 00:21:47,119 Speaker 4: thing you could see was Wikipedia articles, but it had 442 00:21:47,160 --> 00:21:49,919 Speaker 4: the exact same recommendation algorithm. Would you have a problem 443 00:21:49,960 --> 00:21:52,560 Speaker 4: with it? I'm sure no, No, The answer is no, 444 00:21:52,680 --> 00:21:55,000 Speaker 4: these people would not because we have platforms that are 445 00:21:55,160 --> 00:21:57,600 Speaker 4: like that, that are online learning platforms that operate with 446 00:21:57,640 --> 00:22:02,080 Speaker 4: the same sort of engagement based recommendation algorithms, and people 447 00:22:02,080 --> 00:22:04,199 Speaker 4: don't home problem with it. Or something like Spotify right 448 00:22:04,280 --> 00:22:06,800 Speaker 4: which has an enormous amount of data on you and 449 00:22:06,880 --> 00:22:08,640 Speaker 4: will feed you you know, if you're depressed, you get 450 00:22:08,680 --> 00:22:11,000 Speaker 4: more depressing songs, and maybe those depressing songs, you know, 451 00:22:11,080 --> 00:22:13,480 Speaker 4: like those surressing songs make you want to kill yourself, 452 00:22:13,560 --> 00:22:14,560 Speaker 4: Like we're. 453 00:22:14,440 --> 00:22:17,119 Speaker 5: Not having This is one of the cases of British 454 00:22:17,200 --> 00:22:20,359 Speaker 5: woman who was unfortunately pushed a lot of depression content 455 00:22:20,480 --> 00:22:25,000 Speaker 5: from Pinterest and then it ended up committing suicide, dying 456 00:22:25,040 --> 00:22:27,720 Speaker 5: by suicide. And you know, like there are problems with 457 00:22:27,840 --> 00:22:31,679 Speaker 5: algorithmic recommendations and particular types of content, but they're, as 458 00:22:31,760 --> 00:22:35,160 Speaker 5: Tyler was saying, they're also better legal frameworks and better 459 00:22:35,320 --> 00:22:38,959 Speaker 5: law packages that are being discussed now that can address them. 460 00:22:39,480 --> 00:22:43,600 Speaker 5: So some like EU laws, like the Digital Services Acts 461 00:22:43,680 --> 00:22:47,080 Speaker 5: that let you modify your algorithmic feed like things that 462 00:22:47,160 --> 00:22:49,639 Speaker 5: give you more rights to your data and your information. 463 00:22:50,200 --> 00:22:53,320 Speaker 5: So we can use other laws to target these issues 464 00:22:53,440 --> 00:22:55,960 Speaker 5: kind of on the downstream end for sure, But. 465 00:22:56,080 --> 00:22:59,120 Speaker 3: Ultimately this is about who can be sued for what right? 466 00:22:59,160 --> 00:23:01,760 Speaker 3: I mean, that's what it really comes down to. Can 467 00:23:01,840 --> 00:23:04,719 Speaker 3: you win? Can you win a lawsuit for harm caused 468 00:23:04,800 --> 00:23:08,479 Speaker 3: by something you saw online or content you saw online? 469 00:23:08,560 --> 00:23:11,040 Speaker 4: Right? Well, I think we've seen this moral panic forever, 470 00:23:11,280 --> 00:23:14,800 Speaker 4: right is is, and I would argue, yes you can 471 00:23:15,000 --> 00:23:17,239 Speaker 4: right now, you can win a lawsuit, like I mean, 472 00:23:17,720 --> 00:23:21,280 Speaker 4: you can sue the person that made that content that is, 473 00:23:21,480 --> 00:23:24,080 Speaker 4: you know, allegedly harmful. You can sue them and they 474 00:23:24,119 --> 00:23:26,680 Speaker 4: can be held responsible. And that's a really good system. 475 00:23:27,440 --> 00:23:29,720 Speaker 4: What we don't want to do is make it so 476 00:23:29,880 --> 00:23:32,600 Speaker 4: that every platform is legally liable for every piece of 477 00:23:32,720 --> 00:23:35,240 Speaker 4: content on their website, or there will be no such 478 00:23:35,280 --> 00:23:37,719 Speaker 4: thing as journalism on the Internet because all of these 479 00:23:37,760 --> 00:23:41,000 Speaker 4: platforms will simply, you know, not want to host any 480 00:23:41,040 --> 00:23:43,200 Speaker 4: sort of legally questionable content. 481 00:23:43,280 --> 00:23:45,560 Speaker 3: And for a lot of content is anonymous or it's 482 00:23:45,600 --> 00:23:47,399 Speaker 3: not in the US, So like you could sue, but 483 00:23:47,600 --> 00:23:49,040 Speaker 3: you're not going to get much out of it. So 484 00:23:49,119 --> 00:23:52,360 Speaker 3: the only you know, deep pocketed entity you can really 485 00:23:52,400 --> 00:23:55,119 Speaker 3: go after is the platform right right? And the tech company? 486 00:23:55,320 --> 00:23:57,399 Speaker 4: Well, I think what you're talking about, like a lot 487 00:23:57,480 --> 00:24:00,359 Speaker 4: of these sort of critics of these issues need to 488 00:24:00,400 --> 00:24:02,040 Speaker 4: get their story straight. Do you have a problem with 489 00:24:02,119 --> 00:24:05,119 Speaker 4: online anonymity, Okay, there are better ways to address that. 490 00:24:05,240 --> 00:24:07,480 Speaker 4: Do you have a problem with the fact that there's 491 00:24:07,480 --> 00:24:11,480 Speaker 4: harmful content abroad that's being brought to Americans, Okay, let's 492 00:24:11,480 --> 00:24:13,800 Speaker 4: address that through again lots of different things. Or do 493 00:24:13,840 --> 00:24:16,400 Speaker 4: you have a problem with the content on social media? 494 00:24:16,440 --> 00:24:17,800 Speaker 4: And if you have a problem with the content on 495 00:24:17,880 --> 00:24:20,239 Speaker 4: social media, what I would posit is that a lot 496 00:24:20,320 --> 00:24:22,800 Speaker 4: of this stuff is sort of the same moral panic 497 00:24:22,840 --> 00:24:25,080 Speaker 4: that we see about a lot of sort of content 498 00:24:25,160 --> 00:24:28,840 Speaker 4: based freakouts, where it's like, you know, you don't want 499 00:24:28,880 --> 00:24:30,960 Speaker 4: your child seeing xyz. And we should be in that 500 00:24:31,119 --> 00:24:33,280 Speaker 4: speech law and we should be very careful about sort 501 00:24:33,320 --> 00:24:36,919 Speaker 4: of how we asked the government to regulate content. 502 00:24:37,440 --> 00:24:40,920 Speaker 2: It's interesting Senator Ron Wyden, who drafted this law, said 503 00:24:40,960 --> 00:24:42,560 Speaker 2: a couple of things recently. 504 00:24:42,880 --> 00:24:43,040 Speaker 4: You know. 505 00:24:43,160 --> 00:24:46,680 Speaker 2: One he said, without section two thirty, anyone who merely 506 00:24:46,800 --> 00:24:50,000 Speaker 2: shared a story or allegation about Epstein and his associates 507 00:24:50,040 --> 00:24:53,160 Speaker 2: on their social media could be sued by Epstein's deep 508 00:24:53,200 --> 00:24:56,560 Speaker 2: pocketed pals, along with the site that hosted those posts. 509 00:24:56,920 --> 00:24:59,760 Speaker 2: But he also said he was open to reform or 510 00:25:00,080 --> 00:25:02,600 Speaker 2: not not full on repeal, but some tweaks of section 511 00:25:02,680 --> 00:25:05,480 Speaker 2: two thirty to address some of the concerns people have. 512 00:25:06,440 --> 00:25:09,800 Speaker 4: I think with that he's talking about AI primarily because 513 00:25:09,800 --> 00:25:12,359 Speaker 4: I think the question, the big, next big question is like, 514 00:25:13,720 --> 00:25:16,439 Speaker 4: as these AI platforms sort of generate their own forms 515 00:25:16,480 --> 00:25:19,720 Speaker 4: of speech, like how should they be regulated? How should 516 00:25:19,800 --> 00:25:22,160 Speaker 4: large language models be regulated? And I think we're seeing 517 00:25:22,200 --> 00:25:24,920 Speaker 4: them use section two thirty to sort of claim, well, 518 00:25:25,000 --> 00:25:27,800 Speaker 4: this is just sort of like equivalent to user generated content. 519 00:25:27,880 --> 00:25:30,560 Speaker 4: And I think that's sort of these questions that yeah. 520 00:25:30,560 --> 00:25:32,400 Speaker 3: I would love to hear your view. And I don't 521 00:25:32,440 --> 00:25:34,080 Speaker 3: know if I don't know if Section two thirty is 522 00:25:34,119 --> 00:25:36,600 Speaker 3: really going to be fully repealed ever, or if I 523 00:25:36,680 --> 00:25:39,879 Speaker 3: mean that seems unlikely, But just like thought experiment, like 524 00:25:40,040 --> 00:25:42,840 Speaker 3: what would the internet? What would actually happen on these 525 00:25:42,880 --> 00:25:44,840 Speaker 3: tech like the just imagine a world where it does 526 00:25:44,920 --> 00:25:48,359 Speaker 3: get completely repealed, what would change? 527 00:25:49,040 --> 00:25:51,440 Speaker 4: Yeah, I think we don't have to imagine. We see 528 00:25:51,480 --> 00:25:53,520 Speaker 4: what happened when we chipped away at section two thirty. 529 00:25:53,600 --> 00:25:56,000 Speaker 4: So again, Foss Assessa is the first major amendment, and 530 00:25:56,040 --> 00:25:57,720 Speaker 4: it's the first it's the biggest carve out to Section 531 00:25:57,800 --> 00:26:00,960 Speaker 4: two thirty that we've ever had. Foss assess as a 532 00:26:01,040 --> 00:26:03,760 Speaker 4: law that passed back in twenty eighteen that was supposed 533 00:26:03,800 --> 00:26:09,920 Speaker 4: to hold platforms liable for content that would incentivize sex trafficking. 534 00:26:10,400 --> 00:26:12,440 Speaker 4: All of the same arguments they're making today about Section 535 00:26:12,520 --> 00:26:14,800 Speaker 4: two thirty. We went through all of this a decade ago, 536 00:26:15,200 --> 00:26:17,720 Speaker 4: and we actually can see exactly what happened. What happened is, 537 00:26:18,240 --> 00:26:21,520 Speaker 4: first of all, it didn't reduce sex trafficking even in 538 00:26:21,640 --> 00:26:25,879 Speaker 4: the slightest. What it did is mass sensor LGBTQ content 539 00:26:25,960 --> 00:26:28,520 Speaker 4: and abortion content from the Internet. We can also look 540 00:26:28,560 --> 00:26:31,879 Speaker 4: at other countries that have more authoritarian laws, like China, 541 00:26:32,840 --> 00:26:34,600 Speaker 4: and I know people have a lot of complicated thoughts 542 00:26:34,640 --> 00:26:36,879 Speaker 4: on China, but China does have a very restrictive Internet 543 00:26:36,960 --> 00:26:41,560 Speaker 4: where you cannot say things that are not approved tacitly 544 00:26:41,600 --> 00:26:44,320 Speaker 4: by the government, and the government effectively deputizes their big 545 00:26:44,400 --> 00:26:48,120 Speaker 4: tech companies as state sensors. And that is the world 546 00:26:48,200 --> 00:26:50,200 Speaker 4: that we're moving towards here in America. And I think 547 00:26:50,240 --> 00:26:52,920 Speaker 4: that that should concern anybody that cares about civil liberties 548 00:26:53,040 --> 00:26:55,639 Speaker 4: or free speech online. So I think we should protect 549 00:26:55,680 --> 00:26:57,840 Speaker 4: Section two thirty. But I'm all for cracking down on 550 00:26:57,880 --> 00:27:00,800 Speaker 4: these big tech companies and regulating them a lot or heavily, 551 00:27:00,960 --> 00:27:01,920 Speaker 4: you know, in many other ways. 552 00:27:03,160 --> 00:27:04,880 Speaker 2: We're going to take a quick break now, but when 553 00:27:04,880 --> 00:27:07,919 Speaker 2: we come back, we'll hear from Kyle about why taste 554 00:27:08,359 --> 00:27:11,840 Speaker 2: has become such a buzzword in Silicon Valley. Stay with us, 555 00:27:31,800 --> 00:27:34,800 Speaker 2: Welcome back to tech stuff. Kyle, you're up and you 556 00:27:34,880 --> 00:27:37,000 Speaker 2: wrote a column this week in the New Yorker about 557 00:27:37,080 --> 00:27:39,840 Speaker 2: why tech bros Are now obsessed with taste. 558 00:27:40,560 --> 00:27:43,880 Speaker 1: I actually lolled reading it. Here was my favorite. 559 00:27:43,960 --> 00:27:47,960 Speaker 2: Here's my favorite quote The entrepreneur and former Byte dance 560 00:27:48,040 --> 00:27:51,440 Speaker 2: engineer Cony Is it Coney Wang or Connie Wang's song 561 00:27:52,200 --> 00:27:56,520 Speaker 2: Oh songe completely wrong, echoed a New Silicon Valley axiom 562 00:27:56,600 --> 00:27:59,800 Speaker 2: in a blog post, writing quote in the AI era, 563 00:28:00,320 --> 00:28:04,200 Speaker 2: personal taste is the moat. Startups apparently need taste. This 564 00:28:04,359 --> 00:28:07,240 Speaker 2: is you, like AI needs data centers. 565 00:28:09,000 --> 00:28:12,920 Speaker 5: Just trying to have fun, you know. I feel like 566 00:28:13,160 --> 00:28:15,760 Speaker 5: this this was kind of my way into the AI 567 00:28:16,000 --> 00:28:18,720 Speaker 5: culture to be in a way that's not just talking 568 00:28:18,760 --> 00:28:22,520 Speaker 5: about like humanoid robots killing people or something like that 569 00:28:23,800 --> 00:28:27,760 Speaker 5: it's this kind of mania that Silicon Valley and like 570 00:28:28,000 --> 00:28:31,359 Speaker 5: tech bros broadly on X in particular, have with the 571 00:28:31,440 --> 00:28:34,560 Speaker 5: word taste. And they're all trying to make their taste 572 00:28:34,640 --> 00:28:38,680 Speaker 5: better and have taste in AI models and you know, 573 00:28:38,840 --> 00:28:42,440 Speaker 5: train themselves essentially to desire better things. 574 00:28:42,600 --> 00:28:45,040 Speaker 2: And I was just so sick of it, Like, guys, 575 00:28:45,160 --> 00:28:47,520 Speaker 2: how did you How did you become aware of the phenomenon? 576 00:28:47,600 --> 00:28:49,920 Speaker 1: When did taste? When did taste? Dot ring like a 577 00:28:50,040 --> 00:28:51,000 Speaker 1: fire a lamb? 578 00:28:52,000 --> 00:28:54,240 Speaker 4: How can you escape it? If you're on tex Twitter, 579 00:28:54,320 --> 00:28:56,000 Speaker 4: it's all they've been posting that idea. 580 00:28:56,160 --> 00:28:56,400 Speaker 1: Yeah. 581 00:28:56,720 --> 00:28:58,800 Speaker 5: I mean, I'm kind of a Canarian the coal mine 582 00:28:58,800 --> 00:29:01,400 Speaker 5: here because I've been writing about taste and algorithms for 583 00:29:01,560 --> 00:29:05,160 Speaker 5: like ten years now, so I'm very sensitive to the 584 00:29:05,320 --> 00:29:08,080 Speaker 5: use of that word. And I just started seeing it 585 00:29:08,200 --> 00:29:10,280 Speaker 5: crop up more and more and more. And I think 586 00:29:10,480 --> 00:29:13,800 Speaker 5: the threshold moment or like the climax of this was 587 00:29:13,920 --> 00:29:16,920 Speaker 5: like a few weeks ago or something two weeks ago. 588 00:29:17,080 --> 00:29:22,440 Speaker 5: Paul Graham, the technologists like great Entrepreneur tweeted like in 589 00:29:22,560 --> 00:29:25,120 Speaker 5: the AI age, like we all need better personal taste, 590 00:29:25,200 --> 00:29:28,680 Speaker 5: and I was just like, oh my god, Like personal 591 00:29:28,800 --> 00:29:32,720 Speaker 5: taste was the realm of like hipsters and DJs and like. 592 00:29:33,320 --> 00:29:33,560 Speaker 1: You know. 593 00:29:35,120 --> 00:29:36,880 Speaker 2: I worked in the mid two thousands. I feel like 594 00:29:36,920 --> 00:29:40,040 Speaker 2: it's crazy that this word has been claimed by the 595 00:29:40,120 --> 00:29:41,440 Speaker 2: technology industry, like. 596 00:29:41,800 --> 00:29:45,280 Speaker 5: The industry that is widely seeing us the most tasteless 597 00:29:45,320 --> 00:29:49,520 Speaker 5: thing on earth, like no shade, like I cover tech, 598 00:29:49,600 --> 00:29:52,520 Speaker 5: I love tech, I love the Internet. Tasteful, is not it? 599 00:29:52,800 --> 00:29:57,560 Speaker 2: Like taste mean like cigars and whiskey? To these folks, 600 00:29:57,880 --> 00:30:01,560 Speaker 2: does mean what's the like what what's the what's beneath 601 00:30:01,600 --> 00:30:01,880 Speaker 2: the saying? 602 00:30:02,080 --> 00:30:05,360 Speaker 5: Man? I mean the definition that I read out of, 603 00:30:05,600 --> 00:30:08,720 Speaker 5: like interpreted from all of their tweets, is like taste 604 00:30:08,840 --> 00:30:12,680 Speaker 5: is basically what is profitable like to them, like, having 605 00:30:12,760 --> 00:30:15,280 Speaker 5: good taste means knowing how to choose the right thing 606 00:30:15,360 --> 00:30:18,000 Speaker 5: that makes money. It means knowing how to attract more 607 00:30:18,080 --> 00:30:22,000 Speaker 5: customers to use your like AI dating coach. It's maybe 608 00:30:22,200 --> 00:30:24,720 Speaker 5: the choice of like a logo or an Instagram ad 609 00:30:24,840 --> 00:30:27,600 Speaker 5: like that's on the more literally tasteful end of it. 610 00:30:28,080 --> 00:30:31,640 Speaker 5: But I took an issue with this like mechanistic or 611 00:30:31,760 --> 00:30:34,600 Speaker 5: utilitarian use of the word when really, like, to me, 612 00:30:34,760 --> 00:30:38,280 Speaker 5: taste is about enjoying art and culture as a human 613 00:30:38,400 --> 00:30:39,040 Speaker 5: with feelings. 614 00:30:39,720 --> 00:30:41,920 Speaker 4: Well, I think it's so funny because do you guys 615 00:30:41,960 --> 00:30:45,160 Speaker 4: remember when Joyce Carol Oates kind of like dunked on 616 00:30:45,360 --> 00:30:47,920 Speaker 4: Elon Musk and she was just like, you don't enjoy 617 00:30:48,560 --> 00:30:51,120 Speaker 4: like art and literature. Like there was all this discussion 618 00:30:51,160 --> 00:30:52,920 Speaker 4: of like what does Elon Musk do? And then Elon 619 00:30:53,000 --> 00:30:55,080 Speaker 4: Musk sort of attempts to clack back and he starts 620 00:30:55,160 --> 00:30:57,600 Speaker 4: replying to these like movie review accounts and then he 621 00:30:57,640 --> 00:31:00,800 Speaker 4: tweets like this is a great or like homer Iliad 622 00:31:01,120 --> 00:31:03,320 Speaker 4: is a great book, and then he like linked to 623 00:31:03,480 --> 00:31:06,840 Speaker 4: like some other book that was like the Odyssey or something, 624 00:31:08,040 --> 00:31:10,120 Speaker 4: and it just like to me, like when I think 625 00:31:10,160 --> 00:31:13,160 Speaker 4: of tasteless tech people, I think of Elon Musk and 626 00:31:13,360 --> 00:31:17,480 Speaker 4: like this desperation to like be seen as like culturally 627 00:31:17,720 --> 00:31:21,720 Speaker 4: kind of I guess insightful. And yeah, they none of 628 00:31:21,760 --> 00:31:23,920 Speaker 4: these people have any taste. They're making slot, but. 629 00:31:23,920 --> 00:31:26,920 Speaker 3: They're also living, so they're so far in the future 630 00:31:27,000 --> 00:31:29,960 Speaker 3: right that they're imagining a world where like the robots 631 00:31:30,000 --> 00:31:32,880 Speaker 3: are doing everything, and it's like, well, what like there's 632 00:31:32,920 --> 00:31:36,280 Speaker 3: this like existential crisis. It's like what am I at now? 633 00:31:36,480 --> 00:31:38,240 Speaker 3: Like I used to write the code and I don't 634 00:31:38,280 --> 00:31:40,720 Speaker 3: do that. I don't do any you know, I guess 635 00:31:40,800 --> 00:31:42,920 Speaker 3: I have taste. I mean I think that's. 636 00:31:43,680 --> 00:31:47,400 Speaker 5: Telling the robot what to do. Like now, I'm now 637 00:31:47,400 --> 00:31:51,240 Speaker 5: I'm exerting myself for expressing myself by by bossing around 638 00:31:51,840 --> 00:31:53,040 Speaker 5: and exactly. 639 00:31:52,880 --> 00:31:54,080 Speaker 1: Which is a bit suppressing. 640 00:31:54,440 --> 00:31:58,240 Speaker 5: And there was another quite viral Mark Andreeson moment like 641 00:31:58,480 --> 00:32:00,400 Speaker 5: in the last few days, I think, where he just 642 00:32:00,560 --> 00:32:06,720 Speaker 5: says that he has no introspection. It's just like there's 643 00:32:06,800 --> 00:32:09,680 Speaker 5: nothing going on here up here in the head. I 644 00:32:09,720 --> 00:32:12,080 Speaker 5: have no thoughts, I have no sense of self. And 645 00:32:12,240 --> 00:32:14,600 Speaker 5: that was I mean, that was just in a nutshell, 646 00:32:14,760 --> 00:32:17,400 Speaker 5: like how can you have taste if you have no 647 00:32:17,600 --> 00:32:21,200 Speaker 5: thoughts about anything? If like you're only moving forward. 648 00:32:22,200 --> 00:32:24,680 Speaker 2: I want to I want to ask you both and 649 00:32:24,920 --> 00:32:28,320 Speaker 2: Read and Taylor about this. As West West Coasters. There 650 00:32:28,400 --> 00:32:30,200 Speaker 2: was a story in Bloomberg last year that I just 651 00:32:30,280 --> 00:32:32,560 Speaker 2: absolutely adored about how all the tech bros now want 652 00:32:32,600 --> 00:32:37,000 Speaker 2: to build these like extremely tall and ugly monuments too, 653 00:32:37,160 --> 00:32:39,640 Speaker 2: And I'm just wondering, like, what are you seeing on 654 00:32:39,720 --> 00:32:42,840 Speaker 2: the like, are you seeing the physical architecture, I mean 655 00:32:43,160 --> 00:32:46,760 Speaker 2: East Wing style be remodeled in this in this era 656 00:32:46,920 --> 00:32:48,240 Speaker 2: of a new new taste. 657 00:32:48,160 --> 00:32:51,240 Speaker 3: Or no one's building anything out here. So I don't think. 658 00:32:51,560 --> 00:32:56,160 Speaker 4: There's read said it's time to build, there's no. 659 00:32:57,840 --> 00:32:58,040 Speaker 1: Yeah. 660 00:32:58,160 --> 00:33:00,720 Speaker 4: I think it's interesting kind of how they seek to 661 00:33:00,880 --> 00:33:03,280 Speaker 4: exert their influence on the physical world. I mean, Kyle 662 00:33:03,600 --> 00:33:06,160 Speaker 4: has written so much smart sort of stuff about this, 663 00:33:06,400 --> 00:33:09,200 Speaker 4: but it is you know, I and I think of 664 00:33:09,280 --> 00:33:11,160 Speaker 4: Kyle's work a lot when I think of like this, 665 00:33:11,640 --> 00:33:15,360 Speaker 4: you know, the sort of liminal, sterile spaces that we're 666 00:33:15,440 --> 00:33:17,240 Speaker 4: very affiliated with, the sort of associated with the tech 667 00:33:17,280 --> 00:33:21,080 Speaker 4: world in the twenty tens. I do wonder what kind 668 00:33:21,080 --> 00:33:25,360 Speaker 4: of text dominant aesthetic is emerging as in the twenty twenty, 669 00:33:26,240 --> 00:33:28,800 Speaker 4: twenty twenties, Like is it the slop, is it the 670 00:33:28,960 --> 00:33:31,240 Speaker 4: like what what is kind of like the dominant like 671 00:33:31,440 --> 00:33:34,520 Speaker 4: tech aesthetic. I'm curious, Kyle, like that you're seeing have 672 00:33:34,640 --> 00:33:36,200 Speaker 4: you have you been. 673 00:33:36,120 --> 00:33:37,120 Speaker 3: To Cursor's office. 674 00:33:37,680 --> 00:33:37,720 Speaker 1: No. 675 00:33:38,600 --> 00:33:42,960 Speaker 3: There it's like very like warm like living room sheet. 676 00:33:43,040 --> 00:33:46,640 Speaker 3: You come in and like there's just a pile of shoes, 677 00:33:46,840 --> 00:33:50,600 Speaker 3: like hundreds of shoes because everyone just goes barefoot, and 678 00:33:50,720 --> 00:33:52,960 Speaker 3: it's like you just feel like you're in this like 679 00:33:53,600 --> 00:33:56,520 Speaker 3: it's like everything's wood and warm, and you're just like, oh, 680 00:33:56,600 --> 00:33:58,800 Speaker 3: I'm like I'm not at work, I'm like in my 681 00:33:59,400 --> 00:34:02,880 Speaker 3: like cozy living room, which is different from those Google 682 00:34:03,800 --> 00:34:06,040 Speaker 3: you know, like the Google was like very bright, like 683 00:34:06,240 --> 00:34:08,880 Speaker 3: bouncy balls, sit at your desk and like, you know, 684 00:34:09,239 --> 00:34:12,840 Speaker 3: there's it was just like this fun playground atmosphere. Now 685 00:34:12,880 --> 00:34:14,960 Speaker 3: it's like I don't know, I don't even know if 686 00:34:15,000 --> 00:34:16,839 Speaker 3: that's the question you're asking, but it just popped into 687 00:34:16,880 --> 00:34:19,399 Speaker 3: my head, like you should you should visit Curser's office. 688 00:34:19,640 --> 00:34:21,520 Speaker 3: I feel my shoes on. I was told you could, 689 00:34:21,640 --> 00:34:23,399 Speaker 3: I could leave my shoes on, but I wasn't sure 690 00:34:23,400 --> 00:34:26,240 Speaker 3: if I was being judged. Actually I was definitely being judged. 691 00:34:26,320 --> 00:34:29,920 Speaker 5: But this is like the shaman cult vibes. I think, like, 692 00:34:30,200 --> 00:34:32,799 Speaker 5: like what goes with the quest to invent AI? 693 00:34:32,960 --> 00:34:33,200 Speaker 1: God? 694 00:34:33,760 --> 00:34:35,600 Speaker 5: You have to like have your cult in like a 695 00:34:35,760 --> 00:34:39,920 Speaker 5: shag carpeted room, everyone's wearing robes or whatever. You take 696 00:34:39,960 --> 00:34:44,680 Speaker 5: your shoes off, you ensconce yourself surrounded by AI agents 697 00:34:44,880 --> 00:34:47,280 Speaker 5: and you, you know, transcend reality. 698 00:34:48,200 --> 00:34:51,040 Speaker 2: Like Kyle reading your story, I got to think of 699 00:34:51,120 --> 00:34:53,520 Speaker 2: a quote the last thought about when I was studying 700 00:34:53,600 --> 00:34:58,719 Speaker 2: literature University fifteen years ago from William Woodsworth, which was 701 00:34:58,920 --> 00:35:02,360 Speaker 2: every great and orig writer in proportion as he is 702 00:35:02,440 --> 00:35:06,680 Speaker 2: great and original must himself create the taste by which 703 00:35:06,719 --> 00:35:07,480 Speaker 2: he is renished. 704 00:35:08,680 --> 00:35:09,680 Speaker 5: What a great quote. 705 00:35:09,960 --> 00:35:10,800 Speaker 1: What a great quote. 706 00:35:10,840 --> 00:35:12,719 Speaker 5: I mean to write that one down or put it 707 00:35:12,840 --> 00:35:17,480 Speaker 5: on the wall of my startup office. Yeah, Like it's 708 00:35:17,640 --> 00:35:20,080 Speaker 5: it's the job of an artist or a creator or 709 00:35:20,120 --> 00:35:23,920 Speaker 5: whatever to create a new paradigm, right, to create a 710 00:35:24,000 --> 00:35:27,160 Speaker 5: new sensibility and project it into the world and like 711 00:35:27,719 --> 00:35:30,920 Speaker 5: teach an audience in some ways to appreciate it or 712 00:35:31,040 --> 00:35:32,320 Speaker 5: understand what they're trying to do. 713 00:35:33,160 --> 00:35:34,120 Speaker 1: But if you if you. 714 00:35:34,200 --> 00:35:38,480 Speaker 5: Put that task onto an AI startup, like, I'm not 715 00:35:38,600 --> 00:35:40,600 Speaker 5: really resonating with what you want me to do here, 716 00:35:40,719 --> 00:35:42,919 Speaker 5: because what you want me to do is like wear 717 00:35:42,960 --> 00:35:45,919 Speaker 5: a pendant on my neck that insults me via text 718 00:35:46,040 --> 00:35:51,080 Speaker 5: message or you know, tells me to do things they 719 00:35:51,160 --> 00:35:54,040 Speaker 5: don't like want to do. There's that one startup that's 720 00:35:54,120 --> 00:35:57,040 Speaker 5: like AI can rent a human to do labor for it. 721 00:35:57,480 --> 00:35:59,759 Speaker 5: Like that that seems to be the taste they have 722 00:35:59,840 --> 00:36:03,120 Speaker 5: in mind. So I guess I just don't like, I 723 00:36:03,200 --> 00:36:06,520 Speaker 5: don't so far enjoy the sensibility that they're projecting, and 724 00:36:06,600 --> 00:36:08,879 Speaker 5: I wish they could come up with something a little better. 725 00:36:09,800 --> 00:36:12,879 Speaker 3: There was an interesting uh, well, the sorry Taylor, there's 726 00:36:13,040 --> 00:36:15,080 Speaker 3: just this thing other thing that happened in a video 727 00:36:15,239 --> 00:36:18,439 Speaker 3: with this did you see this dl SS five meme 728 00:36:18,600 --> 00:36:21,920 Speaker 3: happening online? They have this new thing where they use 729 00:36:22,000 --> 00:36:25,040 Speaker 3: Generator AI to like upscale the graphics of video games, 730 00:36:25,800 --> 00:36:28,400 Speaker 3: and so they showed like before and after pictures of 731 00:36:28,480 --> 00:36:31,600 Speaker 3: the characters, and people like really took issue with it 732 00:36:31,640 --> 00:36:34,400 Speaker 3: because they're like, well, that's not the artist. I mean, 733 00:36:34,440 --> 00:36:39,359 Speaker 3: it wasn't really explained like I still like a face tune, Yeah, 734 00:36:39,440 --> 00:36:43,920 Speaker 3: like these hilarious memes about it with like it was 735 00:36:43,960 --> 00:36:46,600 Speaker 3: like Jensen Wong before and after and he was like 736 00:36:46,680 --> 00:36:50,120 Speaker 3: a busty woman after, you know, and it's like, you know, 737 00:36:50,600 --> 00:36:53,000 Speaker 3: it gets to this question though of I mean even 738 00:36:53,120 --> 00:36:55,120 Speaker 3: just the announcement of it. It's like there wasn't a 739 00:36:55,200 --> 00:36:57,319 Speaker 3: thought of like are people going to be are people 740 00:36:57,400 --> 00:36:59,160 Speaker 3: going to be confused by this and think that you're 741 00:36:59,640 --> 00:37:03,080 Speaker 3: we're at actually like changing the artist's vision here, and 742 00:37:03,960 --> 00:37:06,560 Speaker 3: you know, and the reaction to it and it is 743 00:37:07,040 --> 00:37:09,319 Speaker 3: you know, I mean, it's not exactly what you're talking about, 744 00:37:09,360 --> 00:37:10,480 Speaker 3: but it just popped into my head. 745 00:37:10,800 --> 00:37:12,759 Speaker 4: But it is kind of similar because it's like how 746 00:37:13,040 --> 00:37:15,759 Speaker 4: is it altering art? And how is it altering kind 747 00:37:15,800 --> 00:37:18,839 Speaker 4: of like the media that we've consumed. There was also 748 00:37:18,920 --> 00:37:23,600 Speaker 4: a viral controversy over the Pretty Little Liar's book I 749 00:37:23,640 --> 00:37:26,920 Speaker 4: Guess was updated on Amazon Kindle Like it was like 750 00:37:26,960 --> 00:37:29,760 Speaker 4: the digital versions of it were updated to include modern 751 00:37:29,840 --> 00:37:33,359 Speaker 4: references to things like TikTok and stuff, which really kind 752 00:37:33,360 --> 00:37:36,400 Speaker 4: of like changes things in the story a little bit. 753 00:37:36,480 --> 00:37:39,160 Speaker 4: And so people were tweeting screenshots like what the heck, 754 00:37:39,280 --> 00:37:42,560 Speaker 4: Like why does a book need like an update, or 755 00:37:42,640 --> 00:37:44,560 Speaker 4: like why do we have to kind of like alter 756 00:37:44,840 --> 00:37:48,040 Speaker 4: our media to like make it conform with today's like 757 00:37:48,200 --> 00:37:50,920 Speaker 4: visual standards or like you know, cultural standards. And I 758 00:37:51,000 --> 00:37:52,920 Speaker 4: think there's a lot of rejection to that, whereas like 759 00:37:52,960 --> 00:37:55,359 Speaker 4: a lot of these tech people are like, wait, but look, 760 00:37:55,440 --> 00:37:57,520 Speaker 4: we have this shiny new version, you know, and it 761 00:37:57,600 --> 00:37:59,120 Speaker 4: can keep up with the times. And it's like, but 762 00:37:59,239 --> 00:38:01,040 Speaker 4: we appreciate that it didn't keep up. 763 00:38:00,920 --> 00:38:01,359 Speaker 3: With the times. 764 00:38:01,840 --> 00:38:05,840 Speaker 5: Yes, is like the taste of AI. Like all culturists 765 00:38:05,880 --> 00:38:10,960 Speaker 5: fan fiction, everything should conform to my personal preferences. Everything 766 00:38:11,000 --> 00:38:14,720 Speaker 5: should be like glossy and upscaled, like the video game filter. 767 00:38:15,200 --> 00:38:19,000 Speaker 5: It's just kind of like optimize, I don't know, optimize sheen. 768 00:38:19,520 --> 00:38:21,320 Speaker 5: That isn't very tasteful. Like if you go back to 769 00:38:21,360 --> 00:38:24,400 Speaker 5: the Wordsworth quote, like, that's timeless. It was from a 770 00:38:24,480 --> 00:38:26,560 Speaker 5: long time ago. We don't need to update it with 771 00:38:26,760 --> 00:38:32,000 Speaker 5: like riz or whatever. Words Worth doesn't need to say 772 00:38:32,120 --> 00:38:33,360 Speaker 5: six seven or something. 773 00:38:35,320 --> 00:38:36,960 Speaker 2: But I do think, Kyle, you put your finger on something. 774 00:38:37,000 --> 00:38:40,520 Speaker 2: She's really interesting to me, which is these tech overlords 775 00:38:40,560 --> 00:38:43,480 Speaker 2: imagining themselves into a future where their creations have been 776 00:38:43,520 --> 00:38:45,600 Speaker 2: so successful there is nothing left for them to do, 777 00:38:46,280 --> 00:38:48,920 Speaker 2: and the idea of taste as this kind of last 778 00:38:49,440 --> 00:38:53,160 Speaker 2: bastion of like what makes us human or gives us value? 779 00:38:53,200 --> 00:38:56,000 Speaker 2: I mean, thinking about the hipster movement earlier, it was 780 00:38:56,080 --> 00:38:58,400 Speaker 2: kind of like a I don't want to summarize a 781 00:38:58,440 --> 00:38:59,920 Speaker 2: hipster movement, but in a sense it was like, well, 782 00:39:00,200 --> 00:39:02,800 Speaker 2: we may not play by the rules of like you know, 783 00:39:03,000 --> 00:39:05,720 Speaker 2: finance culture, but like we have our thing which is special, 784 00:39:05,800 --> 00:39:08,120 Speaker 2: and we have our taste. And so the idea that 785 00:39:08,280 --> 00:39:10,399 Speaker 2: that tech bros may on the other side of their 786 00:39:10,440 --> 00:39:14,680 Speaker 2: great achievement join the hipsters of yesteryear in feeling slightly 787 00:39:14,719 --> 00:39:17,359 Speaker 2: alienated by the world in which they live is kind 788 00:39:17,360 --> 00:39:17,960 Speaker 2: of delightful. 789 00:39:18,080 --> 00:39:20,960 Speaker 5: Oh, I think that's so accurate, Like what else is 790 00:39:21,120 --> 00:39:25,600 Speaker 5: left but to take up carpentry and like make a 791 00:39:25,719 --> 00:39:29,240 Speaker 5: ceramic mug for yourself and like tinker with your espresso machine. 792 00:39:29,920 --> 00:39:33,080 Speaker 5: I think that this is extremely pretentious, probably, but I've 793 00:39:33,120 --> 00:39:36,440 Speaker 5: been reading a book of Basho poems, the like Japanese 794 00:39:36,520 --> 00:39:40,000 Speaker 5: haiku poet from the eighteenth century, and like all he 795 00:39:40,200 --> 00:39:43,960 Speaker 5: was doing was wandering around observing nature and writing very 796 00:39:44,040 --> 00:39:46,840 Speaker 5: short poems and hang out with his friends. And I 797 00:39:46,880 --> 00:39:49,279 Speaker 5: feel like that's all these tech bros actually want to 798 00:39:49,320 --> 00:39:51,680 Speaker 5: do in the end, and they could just do that, 799 00:39:52,280 --> 00:39:54,759 Speaker 5: Like they could skip all of this and just go 800 00:39:54,920 --> 00:39:55,880 Speaker 5: live in a cabin. 801 00:39:56,600 --> 00:39:59,839 Speaker 4: They have I don't think they have the like introspect 802 00:40:00,080 --> 00:40:01,560 Speaker 4: and to do that like I think. 803 00:40:01,440 --> 00:40:03,520 Speaker 3: They like So that's what one of the anthropic people 804 00:40:03,640 --> 00:40:05,480 Speaker 3: said when they you know, if when you leave an 805 00:40:05,480 --> 00:40:07,799 Speaker 3: AI company, you have to write a whole dietribe about 806 00:40:07,800 --> 00:40:09,440 Speaker 3: why you left. And he was like, I'm gonna go 807 00:40:09,520 --> 00:40:13,040 Speaker 3: write poetry. That's what he decided to do. So you 808 00:40:13,120 --> 00:40:15,000 Speaker 3: know they are doing that nice. 809 00:40:16,120 --> 00:40:19,799 Speaker 4: See what Yeah, let's see what that poetry happens out. 810 00:40:20,239 --> 00:40:23,960 Speaker 3: Shelley Banjo wrote this great column last week, Who's who's 811 00:40:24,200 --> 00:40:24,800 Speaker 3: my editor? 812 00:40:24,920 --> 00:40:25,040 Speaker 4: Now? 813 00:40:25,120 --> 00:40:27,640 Speaker 3: Who's newt at seven four? And she was like, she 814 00:40:27,760 --> 00:40:30,160 Speaker 3: asked She talked about asking this friend of hers who 815 00:40:30,239 --> 00:40:32,560 Speaker 3: has great taste in books, he reads a lot of books, 816 00:40:33,080 --> 00:40:35,200 Speaker 3: like what business books should I be reading right now? 817 00:40:35,840 --> 00:40:38,640 Speaker 3: And he just geminied it and sent her the Gemini 818 00:40:38,760 --> 00:40:41,400 Speaker 3: list and it was like the number one recommendation was 819 00:40:41,440 --> 00:40:44,960 Speaker 3: like the World is Flat by Tom Friedman. She's like, 820 00:40:45,120 --> 00:40:46,200 Speaker 3: what do you talk about? 821 00:40:46,200 --> 00:40:46,239 Speaker 4: Like? 822 00:40:46,400 --> 00:40:50,080 Speaker 3: This is like horrible? So, you know, I think you're 823 00:40:50,120 --> 00:40:52,640 Speaker 3: gonna have to have That's that's going to be the 824 00:40:52,760 --> 00:40:55,359 Speaker 3: new thing. It's gonna be like I have book recommendations 825 00:40:55,400 --> 00:40:58,200 Speaker 3: that were not generated by AI. You know, that's it's 826 00:40:58,280 --> 00:40:58,880 Speaker 3: real taste. 827 00:40:59,719 --> 00:41:01,080 Speaker 1: That's we have time for today. 828 00:41:01,680 --> 00:41:04,239 Speaker 2: But I want to end our discussion with a simple question. 829 00:41:04,400 --> 00:41:06,800 Speaker 2: Taylor had this one last week, but for Kyle and 830 00:41:07,280 --> 00:41:09,759 Speaker 2: and read it's fresh. Who had the best week in tech? 831 00:41:09,840 --> 00:41:10,600 Speaker 2: And who had the worst? 832 00:41:10,640 --> 00:41:16,360 Speaker 1: We'll start with you, Taylor, No, don't stay Kyle. 833 00:41:16,640 --> 00:41:19,680 Speaker 5: So I think the best week in tech was Nintendo 834 00:41:20,120 --> 00:41:24,760 Speaker 5: with Pocopia, the like new Pokemon slash Animal Crossing Slash 835 00:41:24,840 --> 00:41:27,680 Speaker 5: Minecraft game, and it seems to have just been like 836 00:41:27,800 --> 00:41:30,719 Speaker 5: a huge hit, even though people were cynical about it 837 00:41:31,080 --> 00:41:33,680 Speaker 5: and kind of thought it was a you know, hack job, 838 00:41:33,800 --> 00:41:38,600 Speaker 5: sellout move. It's delightful. Everyone loves Pokemon and hanging out 839 00:41:38,640 --> 00:41:41,360 Speaker 5: and building little gardens for their Pokemon to hang out in. 840 00:41:41,560 --> 00:41:43,200 Speaker 5: So I think Nintendo had. 841 00:41:43,040 --> 00:41:43,560 Speaker 3: A great week. 842 00:41:44,080 --> 00:41:44,839 Speaker 1: Who had a bad one? 843 00:41:45,280 --> 00:41:47,600 Speaker 5: I mean, I was gonna say Sam Altman getting called 844 00:41:47,640 --> 00:41:53,000 Speaker 5: a Nazi by as it came up before. That's pretty 845 00:41:53,160 --> 00:41:56,600 Speaker 5: pretty bad. That is pretty bad read. 846 00:41:56,840 --> 00:41:59,120 Speaker 3: I mean, I don't know. I think I probably just 847 00:41:59,239 --> 00:42:00,800 Speaker 3: just because it's like wresh in my mind and I 848 00:42:00,880 --> 00:42:03,239 Speaker 3: was talking about it, but I did, I did. I 849 00:42:03,360 --> 00:42:06,719 Speaker 3: do think that Nvidia had kind of a good, a 850 00:42:06,800 --> 00:42:09,080 Speaker 3: good week this week, even though they're stock I mean, 851 00:42:09,200 --> 00:42:11,920 Speaker 3: not looking at their stock price like it kind of 852 00:42:12,160 --> 00:42:15,320 Speaker 3: But maybe it's just me crystallizing in my head like 853 00:42:15,400 --> 00:42:18,279 Speaker 3: where they're at in terms of their their grand ambitions. 854 00:42:19,800 --> 00:42:23,160 Speaker 3: But I don't know. I mean, I think it just 855 00:42:23,280 --> 00:42:26,880 Speaker 3: continues to be like this this Anthropic thing with the Pentagon. 856 00:42:27,200 --> 00:42:29,920 Speaker 3: You know, Anthropic has had this big fight with the 857 00:42:30,000 --> 00:42:33,839 Speaker 3: Pentagon over exactly how their AI models should be used 858 00:42:34,000 --> 00:42:38,120 Speaker 3: in warfare and also in surveillance of Americans, and the 859 00:42:38,160 --> 00:42:41,279 Speaker 3: Pentagon is saying, you know, we don't want any restrictions 860 00:42:41,320 --> 00:42:44,200 Speaker 3: on these models, and in fact, now we're designating you 861 00:42:44,320 --> 00:42:47,600 Speaker 3: a supply chain risk. I just can't imagine that this 862 00:42:47,800 --> 00:42:49,680 Speaker 3: is like great for for Anthropic. 863 00:42:49,760 --> 00:42:52,440 Speaker 2: I mean, that's very interesting because last week's takeaway was 864 00:42:52,480 --> 00:42:54,560 Speaker 2: the Anthropic it had the best we can take because 865 00:42:54,560 --> 00:42:56,600 Speaker 2: of all the use of growth and love that had 866 00:42:56,640 --> 00:42:57,480 Speaker 2: gotten from there. 867 00:42:57,600 --> 00:42:58,640 Speaker 1: From their stance on this. 868 00:42:59,280 --> 00:43:03,120 Speaker 3: I know, I know, and I I it's like it's okay, 869 00:43:04,560 --> 00:43:05,960 Speaker 3: you can look at that and you're like, yeah, I 870 00:43:06,040 --> 00:43:08,640 Speaker 3: mean sure if it it could work out for them 871 00:43:08,719 --> 00:43:10,920 Speaker 3: really well, right like this sort of like Apple versus 872 00:43:11,040 --> 00:43:15,120 Speaker 3: the versus the the FBI thing that sort of helped 873 00:43:15,160 --> 00:43:19,080 Speaker 3: that helped like solidify their privacy centric marketing. But I 874 00:43:19,239 --> 00:43:21,640 Speaker 3: just think in the end, it's not They're not a 875 00:43:21,800 --> 00:43:24,400 Speaker 3: they're not a consumer company really, so it doesn't like 876 00:43:25,200 --> 00:43:28,640 Speaker 3: like their app store numbers don't really ultimately matter that much. 877 00:43:28,719 --> 00:43:31,560 Speaker 3: It's like they have to win an enterprise and I 878 00:43:31,680 --> 00:43:35,160 Speaker 3: think like it's just not ultimately in the long run 879 00:43:35,400 --> 00:43:37,279 Speaker 3: that like, I don't think companies are going to look 880 00:43:37,280 --> 00:43:40,160 Speaker 3: are going to view this as like, you know, a 881 00:43:40,239 --> 00:43:43,200 Speaker 3: big bonus. It's just like it's just a draw. The 882 00:43:43,280 --> 00:43:45,560 Speaker 3: more the more this fight gets like drawn out, I 883 00:43:45,680 --> 00:43:47,600 Speaker 3: just have a feeling it's not that great for them. 884 00:43:47,719 --> 00:43:51,000 Speaker 3: Like their competitors are. Like Google has somehow totally avoided 885 00:43:51,080 --> 00:43:54,880 Speaker 3: this controversy, which I think is like fascinating and in 886 00:43:54,920 --> 00:43:56,879 Speaker 3: a lot of ways, they're like the real the one 887 00:43:56,960 --> 00:43:59,480 Speaker 3: everybody should be competing with, and like opening eyes kind 888 00:43:59,480 --> 00:44:01,880 Speaker 3: of geting dry into it too. So I don't know, 889 00:44:02,080 --> 00:44:04,040 Speaker 3: maybe Google had the best week actually I don't know, 890 00:44:04,960 --> 00:44:08,040 Speaker 3: just by not being like no one's talking about them 891 00:44:08,080 --> 00:44:08,880 Speaker 3: in this context. 892 00:44:09,400 --> 00:44:11,239 Speaker 4: I know, when you're a tech company, sometimes you could 893 00:44:11,320 --> 00:44:13,560 Speaker 4: just have the best week by no one speaking about you. 894 00:44:14,080 --> 00:44:16,120 Speaker 3: Right, right, It's true, It's true. 895 00:44:16,400 --> 00:44:18,960 Speaker 4: I think Jensen I'm gonna I'm gonna kind of like piggyback. 896 00:44:18,960 --> 00:44:21,279 Speaker 4: I mean, I do think Jensen Wong had a good week. 897 00:44:21,800 --> 00:44:24,839 Speaker 4: I think, like I mean, correct me if I'm wrong. 898 00:44:24,920 --> 00:44:27,280 Speaker 4: But I didn't see like a huge amount of backlash 899 00:44:27,520 --> 00:44:29,840 Speaker 4: considering like kind of the stuff that he was talking about. 900 00:44:30,120 --> 00:44:31,719 Speaker 4: I mean, maybe there was like a little dram over 901 00:44:31,800 --> 00:44:35,000 Speaker 4: in Vidia, but also like he is ascending to use 902 00:44:35,040 --> 00:44:38,800 Speaker 4: the clavicular term, you know, into like CEO status, Like 903 00:44:38,880 --> 00:44:40,960 Speaker 4: I mean even at the Nvidia you know conference, I 904 00:44:41,000 --> 00:44:43,800 Speaker 4: think they were selling like sweaters with Jensen Wong's like 905 00:44:43,920 --> 00:44:46,279 Speaker 4: face on it. So I think he's kind of like 906 00:44:46,800 --> 00:44:48,879 Speaker 4: you know, when you look at the clout index, maybe 907 00:44:49,000 --> 00:44:51,960 Speaker 4: like his is going up and Sam Altman is going down. 908 00:44:52,080 --> 00:44:55,640 Speaker 4: I would say Sam Altman probably had the worst week. Also, 909 00:44:56,920 --> 00:45:00,560 Speaker 4: Uber's former head of self driving wrote this great piece 910 00:45:00,640 --> 00:45:04,600 Speaker 4: about actually how his self driving Tesla crashed and taught 911 00:45:04,680 --> 00:45:07,319 Speaker 4: him a lot about like Ai Risks, and I thought, 912 00:45:07,800 --> 00:45:10,000 Speaker 4: I don't know if that's a bad week, but I thought, like, 913 00:45:10,320 --> 00:45:13,359 Speaker 4: you know, that was a good lesson for that man 914 00:45:13,440 --> 00:45:14,799 Speaker 4: to learn, like. 915 00:45:14,880 --> 00:45:17,759 Speaker 5: Your email inbox getting to lead it, like you learned something. 916 00:45:19,239 --> 00:45:20,160 Speaker 1: People need to learn. 917 00:45:20,080 --> 00:45:21,680 Speaker 4: These lessons for themselves. 918 00:45:21,760 --> 00:45:24,320 Speaker 1: There you got forced reflection. Thank you so much. That 919 00:45:24,360 --> 00:45:25,680 Speaker 1: was really fun. That was fun. 920 00:45:25,960 --> 00:45:26,319 Speaker 4: Thank you. 921 00:45:51,840 --> 00:45:52,600 Speaker 1: Put text stuff. 922 00:45:52,760 --> 00:45:55,480 Speaker 2: I'm as volocian. This episode was produced by Eliza Dennis 923 00:45:55,520 --> 00:45:58,800 Speaker 2: and Melissa Slaughter. It was executive produced by me Julian 924 00:45:58,880 --> 00:46:02,960 Speaker 2: Nutter and Kate Osborne Kaleidoscope and Katrina Norvel for iHeart Podcasts. 925 00:46:03,520 --> 00:46:07,040 Speaker 2: The engineer is Charles de Montebello for CDM Studios. Jack 926 00:46:07,120 --> 00:46:10,000 Speaker 2: Insley mixed his episode and Kyle murdoch Rodart theme song. 927 00:46:10,680 --> 00:46:13,080 Speaker 2: A special thank you to read Albergotti, Kyle Chaker, and 928 00:46:13,120 --> 00:46:15,920 Speaker 2: Taylor Lorenz. Please check out all the work they put 929 00:46:15,920 --> 00:46:18,000 Speaker 2: out into the world. We're very lucky to call them 930 00:46:18,080 --> 00:46:20,880 Speaker 2: friends of the pod and please do rate and review 931 00:46:20,960 --> 00:46:23,000 Speaker 2: this show wherever you listen to your podcasts.