1 00:00:17,280 --> 00:00:19,080 Speaker 1: Hey, and welcome to what Future. 2 00:00:19,360 --> 00:00:23,040 Speaker 2: I'm your host, Joshua Tapolski, and we have a Crackerjack 3 00:00:23,079 --> 00:00:26,400 Speaker 2: of a show. Do people still say that Crackerjack? Does 4 00:00:26,400 --> 00:00:27,440 Speaker 2: that even exist anymore? 5 00:00:28,480 --> 00:00:28,720 Speaker 1: Do you know? 6 00:00:28,800 --> 00:00:31,680 Speaker 2: Crackerjack's there like a candy, It's like a box. It's 7 00:00:31,720 --> 00:00:34,120 Speaker 2: a candy. It's a candy corn. No, no, candy. Corn 8 00:00:34,200 --> 00:00:37,920 Speaker 2: is a candy. It's like caramel corn or caramel corn 9 00:00:37,920 --> 00:00:40,720 Speaker 2: depending on who you talk to, which I guess is 10 00:00:40,760 --> 00:00:43,800 Speaker 2: corn covered in a sugary coating. And then in the 11 00:00:43,840 --> 00:00:45,600 Speaker 2: box it was like a red stripe box. Does this 12 00:00:45,680 --> 00:00:47,240 Speaker 2: even Does anybody know what I'm talking about? 13 00:00:47,320 --> 00:00:49,280 Speaker 1: Lyra or Jenna? Do you know what I'm talking about? 14 00:00:49,400 --> 00:00:50,920 Speaker 3: Of course I know what Crackerjack. 15 00:00:51,240 --> 00:00:52,880 Speaker 1: You know what Crackerjack is? Okay? 16 00:00:52,920 --> 00:00:54,720 Speaker 3: And I had a prize in the box, and. 17 00:00:54,960 --> 00:00:57,800 Speaker 2: At the bottom of the fucking box there'd be a prize, 18 00:00:57,800 --> 00:00:58,920 Speaker 2: like a toy or something. 19 00:00:59,280 --> 00:01:02,240 Speaker 1: That's great. They don't do that anymore. They don't do it. 20 00:01:02,320 --> 00:01:02,840 Speaker 1: Maybe they do. 21 00:01:02,880 --> 00:01:05,600 Speaker 2: They probably still sell it. It's probably very readily available. 22 00:01:05,640 --> 00:01:08,039 Speaker 2: It's probably one of the best selling products at grocery stores. 23 00:01:08,080 --> 00:01:09,520 Speaker 3: Heay, cracker Barrel, I bet you. 24 00:01:09,680 --> 00:01:12,760 Speaker 2: Cracker Barrel is a right wing establishment dedicated to the 25 00:01:13,080 --> 00:01:16,880 Speaker 2: suppression and oppression of many groups, and I can't stand 26 00:01:17,319 --> 00:01:19,000 Speaker 2: I won't stand for it. Although they do have really 27 00:01:19,040 --> 00:01:20,920 Speaker 2: good biscuits in my recollection. 28 00:01:21,120 --> 00:01:23,240 Speaker 3: But I would bet you good money that they sell 29 00:01:23,280 --> 00:01:24,280 Speaker 3: cracker jacks in the. 30 00:01:24,240 --> 00:01:26,760 Speaker 1: General store, crackerjack at the cracker barrel. 31 00:01:27,000 --> 00:01:29,520 Speaker 2: Probably you're probably right, An, Yeah, we have a crackerjack 32 00:01:29,760 --> 00:01:32,839 Speaker 2: show for you. We've got a wonderful man, Max Read, 33 00:01:33,240 --> 00:01:37,440 Speaker 2: a writer, an editor, a screenwriter, and also a friend, 34 00:01:37,600 --> 00:01:40,440 Speaker 2: I should say, a buddy, and he's got a great 35 00:01:40,440 --> 00:01:43,160 Speaker 2: newsletter called Read Max that I love. And we're going 36 00:01:43,200 --> 00:01:45,280 Speaker 2: to talk about all sorts of stuff. I don't want 37 00:01:45,280 --> 00:01:47,920 Speaker 2: to waste one minute, so let's get into this conversation. 38 00:02:05,080 --> 00:02:07,680 Speaker 2: You were editor in chief of Gawker when it was 39 00:02:07,800 --> 00:02:08,760 Speaker 2: suit out of existence? 40 00:02:08,800 --> 00:02:09,280 Speaker 1: Is that right? 41 00:02:09,760 --> 00:02:13,040 Speaker 4: The lawsuit was ongoing when I was editor. I left 42 00:02:13,080 --> 00:02:16,519 Speaker 4: in twenty fifteen. The following year spring, I want to 43 00:02:16,520 --> 00:02:18,760 Speaker 4: say the following year was when the judgment came down 44 00:02:19,240 --> 00:02:20,680 Speaker 4: and Gocker declared bankruptcy. 45 00:02:20,720 --> 00:02:22,640 Speaker 1: Who was the editor of Gawker at that time. 46 00:02:22,760 --> 00:02:26,160 Speaker 4: I think Alex Parene was the editor in chief. I thought, wow, yeah, 47 00:02:26,160 --> 00:02:28,040 Speaker 4: the last and final Yeah. 48 00:02:27,840 --> 00:02:30,840 Speaker 2: No, not the last and final of that Goker. There 49 00:02:30,880 --> 00:02:32,600 Speaker 2: were many other erbitions of Cocker. 50 00:02:33,000 --> 00:02:36,200 Speaker 4: They just transitioned. They became h what was it called. 51 00:02:36,040 --> 00:02:40,520 Speaker 1: Splinter, splinter, splinder splinter, but splinter was Splinter was owned 52 00:02:40,560 --> 00:02:44,760 Speaker 1: by that was? When? Was it owned by Fusion? Is 53 00:02:44,760 --> 00:02:45,360 Speaker 1: that was it? 54 00:02:45,560 --> 00:02:45,640 Speaker 3: Ye? 55 00:02:45,960 --> 00:02:48,880 Speaker 2: Fusion became I'm sorry, this is crazy shit to think about, 56 00:02:48,880 --> 00:02:51,880 Speaker 2: but like I know, Fusion was a thing that was 57 00:02:52,160 --> 00:02:55,519 Speaker 2: that existed. It was a new media startup. Correct me 58 00:02:55,560 --> 00:02:56,079 Speaker 2: if I'm wrong. 59 00:02:56,160 --> 00:02:58,440 Speaker 4: It was It was new media within Telemundo. 60 00:02:58,520 --> 00:03:02,800 Speaker 2: It was like Telemundo Star new Yes, start up, tell them. 61 00:03:02,960 --> 00:03:07,120 Speaker 2: Telemundo started a thing called Fusion dot net, which was 62 00:03:07,320 --> 00:03:11,400 Speaker 2: a a diverse millennial news operation, right that was sort 63 00:03:11,400 --> 00:03:15,440 Speaker 2: of like the overarching which like from like I get it, 64 00:03:15,520 --> 00:03:18,360 Speaker 2: like very of the moment, like very like Mashable, like 65 00:03:18,400 --> 00:03:21,040 Speaker 2: in the spectrum of like a Mashable and uh yeah, 66 00:03:21,080 --> 00:03:22,400 Speaker 2: I mean who else who else would it have been? 67 00:03:22,440 --> 00:03:25,679 Speaker 2: I guess BuzzFeed News, Mike whatever, This is not Mike, Yeah, 68 00:03:25,760 --> 00:03:28,760 Speaker 2: Mike's other perfect example. Actually, I would say in a 69 00:03:28,800 --> 00:03:31,440 Speaker 2: way Fusion was like a mic competitor. Probably is the 70 00:03:31,560 --> 00:03:34,079 Speaker 2: most most accurate. Anyhow, Sorry, this is not we don't 71 00:03:34,080 --> 00:03:35,480 Speaker 2: have to talk about this at all. In fact, I wasn't really 72 00:03:35,520 --> 00:03:37,800 Speaker 2: planning on talking about it, but although I do want 73 00:03:37,800 --> 00:03:40,480 Speaker 2: to talk about media because you are a member of 74 00:03:40,520 --> 00:03:43,680 Speaker 2: the media elite as we know, I don't know what 75 00:03:43,680 --> 00:03:45,360 Speaker 2: what is the word for a person who just has 76 00:03:45,400 --> 00:03:45,960 Speaker 2: a newsletter? 77 00:03:45,960 --> 00:03:46,560 Speaker 1: What do you call that? 78 00:03:47,040 --> 00:03:50,160 Speaker 4: I call myself an owner operator, a small business, small 79 00:03:50,200 --> 00:03:50,840 Speaker 4: business owner. 80 00:03:50,880 --> 00:03:52,200 Speaker 1: You're a small business owner. 81 00:03:52,360 --> 00:03:55,440 Speaker 2: Okay, so you get some some of Biden's tax break, 82 00:03:56,000 --> 00:03:59,000 Speaker 2: tax breaks for the one percent, for the elite. So 83 00:03:59,200 --> 00:04:01,600 Speaker 2: you're in the media, right, you'd say you're in the media, 84 00:04:01,640 --> 00:04:04,480 Speaker 2: like you continue you continue to publish content that is 85 00:04:04,520 --> 00:04:06,680 Speaker 2: in the sphere of news media. 86 00:04:06,760 --> 00:04:09,000 Speaker 4: Would you say, yeah, I mean, I I write twice 87 00:04:09,040 --> 00:04:12,360 Speaker 4: a week on my substack, do some freelance journalism still too. 88 00:04:13,000 --> 00:04:15,360 Speaker 4: I mean, it's funny the fact that you sort of 89 00:04:15,560 --> 00:04:17,440 Speaker 4: we sort of started talking about this and then immediately 90 00:04:17,440 --> 00:04:20,919 Speaker 4: started remembering some websites. It's very hard. I started my 91 00:04:20,920 --> 00:04:23,839 Speaker 4: substack not really intending to write very much about media. 92 00:04:23,920 --> 00:04:26,840 Speaker 4: But if you have been in the business for long enough, 93 00:04:26,880 --> 00:04:29,880 Speaker 4: and especially if you lived through the period that we're 94 00:04:29,920 --> 00:04:32,960 Speaker 4: talking about, the twenty tens, the crazy twenty tens, it's 95 00:04:33,000 --> 00:04:35,920 Speaker 4: hard not to just keep writing about it. Because it 96 00:04:36,040 --> 00:04:38,760 Speaker 4: still sticks in your mind. I don't want to say trauma, 97 00:04:39,000 --> 00:04:39,680 Speaker 4: but you know it's. 98 00:04:39,680 --> 00:04:42,440 Speaker 1: No, no, no, no, it's trauma. It's Trauma's the right word. 99 00:04:42,480 --> 00:04:42,880 Speaker 1: I don't know. 100 00:04:42,920 --> 00:04:44,760 Speaker 2: I think the word trauma's overused, and I don't want 101 00:04:44,800 --> 00:04:47,000 Speaker 2: to say that people who use it don't have trauma. 102 00:04:47,080 --> 00:04:49,240 Speaker 2: But I do feel like people say trauma about things 103 00:04:49,240 --> 00:04:51,240 Speaker 2: that are like they like, you know, I don't know, 104 00:04:51,279 --> 00:04:53,320 Speaker 2: they've got they couldn't find a pair of shoes that fit, 105 00:04:53,640 --> 00:04:55,640 Speaker 2: and they're like, I have trauma from that. I'm like, 106 00:04:55,680 --> 00:04:57,640 Speaker 2: I don't think that's what the word means. Like, I 107 00:04:57,640 --> 00:05:00,480 Speaker 2: feel like it might get a little overused. But because 108 00:05:00,480 --> 00:05:02,880 Speaker 2: I had a call earlier today, I was talking to 109 00:05:03,600 --> 00:05:11,039 Speaker 2: a reporter and we got into a conversation about the Messenger, Yeah, 110 00:05:11,080 --> 00:05:13,480 Speaker 2: which is a the messenger is like so you actually, 111 00:05:13,480 --> 00:05:15,000 Speaker 2: I said the boom period when you're talking about what 112 00:05:15,080 --> 00:05:17,279 Speaker 2: you're talking about fusion and stuff, but actually what fusion 113 00:05:17,440 --> 00:05:21,080 Speaker 2: is in that era, which is like ten late twenty tens. 114 00:05:21,080 --> 00:05:23,440 Speaker 2: I want to say, like, right, I don't know when 115 00:05:23,480 --> 00:05:26,320 Speaker 2: fusion started or that, like what the but I want 116 00:05:26,320 --> 00:05:31,560 Speaker 2: to say, like twenty ten, that's probably later than twenty ten, fourteen, 117 00:05:31,800 --> 00:05:34,920 Speaker 2: around thirteen fourteen. It's actually the beginning of the end 118 00:05:35,080 --> 00:05:38,960 Speaker 2: for the new media boom. It's like Vice had matured 119 00:05:38,960 --> 00:05:42,600 Speaker 2: into this monolithic thing, and BuzzFeed news was at least 120 00:05:42,720 --> 00:05:45,880 Speaker 2: rolling probably pretty successful at that point. It was sort 121 00:05:45,880 --> 00:05:47,720 Speaker 2: of like the beginning of the end for like the 122 00:05:47,760 --> 00:05:50,760 Speaker 2: boom era of like blogging, which would have been which 123 00:05:50,760 --> 00:05:53,960 Speaker 2: would have been in the you know, two thousand and 124 00:05:54,800 --> 00:05:57,280 Speaker 2: four to two. I mean, obviously stuff happened aslet's say 125 00:05:57,279 --> 00:06:00,960 Speaker 2: two thousand and four to two thousand and ten something 126 00:06:01,000 --> 00:06:02,960 Speaker 2: like that is like, yeah, the biggest like boom for 127 00:06:03,040 --> 00:06:05,560 Speaker 2: like blogging, blogging, not like corporate owned blogging. 128 00:06:06,360 --> 00:06:07,760 Speaker 4: Well, there was, I mean, the way I think about 129 00:06:07,760 --> 00:06:10,400 Speaker 4: it is always like there was that early period was 130 00:06:10,600 --> 00:06:12,920 Speaker 4: very people would still go to websites. You know, you 131 00:06:12,960 --> 00:06:15,960 Speaker 4: would actually go to ngadget dot com or goal dot 132 00:06:16,040 --> 00:06:18,720 Speaker 4: com and you you would refresh it to see the feed. 133 00:06:19,040 --> 00:06:21,200 Speaker 4: And then it was sort of SEO came along and 134 00:06:21,360 --> 00:06:24,800 Speaker 4: it allowed you to maintain a relatively similar format. But 135 00:06:24,880 --> 00:06:27,360 Speaker 4: the thing that really sort of turned like the when 136 00:06:27,480 --> 00:06:30,240 Speaker 4: Facebook arrived as like the thing that would give you 137 00:06:30,279 --> 00:06:34,159 Speaker 4: traffic around twenty twelve, and all of a sudden, everything 138 00:06:34,160 --> 00:06:36,400 Speaker 4: you wrote didn't matter what your website looked like, didn't 139 00:06:36,400 --> 00:06:38,240 Speaker 4: matter what your website was. Everything you wrote had to 140 00:06:38,320 --> 00:06:41,440 Speaker 4: sound like it could be shared on a social feed 141 00:06:41,520 --> 00:06:41,920 Speaker 4: or whatever. 142 00:06:42,040 --> 00:06:42,479 Speaker 1: That's right. 143 00:06:42,600 --> 00:06:44,440 Speaker 4: And that's when Fusion and all these that's when all 144 00:06:44,440 --> 00:06:46,839 Speaker 4: the VC money started really coming in and the big 145 00:06:46,920 --> 00:06:47,599 Speaker 4: corporate money. 146 00:06:48,160 --> 00:06:50,440 Speaker 2: Yeah, a very dark period. I mean, I'm just trying 147 00:06:50,440 --> 00:06:52,239 Speaker 2: to think. I'm trying to think of it in personal 148 00:06:52,320 --> 00:06:54,919 Speaker 2: terms only because that's easier for me. My memory is 149 00:06:55,000 --> 00:06:56,440 Speaker 2: very bad, and so I have to go like, well, 150 00:06:56,440 --> 00:06:59,240 Speaker 2: what was I doing? What was I doing at that point? 151 00:06:59,360 --> 00:06:59,480 Speaker 1: Right? 152 00:06:59,520 --> 00:07:01,840 Speaker 2: So, like, yeah, Gadget Crew left to start the Verge 153 00:07:01,880 --> 00:07:05,880 Speaker 2: in like twenty ten, I think maybe to early twenty eleven. 154 00:07:05,920 --> 00:07:08,320 Speaker 2: We launched the Verge in November of twenty eleven. I'm 155 00:07:08,320 --> 00:07:11,440 Speaker 2: pretty sure that sounds wrong, but I think it's actually 156 00:07:11,480 --> 00:07:13,800 Speaker 2: right anyhow, whatever, But that that to me is like 157 00:07:13,960 --> 00:07:15,360 Speaker 2: sort of the bit that's about when you used to 158 00:07:15,440 --> 00:07:18,080 Speaker 2: kickoff of the ends. It's not to be, not to 159 00:07:18,080 --> 00:07:20,520 Speaker 2: be you know, self centric or whatever. But like, I 160 00:07:20,560 --> 00:07:23,840 Speaker 2: think we were legitimately doing the blogging thing where we're like, 161 00:07:23,880 --> 00:07:25,640 Speaker 2: we're going to take our team and go do a thing. 162 00:07:25,680 --> 00:07:27,160 Speaker 2: And we went to a company that wasn't it was 163 00:07:27,280 --> 00:07:30,160 Speaker 2: VC back but it wasn't like it was an NBC 164 00:07:30,360 --> 00:07:33,640 Speaker 2: or whatever. It wasn't like Telemundo wasn't wasn't backing it 165 00:07:33,720 --> 00:07:37,200 Speaker 2: or whatever you're writing. You have a sub stack? Am 166 00:07:37,240 --> 00:07:38,000 Speaker 2: I allowed to say that? 167 00:07:38,320 --> 00:07:38,760 Speaker 1: Yeah? 168 00:07:38,800 --> 00:07:40,960 Speaker 2: A lot of times on the Internet somebody says substack. 169 00:07:41,000 --> 00:07:43,440 Speaker 2: That might put an asterisk in between the S and 170 00:07:43,480 --> 00:07:45,400 Speaker 2: the B because they don't know if Elon Musk is 171 00:07:45,400 --> 00:07:46,040 Speaker 2: going to block. 172 00:07:45,880 --> 00:07:47,560 Speaker 4: It for auditor. Yeah, well, just don't put it in 173 00:07:47,560 --> 00:07:49,120 Speaker 4: the title of this of this podcast. 174 00:07:49,200 --> 00:07:50,600 Speaker 1: Yeah, let's talk. We're gonna have to end. 175 00:07:50,560 --> 00:07:53,280 Speaker 2: As a note for Lyra and Jenna, please let's make 176 00:07:53,320 --> 00:07:56,320 Speaker 2: sure we don't add the word substack fully spelled out 177 00:07:56,320 --> 00:07:56,920 Speaker 2: in the title. 178 00:07:57,240 --> 00:07:59,240 Speaker 1: But you have a sub stack, which is a. 179 00:08:00,680 --> 00:08:03,559 Speaker 2: Is a company that has created a platform for people 180 00:08:03,600 --> 00:08:08,320 Speaker 2: to publish newsletters which look a lot like blog posts. Yeah, 181 00:08:08,360 --> 00:08:09,960 Speaker 2: it just sought me at any point where if you 182 00:08:10,000 --> 00:08:11,679 Speaker 2: sound like this, if this sounds like this is wrong, 183 00:08:12,040 --> 00:08:15,680 Speaker 2: And also, you can charge people a subscription fee for 184 00:08:15,840 --> 00:08:19,720 Speaker 2: your newsletter and in doing so create a like a 185 00:08:19,760 --> 00:08:24,400 Speaker 2: personal media company, like a single person media brand. 186 00:08:24,840 --> 00:08:25,680 Speaker 1: And that's what you've done. 187 00:08:25,760 --> 00:08:28,240 Speaker 2: Now you have a thing called read Max which I 188 00:08:28,240 --> 00:08:32,200 Speaker 2: have to say, as far as titles goes very unoriginal 189 00:08:32,280 --> 00:08:37,160 Speaker 2: for you, but I guess pretty clever as well. Explain 190 00:08:37,200 --> 00:08:39,720 Speaker 2: to the listener, who I'm sure are all subscribers of 191 00:08:39,760 --> 00:08:41,600 Speaker 2: read Max. By the way I have, I think the 192 00:08:41,640 --> 00:08:44,080 Speaker 2: crossover is extremely high. But just explain to them, like 193 00:08:44,120 --> 00:08:45,400 Speaker 2: what your newsletter does. 194 00:08:46,679 --> 00:08:49,160 Speaker 4: The tagline is that it's a newsletter about the future. 195 00:08:49,840 --> 00:08:53,000 Speaker 1: Really, yeah, is that the tagline? Yeah? It is? Wow? 196 00:08:53,160 --> 00:08:56,280 Speaker 4: Okay interesting, you know I started it. It's funny. I 197 00:08:56,280 --> 00:08:57,800 Speaker 4: was thinking about this the other day. I started it 198 00:08:57,800 --> 00:09:00,240 Speaker 4: with this, with the sort of manifesto post that people 199 00:09:00,240 --> 00:09:02,719 Speaker 4: can still go back and look at where I said. 200 00:09:02,760 --> 00:09:05,640 Speaker 4: I wanted to write a lot about the way life 201 00:09:06,000 --> 00:09:09,320 Speaker 4: in the twenty percentury had been shaped by the mega 202 00:09:09,320 --> 00:09:12,720 Speaker 4: platforms of the Internet, the way that not just sort 203 00:09:12,760 --> 00:09:18,080 Speaker 4: of economics but also culture and social formations get worked 204 00:09:18,120 --> 00:09:21,880 Speaker 4: through your facebooks and your Instagrams and your tiktoks. But 205 00:09:22,280 --> 00:09:25,800 Speaker 4: as it happens, you know, my editorial philosophy has all 206 00:09:25,840 --> 00:09:29,040 Speaker 4: the sort of been that readers like to feel passion 207 00:09:29,320 --> 00:09:31,319 Speaker 4: and joy. I sound like I'm giving like an upfront's 208 00:09:31,320 --> 00:09:32,520 Speaker 4: presentation to sell the advertise. 209 00:09:32,720 --> 00:09:34,960 Speaker 1: I love that. I wish you had like a clicker 210 00:09:35,000 --> 00:09:35,640 Speaker 1: and some slides. 211 00:09:35,760 --> 00:09:38,000 Speaker 2: Exactly right, if you really could, you have a big 212 00:09:38,120 --> 00:09:40,400 Speaker 2: like a pie chart with some percentage of readers who 213 00:09:40,520 --> 00:09:40,720 Speaker 2: like the. 214 00:09:40,720 --> 00:09:43,040 Speaker 4: Readers love imagine I'm in like a black turtle. 215 00:09:43,320 --> 00:09:45,720 Speaker 2: There's like, yeah, it's like it's like seventy eight percent 216 00:09:45,760 --> 00:09:47,080 Speaker 2: of readers love to feel passion. 217 00:09:47,160 --> 00:09:49,800 Speaker 1: There's a small sliver that don't like passion anyhow. 218 00:09:49,720 --> 00:09:51,160 Speaker 4: So this is what happens when you spend any time 219 00:09:51,200 --> 00:09:53,240 Speaker 4: in editorial management, as you learn to talk like this. 220 00:09:53,760 --> 00:09:57,560 Speaker 4: That being said, yeah cliched, is it sounds It's like 221 00:09:57,640 --> 00:10:01,160 Speaker 4: not worth me finding something to say about tech every 222 00:10:01,200 --> 00:10:03,000 Speaker 4: week if it means that once a week is going 223 00:10:03,040 --> 00:10:06,640 Speaker 4: to be boring. So in practice, the tagline of newsletter 224 00:10:06,679 --> 00:10:08,520 Speaker 4: is a newsletter about the future. In practice, it's a 225 00:10:08,559 --> 00:10:12,880 Speaker 4: newsletter about things that I think are interesting, which includes AI, 226 00:10:13,240 --> 00:10:18,319 Speaker 4: includes crypto, includes you know, platforms like Facebook, includes media stuff. 227 00:10:18,320 --> 00:10:22,240 Speaker 4: Also includes action movies, sci fi movies. Yes, sort of 228 00:10:22,240 --> 00:10:25,760 Speaker 4: whatever is is interesting me at any given moment. So 229 00:10:25,800 --> 00:10:28,559 Speaker 4: it's hard to give the elevator pitch. But people who 230 00:10:28,800 --> 00:10:30,640 Speaker 4: are on the same wave link that I am, I 231 00:10:30,640 --> 00:10:33,600 Speaker 4: think like it because they because it gives them a 232 00:10:33,640 --> 00:10:37,040 Speaker 4: sort of weekly dose of interesting thinking about, you know, 233 00:10:37,080 --> 00:10:40,440 Speaker 4: whatever is going on in whatever things are interesting me 234 00:10:40,480 --> 00:10:41,000 Speaker 4: at that moment. 235 00:10:41,440 --> 00:10:44,840 Speaker 2: Yeah, I think I think the most successful publications, whether 236 00:10:44,880 --> 00:10:48,839 Speaker 2: they're a person doing a newsletter or you know, lots 237 00:10:48,840 --> 00:10:51,120 Speaker 2: of people working on something, tend to be the ones 238 00:10:51,200 --> 00:10:55,520 Speaker 2: that center not around necessarily a very hard line of 239 00:10:55,559 --> 00:10:59,559 Speaker 2: specific topics, but a kind of philosophical through line. And 240 00:11:00,160 --> 00:11:02,000 Speaker 2: I mean, I agree with you, would be boring to 241 00:11:02,080 --> 00:11:06,040 Speaker 2: simply write about like whatever that description is supposed to 242 00:11:06,160 --> 00:11:08,440 Speaker 2: encapsulate or could encapsulate. It probably would be very boring 243 00:11:08,480 --> 00:11:10,640 Speaker 2: every week to hear about that. But you have you 244 00:11:10,640 --> 00:11:13,719 Speaker 2: do go pretty far afield. Yeah, Like I downloaded and 245 00:11:13,760 --> 00:11:15,719 Speaker 2: started watching this film Nemesis the other day, which I 246 00:11:15,800 --> 00:11:18,199 Speaker 2: definitely saw in my youth. I mean, you might have 247 00:11:18,240 --> 00:11:21,200 Speaker 2: written more than one occasion about the movie, but it's 248 00:11:21,240 --> 00:11:22,840 Speaker 2: not a good movie. I mean, to me, just for 249 00:11:22,920 --> 00:11:26,120 Speaker 2: starters time, I want to be very clear, it's not 250 00:11:26,200 --> 00:11:28,760 Speaker 2: like high art or anything. It is like it was 251 00:11:28,800 --> 00:11:32,360 Speaker 2: an early nineties. Yeah, like a cyber science fiction kind 252 00:11:32,400 --> 00:11:34,160 Speaker 2: of movie. It's like it's like there were a lot 253 00:11:34,200 --> 00:11:35,840 Speaker 2: of movies in the early nineties. Another one of them 254 00:11:35,920 --> 00:11:38,600 Speaker 2: is Hardware, which I'm sure you've seen I hope you've seen, 255 00:11:39,480 --> 00:11:42,760 Speaker 2: which originally was rated triple X very like for real, 256 00:11:42,920 --> 00:11:45,880 Speaker 2: like actually it was rated ACTS or triple X is 257 00:11:45,920 --> 00:11:47,719 Speaker 2: whatever they called it when it was you know, like, 258 00:11:48,360 --> 00:11:50,720 Speaker 2: but there's this kind of genre of film people who 259 00:11:50,760 --> 00:11:53,720 Speaker 2: like read Neuromancer and were like, that's cool, But I 260 00:11:53,720 --> 00:11:55,959 Speaker 2: don't really have much of a budget. What could I do? 261 00:11:56,120 --> 00:11:59,000 Speaker 2: Like if I like this like a cyberpunk idea, but 262 00:11:59,040 --> 00:12:01,000 Speaker 2: I don't really have the to go all the way. 263 00:12:01,840 --> 00:12:03,560 Speaker 2: And so Nemesis is what of the because you wrote 264 00:12:03,559 --> 00:12:05,120 Speaker 2: about but you actually did a post. I want to 265 00:12:05,160 --> 00:12:09,480 Speaker 2: say it was like a ranking of the sunglasses from Nemesis, 266 00:12:09,600 --> 00:12:11,559 Speaker 2: the cool sunglasses from Nemesis. 267 00:12:11,800 --> 00:12:14,440 Speaker 1: Yeah. I felt so inspired after seeing that that. 268 00:12:14,520 --> 00:12:16,240 Speaker 2: I was like, I got to watch this movie again, 269 00:12:16,240 --> 00:12:18,840 Speaker 2: and so I h of course I went on the 270 00:12:18,880 --> 00:12:22,280 Speaker 2: Pirate Bay, which is still in existence, even though you 271 00:12:22,320 --> 00:12:25,160 Speaker 2: have to definitely get a computer. 272 00:12:24,840 --> 00:12:27,480 Speaker 1: Virus to access it. You absolutely have. 273 00:12:27,480 --> 00:12:29,360 Speaker 2: To get a computer virus to go to the Pirate 274 00:12:29,360 --> 00:12:33,520 Speaker 2: Bay and downloaded a you know, a HD like Blu 275 00:12:33,679 --> 00:12:35,520 Speaker 2: Ray rip of Nemesis. 276 00:12:35,320 --> 00:12:36,920 Speaker 4: Christine twenty one sixty. 277 00:12:37,200 --> 00:12:38,760 Speaker 1: You know, no, no, no, I can't be. 278 00:12:39,080 --> 00:12:41,840 Speaker 2: I can't be watching Nemesis in some seven twenty shit 279 00:12:42,080 --> 00:12:44,240 Speaker 2: like I need to see I need to see every 280 00:12:44,320 --> 00:12:48,720 Speaker 2: grain of sand in the extremely sandy atmosphere of the 281 00:12:48,760 --> 00:12:50,880 Speaker 2: film anyhow. But so so you're writing about like you 282 00:12:50,920 --> 00:12:54,320 Speaker 2: write about stuff that is definitely not the future. But 283 00:12:54,400 --> 00:12:57,719 Speaker 2: I am curious what is like in your mind? Like 284 00:12:57,760 --> 00:12:59,640 Speaker 2: when you say the future, what are things you've written 285 00:12:59,640 --> 00:13:01,680 Speaker 2: and talked about and are thinking about recently that would 286 00:13:01,720 --> 00:13:02,880 Speaker 2: fall into that category. 287 00:13:03,400 --> 00:13:05,600 Speaker 4: I mean, I'm not any different than anybody else who's 288 00:13:05,640 --> 00:13:09,320 Speaker 4: paying attention right now in thinking that, like AI is, 289 00:13:09,360 --> 00:13:12,319 Speaker 4: the is kind of the thing that we everybody is 290 00:13:12,360 --> 00:13:16,040 Speaker 4: talking about, and therefore becomes interesting just because it is 291 00:13:16,080 --> 00:13:19,200 Speaker 4: a thing that everybody's talking about. You know, I was 292 00:13:19,679 --> 00:13:21,360 Speaker 4: over the last couple of years, I would describe myself 293 00:13:21,360 --> 00:13:25,319 Speaker 4: as a cryptoskeptic, but I still found crypto as a 294 00:13:25,440 --> 00:13:29,120 Speaker 4: kind of like object of interest worth writing about and 295 00:13:29,200 --> 00:13:32,560 Speaker 4: thinking about, both as a reason to criticize it, but 296 00:13:32,600 --> 00:13:35,760 Speaker 4: also you know, the communities that arise around something like crypto, 297 00:13:35,920 --> 00:13:39,400 Speaker 4: the kind of like the art work I suppose, or 298 00:13:39,440 --> 00:13:42,600 Speaker 4: the cultural formation, if you want to call it that, 299 00:13:42,840 --> 00:13:45,320 Speaker 4: I mean, between Nemesis and Bordees. Clearly I have a 300 00:13:45,360 --> 00:13:47,679 Speaker 4: real fascination with bad art at something you're. 301 00:13:47,840 --> 00:13:48,880 Speaker 1: Interesting to me here. 302 00:13:49,120 --> 00:13:51,720 Speaker 2: I've often described the NFT boom as the Olympics of 303 00:13:51,760 --> 00:13:54,560 Speaker 2: bad art, which I would I think it's just like 304 00:13:54,600 --> 00:13:58,240 Speaker 2: an astounding quantity of just like the people competing for 305 00:13:58,280 --> 00:14:00,679 Speaker 2: the top slot and like the bad art. Yeah, I 306 00:14:00,720 --> 00:14:02,320 Speaker 2: mean just some of the worst shit ever put into 307 00:14:02,320 --> 00:14:03,400 Speaker 2: it into humanity. 308 00:14:03,600 --> 00:14:05,360 Speaker 4: I mean, to any terrible. But isn't that But to me, 309 00:14:05,440 --> 00:14:07,880 Speaker 4: that's like it's sort of fascinating, Like there's something fascinating 310 00:14:07,880 --> 00:14:12,400 Speaker 4: about this, like weird grifter culture, like tacky grifter culture 311 00:14:12,400 --> 00:14:14,760 Speaker 4: emerging like billionaires. 312 00:14:15,320 --> 00:14:18,000 Speaker 2: Yeah, what happens when you find out that like something 313 00:14:18,080 --> 00:14:20,160 Speaker 2: that could you could scam people out of money is art, 314 00:14:20,640 --> 00:14:22,560 Speaker 2: and like you can and it's like not art, like 315 00:14:22,680 --> 00:14:24,800 Speaker 2: I had need, I have a Picasso, Right, it's not 316 00:14:25,240 --> 00:14:27,120 Speaker 2: like not that that's a scam. That's a whole different 317 00:14:27,240 --> 00:14:32,440 Speaker 2: type of scam actually, but one that is real. But 318 00:14:32,440 --> 00:14:34,480 Speaker 2: but but yeah, like what it happens. I mean, that's 319 00:14:34,520 --> 00:14:36,680 Speaker 2: an interesting place to explore it, right, Yeah, I hear 320 00:14:36,720 --> 00:14:38,160 Speaker 2: what you're saying, Like and. 321 00:14:38,320 --> 00:14:40,400 Speaker 4: You know, like right now, all of the all of 322 00:14:40,440 --> 00:14:43,000 Speaker 4: that energy, all of that kind of like prenetic like 323 00:14:43,080 --> 00:14:46,400 Speaker 4: what's next, what's big? What's interesting energy is is plunged 324 00:14:46,440 --> 00:14:49,800 Speaker 4: into large language models and the AI scene in general. 325 00:14:50,080 --> 00:14:52,240 Speaker 4: And to me this is even more kind of fruitful 326 00:14:52,240 --> 00:14:55,560 Speaker 4: and interesting than crypto because the technology is much more 327 00:14:55,560 --> 00:14:59,760 Speaker 4: obviously impressive and like has applications you can think of 328 00:14:59,800 --> 00:15:02,479 Speaker 4: with out having to like, you know, get a lobotomy 329 00:15:02,640 --> 00:15:04,960 Speaker 4: and start talking like Mark andrieson or what. 330 00:15:05,160 --> 00:15:09,040 Speaker 2: About slurp juice to understand like what AI could do 331 00:15:09,120 --> 00:15:09,360 Speaker 2: for you? 332 00:15:09,560 --> 00:15:11,720 Speaker 4: Yeah, and you know it Also it's interesting to me 333 00:15:11,760 --> 00:15:14,360 Speaker 4: because for the obvious reason that, like, as a writer, 334 00:15:14,760 --> 00:15:18,280 Speaker 4: like this is this is technology that is directly overlaps 335 00:15:18,360 --> 00:15:20,920 Speaker 4: with what I am trained to and paid to do, 336 00:15:22,320 --> 00:15:24,160 Speaker 4: you know, and I try to be in general, my 337 00:15:24,200 --> 00:15:27,280 Speaker 4: approach to this stuff is to try and figure out 338 00:15:27,360 --> 00:15:30,760 Speaker 4: what excites like what excites what excites other people about 339 00:15:30,760 --> 00:15:32,720 Speaker 4: something that doesn't necessarily excite me, or like what is 340 00:15:32,960 --> 00:15:36,240 Speaker 4: what do people find so captivating about it. It's a 341 00:15:36,240 --> 00:15:38,120 Speaker 4: little easier to do that because, like I said, it's 342 00:15:38,160 --> 00:15:40,640 Speaker 4: sort of obviously impressive, and I try to approach it 343 00:15:40,680 --> 00:15:45,360 Speaker 4: not from like a purely critical, purely kind of negative standpoint. 344 00:15:45,480 --> 00:15:48,080 Speaker 4: Even if I think, you know, Sam Altman is not 345 00:15:48,280 --> 00:15:51,120 Speaker 4: a great guy or somebody who's got my best interests 346 00:15:51,160 --> 00:15:51,480 Speaker 4: in mind. 347 00:15:51,560 --> 00:15:53,760 Speaker 1: Frankly, they ever are a great guy? Are they know? 348 00:16:04,840 --> 00:16:06,400 Speaker 2: Was the last time there was like a huge new 349 00:16:06,440 --> 00:16:09,240 Speaker 2: technology started by somebody that you're like, that is a 350 00:16:09,320 --> 00:16:10,000 Speaker 2: freat person. 351 00:16:10,080 --> 00:16:12,160 Speaker 1: I really like him. They seem cool. 352 00:16:12,200 --> 00:16:15,120 Speaker 2: I mean it is unusual, right, yeah. I mean people 353 00:16:15,200 --> 00:16:18,600 Speaker 2: used to love like Steve Jobs, but even Steve Jobs 354 00:16:18,640 --> 00:16:22,720 Speaker 2: and not upon further reflection, by the way, always seemed 355 00:16:22,720 --> 00:16:24,440 Speaker 2: like kind of not a great guy, like he was 356 00:16:24,560 --> 00:16:25,240 Speaker 2: sort of a dick. 357 00:16:25,800 --> 00:16:27,440 Speaker 4: I mean, you must remember, I feel like it was. 358 00:16:27,760 --> 00:16:30,040 Speaker 4: It's just a different cultural difference between like the nineties 359 00:16:30,080 --> 00:16:31,840 Speaker 4: and two thousands is now as people used to sort 360 00:16:31,840 --> 00:16:34,800 Speaker 4: of laugh about Steve Jobs as an abuse of boss like, 361 00:16:34,800 --> 00:16:37,680 Speaker 4: which he really obviously was. Every story you'd hear, these 362 00:16:37,680 --> 00:16:40,920 Speaker 4: sort of semi heroic stories about him just completely bitching 363 00:16:40,960 --> 00:16:44,240 Speaker 4: people out, screening at them whatever, like in the in 364 00:16:44,320 --> 00:16:46,680 Speaker 4: the mac Press, in MacAddict or whatever. It was also 365 00:16:46,920 --> 00:16:49,400 Speaker 4: treated as this funny little thing. And now you'd be like, 366 00:16:49,640 --> 00:16:51,800 Speaker 4: if that, if that stuff had come out now, that 367 00:16:51,960 --> 00:16:53,920 Speaker 4: people would have been up in arms. I mean, because 368 00:16:53,920 --> 00:16:55,880 Speaker 4: he sounds like he was an horrible person to work 369 00:16:55,920 --> 00:16:57,640 Speaker 4: for and basically every way. 370 00:16:57,480 --> 00:16:59,880 Speaker 2: You do, yes, and then but also you have to 371 00:17:00,120 --> 00:17:04,760 Speaker 2: wonder what is what could a Steve Jobs level creator exist? 372 00:17:05,560 --> 00:17:07,800 Speaker 2: I mean, this is the ultimate question. If the man 373 00:17:07,960 --> 00:17:10,359 Speaker 2: wasn't allowed to be a complete like rage aholic or whatever. 374 00:17:10,359 --> 00:17:12,959 Speaker 2: It's also right, he also cried. He also cried a lot, right, 375 00:17:13,000 --> 00:17:14,199 Speaker 2: That was the other thing, Like in the in the 376 00:17:14,200 --> 00:17:16,440 Speaker 2: Isaacs in books, they talked about him crying, but which 377 00:17:16,440 --> 00:17:18,919 Speaker 2: I find just like incongruous with him. I mean, it 378 00:17:18,960 --> 00:17:20,920 Speaker 2: makes I can see it, like in my mind's eye. 379 00:17:20,960 --> 00:17:23,320 Speaker 2: I'm like, okay, I get it. But like, imagine being 380 00:17:23,359 --> 00:17:26,360 Speaker 2: in any work scenario. Okay, you're a you're a guy. 381 00:17:26,440 --> 00:17:29,560 Speaker 2: You go into an office, your boss is berating you 382 00:17:29,720 --> 00:17:32,000 Speaker 2: or something, or like, you have to deliver bad news, 383 00:17:32,119 --> 00:17:36,520 Speaker 2: and his reactions he starts crying in the room. Forget 384 00:17:36,560 --> 00:17:39,760 Speaker 2: about forget about abuse for a second. Let's take the 385 00:17:39,800 --> 00:17:42,399 Speaker 2: abuse stuff out of it, just for a moment, just 386 00:17:42,400 --> 00:17:44,359 Speaker 2: put it to the side. I can't think of a 387 00:17:44,480 --> 00:17:47,199 Speaker 2: more uncomfortable situation for a person to be. I mean, 388 00:17:47,240 --> 00:17:50,359 Speaker 2: obviously there are more uncomfortable situations, but very unusual and 389 00:17:50,440 --> 00:17:52,760 Speaker 2: apparently happened on a regular basis that he would break 390 00:17:52,800 --> 00:17:55,520 Speaker 2: down in tears during a meeting or like you did 391 00:17:55,520 --> 00:17:58,960 Speaker 2: an argument or something, which is like, it's just very 392 00:17:59,080 --> 00:18:00,480 Speaker 2: like he had a lot of volid emotions. 393 00:18:00,480 --> 00:18:02,800 Speaker 4: But that's how we got the iPhone from crying lots 394 00:18:02,840 --> 00:18:03,679 Speaker 4: of it. 395 00:18:03,760 --> 00:18:06,000 Speaker 2: Fucking raises the question because you know what happened. I 396 00:18:06,040 --> 00:18:08,159 Speaker 2: guarantee you. I guarantee you. Some people brought him a 397 00:18:08,160 --> 00:18:11,040 Speaker 2: shitty fucking touch screen like the original Android. I mean, 398 00:18:11,080 --> 00:18:12,840 Speaker 2: I don't know if you remember how Android phones were, 399 00:18:12,880 --> 00:18:15,440 Speaker 2: the first version of them, they had like the touch 400 00:18:15,480 --> 00:18:18,119 Speaker 2: screen sensitivity was all over the fucking map. They sucked 401 00:18:18,160 --> 00:18:21,480 Speaker 2: to use. Yeah, somebody brought him that shit. Yeah, and 402 00:18:21,520 --> 00:18:23,760 Speaker 2: he had a meltdown on them. He threw the fucking 403 00:18:23,800 --> 00:18:26,119 Speaker 2: phone in the face. He did a David Pogue. He 404 00:18:26,200 --> 00:18:28,520 Speaker 2: chucked the phone directly at their face. Remember when David 405 00:18:28,520 --> 00:18:30,320 Speaker 2: Poke threw a phone at his wife. This is oh, yeah, 406 00:18:30,320 --> 00:18:32,239 Speaker 2: it was sorry, It's just I think about it all 407 00:18:32,280 --> 00:18:34,720 Speaker 2: the time. Whenever I noticed that talk about the iPhone, 408 00:18:34,720 --> 00:18:36,520 Speaker 2: I'd be like, go to Dave. David Pogue throwing an 409 00:18:36,520 --> 00:18:39,120 Speaker 2: iPhone at his wife. Oh also horrible, by the way, 410 00:18:39,119 --> 00:18:41,840 Speaker 2: horrible abusive thing to do. But yeah, Steve Jobs, they 411 00:18:41,840 --> 00:18:44,600 Speaker 2: gave him the shitty touch screen. He threw it at somebody. 412 00:18:44,600 --> 00:18:46,479 Speaker 2: He was like, this is not extentually started crying, I 413 00:18:46,520 --> 00:18:49,960 Speaker 2: assume at some point, and then eventually, out of fear, right, 414 00:18:50,320 --> 00:18:53,480 Speaker 2: they brought him really good touch screen, like just out 415 00:18:53,480 --> 00:18:55,760 Speaker 2: of pure fear. They were like, all right, like we 416 00:18:55,920 --> 00:18:58,560 Speaker 2: have to do this, or like he's gonna be really upset. 417 00:18:58,600 --> 00:19:01,040 Speaker 2: We don't like want to make Steve us. You know, 418 00:19:01,080 --> 00:19:03,239 Speaker 2: would we have gotten the iPhone or would we have 419 00:19:03,320 --> 00:19:05,400 Speaker 2: a would it be a whole different future that we're 420 00:19:05,440 --> 00:19:07,600 Speaker 2: living in right now. Maybe there would be no social 421 00:19:07,680 --> 00:19:10,440 Speaker 2: media because when you think about it, without a great 422 00:19:10,440 --> 00:19:13,199 Speaker 2: touch screen, the iPhone's probably not a successful product. And 423 00:19:13,240 --> 00:19:15,600 Speaker 2: if it's not successful, then the whole social media boom 424 00:19:15,600 --> 00:19:16,760 Speaker 2: and all that shit really doesn't happen. 425 00:19:16,880 --> 00:19:18,479 Speaker 4: Might Yeah, I mean when you put it like that, 426 00:19:18,520 --> 00:19:20,120 Speaker 4: you're not making an argument that it was a good 427 00:19:20,160 --> 00:19:22,919 Speaker 4: thing that any else I'm not sure not if it's 428 00:19:23,400 --> 00:19:25,840 Speaker 4: without bad bosses, we wouldn't have social media. Wouldn't that 429 00:19:25,880 --> 00:19:26,280 Speaker 4: be great? 430 00:19:26,440 --> 00:19:26,600 Speaker 2: Right? 431 00:19:26,840 --> 00:19:28,800 Speaker 4: Better workplaces and no Facebook. 432 00:19:28,920 --> 00:19:31,000 Speaker 2: But I've come full circle. I mean I was when 433 00:19:31,000 --> 00:19:32,600 Speaker 2: I was twelve years old. I got on the Internet 434 00:19:32,640 --> 00:19:35,440 Speaker 2: when I was like twelve, right, and it was early internet. 435 00:19:36,000 --> 00:19:38,159 Speaker 2: But now I'm like, yeah, we got to shut it down, like, 436 00:19:38,280 --> 00:19:40,159 Speaker 2: why did we ever invent any of this stuff? This 437 00:19:40,200 --> 00:19:42,359 Speaker 2: seems bad, like we made a huge mistake. 438 00:19:43,680 --> 00:19:44,359 Speaker 1: Well, it's funny. 439 00:19:44,760 --> 00:19:47,080 Speaker 4: One thing about AI that I noticed is I think 440 00:19:47,080 --> 00:19:50,600 Speaker 4: we've gone from a position I mean the sort of 441 00:19:50,600 --> 00:19:53,520 Speaker 4: capsule history which isn't which is always more complicated than this, 442 00:19:53,560 --> 00:19:55,880 Speaker 4: But the capsule history I think that people tell about themselves, 443 00:19:56,160 --> 00:19:58,280 Speaker 4: journalists tell about themselves and the tech boom is that 444 00:19:58,680 --> 00:20:01,960 Speaker 4: we were all a little bit who positive towards the 445 00:20:02,000 --> 00:20:04,200 Speaker 4: tech industry, that we were too excited that we gave 446 00:20:04,240 --> 00:20:06,040 Speaker 4: them a pass through the recite. 447 00:20:06,240 --> 00:20:09,359 Speaker 2: Yeah, we're like the touch screen on this iPhones fucking amazingly. 448 00:20:09,400 --> 00:20:11,200 Speaker 2: I mean, as a guy who reviewed most of them, 449 00:20:11,280 --> 00:20:13,200 Speaker 2: I was like, oh my god, the touch screen is 450 00:20:13,240 --> 00:20:13,560 Speaker 2: so good? 451 00:20:13,600 --> 00:20:15,879 Speaker 1: How did they do it? Steve Jobs has done it again? 452 00:20:16,320 --> 00:20:18,200 Speaker 4: And then and then when everything kind of fell apart 453 00:20:18,240 --> 00:20:21,000 Speaker 4: and it became clear how bad for all of us, 454 00:20:21,080 --> 00:20:22,080 Speaker 4: all of this stuff. 455 00:20:21,880 --> 00:20:22,399 Speaker 1: Kind of was. 456 00:20:22,840 --> 00:20:24,960 Speaker 4: You know, everybody hunched over and was like, we can't 457 00:20:25,040 --> 00:20:26,520 Speaker 4: let that happen again. And I think a lot of 458 00:20:26,560 --> 00:20:30,200 Speaker 4: the like media critical reaction to a lot of AI 459 00:20:30,320 --> 00:20:32,719 Speaker 4: stuff is born out of that fear that we're going 460 00:20:32,760 --> 00:20:35,560 Speaker 4: to be too. You know that we that we really 461 00:20:35,640 --> 00:20:38,240 Speaker 4: need to give these products like this and technologies like 462 00:20:38,280 --> 00:20:42,440 Speaker 4: this really rigorous and clear kind of investigations goings over 463 00:20:42,520 --> 00:20:44,960 Speaker 4: like make it make them as good as they possibly can. 464 00:20:45,400 --> 00:20:45,560 Speaker 1: Right. 465 00:20:45,600 --> 00:20:48,560 Speaker 4: That's obviously you want journalists who are oppositional and aggressive 466 00:20:48,600 --> 00:20:51,080 Speaker 4: and critical in all these ways. But it's like creates 467 00:20:51,080 --> 00:20:53,560 Speaker 4: this very a very funny and different kind of relationship 468 00:20:53,560 --> 00:20:56,119 Speaker 4: with the tech industry than existed in say like two 469 00:20:56,160 --> 00:20:57,760 Speaker 4: thousand and five or two thousand and six. 470 00:20:58,280 --> 00:21:00,000 Speaker 1: Oh no, totally, totally. 471 00:21:00,119 --> 00:21:02,080 Speaker 2: I hear, by the way, the sounds of New York 472 00:21:02,080 --> 00:21:04,840 Speaker 2: in the background there, it's like you gotta sir, and 473 00:21:04,880 --> 00:21:06,800 Speaker 2: I'm just creeping up. So I've been so long since 474 00:21:06,840 --> 00:21:10,239 Speaker 2: I've heard that, now that I live in out the country. No, 475 00:21:10,320 --> 00:21:12,639 Speaker 2: you're right, I mean I think there is probably in 476 00:21:12,640 --> 00:21:14,080 Speaker 2: some way an over adjustment. 477 00:21:14,240 --> 00:21:16,640 Speaker 1: I mean, I actually, and I've probably told the story before. 478 00:21:16,400 --> 00:21:21,520 Speaker 2: But I I had a meeting with actually it was 479 00:21:21,520 --> 00:21:23,719 Speaker 2: a fund I was fundraising for the Outline and we 480 00:21:23,760 --> 00:21:27,560 Speaker 2: met with Mark Andries and it turned into an argument 481 00:21:27,600 --> 00:21:29,720 Speaker 2: between the two of us. It was an interesting meeting 482 00:21:29,840 --> 00:21:32,080 Speaker 2: because there were like seven people there and it just 483 00:21:32,200 --> 00:21:35,080 Speaker 2: ended up with us arguing with each other. But one 484 00:21:35,160 --> 00:21:38,760 Speaker 2: of his arguments about generally about like the tech press, 485 00:21:38,760 --> 00:21:40,560 Speaker 2: I mean, the outline wasn't pure tech, but it was 486 00:21:40,600 --> 00:21:42,920 Speaker 2: like he was like, you guys are of all, you're 487 00:21:42,960 --> 00:21:45,240 Speaker 2: all trashing us, and you're taking shots at us, and 488 00:21:45,280 --> 00:21:47,320 Speaker 2: you're trying to like tear down all the work that 489 00:21:47,359 --> 00:21:49,760 Speaker 2: we're doing. And and I was like, one of my 490 00:21:49,920 --> 00:21:52,639 Speaker 2: things that I remember, and I think maybe struck a 491 00:21:52,680 --> 00:21:55,760 Speaker 2: course with him, I was like, we like actually have 492 00:21:55,880 --> 00:21:59,720 Speaker 2: been like nothing but like wonderful to the tech industry, 493 00:21:59,800 --> 00:22:02,280 Speaker 2: and we expected like that that people in tech to 494 00:22:02,320 --> 00:22:04,720 Speaker 2: be better. Yeah, and you guys have ended up being 495 00:22:04,760 --> 00:22:08,000 Speaker 2: exactly like the fucking robber barons of yester year. Like 496 00:22:08,280 --> 00:22:10,760 Speaker 2: you're supposed to be the next generation of like these 497 00:22:10,840 --> 00:22:14,119 Speaker 2: leaders who do who hold themselves to a higher standard 498 00:22:14,200 --> 00:22:16,119 Speaker 2: or are more aware of what's going on in the 499 00:22:16,160 --> 00:22:19,520 Speaker 2: world and like you know, more sensitive to like the 500 00:22:19,600 --> 00:22:24,120 Speaker 2: users and who they are and what they represent and anyhow. 501 00:22:24,119 --> 00:22:26,199 Speaker 2: But like, I think that's that is they've reacted very 502 00:22:26,240 --> 00:22:28,960 Speaker 2: poorly to getting bad press because they they have gotten 503 00:22:29,000 --> 00:22:32,920 Speaker 2: so much good press, and so there's a hugely defensive stance. 504 00:22:33,280 --> 00:22:34,800 Speaker 4: I also think, I mean, I think that the other 505 00:22:34,880 --> 00:22:37,119 Speaker 4: aspect of this that is is sort of unexpected and 506 00:22:37,160 --> 00:22:38,800 Speaker 4: you wouldn't have been able to predict in two thousand 507 00:22:38,800 --> 00:22:42,359 Speaker 4: and six, is these are like the two main power 508 00:22:42,440 --> 00:22:45,760 Speaker 4: user groups on Twitter. So like suddenly all of the 509 00:22:45,800 --> 00:22:49,200 Speaker 4: media kind of got shuttled through this one single social 510 00:22:49,200 --> 00:22:52,280 Speaker 4: media feed, and it was the same feed that all 511 00:22:52,440 --> 00:22:55,359 Speaker 4: of all of tech was also on, and it just 512 00:22:55,480 --> 00:22:58,720 Speaker 4: was like just sort of a bad neighbors situation. Like 513 00:22:58,800 --> 00:23:00,719 Speaker 4: maybe like all of a sudden, you went from like 514 00:23:01,000 --> 00:23:03,080 Speaker 4: maybe you get the Times delivered and you see a 515 00:23:03,119 --> 00:23:05,320 Speaker 4: negative article, but you also see the positive article, and 516 00:23:05,359 --> 00:23:06,800 Speaker 4: you also get the Journal and there's a bunch of 517 00:23:06,840 --> 00:23:09,320 Speaker 4: different things. Now you're like you see every single even 518 00:23:09,359 --> 00:23:12,760 Speaker 4: the lowest level, you know, copy editors talking shit about 519 00:23:12,800 --> 00:23:15,560 Speaker 4: you or like replying to you and calling you egghead 520 00:23:15,640 --> 00:23:18,360 Speaker 4: or whatever, and you're freaking out, like it's too much 521 00:23:18,359 --> 00:23:18,720 Speaker 4: for you. 522 00:23:19,200 --> 00:23:21,199 Speaker 2: That is mean you should never I don't think I 523 00:23:21,200 --> 00:23:23,159 Speaker 2: don't I think never go after people's looks. 524 00:23:23,200 --> 00:23:25,680 Speaker 1: I think you don't you know what I mean. I 525 00:23:25,920 --> 00:23:26,679 Speaker 1: have always felt this. 526 00:23:26,720 --> 00:23:28,919 Speaker 2: I mean, especially during the Trump years, people would be 527 00:23:28,920 --> 00:23:31,200 Speaker 2: like ooh, the orange whatever. I'm like, that's not productive, 528 00:23:31,320 --> 00:23:33,200 Speaker 2: Like it just isn't like it's not a good. 529 00:23:33,400 --> 00:23:35,640 Speaker 4: Clearly, I mean, it clearly got under marketing well, clearly 530 00:23:35,680 --> 00:23:38,359 Speaker 4: got under a bunch of people's skins, various various people. 531 00:23:38,560 --> 00:23:39,360 Speaker 4: I mean, it doesn't. 532 00:23:39,520 --> 00:23:39,719 Speaker 1: I mean. 533 00:23:39,760 --> 00:23:42,200 Speaker 4: The funny thing is that Andresen then started his own 534 00:23:42,240 --> 00:23:46,600 Speaker 4: publication to much fanfare, called I think called Future. 535 00:23:46,640 --> 00:23:49,720 Speaker 2: He's also he's also very interested in covering the future, 536 00:23:49,840 --> 00:23:50,800 Speaker 2: like what future and. 537 00:23:50,920 --> 00:23:51,800 Speaker 1: Your public is. 538 00:23:51,920 --> 00:23:53,400 Speaker 4: Then they shut it down because because if. 539 00:23:53,280 --> 00:23:55,000 Speaker 2: He wants to read a fucking white paper from a 540 00:23:55,119 --> 00:23:57,520 Speaker 2: sixteen Z or from andres and Horwitz, like nobody wants 541 00:23:57,560 --> 00:24:00,760 Speaker 2: to read, nobody cares, nobody wants to read PR. First off, 542 00:24:00,840 --> 00:24:02,480 Speaker 2: I wish I could tell all the PR people though 543 00:24:02,520 --> 00:24:03,440 Speaker 2: I wish they could learn. 544 00:24:03,520 --> 00:24:05,200 Speaker 4: But I mean, and the other thing is like running 545 00:24:05,200 --> 00:24:07,840 Speaker 4: an actual like publication that publishes with any kind of 546 00:24:07,840 --> 00:24:10,080 Speaker 4: regularity good things that people want to read with any 547 00:24:10,119 --> 00:24:12,399 Speaker 4: kind of regularity is much more difficult than I think 548 00:24:12,840 --> 00:24:15,639 Speaker 4: people appreciate it. I have this thing about I think 549 00:24:15,640 --> 00:24:18,520 Speaker 4: there's something real about the way people, certain kinds of 550 00:24:18,520 --> 00:24:21,280 Speaker 4: people relate to journalists and the media is because of 551 00:24:21,400 --> 00:24:24,840 Speaker 4: what we do is just write and just sort of 552 00:24:24,880 --> 00:24:27,439 Speaker 4: tell people what's happening in the world. There's there's there's 553 00:24:27,480 --> 00:24:29,800 Speaker 4: a sense that like anybody could kind of do it, 554 00:24:29,840 --> 00:24:32,200 Speaker 4: and so there's a there's a there's a resentment from 555 00:24:32,240 --> 00:24:34,560 Speaker 4: people like Intrisan or whomever that that sort of the 556 00:24:34,640 --> 00:24:37,000 Speaker 4: suggestion that, like, you know, I could be doing what 557 00:24:37,040 --> 00:24:39,480 Speaker 4: you're doing, I could do it better. And when you 558 00:24:39,520 --> 00:24:41,600 Speaker 4: actually try to do it, like run the whole thing 559 00:24:41,640 --> 00:24:44,000 Speaker 4: as a business, like like hit everything that you need 560 00:24:44,000 --> 00:24:46,800 Speaker 4: to hit to be a good journalist and a good 561 00:24:46,880 --> 00:24:49,639 Speaker 4: editor and a good publisher, you recognize that, especially in 562 00:24:49,680 --> 00:24:51,800 Speaker 4: the current environment, especially in the environment that people like 563 00:24:51,840 --> 00:24:54,600 Speaker 4: Andresen created, it's a lot of work. It takes a 564 00:24:54,640 --> 00:24:56,960 Speaker 4: lot of work and a lot of like accumulated skill 565 00:24:57,040 --> 00:24:59,719 Speaker 4: and knowledge. It's not something you can just replicate because 566 00:25:00,040 --> 00:25:02,639 Speaker 4: you've decided that today's journalists don't know how to do 567 00:25:02,680 --> 00:25:03,600 Speaker 4: it right or whatever. 568 00:25:04,040 --> 00:25:05,960 Speaker 2: I agree with everything you just said one hundred percent, 569 00:25:06,000 --> 00:25:09,520 Speaker 2: but I will also say we have so devalued what 570 00:25:09,880 --> 00:25:10,440 Speaker 2: looks like. 571 00:25:10,560 --> 00:25:13,119 Speaker 1: Journalism and news. 572 00:25:14,080 --> 00:25:17,240 Speaker 2: It's been so greatly terrifically devalued, and there are so 573 00:25:17,359 --> 00:25:20,440 Speaker 2: many people who truly suck doing it, like really bad. 574 00:25:20,680 --> 00:25:23,639 Speaker 2: There are like bad actors at places like Daily Wire 575 00:25:23,760 --> 00:25:27,000 Speaker 2: or whatever. There are people who are bad at it, 576 00:25:27,080 --> 00:25:28,800 Speaker 2: like that just aren't good at their job because it 577 00:25:28,800 --> 00:25:31,679 Speaker 2: has become probably there was a period where it was 578 00:25:31,760 --> 00:25:34,919 Speaker 2: easier than ever to go, hey, i'm gonna write, I'm 579 00:25:34,960 --> 00:25:38,200 Speaker 2: gonna make content, Like journalism became this thing called content, 580 00:25:38,320 --> 00:25:41,080 Speaker 2: not the journalism with the capital jay that you're probably 581 00:25:41,160 --> 00:25:42,600 Speaker 2: thinking about or we're talking about. 582 00:25:42,640 --> 00:25:42,920 Speaker 1: Really. 583 00:25:43,680 --> 00:25:46,440 Speaker 2: And also it's all free, yeah, right, it's free to everybody, 584 00:25:46,440 --> 00:25:48,600 Speaker 2: and everybody thinks it should be free. And so I 585 00:25:48,600 --> 00:25:52,080 Speaker 2: think it's that combination of like you've got like legitimately 586 00:25:52,119 --> 00:25:54,320 Speaker 2: bad actors, you've got legitimately like kind of people who 587 00:25:54,320 --> 00:25:57,159 Speaker 2: should never have gotten into the craft to begin with. 588 00:25:57,359 --> 00:25:59,360 Speaker 2: Like not to throw it back to the NFT thing, 589 00:25:59,359 --> 00:26:01,600 Speaker 2: but it's a good it's kind of an interesting parallel 590 00:26:01,600 --> 00:26:04,760 Speaker 2: where it's like you really aren't an artist. You didn't like, 591 00:26:04,960 --> 00:26:07,240 Speaker 2: you didn't train, you didn't really want to be an artist, 592 00:26:07,280 --> 00:26:08,959 Speaker 2: you kind of don't care about it that much, but 593 00:26:09,000 --> 00:26:11,439 Speaker 2: like you could get a job doing it right. You 594 00:26:11,440 --> 00:26:13,720 Speaker 2: can get a job like making an NFT and like 595 00:26:13,760 --> 00:26:16,199 Speaker 2: maybe sell some and make a quick buck. I think 596 00:26:16,200 --> 00:26:17,560 Speaker 2: there's a lot of people who kind of found their 597 00:26:17,560 --> 00:26:20,640 Speaker 2: way into like content creation that was like a machine. 598 00:26:20,680 --> 00:26:22,000 Speaker 1: It was like it kind of like, you. 599 00:26:22,000 --> 00:26:23,960 Speaker 2: Know why we're all talking about AI a lot lately 600 00:26:24,119 --> 00:26:27,320 Speaker 2: in this in this venue is like it became a 601 00:26:27,400 --> 00:26:30,480 Speaker 2: very machine almost like a machine generated game, where it's 602 00:26:30,520 --> 00:26:32,320 Speaker 2: like you write a headline that it will get clicks, 603 00:26:32,359 --> 00:26:34,640 Speaker 2: and you put some content in the below it that's 604 00:26:34,680 --> 00:26:37,359 Speaker 2: like feels like enough information to call it a story, 605 00:26:37,560 --> 00:26:40,840 Speaker 2: and that's like what journalism is. There's actually a fair 606 00:26:41,000 --> 00:26:44,240 Speaker 2: argument to say that there is a lot of bad journalism. 607 00:26:44,440 --> 00:26:48,119 Speaker 2: It's just that what they're talking about isn't like is 608 00:26:48,160 --> 00:26:51,320 Speaker 2: it the Daily Wire, and it isn't like the user 609 00:26:51,359 --> 00:26:56,440 Speaker 2: generated content on BuzzFeed dot com. It's people who say 610 00:26:56,480 --> 00:26:58,080 Speaker 2: things that are true that they don't want to hear 611 00:26:58,200 --> 00:26:59,520 Speaker 2: or don't want anybody else to see. 612 00:27:00,000 --> 00:27:00,240 Speaker 1: Basic. 613 00:27:00,359 --> 00:27:01,919 Speaker 4: The other thing I wouldn't want to say is that 614 00:27:01,960 --> 00:27:03,639 Speaker 4: it's not as though. I mean, the thing I still 615 00:27:03,640 --> 00:27:07,320 Speaker 4: love about the Internet is that you can find incredible 616 00:27:07,359 --> 00:27:11,760 Speaker 4: geniuses doing work that would otherwise never have been cover. 617 00:27:11,880 --> 00:27:14,560 Speaker 4: It's not like citizens, I like, do I believe in 618 00:27:14,600 --> 00:27:19,800 Speaker 4: citizen journalism like conceptually Circle twenty twelve, Jeff Jarvis talking 619 00:27:19,880 --> 00:27:22,280 Speaker 4: about like said citizen journalism, No, but do I believe 620 00:27:22,280 --> 00:27:25,760 Speaker 4: that there are like people there are incredibly talented writers, reporters, 621 00:27:26,119 --> 00:27:29,480 Speaker 4: journalists out there who would not have access to audiences 622 00:27:29,480 --> 00:27:30,399 Speaker 4: if it wasn't for the Internet. 623 00:27:30,480 --> 00:27:30,640 Speaker 1: Yeah. 624 00:27:30,640 --> 00:27:32,600 Speaker 4: Absolutely, I mean that those people are those. 625 00:27:32,480 --> 00:27:35,439 Speaker 2: People one hundred percent. I mean, to be clear, we 626 00:27:35,480 --> 00:27:37,400 Speaker 2: would not be having this conversation. I would have done 627 00:27:37,400 --> 00:27:39,920 Speaker 2: nothing if it weren't for the fact. I mean I 628 00:27:40,200 --> 00:27:42,480 Speaker 2: didn't go to as you as I think. You know, 629 00:27:42,560 --> 00:27:44,679 Speaker 2: I barely even went to high school, but I definitely 630 00:27:44,680 --> 00:27:48,040 Speaker 2: didn't go to to college for journalism. Like I didn't 631 00:27:48,080 --> 00:27:50,439 Speaker 2: go to Jay's school and then leave and go intern 632 00:27:50,520 --> 00:27:52,359 Speaker 2: at the New York Times. Like there used to be 633 00:27:52,400 --> 00:27:54,520 Speaker 2: a way that people did this that was very linear, right, 634 00:27:54,560 --> 00:27:56,200 Speaker 2: and it was a very closed circuit. 635 00:27:56,880 --> 00:27:57,359 Speaker 1: For sure. 636 00:27:57,440 --> 00:27:59,720 Speaker 2: The Internet has has I know that everybody talks about 637 00:27:59,720 --> 00:28:01,639 Speaker 2: this is like kind of like bullshit, but I think it, 638 00:28:01,760 --> 00:28:04,719 Speaker 2: I mean, these days, but it had leveled the playing 639 00:28:04,720 --> 00:28:09,840 Speaker 2: field for if a person was eager and excited and good, yeah, 640 00:28:09,920 --> 00:28:12,439 Speaker 2: and had any bit of talent, Like there was a 641 00:28:12,440 --> 00:28:14,520 Speaker 2: place to go and go and do it and find 642 00:28:14,840 --> 00:28:17,720 Speaker 2: like hone it, you know, and like I that's real 643 00:28:17,800 --> 00:28:20,720 Speaker 2: and awesome. Yeah, And a lot of our best journalists, 644 00:28:20,720 --> 00:28:24,040 Speaker 2: a lot of the best journalists working today it came 645 00:28:24,080 --> 00:28:27,200 Speaker 2: from that sort of with those backgrounds, like not from 646 00:28:27,280 --> 00:28:28,320 Speaker 2: straight from Jay school. 647 00:28:28,359 --> 00:28:32,600 Speaker 4: You know, something that I believe, like incredibly strongly, like 648 00:28:32,680 --> 00:28:36,840 Speaker 4: on a political levelist that there's all this incredible, unused, 649 00:28:36,920 --> 00:28:42,040 Speaker 4: untapped talent, creative talent, creative genius intelligence out there in 650 00:28:42,080 --> 00:28:44,680 Speaker 4: the world, in the US but around the world that 651 00:28:44,920 --> 00:28:49,040 Speaker 4: is just ill served by the political economic systems that 652 00:28:49,040 --> 00:28:51,040 Speaker 4: we have working right now. And the Internet at its 653 00:28:51,080 --> 00:28:53,720 Speaker 4: best is a way to level that playing field and 654 00:28:53,760 --> 00:28:56,320 Speaker 4: to like find outlets for those people, find ways for 655 00:28:56,360 --> 00:28:59,720 Speaker 4: them to make use of their incredible genius that otherwise 656 00:28:59,760 --> 00:29:02,080 Speaker 4: would and exist. And then the Internet, it's worse, is 657 00:29:02,120 --> 00:29:05,200 Speaker 4: also like a way for very rich guys to like 658 00:29:05,280 --> 00:29:07,920 Speaker 4: find those talents and then just exploit the hell out 659 00:29:07,920 --> 00:29:09,400 Speaker 4: of the stuff they're creating. I mean, I think this 660 00:29:09,520 --> 00:29:12,800 Speaker 4: about social media for real. It's like, what makes Facebook 661 00:29:12,880 --> 00:29:16,040 Speaker 4: valuable is not the technology, what makes or Twitter or 662 00:29:16,080 --> 00:29:18,560 Speaker 4: any or TikTok or anything. It's the people who are 663 00:29:18,600 --> 00:29:22,520 Speaker 4: creating stuff that is engaging and entertaining and funny and 664 00:29:22,560 --> 00:29:25,200 Speaker 4: weird for free for these social networks. 665 00:29:25,280 --> 00:29:25,360 Speaker 2: Right. 666 00:29:25,640 --> 00:29:28,640 Speaker 4: The drill the drills, Yeah, exactly right, And you know 667 00:29:28,680 --> 00:29:30,440 Speaker 4: this is this is the thing that I think is 668 00:29:30,440 --> 00:29:33,200 Speaker 4: worth thinking about as AI comes into being is the 669 00:29:33,240 --> 00:29:36,160 Speaker 4: sort of sense that, like, you know, what is AI 670 00:29:36,240 --> 00:29:38,680 Speaker 4: going to be used for? Because I think it has 671 00:29:38,720 --> 00:29:42,160 Speaker 4: this possibility to give sort of production tool in the 672 00:29:42,200 --> 00:29:46,080 Speaker 4: same way that you know, like electronic production music production 673 00:29:46,200 --> 00:29:48,520 Speaker 4: tools gave access to all these kids in their bedrooms 674 00:29:48,560 --> 00:29:51,160 Speaker 4: the ability to make beats and to make music and 675 00:29:51,360 --> 00:29:54,280 Speaker 4: to create create stuff themselves without having to you know, 676 00:29:54,440 --> 00:29:56,680 Speaker 4: pay one thousand dollars an hour for a studio or 677 00:29:56,720 --> 00:29:59,480 Speaker 4: whatever and get that stuff together. Like, I see a 678 00:29:59,480 --> 00:30:02,000 Speaker 4: lot of gender to AI that feels like, maybe not 679 00:30:02,080 --> 00:30:03,840 Speaker 4: right now, but pretty soon, this is going to help 680 00:30:03,840 --> 00:30:06,360 Speaker 4: people create things themselves, even if they don't really have 681 00:30:06,400 --> 00:30:09,200 Speaker 4: access to these huge resources outside of it, right, And 682 00:30:09,200 --> 00:30:11,760 Speaker 4: what frustrates me is that it sure seems like the 683 00:30:11,840 --> 00:30:14,680 Speaker 4: business model that places like open ai are gunning for 684 00:30:14,840 --> 00:30:17,560 Speaker 4: is instead of like, let's enable people who otherwise wouldn't 685 00:30:17,560 --> 00:30:21,520 Speaker 4: be able to, you know, do these things, let's let's 686 00:30:21,560 --> 00:30:23,720 Speaker 4: find ways to replace the people who are already doing 687 00:30:23,760 --> 00:30:26,320 Speaker 4: these things with the shittier version that we can have 688 00:30:26,400 --> 00:30:28,080 Speaker 4: the boss's control or whatever. 689 00:30:28,640 --> 00:30:31,440 Speaker 2: Right, there's a lot there that I agree with and 690 00:30:31,480 --> 00:30:33,640 Speaker 2: also to unpack. But I will say that it is 691 00:30:33,680 --> 00:30:36,560 Speaker 2: interesting that. I mean, I don't want to get into 692 00:30:36,640 --> 00:30:40,280 Speaker 2: a you know, capitalism or whatever. Capitalism isn't good or not. 693 00:30:41,240 --> 00:30:44,320 Speaker 2: There's obviously some problems with it, but like there is 694 00:30:44,360 --> 00:30:48,160 Speaker 2: that you know. The history of capitalism is like these 695 00:30:48,160 --> 00:30:52,320 Speaker 2: like incredible innovations that are then like manipulated into like 696 00:30:52,400 --> 00:30:55,360 Speaker 2: a massive business that then like needs a bunch of 697 00:30:55,400 --> 00:30:58,040 Speaker 2: like worker bees to like go and do and it's 698 00:30:58,080 --> 00:31:00,720 Speaker 2: like owned by a very small segment. Like the thing 699 00:31:00,720 --> 00:31:02,960 Speaker 2: itself is owned by a very small segment of the population, 700 00:31:03,320 --> 00:31:06,240 Speaker 2: but the actual work to make the thing is done 701 00:31:06,280 --> 00:31:07,840 Speaker 2: by a much larger segment. 702 00:31:07,600 --> 00:31:08,200 Speaker 1: Of the population. 703 00:31:08,240 --> 00:31:11,160 Speaker 2: And it's usually grueling and shitty, and people are underpaid 704 00:31:11,160 --> 00:31:12,680 Speaker 2: for it. And it's like is and that is like 705 00:31:12,720 --> 00:31:15,760 Speaker 2: in some way, like a Facebook is, like you said, 706 00:31:15,840 --> 00:31:17,720 Speaker 2: like all these people are Facebook or Twitter or whatever. 707 00:31:17,840 --> 00:31:20,080 Speaker 2: Is like these amazing people create for it, like that's 708 00:31:20,120 --> 00:31:22,840 Speaker 2: the engine of it or whatever. But it is, you know, 709 00:31:23,520 --> 00:31:27,120 Speaker 2: over time turned into kind of like a just like 710 00:31:27,160 --> 00:31:29,280 Speaker 2: it's a part of like a machine, Like those people 711 00:31:29,320 --> 00:31:32,600 Speaker 2: are a part of a machine that drives commerce. Right, Yeah, 712 00:31:32,880 --> 00:31:34,480 Speaker 2: I think that would be fine if the system of 713 00:31:34,520 --> 00:31:37,920 Speaker 2: commerce that it was built on was actually made any 714 00:31:37,920 --> 00:31:41,120 Speaker 2: fucking sense. But the system of commerce is built on 715 00:31:41,240 --> 00:31:44,120 Speaker 2: is based on a mistake, like essentially a mistake about 716 00:31:44,120 --> 00:31:57,160 Speaker 2: the value of people on the Internet. I think there's 717 00:31:57,160 --> 00:32:00,520 Speaker 2: like a foundational fundamental flaw and like how we monetize 718 00:32:00,560 --> 00:32:02,520 Speaker 2: content on the Internet. I've probably put this a million 719 00:32:02,600 --> 00:32:04,360 Speaker 2: times and I don't have to go into my spiel here, 720 00:32:04,560 --> 00:32:08,440 Speaker 2: but like the way that Google came up with monetizing 721 00:32:08,560 --> 00:32:11,200 Speaker 2: search is the way that we basically monetize. 722 00:32:10,680 --> 00:32:14,800 Speaker 1: Everything, and it is it's wrong. It just was wrong. 723 00:32:14,880 --> 00:32:16,360 Speaker 2: It just was like they could do that at the 724 00:32:16,360 --> 00:32:18,280 Speaker 2: time because that was all that was available to them, 725 00:32:18,320 --> 00:32:20,640 Speaker 2: Like they had one notion of like could we make money, 726 00:32:20,640 --> 00:32:22,840 Speaker 2: and like, well, this is a way, like a billboard 727 00:32:22,880 --> 00:32:25,400 Speaker 2: on a road, Like that's one way to make money. 728 00:32:25,440 --> 00:32:27,280 Speaker 2: Like you put a billboard up and some people drive 729 00:32:27,320 --> 00:32:29,880 Speaker 2: past it, and you can sell space on the billboard. 730 00:32:29,880 --> 00:32:32,720 Speaker 2: And sometimes people who drive past it will be like, hey, 731 00:32:33,320 --> 00:32:35,360 Speaker 2: like yeah, I need a new car, Like I should 732 00:32:35,400 --> 00:32:38,040 Speaker 2: go check out the fucking Audi that I you know, 733 00:32:38,120 --> 00:32:40,120 Speaker 2: the sign I drove past. But like, out of the 734 00:32:40,160 --> 00:32:41,960 Speaker 2: people who drive past it, most of them never go 735 00:32:42,080 --> 00:32:45,480 Speaker 2: check out the Audi or whatever. But over time, there's 736 00:32:45,480 --> 00:32:48,280 Speaker 2: some small value that can be extracted from that billboard. 737 00:32:48,480 --> 00:32:51,160 Speaker 2: And that's but that's the entire business model of the Internet, 738 00:32:51,440 --> 00:32:54,480 Speaker 2: is that every single thing on it is like as 739 00:32:54,520 --> 00:32:55,520 Speaker 2: devalued as. 740 00:32:55,400 --> 00:32:56,680 Speaker 1: Like a billboard or whatever. 741 00:32:56,920 --> 00:32:59,480 Speaker 2: Like anyhow, I mean it's not the perfect analogy, but 742 00:32:59,480 --> 00:33:00,760 Speaker 2: you know what I'm saying, So. 743 00:33:01,320 --> 00:33:01,840 Speaker 1: But it doesn't. 744 00:33:01,880 --> 00:33:03,280 Speaker 4: I mean, part of it too, is that it means 745 00:33:03,280 --> 00:33:05,960 Speaker 4: that when you're doing creative work, like you are also 746 00:33:06,360 --> 00:33:08,960 Speaker 4: having to think about how you're like you yourself are also 747 00:33:09,000 --> 00:33:11,479 Speaker 4: a billboard. And so like I like my job, I 748 00:33:11,560 --> 00:33:13,280 Speaker 4: like my life. I like what I do on substeck, 749 00:33:13,360 --> 00:33:16,160 Speaker 4: but like part of what I have to be part 750 00:33:16,160 --> 00:33:17,840 Speaker 4: of what my subtect is and like part of what 751 00:33:17,880 --> 00:33:20,480 Speaker 4: I have to do as a writer who isn't currently 752 00:33:20,600 --> 00:33:23,720 Speaker 4: have W two employment is sort of be my own 753 00:33:23,800 --> 00:33:27,040 Speaker 4: marketing team across a bunch of different platforms so that 754 00:33:27,080 --> 00:33:28,920 Speaker 4: people know they can hire me, people know they can 755 00:33:28,920 --> 00:33:29,280 Speaker 4: find me. 756 00:33:29,600 --> 00:33:29,800 Speaker 1: Right. 757 00:33:29,840 --> 00:33:32,080 Speaker 4: It's the hustle, yeah, and it's it's not a great 758 00:33:32,120 --> 00:33:33,840 Speaker 4: way to live. And the thing that but the thing 759 00:33:33,840 --> 00:33:35,360 Speaker 4: that really I think is a problem with it is 760 00:33:35,400 --> 00:33:37,200 Speaker 4: that we know, for a fact there are a lot 761 00:33:37,200 --> 00:33:39,200 Speaker 4: of writers who are more talented than me, who are 762 00:33:39,200 --> 00:33:43,640 Speaker 4: better than me at all the things I do. For instance, well, 763 00:33:43,640 --> 00:33:46,640 Speaker 4: who don't want to let's without naming names. I don't 764 00:33:46,640 --> 00:33:48,840 Speaker 4: want to be part of that that hustle who are 765 00:33:48,840 --> 00:33:53,080 Speaker 4: not interested in that. Like, No, I'm talking about the opposite. 766 00:33:53,080 --> 00:33:55,000 Speaker 4: I'm talking about somebody who is not good at this 767 00:33:55,080 --> 00:33:57,120 Speaker 4: kind of this kind of thing, who doesn't want to 768 00:33:57,160 --> 00:34:00,520 Speaker 4: do it. Yes, there's like very few ways for that person. Yeah, 769 00:34:00,520 --> 00:34:02,600 Speaker 4: they're very few ways for that person to be compensated 770 00:34:02,640 --> 00:34:05,440 Speaker 4: for work they do. Whereas you have people who are 771 00:34:05,600 --> 00:34:07,720 Speaker 4: who are incredibly good at the hustle part of it, 772 00:34:08,080 --> 00:34:10,640 Speaker 4: again not naming, incredibly good to the hustle part of it, 773 00:34:11,000 --> 00:34:13,360 Speaker 4: not not particularly intelligent or whatever. 774 00:34:13,360 --> 00:34:15,760 Speaker 1: Writer. I think that's what we're talking about. 775 00:34:15,880 --> 00:34:18,360 Speaker 4: I mean, we can all imagine in our heads somebody 776 00:34:18,400 --> 00:34:23,240 Speaker 4: who fits these gat and say, for exactly, for example, 777 00:34:23,320 --> 00:34:25,040 Speaker 4: they're they're ubiquitous. 778 00:34:24,560 --> 00:34:26,920 Speaker 1: I'm intelligent, but good at the hustle. I'm thinking, actually, 779 00:34:26,920 --> 00:34:27,480 Speaker 1: they shake. 780 00:34:27,320 --> 00:34:29,360 Speaker 4: The right hands, they put themselves in the right names. 781 00:34:29,719 --> 00:34:32,360 Speaker 4: It's one reason why the internet can make you so 782 00:34:32,480 --> 00:34:35,560 Speaker 4: mad when you go online these days, Like the reward 783 00:34:35,640 --> 00:34:37,880 Speaker 4: is market Like you get much better rewarded for marketing 784 00:34:37,920 --> 00:34:39,240 Speaker 4: than you do for quality. 785 00:34:39,160 --> 00:34:41,840 Speaker 2: Right, of course no, yes, I mean and and and 786 00:34:41,920 --> 00:34:44,279 Speaker 2: if you're bad at or don't like or feel like 787 00:34:44,320 --> 00:34:47,000 Speaker 2: you know, some people like if you're from generation acts 788 00:34:47,040 --> 00:34:49,080 Speaker 2: like myself, you're not gen X. 789 00:34:49,040 --> 00:34:50,000 Speaker 1: Are you? You're a millennial? 790 00:34:50,040 --> 00:34:50,760 Speaker 4: No, I'm a millennial. 791 00:34:50,840 --> 00:34:52,640 Speaker 2: Yeah, if you're if you're like a gen X person, 792 00:34:52,640 --> 00:34:55,840 Speaker 2: you you feel a kind of physical impulse to reject 793 00:34:56,239 --> 00:34:58,839 Speaker 2: self promotion. And now, I don't know, maybe people would 794 00:34:58,840 --> 00:35:00,960 Speaker 2: say that I don't. I haven't jected self promotion. I'm 795 00:35:00,960 --> 00:35:02,520 Speaker 2: not really sure, but I would tell you I would. 796 00:35:02,560 --> 00:35:04,160 Speaker 2: I think if you ask Lyra and Jenn f I 797 00:35:04,239 --> 00:35:06,680 Speaker 2: am promoting this podcast enough, I think that they would 798 00:35:06,719 --> 00:35:08,000 Speaker 2: tell you that. 799 00:35:08,000 --> 00:35:10,640 Speaker 1: I am not. And they wish I was hustling. They 800 00:35:10,640 --> 00:35:12,440 Speaker 1: wish I was hustling more. I don't know. I don't 801 00:35:12,440 --> 00:35:13,319 Speaker 1: want to speak for them. 802 00:35:13,840 --> 00:35:17,440 Speaker 2: All that said, you know, if you even are close 803 00:35:17,520 --> 00:35:21,560 Speaker 2: to making a living at doing what you're doing, it's 804 00:35:21,600 --> 00:35:25,960 Speaker 2: also fucking wild because you know your parents and your 805 00:35:26,040 --> 00:35:29,240 Speaker 2: grandparents and certainly every generation that is, almost every generation 806 00:35:29,280 --> 00:35:34,040 Speaker 2: that's come before us. This abstract thing that we are doing, 807 00:35:34,360 --> 00:35:37,040 Speaker 2: this abstract I mean, it is fucking weird, like it 808 00:35:37,160 --> 00:35:40,120 Speaker 2: is weird, like we are go on to this thing 809 00:35:40,280 --> 00:35:43,000 Speaker 2: the screen, We go into the screen and then we 810 00:35:43,120 --> 00:35:46,000 Speaker 2: do something that frankly we like. Probably for the most part, 811 00:35:46,040 --> 00:35:48,120 Speaker 2: you probably like the writing that you're doing, right, You're 812 00:35:48,120 --> 00:35:50,120 Speaker 2: not like, you don't wake up every day and you're like, 813 00:35:50,160 --> 00:35:51,239 Speaker 2: I mean, I don't know, maybe you got to do 814 00:35:51,239 --> 00:35:52,480 Speaker 2: the hustle, but you don't wake up every. 815 00:35:52,360 --> 00:35:54,200 Speaker 4: I mean some weeks. Some weeks you're like, ah, do 816 00:35:54,239 --> 00:35:56,040 Speaker 4: I really have to do this? But for the most part, yeah, 817 00:35:56,400 --> 00:35:57,120 Speaker 4: it's good, right. 818 00:35:57,320 --> 00:36:00,000 Speaker 2: And I mean, I'm sure there are many struggling YouTubers 819 00:36:00,000 --> 00:36:01,600 Speaker 2: you feel the same way. But like, if you can 820 00:36:01,680 --> 00:36:03,359 Speaker 2: even make if you I'm sorry, if you can even 821 00:36:03,400 --> 00:36:06,799 Speaker 2: make one dime just shooting a video of yourself and 822 00:36:06,800 --> 00:36:08,960 Speaker 2: put it on the internet, Like, it's kind of amazing 823 00:36:09,040 --> 00:36:13,200 Speaker 2: because there was no period in history before this where 824 00:36:13,239 --> 00:36:15,560 Speaker 2: such a thing was possible in really any venue. I mean, 825 00:36:15,680 --> 00:36:18,120 Speaker 2: you couldn't just put a show on and have people 826 00:36:18,120 --> 00:36:20,920 Speaker 2: come to your play or whatever. You couldn't just make 827 00:36:20,960 --> 00:36:23,360 Speaker 2: a movie and put it in a theater. You couldn't 828 00:36:23,719 --> 00:36:26,360 Speaker 2: just like write a newspaper and have it exist in 829 00:36:26,400 --> 00:36:28,839 Speaker 2: front of people. So there is this amazing flip side, 830 00:36:28,880 --> 00:36:31,399 Speaker 2: which is like all of this seems like there's fantasy here. 831 00:36:31,560 --> 00:36:33,960 Speaker 2: Like I think if you were to go back fifty 832 00:36:34,040 --> 00:36:35,400 Speaker 2: years and describe. 833 00:36:34,960 --> 00:36:36,879 Speaker 1: This to somebody, they'd be like, it would just. 834 00:36:36,840 --> 00:36:40,560 Speaker 2: Be sound like such a fantastic story, like such a 835 00:36:40,640 --> 00:36:43,400 Speaker 2: total bit of fantasy. So there's a flip side to 836 00:36:43,440 --> 00:36:44,839 Speaker 2: all of it, which is like, the internet's a sass 837 00:36:44,880 --> 00:36:47,200 Speaker 2: pulling full of horrible people, and the people who own 838 00:36:47,239 --> 00:36:50,000 Speaker 2: and control most parts of the Internet are tyrants who 839 00:36:50,080 --> 00:36:52,080 Speaker 2: are working everybody to the bone and don't give us 840 00:36:52,120 --> 00:36:55,000 Speaker 2: our fair shake. And the model, the monetization model the 841 00:36:55,000 --> 00:36:57,200 Speaker 2: Internet is built on, it's a complete shit show that sucks, 842 00:36:57,760 --> 00:37:00,719 Speaker 2: and it's broken at at its most fun a mental level. 843 00:37:00,840 --> 00:37:07,160 Speaker 2: And yet there are more people creating interesting works of fiction, 844 00:37:07,320 --> 00:37:10,399 Speaker 2: non fiction, art, non art, like journalism, whatever, than we've 845 00:37:10,400 --> 00:37:12,759 Speaker 2: probably ever had in the history of the world. There's 846 00:37:12,760 --> 00:37:16,919 Speaker 2: probably more like journalism being done than ever before. Yeah, 847 00:37:17,160 --> 00:37:18,560 Speaker 2: am I crazy for saying that? 848 00:37:18,760 --> 00:37:20,120 Speaker 4: No? I don't think so. I mean, I think it's 849 00:37:20,120 --> 00:37:22,680 Speaker 4: a it's a I mean for me, the question is, 850 00:37:22,680 --> 00:37:25,880 Speaker 4: how can we have that flowering of creativity and talent 851 00:37:26,000 --> 00:37:28,560 Speaker 4: and intelligence, and how can we you know, like make 852 00:37:28,560 --> 00:37:32,240 Speaker 4: sure that people are reaching their potentials without the layer 853 00:37:32,320 --> 00:37:35,880 Speaker 4: of exploitation that seems that right now seems like integral 854 00:37:35,960 --> 00:37:39,080 Speaker 4: to all this stuff, and I did like to This 855 00:37:39,160 --> 00:37:40,959 Speaker 4: is something that I've been thinking about a lot lately, 856 00:37:41,000 --> 00:37:43,960 Speaker 4: because so I'm in the Writer's Guild, the TV Writers 857 00:37:43,960 --> 00:37:46,319 Speaker 4: Guild and we're on strike right now, or the TV 858 00:37:46,360 --> 00:37:48,399 Speaker 4: and Film Writers Guild and we're on strike right now, 859 00:37:48,440 --> 00:37:52,120 Speaker 4: and something that is a huge contrast between being a journalist, 860 00:37:52,160 --> 00:37:56,080 Speaker 4: where you might have a workplace union but oftentimes in 861 00:37:56,120 --> 00:37:59,960 Speaker 4: general you don't, versus being in a heavily unionized industry 862 00:38:00,080 --> 00:38:05,040 Speaker 4: part of a pretty powerful union is seeing what can happen, 863 00:38:05,320 --> 00:38:08,920 Speaker 4: how much better a job can be. When you have 864 00:38:09,040 --> 00:38:11,160 Speaker 4: that kind of at least, you have a little bit 865 00:38:11,200 --> 00:38:13,880 Speaker 4: of leverage against the exploitation that you're not completely at 866 00:38:13,880 --> 00:38:16,240 Speaker 4: the whims of what the bosses want at any given moment. 867 00:38:16,239 --> 00:38:18,640 Speaker 4: You're not completely like, am I going to have a 868 00:38:18,719 --> 00:38:20,200 Speaker 4: job today? Am I going to have a job tomorrow? 869 00:38:20,200 --> 00:38:21,680 Speaker 4: Am I going to make money from this thing I'm 870 00:38:21,680 --> 00:38:23,200 Speaker 4: working on on? Spec? Am I not going to make 871 00:38:23,239 --> 00:38:23,799 Speaker 4: money from it? 872 00:38:24,239 --> 00:38:24,520 Speaker 1: Right? 873 00:38:25,320 --> 00:38:28,200 Speaker 4: And I think Hollywood is a really interesting example of 874 00:38:28,200 --> 00:38:31,040 Speaker 4: an industry that is a creative industry that produces a 875 00:38:31,160 --> 00:38:35,359 Speaker 4: wide range of things from total shit to like you know, 876 00:38:35,680 --> 00:38:40,839 Speaker 4: instances of surpassing genius that manages nonetheless to like adequately 877 00:38:41,080 --> 00:38:44,480 Speaker 4: up until recently, adequately reward the creative people who work 878 00:38:44,520 --> 00:38:47,040 Speaker 4: on it like and sometimes like more than just rewards, 879 00:38:47,040 --> 00:38:49,600 Speaker 4: sometimes make them extremely rich because they they did a 880 00:38:49,600 --> 00:38:51,520 Speaker 4: really good job at whatever it is you're supposed to 881 00:38:51,520 --> 00:38:53,440 Speaker 4: do it. And I think that like to you know, 882 00:38:53,600 --> 00:38:55,840 Speaker 4: I'm not saying like, I don't know how a content creators. 883 00:38:55,880 --> 00:38:58,120 Speaker 4: I'm not suggesting that content creators unionized, though I would 884 00:38:58,160 --> 00:38:59,479 Speaker 4: love to. I would think it would be great. 885 00:38:59,480 --> 00:39:03,040 Speaker 2: Imagine if like everybody in YouTube unionized, like you. 886 00:39:02,960 --> 00:39:06,040 Speaker 4: Know, there's like the German German vloggers have been trying 887 00:39:06,080 --> 00:39:08,959 Speaker 4: to do this for years now, and in fact, they're 888 00:39:09,400 --> 00:39:12,200 Speaker 4: unionizing under the steel workers union in Germany, I guess 889 00:39:12,280 --> 00:39:14,920 Speaker 4: is extremely powerful. It's maybe the biggest union there, and 890 00:39:14,960 --> 00:39:19,080 Speaker 4: so they've got a special like YouTube content creators organizing 891 00:39:19,360 --> 00:39:20,480 Speaker 4: like group that they're. 892 00:39:20,320 --> 00:39:22,160 Speaker 2: Trying to I like, I really like the idea of 893 00:39:22,200 --> 00:39:25,640 Speaker 2: like steel workers and YouTubers like you know, together, that's 894 00:39:25,719 --> 00:39:28,520 Speaker 2: like very I don't know that sounds. 895 00:39:28,239 --> 00:39:30,319 Speaker 4: Right if I might be remembering this wrong, but I'm 896 00:39:30,320 --> 00:39:32,279 Speaker 4: pretty sure the guy behind it is this German guy 897 00:39:32,280 --> 00:39:34,040 Speaker 4: who does videos of where he like builds his own 898 00:39:34,120 --> 00:39:36,400 Speaker 4: catapults and just like it's the most YouTube thing. But 899 00:39:36,440 --> 00:39:39,080 Speaker 4: also I think I like, right at the intersection of 900 00:39:39,120 --> 00:39:41,839 Speaker 4: like steel workers YouTube and German, it's like the guy. 901 00:39:41,880 --> 00:39:43,359 Speaker 2: It's like the guy who's like, I'm going to build 902 00:39:43,360 --> 00:39:45,960 Speaker 2: a house from like just like this mud plane. I like, 903 00:39:46,000 --> 00:39:47,520 Speaker 2: come here and I'm going to build like a mudhouse 904 00:39:47,600 --> 00:39:49,920 Speaker 2: out of like lumber, I chop down and yeah, something 905 00:39:49,960 --> 00:39:52,439 Speaker 2: like that. So, right, there is this huge strike going 906 00:39:52,480 --> 00:39:54,319 Speaker 2: on right now. Of course, Hollywood is an industry that's 907 00:39:54,320 --> 00:39:57,000 Speaker 2: been around a lot longer than the Internet, right, and 908 00:39:57,160 --> 00:40:01,120 Speaker 2: has has gone through lots of situations where there's tremendous 909 00:40:01,160 --> 00:40:03,000 Speaker 2: abuse of power by the people who are in charge 910 00:40:03,000 --> 00:40:06,000 Speaker 2: of like the studios. I imagine there's still a tremendous 911 00:40:06,000 --> 00:40:08,080 Speaker 2: abuse of power by the people who, yeah, the studios. 912 00:40:08,120 --> 00:40:10,839 Speaker 2: But but yeah, of course people eventually were like, hey, 913 00:40:10,880 --> 00:40:13,480 Speaker 2: we need to unionize. Of course, It's interesting the modern 914 00:40:13,800 --> 00:40:17,200 Speaker 2: narrative about unions is is very confused. I think there's 915 00:40:17,239 --> 00:40:20,279 Speaker 2: so much misinformation about about what unions are and what 916 00:40:20,320 --> 00:40:23,160 Speaker 2: they do that it has really done its job of 917 00:40:23,200 --> 00:40:25,520 Speaker 2: like making a lot of people very skeptical about the 918 00:40:25,560 --> 00:40:26,560 Speaker 2: power of a union. 919 00:40:26,719 --> 00:40:29,200 Speaker 4: Yeah. I mean, Hollywood unions all formed in the forties 920 00:40:29,239 --> 00:40:31,520 Speaker 4: and fifties, when union density in the US was like 921 00:40:31,600 --> 00:40:33,400 Speaker 4: three or four times what it is now. When you know, 922 00:40:33,440 --> 00:40:35,640 Speaker 4: it was like pretty common to be in a union. 923 00:40:36,120 --> 00:40:38,800 Speaker 4: There were like institutions that people knew, well, everybody knew 924 00:40:38,840 --> 00:40:40,640 Speaker 4: at least one person or a family member who was 925 00:40:40,680 --> 00:40:43,040 Speaker 4: in a union. So it wasn't crazy to be like, oh, yeah, 926 00:40:43,040 --> 00:40:44,040 Speaker 4: we're going to form a writer. 927 00:40:43,920 --> 00:40:46,319 Speaker 1: It wasn't a political thing that like, you know, I 928 00:40:46,320 --> 00:40:47,000 Speaker 1: mean there was a wow. 929 00:40:47,080 --> 00:40:48,719 Speaker 4: I mean, it depends on the you depends on the union. 930 00:40:48,960 --> 00:40:51,120 Speaker 4: Like some of those you know, the writers in particular 931 00:40:51,160 --> 00:40:54,080 Speaker 4: were and still are relatively left wing compared to the 932 00:40:54,080 --> 00:40:54,440 Speaker 4: other right. 933 00:40:54,520 --> 00:40:57,680 Speaker 2: But I mean, I mean, Hollywood is generally speaking not 934 00:40:57,880 --> 00:41:00,880 Speaker 2: a I mean, I'm not saying there are no conservatives 935 00:41:00,920 --> 00:41:03,600 Speaker 2: in Hollywood, but it is obviously the owners are all 936 00:41:03,760 --> 00:41:06,399 Speaker 2: probably there's no question, but I mean, generally speaking, people 937 00:41:06,440 --> 00:41:09,960 Speaker 2: in the creative arts tend to be more left leaning, 938 00:41:10,000 --> 00:41:12,879 Speaker 2: more liberal than people in a non creative arts. I think, yeah, 939 00:41:12,880 --> 00:41:15,879 Speaker 2: I think that's fair, and so unsurprisingly that the union 940 00:41:15,880 --> 00:41:18,480 Speaker 2: would be you know, obviously very very liberal or i 941 00:41:18,480 --> 00:41:22,759 Speaker 2: mean socialism, you know, is directly tied to lots of 942 00:41:22,920 --> 00:41:25,480 Speaker 2: unions in their creation, and like the concept of a 943 00:41:25,600 --> 00:41:26,960 Speaker 2: union of itself. 944 00:41:26,640 --> 00:41:29,600 Speaker 4: Is like a very socialistic idea, although you know that 945 00:41:29,719 --> 00:41:31,560 Speaker 4: being said, like if you go and you talk to 946 00:41:31,680 --> 00:41:34,359 Speaker 4: like the camera operators who work shows in New York 947 00:41:34,400 --> 00:41:37,280 Speaker 4: City who are all members of AATSI, and the teamsters 948 00:41:37,280 --> 00:41:39,279 Speaker 4: who are part of the Stage Hands Union, which is 949 00:41:39,280 --> 00:41:41,799 Speaker 4: one of the other big powerful Hollywood unions. Yeah, yeah, 950 00:41:41,800 --> 00:41:43,279 Speaker 4: I'm not saying an even vote for Trump, but you're 951 00:41:43,280 --> 00:41:45,359 Speaker 4: going to talk to guys who are very different set 952 00:41:45,440 --> 00:41:47,600 Speaker 4: set up politics than I sew on union guys, guys 953 00:41:47,640 --> 00:41:49,959 Speaker 4: who are showing us solidarity on the strike. But there's 954 00:41:50,000 --> 00:41:52,839 Speaker 4: like a there is a wide range of political. 955 00:41:52,480 --> 00:41:54,560 Speaker 1: You know, but you see how those cut across. 956 00:41:54,640 --> 00:41:58,279 Speaker 2: There's a very blue collar, white collar interception there where 957 00:41:58,280 --> 00:42:00,400 Speaker 2: you've got like like a camera operation or is a 958 00:42:00,480 --> 00:42:03,400 Speaker 2: vary or like somebody who's doing set design or or 959 00:42:03,440 --> 00:42:05,040 Speaker 2: like literally some of the labor. Like I remember when 960 00:42:05,080 --> 00:42:07,279 Speaker 2: I used to go on on Fallon and like I 961 00:42:07,280 --> 00:42:09,080 Speaker 2: remember once I tried to move a box like we 962 00:42:09,080 --> 00:42:11,040 Speaker 2: were like setting up some like gadget or whatever, and 963 00:42:11,040 --> 00:42:13,080 Speaker 2: they're like, you can't touch that, And I was like Okay, 964 00:42:13,080 --> 00:42:16,200 Speaker 2: there's only the union. People who moved the boxes could 965 00:42:16,280 --> 00:42:18,480 Speaker 2: touch the boxes. Like I it was like literally illegal 966 00:42:18,520 --> 00:42:20,799 Speaker 2: for me to touch the box. But it's very blue 967 00:42:20,800 --> 00:42:22,919 Speaker 2: collar work, you know, it's like not like sitting down 968 00:42:22,920 --> 00:42:25,919 Speaker 2: at your think pad and writing, you know, the next 969 00:42:25,960 --> 00:42:26,760 Speaker 2: episode of Lost. 970 00:42:26,760 --> 00:42:27,040 Speaker 1: It's like. 971 00:42:27,320 --> 00:42:28,600 Speaker 4: One of the things that, I mean, one of the 972 00:42:28,600 --> 00:42:31,560 Speaker 4: things that has been very inspiring and has made me 973 00:42:31,600 --> 00:42:34,080 Speaker 4: feel really optimistic and positive about the strike wear on 974 00:42:34,200 --> 00:42:37,200 Speaker 4: right now is the extent of the solidarity shown by 975 00:42:37,239 --> 00:42:39,600 Speaker 4: the teamsters to the Writer's Guild so that we've managed 976 00:42:39,640 --> 00:42:41,600 Speaker 4: especially it's supposed to work. Yeah, I mean, the idea 977 00:42:41,680 --> 00:42:44,520 Speaker 4: is that everybody that there's a there's solidarity between unions 978 00:42:44,560 --> 00:42:47,920 Speaker 4: because everybody understands the basic dynamic. And that hasn't always 979 00:42:47,920 --> 00:42:50,040 Speaker 4: been the case. You know, the last strike writers went 980 00:42:50,080 --> 00:42:51,799 Speaker 4: on with two thousand and eight. A lot of this 981 00:42:51,920 --> 00:42:55,040 Speaker 4: is sort of like anecdotes, and but people said, you know, 982 00:42:55,280 --> 00:42:57,520 Speaker 4: back then, people didn't think the writers need to go 983 00:42:57,560 --> 00:43:00,560 Speaker 4: on strike. You know, teamsters were worried about missing there hours. 984 00:43:00,600 --> 00:43:03,000 Speaker 4: You know, they're not productions are getting shut down. In 985 00:43:03,000 --> 00:43:05,000 Speaker 4: a lot of cases, they're not getting paid. So they 986 00:43:05,000 --> 00:43:07,120 Speaker 4: would break picket lines, they would they would walk through, 987 00:43:07,120 --> 00:43:08,880 Speaker 4: they would try to keep working, try to get paid, 988 00:43:09,040 --> 00:43:11,600 Speaker 4: and that just has it's been a total change from 989 00:43:11,640 --> 00:43:14,480 Speaker 4: that for this most recent one, which you can attribute 990 00:43:14,480 --> 00:43:16,040 Speaker 4: to a million things, you know, like one of it 991 00:43:16,080 --> 00:43:18,200 Speaker 4: and is just I think, like you were saying, we've 992 00:43:18,200 --> 00:43:20,279 Speaker 4: gone from a real low ebb of unions to like 993 00:43:20,360 --> 00:43:22,920 Speaker 4: the sparks. We're not yet in a place where unions 994 00:43:22,960 --> 00:43:24,759 Speaker 4: are like have the same kind of density that they 995 00:43:24,840 --> 00:43:27,120 Speaker 4: used to, but we've gone from a low ebb to 996 00:43:27,400 --> 00:43:29,759 Speaker 4: there's you know, there's a changes coming, I think, And 997 00:43:29,800 --> 00:43:32,400 Speaker 4: so I think people are more positive in general that 998 00:43:32,520 --> 00:43:33,560 Speaker 4: we're learning more. 999 00:43:33,760 --> 00:43:34,400 Speaker 1: I agree. 1000 00:43:34,560 --> 00:43:38,479 Speaker 2: I think people maybe are starting to realize and maybe 1001 00:43:38,480 --> 00:43:40,759 Speaker 2: the internet is we can thank the internet for some 1002 00:43:40,800 --> 00:43:44,040 Speaker 2: of this, just of being able to visualize what happens 1003 00:43:44,440 --> 00:43:48,160 Speaker 2: in non union environments and like going, hey, wait a second, like, yeah, 1004 00:43:48,160 --> 00:43:50,799 Speaker 2: it makes sense that we have some leverage because if 1005 00:43:50,800 --> 00:43:52,959 Speaker 2: you're not in a union, your leverage is basically niil 1006 00:43:53,440 --> 00:43:55,600 Speaker 2: with the business. And by the way, have you been picketing? 1007 00:43:55,640 --> 00:43:57,400 Speaker 2: Are you out there? Like have you you out there 1008 00:43:57,400 --> 00:43:57,560 Speaker 2: with this? 1009 00:43:57,719 --> 00:43:57,919 Speaker 1: Yeah? 1010 00:43:58,160 --> 00:43:59,880 Speaker 4: I pick I haven't picketed this week, but I was 1011 00:44:00,000 --> 00:44:01,240 Speaker 4: getting last week in the week before. 1012 00:44:01,400 --> 00:44:03,320 Speaker 1: Are there shifts. Are people doing shifts on picking? 1013 00:44:03,760 --> 00:44:05,880 Speaker 4: Yeah, so they've got big shifts that they announced the 1014 00:44:05,960 --> 00:44:08,439 Speaker 4: day before. Like right now, upfronts, which are the big 1015 00:44:08,480 --> 00:44:10,719 Speaker 4: ad sales events that the networks I'll put on, are 1016 00:44:10,719 --> 00:44:13,120 Speaker 4: going on. So there's pickets outside all the upfronts. You know, 1017 00:44:13,600 --> 00:44:15,399 Speaker 4: usually they bring on the actors for the big fall 1018 00:44:15,480 --> 00:44:18,480 Speaker 4: shows or whatever. Actors, like the teamsters have been showing 1019 00:44:18,480 --> 00:44:20,359 Speaker 4: a ton of solidarity with the writers. Most of them 1020 00:44:20,360 --> 00:44:22,799 Speaker 4: have refused to cross. So it's like, you know, a 1021 00:44:22,880 --> 00:44:25,880 Speaker 4: couple of nightly news anchors though even even some of 1022 00:44:25,920 --> 00:44:28,719 Speaker 4: the news anchors refused to cross too. I hear so 1023 00:44:28,960 --> 00:44:31,520 Speaker 4: oh really, yeah, so there's like that's interesting. Yeah, Lester 1024 00:44:31,560 --> 00:44:34,200 Speaker 4: hole he won't cross crossing or he isn't. Both refused 1025 00:44:34,200 --> 00:44:34,560 Speaker 4: to cross. 1026 00:44:34,640 --> 00:44:35,000 Speaker 1: That's cool. 1027 00:44:35,440 --> 00:44:38,400 Speaker 4: Andrew Ross Sorkin of The Times and CNBC did cross. 1028 00:44:38,480 --> 00:44:40,600 Speaker 1: Oh wow, Andrew not cool. 1029 00:44:40,600 --> 00:44:41,920 Speaker 4: Stephanie rule of MSNBC. 1030 00:44:42,000 --> 00:44:45,640 Speaker 2: Stephanie rule cross She definitely crossed crossed. Yeah, okay, who 1031 00:44:45,640 --> 00:44:47,280 Speaker 2: else anybody else? That's his is juiciest. 1032 00:44:47,360 --> 00:44:48,320 Speaker 4: Seacrest crossed. 1033 00:44:48,520 --> 00:44:50,440 Speaker 1: Seacrest crossed. Yeah. What's going on with Fallon? 1034 00:44:50,480 --> 00:44:52,520 Speaker 2: I saw there's some He came on Blue Sky, he 1035 00:44:52,600 --> 00:44:55,279 Speaker 2: joined Blue sky and people were like fucking hazing him. 1036 00:44:55,640 --> 00:44:57,719 Speaker 2: I felt bad for the guy, to be honest, It's like, 1037 00:44:57,840 --> 00:44:59,400 Speaker 2: I mean, I know you shouldn't feel too bad, but I. 1038 00:44:59,400 --> 00:45:01,880 Speaker 4: Mean, the thing the show is, it's always complicated because 1039 00:45:02,360 --> 00:45:04,080 Speaker 4: you know, you don't want to cross the pivot line. 1040 00:45:04,080 --> 00:45:05,960 Speaker 4: But if you're doing a nightly if you're doing a 1041 00:45:06,040 --> 00:45:08,040 Speaker 4: nightly talk show, like you've got a huge number of 1042 00:45:08,080 --> 00:45:10,520 Speaker 4: people who are that's their living, but you can shut 1043 00:45:10,520 --> 00:45:11,920 Speaker 4: it down for two weeks, but at some point you 1044 00:45:11,960 --> 00:45:25,279 Speaker 4: have to go back to work. Yeah, I think this 1045 00:45:25,320 --> 00:45:26,600 Speaker 4: one's going to last a long time, and so I 1046 00:45:26,640 --> 00:45:28,319 Speaker 4: would expect most of the talk show hosts to go 1047 00:45:28,360 --> 00:45:31,600 Speaker 4: back on without writers and have to do something similar. Though. 1048 00:45:32,239 --> 00:45:33,440 Speaker 4: I mean, we'll see, Like I don't. 1049 00:45:33,640 --> 00:45:36,200 Speaker 2: I've got Ai that's to spin up chat GBT to 1050 00:45:36,239 --> 00:45:37,960 Speaker 2: do some bits. I mean, how hard can it be? 1051 00:45:38,200 --> 00:45:38,359 Speaker 1: Right? 1052 00:45:38,840 --> 00:45:40,719 Speaker 2: This is maybe this would be the first big test 1053 00:45:40,760 --> 00:45:43,960 Speaker 2: of whether chat GPT can replace human beings that creative 1054 00:45:43,960 --> 00:45:44,239 Speaker 2: and depth. 1055 00:45:44,320 --> 00:45:47,120 Speaker 4: I mean, this is one of the items in our negotiations. 1056 00:45:47,160 --> 00:45:49,359 Speaker 4: It's very important to us because I mean, I think 1057 00:45:49,400 --> 00:45:51,160 Speaker 4: one thing that one way to think about all this, 1058 00:45:51,560 --> 00:45:53,279 Speaker 4: you know, just to connect it to us talking about 1059 00:45:53,360 --> 00:45:56,239 Speaker 4: media before is like we lived through when we were 1060 00:45:56,239 --> 00:45:59,040 Speaker 4: talking about earlier on, we lived through a huge sea 1061 00:45:59,160 --> 00:46:02,920 Speaker 4: change in the way like journalism is just created and distributed, 1062 00:46:03,160 --> 00:46:07,160 Speaker 4: going from print to online. They just decimated the industry. 1063 00:46:07,440 --> 00:46:09,840 Speaker 4: That's not the same thing necessarily is what's happening with 1064 00:46:09,880 --> 00:46:13,759 Speaker 4: streaming versus studios, But there's a lot of similarities. And 1065 00:46:13,800 --> 00:46:17,000 Speaker 4: one of the similarities is just having these huge cash 1066 00:46:17,080 --> 00:46:20,400 Speaker 4: rich companies, tech companies like Netflix and Amazon and Apple 1067 00:46:20,719 --> 00:46:23,360 Speaker 4: come in and just kind of throw money around in 1068 00:46:23,400 --> 00:46:25,279 Speaker 4: a way that is very hard for people to say 1069 00:46:25,280 --> 00:46:27,600 Speaker 4: no to that kind of money, but often means also 1070 00:46:27,880 --> 00:46:31,520 Speaker 4: diminishing work protections, accepting deals that you wouldn't have otherwise accepted, 1071 00:46:31,560 --> 00:46:33,040 Speaker 4: of course, and now I think a lot of people 1072 00:46:33,080 --> 00:46:35,279 Speaker 4: are sort of looking around and saying, hold on, like 1073 00:46:35,360 --> 00:46:38,520 Speaker 4: what direction is this going in? And having lived through 1074 00:46:38,520 --> 00:46:40,880 Speaker 4: that as a journalist and seeing sort of what happened 1075 00:46:40,880 --> 00:46:43,399 Speaker 4: to journalism, I think it's very clear what happens when 1076 00:46:43,440 --> 00:46:46,600 Speaker 4: you allow yourself to sort of buy the line that like, oh, 1077 00:46:46,680 --> 00:46:49,319 Speaker 4: the technology has changed, so you have to accept that 1078 00:46:49,360 --> 00:46:51,880 Speaker 4: you just can't get paid as much anymore. And you 1079 00:46:51,880 --> 00:46:55,360 Speaker 4: know the other thing about that is that Hollywood actually 1080 00:46:55,360 --> 00:46:57,239 Speaker 4: is a great example of an industry that's gone two 1081 00:46:57,280 --> 00:46:59,680 Speaker 4: or three times now through major technological shifts and how 1082 00:46:59,680 --> 00:47:02,560 Speaker 4: stuff created and distributed. That these unions have lasted since 1083 00:47:02,600 --> 00:47:06,759 Speaker 4: before TV, they lasted before the VCR, before DVDs, And 1084 00:47:06,840 --> 00:47:08,719 Speaker 4: every time there's been one of these shifts, there has 1085 00:47:08,800 --> 00:47:11,879 Speaker 4: been almost like you could almost track it by the year, 1086 00:47:11,960 --> 00:47:15,760 Speaker 4: there's been a strike because every single time the studios 1087 00:47:15,800 --> 00:47:17,879 Speaker 4: trying and say, oh, well, stuff's changed, we just can't 1088 00:47:17,880 --> 00:47:20,480 Speaker 4: pay as much as we used to, and writers and 1089 00:47:20,640 --> 00:47:24,239 Speaker 4: actors and directors and stage hands tend to stand up 1090 00:47:24,280 --> 00:47:26,720 Speaker 4: and say, hold on, that's not what's going to happen. 1091 00:47:26,840 --> 00:47:28,319 Speaker 4: We're going to figure out a new system so that 1092 00:47:28,360 --> 00:47:30,680 Speaker 4: we get paid what we deserve to create the content 1093 00:47:30,719 --> 00:47:33,360 Speaker 4: that people want to see. And so at the end 1094 00:47:33,400 --> 00:47:35,640 Speaker 4: of the tunnel right now you can see a place 1095 00:47:35,640 --> 00:47:40,080 Speaker 4: where studios want to use AI to create content as 1096 00:47:40,080 --> 00:47:41,960 Speaker 4: a way to pay writers less, as a way to 1097 00:47:42,000 --> 00:47:45,640 Speaker 4: employ fewer writers. And it's really important to us to 1098 00:47:45,680 --> 00:47:48,440 Speaker 4: stand up and say hold on, Like, for example, if 1099 00:47:48,440 --> 00:47:52,160 Speaker 4: you come up with a idea for a story via chat, 1100 00:47:52,200 --> 00:47:56,480 Speaker 4: GPT or whatever sophisticated Hollywood AI system. I'm sure Netflix 1101 00:47:56,520 --> 00:47:59,680 Speaker 4: is developing somewhere in a dark room and they're huge complex. 1102 00:48:00,239 --> 00:48:02,480 Speaker 4: You still have to pay the writer who takes that 1103 00:48:02,560 --> 00:48:04,759 Speaker 4: idea and turns it into a script the same rate 1104 00:48:04,960 --> 00:48:07,200 Speaker 4: that you would pay writer to write a script from 1105 00:48:07,200 --> 00:48:09,080 Speaker 4: a Wikipedia article or whatever. 1106 00:48:09,200 --> 00:48:11,919 Speaker 2: Up Right, I mean that just seems like a no brainer. Yeah, 1107 00:48:11,960 --> 00:48:14,000 Speaker 2: the work is the same exactly right. 1108 00:48:14,120 --> 00:48:14,279 Speaker 1: Yeah. 1109 00:48:14,320 --> 00:48:16,160 Speaker 4: And you know there's another thing where it's like most 1110 00:48:16,200 --> 00:48:18,440 Speaker 4: TV shows are written with writer's room, so you get 1111 00:48:18,480 --> 00:48:21,120 Speaker 4: ten smart, funny writers. I mean, this is like one 1112 00:48:21,120 --> 00:48:22,799 Speaker 4: of those things. I you know, because I started as 1113 00:48:22,800 --> 00:48:24,960 Speaker 4: a journalist and moved into TV writing, I'd never really 1114 00:48:25,000 --> 00:48:27,160 Speaker 4: experienced the TV writers' room. And now it's like, I 1115 00:48:27,200 --> 00:48:29,279 Speaker 4: wish every single thing I write I could write with 1116 00:48:29,280 --> 00:48:30,880 Speaker 4: the writer's room, to be able to sit down and 1117 00:48:30,960 --> 00:48:33,279 Speaker 4: sayle and be like, heep's great. Here's what we're trying 1118 00:48:33,280 --> 00:48:36,560 Speaker 4: to do. Let's get ten like extremely funny, smart people 1119 00:48:36,600 --> 00:48:39,479 Speaker 4: to like like just really work on it for five 1120 00:48:39,560 --> 00:48:41,400 Speaker 4: months and make it really good. And I think with 1121 00:48:41,480 --> 00:48:43,840 Speaker 4: studios they want to cut down the length of the 1122 00:48:43,880 --> 00:48:45,480 Speaker 4: writer's rooms. They want to cut down the number of 1123 00:48:45,480 --> 00:48:47,759 Speaker 4: writers that you have to hire because they have this 1124 00:48:47,840 --> 00:48:50,040 Speaker 4: idea in their heads. And you know, I don't think 1125 00:48:50,040 --> 00:48:51,799 Speaker 4: any of them have announced this specifically, but this is 1126 00:48:51,800 --> 00:48:54,920 Speaker 4: what everybody's sort of hearing. This is the chatter that 1127 00:48:55,040 --> 00:48:58,080 Speaker 4: you have one guy, you have your Damon Lindelow or 1128 00:48:58,160 --> 00:49:02,360 Speaker 4: JJ Abrams or Shonda Rhymes. They come up with the idea, 1129 00:49:02,440 --> 00:49:06,240 Speaker 4: and then your writer's room is chat GPT or AI 1130 00:49:06,560 --> 00:49:08,920 Speaker 4: or whatever else. And then not only does that make 1131 00:49:09,000 --> 00:49:11,680 Speaker 4: for worse content, but it's also that means that writers 1132 00:49:11,680 --> 00:49:13,399 Speaker 4: in general, when you want to have just one writer 1133 00:49:13,560 --> 00:49:15,560 Speaker 4: come in, you know, a human writer coming to punch 1134 00:49:15,600 --> 00:49:18,719 Speaker 4: it up, they're now not protected by the contract structures 1135 00:49:18,760 --> 00:49:20,080 Speaker 4: that previously existed. 1136 00:49:20,360 --> 00:49:20,799 Speaker 1: So all of a. 1137 00:49:20,800 --> 00:49:22,960 Speaker 4: Sudden, it's like, you know, there's all these things that 1138 00:49:23,040 --> 00:49:25,400 Speaker 4: the studios, there's ways to make it sound reasonable. Interestings 1139 00:49:25,440 --> 00:49:27,040 Speaker 4: like oh yeah, sure, why why do you need to 1140 00:49:27,080 --> 00:49:28,840 Speaker 4: have ten writers or whatever? And the answer is, and 1141 00:49:29,000 --> 00:49:31,279 Speaker 4: not every show needs to have ten writers. But if 1142 00:49:31,280 --> 00:49:34,640 Speaker 4: you don't guarantee that every show has ten writers, then 1143 00:49:34,640 --> 00:49:36,120 Speaker 4: all of a sudden, all the shows that really do 1144 00:49:36,200 --> 00:49:38,640 Speaker 4: need ten writers suddenly won't get right because the studios 1145 00:49:38,640 --> 00:49:39,879 Speaker 4: don't want to pay for it. They're going to point 1146 00:49:39,880 --> 00:49:42,080 Speaker 4: to the guy who just needed himself, like Mike White, 1147 00:49:42,080 --> 00:49:43,840 Speaker 4: who wrote all of White Lotus by himself and say, well, 1148 00:49:43,880 --> 00:49:44,400 Speaker 4: Mike White did it. 1149 00:49:44,440 --> 00:49:46,720 Speaker 2: Why can't you use a good point? That's a great point. Actually, 1150 00:49:46,719 --> 00:49:48,120 Speaker 2: come on, why not? Why can't you just come in. 1151 00:49:48,239 --> 00:49:51,000 Speaker 4: The right exactly? And then you have guys who really 1152 00:49:51,040 --> 00:49:53,000 Speaker 4: can't doing ten episode seasons, you're going to get a 1153 00:49:53,040 --> 00:49:54,439 Speaker 4: lot more, a lot worse TV. 1154 00:49:54,960 --> 00:49:55,280 Speaker 1: Well. 1155 00:49:55,360 --> 00:49:57,640 Speaker 2: I mean, it's interesting how it mirrors a lot of 1156 00:49:57,680 --> 00:50:00,520 Speaker 2: the rest of what's going on in reality, Like there 1157 00:50:00,560 --> 00:50:04,319 Speaker 2: is this pursuit of like massive growth and production, and 1158 00:50:04,400 --> 00:50:06,200 Speaker 2: like in some way, I feel like we're starting to 1159 00:50:06,239 --> 00:50:09,160 Speaker 2: see the limits of like the content mill. Like I 1160 00:50:09,200 --> 00:50:11,960 Speaker 2: feel like with the streaming services, like there's just too 1161 00:50:12,000 --> 00:50:15,000 Speaker 2: fucking much. Like like we know that people are starting 1162 00:50:15,000 --> 00:50:17,040 Speaker 2: to watch less. We know that there's like an attention 1163 00:50:17,239 --> 00:50:19,799 Speaker 2: sort of drift happening. I mean, I don't know about you, 1164 00:50:19,880 --> 00:50:22,520 Speaker 2: but I feel like the Sunday night appointment viewing thing 1165 00:50:22,520 --> 00:50:24,520 Speaker 2: has come back into focus in a big way around 1166 00:50:24,560 --> 00:50:27,000 Speaker 2: the stuff that HBO is doing and some other shows 1167 00:50:27,040 --> 00:50:29,520 Speaker 2: like stuff like Yellowjackets, where people are like it really 1168 00:50:29,560 --> 00:50:31,440 Speaker 2: is week to week, like talking about there's like a 1169 00:50:31,480 --> 00:50:34,719 Speaker 2: discourse about a show, not about the million things that 1170 00:50:34,760 --> 00:50:36,799 Speaker 2: you could possibly watching. I mean, I think it is 1171 00:50:36,840 --> 00:50:40,120 Speaker 2: that pursuit of that scale is a very is a 1172 00:50:40,200 --> 00:50:44,719 Speaker 2: very tech market based fucking thinking on like all of 1173 00:50:44,760 --> 00:50:46,480 Speaker 2: this can be bigger and there can be more of it, 1174 00:50:46,520 --> 00:50:48,080 Speaker 2: and we can just pump this out, and like AI 1175 00:50:48,280 --> 00:50:50,440 Speaker 2: is a solve I mean, yes, the soul for like 1176 00:50:50,480 --> 00:50:51,600 Speaker 2: having to do Oh I don't want to deal with 1177 00:50:51,600 --> 00:50:53,120 Speaker 2: these fucking creatives or I don't want to room with 1178 00:50:53,160 --> 00:50:53,560 Speaker 2: ten people. 1179 00:50:53,600 --> 00:50:54,239 Speaker 1: I want one guy. 1180 00:50:54,520 --> 00:50:57,040 Speaker 2: But it's also like a solve for scale, right, You're like, oh, 1181 00:50:57,080 --> 00:50:59,120 Speaker 2: I can just pump this shit out and I can 1182 00:50:59,160 --> 00:51:01,200 Speaker 2: hone it and refine it to what the audience wants 1183 00:51:01,239 --> 00:51:04,000 Speaker 2: and it'll be this perfect marriage of like no cost 1184 00:51:04,040 --> 00:51:07,600 Speaker 2: production and content that people love, and like we'll just 1185 00:51:07,640 --> 00:51:10,400 Speaker 2: make all the money for ourselves. I want to believe 1186 00:51:11,120 --> 00:51:14,360 Speaker 2: there's some impossibility there. I want to believe that there's something. 1187 00:51:14,440 --> 00:51:18,319 Speaker 2: I think we're all convinced there's some fear lingering out there, 1188 00:51:18,840 --> 00:51:21,520 Speaker 2: this AI fear, you know, and listen, I see it 1189 00:51:21,520 --> 00:51:23,360 Speaker 2: in art, like, I mean, the stuff that mid Journey 1190 00:51:23,400 --> 00:51:24,160 Speaker 2: does is crazy. 1191 00:51:24,440 --> 00:51:25,520 Speaker 1: Like I put out a song. 1192 00:51:25,360 --> 00:51:26,880 Speaker 2: The other day and I had mid Journey do the 1193 00:51:26,960 --> 00:51:28,560 Speaker 2: art for it, and it is like as good as 1194 00:51:28,600 --> 00:51:30,799 Speaker 2: any maybe better than any piece of art I could 1195 00:51:30,800 --> 00:51:33,160 Speaker 2: have gotten from another person because it did exactly what 1196 00:51:33,200 --> 00:51:35,600 Speaker 2: I wanted, exactly what I was trying to get it 1197 00:51:35,640 --> 00:51:38,239 Speaker 2: to do. And that's scary. That's going to cost people jobs. 1198 00:51:38,239 --> 00:51:40,560 Speaker 2: There's no question it already is costing people jobs. Like 1199 00:51:40,600 --> 00:51:42,239 Speaker 2: I see it all over the place, Like I see 1200 00:51:42,239 --> 00:51:45,400 Speaker 2: it on articles on news stories all the time, right 1201 00:51:45,400 --> 00:51:48,399 Speaker 2: where the content was devalued. So also has the art 1202 00:51:48,600 --> 00:51:49,720 Speaker 2: content been devalued. 1203 00:51:49,760 --> 00:51:50,240 Speaker 1: Go figure. 1204 00:51:51,080 --> 00:51:53,000 Speaker 2: I want to believe that there's going to be a 1205 00:51:53,040 --> 00:51:56,120 Speaker 2: point and maybe I'm wrong, And frankly, I don't know 1206 00:51:56,120 --> 00:51:59,439 Speaker 2: if I'm right. That the capability of the AI won't 1207 00:51:59,480 --> 00:52:01,760 Speaker 2: be as as we think it will be to deliver 1208 00:52:01,800 --> 00:52:03,560 Speaker 2: the thing that we think it can deliver. Like I 1209 00:52:03,560 --> 00:52:06,279 Speaker 2: feel like in technology and in life in some way, 1210 00:52:06,520 --> 00:52:10,279 Speaker 2: we're always looking for this magic single solution that you 1211 00:52:10,360 --> 00:52:13,520 Speaker 2: put the thing the input into and the output is 1212 00:52:13,560 --> 00:52:16,680 Speaker 2: your solve, right like, and I think that this happens 1213 00:52:16,719 --> 00:52:19,080 Speaker 2: all the time. I think pivot to video is one 1214 00:52:19,120 --> 00:52:21,640 Speaker 2: of these things, right Or like newsletters is one of 1215 00:52:21,640 --> 00:52:23,719 Speaker 2: those things. Not a knock on your thing, but it's 1216 00:52:23,760 --> 00:52:27,360 Speaker 2: like this is it. Yeah, we found the solution to 1217 00:52:27,480 --> 00:52:30,399 Speaker 2: the problem here, and if we just put that all 1218 00:52:30,440 --> 00:52:31,480 Speaker 2: the our bets on that thing. 1219 00:52:31,520 --> 00:52:32,960 Speaker 1: It's going to fix it. And like, I think AI 1220 00:52:33,040 --> 00:52:33,879 Speaker 1: has a lot of like. 1221 00:52:33,800 --> 00:52:36,799 Speaker 2: Promise and obviously a lot of potential danger, but is 1222 00:52:36,840 --> 00:52:38,960 Speaker 2: it really like can it really do the things? Is 1223 00:52:39,000 --> 00:52:41,160 Speaker 2: there an AI that will be Mike White before Mike 1224 00:52:41,200 --> 00:52:41,840 Speaker 2: White exists? 1225 00:52:41,840 --> 00:52:41,920 Speaker 1: Like? 1226 00:52:42,000 --> 00:52:44,799 Speaker 2: Yeah, can AI do the White lotus? Just because it's 1227 00:52:44,840 --> 00:52:47,640 Speaker 2: able to synthesize what has come before it? Yeah, and 1228 00:52:47,719 --> 00:52:51,400 Speaker 2: spit something out even if it's of some quality. Have 1229 00:52:51,480 --> 00:52:55,560 Speaker 2: we underestimated the randomness of the human mind? Like I 1230 00:52:55,600 --> 00:52:57,640 Speaker 2: think then, I think we have a little bit to 1231 00:52:57,640 --> 00:52:58,080 Speaker 2: be honest. 1232 00:52:58,120 --> 00:53:00,640 Speaker 4: You can almost talk about this in purely technical terms 1233 00:53:00,680 --> 00:53:03,759 Speaker 4: right where it's like we've seen these incredible especially large 1234 00:53:03,840 --> 00:53:06,200 Speaker 4: language models have advanced by leaps and pounds just by 1235 00:53:06,280 --> 00:53:09,640 Speaker 4: throwing power to them, basically, But it doesn't follow from 1236 00:53:09,640 --> 00:53:13,440 Speaker 4: that that just by continuing to add more computers, you know, 1237 00:53:13,520 --> 00:53:16,600 Speaker 4: more chips, more more power behind those lms, that we're 1238 00:53:16,600 --> 00:53:18,640 Speaker 4: going to get to you know, maybe we can get 1239 00:53:18,640 --> 00:53:21,400 Speaker 4: to ninety percent of human capability or whatever, and maybe 1240 00:53:21,400 --> 00:53:24,040 Speaker 4: for a lot of applications that's good enough. Ninety percent 1241 00:53:24,120 --> 00:53:27,600 Speaker 4: is fine, and even maybe for television, But that last 1242 00:53:27,600 --> 00:53:31,000 Speaker 4: ten percent, like the ten percent that is memorable and 1243 00:53:31,120 --> 00:53:35,319 Speaker 4: meaningful and original and new we may never ever reach that, 1244 00:53:35,360 --> 00:53:36,400 Speaker 4: we may not ever get. 1245 00:53:36,200 --> 00:53:36,719 Speaker 1: There or whatever. 1246 00:53:36,840 --> 00:53:38,440 Speaker 4: Yeah, I think that because I think the other the 1247 00:53:38,440 --> 00:53:41,000 Speaker 4: flip side of the sort of dynamic you're describing is 1248 00:53:41,719 --> 00:53:43,719 Speaker 4: and I can't remember who I feel like there was 1249 00:53:43,800 --> 00:53:46,719 Speaker 4: an old Wired article that first kind of articulated this 1250 00:53:46,800 --> 00:53:49,480 Speaker 4: idea that like the one of the things the Internet 1251 00:53:49,480 --> 00:53:52,400 Speaker 4: has done is sort of allowed the rain of the 1252 00:53:52,440 --> 00:53:55,320 Speaker 4: good enough. That there's just like once you realize you 1253 00:53:55,320 --> 00:53:57,759 Speaker 4: can kind of penetrate into what people are actually accessing 1254 00:53:57,800 --> 00:53:59,600 Speaker 4: and seeing and doing whatever else, that you realize that, 1255 00:53:59,680 --> 00:54:01,960 Speaker 4: like you don't have to make everything perfect. You can 1256 00:54:02,040 --> 00:54:04,080 Speaker 4: just make stuff good enough and people will consume it 1257 00:54:04,120 --> 00:54:06,960 Speaker 4: in vast quantities. In some ways, that's incredibly freeing as 1258 00:54:06,960 --> 00:54:09,360 Speaker 4: a creative that like you can allow stuff that wouldn't 1259 00:54:09,360 --> 00:54:11,719 Speaker 4: necessarily hit your standards out there into the world. But 1260 00:54:11,719 --> 00:54:13,600 Speaker 4: the flip side of it is that like if good 1261 00:54:13,680 --> 00:54:17,080 Speaker 4: enough is good enough, then like what's the motivation to 1262 00:54:17,160 --> 00:54:20,200 Speaker 4: take it to beyond good enough? Especially like on a 1263 00:54:20,239 --> 00:54:22,520 Speaker 4: corporate level where you're trying to press shareholders and make 1264 00:54:22,560 --> 00:54:24,960 Speaker 4: a profit whatever else, Like you don't get extra money 1265 00:54:25,000 --> 00:54:28,360 Speaker 4: by going from good enough to truly excellent. Like the 1266 00:54:28,440 --> 00:54:30,799 Speaker 4: dystopian fear about AI. I think I'm just sort of 1267 00:54:30,800 --> 00:54:32,920 Speaker 4: echoing what you're saying, is like maybe a I will 1268 00:54:32,920 --> 00:54:34,840 Speaker 4: never be able to write the White Lotus or Succession 1269 00:54:34,880 --> 00:54:36,919 Speaker 4: or whatever to your sopran Certainly not the Sopranos, whatever 1270 00:54:36,960 --> 00:54:37,600 Speaker 4: TV Shaw you love. 1271 00:54:37,680 --> 00:54:39,960 Speaker 1: Definitely not the Soprido, like you're like the other one. 1272 00:54:40,000 --> 00:54:42,759 Speaker 4: Well, the Sopranos to me is like that's part that's like, 1273 00:54:42,880 --> 00:54:44,560 Speaker 4: that's that's memorable, that's for it. 1274 00:54:44,600 --> 00:54:47,080 Speaker 2: I don't I don't disagree, But they'll be remaking it 1275 00:54:47,080 --> 00:54:48,960 Speaker 2: in like twenty years of the new cast or not 1276 00:54:49,000 --> 00:54:50,440 Speaker 2: even twenty years probably. 1277 00:54:50,200 --> 00:54:52,919 Speaker 4: Yeah, right, anyway that there's a million shows that would 1278 00:54:52,960 --> 00:54:55,919 Speaker 4: be fine to write with AI, and people might notice 1279 00:54:55,920 --> 00:54:56,280 Speaker 4: the difference. 1280 00:54:56,320 --> 00:54:58,759 Speaker 1: But maybe crazy atom maybe for instance. 1281 00:54:58,440 --> 00:54:59,680 Speaker 4: I mean, I don't want to again, I don't want 1282 00:54:59,680 --> 00:55:01,080 Speaker 4: I have to. I have to get a job when 1283 00:55:01,080 --> 00:55:02,680 Speaker 4: the strike is over. So I'm not going to say 1284 00:55:02,680 --> 00:55:03,319 Speaker 4: anything out loud. 1285 00:55:03,400 --> 00:55:05,520 Speaker 1: No, I've never watched Gray's Anatomy, so I wouldn't be 1286 00:55:05,520 --> 00:55:05,960 Speaker 1: able to tell you. 1287 00:55:06,000 --> 00:55:08,200 Speaker 4: But and I think, I mean, I think that's a 1288 00:55:08,280 --> 00:55:09,880 Speaker 4: legitimate fear. I mean the other part of it is 1289 00:55:09,920 --> 00:55:14,120 Speaker 4: I just think like, there's something about having things created 1290 00:55:14,160 --> 00:55:16,440 Speaker 4: by humans that I think is really important to the 1291 00:55:16,480 --> 00:55:20,920 Speaker 4: way we consume art and journalism and any other thing. 1292 00:55:21,200 --> 00:55:23,640 Speaker 1: To you, to a lot of people, it's not important 1293 00:55:23,640 --> 00:55:23,799 Speaker 1: at all. 1294 00:55:23,840 --> 00:55:26,280 Speaker 2: The same people who would happily read a user generate 1295 00:55:26,320 --> 00:55:27,880 Speaker 2: a piece of content on buzzfit. 1296 00:55:27,680 --> 00:55:28,640 Speaker 1: I don't know and enjoy it. 1297 00:55:28,680 --> 00:55:30,759 Speaker 4: I wonder like I do kind of. Jake Kang had 1298 00:55:30,760 --> 00:55:32,000 Speaker 4: this great line and a piece he wrote for The 1299 00:55:32,000 --> 00:55:34,120 Speaker 4: New York a few weeks ago where he said that, like, 1300 00:55:34,440 --> 00:55:36,120 Speaker 4: it's really important to him when he reads an article 1301 00:55:36,200 --> 00:55:38,160 Speaker 4: that he's getting angry at a human, like when you 1302 00:55:38,239 --> 00:55:40,200 Speaker 4: and Jay is like, as a writer, I love who 1303 00:55:40,360 --> 00:55:41,879 Speaker 4: who's Like one of the things he does is get 1304 00:55:41,880 --> 00:55:42,400 Speaker 4: really mad. 1305 00:55:42,280 --> 00:55:44,840 Speaker 1: At people, and he's mad online. 1306 00:55:45,040 --> 00:55:47,319 Speaker 4: Yeah, and the human component, like the fact that there's 1307 00:55:47,320 --> 00:55:49,439 Speaker 4: a human to be mad at, is important to him, 1308 00:55:49,480 --> 00:55:51,080 Speaker 4: the fact. And I think you can transperse that to 1309 00:55:51,120 --> 00:55:53,480 Speaker 4: all kinds of emotional states that you want to be. 1310 00:55:54,120 --> 00:55:55,799 Speaker 4: Let's say this is a hope more than like a 1311 00:55:55,840 --> 00:55:57,880 Speaker 4: thing that I absolutely know to be true. This is 1312 00:55:58,000 --> 00:56:01,759 Speaker 4: but it's something that I I hope with reason that 1313 00:56:02,640 --> 00:56:05,600 Speaker 4: people will feel some kind of difference between watching Let's 1314 00:56:05,600 --> 00:56:09,280 Speaker 4: pretend some far future AI could from a single prompt 1315 00:56:09,360 --> 00:56:12,040 Speaker 4: create a ten episode, not just the script, the whole thing. 1316 00:56:12,440 --> 00:56:15,120 Speaker 4: That there's a difference between watching that and watching something 1317 00:56:15,160 --> 00:56:18,640 Speaker 4: that somebody else made right even that, like, even there's 1318 00:56:18,640 --> 00:56:20,759 Speaker 4: a difference between you, Joshua, like I really want to 1319 00:56:20,760 --> 00:56:24,960 Speaker 4: watch a cyberpunk adaptation of whatever You type that into 1320 00:56:25,440 --> 00:56:28,400 Speaker 4: your TV generating AI and get it that there's a 1321 00:56:28,400 --> 00:56:32,080 Speaker 4: difference between that versus something that Mike White typed into 1322 00:56:32,080 --> 00:56:34,400 Speaker 4: his thing. I want to write a thing right that, 1323 00:56:34,480 --> 00:56:37,120 Speaker 4: even between those two uses of AI, there's a difference that, 1324 00:56:37,480 --> 00:56:39,280 Speaker 4: like so much of what we consume, there's this social 1325 00:56:39,320 --> 00:56:41,680 Speaker 4: basis to it that I think hasn't quite been worked 1326 00:56:41,680 --> 00:56:44,000 Speaker 4: out by places like Netflix. That there's that that they're 1327 00:56:44,040 --> 00:56:46,719 Speaker 4: betting on the other half, which is, like you say 1328 00:56:46,760 --> 00:56:50,080 Speaker 4: that if we can sufficiently allow the prompt to just 1329 00:56:50,200 --> 00:56:53,080 Speaker 4: solve all of these problems, then we're all set. But 1330 00:56:53,320 --> 00:56:55,920 Speaker 4: I mean, man, I've been so wrong as I try 1331 00:56:55,960 --> 00:56:57,640 Speaker 4: to get people to subscribe to my newsletter about the future. 1332 00:56:57,680 --> 00:56:59,960 Speaker 4: Let me just say, I'm so wrong all the time. 1333 00:57:00,080 --> 00:57:01,040 Speaker 1: But that's good to know. 1334 00:57:01,520 --> 00:57:03,680 Speaker 2: People love to read about a guy who's predicted in 1335 00:57:03,680 --> 00:57:05,880 Speaker 2: the future, but getting it wrong that's one of their favorite. 1336 00:57:05,920 --> 00:57:07,560 Speaker 1: Yeah. I mean, actually, that's it gives you a person 1337 00:57:07,560 --> 00:57:08,280 Speaker 1: to hate online. 1338 00:57:08,320 --> 00:57:12,279 Speaker 2: I feel just perhaps, I mean maybe what saves humanity 1339 00:57:12,560 --> 00:57:14,120 Speaker 2: is that we need to be mad online, but we 1340 00:57:14,160 --> 00:57:18,600 Speaker 2: need to be mad online at someone specifically. We can't 1341 00:57:18,640 --> 00:57:20,880 Speaker 2: just be mad online, Like being mad online at an 1342 00:57:20,960 --> 00:57:25,000 Speaker 2: AI feels like unproductive, right, just not enough, you know. 1343 00:57:25,120 --> 00:57:27,960 Speaker 2: That's that's that's really seizing the means of production, the 1344 00:57:28,120 --> 00:57:28,960 Speaker 2: productive anger. 1345 00:57:29,080 --> 00:57:30,240 Speaker 4: Yeah, all right. 1346 00:57:30,160 --> 00:57:31,440 Speaker 1: Yeah, we got to wrap up, Max. 1347 00:57:31,480 --> 00:57:35,880 Speaker 2: This was really great first off, not surprising, but really enjoyable. 1348 00:57:35,880 --> 00:57:38,400 Speaker 2: And I feel like there's about twenty things we didn't 1349 00:57:38,400 --> 00:57:38,680 Speaker 2: get to. 1350 00:57:38,800 --> 00:57:39,120 Speaker 1: I had. 1351 00:57:39,160 --> 00:57:41,120 Speaker 2: I know that during the time that you were talking, 1352 00:57:41,440 --> 00:57:43,360 Speaker 2: several times you were saying something, I'm like, I have 1353 00:57:43,400 --> 00:57:46,480 Speaker 2: a great, hilarious rebuttal to this, and I didn't get 1354 00:57:46,480 --> 00:57:48,600 Speaker 2: a chance to even get there because we got it 1355 00:57:48,600 --> 00:57:50,240 Speaker 2: to so many other things. So you got to come 1356 00:57:50,240 --> 00:57:53,600 Speaker 2: back and do this again. Yeah, and people can find you. 1357 00:57:53,800 --> 00:57:55,640 Speaker 2: You're not on Twitter. You don't You're not on Twitter. 1358 00:57:56,000 --> 00:57:58,560 Speaker 4: No, I'm on blue Sky now though I'm. 1359 00:57:58,440 --> 00:58:00,600 Speaker 2: Not really uhy, but nobody else is on blue Sky. 1360 00:58:00,640 --> 00:58:02,200 Speaker 2: So that's why you can't promote yourself. 1361 00:58:02,280 --> 00:58:03,959 Speaker 1: What is your name? On blue Sky. How come people 1362 00:58:03,960 --> 00:58:04,280 Speaker 1: find you? 1363 00:58:04,560 --> 00:58:07,360 Speaker 4: It's just it's max read dot info. It's my it's 1364 00:58:07,400 --> 00:58:08,040 Speaker 4: my personal. 1365 00:58:07,840 --> 00:58:08,480 Speaker 1: Lives at r O. 1366 00:58:08,600 --> 00:58:09,640 Speaker 4: That's great dot info. 1367 00:58:09,960 --> 00:58:11,800 Speaker 2: I'm also I'm also on blue Sky, I should say, 1368 00:58:11,800 --> 00:58:13,240 Speaker 2: because that's a cool thing to tell people. 1369 00:58:13,320 --> 00:58:14,520 Speaker 1: Josh, wait, Tibolski dot. 1370 00:58:14,400 --> 00:58:18,080 Speaker 2: Com is my this is my blue Sky day, which 1371 00:58:18,120 --> 00:58:21,000 Speaker 2: is great. You obviously have this great read Max newsletter 1372 00:58:21,120 --> 00:58:22,680 Speaker 2: that you can subscribe to. It's on substack. 1373 00:58:23,520 --> 00:58:26,560 Speaker 4: The U R L is Max read dot substack dot com. 1374 00:58:26,600 --> 00:58:28,440 Speaker 4: And my last name is spelled R E A D 1375 00:58:28,640 --> 00:58:29,000 Speaker 4: like a book. 1376 00:58:29,040 --> 00:58:31,480 Speaker 1: Yeah, we know how to spell read. Do people e 1377 00:58:31,640 --> 00:58:32,560 Speaker 1: on the end a lot of the time? 1378 00:58:33,160 --> 00:58:35,400 Speaker 4: This is this R E E D sometimes? 1379 00:58:35,680 --> 00:58:37,919 Speaker 1: Yeah, sure, I guess I don't think of that read 1380 00:58:38,000 --> 00:58:40,800 Speaker 1: at all. What else? Instagram? You don't do Instagram? 1381 00:58:40,840 --> 00:58:41,400 Speaker 4: Not really? 1382 00:58:41,600 --> 00:58:45,720 Speaker 2: You know, I'm a podcast No, but yeah, I think 1383 00:58:45,720 --> 00:58:47,280 Speaker 2: when the I when the higher ups that I heard 1384 00:58:47,360 --> 00:58:48,880 Speaker 2: here this, They're going to be like this guy. 1385 00:58:48,920 --> 00:58:50,200 Speaker 1: We got to get this guy on the air. 1386 00:58:50,280 --> 00:58:51,440 Speaker 4: One can only hope. 1387 00:58:51,400 --> 00:58:54,440 Speaker 1: Or a chat GBT equivalent of this guy. No. 1388 00:58:54,600 --> 00:58:57,040 Speaker 4: I invite people. I invite people to subscribe to the newsletter. 1389 00:58:57,080 --> 00:58:59,480 Speaker 4: It's free. I should mention there is a subscription fee. 1390 00:58:59,480 --> 00:59:02,000 Speaker 4: You can pay for it, but I write one free 1391 00:59:02,080 --> 00:59:03,160 Speaker 4: weekly column. 1392 00:59:02,960 --> 00:59:05,480 Speaker 2: So I think you can get that the Nemesis Sunglass 1393 00:59:05,480 --> 00:59:07,360 Speaker 2: Post for free. And I think that if you want 1394 00:59:07,360 --> 00:59:09,400 Speaker 2: to go a deeper dive on nemmesis you get that, 1395 00:59:09,480 --> 00:59:10,439 Speaker 2: you have to pay a few bucks. 1396 00:59:10,520 --> 00:59:12,440 Speaker 4: That's exactly right, and you will after you read the 1397 00:59:12,480 --> 00:59:14,200 Speaker 4: Sunglass post, you will want the deeper dive. 1398 00:59:14,280 --> 00:59:15,960 Speaker 2: I do think that's a big way to kind of 1399 00:59:16,040 --> 00:59:17,920 Speaker 2: dangle the subscription at people. 1400 00:59:17,800 --> 00:59:18,320 Speaker 1: To get that. 1401 00:59:18,760 --> 00:59:20,960 Speaker 2: That's a glass posts. Anyhow, it's a super fun Thank 1402 00:59:21,040 --> 00:59:28,919 Speaker 2: you for taking the time. Well, that is our show 1403 00:59:28,960 --> 00:59:31,800 Speaker 2: for this week. I think obviously I've I've concluded the 1404 00:59:31,800 --> 00:59:34,840 Speaker 2: conversation and now that I've had nothing left my all 1405 00:59:34,880 --> 00:59:38,920 Speaker 2: of my essence has been drained from me. That sounds disgusting, actually, 1406 00:59:39,080 --> 00:59:42,080 Speaker 2: but I go into a cryogenic chamber at the end 1407 00:59:42,120 --> 00:59:44,240 Speaker 2: of every show and I'm there until the next show, 1408 00:59:44,440 --> 00:59:49,080 Speaker 2: so I have to begin my long slumber. We'll be 1409 00:59:49,080 --> 00:59:52,480 Speaker 2: back next week with more what future, and as always, 1410 00:59:52,520 --> 00:59:54,360 Speaker 2: I wish you and your family the very best. 1411 01:00:00,040 --> 01:00:00,280 Speaker 4: Height