1 00:00:00,200 --> 00:00:02,400 Speaker 1: Thanks for tuni in to techt Stuff. If you don't 2 00:00:02,400 --> 00:00:05,440 Speaker 1: recognize my voice, my name is Oz Valoshian and I'm 3 00:00:05,440 --> 00:00:08,760 Speaker 1: here because the inimitable Jonathan Strickland has passed the baton 4 00:00:08,800 --> 00:00:11,760 Speaker 1: to Cara Price and myself to host tech Stuff. The 5 00:00:11,800 --> 00:00:14,640 Speaker 1: show will remain your home for all things tech, and 6 00:00:14,800 --> 00:00:17,520 Speaker 1: all the old episodes will remain available in this feed. 7 00:00:17,960 --> 00:00:23,000 Speaker 1: Thanks for listening. Welcome to tech Stuff. I'm oz Vaaloshian 8 00:00:23,239 --> 00:00:26,880 Speaker 1: and I'm Cara Price. So it's Wednesday, and starting on 9 00:00:26,920 --> 00:00:30,520 Speaker 1: today's Tech Stuff, and every Wednesday going forward, we're going 10 00:00:30,560 --> 00:00:32,640 Speaker 1: to bring you an in depth conversation with one of 11 00:00:32,680 --> 00:00:37,920 Speaker 1: the brightest and farthest seeing minds in all of technology. 12 00:00:39,560 --> 00:00:42,919 Speaker 1: For me personally, hosting this podcast with you is kind 13 00:00:42,960 --> 00:00:45,400 Speaker 1: of a dream come true. Parts I love spending time 14 00:00:45,440 --> 00:00:49,600 Speaker 1: with you, but also because I love getting the opportunity 15 00:00:49,640 --> 00:00:53,240 Speaker 1: to sit down with people who are in many cases 16 00:00:53,280 --> 00:00:56,880 Speaker 1: building the future and asking them what they're looking at, 17 00:00:57,000 --> 00:00:59,360 Speaker 1: how they're building, what they're scared of, and what they're 18 00:00:59,400 --> 00:01:01,880 Speaker 1: excited about, and then bring that back. 19 00:01:02,240 --> 00:01:04,200 Speaker 2: And it is my dream to have you do all 20 00:01:04,240 --> 00:01:05,959 Speaker 2: of the work and to respond to it. 21 00:01:06,560 --> 00:01:09,760 Speaker 1: So thank you so for our first Wednesday episode. Of 22 00:01:09,760 --> 00:01:12,280 Speaker 1: tech stuff. There was no one I wanted to reach 23 00:01:12,319 --> 00:01:13,880 Speaker 1: out to more than Nicholas Thompson. 24 00:01:14,720 --> 00:01:17,160 Speaker 2: I really like Nick Thompson. I remember when he was 25 00:01:17,480 --> 00:01:18,040 Speaker 2: at Wired. 26 00:01:18,280 --> 00:01:20,600 Speaker 1: Yeah, he was the editorn chief of Wired, and he's 27 00:01:20,640 --> 00:01:24,000 Speaker 1: been a long time chronicler of tech. In fact, I 28 00:01:24,080 --> 00:01:27,240 Speaker 1: really like this thing he does on LinkedIn almost every 29 00:01:27,240 --> 00:01:30,560 Speaker 1: single day, which is a kind of selfie video called 30 00:01:30,600 --> 00:01:33,360 Speaker 1: the most Interesting Thing in Tech This Week. 31 00:01:33,480 --> 00:01:38,880 Speaker 2: It's podcast series and actually we found ourselves mentioned for 32 00:01:38,920 --> 00:01:40,440 Speaker 2: a very particular reason. 33 00:01:40,560 --> 00:01:42,800 Speaker 3: Hosted by Karra Price and Oswalsh. 34 00:01:42,560 --> 00:01:46,000 Speaker 1: And so look at one day. Back in twenty nineteen, 35 00:01:46,480 --> 00:01:50,280 Speaker 1: one of Nicholas Thompson's most Interesting Things in Tech was 36 00:01:50,400 --> 00:01:53,400 Speaker 1: our very own podcast that we hosted together, Sleepwalkers. 37 00:01:53,640 --> 00:01:56,280 Speaker 3: But what I like about it is it's real reporting, 38 00:01:56,760 --> 00:02:05,160 Speaker 3: an analysis, but there's realism about the complicated trade offs. 39 00:02:05,400 --> 00:02:07,720 Speaker 3: They're both optimistic and pessimistic. 40 00:02:08,240 --> 00:02:10,120 Speaker 2: Yeah, that was a very wild moment for us. I mean, 41 00:02:10,160 --> 00:02:12,639 Speaker 2: you and I do like getting press hits. My mother's 42 00:02:12,639 --> 00:02:14,400 Speaker 2: a publicist. I know what a big deal it is. 43 00:02:14,760 --> 00:02:19,080 Speaker 1: Certainly, when Nicholas put this video up on LinkedIn, obviously, 44 00:02:19,160 --> 00:02:20,800 Speaker 1: the first thing I did was to get his email 45 00:02:20,840 --> 00:02:22,760 Speaker 1: address and right to him and ask him to have 46 00:02:22,840 --> 00:02:27,160 Speaker 1: a coffee, which he agreed to do, and subsequently he 47 00:02:27,280 --> 00:02:31,919 Speaker 1: actually a wide magazine syndicated our podcast Sleepwalkers as a column, 48 00:02:31,919 --> 00:02:34,200 Speaker 1: which was just very very very cool and very exciting. 49 00:02:34,639 --> 00:02:36,679 Speaker 1: And I think in some ways it is part of 50 00:02:36,760 --> 00:02:39,120 Speaker 1: the reason we're we're back in the seat a few 51 00:02:39,200 --> 00:02:42,400 Speaker 1: years later, because he contributed to giving us the confidence 52 00:02:42,639 --> 00:02:45,359 Speaker 1: and maybe even the credibility to be hosting tech stuff today. 53 00:02:45,760 --> 00:02:49,680 Speaker 2: Absolutely, I think for us to have the real deal 54 00:02:50,240 --> 00:02:52,520 Speaker 2: put his stamp of approval on things, I think was 55 00:02:52,600 --> 00:02:55,160 Speaker 2: very exciting for us. But you know, we can't assume 56 00:02:55,200 --> 00:02:58,680 Speaker 2: that everyone knows who Nicholas Thompson is, so what does 57 00:02:58,720 --> 00:02:59,320 Speaker 2: he do now? 58 00:02:59,760 --> 00:03:03,400 Speaker 1: So Nicholas Thompson went on to become the CEO of 59 00:03:03,480 --> 00:03:06,480 Speaker 1: the Atlantic, So yeah, he's kind of bounced around throughout 60 00:03:06,520 --> 00:03:09,680 Speaker 1: a long career in journalism. And Nicholas has written about politics, 61 00:03:09,800 --> 00:03:13,560 Speaker 1: about the law, and of course technology. He's been a writer, 62 00:03:13,760 --> 00:03:17,200 Speaker 1: He's been an editor and an author of books. He 63 00:03:17,240 --> 00:03:20,640 Speaker 1: wrote The Hawk and the Dove, Paul Knitzer, George Kennan, 64 00:03:20,639 --> 00:03:23,080 Speaker 1: and the History of the Cold War. And because he 65 00:03:23,160 --> 00:03:26,440 Speaker 1: never ever stops, he's writing a new book called Running 66 00:03:26,480 --> 00:03:29,320 Speaker 1: for Your Life on middle age marathons and the quest 67 00:03:29,400 --> 00:03:30,320 Speaker 1: for peak performance. 68 00:03:31,120 --> 00:03:32,960 Speaker 2: I just think about what I do in a day, 69 00:03:33,400 --> 00:03:35,040 Speaker 2: what he does in a day. But you know, I'm 70 00:03:35,120 --> 00:03:37,000 Speaker 2: very excited to hear from him. I think he I 71 00:03:37,040 --> 00:03:40,560 Speaker 2: think we also gravitate towards him as a person because 72 00:03:40,600 --> 00:03:44,440 Speaker 2: he is incredibly multifaceted. 73 00:03:44,080 --> 00:03:48,080 Speaker 1: Highly highly, highly energetic. That's disturbingly energetic. In fact, he 74 00:03:48,200 --> 00:03:50,840 Speaker 1: really is just a ball of kind of optimistic energy. 75 00:03:51,120 --> 00:03:53,520 Speaker 1: And we had a lot to cover together. We talked 76 00:03:53,520 --> 00:03:56,200 Speaker 1: about the deal he struck with Open Ai in twenty 77 00:03:56,240 --> 00:04:01,440 Speaker 1: twenty four as Atlantic CEO, which included licensing the magazine's 78 00:04:01,560 --> 00:04:02,840 Speaker 1: archive to train AI. 79 00:04:03,480 --> 00:04:04,200 Speaker 2: That was drama. 80 00:04:04,360 --> 00:04:06,960 Speaker 1: That was drama. So I asked him about that, and 81 00:04:07,320 --> 00:04:09,080 Speaker 1: there are a few other kind of big questions, but 82 00:04:09,120 --> 00:04:11,680 Speaker 1: I started the conversation asking about running and how he 83 00:04:11,800 --> 00:04:15,080 Speaker 1: used tech to beat his best ever marathon time went 84 00:04:15,120 --> 00:04:15,960 Speaker 1: into his forties. 85 00:04:17,400 --> 00:04:21,359 Speaker 3: I had to somehow convince myself at a subconscious level 86 00:04:21,360 --> 00:04:23,360 Speaker 3: that I could go faster than I thought I could. 87 00:04:23,800 --> 00:04:25,760 Speaker 3: And the funny thing about running, which I didn't quite 88 00:04:25,839 --> 00:04:31,599 Speaker 3: understand then, is what slows you down often isn't physiological pain. 89 00:04:32,480 --> 00:04:36,479 Speaker 3: It's your body creating an illusion of physiological pain. Because 90 00:04:36,480 --> 00:04:39,760 Speaker 3: it's worried that you'll lose homeostasis if you continue a 91 00:04:39,800 --> 00:04:41,599 Speaker 3: pace for a certain period of time. And if you 92 00:04:41,600 --> 00:04:43,800 Speaker 3: can convince your mind that you can do more, well, 93 00:04:43,800 --> 00:04:45,200 Speaker 3: then you can do more. But what do you use 94 00:04:45,240 --> 00:04:46,880 Speaker 3: to convince your mind? Or you have to use your mind? 95 00:04:47,800 --> 00:04:51,320 Speaker 3: So I started using an arm heart rate monitor, and 96 00:04:51,320 --> 00:04:53,680 Speaker 3: so I actually had very accurate readings of my heart 97 00:04:53,720 --> 00:04:57,280 Speaker 3: rate as opposed to the highly inaccurate readings that we 98 00:04:57,400 --> 00:05:01,240 Speaker 3: normally have, and that allowed me to both sort of 99 00:05:01,279 --> 00:05:03,920 Speaker 3: titrate the effort during workouts and during races, but also 100 00:05:03,920 --> 00:05:05,640 Speaker 3: to have confidence right when you're running a race and 101 00:05:05,680 --> 00:05:07,560 Speaker 3: you're running at a fast pace and your heart rate 102 00:05:07,640 --> 00:05:10,440 Speaker 3: is oh look, my heart rate is only one thirty five, right, 103 00:05:10,480 --> 00:05:14,159 Speaker 3: I'm okay, I can go harder. That is extremely useful. Now, 104 00:05:14,200 --> 00:05:16,560 Speaker 3: of course I use AI. I upload everything I've eaten. 105 00:05:16,600 --> 00:05:18,919 Speaker 3: I asked for nutritional advice or to get it. Oh yeah, 106 00:05:19,120 --> 00:05:20,440 Speaker 3: you know. This is what I had for breakfast, This 107 00:05:20,480 --> 00:05:21,880 Speaker 3: is what I had for lunch. This is the workout 108 00:05:21,880 --> 00:05:24,520 Speaker 3: I ran yesterday. What would you recommend I do between 109 00:05:24,520 --> 00:05:25,960 Speaker 3: now and my next workout on Friday. 110 00:05:26,240 --> 00:05:31,040 Speaker 1: It's great, how much of a paradigm for human machine interaction. 111 00:05:31,120 --> 00:05:33,720 Speaker 1: Do you think this kind of experience, your experience with 112 00:05:34,160 --> 00:05:36,440 Speaker 1: optimization through running is huge. 113 00:05:36,480 --> 00:05:39,280 Speaker 3: I mean, if you think about AI, it's very good 114 00:05:39,720 --> 00:05:44,919 Speaker 3: at tasks where it's better than the best human available 115 00:05:45,360 --> 00:05:48,960 Speaker 3: and the answer doesn't have to be one hundred percent accurate, 116 00:05:49,080 --> 00:05:53,240 Speaker 3: where a fast answer involving all of the inputs can 117 00:05:53,240 --> 00:05:56,279 Speaker 3: be ninety five percent accurate and it's good. Right, And 118 00:05:56,800 --> 00:06:01,320 Speaker 3: that's the case for what should I eat for dinner tonight? Right? Right? 119 00:06:01,440 --> 00:06:03,720 Speaker 3: Like even if it tells me I need a little 120 00:06:03,720 --> 00:06:07,120 Speaker 3: extra protein and maybe I don't need electric protein, who cares? 121 00:06:07,320 --> 00:06:09,360 Speaker 3: But it still knows a lot more about nutrition than 122 00:06:09,360 --> 00:06:11,880 Speaker 3: I know about nutrition, and it can analyze the content 123 00:06:11,920 --> 00:06:13,520 Speaker 3: of the foods I've eaten in a much better way 124 00:06:13,720 --> 00:06:16,839 Speaker 3: and it's an extremely useful tool. 125 00:06:17,080 --> 00:06:20,120 Speaker 1: Yeah, because I mean, you're not a professional runner, although 126 00:06:20,160 --> 00:06:21,960 Speaker 1: you were kind of in the elite or sub a 127 00:06:22,000 --> 00:06:23,040 Speaker 1: leaite category. 128 00:06:22,760 --> 00:06:25,240 Speaker 3: Sub sub elite, which is better if you folks like 129 00:06:25,640 --> 00:06:29,960 Speaker 3: sub elite sounds ridiculous. Elite, you know, excellent, Like I've 130 00:06:30,000 --> 00:06:33,960 Speaker 3: never I've never I'm never the elite elite, well not 131 00:06:34,000 --> 00:06:36,960 Speaker 3: in running PHAs maybe in other ways you are, but 132 00:06:36,960 --> 00:06:37,960 Speaker 3: but elite elite. 133 00:06:37,760 --> 00:06:40,599 Speaker 1: Runners are Also, I mean you've you've talked about a runner, 134 00:06:40,640 --> 00:06:42,960 Speaker 1: you know who had a digital twin, I mean talk 135 00:06:43,000 --> 00:06:43,400 Speaker 1: about that. 136 00:06:43,800 --> 00:06:49,080 Speaker 3: Yeah, so Deslynden, who is a wonderful runner. She set 137 00:06:49,120 --> 00:06:51,760 Speaker 3: the world record for women in the fifty k run, 138 00:06:52,480 --> 00:06:55,599 Speaker 3: and she's a force of nature, and so she had 139 00:06:55,680 --> 00:06:57,200 Speaker 3: TCS built a digital. 140 00:06:56,960 --> 00:06:57,680 Speaker 1: Twin of her heart. 141 00:06:58,080 --> 00:07:01,800 Speaker 3: And it's still early days to see how useful that 142 00:07:01,800 --> 00:07:05,039 Speaker 3: can be. Right now, it sort of just explains in 143 00:07:05,360 --> 00:07:08,159 Speaker 3: much finer detail how she recovers from a workout and 144 00:07:08,160 --> 00:07:11,280 Speaker 3: how she benefits from her workout. But you can imagine 145 00:07:11,840 --> 00:07:14,920 Speaker 3: if I had a digital twin of my heart, I'm 146 00:07:14,960 --> 00:07:17,400 Speaker 3: sure I could optimize workouts in a way that I 147 00:07:17,440 --> 00:07:17,880 Speaker 3: can't now. 148 00:07:18,240 --> 00:07:20,320 Speaker 1: Well, I guess you look at F one. There's only 149 00:07:20,320 --> 00:07:23,000 Speaker 1: a sudden number of hours the cars are allowed to 150 00:07:23,040 --> 00:07:24,800 Speaker 1: be on the track. So that's why simulation is so 151 00:07:24,800 --> 00:07:27,440 Speaker 1: important for F one. Similarly, in running, I mean, you 152 00:07:27,960 --> 00:07:30,360 Speaker 1: don't want to be running way too much, right, So 153 00:07:30,360 --> 00:07:33,040 Speaker 1: in a sense, this idea of simulating training allows you 154 00:07:33,080 --> 00:07:35,480 Speaker 1: to do way more training than you could do, right 155 00:07:35,480 --> 00:07:36,040 Speaker 1: It was a real point. 156 00:07:36,040 --> 00:07:38,200 Speaker 3: Well, I mean they're also there's specific things like so 157 00:07:38,440 --> 00:07:40,240 Speaker 3: right now I run ultras and I'm trying to run 158 00:07:40,280 --> 00:07:45,160 Speaker 3: a fast fifty miler, and the problem is in ultra training, 159 00:07:45,280 --> 00:07:47,680 Speaker 3: you never run fifty miles in a workout, and so 160 00:07:47,840 --> 00:07:50,200 Speaker 3: you can't actually test your body and see whether you're 161 00:07:50,200 --> 00:07:52,760 Speaker 3: gonna make yourself puke. If you take in five hundred 162 00:07:52,760 --> 00:07:55,200 Speaker 3: calories an hour for five hours, and if you could 163 00:07:55,280 --> 00:07:57,280 Speaker 3: do that through a digital twin, and you can say, Okay, 164 00:07:57,280 --> 00:07:59,400 Speaker 3: here's how my digestive system works. Here's the rate at 165 00:07:59,400 --> 00:08:02,160 Speaker 3: which I burn calories. Here's how fast I'll be running. 166 00:08:02,480 --> 00:08:04,720 Speaker 3: What is the optimal number of carbohydrates that I can 167 00:08:04,760 --> 00:08:09,160 Speaker 3: take in without throwing up? That would be phenomenal. 168 00:08:10,080 --> 00:08:13,239 Speaker 1: Running is one of your key passions. And then there's writing. 169 00:08:13,800 --> 00:08:16,160 Speaker 1: I think I read that you've put your interviews that 170 00:08:16,200 --> 00:08:18,800 Speaker 1: you're doing for your book through an NLM to kind 171 00:08:18,840 --> 00:08:22,360 Speaker 1: of have new connections and themes suggested or. 172 00:08:22,480 --> 00:08:24,560 Speaker 3: Yeah, so this is a really interesting process. I try 173 00:08:24,600 --> 00:08:28,000 Speaker 3: to use large language models in every way possible as 174 00:08:28,040 --> 00:08:30,640 Speaker 3: I write this book about running, with the exception of 175 00:08:31,480 --> 00:08:33,520 Speaker 3: writing any words. Not one word in the book will 176 00:08:33,559 --> 00:08:34,920 Speaker 3: be written by A but I try to use it 177 00:08:34,920 --> 00:08:36,480 Speaker 3: for everything else to see where it's good and where 178 00:08:36,520 --> 00:08:39,080 Speaker 3: it's not good, and also to accelerate the process, because 179 00:08:39,120 --> 00:08:40,679 Speaker 3: when you write a story at The Atlantic or the 180 00:08:40,679 --> 00:08:42,760 Speaker 3: New Yorkery, of this team of editors behind you right 181 00:08:42,880 --> 00:08:45,880 Speaker 3: helping you all the time, you're writing a book, it's 182 00:08:45,880 --> 00:08:49,319 Speaker 3: a much more solo project. And so the way the 183 00:08:49,320 --> 00:08:51,240 Speaker 3: book is structured is it's partly about my life. It's 184 00:08:51,240 --> 00:08:53,480 Speaker 3: probably about my father, and then it's partly about different 185 00:08:53,520 --> 00:08:56,600 Speaker 3: runners who I've encountered or competed with along the way. 186 00:08:57,120 --> 00:08:59,600 Speaker 3: And some of them are people that I've interviewed episodically 187 00:08:59,600 --> 00:09:03,560 Speaker 3: over your time period. And so one of the characters, 188 00:09:03,600 --> 00:09:05,600 Speaker 3: for example, is this one Bobby gibb first would run 189 00:09:05,640 --> 00:09:07,480 Speaker 3: the Boston Marathon, mother of a friend of mine in 190 00:09:07,559 --> 00:09:10,600 Speaker 3: high school. I have all these interviews. Yeah, and so 191 00:09:10,800 --> 00:09:12,680 Speaker 3: the most useful task is I wrote a section on 192 00:09:12,679 --> 00:09:14,360 Speaker 3: her in the book and say it's three thousand words, 193 00:09:14,920 --> 00:09:17,800 Speaker 3: and then I've fed all the interviews into a large 194 00:09:17,840 --> 00:09:20,200 Speaker 3: language model, and I said, here's the section I've written. 195 00:09:20,440 --> 00:09:23,480 Speaker 3: Here are all the interviews. Is there anything I've written 196 00:09:23,480 --> 00:09:25,959 Speaker 3: that is inaccurate based on what you've said? Are there 197 00:09:25,960 --> 00:09:27,000 Speaker 3: any quotes from. 198 00:09:26,880 --> 00:09:30,000 Speaker 1: Her faccurate or inaccurate characterization? Both? 199 00:09:30,120 --> 00:09:33,680 Speaker 3: Yeah, it's less good on factually inaccurate, But like is 200 00:09:33,679 --> 00:09:35,800 Speaker 3: anything I've said kind of unfair? Which is a test 201 00:09:35,800 --> 00:09:38,240 Speaker 3: you should do as a journalist anyway, But is anything 202 00:09:38,280 --> 00:09:40,920 Speaker 3: I've said unfair? And are there any quotes that she's 203 00:09:41,360 --> 00:09:44,240 Speaker 3: given me that are better than the quotes I've included? 204 00:09:44,600 --> 00:09:47,040 Speaker 3: And in fact it said yes, you should include this, 205 00:09:47,080 --> 00:09:48,520 Speaker 3: and you should include that, And then I went back 206 00:09:48,559 --> 00:09:50,760 Speaker 3: and I said, okay, great Now I have also at 207 00:09:50,800 --> 00:09:53,719 Speaker 3: different points I've said, you know, here's the whole manuscript, 208 00:09:53,800 --> 00:09:56,160 Speaker 3: you know, with the privacy protections on, so it's not 209 00:09:56,200 --> 00:09:58,760 Speaker 3: fed back in where should I add this? And it's 210 00:09:59,520 --> 00:10:02,680 Speaker 3: less good at that, But there are specific narrow tasks 211 00:10:02,679 --> 00:10:05,920 Speaker 3: where it's amazing. It's just like having a very smart 212 00:10:06,160 --> 00:10:07,440 Speaker 3: research assistant right there. 213 00:10:08,120 --> 00:10:11,439 Speaker 1: It's interesting added that little caveat with the privacy setting 214 00:10:11,520 --> 00:10:14,280 Speaker 1: zone because I think you had another experience with a 215 00:10:14,320 --> 00:10:16,480 Speaker 1: book you've already written that had to do an AI 216 00:10:16,559 --> 00:10:17,480 Speaker 1: that was less possive. 217 00:10:17,559 --> 00:10:22,439 Speaker 3: Right. Well, yeah, so there's this big debate about my job. 218 00:10:22,520 --> 00:10:25,280 Speaker 3: See you of the Atlantic, how are we licensing the 219 00:10:25,280 --> 00:10:28,160 Speaker 3: Atlantics data to AI models? And there is a direct 220 00:10:28,200 --> 00:10:32,319 Speaker 3: process whereby they in the last few years during which 221 00:10:32,360 --> 00:10:34,000 Speaker 3: they've trained their model, have come to our site have 222 00:10:34,040 --> 00:10:37,520 Speaker 3: scraped it have uploaded it, and that is something that 223 00:10:37,559 --> 00:10:39,400 Speaker 3: we have some control over, right, and we can say 224 00:10:39,440 --> 00:10:41,120 Speaker 3: don't do that. We can say we're going to license it, 225 00:10:41,120 --> 00:10:42,520 Speaker 3: we can say you can sue you for doing it. 226 00:10:42,600 --> 00:10:46,080 Speaker 3: But what is so interesting is that a huge percentage 227 00:10:46,679 --> 00:10:49,559 Speaker 3: of the Atlantic content in these models doesn't come from 228 00:10:49,600 --> 00:10:54,000 Speaker 3: reading the Atlantic. It comes from, well, the Atlantic's website 229 00:10:54,120 --> 00:10:56,520 Speaker 3: was already captured as part of this process by which 230 00:10:56,520 --> 00:11:00,120 Speaker 3: somebody captured the whole open web, or someone copy and 231 00:11:00,200 --> 00:11:02,040 Speaker 3: pasted an article and to read it or put it 232 00:11:02,080 --> 00:11:04,480 Speaker 3: on instant paper or pot And the same thing happens 233 00:11:04,480 --> 00:11:08,960 Speaker 3: with books. So my book published by McMillan, The Hawk 234 00:11:09,000 --> 00:11:11,520 Speaker 3: and the Dove, you know, on Paulmitz and George kennon 235 00:11:11,520 --> 00:11:14,040 Speaker 3: the history of the Cold War, I never licensed it 236 00:11:14,080 --> 00:11:16,560 Speaker 3: to an AI model, but you know, it went out 237 00:11:16,600 --> 00:11:19,600 Speaker 3: to libraries and then this code or you know whatever 238 00:11:19,640 --> 00:11:22,360 Speaker 3: hacked the stream of all bookstually, like, there are all 239 00:11:22,440 --> 00:11:24,680 Speaker 3: these data sets that include the words in my book 240 00:11:24,720 --> 00:11:26,920 Speaker 3: that have been fed into all these large language models. 241 00:11:27,040 --> 00:11:30,120 Speaker 1: And that feels weird having realized that, what do you do? 242 00:11:30,520 --> 00:11:34,360 Speaker 3: Well, it's complicated it because there's a there's a question 243 00:11:34,440 --> 00:11:37,200 Speaker 3: of whether the AI companies argue that what they've done 244 00:11:37,240 --> 00:11:39,720 Speaker 3: is fair use. They just taken data and they transform it, right, 245 00:11:39,720 --> 00:11:41,600 Speaker 3: because you can't write the Hawk and the Dove with 246 00:11:41,679 --> 00:11:43,840 Speaker 3: one of these things. It's transformative. It's just like they 247 00:11:43,880 --> 00:11:47,080 Speaker 3: went into the library and read and then not only that, 248 00:11:47,240 --> 00:11:49,560 Speaker 3: because it's from a data set and they didn't know 249 00:11:49,600 --> 00:11:53,280 Speaker 3: the contents, it's kind of a secondary copyright violation. It's 250 00:11:53,320 --> 00:11:55,360 Speaker 3: a little bit different if they had come and taken 251 00:11:55,679 --> 00:11:57,800 Speaker 3: Hawk and the Dove, photographed it and fed it in. 252 00:11:57,920 --> 00:11:59,600 Speaker 1: Yeah, they found a well in the street rather thant 253 00:11:59,600 --> 00:12:01,360 Speaker 1: taking out of book gives correct, right. 254 00:12:01,600 --> 00:12:03,280 Speaker 3: Or they went to a thrift store and they bought 255 00:12:03,360 --> 00:12:05,600 Speaker 3: a big bucket of things and there were some wallets 256 00:12:05,600 --> 00:12:08,520 Speaker 3: in there, right. So you don't have a lot of options. 257 00:12:08,600 --> 00:12:13,959 Speaker 3: The option that I am most supportive of is being 258 00:12:14,000 --> 00:12:16,160 Speaker 3: pursued by a company called pro Rada, and what they 259 00:12:16,200 --> 00:12:18,880 Speaker 3: are doing is they are building a kind of reverse 260 00:12:18,960 --> 00:12:22,959 Speaker 3: AI tool that will evaluate the answer given by a 261 00:12:23,040 --> 00:12:26,760 Speaker 3: large language model way the sources that went into it, 262 00:12:27,000 --> 00:12:30,160 Speaker 3: and then overlay payments process. So it's a little bit 263 00:12:30,280 --> 00:12:33,200 Speaker 3: like askap right, And the idea is if an answer 264 00:12:33,360 --> 00:12:36,920 Speaker 3: like open AI answers a question and their answer based 265 00:12:36,960 --> 00:12:41,239 Speaker 3: on the work that pro Rada has done derives from. 266 00:12:41,160 --> 00:12:41,960 Speaker 1: The Hawk and the Dove. 267 00:12:42,640 --> 00:12:44,920 Speaker 3: You know, one percent derives from the Hawk and the Dove, 268 00:12:45,080 --> 00:12:48,520 Speaker 3: and you know open Ai makes one penny off of it, 269 00:12:48,760 --> 00:12:50,600 Speaker 3: then I should be given some fraction of one percent 270 00:12:50,600 --> 00:12:52,600 Speaker 3: of the one penny, right, And that's the pro Rada 271 00:12:52,640 --> 00:12:53,199 Speaker 3: is trying to. 272 00:12:53,200 --> 00:12:54,840 Speaker 1: Develop that as a business model. 273 00:12:54,840 --> 00:12:57,120 Speaker 3: I'm on the board of Them's a full disclosure, but 274 00:12:57,440 --> 00:12:59,480 Speaker 3: there are a couple of companies out there that are 275 00:12:59,520 --> 00:13:03,560 Speaker 3: trying to all this compensation issue because a lot of 276 00:13:03,640 --> 00:13:07,880 Speaker 3: value has been created from copyrighted materials to which the 277 00:13:07,880 --> 00:13:10,319 Speaker 3: copyright holders were being given nothing. There's not been a 278 00:13:10,480 --> 00:13:12,600 Speaker 3: fair exchange of value, and that's a problem you. 279 00:13:12,600 --> 00:13:13,040 Speaker 2: Have to solve. 280 00:13:13,960 --> 00:13:16,040 Speaker 1: So you'll now see you at the Atlantic. But you're 281 00:13:16,080 --> 00:13:19,080 Speaker 1: previously editor of Wired, and I have to ask you, 282 00:13:19,520 --> 00:13:21,360 Speaker 1: I mean, what's your advice, honestly to us as we 283 00:13:21,400 --> 00:13:23,880 Speaker 1: start this new podcast as to how to how to 284 00:13:23,920 --> 00:13:26,520 Speaker 1: approach this set of stories and problems. 285 00:13:26,880 --> 00:13:29,319 Speaker 3: I'm drawn to. I call it tech enthusiasm, where I 286 00:13:30,120 --> 00:13:33,200 Speaker 3: love tech right and I think two things are true. 287 00:13:33,200 --> 00:13:35,280 Speaker 3: One tech is amazing and two in the long archive time, 288 00:13:35,320 --> 00:13:38,440 Speaker 3: technology makes the world better for people. That said, I 289 00:13:38,640 --> 00:13:40,840 Speaker 3: was never a perfect fit with the sort of the 290 00:13:40,880 --> 00:13:46,680 Speaker 3: pure optimism of early wires. It was also as the 291 00:13:46,679 --> 00:13:50,880 Speaker 3: the position of pure optimism kind of fit the tech 292 00:13:50,880 --> 00:13:53,679 Speaker 3: industry when they were the underdogs, but once they were 293 00:13:53,679 --> 00:13:56,080 Speaker 3: the dominant forces in the world, it was a less 294 00:13:56,240 --> 00:13:59,959 Speaker 3: appropriate response any case, So you should choose your own 295 00:14:00,040 --> 00:14:02,920 Speaker 3: you of how you come to tech, right and true. 296 00:14:04,640 --> 00:14:08,880 Speaker 3: But like my view of tech is I'm constantly trying 297 00:14:08,880 --> 00:14:11,280 Speaker 3: to learn about it. I'm constantly trying to understand it, 298 00:14:11,480 --> 00:14:14,600 Speaker 3: and every now and then I stop and I'm like, wait, 299 00:14:15,280 --> 00:14:18,439 Speaker 3: this is horrifying. But then I'm like, you know what, 300 00:14:18,600 --> 00:14:21,120 Speaker 3: like I'm enthusiastic. I'm just gonna keep going. We're gonna 301 00:14:21,160 --> 00:14:23,280 Speaker 3: keep looking at this stuff. Because one of the risks 302 00:14:23,320 --> 00:14:25,800 Speaker 3: in AI is that you look ahead and you're like, God, 303 00:14:25,800 --> 00:14:29,240 Speaker 3: this is I'm terrified of this. And then you say, 304 00:14:29,280 --> 00:14:30,680 Speaker 3: you know what I'm gonna do. I'm gonna like just 305 00:14:31,040 --> 00:14:33,120 Speaker 3: I'm gonna be like King Canute, I'm gonna say the 306 00:14:33,120 --> 00:14:36,320 Speaker 3: AI is not happening. Like on principle, I refuse and 307 00:14:36,320 --> 00:14:38,080 Speaker 3: I'm not going to use AI because I don't like 308 00:14:38,160 --> 00:14:40,920 Speaker 3: what it's doing to the world. Well, that's not the answer, 309 00:14:41,000 --> 00:14:43,240 Speaker 3: because it's not going to go away just because you 310 00:14:43,320 --> 00:14:46,800 Speaker 3: find it scary. You're just gonna miss the moment where 311 00:14:46,800 --> 00:14:48,480 Speaker 3: you can shape it in a way that maybe makes 312 00:14:48,520 --> 00:14:50,080 Speaker 3: it less scary. 313 00:14:51,120 --> 00:14:54,160 Speaker 1: When we come back the Atlantic's decision to partner with 314 00:14:54,200 --> 00:15:07,680 Speaker 1: Open AI and how that decision was received in the newsroom, 315 00:15:07,800 --> 00:15:10,440 Speaker 1: I also really enjoy your newsletter, which is another content 316 00:15:10,440 --> 00:15:11,360 Speaker 1: output that we haven't had it. 317 00:15:11,480 --> 00:15:14,040 Speaker 3: I didn't even mention that, Yeah, that's a fun one. 318 00:15:14,640 --> 00:15:18,280 Speaker 1: In June, you picked up an essay by Leopold Aschenbrenner 319 00:15:18,760 --> 00:15:22,560 Speaker 1: called Situational Awareness, which I was both, you know, quite 320 00:15:22,600 --> 00:15:25,520 Speaker 1: curious about it and also slightly put off by the tone. 321 00:15:26,240 --> 00:15:29,600 Speaker 3: Oh I mean, I mean, I don't think I've ever 322 00:15:29,640 --> 00:15:34,000 Speaker 3: read an essay that has the ratio of insight to 323 00:15:34,440 --> 00:15:37,880 Speaker 3: alienation in that essay from like it ends up like 324 00:15:37,960 --> 00:15:39,680 Speaker 3: ten times as insightful as a lot that you read 325 00:15:39,680 --> 00:15:42,400 Speaker 3: and like ten times as alienating. Right, And so you know, 326 00:15:42,520 --> 00:15:45,640 Speaker 3: begins with this sense that you know, because the author 327 00:15:45,640 --> 00:15:47,680 Speaker 3: got really high grades at Columbia, you should trust him 328 00:15:47,720 --> 00:15:50,000 Speaker 3: to like see the entire future, and he sees the 329 00:15:50,000 --> 00:15:52,200 Speaker 3: future nobody else does, and you're like, okay. 330 00:15:51,960 --> 00:15:53,320 Speaker 1: He's also an open AI researcher. 331 00:15:53,320 --> 00:15:54,960 Speaker 3: There, right, So we worked at open A left, left, 332 00:15:55,000 --> 00:15:58,800 Speaker 3: open AI, and he's he is, in fact, exceptionally bright, right. 333 00:15:58,800 --> 00:16:01,000 Speaker 3: And I've spent time talking to him, exceptionally fun to 334 00:16:01,040 --> 00:16:02,640 Speaker 3: talk to. And when you talk to him, it's a 335 00:16:02,640 --> 00:16:04,400 Speaker 3: little easier than when you read the essay. But in 336 00:16:04,480 --> 00:16:09,320 Speaker 3: case the essay says, hey, everybody, wake up, this is 337 00:16:09,360 --> 00:16:13,200 Speaker 3: what happens if there's exponential AI. Here's how the improvement 338 00:16:13,240 --> 00:16:15,480 Speaker 3: curve works, and here's what it will be able to 339 00:16:15,480 --> 00:16:19,280 Speaker 3: do soon. And now we've watched AI go from being 340 00:16:19,320 --> 00:16:21,880 Speaker 3: as smart as a toddler to being as smart as 341 00:16:21,920 --> 00:16:23,640 Speaker 3: a high school student to being as smart as a 342 00:16:23,680 --> 00:16:27,920 Speaker 3: pH d student. Now let's just extrapolate to when it's 343 00:16:27,960 --> 00:16:30,640 Speaker 3: smarter than a Nobel Prize winner, and then when it's 344 00:16:30,640 --> 00:16:33,680 Speaker 3: building a model by itself that we have no control over. 345 00:16:33,840 --> 00:16:36,560 Speaker 3: And maybe he's wrong. Lots of people challenge his assumptions, 346 00:16:36,600 --> 00:16:40,320 Speaker 3: and the question of whether AI will scale exponentially is 347 00:16:40,400 --> 00:16:43,040 Speaker 3: hotly debated. The part that I found dangerous, that I 348 00:16:43,080 --> 00:16:47,600 Speaker 3: think probably contributed to a series of mistakes that we 349 00:16:47,640 --> 00:16:50,440 Speaker 3: are making right now, is the view. Okay, if you 350 00:16:50,480 --> 00:16:55,200 Speaker 3: play it out, then whoever controls AI will control the world, 351 00:16:55,320 --> 00:16:56,960 Speaker 3: and so therefore we need to really make sure that 352 00:16:56,960 --> 00:16:58,600 Speaker 3: it's controlled in the United States and not in China. 353 00:16:58,680 --> 00:17:01,080 Speaker 3: So therefore we need to have a very sagonistic relationship 354 00:17:01,120 --> 00:17:02,840 Speaker 3: to China. We need to make sure they can't hack 355 00:17:02,880 --> 00:17:04,480 Speaker 3: in and get our systems, and we need to set 356 00:17:04,480 --> 00:17:06,879 Speaker 3: our foreign policy to prevent them from getting AI. And 357 00:17:06,920 --> 00:17:09,359 Speaker 3: you can see not that Leopold Ashenbrener is responsible for, 358 00:17:09,440 --> 00:17:12,600 Speaker 3: you know, the Biden administration Chips Act and anti China 359 00:17:12,800 --> 00:17:17,560 Speaker 3: policies and technology, but he contributed to a conversation that 360 00:17:17,640 --> 00:17:20,840 Speaker 3: I think has led to a set of policies that 361 00:17:21,560 --> 00:17:24,639 Speaker 3: most people think are good, very aggressive policies by the 362 00:17:24,720 --> 00:17:27,680 Speaker 3: United States to try to slow down China's AA industry, 363 00:17:27,960 --> 00:17:29,480 Speaker 3: but that I think are bad. 364 00:17:30,320 --> 00:17:32,360 Speaker 1: You credit that situation on the lne as I say, 365 00:17:32,359 --> 00:17:34,119 Speaker 1: with a real policy. 366 00:17:33,760 --> 00:17:38,399 Speaker 3: Shift, Like Leopold Asherbrener was probably in high school right 367 00:17:38,400 --> 00:17:40,919 Speaker 3: when Trump started like going after Walway. But if you 368 00:17:40,960 --> 00:17:45,600 Speaker 3: look at the conversation this summer, why did SB ten 369 00:17:45,680 --> 00:17:49,080 Speaker 3: forty seven in California, which was strict AI regulation, Why 370 00:17:49,240 --> 00:17:53,360 Speaker 3: was that knocked back by God Gavin Newsom, governor of California. 371 00:17:54,160 --> 00:17:57,399 Speaker 3: I think in no small part because of a fear 372 00:17:57,680 --> 00:18:01,960 Speaker 3: that if we regulate the AI industry, China will get ahead, right, 373 00:18:02,359 --> 00:18:05,880 Speaker 3: And I do think that the sort of China Hawk 374 00:18:06,320 --> 00:18:11,120 Speaker 3: element of the AI industry contribute it to the defeat 375 00:18:11,200 --> 00:18:15,400 Speaker 3: of the most systematic attempt at regulation. And I do 376 00:18:15,480 --> 00:18:19,320 Speaker 3: think that Ashrabnner's essay played a small role in that. 377 00:18:20,880 --> 00:18:24,399 Speaker 1: Which I guess segues to You're no longer just an editor. 378 00:18:24,400 --> 00:18:27,080 Speaker 1: You're also a CEO, right, I am, yes, And I'm 379 00:18:27,119 --> 00:18:30,159 Speaker 1: wondering how do you think differently about AI as an 380 00:18:30,240 --> 00:18:31,439 Speaker 1: editor versus a CEO. 381 00:18:32,520 --> 00:18:35,560 Speaker 3: Well, it's a CEO you have all these other hard questions. Right, 382 00:18:35,600 --> 00:18:37,800 Speaker 3: So as an editor and a writer, you're just like 383 00:18:38,520 --> 00:18:41,400 Speaker 3: finding things that are interesting, You're sticking your curiosity. You're 384 00:18:42,480 --> 00:18:46,080 Speaker 3: as a CEO, I have to think about how it 385 00:18:46,080 --> 00:18:49,439 Speaker 3: will literally change my company and prepare for that. Right, So, 386 00:18:50,880 --> 00:18:53,000 Speaker 3: what will it make easier? What will it make harder? 387 00:18:53,400 --> 00:18:55,960 Speaker 3: How will it change jobs in the future. If you 388 00:18:56,040 --> 00:18:58,320 Speaker 3: assume that ai wei should be powerful, should you be 389 00:18:58,520 --> 00:19:00,920 Speaker 3: hiring a different kind of person? Right, So you'd be 390 00:19:01,000 --> 00:19:04,760 Speaker 3: hiring somebody who's more flexible about what they do as 391 00:19:04,760 --> 00:19:07,199 Speaker 3: many different skills as opposed to very narrow skills. Right, 392 00:19:07,240 --> 00:19:10,440 Speaker 3: So you make those decisions. That's one category of decisions. 393 00:19:10,480 --> 00:19:13,719 Speaker 3: Probably the most pressing is you have to anticipate how 394 00:19:13,720 --> 00:19:15,639 Speaker 3: it will change your field and then how you operate 395 00:19:15,680 --> 00:19:18,480 Speaker 3: in it. So how will it change the production of media? 396 00:19:18,600 --> 00:19:20,440 Speaker 3: But one of the things that's doing is it's changing 397 00:19:20,480 --> 00:19:23,160 Speaker 3: how search engines work. Right, We're going from search engines 398 00:19:23,200 --> 00:19:26,080 Speaker 3: to answer engines. Answer engines don't drive traffic. We get 399 00:19:26,080 --> 00:19:28,439 Speaker 3: the plurality of our readers from search engines. So if 400 00:19:28,480 --> 00:19:30,680 Speaker 3: search engines go away and we move to answer engines, 401 00:19:30,840 --> 00:19:33,199 Speaker 3: where will our readers come from. Gosh, there won't be 402 00:19:33,200 --> 00:19:35,960 Speaker 3: as many of them. Okay, can your business survive and 403 00:19:36,000 --> 00:19:39,040 Speaker 3: thrive in that ecosystem? So that is a hard problem. 404 00:19:39,080 --> 00:19:42,680 Speaker 3: So what will it mean if AI becomes as good 405 00:19:42,680 --> 00:19:45,320 Speaker 3: as Leopold Ashburnner thinks? And I asked him this question, 406 00:19:45,400 --> 00:19:48,000 Speaker 3: like how long will you know serious publications have a mote? 407 00:19:48,000 --> 00:19:55,960 Speaker 3: And he's like, oh, three years, right right. I was like, great, right, 408 00:19:56,080 --> 00:19:58,440 Speaker 3: But if you believe in his view, in three years, 409 00:20:00,040 --> 00:20:01,760 Speaker 3: anybody will be able to say, hey, make me a 410 00:20:01,800 --> 00:20:04,520 Speaker 3: magazine that's just like the Atlantic right in you know, 411 00:20:04,560 --> 00:20:06,560 Speaker 3: two seconds, and so a guy in Macedonia will make 412 00:20:06,600 --> 00:20:09,240 Speaker 3: a pseudo Atlantic, and so you'll have these new competitors 413 00:20:09,280 --> 00:20:12,040 Speaker 3: that you're dealing with, right. So then there's a third category, 414 00:20:12,080 --> 00:20:15,240 Speaker 3: which is how do we interface with the large language 415 00:20:15,240 --> 00:20:17,080 Speaker 3: model companies? And this is the question related to what 416 00:20:17,119 --> 00:20:19,520 Speaker 3: you asked me earlier about my book, and that is like, okay, 417 00:20:19,880 --> 00:20:20,960 Speaker 3: which ones we make deals with? 418 00:20:21,000 --> 00:20:21,840 Speaker 1: Which ones do we sue? 419 00:20:21,920 --> 00:20:24,679 Speaker 3: Right? And then there's kind of a fourth question of 420 00:20:25,160 --> 00:20:26,959 Speaker 3: you know, what products can we build and are there 421 00:20:27,000 --> 00:20:29,280 Speaker 3: things that we know about media that we can build 422 00:20:29,400 --> 00:20:31,760 Speaker 3: using AI that we can then productize and turn into companies. 423 00:20:31,960 --> 00:20:34,720 Speaker 1: One of the questions this conversation raises, of course, is 424 00:20:34,720 --> 00:20:37,680 Speaker 1: if I'm a reporter at The Atlantic and I hate 425 00:20:37,680 --> 00:20:40,359 Speaker 1: this conversation, I might think, you know, the sea of 426 00:20:40,359 --> 00:20:43,680 Speaker 1: the company thinks that in two or three years there 427 00:20:43,720 --> 00:20:45,720 Speaker 1: may no longer be a need for me, and this 428 00:20:45,760 --> 00:20:49,439 Speaker 1: company may be replaced by a Macedonian spoof. How do 429 00:20:49,440 --> 00:20:50,080 Speaker 1: you respond to that? 430 00:20:50,440 --> 00:20:54,840 Speaker 3: Well, So, first off, I don't believe that it's going 431 00:20:54,880 --> 00:20:55,560 Speaker 3: to that fast. 432 00:20:55,720 --> 00:20:55,880 Speaker 2: Right. 433 00:20:56,000 --> 00:20:59,320 Speaker 3: That is the view of Leopold Ashenbrenner and others, Right 434 00:20:59,600 --> 00:21:02,679 Speaker 3: when he said, you know you have a mote for 435 00:21:02,720 --> 00:21:05,800 Speaker 3: three years. I disagree. I think the mote is much longer, right, 436 00:21:05,880 --> 00:21:08,600 Speaker 3: And why do we have a mote. Well, first off, 437 00:21:08,680 --> 00:21:11,439 Speaker 3: there's no indication whatsoever that AI can write with any 438 00:21:11,520 --> 00:21:13,399 Speaker 3: kind of style and voice, Like it is terrible at 439 00:21:13,440 --> 00:21:15,639 Speaker 3: it ask it to try to style in voice. It 440 00:21:15,640 --> 00:21:19,480 Speaker 3: can write poems that are kind of silly. It cannot 441 00:21:19,640 --> 00:21:23,520 Speaker 3: report right, and it can't go out and have a 442 00:21:23,520 --> 00:21:27,200 Speaker 3: conversation with the source the stuff that makes Atlantic stories 443 00:21:27,480 --> 00:21:32,000 Speaker 3: Atlantic stories. It can't do right. We just had Robert 444 00:21:32,000 --> 00:21:36,200 Speaker 3: Worth out reporting with Ukrainian fighters right in the streets 445 00:21:36,240 --> 00:21:39,200 Speaker 3: of Ukraine. Do you really think, even if you believe 446 00:21:39,240 --> 00:21:44,200 Speaker 3: the most optimistic AI scenarios, that somehow your AI bought 447 00:21:44,320 --> 00:21:45,560 Speaker 3: is going to be able to get these guys on 448 00:21:45,600 --> 00:21:48,600 Speaker 3: the phone and like we'll be able to talk as Honestly, 449 00:21:48,680 --> 00:21:50,080 Speaker 3: there's no way in hell that's. 450 00:21:49,880 --> 00:21:50,320 Speaker 1: Going to happen. 451 00:21:50,600 --> 00:21:55,080 Speaker 3: So the Atlantic and serious long form publications that write 452 00:21:55,080 --> 00:21:57,720 Speaker 3: with style, that do complicated stories with interesting narratives and 453 00:21:57,840 --> 00:22:03,600 Speaker 3: do reporting is going to be around for long, long, long, long, 454 00:22:03,600 --> 00:22:06,920 Speaker 3: long long time doing the things that it does. Now 455 00:22:06,960 --> 00:22:09,800 Speaker 3: all of that said, I would be a fool not 456 00:22:09,960 --> 00:22:14,080 Speaker 3: to think about how AI is going to advantage competitors. 457 00:22:14,600 --> 00:22:18,040 Speaker 3: The publications that will be started by Macedonians with really 458 00:22:18,080 --> 00:22:21,720 Speaker 3: good prompt engineering skills, and that will exist in a 459 00:22:21,720 --> 00:22:24,520 Speaker 3: web where search is totally different. Right, And so I 460 00:22:24,560 --> 00:22:26,800 Speaker 3: think that the Atlantic will be publishing the kinds of 461 00:22:26,840 --> 00:22:27,359 Speaker 3: stories that we. 462 00:22:27,440 --> 00:22:29,880 Speaker 1: Publish as far as I can see. 463 00:22:30,000 --> 00:22:32,399 Speaker 3: And I also think that preparing for World of AI 464 00:22:32,600 --> 00:22:34,640 Speaker 3: is something that is extremely important for me SEO. 465 00:22:35,119 --> 00:22:40,000 Speaker 1: So that said, your own magazine, I think refer to 466 00:22:40,040 --> 00:22:42,880 Speaker 1: the deal you made with open Ai as a devil's bargain, right. 467 00:22:42,840 --> 00:22:46,640 Speaker 3: Yes, this was a deal that members of our editorial 468 00:22:46,640 --> 00:22:50,520 Speaker 3: team were not fully supportive of. Like you know, I 469 00:22:50,560 --> 00:22:52,680 Speaker 3: don't tell them what to write, and they don't tell 470 00:22:52,720 --> 00:22:56,000 Speaker 3: me what to do. And I am one hundred percent, fully, completely, 471 00:22:56,040 --> 00:22:59,320 Speaker 3: absolutely of the belief that that deal was good for 472 00:22:59,359 --> 00:23:01,399 Speaker 3: the short term of thee, for the long term the Atlantic, 473 00:23:01,440 --> 00:23:03,800 Speaker 3: and for the long term the journalism industry, And I 474 00:23:03,880 --> 00:23:04,440 Speaker 3: believe can. 475 00:23:04,359 --> 00:23:05,919 Speaker 1: You explain exactly what the deal was just? 476 00:23:06,119 --> 00:23:09,280 Speaker 3: Yes? So the deal is that open ai agrees to 477 00:23:09,320 --> 00:23:11,560 Speaker 3: pay the Atlantic sum of money or a period of time. 478 00:23:12,000 --> 00:23:15,280 Speaker 3: In return, it is given the right to train on 479 00:23:15,359 --> 00:23:18,200 Speaker 3: the Atlantics material, meaning that the models that are developed 480 00:23:18,240 --> 00:23:21,560 Speaker 3: in that window, not afterwards, are allowed to train on 481 00:23:22,040 --> 00:23:26,680 Speaker 3: Atlantic content. And when they build a search engine, they 482 00:23:26,720 --> 00:23:29,240 Speaker 3: will be able to link to Atlantic stories and reference them. 483 00:23:29,640 --> 00:23:31,560 Speaker 3: And so if you go to the search engine and 484 00:23:31,600 --> 00:23:34,320 Speaker 3: chat GPT and you ask about something that has happened, 485 00:23:34,480 --> 00:23:36,359 Speaker 3: you will get links to Atlantic articles. You will not 486 00:23:36,359 --> 00:23:37,800 Speaker 3: get links to the New York Times. Most the New 487 00:23:37,880 --> 00:23:39,920 Speaker 3: York Times issuing them does not have good deal. And 488 00:23:40,000 --> 00:23:42,920 Speaker 3: so we have gone through a process whereby we have 489 00:23:43,000 --> 00:23:45,679 Speaker 3: been giving feedback on how that search engine works. It 490 00:23:45,720 --> 00:23:47,600 Speaker 3: doesn't work when the links are appropriate, when they're not. 491 00:23:48,000 --> 00:23:51,280 Speaker 3: So it is our belief that the elements of the deal, 492 00:23:51,359 --> 00:23:54,800 Speaker 3: which include some influence on shaping the search product, which 493 00:23:54,840 --> 00:23:58,720 Speaker 3: is massively important to media, right, some referral traffic, which 494 00:23:58,720 --> 00:24:00,880 Speaker 3: is extremely important because as I mentioned, as we switch 495 00:24:00,920 --> 00:24:04,080 Speaker 3: from search engines to answer engines, our traffic will decline substantially. 496 00:24:04,440 --> 00:24:07,639 Speaker 3: And then an exchange of value over the data that 497 00:24:07,720 --> 00:24:10,920 Speaker 3: was used to train. The reason why many journalists, including 498 00:24:10,960 --> 00:24:13,560 Speaker 3: many the Atlantic, didn't like it is that they you know, 499 00:24:14,560 --> 00:24:17,800 Speaker 3: they don't trust open ai as a company. They feel 500 00:24:17,840 --> 00:24:20,000 Speaker 3: like it wasn't a fair exchange of value. Right, there 501 00:24:20,000 --> 00:24:23,159 Speaker 3: are lots of reasons why they opposed it. Now we 502 00:24:23,240 --> 00:24:25,439 Speaker 3: have no fair exchange of value. We have gotten nothing 503 00:24:25,520 --> 00:24:27,800 Speaker 3: from the other large language model companies that have trained 504 00:24:27,840 --> 00:24:29,919 Speaker 3: on our data, and there are many of them. So 505 00:24:30,000 --> 00:24:33,000 Speaker 3: the open Ai deal was the one major deal with 506 00:24:33,040 --> 00:24:35,480 Speaker 3: a big AI company that we signed and that we announced. 507 00:24:35,920 --> 00:24:38,159 Speaker 1: And I guess one of the concerns, of course, that 508 00:24:38,200 --> 00:24:42,199 Speaker 1: the training data perhaps becomes less relevant, and that this 509 00:24:42,320 --> 00:24:45,399 Speaker 1: may be a very advantageous deal for open Ai in 510 00:24:45,440 --> 00:24:47,359 Speaker 1: the short term, and at the other side of it, 511 00:24:47,359 --> 00:24:48,280 Speaker 1: they won't need to renew. 512 00:24:48,960 --> 00:24:51,400 Speaker 3: Well, that's interesting, right, because then the argument there, If 513 00:24:51,440 --> 00:24:53,480 Speaker 3: that was one's argument, then you would write, well, actually, 514 00:24:53,480 --> 00:24:54,720 Speaker 3: we should have made a longer deal. 515 00:24:55,080 --> 00:24:56,560 Speaker 1: Did you consider Carchie a longer deal? 516 00:24:57,000 --> 00:24:58,800 Speaker 3: No, we didn't, And the reason we didn't is that 517 00:25:00,400 --> 00:25:05,040 Speaker 3: the price for training on high quality media content is 518 00:25:05,080 --> 00:25:08,359 Speaker 3: going to change substantially in the next couple of years. 519 00:25:08,400 --> 00:25:10,400 Speaker 3: And it's going to change based on a couple of factors, 520 00:25:10,920 --> 00:25:14,320 Speaker 3: one of which is their legislation mandating that they're being 521 00:25:14,400 --> 00:25:16,679 Speaker 3: exchange of value, Another of which is will the New 522 00:25:16,760 --> 00:25:19,240 Speaker 3: York Times and the other lawsuits be successful. If they are, 523 00:25:19,359 --> 00:25:21,480 Speaker 3: the price of this training will go up. If they're not, 524 00:25:22,040 --> 00:25:24,960 Speaker 3: the price will go way down. And so we made 525 00:25:25,000 --> 00:25:27,560 Speaker 3: it to your deal on the expectation that maybe in 526 00:25:27,600 --> 00:25:29,760 Speaker 3: two years the price will go up, and therefore we'll 527 00:25:29,760 --> 00:25:31,760 Speaker 3: be able to get more money. It may be a risk, 528 00:25:31,800 --> 00:25:33,879 Speaker 3: and in fact, the prices that are being reported in 529 00:25:33,920 --> 00:25:37,840 Speaker 3: the press for training have gone down substantially since you know, 530 00:25:37,880 --> 00:25:40,119 Speaker 3: we made that deal in May. And that may be 531 00:25:40,200 --> 00:25:42,399 Speaker 3: because the AI companies think they're going to win their lawsuits, right. 532 00:25:42,640 --> 00:25:45,320 Speaker 3: It may be because they think that they don't need 533 00:25:45,400 --> 00:25:48,440 Speaker 3: us because synthetic data is so good. It may be 534 00:25:48,520 --> 00:25:51,879 Speaker 3: that they figured out how to train models, and like, 535 00:25:52,359 --> 00:25:54,400 Speaker 3: as an environmentalist, I'd like them be able to train 536 00:25:54,440 --> 00:25:56,960 Speaker 3: models unless datacause he uses less energy. That maybe that 537 00:25:57,000 --> 00:25:59,159 Speaker 3: the AI companies are getting enough from there that they 538 00:25:59,200 --> 00:26:02,280 Speaker 3: don't need you know, Atlantic stories, right, or they need 539 00:26:02,280 --> 00:26:05,840 Speaker 3: the Atlantic stories less, so their perception of the value 540 00:26:05,920 --> 00:26:09,480 Speaker 3: of the tokens that we have is dropping. So maybe 541 00:26:09,480 --> 00:26:10,800 Speaker 3: I should have sign a five year deal if I 542 00:26:10,840 --> 00:26:12,480 Speaker 3: could have seen in the future. On the other hand, 543 00:26:12,480 --> 00:26:13,960 Speaker 3: if the New York Times wins their lawsuit, or the 544 00:26:13,960 --> 00:26:17,480 Speaker 3: European Union passes legislation, or any of a number of 545 00:26:17,520 --> 00:26:20,280 Speaker 3: other things happen, the price will go way up, in 546 00:26:20,280 --> 00:26:23,720 Speaker 3: which case great. One of the questions that will come 547 00:26:23,800 --> 00:26:27,160 Speaker 3: up in the lawsuit is can you prove that there 548 00:26:27,240 --> 00:26:31,040 Speaker 3: is value to the content that we scraped, and clearly 549 00:26:31,119 --> 00:26:32,080 Speaker 3: there is because. 550 00:26:31,880 --> 00:26:34,600 Speaker 1: You're paying somebody else, paying somebody else, right, So. 551 00:26:34,920 --> 00:26:36,720 Speaker 3: That was something we put up publicly because there was 552 00:26:36,760 --> 00:26:39,240 Speaker 3: a perception, Wait, you're actively working against the New York Times, 553 00:26:39,280 --> 00:26:41,119 Speaker 3: why don't you stand in solidarity with our brothers in 554 00:26:41,119 --> 00:26:42,680 Speaker 3: Times Square? And it's like, well, hold on a second, 555 00:26:43,720 --> 00:26:46,080 Speaker 3: this actually does help them. Now, maybe it would have 556 00:26:46,080 --> 00:26:48,560 Speaker 3: helped them more if we join the lawsuit, but our 557 00:26:48,640 --> 00:26:51,000 Speaker 3: job is to find the best deal for the Atlantic, 558 00:26:51,040 --> 00:26:52,280 Speaker 3: as much as we love the New York Times and 559 00:26:52,280 --> 00:26:54,080 Speaker 3: want to help the larger cause of media. 560 00:26:54,240 --> 00:26:56,360 Speaker 1: Were there any kind of people you spoke to who 561 00:26:56,440 --> 00:26:58,680 Speaker 1: were all the opposite opinion to you, who came around 562 00:26:58,720 --> 00:27:00,119 Speaker 1: to your opinions through this prest question. 563 00:27:00,320 --> 00:27:02,960 Speaker 3: I think one of the arguments that, for better or worse, 564 00:27:04,240 --> 00:27:10,000 Speaker 3: shifted people's minds is, well, they've already done this scraping, right, 565 00:27:10,040 --> 00:27:14,240 Speaker 3: And so if what you want is an open AI 566 00:27:14,359 --> 00:27:18,800 Speaker 3: that has no knowledge whatsoever of the Atlantic, you can't 567 00:27:18,920 --> 00:27:21,480 Speaker 3: ever get that that doesn't exist. It's just sort of 568 00:27:21,480 --> 00:27:24,560 Speaker 3: an unfortunate fact. I wish we could have prevented, and 569 00:27:24,600 --> 00:27:28,520 Speaker 3: we had somehow through some combination of heroic efforts, to 570 00:27:28,600 --> 00:27:31,719 Speaker 3: remove stories from Reddit, right, Like, you know, we had 571 00:27:31,760 --> 00:27:33,600 Speaker 3: prevented that from happening. But I think a lot of 572 00:27:33,600 --> 00:27:37,080 Speaker 3: people realized oh wait. I also think another argument did work. 573 00:27:37,080 --> 00:27:39,560 Speaker 3: So Jessica Lesson, Who's very smart and a good friend, 574 00:27:39,880 --> 00:27:42,520 Speaker 3: published in the Atlantic the day before we announced the deal, 575 00:27:43,280 --> 00:27:46,120 Speaker 3: this argument saying, hey, media company should not make deals, right, 576 00:27:46,280 --> 00:27:48,159 Speaker 3: and look at what happened to all the companies that 577 00:27:48,200 --> 00:27:50,879 Speaker 3: made deals with Facebook Watch. A lot of sort of 578 00:27:50,920 --> 00:27:54,880 Speaker 3: young social media based companies of ten years ago were screwed. 579 00:27:55,040 --> 00:27:55,200 Speaker 2: Right. 580 00:27:55,200 --> 00:27:58,399 Speaker 3: And so the conclusion that I think many people have 581 00:27:58,480 --> 00:28:02,240 Speaker 3: drawn is don't do deals with big tech companies. And 582 00:28:02,320 --> 00:28:04,960 Speaker 3: I think an argument that was somewhat persuasive encountering that was, 583 00:28:05,320 --> 00:28:09,560 Speaker 3: hold on, don't do bad deals. But how do you 584 00:28:09,560 --> 00:28:12,640 Speaker 3: think the Atlantic gets subscribers? What is the number one 585 00:28:12,680 --> 00:28:15,359 Speaker 3: mechanism we have for driving subscription? It is Facebook Ads? 586 00:28:15,560 --> 00:28:16,040 Speaker 1: Is that really? 587 00:28:16,320 --> 00:28:19,400 Speaker 3: And so I think this kind of absolutist pure position 588 00:28:19,520 --> 00:28:22,200 Speaker 3: no deals with tech companies once you get a little 589 00:28:22,200 --> 00:28:25,919 Speaker 3: more granular. Oh wait, okay, no stupid deals. Now you 590 00:28:25,920 --> 00:28:27,520 Speaker 3: can still argue that this deal we made was a 591 00:28:27,560 --> 00:28:30,960 Speaker 3: stupid deal, right, But I think we had some success 592 00:28:30,960 --> 00:28:33,040 Speaker 3: in kind of moving people from the absolute position of 593 00:28:33,080 --> 00:28:34,040 Speaker 3: no deals. 594 00:28:37,160 --> 00:28:44,960 Speaker 1: More insights from Nicholas Thompson when we come back. I 595 00:28:45,000 --> 00:28:47,640 Speaker 1: want to close with a quote from David Foster Wallace 596 00:28:47,640 --> 00:28:50,760 Speaker 1: that I also found in one of your newsletters, which was, 597 00:28:50,840 --> 00:28:52,800 Speaker 1: the technology is just going to get better and better 598 00:28:52,800 --> 00:28:54,920 Speaker 1: and better, and it's going to get easier and easier 599 00:28:54,960 --> 00:28:57,160 Speaker 1: and more and more convenient and more and more pleasurable 600 00:28:57,200 --> 00:29:00,400 Speaker 1: to be alone with images on the screen to us 601 00:29:00,440 --> 00:29:02,560 Speaker 1: by people who don't love us but want on money, 602 00:29:03,040 --> 00:29:05,360 Speaker 1: which is all right in low doses, right, But if 603 00:29:05,360 --> 00:29:07,920 Speaker 1: that's the basic main statement of your diet, you're going 604 00:29:07,960 --> 00:29:10,680 Speaker 1: to die in a meaningful way. You're going to die. 605 00:29:12,440 --> 00:29:14,720 Speaker 3: It's one of the most prescient and wonderful quotes. And 606 00:29:14,760 --> 00:29:18,160 Speaker 3: it's so he was just talking in an interview. But 607 00:29:18,280 --> 00:29:22,680 Speaker 3: the people who don't love you but do want your money, right, Like, 608 00:29:22,720 --> 00:29:27,080 Speaker 3: how can you say it better than that? And you know, 609 00:29:27,080 --> 00:29:29,800 Speaker 3: I am a tech enthusiast. I love taking things apart. 610 00:29:29,840 --> 00:29:32,200 Speaker 3: I love trying to understand them. I ask to try 611 00:29:32,240 --> 00:29:34,400 Speaker 3: really hard to make sure my kids are off their phones. 612 00:29:34,520 --> 00:29:36,960 Speaker 3: I like make sure I have lots of time off 613 00:29:37,000 --> 00:29:40,080 Speaker 3: my phones. I like to spend time in the mountain, 614 00:29:40,200 --> 00:29:44,040 Speaker 3: right so I think what he said is perfect, which 615 00:29:44,080 --> 00:29:46,880 Speaker 3: is and I probably would go for medium doses, not 616 00:29:46,880 --> 00:29:50,920 Speaker 3: low doses. But you do also have to disconnect, and 617 00:29:50,920 --> 00:29:52,520 Speaker 3: you do also have to be human, and I think 618 00:29:52,520 --> 00:29:54,040 Speaker 3: he said it better than anybody. He said that. I 619 00:29:54,040 --> 00:29:57,400 Speaker 3: think in like nineteen ninety six, so you know, an 620 00:29:57,440 --> 00:29:59,960 Speaker 3: incredible writer and saw out into the future. 621 00:30:00,080 --> 00:30:01,600 Speaker 1: But that brings me back to the beginning of the 622 00:30:01,600 --> 00:30:05,760 Speaker 1: conversation and the running, because it's something you both use 623 00:30:05,840 --> 00:30:11,840 Speaker 1: technology to excel at also in a way, something which 624 00:30:11,880 --> 00:30:13,120 Speaker 1: is very medicinive. 625 00:30:12,680 --> 00:30:15,800 Speaker 3: Totally disconnect. And as I said earlier, the process of 626 00:30:15,840 --> 00:30:19,680 Speaker 3: getting faster is like getting your body and your brain 627 00:30:19,800 --> 00:30:22,000 Speaker 3: more in sync with each other. And so when you 628 00:30:22,040 --> 00:30:25,200 Speaker 3: do a workout, you want a minimal number of mental distractions, 629 00:30:25,200 --> 00:30:28,040 Speaker 3: and so much of the benefit is the attention of 630 00:30:28,080 --> 00:30:30,360 Speaker 3: your brain and your body in sync as you run 631 00:30:30,720 --> 00:30:34,520 Speaker 3: whatever pace. It was five forty six, right, And that 632 00:30:34,880 --> 00:30:37,880 Speaker 3: is a lot of what makes you better at the sport. 633 00:30:37,960 --> 00:30:40,320 Speaker 3: And so if you are allowing anything to interfere with 634 00:30:40,360 --> 00:30:43,760 Speaker 3: that mental physical process, you are doing a disservice to 635 00:30:43,800 --> 00:30:46,920 Speaker 3: your training. And so there is a technological element of running. 636 00:30:46,920 --> 00:30:48,880 Speaker 3: I do analyze my training. I do like, look at 637 00:30:48,880 --> 00:30:51,040 Speaker 3: my historical heart rate data, you know, before a race, 638 00:30:51,080 --> 00:30:53,400 Speaker 3: I will, you know, look very carefully at how I've 639 00:30:53,400 --> 00:30:55,480 Speaker 3: done in certain you know, workouts and what it indicates, 640 00:30:55,480 --> 00:30:57,360 Speaker 3: because that helps me choose the pace that I'll run at. Right, 641 00:30:57,360 --> 00:30:59,600 Speaker 3: there's a whole there's a whole process, but it is 642 00:30:59,600 --> 00:31:02,960 Speaker 3: also extremely important to disconnect, both as part of the 643 00:31:03,000 --> 00:31:12,920 Speaker 3: training and as part of meditation. 644 00:31:13,920 --> 00:31:15,440 Speaker 2: That was a really interesting interview. 645 00:31:15,520 --> 00:31:16,200 Speaker 1: I thank you, Carol. 646 00:31:16,240 --> 00:31:17,680 Speaker 2: I was going to thank you for doing it, but 647 00:31:18,160 --> 00:31:19,720 Speaker 2: you know, that's what you want to do, that's your job. 648 00:31:20,480 --> 00:31:22,560 Speaker 2: It's actually funny. Tory, one of our producers, was just 649 00:31:22,600 --> 00:31:26,400 Speaker 2: saying that every software engineer that she knows is an 650 00:31:26,440 --> 00:31:29,280 Speaker 2: avid rock climber just for this reason, like get away 651 00:31:29,400 --> 00:31:29,760 Speaker 2: from the. 652 00:31:29,760 --> 00:31:30,760 Speaker 1: Techt away from the phone. 653 00:31:30,960 --> 00:31:32,360 Speaker 2: Yeah, exactly exactly. 654 00:31:32,640 --> 00:31:36,200 Speaker 1: Also why I'm such a big souna enthusiast the one place. 655 00:31:36,120 --> 00:31:37,840 Speaker 2: Oh, I thought, that's because you're a Ukrainian. 656 00:31:37,880 --> 00:31:41,600 Speaker 1: Well, it's partly that the comment. But not being able 657 00:31:41,600 --> 00:31:43,400 Speaker 1: to have your phone in the sonar as be part 658 00:31:43,400 --> 00:31:44,720 Speaker 1: of why this sonar is so great. 659 00:31:44,960 --> 00:31:47,880 Speaker 2: I have done something recently and not you know, to 660 00:31:47,960 --> 00:31:51,720 Speaker 2: sound very to so twenty twenty one, but I don't 661 00:31:51,760 --> 00:31:53,280 Speaker 2: sleep with my phone in my room anymore. 662 00:31:53,480 --> 00:31:55,480 Speaker 1: I'm very proud of Yes, I born an alarm clock, 663 00:31:55,480 --> 00:32:00,000 Speaker 1: but I haven't gone around to that's pathetic. Set it up. 664 00:32:01,680 --> 00:32:04,400 Speaker 2: What I love in the discussion of like the way 665 00:32:04,440 --> 00:32:08,680 Speaker 2: that technology optimizes human performance is like, there obviously is 666 00:32:08,720 --> 00:32:13,760 Speaker 2: something inherent to a great athlete's performance like Michael Jordan's 667 00:32:13,760 --> 00:32:16,920 Speaker 2: Michael Jordan regardless of his shoe in a certain sense. 668 00:32:17,320 --> 00:32:20,280 Speaker 2: But technology does sort of. 669 00:32:20,760 --> 00:32:23,920 Speaker 1: Truly enhanced performance, truly push the human being in the 670 00:32:24,000 --> 00:32:24,920 Speaker 1: human body. 671 00:32:24,720 --> 00:32:27,040 Speaker 2: Well and redefine like what it is to be a runner, 672 00:32:27,080 --> 00:32:28,320 Speaker 2: what it is to be an athlete. 673 00:32:28,400 --> 00:32:30,720 Speaker 1: I really like what Nick said about the way you 674 00:32:30,760 --> 00:32:34,080 Speaker 1: get better at running is to quiet your mind, take 675 00:32:34,080 --> 00:32:36,800 Speaker 1: the fear away. And the only way you can quiet 676 00:32:36,840 --> 00:32:39,640 Speaker 1: your mind is with your mind. But for him to 677 00:32:39,680 --> 00:32:42,160 Speaker 1: be able to see that his heart rate, even though 678 00:32:42,200 --> 00:32:44,840 Speaker 1: he was maybe approaching panic in terms of how hard 679 00:32:44,880 --> 00:32:47,560 Speaker 1: he was pushing himself, because of his heart rate monitor, 680 00:32:47,640 --> 00:32:51,000 Speaker 1: he kind of knew that he was okay, and that 681 00:32:51,040 --> 00:32:54,400 Speaker 1: allowed him to generate better and better time. So it 682 00:32:54,440 --> 00:32:56,840 Speaker 1: wasn't the technology per se, he wasn't He didn't have 683 00:32:56,920 --> 00:33:00,120 Speaker 1: like you know, air boosters and his trainers but the 684 00:33:00,120 --> 00:33:02,720 Speaker 1: technology allowed him to quiet his own mind in a 685 00:33:02,800 --> 00:33:03,360 Speaker 1: strange way. 686 00:33:03,520 --> 00:33:05,440 Speaker 2: Yeah. One of the things that I was thinking about 687 00:33:05,760 --> 00:33:08,959 Speaker 2: is that, like, there are two ways that technology affects us. 688 00:33:09,000 --> 00:33:11,120 Speaker 2: There are things that make us less human and there 689 00:33:11,120 --> 00:33:14,120 Speaker 2: are things that make us more human. Like human enhancement 690 00:33:14,200 --> 00:33:20,360 Speaker 2: can both be you become superhuman or you become more 691 00:33:20,400 --> 00:33:23,840 Speaker 2: of who you are through personal optimization. Yeah, and I 692 00:33:23,920 --> 00:33:26,920 Speaker 2: just thought that was very interesting. The other thing that 693 00:33:27,040 --> 00:33:30,560 Speaker 2: I actually I was so happy you were interviewing him 694 00:33:30,600 --> 00:33:34,080 Speaker 2: because I remember, I'm not going to compare it to 695 00:33:34,200 --> 00:33:37,080 Speaker 2: some of like the great you know, world events, but 696 00:33:37,120 --> 00:33:40,960 Speaker 2: I do remember where I was sitting on West Broadway 697 00:33:41,400 --> 00:33:43,840 Speaker 2: when I got the alert, and I remember saying that 698 00:33:43,960 --> 00:33:49,360 Speaker 2: the Atlantic is making a deal with Open AI to 699 00:33:50,480 --> 00:33:56,480 Speaker 2: basically allow them to mine the Atlantic's what do you 700 00:33:56,480 --> 00:34:02,400 Speaker 2: call catalog or archive? Is a turning point in the 701 00:34:02,440 --> 00:34:09,080 Speaker 2: history of journalism where someone has decided that the way 702 00:34:09,120 --> 00:34:11,200 Speaker 2: to make money is to make a deal with the devil. 703 00:34:12,160 --> 00:34:15,160 Speaker 2: And you getting this as our first interview, I think, again, 704 00:34:15,200 --> 00:34:17,360 Speaker 2: it might not seem like it's that big of a 705 00:34:17,400 --> 00:34:20,840 Speaker 2: deal to people, but I think in the conversation between 706 00:34:21,640 --> 00:34:26,600 Speaker 2: you know, what is the future of journalism how do 707 00:34:27,560 --> 00:34:31,799 Speaker 2: these newsrooms monetize in a way that does not cannibalize 708 00:34:31,800 --> 00:34:35,080 Speaker 2: the thing that the newsroom does. And then to see 709 00:34:35,120 --> 00:34:39,040 Speaker 2: the Atlantic make and other newsrooms now too, and other 710 00:34:39,239 --> 00:34:43,239 Speaker 2: just content providers make that pact, I think is a 711 00:34:43,320 --> 00:34:44,400 Speaker 2: real turning point. 712 00:34:44,560 --> 00:34:47,840 Speaker 1: And actually one of the people who wrote a piece 713 00:34:47,840 --> 00:34:52,200 Speaker 1: in the Atlantic really is a broadside against media companies 714 00:34:52,239 --> 00:34:56,360 Speaker 1: partnering with AI companies was Jessicallessen, the CEO and founder 715 00:34:56,400 --> 00:34:58,440 Speaker 1: of The Information who are going to be talking to 716 00:34:58,719 --> 00:35:02,000 Speaker 1: on the show soon. But this is a hot, hot, 717 00:35:02,000 --> 00:35:06,000 Speaker 1: hot button issue obviously. I mean Nick's point was basically, 718 00:35:06,600 --> 00:35:10,000 Speaker 1: this is happening anyway, and I got us a some 719 00:35:10,160 --> 00:35:13,840 Speaker 1: compensation and be some ability to show up in open 720 00:35:14,120 --> 00:35:17,800 Speaker 1: AI's search engine, which will be useful for brand awareness 721 00:35:17,800 --> 00:35:19,640 Speaker 1: and to drive subs in the future. 722 00:35:19,920 --> 00:35:21,399 Speaker 2: It is if you can't beat them, join them. 723 00:35:21,440 --> 00:35:23,120 Speaker 1: It's a little bit. If you can't beat them, join them. 724 00:35:23,239 --> 00:35:26,080 Speaker 2: All right, Before I go into too much future tripping, 725 00:35:26,080 --> 00:35:27,920 Speaker 2: I think this is a good place to leave it. 726 00:35:28,040 --> 00:35:33,720 Speaker 2: And that is all for tech stuff today. This episode 727 00:35:33,760 --> 00:35:37,000 Speaker 2: was produced by Shena Ozaki and Eliza Dennis, with help 728 00:35:37,040 --> 00:35:40,760 Speaker 2: from Lizzie Jacobs and Victoria Domingez. It was executive produced 729 00:35:40,760 --> 00:35:44,040 Speaker 2: by me, Kara Price, os Valashan and Kate Osbourne for 730 00:35:44,080 --> 00:35:48,120 Speaker 2: Kaleidoscope and Katrina Norvel for iHeart. Our engineers are Biheed 731 00:35:48,120 --> 00:35:52,040 Speaker 2: Frasier at iHeart and Kathleen Kanti at CDM Studios. Kyle 732 00:35:52,120 --> 00:35:55,200 Speaker 2: Murdoch wrote our theme song, Thanks again to Nicholas Thompson. 733 00:35:56,080 --> 00:35:58,919 Speaker 1: Join us on Friday for tech Stuff's The Week in Tech. 734 00:35:59,320 --> 00:36:02,000 Speaker 1: We'll run through our favorite headlines, talk with our friends 735 00:36:02,120 --> 00:36:06,520 Speaker 1: for form media, and try to tackle a question when 736 00:36:06,560 --> 00:36:10,640 Speaker 1: did this become a thing? And please rate and review 737 00:36:10,719 --> 00:36:14,120 Speaker 1: on Apple Podcasts or Spotify wherever you listen, and reach 738 00:36:14,160 --> 00:36:17,319 Speaker 1: out to us at tech stuff podcast at gmail dot 739 00:36:17,360 --> 00:36:20,040 Speaker 1: com with thoughts and feedback. We really do want to 740 00:36:20,040 --> 00:36:21,040 Speaker 1: hear from you. Thank you.