1 00:00:15,320 --> 00:00:16,320 Speaker 1: Welcome to tech stuff. 2 00:00:16,440 --> 00:00:19,160 Speaker 2: I'm os Voloscan and this week I've been in Doha, 3 00:00:19,400 --> 00:00:22,919 Speaker 2: Cutter attending the Web Summit. I had the opportunity to 4 00:00:22,960 --> 00:00:26,400 Speaker 2: have a conversation on stage with Katie Drummond, who is 5 00:00:26,440 --> 00:00:30,440 Speaker 2: the Global editorial director of Wired and is wide. They 6 00:00:30,440 --> 00:00:33,240 Speaker 2: admired for having made the brand more essential and more 7 00:00:33,240 --> 00:00:36,560 Speaker 2: relevant than ever. So instead of the Week in Tech, 8 00:00:37,040 --> 00:00:40,239 Speaker 2: I wanted to share my conversation with Katie. We get 9 00:00:40,240 --> 00:00:45,640 Speaker 2: into all my favorite topics, humanoid robotics, world models, disrupting LMS, 10 00:00:46,240 --> 00:00:50,080 Speaker 2: how clawed code shook the AI industry, what US audiences 11 00:00:50,159 --> 00:00:54,520 Speaker 2: are missing about tech in China? And Katie also shared 12 00:00:54,680 --> 00:00:59,440 Speaker 2: how Wired owned the story of the tumultuous implementation and 13 00:00:59,480 --> 00:01:03,280 Speaker 2: subsequent and implosion of Doge in the White House. 14 00:01:06,560 --> 00:01:08,920 Speaker 3: Kenny Drummond, very nice to see you, Very nice to 15 00:01:08,959 --> 00:01:09,480 Speaker 3: see you too. 16 00:01:09,840 --> 00:01:13,880 Speaker 2: You are the global editorial director of Wyatt correct the 17 00:01:14,400 --> 00:01:16,640 Speaker 2: kind of phrase to me that sums up what you've 18 00:01:16,680 --> 00:01:17,440 Speaker 2: been able to do it? 19 00:01:17,520 --> 00:01:21,880 Speaker 1: Why is meeting the moment? How do you do that? 20 00:01:22,760 --> 00:01:24,679 Speaker 3: Oh my gosh, how do you well? First of all, 21 00:01:24,680 --> 00:01:27,080 Speaker 3: that's very nice of you to say, how do we 22 00:01:27,200 --> 00:01:30,240 Speaker 3: do that? In this moment? I think particularly in the 23 00:01:30,240 --> 00:01:32,800 Speaker 3: context of artificial intelligence, right. I mean, there's so much 24 00:01:33,000 --> 00:01:36,920 Speaker 3: changing so quickly, we move as quickly as the story does. 25 00:01:36,959 --> 00:01:41,600 Speaker 3: I think that we have a newsroom of really excited, hardworking, 26 00:01:41,880 --> 00:01:45,200 Speaker 3: animated subject matter experts, and I think that's something that 27 00:01:45,280 --> 00:01:48,960 Speaker 3: really differentiates Wired is that we don't employ a bunch 28 00:01:48,960 --> 00:01:52,200 Speaker 3: of general interest reporters and editors, right. We really employ 29 00:01:52,320 --> 00:01:55,640 Speaker 3: people who understand the nitty gritty of what they cover, 30 00:01:55,760 --> 00:01:58,440 Speaker 3: who are very, very deeply immersed in the technology and 31 00:01:58,520 --> 00:02:02,680 Speaker 3: are able to explain it to people in very human terms. 32 00:02:02,760 --> 00:02:05,360 Speaker 3: I think the way we describe it as conversational authority, 33 00:02:05,360 --> 00:02:07,920 Speaker 3: and so I think we just wake up every day 34 00:02:09,120 --> 00:02:13,079 Speaker 3: really committed to the idea of helping audiences understand what's happening, 35 00:02:13,320 --> 00:02:16,760 Speaker 3: and doing it from a place of authority and expertise 36 00:02:16,800 --> 00:02:17,440 Speaker 3: and credibility. 37 00:02:17,960 --> 00:02:20,240 Speaker 1: Now, I think, did you start in twenty twenty three? 38 00:02:20,840 --> 00:02:22,840 Speaker 3: I did. I started in September twenty twenty three. 39 00:02:22,919 --> 00:02:24,359 Speaker 1: Yeah, because the history. 40 00:02:24,000 --> 00:02:28,800 Speaker 2: Of Wide magazine was more about kind of celebrating, in 41 00:02:28,800 --> 00:02:30,960 Speaker 2: a sense, the tech industry, right, or at least providing 42 00:02:31,000 --> 00:02:35,120 Speaker 2: a forum for tech visionaries to lay out their vision 43 00:02:35,160 --> 00:02:37,320 Speaker 2: of what the future could look like or what the 44 00:02:37,400 --> 00:02:41,400 Speaker 2: utopian ideas of technology were in the early nineties. So 45 00:02:41,680 --> 00:02:43,760 Speaker 2: did you have to do cultural change or did you 46 00:02:43,800 --> 00:02:46,880 Speaker 2: have to push back on kind of old school ideas 47 00:02:46,880 --> 00:02:49,280 Speaker 2: of what Wired was in order to be a much 48 00:02:49,280 --> 00:02:51,600 Speaker 2: more frankly political publication as you are today. 49 00:02:52,200 --> 00:02:54,239 Speaker 3: I did, you know, and not so much from the 50 00:02:54,240 --> 00:02:56,040 Speaker 3: point of view of the newsroom and the staff. I 51 00:02:56,080 --> 00:03:00,120 Speaker 3: think they were very attuned to what had changed and 52 00:03:00,160 --> 00:03:02,800 Speaker 3: what they wanted and needed. Really was just clear direction 53 00:03:02,960 --> 00:03:05,520 Speaker 3: about what we were there to do, which is not 54 00:03:05,760 --> 00:03:08,959 Speaker 3: so much to act as stenographers. Right, we don't work 55 00:03:09,000 --> 00:03:12,639 Speaker 3: in comms for big tech. Our job is to tell 56 00:03:12,680 --> 00:03:15,920 Speaker 3: the truth and tell the truth about the promise of 57 00:03:16,000 --> 00:03:19,680 Speaker 3: technology and scientific inquiry, but to tell the truth as 58 00:03:19,760 --> 00:03:23,320 Speaker 3: well about when that goes awry. And so what they 59 00:03:23,440 --> 00:03:27,119 Speaker 3: needed was that direction. I think that the biggest resistance 60 00:03:27,200 --> 00:03:31,480 Speaker 3: we encountered, interestingly was from the audience. First of all, right, 61 00:03:31,560 --> 00:03:35,600 Speaker 3: Wired had, you know, a cohort of sort of newer 62 00:03:35,640 --> 00:03:37,840 Speaker 3: readers and viewers who were very attuned to what we 63 00:03:37,840 --> 00:03:39,840 Speaker 3: were doing. But then there was this sort of cohort 64 00:03:39,880 --> 00:03:42,760 Speaker 3: of people, probably all of whom have emailed me at 65 00:03:42,840 --> 00:03:47,000 Speaker 3: least once saying you're doing to say like I'm canceling 66 00:03:47,040 --> 00:03:50,120 Speaker 3: my subscription, and I can't believe that you would ask 67 00:03:50,200 --> 00:03:52,720 Speaker 3: these kinds of questions of these companies. I read Wired 68 00:03:53,160 --> 00:03:56,200 Speaker 3: to be inspired and excited. I don't read wired to, 69 00:03:57,360 --> 00:04:00,040 Speaker 3: you know, see the industry challenged, or to dig in 70 00:04:00,040 --> 00:04:01,840 Speaker 3: into politics, like I'm not here for that. 71 00:04:02,040 --> 00:04:03,720 Speaker 1: So do you respond to those emails? 72 00:04:03,800 --> 00:04:07,240 Speaker 3: Or sometimes? Yeah? Sometimes what do I do well? In particular? 73 00:04:07,280 --> 00:04:10,160 Speaker 3: You know, if someone emails me with something nice to say, 74 00:04:10,880 --> 00:04:13,160 Speaker 3: I always respond to those emails because I think it's 75 00:04:13,160 --> 00:04:15,040 Speaker 3: for someone to take five minutes out of their day 76 00:04:15,080 --> 00:04:17,599 Speaker 3: to email me to tell me that they love the publication. 77 00:04:17,680 --> 00:04:20,160 Speaker 3: I run, like that's really meaningful. So I want to 78 00:04:20,200 --> 00:04:22,920 Speaker 3: make sure that I am engaging with those people. And 79 00:04:22,960 --> 00:04:24,800 Speaker 3: then sometimes I'll just write back and say, you know, 80 00:04:24,839 --> 00:04:27,120 Speaker 3: I'm really sorry to hear that you feel that way. 81 00:04:28,240 --> 00:04:31,599 Speaker 3: Best of luck to you on your reading journey with 82 00:04:31,720 --> 00:04:34,280 Speaker 3: other publications. You know, I am not going to get 83 00:04:34,279 --> 00:04:36,240 Speaker 3: into a back and forth over email with like Dave 84 00:04:36,279 --> 00:04:39,560 Speaker 3: from Kansas City. But I do pay attention to what 85 00:04:39,640 --> 00:04:41,800 Speaker 3: our audience is telling us and what they're telling me directly. 86 00:04:41,839 --> 00:04:45,360 Speaker 3: So the biggest I think point of contention really came 87 00:04:45,400 --> 00:04:49,359 Speaker 3: from the audience and then I think from the industry itself, 88 00:04:49,360 --> 00:04:51,200 Speaker 3: I think there were a lot of people who were 89 00:04:51,640 --> 00:04:54,119 Speaker 3: who work in technology or who maybe we have dealt 90 00:04:54,120 --> 00:04:58,120 Speaker 3: with historically, who were maybe dismayed to find that we 91 00:04:58,240 --> 00:05:01,920 Speaker 3: no longer covered them the way they were accustomed to, 92 00:05:02,040 --> 00:05:04,560 Speaker 3: and so I think there was a transition there that 93 00:05:04,600 --> 00:05:05,159 Speaker 3: had to happen. 94 00:05:05,600 --> 00:05:07,800 Speaker 2: Do you think, as you look at your tenure so far, 95 00:05:07,960 --> 00:05:12,280 Speaker 2: is there one story or signature reporting that in some 96 00:05:12,320 --> 00:05:15,120 Speaker 2: sense defines what you've done and what you're aiming to be. 97 00:05:15,720 --> 00:05:18,480 Speaker 3: Yeah. I mean, I think we had a really big 98 00:05:18,560 --> 00:05:21,880 Speaker 3: year last year in twenty twenty five, and it's the 99 00:05:23,440 --> 00:05:26,080 Speaker 3: catalyst for that was and I you know, not to 100 00:05:26,120 --> 00:05:29,200 Speaker 3: get too much into politics, but when President Trump was 101 00:05:29,240 --> 00:05:32,320 Speaker 3: elected in the United States and he brought Elon Musk 102 00:05:32,360 --> 00:05:34,359 Speaker 3: into the White House and put him in charge of 103 00:05:34,480 --> 00:05:38,679 Speaker 3: the Department of Governmental Efficiency DOGE, we were very quick 104 00:05:38,680 --> 00:05:42,280 Speaker 3: to that story. And it was because we had journalists 105 00:05:42,320 --> 00:05:44,680 Speaker 3: who had covered Elon Musk and his businesses for a 106 00:05:44,760 --> 00:05:46,760 Speaker 3: very long time, and then we had built out a 107 00:05:46,760 --> 00:05:49,880 Speaker 3: politics team who were you know, very well sourced in 108 00:05:50,000 --> 00:05:52,880 Speaker 3: DC and sort of ready to cover the political piece 109 00:05:52,920 --> 00:05:55,640 Speaker 3: of that, and so we were very quick to that story. 110 00:05:55,680 --> 00:05:57,880 Speaker 3: We were very early to that story, and we really 111 00:05:58,000 --> 00:06:01,839 Speaker 3: led the charge in documing sort of exactly what Doge 112 00:06:01,920 --> 00:06:03,960 Speaker 3: was doing inside of the federal government. And I think 113 00:06:04,000 --> 00:06:07,400 Speaker 3: that's a perfect example for me where this idea that 114 00:06:07,440 --> 00:06:09,760 Speaker 3: technology is not political, or that you need to sort 115 00:06:09,760 --> 00:06:13,240 Speaker 3: of separate politics from tech is just not true, Like 116 00:06:13,279 --> 00:06:15,720 Speaker 3: that is just factually inaccurate. And so I think that 117 00:06:15,920 --> 00:06:19,400 Speaker 3: moment for us really crystallized sort of the narrative that 118 00:06:19,440 --> 00:06:21,440 Speaker 3: I had created around Wired and what I was trying 119 00:06:21,440 --> 00:06:24,840 Speaker 3: to accomplish with Wired, which is just to say technology 120 00:06:24,880 --> 00:06:27,920 Speaker 3: is ubiquitous. I mean, this is a mainstream phenomenon, and 121 00:06:27,960 --> 00:06:31,120 Speaker 3: you can't untangle it from any any facet of how 122 00:06:31,160 --> 00:06:34,880 Speaker 3: we live, from culture, from politics, from security science, everything 123 00:06:34,880 --> 00:06:37,159 Speaker 3: that Wired covers, Everything about the way we all live 124 00:06:38,400 --> 00:06:41,440 Speaker 3: has technology woven into it, and that includes politics. And 125 00:06:41,480 --> 00:06:43,839 Speaker 3: I think what happened with Doge and Musk is a 126 00:06:43,839 --> 00:06:44,800 Speaker 3: perfect example of that. 127 00:06:45,240 --> 00:06:47,720 Speaker 2: I imagine if you did, like a Google trend search 128 00:06:47,960 --> 00:06:51,159 Speaker 2: for Doge in like January February last year, it would 129 00:06:51,160 --> 00:06:53,320 Speaker 2: have been pretty close to the top of the Internet, 130 00:06:54,160 --> 00:06:56,040 Speaker 2: which is both a good thing as a magazine or 131 00:06:56,040 --> 00:07:00,200 Speaker 2: as an editorial leader, but also a bad thing like 132 00:07:00,240 --> 00:07:01,440 Speaker 2: how did you break through it? 133 00:07:01,480 --> 00:07:01,640 Speaker 1: Why? 134 00:07:01,760 --> 00:07:03,960 Speaker 2: Like, how did you move the conversation forward? What did 135 00:07:04,000 --> 00:07:06,600 Speaker 2: you say or do about Doge that no one else could? 136 00:07:06,880 --> 00:07:08,719 Speaker 3: I mean, we broke a lot of news. And I 137 00:07:08,720 --> 00:07:13,040 Speaker 3: think this is another significant change that I've made at 138 00:07:13,040 --> 00:07:15,680 Speaker 3: Wired is that when I started, I think they were 139 00:07:15,760 --> 00:07:18,800 Speaker 3: very much thinking of themselves as a print magazine with 140 00:07:18,840 --> 00:07:22,280 Speaker 3: an Internet website, right, And so the idea of okay, well, 141 00:07:22,280 --> 00:07:24,840 Speaker 3: how do you translate magazine journalism to the Internet While 142 00:07:24,960 --> 00:07:28,200 Speaker 3: you look at what's happening in the world. You look 143 00:07:28,240 --> 00:07:30,240 Speaker 3: at what news is breaking, and then you react to it, 144 00:07:30,360 --> 00:07:32,960 Speaker 3: You analyze it, you tell audiences how to think about it, 145 00:07:33,240 --> 00:07:35,280 Speaker 3: and you do that like two or three days after 146 00:07:35,320 --> 00:07:39,400 Speaker 3: a big story or a major event has happened. My view, though, 147 00:07:39,560 --> 00:07:42,480 Speaker 3: is that Wired should not be the place reacting to 148 00:07:42,520 --> 00:07:44,920 Speaker 3: what's happening. We should be the place where it happens. 149 00:07:44,960 --> 00:07:47,600 Speaker 3: And so what that means is we should actually be 150 00:07:47,600 --> 00:07:51,000 Speaker 3: breaking the stories. We can offer that analysis and that synthesis, 151 00:07:51,040 --> 00:07:54,080 Speaker 3: but the origin point needs to come from us. The 152 00:07:54,200 --> 00:07:56,320 Speaker 3: Internet is not going to wait two or three days 153 00:07:56,320 --> 00:07:58,400 Speaker 3: for your essay. Like, that's like not how it works. 154 00:07:59,000 --> 00:08:02,280 Speaker 3: So with Doge, we differentiated ourselves because we were inside 155 00:08:02,320 --> 00:08:06,440 Speaker 3: all of these federal agencies as DOGE operatives were doing 156 00:08:06,440 --> 00:08:09,400 Speaker 3: what they were doing, and we were breaking stories often 157 00:08:09,480 --> 00:08:11,960 Speaker 3: like two, three, four or five times a day for 158 00:08:12,040 --> 00:08:14,960 Speaker 3: weeks at a time. That's how we stood out. I mean, 159 00:08:14,960 --> 00:08:17,680 Speaker 3: we didn't stand out because we were offering you the 160 00:08:17,720 --> 00:08:19,840 Speaker 3: best hot take or the best essay. We stood out 161 00:08:19,880 --> 00:08:24,120 Speaker 3: because we were providing new information to audiences, particularly in 162 00:08:24,160 --> 00:08:27,840 Speaker 3: the United States, who were deeply concerned and sort of 163 00:08:27,920 --> 00:08:31,200 Speaker 3: desperate to understand what was happening inside of the federal government, 164 00:08:31,360 --> 00:08:34,400 Speaker 3: and they turned to Wired to answer those questions for them. 165 00:08:34,800 --> 00:08:37,800 Speaker 2: At the same time, you come up with this new 166 00:08:37,920 --> 00:08:40,120 Speaker 2: kind of mission statement in a sense for wide which 167 00:08:40,160 --> 00:08:42,880 Speaker 2: is for future reference, like what will people look at 168 00:08:43,240 --> 00:08:47,280 Speaker 2: ten twenty years from now as significant stories? So I 169 00:08:47,280 --> 00:08:50,040 Speaker 2: had to ask you about AI obviously the two letters 170 00:08:50,040 --> 00:08:54,160 Speaker 2: on everyone's lips permanently. How do you when you think 171 00:08:54,160 --> 00:08:57,440 Speaker 2: about covering AI sought the signal from the noise and 172 00:08:57,559 --> 00:09:01,479 Speaker 2: make sure that you are actually covering the people ideas 173 00:09:01,679 --> 00:09:04,600 Speaker 2: things that will be meaningful in ten years rather than 174 00:09:04,800 --> 00:09:07,320 Speaker 2: just whatever the trend of the day or the hour is. 175 00:09:07,440 --> 00:09:08,960 Speaker 3: Sure, Yeah, I mean I will say I had the 176 00:09:09,000 --> 00:09:12,160 Speaker 3: great fortune or misfortune depending on how you look at it. 177 00:09:12,200 --> 00:09:15,680 Speaker 3: Of starting my job at Wired maybe like four to 178 00:09:15,760 --> 00:09:18,080 Speaker 3: six months after chet GPT first came out, and so 179 00:09:18,120 --> 00:09:20,000 Speaker 3: it was all anyone wanted to talk to me about. 180 00:09:20,080 --> 00:09:22,080 Speaker 3: Even I remember interviewing for the job, they were like, 181 00:09:22,120 --> 00:09:24,840 Speaker 3: how would you cover AI? And that has not changed. 182 00:09:25,120 --> 00:09:27,080 Speaker 3: It's been two and a half years. I expect that 183 00:09:27,120 --> 00:09:28,760 Speaker 3: in another two and a half years we're still going 184 00:09:28,800 --> 00:09:31,840 Speaker 3: to be covering the technology. I mean, it is pervasive 185 00:09:32,000 --> 00:09:34,360 Speaker 3: and ubiquitous at this point. But I think it really 186 00:09:34,360 --> 00:09:37,160 Speaker 3: comes back to the idea of expertise. So I'll give 187 00:09:37,160 --> 00:09:39,839 Speaker 3: an example. With AI. We have a journalist named Will 188 00:09:39,880 --> 00:09:43,440 Speaker 3: Knight who has been covering AI for his entire career, 189 00:09:43,480 --> 00:09:46,280 Speaker 3: I mean fifteen twenty years. He writes a weekly newsletter 190 00:09:46,280 --> 00:09:50,360 Speaker 3: for us AI Lab and he spends his days talking 191 00:09:50,400 --> 00:09:54,120 Speaker 3: to academics, reading papers, you know, talking to people inside 192 00:09:54,120 --> 00:09:58,080 Speaker 3: of these companies, which means that he is able to 193 00:09:58,200 --> 00:09:59,760 Speaker 3: look at a new paper or look at it an 194 00:09:59,800 --> 00:10:03,640 Speaker 3: now from Open AI or Anthropic or Microsoft or whomever else, 195 00:10:04,200 --> 00:10:07,640 Speaker 3: and be able to distill very quickly what is meaningful 196 00:10:07,640 --> 00:10:10,160 Speaker 3: from that and what is not right. He has the 197 00:10:10,240 --> 00:10:13,720 Speaker 3: expertise to be able to not only explain to the 198 00:10:13,800 --> 00:10:18,079 Speaker 3: audience what's new and interesting and novel about a certain 199 00:10:18,280 --> 00:10:21,320 Speaker 3: you know, moment in time with regards to AI, but 200 00:10:21,600 --> 00:10:23,840 Speaker 3: he's also able to tell them this matters and this 201 00:10:23,880 --> 00:10:26,680 Speaker 3: doesn't matter. So we have Will, for example, on the 202 00:10:26,720 --> 00:10:28,480 Speaker 3: sort of research piece of it, and then we have 203 00:10:28,800 --> 00:10:32,040 Speaker 3: another journalist, Max Zeff, based in San Francisco, who actually 204 00:10:32,040 --> 00:10:35,320 Speaker 3: covers what's going on inside of these companies, so he 205 00:10:35,440 --> 00:10:38,200 Speaker 3: sort of is positioned differently where he's tracking, you know, 206 00:10:38,320 --> 00:10:41,160 Speaker 3: staffing changes, you know, the latest drama at open AI. 207 00:10:42,320 --> 00:10:44,960 Speaker 3: What it means that, you know, Microsoft is investing this 208 00:10:45,080 --> 00:10:46,800 Speaker 3: much money in this company and this much money in 209 00:10:46,840 --> 00:10:49,960 Speaker 3: this company. So we're able to cover AI and really 210 00:10:49,960 --> 00:10:53,040 Speaker 3: sort of cut through all of the noise by having 211 00:10:53,520 --> 00:10:56,680 Speaker 3: beat reporters essentially subject matter experts who've been doing this 212 00:10:56,720 --> 00:10:58,720 Speaker 3: for a long time, who know the players, who know 213 00:10:58,800 --> 00:11:01,480 Speaker 3: the technology, and Will can sort of tackle it from 214 00:11:01,520 --> 00:11:04,400 Speaker 3: a research point of view, and Max can tackle it 215 00:11:04,440 --> 00:11:06,400 Speaker 3: from like an industry point of view. When if you 216 00:11:06,440 --> 00:11:08,480 Speaker 3: combine those things two things together, they each write a 217 00:11:08,480 --> 00:11:10,920 Speaker 3: newsletter every week, I think you're going to get a 218 00:11:11,040 --> 00:11:14,720 Speaker 3: very clear eyed and sort of very hype free version 219 00:11:15,120 --> 00:11:17,360 Speaker 3: of what's actually happening with the tech. 220 00:11:17,640 --> 00:11:21,199 Speaker 2: Over the holidays, Clawed code kind of went viral, and 221 00:11:21,679 --> 00:11:25,680 Speaker 2: it seemed like sort of regular people using AI went 222 00:11:25,880 --> 00:11:30,240 Speaker 2: from sort of summarizing text and creating studio ghibli images 223 00:11:30,760 --> 00:11:34,400 Speaker 2: to regular people creating programs that help them in some 224 00:11:34,520 --> 00:11:38,160 Speaker 2: way with their lives even though they had no programming background. 225 00:11:38,160 --> 00:11:39,840 Speaker 2: And some people said, Okay, this is the true like 226 00:11:40,280 --> 00:11:44,680 Speaker 2: watershed moment where the promise of jen Ai kind of diffuses. 227 00:11:45,880 --> 00:11:48,080 Speaker 1: How do you think about that moment? Was it significant 228 00:11:48,120 --> 00:11:48,400 Speaker 1: for you? 229 00:11:48,480 --> 00:11:51,360 Speaker 3: And it was actually I you know, and I will 230 00:11:51,640 --> 00:11:55,320 Speaker 3: call out a friend I spoke to a couple weeks 231 00:11:55,360 --> 00:11:57,760 Speaker 3: ago who's a software engineer. He's been doing it for 232 00:11:57,760 --> 00:12:01,400 Speaker 3: twenty five years, has a long history intechnology, and for 233 00:12:01,679 --> 00:12:06,200 Speaker 3: years he was saying, this AI stuff with programming is ridiculous. 234 00:12:06,480 --> 00:12:08,520 Speaker 3: I don't use it. I'm never going to use it. 235 00:12:08,520 --> 00:12:11,360 Speaker 3: It hallucinates, it makes mistakes, It is deeply flawed. The 236 00:12:11,360 --> 00:12:13,920 Speaker 3: fact that my CEOs and the companies I work for 237 00:12:14,000 --> 00:12:16,920 Speaker 3: are talking about this stuff like a holy grail is insane. 238 00:12:17,160 --> 00:12:19,720 Speaker 3: Like we have all had, we are experiencing like a 239 00:12:19,720 --> 00:12:23,680 Speaker 3: mass delusion in the tech industry. And this was a 240 00:12:23,720 --> 00:12:27,880 Speaker 3: moment for him with cloud code where that changed very 241 00:12:27,920 --> 00:12:31,840 Speaker 3: profoundly and very quickly, and he is now actively using 242 00:12:31,920 --> 00:12:34,920 Speaker 3: the technology in his work. So I think that again 243 00:12:35,000 --> 00:12:38,880 Speaker 3: to just as a journalist, my job is to look 244 00:12:38,920 --> 00:12:41,440 Speaker 3: to the people who are actually on the front lines 245 00:12:41,480 --> 00:12:44,199 Speaker 3: of this stuff and be able to sort of suss 246 00:12:44,240 --> 00:12:46,960 Speaker 3: out what's a moment and what's not based on their input. 247 00:12:47,000 --> 00:12:48,960 Speaker 3: And I think that was a really powerful example for 248 00:12:49,040 --> 00:12:51,520 Speaker 3: me of like, this is someone who was as skeptical 249 00:12:51,559 --> 00:12:54,000 Speaker 3: as they come and as smart as they come about 250 00:12:54,040 --> 00:12:56,839 Speaker 3: what he does, who just said like, oh no, this 251 00:12:56,960 --> 00:13:00,040 Speaker 3: is actually a really big deal and I need to 252 00:13:00,080 --> 00:13:02,240 Speaker 3: interrogate all of the assumptions that I've held for the 253 00:13:02,320 --> 00:13:05,040 Speaker 3: last several years about how I do my job. So 254 00:13:05,120 --> 00:13:06,400 Speaker 3: I do think it was very meaningful. 255 00:13:06,480 --> 00:13:08,320 Speaker 2: So what do you do with that? As the editorial 256 00:13:08,400 --> 00:13:10,400 Speaker 2: director of why like you get an insight like this, 257 00:13:11,040 --> 00:13:13,160 Speaker 2: it really is a moment, Like, what do you tell 258 00:13:13,160 --> 00:13:13,800 Speaker 2: a newsroom? 259 00:13:13,840 --> 00:13:15,320 Speaker 1: How do you go forward with that? 260 00:13:15,640 --> 00:13:17,560 Speaker 3: Well, I mean they're usually way ahead of me, so 261 00:13:18,760 --> 00:13:20,960 Speaker 3: I usually will be like will I talked to him 262 00:13:21,080 --> 00:13:25,600 Speaker 3: and he's like, yeah, gotcha. You know it's again it's 263 00:13:25,679 --> 00:13:29,480 Speaker 3: just really being in constant communication. I mean, we run 264 00:13:29,559 --> 00:13:33,360 Speaker 3: a newsroom that is about twenty five percent San Francisco, 265 00:13:33,840 --> 00:13:36,200 Speaker 3: twenty five percent, New York twenty five percent, London twenty 266 00:13:36,200 --> 00:13:38,680 Speaker 3: five percent remote. So we are in slack and on 267 00:13:38,760 --> 00:13:40,960 Speaker 3: zoom all day. That's that's how I do my job. 268 00:13:41,040 --> 00:13:44,840 Speaker 3: It's virtually entirely remote. But it's just about being in 269 00:13:44,840 --> 00:13:48,000 Speaker 3: constant communication and really sort of stress testing different ideas, 270 00:13:49,480 --> 00:13:51,960 Speaker 3: always talking about sort of new moments and what they mean, 271 00:13:52,040 --> 00:13:54,640 Speaker 3: making sure that our coverage is on point. But again, 272 00:13:54,800 --> 00:13:57,840 Speaker 3: like someone like Will or Max probably would have said like, yeah, 273 00:13:57,840 --> 00:13:59,839 Speaker 3: I could have told you that a week ago. In fact, 274 00:13:59,880 --> 00:14:02,520 Speaker 3: I wrote a story about it, you know, five days ago, 275 00:14:02,600 --> 00:14:04,280 Speaker 3: and we are working on a couple of sort of 276 00:14:04,320 --> 00:14:07,360 Speaker 3: bigger stories about exactly what you're talking about with Anthropic 277 00:14:08,000 --> 00:14:09,440 Speaker 3: and sort of what that means for the rest of 278 00:14:09,440 --> 00:14:11,440 Speaker 3: the AI industry, because I do think it was a 279 00:14:11,600 --> 00:14:15,000 Speaker 3: very seminal moment for a lot of those other companies 280 00:14:15,480 --> 00:14:17,720 Speaker 3: to look over there and say like, oh boy, like 281 00:14:17,920 --> 00:14:19,760 Speaker 3: we're all of a sudden, very far behind. 282 00:14:20,320 --> 00:14:22,280 Speaker 2: So do you look at this kind of three horse 283 00:14:22,360 --> 00:14:26,920 Speaker 2: race between Anthropic, Open Ai and Google and think about 284 00:14:26,960 --> 00:14:28,960 Speaker 2: handicapping it or I mean, who do you think, what 285 00:14:29,000 --> 00:14:30,240 Speaker 2: do you think the stakes are today. 286 00:14:30,400 --> 00:14:33,280 Speaker 3: If I tried, I would fail miserably. I mean, I 287 00:14:33,280 --> 00:14:35,480 Speaker 3: think if you asked me two years ago about Google, 288 00:14:36,000 --> 00:14:40,200 Speaker 3: I remember we were talking very seriously about a big 289 00:14:40,280 --> 00:14:42,800 Speaker 3: story maybe two years ago, a year and a half ago, 290 00:14:42,920 --> 00:14:46,320 Speaker 3: about like how Google lost the AI race. I mean 291 00:14:46,320 --> 00:14:50,840 Speaker 3: that's how we were all thinking internally and at You 292 00:14:51,040 --> 00:14:52,800 Speaker 3: asked me that same question three months ago, and I 293 00:14:52,800 --> 00:14:55,840 Speaker 3: would have said, like, they're killing it, Like they're integrating 294 00:14:55,840 --> 00:14:58,080 Speaker 3: this technology into all of the products and services that 295 00:14:58,120 --> 00:15:00,840 Speaker 3: we all use every single day, being very smart and 296 00:15:00,880 --> 00:15:06,080 Speaker 3: strategic about it. And they already have that commercialized advantage 297 00:15:06,120 --> 00:15:09,840 Speaker 3: because we all use Google suite of products. So it's 298 00:15:09,920 --> 00:15:12,440 Speaker 3: changing so quickly. And again, you know, Nthropic is another 299 00:15:12,480 --> 00:15:14,640 Speaker 3: great example where they just came out and kicked open 300 00:15:14,720 --> 00:15:17,960 Speaker 3: a Eyes. But on coding. I don't think it would 301 00:15:18,000 --> 00:15:22,040 Speaker 3: be wise or reasonable to try to game out a 302 00:15:22,080 --> 00:15:23,920 Speaker 3: winner or a loser here. I think that there will 303 00:15:23,960 --> 00:15:26,320 Speaker 3: be many winners. I think there will be many many losers. 304 00:15:26,640 --> 00:15:28,400 Speaker 3: What I would say is that when you look at 305 00:15:28,400 --> 00:15:31,280 Speaker 3: like the big incumbents right like the Microsoft's and the Googles, 306 00:15:31,880 --> 00:15:35,400 Speaker 3: they have a much stronger foundation to stand on than 307 00:15:35,440 --> 00:15:37,280 Speaker 3: the Open a Eyes or the anthropics, right. I mean, 308 00:15:37,280 --> 00:15:43,520 Speaker 3: these are big, big, big companies who can spend astronomical 309 00:15:43,520 --> 00:15:46,840 Speaker 3: sums of money to commercialize this technology, to do it 310 00:15:46,880 --> 00:15:48,640 Speaker 3: at a loss for as long as they need to, 311 00:15:48,800 --> 00:15:51,480 Speaker 3: essentially in order to win that race. So I think 312 00:15:51,520 --> 00:15:55,360 Speaker 3: the question really is of these very well moneyed startups 313 00:15:56,200 --> 00:15:59,680 Speaker 3: who will be able to compete and win in that space. 314 00:15:59,720 --> 00:16:01,800 Speaker 3: When you have the Googles kind of like threatening to 315 00:16:01,840 --> 00:16:02,760 Speaker 3: eat your lunch every other. 316 00:16:02,720 --> 00:16:31,480 Speaker 4: Day after the break, why we need to look to 317 00:16:31,600 --> 00:16:34,120 Speaker 4: China to see where the future of tech is going. 318 00:16:34,640 --> 00:16:55,920 Speaker 1: Stay with us. It's interesting. 319 00:16:55,960 --> 00:16:58,000 Speaker 2: I mean, I think we've both come from a background 320 00:16:58,000 --> 00:17:01,920 Speaker 2: in journalism where your kind of natural tendency is to 321 00:17:01,960 --> 00:17:04,440 Speaker 2: be skeptical, right. And I remember about five years ago 322 00:17:04,520 --> 00:17:07,920 Speaker 2: somebody talking to me about, you know, drones, ubiquitous drones 323 00:17:08,000 --> 00:17:10,320 Speaker 2: and fully self driving vehicles, and I said, you know what, 324 00:17:11,320 --> 00:17:14,879 Speaker 2: you know, be careful about swallowing too many of the 325 00:17:14,840 --> 00:17:16,800 Speaker 2: the you know, the dictums of the tech companies who 326 00:17:16,840 --> 00:17:19,200 Speaker 2: want you to believe in this future, because those both 327 00:17:19,200 --> 00:17:22,440 Speaker 2: of those things seem kind of far off. Now they're 328 00:17:22,440 --> 00:17:25,480 Speaker 2: both here, right, And so I mean, presumably you, like me, 329 00:17:25,520 --> 00:17:29,000 Speaker 2: are skeptical of some of these you know visions painted 330 00:17:29,080 --> 00:17:32,439 Speaker 2: by the leaders of these companies, and yet a lot 331 00:17:32,480 --> 00:17:34,200 Speaker 2: of them do seem to be coming true. So how 332 00:17:34,200 --> 00:17:38,080 Speaker 2: do you balance how do you balance interrogating those visions 333 00:17:38,400 --> 00:17:40,399 Speaker 2: at the same time as giving them enough credit for 334 00:17:40,480 --> 00:17:43,320 Speaker 2: some of the frankly unexpected wins they may have had. 335 00:17:44,040 --> 00:17:45,760 Speaker 3: Yeah, that's a really good question. I mean, I think 336 00:17:45,800 --> 00:17:47,320 Speaker 3: the way I talk to the team about it is 337 00:17:47,359 --> 00:17:50,520 Speaker 3: we're we are skeptical, but we're not cynical. So I 338 00:17:50,600 --> 00:17:54,760 Speaker 3: never want our team to be so dismissive or so 339 00:17:54,920 --> 00:17:58,360 Speaker 3: down on technology, or so down on innovation and inquiry 340 00:17:58,400 --> 00:18:01,639 Speaker 3: that they can't see like the light that's kind of 341 00:18:02,119 --> 00:18:04,960 Speaker 3: beckoning at them through the haze of hype and marketing 342 00:18:05,040 --> 00:18:06,600 Speaker 3: and all of that stuff. Right, So we should always 343 00:18:06,600 --> 00:18:08,920 Speaker 3: be skeptical, we should never be cynical. And I think 344 00:18:08,920 --> 00:18:13,600 Speaker 3: again it comes down to really interrogating the research, right, 345 00:18:13,720 --> 00:18:17,600 Speaker 3: really interrogating the work, interrogating who's behind it, right, who 346 00:18:17,640 --> 00:18:20,840 Speaker 3: is saying that, What is their background, what are their credentials, 347 00:18:20,840 --> 00:18:24,479 Speaker 3: like asking those hard questions, and always being sure that 348 00:18:24,560 --> 00:18:27,280 Speaker 3: when you talk to a company or a person making 349 00:18:27,280 --> 00:18:30,160 Speaker 3: those kinds of promises right about self driving cars, for example, 350 00:18:30,440 --> 00:18:32,560 Speaker 3: that you ask them the hard questions, press them on 351 00:18:32,560 --> 00:18:35,639 Speaker 3: the research, know your stuff, have the expertise to know 352 00:18:35,680 --> 00:18:38,199 Speaker 3: what to ask, to be able to walk away with 353 00:18:38,240 --> 00:18:41,640 Speaker 3: an informed conclusion. Again, we never want to be regurgitating 354 00:18:41,680 --> 00:18:44,120 Speaker 3: a press release, but if there is a there there, 355 00:18:44,160 --> 00:18:47,080 Speaker 3: if there is something really exciting going on, we want 356 00:18:47,080 --> 00:18:48,919 Speaker 3: to be able to translate that for our audience. I mean, 357 00:18:48,960 --> 00:18:51,000 Speaker 3: I think self driving car technology is such a great 358 00:18:51,000 --> 00:18:53,760 Speaker 3: example because it's been talked about and talked about and 359 00:18:53,800 --> 00:18:57,000 Speaker 3: talked about, and I remember sort of like five or 360 00:18:57,080 --> 00:18:59,719 Speaker 3: six years ago, there were all of these companies racing 361 00:18:59,800 --> 00:19:02,040 Speaker 3: to to win the self driving car thing. A lot 362 00:19:02,080 --> 00:19:04,320 Speaker 3: of them are now out of business, and it was 363 00:19:04,480 --> 00:19:06,920 Speaker 3: impossible at the time to discern who was going to 364 00:19:07,040 --> 00:19:10,399 Speaker 3: emerge victorious or if anyone was. And you look around 365 00:19:10,400 --> 00:19:14,000 Speaker 3: now at what WEIMO is doing, it's incredible. Like I 366 00:19:14,040 --> 00:19:17,879 Speaker 3: don't know if enough people appreciate what a fierce battle 367 00:19:17,960 --> 00:19:22,439 Speaker 3: that has been and how remarkable it is that they 368 00:19:22,440 --> 00:19:25,240 Speaker 3: are now operating in what dozens of cities, launching in 369 00:19:25,359 --> 00:19:29,159 Speaker 3: dozens more this year, Like it's that is unbelievable. And 370 00:19:29,200 --> 00:19:32,760 Speaker 3: so we don't want to lose sight of those really 371 00:19:32,840 --> 00:19:36,600 Speaker 3: interesting races and that really interesting research and those really 372 00:19:36,600 --> 00:19:39,920 Speaker 3: interesting wins because again, we've been talking about self driving cars, 373 00:19:39,920 --> 00:19:42,080 Speaker 3: like probably since before I was born, right, like years 374 00:19:42,080 --> 00:19:44,240 Speaker 3: and years and years and years, and now it is 375 00:19:44,280 --> 00:19:47,520 Speaker 3: actually here if you go to San Francisco, they're everywhere. 376 00:19:47,800 --> 00:19:51,879 Speaker 3: And I want to maintain our sort of excitement and 377 00:19:52,000 --> 00:19:54,880 Speaker 3: enthusiasm for that, because like, that's cool as hell, you. 378 00:19:54,840 --> 00:19:55,680 Speaker 1: Know, I agree. 379 00:19:55,800 --> 00:19:58,240 Speaker 2: And let's talk about two things that emerge at the 380 00:19:58,240 --> 00:20:01,359 Speaker 2: beginning of this year as kind of promises about the 381 00:20:01,400 --> 00:20:05,879 Speaker 2: near future. One is humanoid robots, the obsession with humanoid 382 00:20:06,000 --> 00:20:10,440 Speaker 2: robots coming out of CS, and the other is world models. Right, 383 00:20:10,520 --> 00:20:14,359 Speaker 2: there's a new paradigm of AI which learns from observation 384 00:20:14,440 --> 00:20:17,360 Speaker 2: of the physical world rather than from language and numbers. 385 00:20:17,760 --> 00:20:20,080 Speaker 2: Maybe that's start with humanoid robotics, Like, do you think 386 00:20:20,760 --> 00:20:23,639 Speaker 2: this is going to be the next frontier of technology 387 00:20:23,640 --> 00:20:25,560 Speaker 2: in the same way that self driving cars have been? 388 00:20:25,600 --> 00:20:27,080 Speaker 1: Do you think the excitement is warranted? 389 00:20:27,960 --> 00:20:30,640 Speaker 3: You know, I it's so tough because I do love 390 00:20:30,680 --> 00:20:33,479 Speaker 3: being a skeptic. And humanoid robots is another great example 391 00:20:33,480 --> 00:20:36,600 Speaker 3: of a technology that's been promised for so so so long. 392 00:20:37,000 --> 00:20:39,640 Speaker 3: But again I look at the experts that I work with, right, 393 00:20:39,680 --> 00:20:41,440 Speaker 3: and so we just put out a big issue about 394 00:20:41,480 --> 00:20:44,480 Speaker 3: China and about how sort of how far along China 395 00:20:44,600 --> 00:20:46,919 Speaker 3: is in so many ways where the US is lagging behind, 396 00:20:47,200 --> 00:20:50,320 Speaker 3: And robotics is a really salient example there right where 397 00:20:50,400 --> 00:20:53,560 Speaker 3: China is doing so much incredible work with physical robotics. 398 00:20:53,880 --> 00:20:56,399 Speaker 3: Jen Ai is allowing us to train robots in very 399 00:20:56,400 --> 00:20:59,800 Speaker 3: different ways. We have some really interesting stories coming out 400 00:21:00,480 --> 00:21:05,760 Speaker 3: about physical robotics being run by certain large companies, global companies, 401 00:21:06,240 --> 00:21:08,720 Speaker 3: and what that means for the workforce. And I look 402 00:21:08,760 --> 00:21:11,480 Speaker 3: at our reporters and I trust them, and I look 403 00:21:11,520 --> 00:21:14,439 Speaker 3: at the work that they're doing on this subject, and 404 00:21:14,480 --> 00:21:17,760 Speaker 3: I have to say, it's very very compelling what is 405 00:21:17,760 --> 00:21:20,000 Speaker 3: happening with robotics. I don't know if ultimately the shape 406 00:21:20,040 --> 00:21:22,000 Speaker 3: and form of these is humanoid, right, I think the 407 00:21:22,040 --> 00:21:25,080 Speaker 3: idea of like a Jetson's like a robot doing your 408 00:21:25,119 --> 00:21:27,919 Speaker 3: dishes for you. I think less about it in that 409 00:21:28,119 --> 00:21:31,440 Speaker 3: context and more about it in the context of like warehouses, 410 00:21:31,560 --> 00:21:34,360 Speaker 3: fulfillment centers, like robots doing a lot of the kind 411 00:21:34,400 --> 00:21:38,480 Speaker 3: of grunt work in those industrial settings. But that is 412 00:21:38,960 --> 00:21:41,760 Speaker 3: very real, and the research being done in those spaces 413 00:21:42,119 --> 00:21:44,600 Speaker 3: is very real. It's very interesting, it's very promising. I mean, 414 00:21:44,600 --> 00:21:48,119 Speaker 3: this is still incredibly expensive technology to deploy right to 415 00:21:48,200 --> 00:21:52,639 Speaker 3: actually commercialize physical robots at scale. So again, this is 416 00:21:52,680 --> 00:21:54,840 Speaker 3: not something that will be in your house in five years. 417 00:21:55,280 --> 00:21:58,800 Speaker 3: But will robots and sort of physical robotics take a 418 00:21:58,880 --> 00:22:04,760 Speaker 3: much larger role in industrial contexts. I think I hate 419 00:22:04,800 --> 00:22:06,880 Speaker 3: to make a prediction. I guess it's part of my job. 420 00:22:07,160 --> 00:22:09,400 Speaker 3: I would be willing to predict that, yes, we will 421 00:22:09,400 --> 00:22:10,840 Speaker 3: see that happen, and I think we will see it 422 00:22:10,840 --> 00:22:12,640 Speaker 3: happen in a way that in the same way we've 423 00:22:12,680 --> 00:22:15,800 Speaker 3: seen Waimo kind of like take over the streets in 424 00:22:16,359 --> 00:22:18,720 Speaker 3: many cities. I think we will have that same moment 425 00:22:18,800 --> 00:22:21,040 Speaker 3: with robotics in a few years. I really do big cool, 426 00:22:21,760 --> 00:22:24,200 Speaker 3: big call come back in five years and ask me. 427 00:22:24,400 --> 00:22:27,159 Speaker 2: Okay, what about world models, Because obviously you have anthropic 428 00:22:27,320 --> 00:22:33,040 Speaker 2: and open AI raising raising racing towards IPOs that you know, 429 00:22:33,280 --> 00:22:35,800 Speaker 2: valuations approaching a trillion dollars later this year. 430 00:22:36,720 --> 00:22:37,480 Speaker 1: And then you have. 431 00:22:37,520 --> 00:22:40,560 Speaker 2: Faith A Lee and Yan Lacun kind of pioneers of 432 00:22:40,600 --> 00:22:44,679 Speaker 2: this AI moment, both of whom saying no, no, the 433 00:22:44,720 --> 00:22:47,399 Speaker 2: next wave of AI won't come from this same approach. 434 00:22:47,440 --> 00:22:49,879 Speaker 2: There are these world models coming it's quite hard to 435 00:22:49,880 --> 00:22:50,600 Speaker 2: get your head around. 436 00:22:50,680 --> 00:22:52,119 Speaker 1: But how do you how do you look at that? 437 00:22:52,640 --> 00:22:55,440 Speaker 3: I first interviewed Yan Lacuhn in two thousand and eight 438 00:22:55,560 --> 00:22:57,880 Speaker 3: when I was like, I don't know, twenty years old. 439 00:22:57,920 --> 00:23:00,640 Speaker 3: I think he's brilliant. I think he has always been 440 00:23:01,640 --> 00:23:04,159 Speaker 3: someone to zig where others are zagging, right, and I 441 00:23:04,160 --> 00:23:06,200 Speaker 3: think that makes him really interesting. I think he's he's 442 00:23:06,240 --> 00:23:08,680 Speaker 3: a pioneer in his field. I mean Fayfe as well, 443 00:23:08,920 --> 00:23:10,320 Speaker 3: and I think there's a real there there. I mean, 444 00:23:10,320 --> 00:23:11,960 Speaker 3: I think we've seen a lot of research in the 445 00:23:12,040 --> 00:23:15,399 Speaker 3: last several months about the limitations of lllms and the 446 00:23:15,440 --> 00:23:17,680 Speaker 3: fact that there is sort of a natural and necessary 447 00:23:17,760 --> 00:23:20,480 Speaker 3: endpoint to what they can accomplish. So I think that 448 00:23:20,600 --> 00:23:26,400 Speaker 3: the idea of a different paradigm for leveling up artificial 449 00:23:26,440 --> 00:23:30,000 Speaker 3: intelligence is right, whether it's world models or whether there 450 00:23:30,040 --> 00:23:32,199 Speaker 3: is sort of something else out there that remains to 451 00:23:32,200 --> 00:23:36,359 Speaker 3: be seen. But I think there's there's really no question 452 00:23:36,400 --> 00:23:39,399 Speaker 3: that lllm's at some point like will have a finite 453 00:23:39,840 --> 00:23:43,199 Speaker 3: there will we finite applications for how how much we 454 00:23:43,200 --> 00:23:45,119 Speaker 3: can utilize those right, like how much further we can 455 00:23:45,160 --> 00:23:47,879 Speaker 3: really press them. I think that's really true. And so 456 00:23:47,920 --> 00:23:50,480 Speaker 3: I think what will be interesting to watch there is 457 00:23:50,480 --> 00:23:53,679 Speaker 3: what that means for the open ayes and the anthropics 458 00:23:53,720 --> 00:23:56,040 Speaker 3: of the world right, and what it means potentially for 459 00:23:56,160 --> 00:23:58,520 Speaker 3: new companies that might come in to play, like what 460 00:23:58,600 --> 00:24:01,040 Speaker 3: Yan is doing now, and whether they can sort of 461 00:24:01,080 --> 00:24:04,639 Speaker 3: compete and make you know, a major leap forward with 462 00:24:04,680 --> 00:24:05,040 Speaker 3: the tech. 463 00:24:06,040 --> 00:24:08,119 Speaker 1: And you mentioned this China issue. 464 00:24:08,520 --> 00:24:12,440 Speaker 2: What do you think American audiences or readers are missing 465 00:24:12,520 --> 00:24:15,120 Speaker 2: about the China tech story, like your what was your 466 00:24:15,200 --> 00:24:18,120 Speaker 2: hope in terms of publishing those twenty three ways? We're 467 00:24:18,119 --> 00:24:19,640 Speaker 2: already living in a Chinese century. 468 00:24:20,480 --> 00:24:24,240 Speaker 3: Look. I mean I grew up in Canada and looked 469 00:24:24,320 --> 00:24:26,520 Speaker 3: to the United States with a great deal of reverence 470 00:24:26,560 --> 00:24:29,040 Speaker 3: and aspiration. I think it's it's an aspirational place for 471 00:24:29,119 --> 00:24:34,720 Speaker 3: so many people. I think though, that very often in 472 00:24:34,760 --> 00:24:39,080 Speaker 3: the United States we tend to look inward, and we 473 00:24:39,200 --> 00:24:42,040 Speaker 3: don't tend to look past our own borders. Often enough, 474 00:24:42,040 --> 00:24:47,600 Speaker 3: we don't appreciate or acknowledge the innovation and the technological 475 00:24:47,640 --> 00:24:49,320 Speaker 3: progress happening in other parts of the world. I mean, 476 00:24:49,359 --> 00:24:51,359 Speaker 3: even sitting here with you in Doha, right, there's no 477 00:24:51,440 --> 00:24:54,080 Speaker 3: denying there is a ton of money and a ton 478 00:24:54,119 --> 00:24:57,200 Speaker 3: of research and a ton of really interesting technological progress 479 00:24:57,200 --> 00:25:00,159 Speaker 3: happening in this part of the world, let alone China, 480 00:25:00,240 --> 00:25:03,440 Speaker 3: that American audiences, I would argue, don't spend enough time 481 00:25:03,520 --> 00:25:06,120 Speaker 3: engaging with. And so I think, you know, we really 482 00:25:06,160 --> 00:25:08,600 Speaker 3: wanted to be a little bit cheeky and maybe a 483 00:25:08,640 --> 00:25:12,199 Speaker 3: little bit clever with that issue by saying, we have 484 00:25:12,280 --> 00:25:14,320 Speaker 3: a president in the United States right now, who is 485 00:25:14,400 --> 00:25:16,680 Speaker 3: you know, promising to make this the best country in 486 00:25:16,720 --> 00:25:20,320 Speaker 3: the world, make America great again. And I would argue 487 00:25:20,320 --> 00:25:22,639 Speaker 3: that in many ways that is not exactly what we 488 00:25:22,680 --> 00:25:26,800 Speaker 3: are seeing. And I would urge American audiences to look elsewhere, 489 00:25:26,880 --> 00:25:29,199 Speaker 3: to look to China, to look to you know, the 490 00:25:29,240 --> 00:25:31,760 Speaker 3: Middle East, to look to other parts of the world 491 00:25:32,320 --> 00:25:35,560 Speaker 3: and acknowledge and grapple with the fact that there's a 492 00:25:35,600 --> 00:25:39,479 Speaker 3: lot of really interesting, profound stuff happening and it just 493 00:25:39,520 --> 00:25:42,159 Speaker 3: so happens that it's not happening happening in the United States. 494 00:25:42,160 --> 00:25:44,040 Speaker 3: And I think they need to we all need to 495 00:25:44,080 --> 00:25:46,840 Speaker 3: really think very hard about what that means for the 496 00:25:46,840 --> 00:25:48,800 Speaker 3: future of the country that we all live in. 497 00:25:49,320 --> 00:25:53,880 Speaker 2: It's interesting this nine to ninety six concept, like nine 498 00:25:53,920 --> 00:25:56,000 Speaker 2: now is a day, six days a week, which was 499 00:25:56,080 --> 00:25:58,280 Speaker 2: kind of a little bit of a stick to beat 500 00:25:58,359 --> 00:26:00,960 Speaker 2: China with in terms of, oh, you know, they're only 501 00:26:00,960 --> 00:26:04,120 Speaker 2: obsessed with work, and they haven't you know, left room 502 00:26:04,160 --> 00:26:07,320 Speaker 2: for personal self realization the same way that we have 503 00:26:07,400 --> 00:26:10,160 Speaker 2: in the US. Five years ago that was a real 504 00:26:10,200 --> 00:26:13,200 Speaker 2: criticism of China, and now in the US tech industry 505 00:26:13,240 --> 00:26:15,199 Speaker 2: it's almost like an aspiration. You have this idea of 506 00:26:15,240 --> 00:26:18,440 Speaker 2: like cracked engineers and basically learning as much as possible 507 00:26:18,480 --> 00:26:21,920 Speaker 2: from Chinese work culture amongst the Silicon Valley youth. I 508 00:26:22,000 --> 00:26:25,560 Speaker 2: just find such an interesting inversion why it was born 509 00:26:25,640 --> 00:26:29,639 Speaker 2: in the early nineties when today's tech overlords were in 510 00:26:29,680 --> 00:26:33,120 Speaker 2: their twenties. Essentially, Yeah, what's the culture of the new 511 00:26:33,800 --> 00:26:37,520 Speaker 2: eighteen to twenty fives in Silicon Valley? What are you 512 00:26:37,600 --> 00:26:39,880 Speaker 2: seeing about them and how they're relating to the world 513 00:26:39,920 --> 00:26:40,760 Speaker 2: that already grabs you. 514 00:26:41,200 --> 00:26:43,240 Speaker 3: Well, it's funny because I feel like nine ninety six 515 00:26:43,320 --> 00:26:47,240 Speaker 3: was really always the way people in startup world worked. 516 00:26:47,280 --> 00:26:49,840 Speaker 3: I mean, we just inher We just took a name 517 00:26:49,880 --> 00:26:52,640 Speaker 3: from China and sort of applied it to our own industry, 518 00:26:52,640 --> 00:26:56,200 Speaker 3: which I think speaks to China's sort of outsize cultural influence. 519 00:26:56,600 --> 00:26:58,720 Speaker 3: I mean, I think the most interesting thing to me 520 00:26:58,920 --> 00:27:02,120 Speaker 3: about sort of young tech culture and startup culture now 521 00:27:02,800 --> 00:27:06,840 Speaker 3: is how completely consumed it is with AI. And I 522 00:27:06,840 --> 00:27:10,399 Speaker 3: think that the smartest, most strategic founders today in that 523 00:27:10,480 --> 00:27:15,840 Speaker 3: cohort are using AI in ways that their predecessors twenty 524 00:27:15,920 --> 00:27:19,280 Speaker 3: years ago could only imagine. I mean, they are building 525 00:27:19,320 --> 00:27:21,640 Speaker 3: and running their entire companies. It's not just that they 526 00:27:21,680 --> 00:27:24,119 Speaker 3: have an idea for a company that they can build 527 00:27:24,600 --> 00:27:27,800 Speaker 3: based on AI. They're using AI to build the entire 528 00:27:27,840 --> 00:27:30,200 Speaker 3: company right and they're doing it at a very accelerated pace. 529 00:27:30,280 --> 00:27:33,919 Speaker 3: So that to me is fascinating, and I am fascinated 530 00:27:33,920 --> 00:27:36,840 Speaker 3: to see in ten or twenty years what that yields, 531 00:27:36,880 --> 00:27:40,160 Speaker 3: because the barrier to entry for a startup founder now 532 00:27:40,600 --> 00:27:43,480 Speaker 3: is drastically lower than it was twenty years ago. I mean, 533 00:27:43,480 --> 00:27:45,320 Speaker 3: if you have access to an LM, if you have 534 00:27:45,359 --> 00:27:49,439 Speaker 3: access to Jenai, you can get a lot done in 535 00:27:49,600 --> 00:27:52,360 Speaker 3: a day, let alone six days, nine hours a day 536 00:27:52,760 --> 00:27:54,360 Speaker 3: compared to twenty or thirty years ago. 537 00:27:54,680 --> 00:27:56,399 Speaker 1: Maybe the Unicorn of one will happen. 538 00:27:56,680 --> 00:28:01,159 Speaker 2: Maybe, Katie Justiclose, we're here in Doha, why it has 539 00:28:01,240 --> 00:28:03,439 Speaker 2: recently integrated Wired Middle East. 540 00:28:03,880 --> 00:28:05,920 Speaker 1: What are you hoping for from that team? 541 00:28:05,960 --> 00:28:08,400 Speaker 2: I mean, obviously some of the stories are around data centers, 542 00:28:09,000 --> 00:28:13,119 Speaker 2: some of them are around investment in US companies, some 543 00:28:13,200 --> 00:28:16,520 Speaker 2: of them around regional competition. What do you think the 544 00:28:16,600 --> 00:28:18,600 Speaker 2: key storyline is coming out of why Middle East in 545 00:28:18,640 --> 00:28:20,040 Speaker 2: the next year will be Oh wow. 546 00:28:20,080 --> 00:28:22,560 Speaker 3: I mean there's so many and so we're launching Wired 547 00:28:22,600 --> 00:28:25,120 Speaker 3: Middle East in a couple weeks in mid February. It's 548 00:28:25,160 --> 00:28:28,359 Speaker 3: an amazing team. I mean, we operate Wired in a 549 00:28:28,359 --> 00:28:30,560 Speaker 3: lot of markets around the world. I would say there 550 00:28:30,640 --> 00:28:32,320 Speaker 3: is no place in the world I'm more excited for 551 00:28:32,359 --> 00:28:34,760 Speaker 3: Wired to be showing up right now than here. And 552 00:28:34,840 --> 00:28:37,359 Speaker 3: it is because this is the Middle East is a 553 00:28:37,520 --> 00:28:41,479 Speaker 3: hub of so much change and so much transformation, and 554 00:28:41,520 --> 00:28:44,360 Speaker 3: I am much less interested, I would say, in how 555 00:28:44,960 --> 00:28:47,280 Speaker 3: this part of the world interacts with the United States 556 00:28:47,280 --> 00:28:49,040 Speaker 3: than I am in how this part of the world 557 00:28:49,080 --> 00:28:51,640 Speaker 3: stands on its own two feet, right. What is happening 558 00:28:51,680 --> 00:28:55,000 Speaker 3: here with regards to science, with regards to technology, with 559 00:28:55,080 --> 00:28:57,840 Speaker 3: regards to digital culture, the way people live their lives 560 00:28:58,520 --> 00:29:03,400 Speaker 3: in a connected way, I think is deserving of more 561 00:29:03,440 --> 00:29:06,880 Speaker 3: attention from a global stage. And I think this team 562 00:29:07,120 --> 00:29:10,360 Speaker 3: will bring that same skepticism, but that same sort of 563 00:29:10,480 --> 00:29:14,520 Speaker 3: enthusiasm and rigor and those same journalistic values that we 564 00:29:14,560 --> 00:29:16,520 Speaker 3: do in every other wired that we operate in the 565 00:29:16,600 --> 00:29:18,360 Speaker 3: United States, and they will bring that to bear on 566 00:29:18,400 --> 00:29:21,000 Speaker 3: what's happening here. And I can't wait to see what 567 00:29:21,040 --> 00:29:22,600 Speaker 3: they're going to do. I know that they plan to 568 00:29:22,680 --> 00:29:24,480 Speaker 3: go at an eleven out of ten, so I'm just 569 00:29:24,520 --> 00:29:25,480 Speaker 3: going to sit back and watch. 570 00:29:25,880 --> 00:29:29,320 Speaker 2: Katie Drumwell, Global Editorial Director wife, thank you, Thank you. 571 00:29:56,480 --> 00:29:59,240 Speaker 2: That's it for this week for tech stuff. I'mos velocian. 572 00:29:59,680 --> 00:30:02,720 Speaker 2: Is that Elis was produced by Eliza Dennis and Melissa Slaughter. 573 00:30:03,280 --> 00:30:04,520 Speaker 1: It was executive. 574 00:30:04,040 --> 00:30:07,600 Speaker 2: Produced by me Cara Price, Julia Nutter, and Kate Osborne 575 00:30:07,640 --> 00:30:12,400 Speaker 2: for Kaleidoscope and Katrina Norvel for iHeart Podcasts. Jack Insley 576 00:30:12,400 --> 00:30:15,120 Speaker 2: mixed this episode and Kyle Murdoch wrote our theme song. 577 00:30:15,760 --> 00:30:18,360 Speaker 2: Please do rate and review the show wherever you listen 578 00:30:18,360 --> 00:30:26,480 Speaker 2: to your podcasts.