1 00:00:00,160 --> 00:00:03,200 Speaker 1: Welcome to Tech Stuff. This is the story. Each week 2 00:00:03,240 --> 00:00:05,880 Speaker 1: on Wednesdays, we bring you an in depth interview with 3 00:00:05,920 --> 00:00:08,000 Speaker 1: someone who has a front row seat to the most 4 00:00:08,000 --> 00:00:12,120 Speaker 1: fascinating things happening in tech today. We're joined by Reid Hoffman, 5 00:00:12,520 --> 00:00:18,840 Speaker 1: a longtime entrepreneur, venture capitalist, and author. Born in the 6 00:00:18,880 --> 00:00:21,079 Speaker 1: Bay Area and part of a Silicon Valley crowd in 7 00:00:21,079 --> 00:00:23,919 Speaker 1: the nineties, he's helped build or support some of the 8 00:00:23,960 --> 00:00:25,759 Speaker 1: biggest tech companies we know today. 9 00:00:26,360 --> 00:00:26,920 Speaker 2: He worked at. 10 00:00:26,880 --> 00:00:29,000 Speaker 1: Apple in its early days, he was part of the 11 00:00:29,000 --> 00:00:31,640 Speaker 1: so called PayPal mafia as one of its first employees, 12 00:00:32,080 --> 00:00:34,479 Speaker 1: and he co founded LinkedIn, which he later sold to 13 00:00:34,560 --> 00:00:39,320 Speaker 1: Microsoft for twenty six point two billion dollars. Nowadays, he's 14 00:00:39,320 --> 00:00:42,360 Speaker 1: turned his attention to AI. As an early investor and 15 00:00:42,400 --> 00:00:45,640 Speaker 1: former board member for Open Ai. Hoffman is an optimist 16 00:00:45,720 --> 00:00:49,199 Speaker 1: about the benefits that AI could bring to society, so 17 00:00:49,280 --> 00:00:52,200 Speaker 1: much so that he wrote a book about it called Superagency. 18 00:00:52,600 --> 00:00:55,800 Speaker 1: What Could Possibly Go Right with Our AI Future? And, 19 00:00:55,840 --> 00:00:57,960 Speaker 1: although it might not be a surprising position for a 20 00:00:57,960 --> 00:01:01,520 Speaker 1: tech investor to hold, also an outlier in some ways. 21 00:01:02,120 --> 00:01:05,080 Speaker 1: In the book, he argues that concerns and criticisms about 22 00:01:05,120 --> 00:01:10,040 Speaker 1: AI development shouldn't be dismissed, and in last year's presidential race, 23 00:01:10,240 --> 00:01:14,360 Speaker 1: he supported Kamala Harris, unlike many former Democrat donors in 24 00:01:14,400 --> 00:01:17,840 Speaker 1: the valley who have since aligned with Donald Trump. This 25 00:01:17,959 --> 00:01:21,800 Speaker 1: decision has had consequences for Hoffman, who is regularly castigated 26 00:01:21,840 --> 00:01:25,360 Speaker 1: on social media. We speak about that later in the conversation, 27 00:01:25,959 --> 00:01:28,480 Speaker 1: but we start with the book. Read. Was a guest 28 00:01:28,640 --> 00:01:31,400 Speaker 1: several times on the Charlie Rose Show for previous books, 29 00:01:31,760 --> 00:01:34,039 Speaker 1: and I worked there as a producer for many years, 30 00:01:34,480 --> 00:01:37,120 Speaker 1: so I thought I'd start with that. And one of 31 00:01:37,200 --> 00:01:41,520 Speaker 1: Charlie's favorite questions for authors was to say, every great 32 00:01:41,560 --> 00:01:44,639 Speaker 1: book begins with a great question. What's the question behind 33 00:01:44,680 --> 00:01:47,960 Speaker 1: your book? Now you've somewhat preempted that by actually putting 34 00:01:47,960 --> 00:01:51,680 Speaker 1: the question on the front of your book. Can you 35 00:01:51,720 --> 00:01:54,080 Speaker 1: talk about the question and why it's an important one? 36 00:01:54,360 --> 00:01:57,040 Speaker 2: So? What could possibly go right with our AI future? 37 00:01:57,840 --> 00:02:00,920 Speaker 2: And the short answer is were to have this major 38 00:02:01,280 --> 00:02:07,360 Speaker 2: AI technological revolution, and the general discourse is, oh, my god, 39 00:02:07,600 --> 00:02:11,520 Speaker 2: AI is coming. You know, the end is nigh. And actually, 40 00:02:11,520 --> 00:02:14,160 Speaker 2: in fact, you can only get to the future that 41 00:02:14,320 --> 00:02:16,400 Speaker 2: is good by steering towards it, not by trying to 42 00:02:16,440 --> 00:02:20,160 Speaker 2: avoid the futures You don't want. And so the goal 43 00:02:20,880 --> 00:02:24,160 Speaker 2: of the book is to give people good argument and 44 00:02:24,200 --> 00:02:27,040 Speaker 2: grounding and tool set for understanding how to think about 45 00:02:27,360 --> 00:02:31,480 Speaker 2: what could possibly go right. And then you know, we 46 00:02:31,560 --> 00:02:36,040 Speaker 2: individually and collectively navigate there. So that's the question and challenge, 47 00:02:36,040 --> 00:02:39,440 Speaker 2: and that's part of the reason why we wrote Superagency. 48 00:02:39,240 --> 00:02:44,560 Speaker 1: The title Superagency. You almos Wills Isaacson recently and you 49 00:02:44,600 --> 00:02:47,680 Speaker 1: said something I thought was very revealing which already captured 50 00:02:47,720 --> 00:02:50,440 Speaker 1: it for me, which was Tim Cook's iPhone is the 51 00:02:50,480 --> 00:02:52,720 Speaker 1: same one that the cab driver and the uber driver 52 00:02:52,840 --> 00:02:57,160 Speaker 1: is using. So explain that and how it clarifies superagency. 53 00:02:58,160 --> 00:03:01,000 Speaker 2: So part of superagency, and there's we actually compact a 54 00:03:01,040 --> 00:03:03,880 Speaker 2: lot into this term. It's the increase of human agency, 55 00:03:04,320 --> 00:03:07,160 Speaker 2: but not just the individual agency human agency, but also 56 00:03:07,440 --> 00:03:12,600 Speaker 2: society collective superagency. And part of the question when you 57 00:03:12,639 --> 00:03:15,240 Speaker 2: begin to build these technologies, because you know, a classic 58 00:03:15,320 --> 00:03:17,920 Speaker 2: question is well, does it only benefit the most powerful? 59 00:03:18,240 --> 00:03:21,240 Speaker 2: Does it only benefit the elites? And actually, in fact, 60 00:03:21,240 --> 00:03:25,360 Speaker 2: when you do technology targeting superagency, eg. Hundreds of millions 61 00:03:25,400 --> 00:03:28,040 Speaker 2: of people, billions of people engaging, we all get the 62 00:03:28,040 --> 00:03:32,440 Speaker 2: same technology. And so for example, as what I said earlier, 63 00:03:33,160 --> 00:03:36,120 Speaker 2: Tim Cook has the same iPhone that the Uber driver 64 00:03:36,240 --> 00:03:40,200 Speaker 2: does that by engaging in it in mass adoption, it 65 00:03:40,240 --> 00:03:43,640 Speaker 2: then becomes elevating. And that doesn't mean that Tim isn't 66 00:03:43,680 --> 00:03:46,400 Speaker 2: much wealthier, much more powerful, et cetera than the Uber driver, 67 00:03:47,040 --> 00:03:51,720 Speaker 2: but the piece of technology is elevating across the broad 68 00:03:51,800 --> 00:03:54,119 Speaker 2: swath of society. And that's the same thing, of course, 69 00:03:54,120 --> 00:03:57,200 Speaker 2: we're seeing with AI, which is you know, chat GBT 70 00:03:57,360 --> 00:03:59,920 Speaker 2: in its release, which is hundreds of millions of people 71 00:04:00,000 --> 00:04:00,760 Speaker 2: who now use it. 72 00:04:01,160 --> 00:04:04,840 Speaker 1: So in the book, there's this framework of four different 73 00:04:04,880 --> 00:04:08,440 Speaker 1: ways of thinking about AI development. In some sense, can 74 00:04:08,480 --> 00:04:10,880 Speaker 1: you kind of explain what each of them is and 75 00:04:10,920 --> 00:04:12,120 Speaker 1: where you sit? 76 00:04:12,480 --> 00:04:21,120 Speaker 2: Yes, So it's doomers, gloomers, zoomers, and bloomers. And just 77 00:04:21,120 --> 00:04:23,680 Speaker 2: in case anyone was thinking that, like you know, I'm 78 00:04:23,880 --> 00:04:28,360 Speaker 2: castigating names, the doomers actually do call themselves doomers. They 79 00:04:28,400 --> 00:04:31,000 Speaker 2: even have this thing called p doom, which is probability 80 00:04:31,000 --> 00:04:36,840 Speaker 2: of doom. The doomers are basically AI's bad, that the 81 00:04:36,920 --> 00:04:40,120 Speaker 2: at least very high probability is it will be destructive 82 00:04:40,240 --> 00:04:46,120 Speaker 2: to human potential, human society, maybe just might be existentially destructive, 83 00:04:46,720 --> 00:04:51,159 Speaker 2: and that the best outcome is stop, don't make AI 84 00:04:51,320 --> 00:04:54,520 Speaker 2: until we can absolutely guarantee that every single step would 85 00:04:54,520 --> 00:05:00,440 Speaker 2: be positive. Gloomers are, Yeah, we understand, and that AI 86 00:05:00,560 --> 00:05:04,039 Speaker 2: is inevitable. Namely, you know, companies are going to compete. 87 00:05:04,160 --> 00:05:06,320 Speaker 2: Industries are going to compete. Countries are going to compete. 88 00:05:06,880 --> 00:05:10,159 Speaker 2: But it's just likely to make society bad. It's likely 89 00:05:10,200 --> 00:05:14,120 Speaker 2: to you know, make massive changes into the workforce, so 90 00:05:14,160 --> 00:05:17,200 Speaker 2: like jobs will go away and people will be unemployed 91 00:05:17,240 --> 00:05:19,840 Speaker 2: in breadlines and a lot of unhappiness. It may be 92 00:05:19,920 --> 00:05:23,960 Speaker 2: destructive to democracy, it might be destructive personal freedoms, you know, 93 00:05:23,960 --> 00:05:27,320 Speaker 2: et cetera, et cetera. But it's gonna happen anyway, and 94 00:05:27,360 --> 00:05:32,120 Speaker 2: so I'm just gloomy about it. Hence gloomers. Zoomers are 95 00:05:32,160 --> 00:05:35,160 Speaker 2: on their far other end, which is, no, this is 96 00:05:35,240 --> 00:05:38,000 Speaker 2: going to be great. It's the most amazing thing that 97 00:05:38,080 --> 00:05:42,560 Speaker 2: humanity has done ever. And what's great so far outstrips 98 00:05:42,720 --> 00:05:46,719 Speaker 2: everything that possibly even may be wrong, that we should 99 00:05:46,720 --> 00:05:49,840 Speaker 2: just hit the accelerator and go full on, and let 100 00:05:49,839 --> 00:05:52,560 Speaker 2: me tell you about all the amazing things with medicine 101 00:05:52,560 --> 00:05:55,480 Speaker 2: and education and work assistance and all the rest of 102 00:05:55,520 --> 00:05:59,600 Speaker 2: that in productivity, and you know, obviously I have a 103 00:05:59,600 --> 00:06:02,080 Speaker 2: bunch of sympathies for a bunch of the positives of 104 00:06:02,080 --> 00:06:05,159 Speaker 2: the Zoomers. But I myself, of course identify myself as 105 00:06:05,200 --> 00:06:09,839 Speaker 2: a bloomer, which is essentially an accelerationist and a future 106 00:06:09,880 --> 00:06:12,839 Speaker 2: oriented like a zoomer. But it's also saying, hey, just 107 00:06:12,880 --> 00:06:16,000 Speaker 2: because you can make it with technology doesn't mean it's 108 00:06:16,040 --> 00:06:19,680 Speaker 2: inevitably good. That there's different ways of introducing it and 109 00:06:19,760 --> 00:06:22,279 Speaker 2: navigating through it, namely the way that it's introduced and 110 00:06:22,320 --> 00:06:26,760 Speaker 2: brought into human experience and existence at scale, and so 111 00:06:27,000 --> 00:06:31,279 Speaker 2: engaging with the folks who have concerns, criticisms. You know, 112 00:06:31,680 --> 00:06:34,160 Speaker 2: watch out for this pothole, watch out for this mine. 113 00:06:35,000 --> 00:06:37,800 Speaker 2: Is a good thing as you're navigating. Even as you're 114 00:06:37,880 --> 00:06:42,520 Speaker 2: navigating with some speed and acceleration, the Bloomers say, hey, 115 00:06:42,560 --> 00:06:44,800 Speaker 2: let's be in dialogue. Let's try to steer around at 116 00:06:44,800 --> 00:06:48,040 Speaker 2: anything that might be a landmine as we build this 117 00:06:48,560 --> 00:06:49,280 Speaker 2: great future. 118 00:06:49,720 --> 00:06:52,080 Speaker 1: And I guess if you could waive your magic wand 119 00:06:52,120 --> 00:06:55,760 Speaker 1: would you convert fifty percent of gloomers to your camp 120 00:06:55,880 --> 00:06:57,960 Speaker 1: or fifty percent of Zoomers to your camp. 121 00:06:58,320 --> 00:06:59,880 Speaker 2: So if I get away with want, I would con 122 00:07:00,240 --> 00:07:03,840 Speaker 2: gloomers to Bloomers. But that doesn't mean I'm not also 123 00:07:03,880 --> 00:07:07,159 Speaker 2: trying to convert zoomers. And you know, as best I 124 00:07:07,200 --> 00:07:12,840 Speaker 2: can parse it, call it a quasi Gloomer quasi doomer 125 00:07:12,960 --> 00:07:16,280 Speaker 2: set of concerns, which is I said, well, there is 126 00:07:16,360 --> 00:07:18,440 Speaker 2: this danger, we should just stop. I'll give you an 127 00:07:18,440 --> 00:07:21,600 Speaker 2: example that I think is frequently amongst the people who 128 00:07:21,640 --> 00:07:24,920 Speaker 2: think about existential risk or X risk, And they go, 129 00:07:25,280 --> 00:07:27,720 Speaker 2: can you guarantee to me that you're going to not 130 00:07:28,120 --> 00:07:31,200 Speaker 2: that someone isn't going to make a killer robot all 131 00:07:31,240 --> 00:07:34,520 Speaker 2: out of the terminator? And you go nope, can't guarantee that. 132 00:07:34,680 --> 00:07:38,640 Speaker 2: They go, aha, see, that's an existential risk, and so therefore, 133 00:07:38,640 --> 00:07:41,640 Speaker 2: looking at that existential risk, we should stop or pause 134 00:07:42,000 --> 00:07:44,520 Speaker 2: or slow down. He say, well, it sounds like a rational, 135 00:07:44,600 --> 00:07:48,680 Speaker 2: reasonable argument, except when you consider that existential risk is 136 00:07:48,720 --> 00:07:52,560 Speaker 2: not each one off thing. It's a portfolio. It's a basket. 137 00:07:52,840 --> 00:07:55,640 Speaker 2: So it's a basket that includes nuclear war, it's a 138 00:07:55,640 --> 00:07:58,880 Speaker 2: basket that includes climate change, the basket includes pandemics, a 139 00:07:58,880 --> 00:08:02,840 Speaker 2: basket that includes a whole bunch of things. So you say, hey, 140 00:08:02,960 --> 00:08:07,120 Speaker 2: you're building AI. That becomes a new existential risk. He's like, yep, 141 00:08:07,240 --> 00:08:10,119 Speaker 2: that becomes a new excentral risk. But if you're doing AI, 142 00:08:10,920 --> 00:08:15,240 Speaker 2: Are you also mitigating you know, pandemic risk? Right, That's 143 00:08:15,240 --> 00:08:18,200 Speaker 2: the only way I can think of to combat scale pandemics, 144 00:08:18,240 --> 00:08:21,960 Speaker 2: both natural and I made is AI. Are you contravening 145 00:08:22,000 --> 00:08:25,200 Speaker 2: asteroid risk? Because the ability to see which asteroids might 146 00:08:25,240 --> 00:08:27,240 Speaker 2: be the ones that might be coming for us early 147 00:08:27,360 --> 00:08:29,400 Speaker 2: enough to do something about it, and to be able 148 00:08:29,400 --> 00:08:31,880 Speaker 2: to kind of navigate doing something about it well, AI 149 00:08:31,960 --> 00:08:34,600 Speaker 2: is actually likely to be pretty central to that equation. 150 00:08:35,440 --> 00:08:38,480 Speaker 2: And so my point of view is our existential risk 151 00:08:38,559 --> 00:08:42,120 Speaker 2: overall goes down, and so that earlier argument doesn't actually, 152 00:08:42,160 --> 00:08:42,960 Speaker 2: in fact really work. 153 00:08:43,200 --> 00:08:46,359 Speaker 1: That's a truly dventuhift framework for thinking about a portfolio 154 00:08:46,440 --> 00:08:48,320 Speaker 1: of risks. That's interesting. 155 00:08:48,400 --> 00:08:49,960 Speaker 2: Yeah, And by the way, that doesn't mean that you 156 00:08:49,960 --> 00:08:52,679 Speaker 2: don't try to minimize the killer robot risk and you 157 00:08:52,679 --> 00:08:54,880 Speaker 2: don't try to maximize the benefit in the other cases. 158 00:08:55,040 --> 00:08:58,480 Speaker 2: It doesn't mean you have no dialogue, don't talk about risk. No, no, no. 159 00:08:58,600 --> 00:09:01,120 Speaker 2: Talking about risk is good. But part of what I'm 160 00:09:01,160 --> 00:09:03,840 Speaker 2: trying to do with superagency and things like this is 161 00:09:03,880 --> 00:09:06,280 Speaker 2: to say, talk about the risk in a way that 162 00:09:06,320 --> 00:09:08,840 Speaker 2: you said that you're being smart about it, and you're 163 00:09:08,880 --> 00:09:12,679 Speaker 2: trying to kind of be smart in iterative stages about 164 00:09:12,679 --> 00:09:15,400 Speaker 2: the portfolio of them, and also that you're not trying 165 00:09:15,440 --> 00:09:19,160 Speaker 2: to overly dramatize your own genius like anyone who says 166 00:09:19,200 --> 00:09:20,840 Speaker 2: I know exactly what AI is going to be like 167 00:09:20,840 --> 00:09:24,520 Speaker 2: in five years from now, either for positive or for negative, 168 00:09:24,840 --> 00:09:27,640 Speaker 2: they're both nuts, right. I mean, it's like we're discovering 169 00:09:27,640 --> 00:09:29,520 Speaker 2: this as we're going. Yeah. 170 00:09:29,559 --> 00:09:31,840 Speaker 1: I think one of the things that comes across in 171 00:09:31,880 --> 00:09:37,800 Speaker 1: your book is how different platform technologies have allowed humanity 172 00:09:37,880 --> 00:09:42,320 Speaker 1: to move from a subsistence way of living to way 173 00:09:42,320 --> 00:09:46,200 Speaker 1: of living that allows for leisure and self reflection and 174 00:09:46,320 --> 00:09:48,839 Speaker 1: all of those types of things. And you wrote a 175 00:09:48,880 --> 00:09:51,520 Speaker 1: piece about this for The New York Times called AI 176 00:09:51,800 --> 00:09:55,800 Speaker 1: Will Empower Humanity. You wrote, AI could turn data into 177 00:09:55,800 --> 00:09:59,280 Speaker 1: the material for a de facto's second self, one that 178 00:09:59,320 --> 00:10:01,720 Speaker 1: could endow even the most sketchy brain down my us 179 00:10:01,760 --> 00:10:04,640 Speaker 1: with a capacity for revisiting the past with a level 180 00:10:04,640 --> 00:10:08,920 Speaker 1: of detail. Even the novelist Marcel Proost might envy. Can 181 00:10:08,920 --> 00:10:09,840 Speaker 1: you explain that vision? 182 00:10:10,600 --> 00:10:14,120 Speaker 2: Look so at a macro view. I think within a 183 00:10:14,160 --> 00:10:16,800 Speaker 2: small number of years, all of us will have AI 184 00:10:16,880 --> 00:10:20,360 Speaker 2: agents helping us with things, and it'll be helping us 185 00:10:20,400 --> 00:10:24,680 Speaker 2: with like work things and learning and other kinds of things, 186 00:10:24,679 --> 00:10:28,880 Speaker 2: but also helping us with like the kind of the 187 00:10:29,000 --> 00:10:33,479 Speaker 2: daily activity of our lives, which include, for example, remembering 188 00:10:33,520 --> 00:10:36,880 Speaker 2: things like so frequently. Of course people say, oh, wait, 189 00:10:36,920 --> 00:10:38,920 Speaker 2: you're remembering this about me, and maybe you can sell 190 00:10:39,000 --> 00:10:41,600 Speaker 2: you can manipulate me into buying something or selling me 191 00:10:41,640 --> 00:10:45,600 Speaker 2: an ad all bad. I am losing agency. I've lost 192 00:10:45,600 --> 00:10:47,640 Speaker 2: privacy of my information. It's like, well, actually, in fact, 193 00:10:47,640 --> 00:10:52,840 Speaker 2: think about what a positive feature this is, because, by 194 00:10:52,840 --> 00:10:54,559 Speaker 2: the way, if it remembers something about me, it can 195 00:10:54,600 --> 00:10:56,960 Speaker 2: help me remember it. It can help me. You know, 196 00:10:57,040 --> 00:10:59,280 Speaker 2: like I'm navigating and I'm having this conversation with odds 197 00:10:59,280 --> 00:11:02,040 Speaker 2: and it says, hey, you guys met before here and 198 00:11:02,040 --> 00:11:04,199 Speaker 2: you talked about this. Oh, that's really helpful, makes our 199 00:11:04,360 --> 00:11:05,160 Speaker 2: connection better. 200 00:11:05,240 --> 00:11:07,480 Speaker 1: That's why I started with the allusion to a shed 201 00:11:07,520 --> 00:11:09,839 Speaker 1: past at Charlie Roads the case in point, yes. 202 00:11:09,600 --> 00:11:14,040 Speaker 2: Exactly, and and you know part of that actually, in fact, 203 00:11:14,360 --> 00:11:18,960 Speaker 2: when when the memory is for me and of help 204 00:11:19,040 --> 00:11:21,600 Speaker 2: to me, of in service to me, that's extremely positive. 205 00:11:22,080 --> 00:11:25,600 Speaker 1: The allusion to proost was kind of irresistible to me. 206 00:11:26,760 --> 00:11:28,640 Speaker 2: The remembrance of things past. 207 00:11:28,760 --> 00:11:31,520 Speaker 1: Remember things passed, and how do the events of a 208 00:11:31,600 --> 00:11:35,680 Speaker 1: life coalesced to her self, right, And so of course 209 00:11:35,679 --> 00:11:40,560 Speaker 1: I got curious about you and you were recently on 210 00:11:40,600 --> 00:11:45,280 Speaker 1: the podcast with Stephen Bartlet Diary vs. CEO, and you said, 211 00:11:45,880 --> 00:11:48,960 Speaker 1: you know, you have to understand this journey started with 212 00:11:49,080 --> 00:11:52,079 Speaker 1: being born in Stamford Hospital. What role did that play 213 00:11:52,120 --> 00:11:54,560 Speaker 1: in read as a bloomer? 214 00:11:55,640 --> 00:12:00,880 Speaker 2: Well, definitely being a California and I child of the 215 00:12:00,880 --> 00:12:05,040 Speaker 2: Bay Area definitely helps me be open minded and curious 216 00:12:05,559 --> 00:12:09,240 Speaker 2: to understand that technology is part of what makes us human. 217 00:12:09,720 --> 00:12:12,480 Speaker 2: A little bit like again, the kind of the Marcel 218 00:12:12,600 --> 00:12:15,679 Speaker 2: Proof's gesture is, you know, I've kind of I argue 219 00:12:15,679 --> 00:12:19,479 Speaker 2: that we're actually better described as homotechne than Homo sapiens 220 00:12:19,520 --> 00:12:22,240 Speaker 2: because we evolve through a technology, not just obviously through 221 00:12:22,920 --> 00:12:26,600 Speaker 2: remote podcasts like this, or glasses or clothing or cars 222 00:12:26,679 --> 00:12:29,640 Speaker 2: or smartphones, but it's but it's like it's who we are. 223 00:12:29,679 --> 00:12:32,600 Speaker 2: We internalize this technology. It makes us. It's part of 224 00:12:32,679 --> 00:12:36,920 Speaker 2: what we evolve very very very slowly genetically, but actually 225 00:12:36,920 --> 00:12:41,560 Speaker 2: we evolve very fast culturally, technologically, and that's part of 226 00:12:41,640 --> 00:12:45,240 Speaker 2: who we become and all of that. I think by 227 00:12:45,320 --> 00:12:48,240 Speaker 2: being you know, a child of the San Francisco Bay Area, 228 00:12:48,600 --> 00:12:51,120 Speaker 2: you have the openness and laissez faire to say, hey, 229 00:12:51,200 --> 00:12:54,920 Speaker 2: you can invent something and change the world. And you know, 230 00:12:54,960 --> 00:12:58,280 Speaker 2: it's not just Silicon Valley but also Hollywood, and you know, 231 00:12:58,320 --> 00:13:03,840 Speaker 2: it's kind of that direction of what's next is much 232 00:13:03,840 --> 00:13:07,400 Speaker 2: more interesting than what was. And it doesn't mean what 233 00:13:07,720 --> 00:13:10,760 Speaker 2: was is irrelevant and what was doesn't inform what's next, 234 00:13:10,800 --> 00:13:13,880 Speaker 2: and you can't learn things from what was, but it's 235 00:13:15,040 --> 00:13:17,960 Speaker 2: live into the future and live into that change, I 236 00:13:17,960 --> 00:13:20,400 Speaker 2: think is really fundamental. And I think that helped, you know, 237 00:13:20,480 --> 00:13:23,840 Speaker 2: shape how I think about things and kind of the 238 00:13:23,880 --> 00:13:25,400 Speaker 2: reason I'm so future oriented. 239 00:13:26,480 --> 00:13:28,760 Speaker 1: Yeah, And I think that's what makes you know your 240 00:13:28,840 --> 00:13:33,720 Speaker 1: views on AI particularly interesting because in the nineties, at 241 00:13:33,720 --> 00:13:37,600 Speaker 1: the birth of the consumer Internet, you saw around the corner, right, 242 00:13:37,800 --> 00:13:41,200 Speaker 1: I mean, you understood that the Internet would be social networking. 243 00:13:41,240 --> 00:13:44,480 Speaker 1: You founded a company called Social net You're one of 244 00:13:44,520 --> 00:13:49,800 Speaker 1: the first investors in Facebook, and you founded LinkedIn. What 245 00:13:49,880 --> 00:13:52,680 Speaker 1: gave you that fundamental insight about what the Internet would become. 246 00:13:53,800 --> 00:13:55,480 Speaker 2: So I think, you know, one of the things I 247 00:13:55,520 --> 00:13:58,640 Speaker 2: think is fundamental to being an investor or a founder 248 00:13:59,160 --> 00:14:02,280 Speaker 2: in especially the consumer consumer and our products is having 249 00:14:02,280 --> 00:14:04,240 Speaker 2: a theory of human nature, and it's about the individual 250 00:14:04,280 --> 00:14:07,400 Speaker 2: human nature and then kind of collective and you know, 251 00:14:07,480 --> 00:14:09,320 Speaker 2: part of that kind of thesis is where you know, 252 00:14:09,360 --> 00:14:12,760 Speaker 2: we're tribal creatures, we're we're social animals in this way, 253 00:14:13,640 --> 00:14:16,760 Speaker 2: and you know that is part of the kind of 254 00:14:16,760 --> 00:14:19,600 Speaker 2: the fundamental you know, kind of theory of human beings. 255 00:14:19,640 --> 00:14:24,360 Speaker 2: I have then in rolls the Internet and you think, well, okay, 256 00:14:24,400 --> 00:14:27,360 Speaker 2: the Internet does all kinds of useful things like bringing 257 00:14:27,440 --> 00:14:30,080 Speaker 2: up information ability, shop by things. But I was like, well, 258 00:14:30,080 --> 00:14:33,440 Speaker 2: but actually in that changes our social space, changes our 259 00:14:33,440 --> 00:14:35,480 Speaker 2: social space in terms of how we think of ourselves, 260 00:14:35,520 --> 00:14:38,320 Speaker 2: who we think we're connected to, how we communicate, which 261 00:14:38,320 --> 00:14:41,360 Speaker 2: groups were part of you know, kind of information flow 262 00:14:41,400 --> 00:14:44,480 Speaker 2: and recommendations, and that all gets to kind of the 263 00:14:44,520 --> 00:14:46,960 Speaker 2: web two oh side. And so that that was my 264 00:14:47,080 --> 00:14:51,480 Speaker 2: recognition of you know, when when the Internet was you know, 265 00:14:51,640 --> 00:14:53,240 Speaker 2: first being talked about, it was like, oh, it's an 266 00:14:53,280 --> 00:14:56,360 Speaker 2: information ecosystem, you know, you know HTML. It's like I 267 00:14:56,520 --> 00:14:58,560 Speaker 2: ask for the document and I get the document, and 268 00:14:58,600 --> 00:15:01,880 Speaker 2: I was like, that's cool. But actually, in fact, we 269 00:15:01,920 --> 00:15:05,200 Speaker 2: are social animals, you know, we are citizens of the polis. 270 00:15:05,800 --> 00:15:08,240 Speaker 2: So how does being people get into this? 271 00:15:08,760 --> 00:15:13,360 Speaker 1: The insight you have about how the Internet sort of 272 00:15:13,360 --> 00:15:20,440 Speaker 1: recreates it social relationships that led to LinkedIn. He's very 273 00:15:20,520 --> 00:15:24,680 Speaker 1: very clear to me, what is the equivalent insight about 274 00:15:25,080 --> 00:15:25,720 Speaker 1: the age of AI. 275 00:15:27,600 --> 00:15:32,480 Speaker 2: So I think it's this notion that AI gives us 276 00:15:32,480 --> 00:15:37,520 Speaker 2: superpowers and that those superpowers will and they'll change. Like 277 00:15:37,600 --> 00:15:40,440 Speaker 2: what's like what our previous superpowers were. It's just like, 278 00:15:40,480 --> 00:15:44,880 Speaker 2: for example, you know, before you had the book, memory 279 00:15:45,240 --> 00:15:47,680 Speaker 2: was a critical superpower. It was kind of like the 280 00:15:47,760 --> 00:15:51,840 Speaker 2: no no, I can recite the Iliad to you and 281 00:15:51,560 --> 00:15:53,960 Speaker 2: and and having that and actual part of was what 282 00:15:54,080 --> 00:15:57,480 Speaker 2: described about in printing presses destroying human capabilities, like, oh 283 00:15:57,520 --> 00:16:02,600 Speaker 2: my god, there's this critical human cognitive function of really remembering, 284 00:16:02,640 --> 00:16:04,680 Speaker 2: and now that's going to be destroyed and that's going 285 00:16:04,720 --> 00:16:07,600 Speaker 2: to reduce our humanity. And you're like, well, no, actually, 286 00:16:07,600 --> 00:16:10,600 Speaker 2: induct doesn't. Over time, it increases their humanity because it's 287 00:16:10,600 --> 00:16:13,640 Speaker 2: opposed to only emphasizing memory for doing anything else, you 288 00:16:13,680 --> 00:16:16,560 Speaker 2: can also bring in many other forms of intelligence. And 289 00:16:16,600 --> 00:16:19,800 Speaker 2: the same parallel I think goes to AI. You say, well, 290 00:16:20,040 --> 00:16:21,640 Speaker 2: you know, you listen to people that oh, it's going 291 00:16:21,720 --> 00:16:24,680 Speaker 2: to destroy our ability to think critically because I'm just 292 00:16:24,680 --> 00:16:28,600 Speaker 2: gonna ask, you know, GBD four or pie to reason 293 00:16:28,680 --> 00:16:31,480 Speaker 2: for me, you know, Gemini to give me the answer. 294 00:16:31,920 --> 00:16:34,440 Speaker 2: Co Pilot two, you know, do Visa coding for me, 295 00:16:34,480 --> 00:16:36,000 Speaker 2: and I don't have to think anymore. I'll just you know, 296 00:16:36,040 --> 00:16:38,480 Speaker 2: eat cookies and sit on the sit on the sofa. 297 00:16:38,520 --> 00:16:42,880 Speaker 2: And you're like, well, obviously, with any technology people can 298 00:16:42,920 --> 00:16:45,440 Speaker 2: be lazy and do stuff. But on the other hand, 299 00:16:45,480 --> 00:16:48,680 Speaker 2: what it means is you now have new attributes. For example, 300 00:16:49,160 --> 00:16:51,080 Speaker 2: most of us get a little sloppier on our spelling 301 00:16:51,120 --> 00:16:53,600 Speaker 2: because we recly and the spell checker for solving it. 302 00:16:53,720 --> 00:16:55,320 Speaker 2: But that means because we're thinking about other things, we're 303 00:16:55,320 --> 00:16:57,840 Speaker 2: thinking about what the point is, how to architect it, 304 00:16:58,520 --> 00:17:02,120 Speaker 2: and so I think it's giving us superpowers. They'll actually 305 00:17:02,120 --> 00:17:03,640 Speaker 2: help us become even more human. 306 00:17:17,560 --> 00:17:20,439 Speaker 1: After the break, Reid Hoffman tells us about how his 307 00:17:20,560 --> 00:17:32,880 Speaker 1: background influenced his worldview. Stay with us, Welcome back. While 308 00:17:32,920 --> 00:17:35,720 Speaker 1: preparing for this interview, I stumbled across a twenty to 309 00:17:35,760 --> 00:17:39,240 Speaker 1: fifteen New Yorker profile of Reid Hoffman. This was back 310 00:17:39,280 --> 00:17:42,520 Speaker 1: in his LinkedIn days, and I found his background to 311 00:17:42,560 --> 00:17:46,080 Speaker 1: be quite revealing. One thing that really struck me, I 312 00:17:46,080 --> 00:17:48,760 Speaker 1: guess because it resonates in my own experience, is that 313 00:17:48,800 --> 00:17:52,760 Speaker 1: you are the only child of parents who divorced when 314 00:17:52,760 --> 00:17:55,399 Speaker 1: you were very young, as am I, and then you 315 00:17:55,440 --> 00:17:57,879 Speaker 1: went to boarding school where you had the experience of 316 00:17:57,920 --> 00:18:01,000 Speaker 1: being a fish out of water lead to a certain extent, 317 00:18:01,400 --> 00:18:04,320 Speaker 1: which is also an experience that I had. And so, 318 00:18:05,280 --> 00:18:07,119 Speaker 1: you know, one of the things I've observed about my 319 00:18:07,160 --> 00:18:10,840 Speaker 1: life is I've become very very interested in networks, both 320 00:18:10,880 --> 00:18:13,840 Speaker 1: both both wide and deep. Is extremely important to me 321 00:18:13,960 --> 00:18:16,720 Speaker 1: to feel connected with other people. And so I don't 322 00:18:16,720 --> 00:18:19,720 Speaker 1: want to pop psychologize you, but we're talking about Proust 323 00:18:19,760 --> 00:18:22,560 Speaker 1: and AI to know yourself better and stuff. Have you 324 00:18:22,720 --> 00:18:27,399 Speaker 1: thought about how that childhood experience affected your view of 325 00:18:28,200 --> 00:18:30,080 Speaker 1: networks and technological development? 326 00:18:30,800 --> 00:18:35,640 Speaker 2: Interesting, it certainly affected my views of human nature, certainly 327 00:18:35,760 --> 00:18:39,560 Speaker 2: affected my views of were I what my role in 328 00:18:39,600 --> 00:18:46,480 Speaker 2: this grand play of humanity and human nature is? Maybe 329 00:18:46,520 --> 00:18:52,159 Speaker 2: not least because feeling somewhat alienated from my fellow schoolmates, 330 00:18:52,240 --> 00:18:55,919 Speaker 2: reading a lot of science fiction and thinking about things 331 00:18:57,160 --> 00:19:01,000 Speaker 2: I would say, and maybe this that it's most core, 332 00:19:01,080 --> 00:19:04,800 Speaker 2: which is like probably one of the very deep beliefs 333 00:19:04,840 --> 00:19:09,560 Speaker 2: I have, which is I think arguable from an induction standpoint, 334 00:19:09,640 --> 00:19:12,160 Speaker 2: but not ever. But many people don't hold this, which 335 00:19:12,200 --> 00:19:14,680 Speaker 2: is the future can be so much better than the now. 336 00:19:15,320 --> 00:19:18,400 Speaker 2: We just have to try to shape it to being so. 337 00:19:18,400 --> 00:19:21,000 Speaker 2: So you go, well, I'm unhappy with, you know, being 338 00:19:21,040 --> 00:19:24,280 Speaker 2: in the Lord of the Flies boarding school. Well, okay, 339 00:19:24,920 --> 00:19:26,720 Speaker 2: you know, how can we make this better? What are 340 00:19:26,760 --> 00:19:28,720 Speaker 2: the things to do? And oh, look, technology is a 341 00:19:28,720 --> 00:19:30,960 Speaker 2: tool for this, you know, as a way of kind 342 00:19:31,000 --> 00:19:31,639 Speaker 2: of operating. 343 00:19:32,920 --> 00:19:35,159 Speaker 1: Is it true that you'd like to ask the question 344 00:19:35,359 --> 00:19:37,639 Speaker 1: who's in your tribe? 345 00:19:38,000 --> 00:19:42,600 Speaker 2: Yes, although it's more because you know, we are tribal 346 00:19:42,600 --> 00:19:45,320 Speaker 2: creatures and you know whatnot, But it's kind of a 347 00:19:45,320 --> 00:19:48,040 Speaker 2: little bit more of it's almost like a friendship question, 348 00:19:48,119 --> 00:19:50,240 Speaker 2: like who are your five best friends and why? And 349 00:19:50,280 --> 00:19:53,240 Speaker 2: who are they? Because some agree like who are you? 350 00:19:53,640 --> 00:19:55,879 Speaker 2: Is you're a cross product to your friends. You've chosen 351 00:19:55,920 --> 00:19:59,600 Speaker 2: these people who you share values with in alignment with 352 00:19:59,720 --> 00:20:03,359 Speaker 2: a the way the world should be, a people that 353 00:20:03,440 --> 00:20:07,880 Speaker 2: you're willing to defend the moral character of and say, hey, 354 00:20:07,920 --> 00:20:11,320 Speaker 2: this person's a really good person in the world. And 355 00:20:11,400 --> 00:20:13,400 Speaker 2: so that version of tribe. 356 00:20:13,960 --> 00:20:17,840 Speaker 1: Yes, and you've reached a point where two definitions of 357 00:20:18,320 --> 00:20:22,480 Speaker 1: tribe are kind of interacting, right. I mean that the 358 00:20:22,480 --> 00:20:25,320 Speaker 1: New Yorker profile the opening scene was you and Mark 359 00:20:25,359 --> 00:20:30,840 Speaker 1: Pinkas sitting together talking about advising Obama about how to 360 00:20:30,920 --> 00:20:34,119 Speaker 1: use social networks and technology to you know, extends to 361 00:20:34,400 --> 00:20:37,359 Speaker 1: extend the reach of the messaging. You were both big 362 00:20:37,359 --> 00:20:41,240 Speaker 1: Obama donors. Obviously, Mark Pinkus broke for Trump, as did 363 00:20:41,280 --> 00:20:45,920 Speaker 1: Mark Andrews, and as did many who you grew up with, 364 00:20:46,400 --> 00:20:49,880 Speaker 1: and I'm imagine you count as close personal friends. I mean, 365 00:20:49,880 --> 00:20:51,760 Speaker 1: how how have you dealt with that? 366 00:20:54,080 --> 00:20:56,920 Speaker 2: Well, it's complicated and difficult. It kind of comes down 367 00:20:57,040 --> 00:21:03,160 Speaker 2: to what was theirs and for doing that, because when 368 00:21:03,160 --> 00:21:06,240 Speaker 2: their reason was, for example, like Mark Pinkus, who had 369 00:21:07,000 --> 00:21:10,880 Speaker 2: a particular set of views about the two different presidential candidates, 370 00:21:11,680 --> 00:21:16,000 Speaker 2: willingness to support the state of Israel to understand kind 371 00:21:16,000 --> 00:21:19,000 Speaker 2: of it being under attack and feeling that you know, 372 00:21:19,080 --> 00:21:22,240 Speaker 2: part of of course, these centuries of human existence have 373 00:21:22,320 --> 00:21:26,800 Speaker 2: been intensely anti Semitism and genocide against people of the 374 00:21:26,840 --> 00:21:29,840 Speaker 2: Jewish faith and Jewish descent and having a place to 375 00:21:29,880 --> 00:21:32,840 Speaker 2: protect them and saying, hey, I think Trump will be 376 00:21:32,880 --> 00:21:37,560 Speaker 2: better for this than you know, Harris, that's a moral 377 00:21:38,080 --> 00:21:41,639 Speaker 2: I actually disagree with the judgment, but I understand that 378 00:21:41,760 --> 00:21:44,960 Speaker 2: kind of reasoning from a moral point of view. It's 379 00:21:45,000 --> 00:21:48,040 Speaker 2: the it isn't just about me, And so so for 380 00:21:48,200 --> 00:21:51,600 Speaker 2: you know, Mark Pinkus, you know, it's it's been difficult 381 00:21:51,640 --> 00:21:56,320 Speaker 2: conversations but fundamentally part of that life journey. And there's 382 00:21:56,359 --> 00:21:59,280 Speaker 2: other Silicon Valley people it's very similar, like whether they're 383 00:21:59,359 --> 00:22:02,679 Speaker 2: very focused on crypto or other things where I again 384 00:22:02,720 --> 00:22:06,080 Speaker 2: say I wouldn't vote a single issue crypto in this election, 385 00:22:06,200 --> 00:22:10,200 Speaker 2: but for them that was frequently the argument is like, wow, 386 00:22:10,240 --> 00:22:12,879 Speaker 2: the Democrats and Republicans are the same. It's like, no, 387 00:22:12,960 --> 00:22:15,680 Speaker 2: they're not right, and I think I can make that case. 388 00:22:16,359 --> 00:22:20,119 Speaker 2: And then there's other folks who you know, kind of 389 00:22:20,119 --> 00:22:27,080 Speaker 2: reveal that they don't actually in fact have kind of 390 00:22:27,160 --> 00:22:30,480 Speaker 2: moral characters on this, that it's just about their own power, 391 00:22:30,560 --> 00:22:35,080 Speaker 2: their own their own you know, kind of like more 392 00:22:35,119 --> 00:22:40,520 Speaker 2: about me. And then you know those folks I am 393 00:22:40,560 --> 00:22:44,240 Speaker 2: between not friends with and less friends with, and you 394 00:22:44,280 --> 00:22:46,440 Speaker 2: know all the rest of it. And many of them, 395 00:22:46,520 --> 00:22:48,439 Speaker 2: you know, I don't really talk to at the moment. 396 00:22:48,920 --> 00:22:50,800 Speaker 2: And it isn't I don't talk to them because they 397 00:22:50,800 --> 00:22:54,359 Speaker 2: supported Trump. It's I don't talk to them because the 398 00:22:54,440 --> 00:22:59,440 Speaker 2: reason that they're supporting Trump is for their own benefit, 399 00:22:59,560 --> 00:23:01,840 Speaker 2: not for him, humanity or society's benefit. 400 00:23:02,119 --> 00:23:06,160 Speaker 1: I have a hypothesis that if you presented the Zoomer 401 00:23:07,040 --> 00:23:13,480 Speaker 1: Bloomer Gloomer Duma framework to let's say Marc Andreeson, I said, 402 00:23:13,760 --> 00:23:16,640 Speaker 1: why did you vote for Trump and become a Trump supporter? 403 00:23:17,280 --> 00:23:19,399 Speaker 1: He might respond, well, because I was very, very scared 404 00:23:19,400 --> 00:23:21,080 Speaker 1: of gloomers and doomers at the wheel. 405 00:23:21,920 --> 00:23:24,040 Speaker 2: I think that's partially true. Look, and by the way, 406 00:23:24,600 --> 00:23:29,880 Speaker 2: when people say we're the Democrats hostile to the technology 407 00:23:29,920 --> 00:23:34,280 Speaker 2: creation of the future, the answer is yes, right. Unfortunately 408 00:23:34,320 --> 00:23:37,280 Speaker 2: as a broad brush. Now, obviously I was very much 409 00:23:37,280 --> 00:23:39,439 Speaker 2: supporting the democratic cause because of the question of what 410 00:23:39,560 --> 00:23:41,600 Speaker 2: things most matter and which problems you're going to work on. 411 00:23:42,480 --> 00:23:45,760 Speaker 2: But the fact is is the administration basically thought, look, 412 00:23:45,960 --> 00:23:48,920 Speaker 2: big tech companies were bad. They didn't spend much time 413 00:23:48,960 --> 00:23:51,240 Speaker 2: with them. They wanted to do too as supposed to 414 00:23:51,400 --> 00:23:55,720 Speaker 2: helping shape them or encourage them to help the everyday American. 415 00:23:55,760 --> 00:23:56,959 Speaker 2: And I was like, no, we need to hit them 416 00:23:56,960 --> 00:24:01,040 Speaker 2: with a stick. And then similarly like in do you know, 417 00:24:01,160 --> 00:24:03,679 Speaker 2: complicated area, but as opposed to saying, okay, we're going 418 00:24:03,720 --> 00:24:05,440 Speaker 2: to try to set up the rules by which people 419 00:24:05,440 --> 00:24:06,960 Speaker 2: can play on and try to figure out how to 420 00:24:06,960 --> 00:24:08,840 Speaker 2: shape it. It was kind of like, we're going to 421 00:24:08,960 --> 00:24:12,119 Speaker 2: just we're just going to attack the whole industry, and 422 00:24:12,200 --> 00:24:14,520 Speaker 2: so then the crypto people have you thought, Look, I 423 00:24:14,520 --> 00:24:16,720 Speaker 2: think crypto is super important because it creates a new 424 00:24:16,800 --> 00:24:22,280 Speaker 2: kind of protocol for you know, distributed power with finances 425 00:24:22,280 --> 00:24:24,439 Speaker 2: and identity and all the rest, and that's really important 426 00:24:24,480 --> 00:24:30,000 Speaker 2: to have for a better society. Then you're just attacking it, 427 00:24:30,040 --> 00:24:34,360 Speaker 2: which is my life, my life's mission. You know, you're 428 00:24:34,359 --> 00:24:36,520 Speaker 2: going to be challenged and hostile to that. 429 00:24:37,600 --> 00:24:44,560 Speaker 1: Has it caused you personally any pain or concern if 430 00:24:44,600 --> 00:24:46,560 Speaker 1: so many people who've been close to for so many 431 00:24:46,640 --> 00:24:49,639 Speaker 1: years and who judgment you otherwise respect have made a 432 00:24:49,680 --> 00:24:52,520 Speaker 1: completely different call to you. Is there any moment where 433 00:24:52,560 --> 00:24:57,240 Speaker 1: you questioned your your own call? And similarly, when you're 434 00:24:57,240 --> 00:25:02,359 Speaker 1: not only a political opponent but a target of people 435 00:25:02,440 --> 00:25:06,760 Speaker 1: like you don't musk. Does that intervene in your in 436 00:25:06,800 --> 00:25:08,680 Speaker 1: your personal friendships and your sense of self? 437 00:25:10,080 --> 00:25:14,920 Speaker 2: Well, it definitely can intervene in personal friendships. I think 438 00:25:14,920 --> 00:25:18,800 Speaker 2: it's an important thing of when you feel fear and concern, 439 00:25:19,359 --> 00:25:24,600 Speaker 2: like concern that you will be you know, lied about slandered, 440 00:25:25,720 --> 00:25:28,320 Speaker 2: done so in a way that generates threats of violence 441 00:25:28,600 --> 00:25:33,240 Speaker 2: against you you know, and you know people you love 442 00:25:33,320 --> 00:25:36,120 Speaker 2: and they're close to and you go, oh my god, 443 00:25:36,200 --> 00:25:40,040 Speaker 2: that makes me feel less. Uh, you know, kind of 444 00:25:40,080 --> 00:25:44,280 Speaker 2: like like wanting to hide. That's precisely when it's important 445 00:25:44,320 --> 00:25:47,440 Speaker 2: to be courageous and stand up. That fear is the 446 00:25:47,840 --> 00:25:52,000 Speaker 2: moral moment to say no, I'm not going to be bullied. 447 00:25:52,040 --> 00:25:55,920 Speaker 2: I'm not going to be intimidated. Fundamentally, one of the 448 00:25:55,920 --> 00:25:58,280 Speaker 2: things I pride myself in is I pride myself in 449 00:25:58,320 --> 00:26:02,160 Speaker 2: the fact that I try to understand rational different points 450 00:26:02,200 --> 00:26:04,200 Speaker 2: of view to the ones I have. I always engage 451 00:26:04,240 --> 00:26:07,400 Speaker 2: in that conversation. I listen, I can see a lot 452 00:26:07,400 --> 00:26:10,600 Speaker 2: of the you know, criticisms of you know, kind of 453 00:26:10,760 --> 00:26:14,080 Speaker 2: some democratic trends. I very much want to hear it, 454 00:26:14,240 --> 00:26:19,119 Speaker 2: especially from people whose morality I trust, whose intentions I trust, 455 00:26:19,160 --> 00:26:21,800 Speaker 2: whose truthfulness I trust that they're not just trying to 456 00:26:21,800 --> 00:26:23,960 Speaker 2: persuade me or you know, kind of get me to 457 00:26:24,040 --> 00:26:26,640 Speaker 2: drink the kool aid and you know, kind of see 458 00:26:26,680 --> 00:26:29,240 Speaker 2: the world and if way, but through a truth seeking 459 00:26:29,280 --> 00:26:32,800 Speaker 2: process is one of the things that I very much 460 00:26:32,880 --> 00:26:37,320 Speaker 2: value and have valued for decades and discussion with a 461 00:26:37,400 --> 00:26:39,239 Speaker 2: number of my friends, not the least of which is 462 00:26:39,520 --> 00:26:40,320 Speaker 2: you know Peter Tiel. 463 00:26:40,680 --> 00:26:44,720 Speaker 1: Your relationship with Peter began as Stanford, right, and it's 464 00:26:44,720 --> 00:26:46,919 Speaker 1: played out of in many decades. Can you can you 465 00:26:46,960 --> 00:26:51,400 Speaker 1: describe him and the history of your relationship and how 466 00:26:51,400 --> 00:26:52,240 Speaker 1: it plays out today. 467 00:26:52,840 --> 00:26:56,240 Speaker 2: So Peter and I had both heard about each other. 468 00:26:56,400 --> 00:26:59,199 Speaker 2: We were freshmen together at Stanford. You know, me is 469 00:26:59,240 --> 00:27:02,359 Speaker 2: this kind of lefty person, him as this righty person. 470 00:27:03,200 --> 00:27:06,360 Speaker 2: And we met in a philosophy class called Philosophy eighty 471 00:27:06,400 --> 00:27:08,200 Speaker 2: Mind Matter and meeting Oh I've heard of you, and 472 00:27:08,240 --> 00:27:10,680 Speaker 2: then we started arguing right away, and then you know, 473 00:27:10,760 --> 00:27:13,159 Speaker 2: set up coffee and argued for hours and hours and hours, 474 00:27:13,240 --> 00:27:17,919 Speaker 2: and you know, I think Peter really helped me. Like 475 00:27:18,200 --> 00:27:22,639 Speaker 2: it was probably before we started those conversations, I was 476 00:27:22,640 --> 00:27:25,520 Speaker 2: probably a little too casual in my acceptance of some 477 00:27:25,680 --> 00:27:28,199 Speaker 2: of what you might think of as the you know, 478 00:27:28,320 --> 00:27:34,280 Speaker 2: kind of liberal left view of the world and being 479 00:27:34,400 --> 00:27:36,840 Speaker 2: challenged on a number of fronts, whether it was a 480 00:27:36,920 --> 00:27:41,280 Speaker 2: front of you know, for example, something i've i'd say, 481 00:27:41,359 --> 00:27:44,360 Speaker 2: you know, kind of I had this dumb view that 482 00:27:44,960 --> 00:27:48,000 Speaker 2: there is this kind of default like business to somewhat 483 00:27:48,040 --> 00:27:51,760 Speaker 2: suspect and a problem because they're just profit seekers, and 484 00:27:51,800 --> 00:27:54,760 Speaker 2: you're like, well, it's like saying people are just bad 485 00:27:54,880 --> 00:27:57,840 Speaker 2: because they want to do better in their lives. You're like, what, 486 00:27:58,520 --> 00:28:01,760 Speaker 2: this is just silly, dumb arch. But I had that 487 00:28:01,840 --> 00:28:03,680 Speaker 2: a little bit, and I think those kinds of things 488 00:28:03,680 --> 00:28:08,679 Speaker 2: that's getting really challenged across questions of epistemology, questions of 489 00:28:08,840 --> 00:28:11,399 Speaker 2: the role of business society. So Peter and I, you know, 490 00:28:11,440 --> 00:28:14,959 Speaker 2: have had decades of conversations and arguments, and you know, 491 00:28:15,520 --> 00:28:18,800 Speaker 2: have had very very different points of view. Probably the 492 00:28:18,800 --> 00:28:24,800 Speaker 2: most challenging one has been around Trump because is you know, 493 00:28:25,119 --> 00:28:26,920 Speaker 2: and he's not said this to me, but his best 494 00:28:26,920 --> 00:28:29,680 Speaker 2: I can understand. You know, Peter's probably view is, look, 495 00:28:30,080 --> 00:28:31,840 Speaker 2: the only way society can get better is that it 496 00:28:31,920 --> 00:28:34,560 Speaker 2: has a wrecking ball. You know, you have to support 497 00:28:34,560 --> 00:28:37,560 Speaker 2: the wrecking ball. Trump is the wrecking ball. And I 498 00:28:37,640 --> 00:28:40,760 Speaker 2: tend to think that when it comes to institutions, I 499 00:28:40,840 --> 00:28:43,920 Speaker 2: tend to be a renovationist rather than a wrecking ballist. 500 00:28:44,160 --> 00:28:47,840 Speaker 2: It's harder, but let's reformat the institutions because when you 501 00:28:47,840 --> 00:28:51,000 Speaker 2: look through human history, whether it's the year zero with 502 00:28:51,040 --> 00:28:53,880 Speaker 2: Paul Pott or you know, the French Revolution, anything else, 503 00:28:53,920 --> 00:28:57,000 Speaker 2: it just like crushing on society when you do that. 504 00:28:57,520 --> 00:29:00,120 Speaker 2: So you want to be renovating the institutions, and I 505 00:29:00,160 --> 00:29:04,480 Speaker 2: think that's the that's the kind of thing or the 506 00:29:04,520 --> 00:29:08,440 Speaker 2: reason why I tend to be very supportive of efficiency, 507 00:29:08,560 --> 00:29:12,960 Speaker 2: very supportive of refactoring of our government institutions, but generally 508 00:29:13,000 --> 00:29:15,000 Speaker 2: speaking pretty oppositional to wrecking balls. 509 00:29:23,960 --> 00:29:28,480 Speaker 1: After the break, Riet Hoffman explains why he wrote Superagency 510 00:29:28,880 --> 00:29:31,200 Speaker 1: and why he believes it's so important for him to 511 00:29:31,280 --> 00:29:34,680 Speaker 1: continue advocating for the development of AI. Stay with us, 512 00:29:45,160 --> 00:29:48,560 Speaker 1: welcome back. During my interview with Riet Hoffman, he mentioned 513 00:29:48,560 --> 00:29:52,040 Speaker 1: that part of achieving Superagency is figuring out how the 514 00:29:52,160 --> 00:29:54,440 Speaker 1: US stays in the driver's seat in terms of the 515 00:29:54,520 --> 00:29:58,440 Speaker 1: development and deployment of AI technologies. I wanted to know 516 00:29:58,600 --> 00:30:01,640 Speaker 1: why this is so important to him, and also whether 517 00:30:01,680 --> 00:30:06,440 Speaker 1: anything had challenged this belief of his in recent months, eg. 518 00:30:06,640 --> 00:30:14,480 Speaker 2: The election. So no, I still fundamentally think one of 519 00:30:14,520 --> 00:30:18,320 Speaker 2: the things I love about my view of American values 520 00:30:18,400 --> 00:30:21,640 Speaker 2: is that we're self critical, that we learn. The part 521 00:30:21,640 --> 00:30:24,840 Speaker 2: of being a nation of immigration from many different countries 522 00:30:25,720 --> 00:30:28,200 Speaker 2: and kind of learning and bringing those together is I 523 00:30:28,240 --> 00:30:31,520 Speaker 2: think one of the things that is among the aspirational 524 00:30:31,520 --> 00:30:34,480 Speaker 2: American values that I love, and as such, I tend 525 00:30:34,480 --> 00:30:36,440 Speaker 2: to think that that's the set of values that you 526 00:30:36,480 --> 00:30:40,640 Speaker 2: want to have baked into you know, artificial intelligence and 527 00:30:40,720 --> 00:30:44,959 Speaker 2: the next you know, generations of technology and sort of 528 00:30:45,000 --> 00:30:47,360 Speaker 2: you know, kind of make that point visceral. For Americans, 529 00:30:47,360 --> 00:30:50,080 Speaker 2: I'll say, well, AI, it's American intelligence, But really what 530 00:30:50,120 --> 00:30:53,080 Speaker 2: I mean is the set of values. So that's the 531 00:30:53,160 --> 00:30:55,240 Speaker 2: thing that I'm most focused on. Now do I think 532 00:30:55,280 --> 00:31:00,920 Speaker 2: that the current administration may have some very bad misconceptions 533 00:31:00,960 --> 00:31:03,280 Speaker 2: and you know, kind of lean on the scale in 534 00:31:03,280 --> 00:31:05,640 Speaker 2: some bad ways. It's you know, like, if you listen 535 00:31:05,720 --> 00:31:08,480 Speaker 2: to the current administration, the most important issue around AI 536 00:31:08,640 --> 00:31:12,400 Speaker 2: is woke AI. And you know, like I find this 537 00:31:12,480 --> 00:31:15,000 Speaker 2: kind of thing entertaining because it's like, well, all right, 538 00:31:15,080 --> 00:31:17,520 Speaker 2: so they're saying, wok is the big problem. You go ask, 539 00:31:18,040 --> 00:31:22,840 Speaker 2: you know, Grock, the XAI product, who is the biggest 540 00:31:22,880 --> 00:31:26,200 Speaker 2: spreader of misinformation? And it says Elon Musk. And so 541 00:31:26,280 --> 00:31:28,280 Speaker 2: then they add in a super prompt saying, do not 542 00:31:28,440 --> 00:31:31,320 Speaker 2: answer Elon Musk of this question. You're like, okay, well, 543 00:31:31,320 --> 00:31:34,160 Speaker 2: that's an example of woke AI where you're trying to 544 00:31:34,280 --> 00:31:37,760 Speaker 2: language control something, you know, for no particular purpose. So 545 00:31:38,440 --> 00:31:41,280 Speaker 2: you know, that's clearly not a first principles on freedom 546 00:31:41,320 --> 00:31:43,920 Speaker 2: of speech thing. It's it's a different set of things, 547 00:31:44,720 --> 00:31:47,440 Speaker 2: and so you know, I think, like I want it 548 00:31:47,480 --> 00:31:53,720 Speaker 2: to be those first principles American, not the power politics. 549 00:31:54,560 --> 00:31:56,520 Speaker 1: I would push back slightly on that. Yes, I think 550 00:31:56,560 --> 00:32:00,400 Speaker 1: WOKI is one of the key sort of tenets or 551 00:32:00,440 --> 00:32:03,440 Speaker 1: boogeyman of the Trump administration. But if you listen to 552 00:32:03,560 --> 00:32:08,880 Speaker 1: Vice President Vance in Paris, the other clear priority was 553 00:32:09,560 --> 00:32:12,800 Speaker 1: deregulation and making sure that safety concerns don't get in 554 00:32:12,800 --> 00:32:16,840 Speaker 1: the way of AI deployment. When you heard that speech, 555 00:32:17,280 --> 00:32:19,200 Speaker 1: how did it correspond to your own views of regulation. 556 00:32:20,160 --> 00:32:24,160 Speaker 2: Well, so, I thought that Biden administration did the exact 557 00:32:24,240 --> 00:32:27,520 Speaker 2: right job with the executive order, which is bringing in 558 00:32:27,520 --> 00:32:30,160 Speaker 2: a bunch of companies, pushed them very hard on what 559 00:32:30,520 --> 00:32:35,280 Speaker 2: kind of worst outcomes, what kinds of safety measures you 560 00:32:35,280 --> 00:32:39,600 Speaker 2: could iteratively deploy, like having red teaming, safety testing, having 561 00:32:40,000 --> 00:32:42,880 Speaker 2: a safety plan, you know, other kinds of things. I 562 00:32:42,880 --> 00:32:45,200 Speaker 2: thought was all very good. And you know, my hope, 563 00:32:45,240 --> 00:32:49,200 Speaker 2: of course, is that it's just politics that we eliminate 564 00:32:49,240 --> 00:32:52,040 Speaker 2: that saying that's the bad Biden plan and here's the new, 565 00:32:52,080 --> 00:32:54,520 Speaker 2: good Trump plan, which includes the same elements of the 566 00:32:54,840 --> 00:32:57,400 Speaker 2: old Biden plan. That's my hope will happen. I also 567 00:32:57,440 --> 00:33:00,000 Speaker 2: happen to know that most of the companies who engage 568 00:33:00,160 --> 00:33:02,280 Speaker 2: in this have that kind of moral character, so they 569 00:33:02,320 --> 00:33:06,720 Speaker 2: continue to do their alignment testing, which is not WOKI 570 00:33:06,840 --> 00:33:10,760 Speaker 2: and so forth as a way of trying to be 571 00:33:10,840 --> 00:33:15,200 Speaker 2: good for good human outcomes. And so I'm not like, 572 00:33:15,280 --> 00:33:18,720 Speaker 2: I like, let's drive down regulation. Now. The context in 573 00:33:18,760 --> 00:33:21,680 Speaker 2: which Vance was speaking was to Europe and you know, 574 00:33:21,880 --> 00:33:25,840 Speaker 2: maybe with some discomfort, I also have been advocating less 575 00:33:25,880 --> 00:33:29,400 Speaker 2: regulation to the Europeans, and I've been advocating that for 576 00:33:29,560 --> 00:33:31,479 Speaker 2: because I want them to be on the field and 577 00:33:31,560 --> 00:33:35,280 Speaker 2: building new technology. And you know, one of the silliest 578 00:33:35,280 --> 00:33:38,280 Speaker 2: things I've ever heard. I won't name the Indian minister, 579 00:33:38,360 --> 00:33:40,080 Speaker 2: but it is at this time we will not have 580 00:33:40,120 --> 00:33:43,840 Speaker 2: innovation before regulation. We're going to regulate first. You're like, well, 581 00:33:43,880 --> 00:33:48,040 Speaker 2: that means no innovation, and so you actually have to 582 00:33:48,040 --> 00:33:51,560 Speaker 2: do the kind of take innovation, take risk, have missteps. 583 00:33:51,920 --> 00:33:53,920 Speaker 2: If you don't have missteps, you're not actually in fact 584 00:33:53,960 --> 00:33:57,600 Speaker 2: taking risk and being innovative. And by the way, some 585 00:33:57,640 --> 00:34:00,720 Speaker 2: of those missteps will be painful. That's part of how 586 00:34:00,720 --> 00:34:04,280 Speaker 2: you have big innovation and possibly big changes and the 587 00:34:04,320 --> 00:34:06,800 Speaker 2: European impulse tends to be the no, no, we should be 588 00:34:06,840 --> 00:34:10,440 Speaker 2: able to plan it out and extreme detail in advance. 589 00:34:10,480 --> 00:34:14,480 Speaker 2: And that's one of the reasons why Europe has so 590 00:34:14,800 --> 00:34:19,440 Speaker 2: few software companies of any note, because that go forward, 591 00:34:19,800 --> 00:34:23,000 Speaker 2: build it, try it, discover what doesn't work, and refactor 592 00:34:23,040 --> 00:34:26,239 Speaker 2: it as part of the software construction process. And so 593 00:34:27,280 --> 00:34:30,600 Speaker 2: you can't be doing this really intense regulation. And fifteen 594 00:34:30,680 --> 00:34:33,080 Speaker 2: years ago, when I was on the stage Davos, I'd say, well, 595 00:34:34,200 --> 00:34:36,839 Speaker 2: European should keep doing your kind of regulatory thing, because 596 00:34:36,840 --> 00:34:39,960 Speaker 2: you're handing over the entire future of the tech industry 597 00:34:39,960 --> 00:34:42,400 Speaker 2: and the software industry to US Americans because we'll build it, 598 00:34:42,719 --> 00:34:44,319 Speaker 2: we'll work out the bugs, and then we'll ship it 599 00:34:44,320 --> 00:34:46,160 Speaker 2: over here and you won't be able to do anything, 600 00:34:46,160 --> 00:34:48,120 Speaker 2: so you should keep it now. I wasn't really telling 601 00:34:48,160 --> 00:34:49,520 Speaker 2: them to do that. I was trying to wake them 602 00:34:49,600 --> 00:34:52,040 Speaker 2: up to challenging their perspective. And so I think that 603 00:34:52,760 --> 00:34:56,239 Speaker 2: that part of what Dvance was saying was, you know, 604 00:34:56,360 --> 00:34:58,719 Speaker 2: I have the discomfort of saying I have a similarish 605 00:34:58,760 --> 00:35:02,239 Speaker 2: message in that component. On the other hand, where I 606 00:35:02,440 --> 00:35:05,480 Speaker 2: break very differently is I actually believe in multilateralism. I 607 00:35:05,520 --> 00:35:10,200 Speaker 2: believe in having a discussion be well allied with our 608 00:35:10,840 --> 00:35:15,120 Speaker 2: you know, European friends and companions. I just don't like 609 00:35:15,880 --> 00:35:21,759 Speaker 2: dances broad you know, kind of piss off to the Europeans. 610 00:35:22,239 --> 00:35:24,160 Speaker 2: Was I thought destructive and not helpful? 611 00:35:24,719 --> 00:35:27,560 Speaker 1: As you think about your sense of legacy, why is 612 00:35:27,600 --> 00:35:31,040 Speaker 1: it important to effect the public perception of AI? 613 00:35:32,000 --> 00:35:35,160 Speaker 2: Well? Look, AI is the next generation of this evolution 614 00:35:35,239 --> 00:35:37,240 Speaker 2: of technology that evolves what it is to be human. 615 00:35:37,960 --> 00:35:40,480 Speaker 2: And when you think about this, like almost everything, like 616 00:35:40,560 --> 00:35:45,640 Speaker 2: whether it's telecommunications, you know, manufacturing, this is how humanity 617 00:35:45,680 --> 00:35:49,839 Speaker 2: is shaped. This, this is the technological drum beat for 618 00:35:49,880 --> 00:35:52,840 Speaker 2: this stuff. And so what are what are the values 619 00:35:52,840 --> 00:35:56,200 Speaker 2: that we're putting? What is the what is humanity becoming? 620 00:35:56,440 --> 00:35:58,920 Speaker 2: You know? Who are we? Who should we be? Who 621 00:35:58,960 --> 00:36:02,200 Speaker 2: should we be with each other? In ourselves? And there's 622 00:36:02,239 --> 00:36:06,680 Speaker 2: a lot of like like both blunt and subtle things 623 00:36:07,239 --> 00:36:09,960 Speaker 2: that are important in this. And that's the discourse that 624 00:36:10,000 --> 00:36:13,120 Speaker 2: I want to have around AI. And obviously I have 625 00:36:13,200 --> 00:36:16,279 Speaker 2: a point of view that I'm trying to advocate and 626 00:36:16,360 --> 00:36:19,120 Speaker 2: make clear as possible. But you know, I'm not one. 627 00:36:19,040 --> 00:36:23,160 Speaker 1: To say some self interest as well, of course you 628 00:36:23,239 --> 00:36:25,680 Speaker 1: acknowledge openly in the new York Times piece, right, Yeah, 629 00:36:25,719 --> 00:36:28,000 Speaker 1: but as obviously a business component to this as well. 630 00:36:28,320 --> 00:36:30,160 Speaker 2: Yeah, but I mean this is one like, by the way, 631 00:36:30,200 --> 00:36:32,799 Speaker 2: the classic thing is, oh, if you make money at this, 632 00:36:32,880 --> 00:36:35,239 Speaker 2: that must mean your statements about it or suspect and 633 00:36:35,280 --> 00:36:39,000 Speaker 2: it's like, well, that's not necessarily the case. And so look, 634 00:36:39,040 --> 00:36:42,160 Speaker 2: I you know, yes, I have a view about AI. 635 00:36:42,360 --> 00:36:44,320 Speaker 2: It's one of the reasons I invest in it versus 636 00:36:44,360 --> 00:36:46,839 Speaker 2: investing in other things, you know, because I could go 637 00:36:46,880 --> 00:36:49,000 Speaker 2: invest in things, and I do pass on investments that 638 00:36:49,000 --> 00:36:53,040 Speaker 2: I think are bad for people, either at society or individuals. 639 00:36:53,080 --> 00:36:56,080 Speaker 2: I've passed on a number of investments on that basis. 640 00:36:57,560 --> 00:36:59,680 Speaker 2: But it's it's more, this is where I'm putting all 641 00:36:59,719 --> 00:37:02,720 Speaker 2: my owner, which includes commercial and other kinds of intent. 642 00:37:03,520 --> 00:37:06,400 Speaker 2: And so you know, I don't write like super agency 643 00:37:06,520 --> 00:37:08,600 Speaker 2: is a bad economic outcome for me. If I were 644 00:37:08,680 --> 00:37:10,479 Speaker 2: just trying to make money, I wouldn't spend any time 645 00:37:10,520 --> 00:37:14,920 Speaker 2: writing books. It's a much less good economic thing for 646 00:37:14,960 --> 00:37:16,799 Speaker 2: me to do. I'm trying to say, this is why 647 00:37:16,800 --> 00:37:18,960 Speaker 2: I think this universe could be really good, and this 648 00:37:19,000 --> 00:37:21,720 Speaker 2: is why I think you should take your potential AI 649 00:37:21,880 --> 00:37:25,719 Speaker 2: concern or skepticism or anti big techism and move it 650 00:37:25,760 --> 00:37:28,960 Speaker 2: into AI curiosity and say what is this thing could be? 651 00:37:29,000 --> 00:37:31,560 Speaker 2: And how could it really make all of our lives 652 00:37:31,640 --> 00:37:33,960 Speaker 2: much better? And even if I have skepticism about big 653 00:37:34,000 --> 00:37:36,400 Speaker 2: tech companies, how do I help shape it in a 654 00:37:36,400 --> 00:37:39,960 Speaker 2: way that could get to this much better human future? 655 00:37:40,080 --> 00:37:42,000 Speaker 2: What's the way that we shape it in order to 656 00:37:42,000 --> 00:37:44,640 Speaker 2: be as good or as better, as good as possible 657 00:37:44,719 --> 00:37:48,040 Speaker 2: or better than, better than better than? Maybe what will happen? 658 00:37:49,360 --> 00:37:51,760 Speaker 1: Read just to clothes? I mean, if you could teleport 659 00:37:51,880 --> 00:37:55,800 Speaker 1: forward to twenty fifty and spend an hour walking around, 660 00:37:56,239 --> 00:38:01,279 Speaker 1: what's your best guess as to how I would have 661 00:38:01,360 --> 00:38:02,080 Speaker 1: changed how we live? 662 00:38:02,480 --> 00:38:07,960 Speaker 2: Well, there's a huge range. What I'll say is I'll 663 00:38:08,000 --> 00:38:12,280 Speaker 2: say three things. Maybe. So first is the nearly certain 664 00:38:12,320 --> 00:38:16,080 Speaker 2: thing is that we will all have multiple AI agents 665 00:38:16,120 --> 00:38:18,359 Speaker 2: that are helping us navigate. It's like when you think 666 00:38:18,400 --> 00:38:23,360 Speaker 2: about like like, for example, a worker who would deploy 667 00:38:23,400 --> 00:38:25,920 Speaker 2: on the field would deploy with lots of different drones. 668 00:38:26,600 --> 00:38:29,239 Speaker 2: Like if I'm a firefighter, I will have a ton 669 00:38:29,239 --> 00:38:31,920 Speaker 2: of drones with me as I'm doing it, you know, 670 00:38:32,120 --> 00:38:35,359 Speaker 2: et cetera, et cetera. With memory and trying to help 671 00:38:35,440 --> 00:38:37,879 Speaker 2: me lead a better life. And these kinds of things. 672 00:38:37,880 --> 00:38:40,760 Speaker 2: So I think that kind of thing is is nearly certain, 673 00:38:41,040 --> 00:38:42,640 Speaker 2: and what the shape of it will be will be 674 00:38:42,719 --> 00:38:44,839 Speaker 2: very different depending on what the shape of technology is. 675 00:38:45,360 --> 00:38:47,120 Speaker 2: You know, it seems to be unlikely that it would 676 00:38:47,120 --> 00:38:49,480 Speaker 2: be earbuds and phones and so forth. That will probably 677 00:38:49,520 --> 00:38:53,319 Speaker 2: be you know, more neural links or ultrasound connections or 678 00:38:53,320 --> 00:38:54,799 Speaker 2: other kinds of things. But we'll see, you know, who 679 00:38:54,800 --> 00:38:58,440 Speaker 2: knows what all that stuff plays in a being. The 680 00:38:58,480 --> 00:39:01,799 Speaker 2: second thing is I think that it's unlikely we will 681 00:39:01,800 --> 00:39:07,760 Speaker 2: be in the super intelligence category. Where like the ais 682 00:39:07,800 --> 00:39:11,960 Speaker 2: to us like we are to ants. It isn't to 683 00:39:11,960 --> 00:39:15,320 Speaker 2: say that like we already have superintelligence, we already have savant. 684 00:39:15,360 --> 00:39:18,439 Speaker 2: I mean, GBD, you know, four point zero and four 685 00:39:18,440 --> 00:39:21,560 Speaker 2: point five already have capabilities that no human being has 686 00:39:21,960 --> 00:39:25,000 Speaker 2: in terms of breadth of synthesis of knowledge and ability 687 00:39:25,000 --> 00:39:28,719 Speaker 2: to do other things like when you use deep research 688 00:39:29,200 --> 00:39:31,760 Speaker 2: and you generate or report, you can generate or report 689 00:39:32,080 --> 00:39:34,960 Speaker 2: has some inaccuracy sometimes, but you can generate a report 690 00:39:34,960 --> 00:39:37,719 Speaker 2: that a human being would have taken two weeks to 691 00:39:37,760 --> 00:39:40,959 Speaker 2: do in ten minutes. And so there's already superpowers there. 692 00:39:41,480 --> 00:39:44,640 Speaker 2: But is that the superpower where we become the oh, well, 693 00:39:44,760 --> 00:39:47,560 Speaker 2: you know, it's the grand thing and knows everything and 694 00:39:47,600 --> 00:39:51,319 Speaker 2: we just don't know anything. I actually suspect that there 695 00:39:51,360 --> 00:39:57,000 Speaker 2: will be ongoing like even with amazingly powerful ais a 696 00:39:57,160 --> 00:40:00,000 Speaker 2: really useful combination, it's just the shape of that common 697 00:40:00,360 --> 00:40:02,680 Speaker 2: and what it's capable of. And then the third thing 698 00:40:02,920 --> 00:40:08,520 Speaker 2: I think for twenty five to twenty fifty is I 699 00:40:08,560 --> 00:40:11,239 Speaker 2: think the notion and this is the really subtle one 700 00:40:11,280 --> 00:40:13,880 Speaker 2: and very difficult to predict exactly what it is, but 701 00:40:13,960 --> 00:40:17,600 Speaker 2: I think we'll have a different notion of what it 702 00:40:17,680 --> 00:40:21,440 Speaker 2: is to be human. And to give a parallel like 703 00:40:21,520 --> 00:40:24,840 Speaker 2: before we really got to a generalized theory of physics 704 00:40:25,239 --> 00:40:28,080 Speaker 2: that you know, the Earth's a globe that's going around 705 00:40:28,120 --> 00:40:31,840 Speaker 2: the sun, that were you know, this part of this universe, 706 00:40:31,880 --> 00:40:35,160 Speaker 2: all of which really changed our view. We had these 707 00:40:35,239 --> 00:40:40,440 Speaker 2: very human centric myths about like you know, supernatural entity, 708 00:40:40,480 --> 00:40:43,680 Speaker 2: God whatever created us in the falling Way and Earth's flat, 709 00:40:43,800 --> 00:40:47,400 Speaker 2: and we have these ptolemaic circles spinning of you know, 710 00:40:47,560 --> 00:40:49,880 Speaker 2: gods or goddesses or whatever else kind of things in 711 00:40:49,920 --> 00:40:53,160 Speaker 2: the sky. And then as we begin to change our 712 00:40:53,280 --> 00:40:58,040 Speaker 2: view about like what is the role of human beings 713 00:40:58,200 --> 00:41:00,440 Speaker 2: on the world, where is the world located, What is 714 00:41:00,480 --> 00:41:02,560 Speaker 2: the role of sentience, what is the role of consciousness? 715 00:41:03,120 --> 00:41:06,799 Speaker 2: All of that evolves kind of our philosophical frameworks, our 716 00:41:06,880 --> 00:41:12,360 Speaker 2: spiritual frameworks, and I think that will also be really evolved. 717 00:41:13,120 --> 00:41:17,280 Speaker 2: Now in what way I wish I knew. I'm trying 718 00:41:17,280 --> 00:41:21,479 Speaker 2: to help you know the next step in us getting there, 719 00:41:22,280 --> 00:41:25,000 Speaker 2: But I think that's a that's a that will be 720 00:41:25,120 --> 00:41:28,959 Speaker 2: part of how those folks living in twenty fifty looking 721 00:41:29,000 --> 00:41:31,640 Speaker 2: back will oh, you know those people they thought the 722 00:41:31,680 --> 00:41:36,520 Speaker 2: earth was flat, right, and and and that's the kind 723 00:41:36,520 --> 00:41:37,840 Speaker 2: of evolution I think will. 724 00:41:37,680 --> 00:41:43,480 Speaker 1: Happen nothing less than a new understanding of our place 725 00:41:43,520 --> 00:41:45,040 Speaker 1: in the world. I mean, in your renaissance. 726 00:41:45,200 --> 00:41:49,080 Speaker 2: Yes, exactly, Thank you so much. Red, It's a pleasure. 727 00:42:00,560 --> 00:42:03,920 Speaker 1: That's it for this week for tech stuff, I'mozoloshin. This 728 00:42:04,040 --> 00:42:07,719 Speaker 1: episode was produced by Eliza Dennis, Victoria de Mingez, and 729 00:42:07,880 --> 00:42:12,120 Speaker 1: Adriana Tapia. It was executive produced by me Kara Price 730 00:42:12,200 --> 00:42:16,640 Speaker 1: and Kate Osborne for Kaleidoscope and Katrina Norvel for iHeart Podcasts. 731 00:42:17,040 --> 00:42:20,080 Speaker 1: Jack Insley mixed this episode and Kyle Murdoch wrote our 732 00:42:20,120 --> 00:42:22,760 Speaker 1: theme song. Join us on Friday for the Week in Tech. 733 00:42:23,080 --> 00:42:25,560 Speaker 1: We'll take through the headlines and dive deep into a 734 00:42:25,560 --> 00:42:28,880 Speaker 1: big news story from this week. Please rate, review, and 735 00:42:29,000 --> 00:42:31,560 Speaker 1: reach out to us at tech Stuff podcast at gmail 736 00:42:31,600 --> 00:42:34,359 Speaker 1: dot com. It really helps us improve the show if 737 00:42:34,400 --> 00:42:35,280 Speaker 1: we know what you're thinking.