1 00:00:15,760 --> 00:00:18,520 Speaker 1: Welcome to tech Stuff. I'm Os Voloscian, and today we 2 00:00:18,600 --> 00:00:21,840 Speaker 1: get the opportunity to go behind the curtain at Google's 3 00:00:21,960 --> 00:00:25,560 Speaker 1: Deep Mind. For almost three years, in the upstairs room 4 00:00:25,600 --> 00:00:29,440 Speaker 1: of a pub in North London, journalist Sebastian Malabi met 5 00:00:29,480 --> 00:00:33,600 Speaker 1: regularly with the company's CEO and co founder Demis has ABIs. 6 00:00:34,240 --> 00:00:40,080 Speaker 1: They spoke about artificial intelligence, philosophy, neuroscience, motivation and consequence, 7 00:00:40,880 --> 00:00:44,160 Speaker 1: all against the backdrop of an increasingly intense three way 8 00:00:44,280 --> 00:00:48,000 Speaker 1: race between Open Ai, Anthropic and Google to win the 9 00:00:48,080 --> 00:00:52,519 Speaker 1: race towards Agi. Sebasti, congratulations on your new book, The 10 00:00:52,560 --> 00:00:54,800 Speaker 1: Infinity Machine, and welcome to tex Stuff. 11 00:00:55,040 --> 00:00:56,160 Speaker 2: Thank you ask going to be here. 12 00:00:56,720 --> 00:00:59,560 Speaker 1: You begin the book with a quote from other scientists 13 00:00:59,600 --> 00:01:03,320 Speaker 1: who watch on Manhattan Project, who said, what we're creating 14 00:01:03,360 --> 00:01:06,280 Speaker 1: now is a monster whose influence is going to change history. 15 00:01:06,760 --> 00:01:09,039 Speaker 1: Yet it would be impossible not to see it through 16 00:01:09,520 --> 00:01:12,160 Speaker 1: the energy source which is now being made available, or 17 00:01:12,240 --> 00:01:16,440 Speaker 1: make scientists the most hated and the most wanted citizens 18 00:01:16,720 --> 00:01:18,160 Speaker 1: of any country where. 19 00:01:18,240 --> 00:01:21,880 Speaker 2: Reliving that now with Ai argree, I mean I began 20 00:01:21,959 --> 00:01:26,280 Speaker 2: this project wanting to capture the tingling sensation of human beings, 21 00:01:26,319 --> 00:01:31,000 Speaker 2: like Demsis ABIs, creating the new version of atomic weapons. Right, 22 00:01:31,080 --> 00:01:35,240 Speaker 2: this incredibly powerful AI technology that has enormous upsides but 23 00:01:35,319 --> 00:01:38,240 Speaker 2: could also be very, very dangerous. And the surprise was 24 00:01:38,520 --> 00:01:39,840 Speaker 2: I didn't have to bring it up to them. They 25 00:01:39,840 --> 00:01:41,319 Speaker 2: brought it up to me. I mean, it's so much 26 00:01:41,360 --> 00:01:43,679 Speaker 2: on their minds. And so that's why I put this 27 00:01:44,080 --> 00:01:46,600 Speaker 2: quotation about the Manhattan Project at the start of the book, 28 00:01:46,600 --> 00:01:49,680 Speaker 2: because it's kind of the It sums up one of 29 00:01:49,680 --> 00:01:53,360 Speaker 2: the main threads, which is this, you know, scientists can't 30 00:01:53,400 --> 00:01:58,400 Speaker 2: resist inventing something which is exciting technically, and then they're 31 00:01:58,440 --> 00:02:00,840 Speaker 2: going to be the most hated and most wanted people 32 00:02:01,680 --> 00:02:02,360 Speaker 2: in the country. 33 00:02:02,760 --> 00:02:06,800 Speaker 1: You described the question of motivation when it comes to 34 00:02:06,880 --> 00:02:10,919 Speaker 1: demis hanging in the air like the mushroom cloud over 35 00:02:10,960 --> 00:02:16,519 Speaker 1: Los Alamos, very arresting visual image. Why that image? And 36 00:02:17,040 --> 00:02:19,040 Speaker 1: what did you understand in the end about his motivation? 37 00:02:19,600 --> 00:02:21,360 Speaker 2: I think when you look at that picture of the 38 00:02:21,440 --> 00:02:26,400 Speaker 2: mushroom cloud over Los Alamos, you're kind of thinking both wow, 39 00:02:26,760 --> 00:02:30,560 Speaker 2: but also why why did human beings do this? You know, 40 00:02:30,760 --> 00:02:33,840 Speaker 2: it's so destructive? Why did you do it? And I 41 00:02:33,840 --> 00:02:37,360 Speaker 2: guess you know, part of the threatened my book is 42 00:02:37,680 --> 00:02:41,040 Speaker 2: he had a series of ideas about how he could 43 00:02:41,040 --> 00:02:44,240 Speaker 2: build a eye and make it safe for humanity and 44 00:02:44,280 --> 00:02:48,160 Speaker 2: beneficial for humanity. And one by one these ideas become 45 00:02:48,240 --> 00:02:51,240 Speaker 2: unraveled as they collide with reality. So the story of 46 00:02:51,240 --> 00:02:54,560 Speaker 2: demisisabass in some ways, you know a story. I think 47 00:02:54,560 --> 00:02:56,600 Speaker 2: there's two categories have screw up in the world. Right, 48 00:02:57,520 --> 00:03:02,200 Speaker 2: Sometimes you get something where basically idiots are in charge, 49 00:03:02,400 --> 00:03:04,640 Speaker 2: they don't understand what they're doing, and they make a 50 00:03:04,680 --> 00:03:08,520 Speaker 2: humongous mistake Iran War for example. Right then you have 51 00:03:08,560 --> 00:03:12,320 Speaker 2: another much more interesting category of screw up, and that 52 00:03:12,480 --> 00:03:16,080 Speaker 2: is where intelligent people know from the beginning exactly what 53 00:03:16,120 --> 00:03:18,280 Speaker 2: they're doing. They can see the risks, they think they 54 00:03:18,280 --> 00:03:21,119 Speaker 2: can manage them, but then forces which are larger than them. 55 00:03:21,720 --> 00:03:24,160 Speaker 2: In this case with AI, it's a race dynamic between 56 00:03:24,200 --> 00:03:27,680 Speaker 2: multiple labs and multiple countries take over and they can 57 00:03:27,720 --> 00:03:32,880 Speaker 2: no longer control the technology that they've invented. And I 58 00:03:32,919 --> 00:03:35,920 Speaker 2: think those episodes where you couldn't just switch out the 59 00:03:35,920 --> 00:03:43,200 Speaker 2: individuals and have a better outcome, where the individual is good, sincerely, good, intelligent, thoughtful, 60 00:03:43,320 --> 00:03:45,480 Speaker 2: has foresight, and yet you still end up in a 61 00:03:45,520 --> 00:03:48,920 Speaker 2: bad place. That's what's really fascinating. When Demis sold his 62 00:03:48,960 --> 00:03:52,160 Speaker 2: company deep Mind to Google in twenty fourteen. There was 63 00:03:52,200 --> 00:03:56,120 Speaker 2: a condition which was this will never be used for weapons. Well, 64 00:03:56,200 --> 00:03:58,760 Speaker 2: you know now it's twenty twenty six it is being 65 00:03:58,840 --> 00:04:01,840 Speaker 2: used for weapons. And you know, so in time after 66 00:04:01,920 --> 00:04:05,640 Speaker 2: time he tried to draw lines in the sand, and 67 00:04:05,680 --> 00:04:06,720 Speaker 2: they've all been erased. 68 00:04:06,840 --> 00:04:10,280 Speaker 1: It's harder and harder to say what these AI companies 69 00:04:10,360 --> 00:04:13,800 Speaker 1: are today. I mean, for a moment last year, Open 70 00:04:13,840 --> 00:04:17,120 Speaker 1: ai was the most popular social video app company in 71 00:04:17,160 --> 00:04:19,600 Speaker 1: the world, and now it doesn't do sour anymore. But like, 72 00:04:20,080 --> 00:04:22,440 Speaker 1: what is your working definition of what deep Mind is. 73 00:04:23,040 --> 00:04:25,880 Speaker 2: I mean, it's a laboratory for the invention of machine intelligence. 74 00:04:26,000 --> 00:04:29,680 Speaker 2: And machine intelligence is a very compacious thing. You know, 75 00:04:29,720 --> 00:04:34,880 Speaker 2: it goes from text to video to images to a 76 00:04:34,880 --> 00:04:39,080 Speaker 2: system like Alpha fold, which divined all the shapes of 77 00:04:39,120 --> 00:04:42,640 Speaker 2: proteins in nature, and one demis the Nobel Price. So 78 00:04:42,680 --> 00:04:46,040 Speaker 2: it's a huge field, and indeed the creation of it 79 00:04:46,080 --> 00:04:49,360 Speaker 2: is a huge thing because you bring in experts in neuroscience, 80 00:04:49,480 --> 00:04:53,480 Speaker 2: experts in chemistry, experts in physics, exprests, computer science, ethical 81 00:04:53,839 --> 00:04:57,080 Speaker 2: experts who can philosophize about the personality that an AI 82 00:04:57,160 --> 00:04:59,919 Speaker 2: system should have. I mean, it's a very multi discip 83 00:05:00,000 --> 00:05:02,400 Speaker 2: binary thing, which is part of what makes the story fascinating. 84 00:05:02,560 --> 00:05:04,000 Speaker 1: When did you first meet Demis Now? 85 00:05:04,080 --> 00:05:07,599 Speaker 2: I first met him when I'd being interested in, you know, 86 00:05:07,680 --> 00:05:10,400 Speaker 2: technology generally. My last book was about Silicon Valley and 87 00:05:10,480 --> 00:05:14,440 Speaker 2: venture capital, and in the process of writing that book, 88 00:05:14,960 --> 00:05:17,640 Speaker 2: I would go to tech conferences in Europe and there 89 00:05:17,680 --> 00:05:20,840 Speaker 2: would be this sort of diminutive figure with a big, 90 00:05:20,880 --> 00:05:23,799 Speaker 2: big smile and sort of a kind of boyish charm, 91 00:05:23,960 --> 00:05:27,200 Speaker 2: really unassuming kind of guy with you know, a sort 92 00:05:27,240 --> 00:05:30,800 Speaker 2: of round neck sweater and his hair falling forward in 93 00:05:30,839 --> 00:05:33,800 Speaker 2: a fringe, and he would get up on the stage 94 00:05:34,120 --> 00:05:36,599 Speaker 2: with a big grin and sort of just almost as 95 00:05:36,640 --> 00:05:38,960 Speaker 2: if he was talking about how he was about to 96 00:05:39,200 --> 00:05:41,520 Speaker 2: wash the dishes after lunch. You know, he would in 97 00:05:41,560 --> 00:05:45,000 Speaker 2: a very plain spoken way, talk about well, when I 98 00:05:45,040 --> 00:05:47,440 Speaker 2: was a child, I had two ambitions. One must understand 99 00:05:47,440 --> 00:05:49,520 Speaker 2: all of science, and the other was to understand all 100 00:05:49,520 --> 00:05:53,120 Speaker 2: of philosophy. So I resolved this dilemma by deciding to 101 00:05:53,120 --> 00:05:55,640 Speaker 2: build AI, which would help me to understand both. And 102 00:05:55,680 --> 00:05:59,919 Speaker 2: so he had this mind blowing mental reach combined with 103 00:06:00,080 --> 00:06:04,479 Speaker 2: this totally approachable next door friend kind of attitude. 104 00:06:04,680 --> 00:06:07,360 Speaker 1: And when did you have the idea to pitch him 105 00:06:07,360 --> 00:06:09,719 Speaker 1: on being a biographical subject for you. 106 00:06:10,040 --> 00:06:12,120 Speaker 2: So after finishing my last book, The Power Law about 107 00:06:12,160 --> 00:06:14,080 Speaker 2: Venture Capital, I was thinking, and you know, this is 108 00:06:14,120 --> 00:06:17,200 Speaker 2: now mid twenty twenty two, what would be a good 109 00:06:17,279 --> 00:06:20,279 Speaker 2: next subject. And because I had met Demis several times 110 00:06:20,279 --> 00:06:22,760 Speaker 2: and I kind of followed what Deep Mind was up to. 111 00:06:22,839 --> 00:06:26,080 Speaker 2: I knew about the protein folding system, I knew about 112 00:06:26,120 --> 00:06:29,159 Speaker 2: i'lpha go, the go playing system before that, and so forth, 113 00:06:29,839 --> 00:06:32,279 Speaker 2: and I had a sense that it would probably go 114 00:06:32,400 --> 00:06:35,320 Speaker 2: from the fringe to the mainstream at some point in 115 00:06:35,360 --> 00:06:37,360 Speaker 2: the next year or so. And then it took me 116 00:06:37,400 --> 00:06:40,919 Speaker 2: a few months after that conversation inside my head to 117 00:06:41,160 --> 00:06:43,599 Speaker 2: get my act together, listen to every podcast that Demis 118 00:06:43,600 --> 00:06:45,600 Speaker 2: had ever done, read all his lectures, really think my 119 00:06:45,640 --> 00:06:48,480 Speaker 2: way into his brain, and then go and see him 120 00:06:48,480 --> 00:06:51,200 Speaker 2: to pitch him on giving me a ton of time, 121 00:06:51,240 --> 00:06:53,440 Speaker 2: because I need a lot of time with people if 122 00:06:53,480 --> 00:06:57,360 Speaker 2: I'm going to write a book with him. And I said, look, 123 00:06:57,400 --> 00:06:58,760 Speaker 2: you know, I want to write this book about you. 124 00:06:59,360 --> 00:07:01,080 Speaker 2: And it seems to me Demists that you may not 125 00:07:01,120 --> 00:07:05,120 Speaker 2: want a book about you, but you've said repeatedly in 126 00:07:05,160 --> 00:07:07,159 Speaker 2: all of your lectures that AI will be the most 127 00:07:07,160 --> 00:07:11,040 Speaker 2: important invention in all of human history demos. So that 128 00:07:11,160 --> 00:07:15,320 Speaker 2: means if you're the creator of this AI, you must 129 00:07:15,360 --> 00:07:18,360 Speaker 2: be one of the most important people in human history. 130 00:07:18,400 --> 00:07:20,560 Speaker 2: And if that's the case, you don't have a choice. 131 00:07:20,680 --> 00:07:23,200 Speaker 2: Somebody is going to write a book, right And furthermore, 132 00:07:23,200 --> 00:07:26,239 Speaker 2: you should welcome this because if you're going to invent 133 00:07:26,280 --> 00:07:30,520 Speaker 2: a technology that is going to disrupt people's lives so thoroughly, 134 00:07:30,640 --> 00:07:33,280 Speaker 2: you know your job will be different, how you raise 135 00:07:33,320 --> 00:07:35,600 Speaker 2: your children will be different, how you think yourself as 136 00:07:35,640 --> 00:07:38,480 Speaker 2: a human will be different because you now have this 137 00:07:38,480 --> 00:07:42,320 Speaker 2: different source of intelligence competing with you. You can't disrupt 138 00:07:42,320 --> 00:07:44,880 Speaker 2: people from head to toe and then not explain them 139 00:07:45,280 --> 00:07:48,280 Speaker 2: why you did it. You need to explain your motives, right, 140 00:07:48,960 --> 00:07:52,640 Speaker 2: And that's the project I'm proposing to you. And he 141 00:07:52,680 --> 00:07:54,720 Speaker 2: thought about it, and he's seen well disposed to this. 142 00:07:55,040 --> 00:07:59,720 Speaker 2: And then one week later chatchipt came out. Oh my goodness, 143 00:07:59,760 --> 00:08:02,320 Speaker 2: and expectation of the technology going from the fringe to 144 00:08:02,360 --> 00:08:04,600 Speaker 2: the mainstream happened a whole lot quicker than I expected. 145 00:08:05,520 --> 00:08:08,120 Speaker 1: And I mean obviously a lot of the book of 146 00:08:08,400 --> 00:08:11,600 Speaker 1: your interests in the technology, how it might change the world, 147 00:08:12,240 --> 00:08:16,040 Speaker 1: the kind of financing and deal making shenanigans that made 148 00:08:16,040 --> 00:08:18,200 Speaker 1: Deep Mind in many ways what it is today. But 149 00:08:18,640 --> 00:08:23,000 Speaker 1: what at the personal side of an extraordinary biographical portrait 150 00:08:23,160 --> 00:08:28,880 Speaker 1: too very specific parents a prodigious talent for chess, which 151 00:08:28,880 --> 00:08:30,560 Speaker 1: he then gave up because he thought it wasn't in 152 00:08:30,560 --> 00:08:33,040 Speaker 1: some sense consequential enough. I mean, did you know when 153 00:08:33,040 --> 00:08:34,920 Speaker 1: you got it, you knew that Ai was the next thing, 154 00:08:34,960 --> 00:08:37,000 Speaker 1: and you know there he was. But did you know 155 00:08:37,040 --> 00:08:40,160 Speaker 1: what an extraordinary personal story he had before you really 156 00:08:40,160 --> 00:08:41,000 Speaker 1: got into it with him? 157 00:08:41,120 --> 00:08:43,520 Speaker 2: No? I didn't. And in fact, I remember very clearly 158 00:08:44,600 --> 00:08:48,320 Speaker 2: to two early experiences in the first discussions. You know. 159 00:08:48,440 --> 00:08:51,760 Speaker 2: One was I was going to have this dinner I 160 00:08:51,800 --> 00:08:54,319 Speaker 2: told you about, and he told me to read a 161 00:08:54,360 --> 00:08:57,240 Speaker 2: book before I came to the dinner, and the book 162 00:08:57,320 --> 00:08:59,439 Speaker 2: was enders a Game. Now, this is a science fiction 163 00:08:59,520 --> 00:09:04,560 Speaker 2: story about a sort of diminutive boy genius hero who 164 00:09:04,600 --> 00:09:08,880 Speaker 2: has to save planet Earth from invading space aliens. And 165 00:09:08,920 --> 00:09:10,800 Speaker 2: he's at the end of the book he saves all 166 00:09:10,800 --> 00:09:14,280 Speaker 2: of humanity from space aliens. And Demis said to me, well, 167 00:09:14,320 --> 00:09:15,920 Speaker 2: I wanted you to read this book because I really 168 00:09:15,920 --> 00:09:19,880 Speaker 2: identify with that character Ender, And I'm thinking, wait, so 169 00:09:20,040 --> 00:09:22,319 Speaker 2: you're telling me you're the savior of humanity. I mean, 170 00:09:22,360 --> 00:09:24,719 Speaker 2: even if you think that Demis, maybe you shouldn't be 171 00:09:24,720 --> 00:09:26,400 Speaker 2: announcing it to the person who's about to write a 172 00:09:26,400 --> 00:09:28,719 Speaker 2: book about you. I mean, surely that's too messianic, to 173 00:09:28,840 --> 00:09:32,000 Speaker 2: over the top, too ridiculous. But he's right out there 174 00:09:32,000 --> 00:09:33,680 Speaker 2: with it. You mean, that is how he thinks, and 175 00:09:33,720 --> 00:09:36,120 Speaker 2: he's not ashamed to tell you. And so that was 176 00:09:36,320 --> 00:09:39,760 Speaker 2: pretty extraordinary. And then the second thing was I went 177 00:09:39,840 --> 00:09:42,720 Speaker 2: to see Shane Legg, his Scientific co founder, and he 178 00:09:42,760 --> 00:09:44,840 Speaker 2: told me the story about how I said, you know, 179 00:09:44,840 --> 00:09:47,439 Speaker 2: what was it like to work with Demis And he said, well, 180 00:09:47,720 --> 00:09:51,080 Speaker 2: you know, Demis has crazy determination. I said, well, what 181 00:09:51,120 --> 00:09:53,520 Speaker 2: do you mean. He said, well, you know, one day, 182 00:09:54,720 --> 00:09:57,040 Speaker 2: according to Demis, his dad said to him, during the 183 00:09:57,160 --> 00:10:00,360 Speaker 2: chess period of his life, listen, you're gonna go play 184 00:10:00,440 --> 00:10:03,880 Speaker 2: chest today. You know, you just have to try your best. Now, 185 00:10:03,880 --> 00:10:05,520 Speaker 2: when I say that to my son, I mean, you know, 186 00:10:05,559 --> 00:10:07,360 Speaker 2: it's fine to lose so long as you try your best. 187 00:10:07,520 --> 00:10:10,880 Speaker 2: The way that Demis apparently interpreted it, according to Shane, 188 00:10:10,960 --> 00:10:13,959 Speaker 2: was you have to try your absolute, absolute, absolute best. 189 00:10:14,320 --> 00:10:16,240 Speaker 2: And it's like running a race and at the end 190 00:10:16,280 --> 00:10:19,600 Speaker 2: of the marathon you fall over the tape and you're 191 00:10:19,640 --> 00:10:21,280 Speaker 2: on the ground and you have to be taken to 192 00:10:21,360 --> 00:10:24,280 Speaker 2: hospital because you're almost dead. And if you haven't been 193 00:10:24,280 --> 00:10:27,360 Speaker 2: taken to hospital, it means you didn't try hard enough. 194 00:10:28,040 --> 00:10:30,800 Speaker 2: That is what try your best meant to Demis aged 195 00:10:30,840 --> 00:10:34,520 Speaker 2: about ten or twelve, and I went to see Demis 196 00:10:34,520 --> 00:10:37,400 Speaker 2: the next time and I replayed this back to him 197 00:10:37,520 --> 00:10:39,160 Speaker 2: and said, is that really true? Is that how you 198 00:10:39,200 --> 00:10:41,760 Speaker 2: interpreted it? He said, oh, yeah, absolutely. You know you 199 00:10:41,840 --> 00:10:44,480 Speaker 2: have to give it every single drop all the time. 200 00:10:44,760 --> 00:10:48,120 Speaker 1: And there's an amazing moment where Demis describes talking about 201 00:10:48,240 --> 00:10:53,160 Speaker 1: hearing nature or science screaming at him and him struggling 202 00:10:53,240 --> 00:10:54,400 Speaker 1: to hear and to understand. 203 00:10:54,800 --> 00:10:58,199 Speaker 2: Yeah, that was the most extreme expression of his desire 204 00:10:58,280 --> 00:11:01,280 Speaker 2: to invent AI. So one day I was with him 205 00:11:01,320 --> 00:11:03,720 Speaker 2: in you know, Hampstead Heath, which is a park in 206 00:11:03,760 --> 00:11:05,760 Speaker 2: North London, not in the pub. Not in the pub 207 00:11:05,800 --> 00:11:07,959 Speaker 2: this time it was a nice day, so we went 208 00:11:08,000 --> 00:11:10,600 Speaker 2: for this cafe instead, and you know, there he was. 209 00:11:10,600 --> 00:11:12,520 Speaker 2: There was kind of a classic English scene. There was 210 00:11:12,559 --> 00:11:14,680 Speaker 2: somebody in front of me who was on his cell 211 00:11:14,679 --> 00:11:16,679 Speaker 2: phone you're doing some sort of sales job, and two 212 00:11:16,720 --> 00:11:19,320 Speaker 2: women behind me talking about their friend who had a 213 00:11:19,360 --> 00:11:22,160 Speaker 2: medical incident and had to get to hospital. So all 214 00:11:22,200 --> 00:11:25,400 Speaker 2: these quididian noises in the background, and there is demisis 215 00:11:25,400 --> 00:11:27,600 Speaker 2: Sabus looking at me, talking about the creation of this 216 00:11:27,640 --> 00:11:30,520 Speaker 2: godlike machine and saying that when he's up at two 217 00:11:30,559 --> 00:11:33,480 Speaker 2: in the morning at his desk at home thinking about this, 218 00:11:33,880 --> 00:11:37,880 Speaker 2: he can sort of feel reality summoning him, screaming at him, 219 00:11:37,960 --> 00:11:41,000 Speaker 2: understand me, understand me, and you know he would then 220 00:11:41,040 --> 00:11:44,880 Speaker 2: slam the table and say, look, Sebastian, this table, it's 221 00:11:44,920 --> 00:11:47,760 Speaker 2: made of atoms, buzzing around with electrons. Why should it 222 00:11:47,800 --> 00:11:50,199 Speaker 2: be solid? Why should that laptop you've got there, why 223 00:11:50,200 --> 00:11:52,640 Speaker 2: should it you know, pieces of sand and metal? How 224 00:11:52,640 --> 00:11:55,040 Speaker 2: could that turn into something which can think? I mean, 225 00:11:55,080 --> 00:11:57,720 Speaker 2: what's going on here? There must be some intelligent thoughts 226 00:11:57,840 --> 00:12:01,040 Speaker 2: designing all these things. And so he kind of basically 227 00:12:01,040 --> 00:12:05,120 Speaker 2: told me that inventing AI and understanding the universe is 228 00:12:05,200 --> 00:12:07,520 Speaker 2: like getting closer to what he thinks of as God. 229 00:12:08,080 --> 00:12:09,160 Speaker 1: So he's a religious man. 230 00:12:09,280 --> 00:12:11,240 Speaker 2: I don't know if he would agree with religious because 231 00:12:11,240 --> 00:12:14,760 Speaker 2: he doesn't go to organized religious services, but he's spiritual. 232 00:12:14,800 --> 00:12:18,280 Speaker 1: I would say, interesting, And I mean that scene you 233 00:12:18,320 --> 00:12:20,520 Speaker 1: described could be a scene from Oppenheimer, right, I mean 234 00:12:21,520 --> 00:12:24,320 Speaker 1: it's so cinematic. Did you ever think is he doing 235 00:12:24,360 --> 00:12:26,160 Speaker 1: this for me? Or is he crazy? Or is it 236 00:12:26,240 --> 00:12:30,120 Speaker 1: just absolutely captivating? The energy and the sense of purpose 237 00:12:30,160 --> 00:12:31,200 Speaker 1: that he brings to this. 238 00:12:31,440 --> 00:12:34,880 Speaker 2: He just exudes both energy and intelligence, but also storytelling, 239 00:12:34,960 --> 00:12:37,680 Speaker 2: natural talent. It's just amazing. I mean, you know, one 240 00:12:37,720 --> 00:12:40,680 Speaker 2: time I asked him about his first office in London 241 00:12:40,720 --> 00:12:43,480 Speaker 2: and Russell Square, which is a sort of storied square, 242 00:12:43,600 --> 00:12:46,120 Speaker 2: you know, near the British Museum and so forth. And 243 00:12:46,160 --> 00:12:48,280 Speaker 2: you know, normally, as a writer, you ask somebody to 244 00:12:48,320 --> 00:12:53,000 Speaker 2: recapture the emotion of opening their first office fifteen years ago. 245 00:12:53,679 --> 00:12:55,560 Speaker 2: It's fifteen years ago. They're going to say, oh, yeah, 246 00:12:55,559 --> 00:12:57,240 Speaker 2: it was cool. You know, that's all you'll get out 247 00:12:57,240 --> 00:13:00,520 Speaker 2: of them. But demis just flows with stories. He said, wow, 248 00:13:00,559 --> 00:13:02,720 Speaker 2: you know, I was in the attic that's where the 249 00:13:02,720 --> 00:13:04,240 Speaker 2: office was, and of course you had to come down 250 00:13:04,280 --> 00:13:06,040 Speaker 2: the stairs. They were all Rickety's so I came down 251 00:13:06,200 --> 00:13:08,280 Speaker 2: ding ding ding ding ding bang bang bang, And then 252 00:13:08,320 --> 00:13:09,800 Speaker 2: I come out on the square and there's these beautiful 253 00:13:09,800 --> 00:13:11,640 Speaker 2: trees in front of me. Beyond to the right. If 254 00:13:11,640 --> 00:13:13,640 Speaker 2: you just go three doors down Sebastian, that's where you 255 00:13:13,679 --> 00:13:18,640 Speaker 2: see the London mathematical society, where Turing invented the origins 256 00:13:18,679 --> 00:13:21,600 Speaker 2: of computer science, which we are now completing. And then 257 00:13:21,920 --> 00:13:24,760 Speaker 2: if you go beyond that to the level crossing black white, 258 00:13:24,760 --> 00:13:27,960 Speaker 2: black white crossing the street, the pedestrian crossing, that is 259 00:13:28,040 --> 00:13:32,360 Speaker 2: where the Hungarian nuclear scientist Zilad had the idea for 260 00:13:32,400 --> 00:13:35,160 Speaker 2: a nuclear chain reaction back in the nineteen thirties, which 261 00:13:35,240 --> 00:13:37,920 Speaker 2: led to the atom bomb. And of course we are 262 00:13:37,960 --> 00:13:40,240 Speaker 2: now creating the equivalent of the atom bomb with Ai. 263 00:13:40,720 --> 00:13:45,120 Speaker 2: What a subject, Yeah, I mean, he is such a storyteller. 264 00:13:44,559 --> 00:13:46,079 Speaker 1: And I heard that he has a sense of humor 265 00:13:46,160 --> 00:13:47,880 Speaker 1: or perhaps a sense of humor about himself in some 266 00:13:47,920 --> 00:13:50,880 Speaker 1: ways as well. Didn't he say when he lost to 267 00:13:50,960 --> 00:13:53,400 Speaker 1: the table football competition in the office that his soul 268 00:13:53,520 --> 00:13:54,080 Speaker 1: was on fire? 269 00:13:54,320 --> 00:13:56,680 Speaker 2: He did say that, yes, you know, one can mock 270 00:13:56,760 --> 00:14:00,600 Speaker 2: him for being too competitive and taking trips things like 271 00:14:00,640 --> 00:14:02,920 Speaker 2: table football seriously, but he actually really feels it. 272 00:14:03,000 --> 00:14:06,319 Speaker 1: And we talked about the sort of mushroom cloud of motivation. 273 00:14:07,040 --> 00:14:08,440 Speaker 1: One of the things that doesn't seem to be so 274 00:14:08,600 --> 00:14:11,640 Speaker 1: motivating to him is money. I mean, there's a story 275 00:14:11,679 --> 00:14:14,320 Speaker 1: about the offer as a as an eighteen year old 276 00:14:14,800 --> 00:14:19,240 Speaker 1: of a half million pounds to join game development Studio right, 277 00:14:19,280 --> 00:14:21,880 Speaker 1: which he turned out right despite coming from me. I 278 00:14:21,920 --> 00:14:24,640 Speaker 1: know his mother had gone through homelessness her in her youth. 279 00:14:24,680 --> 00:14:27,120 Speaker 1: I mean, was that a hard decision for him? Why 280 00:14:27,120 --> 00:14:27,760 Speaker 1: did he make it? 281 00:14:27,960 --> 00:14:30,320 Speaker 2: He said, it was completely easy in today's money. It 282 00:14:30,440 --> 00:14:32,640 Speaker 2: was well over a million dollars that he was being offered. 283 00:14:33,040 --> 00:14:35,280 Speaker 2: He was eighteen. As you say, his parents were not rich. 284 00:14:35,800 --> 00:14:39,400 Speaker 2: I mean, you know, any self respecting sort of Stanford character. 285 00:14:39,840 --> 00:14:42,120 Speaker 2: But at this point of fall, you know, taking the 286 00:14:42,160 --> 00:14:45,000 Speaker 2: money dropped out of Stanford and you know, written off 287 00:14:45,040 --> 00:14:47,840 Speaker 2: into the sunset with a loot. No, Demis is different. 288 00:14:47,840 --> 00:14:51,200 Speaker 2: Demis wanted to understand science. That was his primary motivation. 289 00:14:51,240 --> 00:14:53,000 Speaker 2: That's what he's up when he's up at two o'clock 290 00:14:53,040 --> 00:14:55,440 Speaker 2: in the morning. He's thinking, how do I understand nature? 291 00:14:55,840 --> 00:14:58,360 Speaker 2: And so he turned down the cash to go and 292 00:14:58,400 --> 00:14:59,840 Speaker 2: study computer science instead. 293 00:15:00,080 --> 00:15:03,920 Speaker 1: Fast forward a few years and he meets Peter Teel, 294 00:15:05,080 --> 00:15:09,080 Speaker 1: who gives him a A plus for science fiction and 295 00:15:09,120 --> 00:15:12,280 Speaker 1: an F for business model. But nonetheless the size skives 296 00:15:12,280 --> 00:15:12,680 Speaker 1: some money. 297 00:15:13,880 --> 00:15:16,040 Speaker 2: Well, actually there's a fraud and slip there. You said 298 00:15:16,040 --> 00:15:18,040 Speaker 2: an A plus for science fiction. I think you met 299 00:15:18,040 --> 00:15:19,840 Speaker 2: an A plus for science Oh, put. 300 00:15:19,760 --> 00:15:24,000 Speaker 3: Science fiction, but science fiction maybe that would have been better, 301 00:15:24,000 --> 00:15:27,240 Speaker 3: because in fact Demis was spinning this vision and this 302 00:15:27,440 --> 00:15:30,400 Speaker 3: is twenty ten, right, He was saying, I'm going to invent. 303 00:15:30,720 --> 00:15:34,000 Speaker 2: Very powerful AI. This is at a time when AI 304 00:15:34,280 --> 00:15:38,160 Speaker 2: literally couldn't recognize the photograph of a cat, like nothing 305 00:15:38,280 --> 00:15:40,680 Speaker 2: was working, and you have this character coming and say, oh, 306 00:15:40,680 --> 00:15:43,560 Speaker 2: I'm going to create artificial general intelligence. It was nuts. 307 00:15:43,600 --> 00:15:45,160 Speaker 2: So it kind of was science fiction. 308 00:15:45,920 --> 00:15:49,280 Speaker 1: What was his entree to the world of technology investors 309 00:15:49,320 --> 00:15:51,480 Speaker 1: and when did deep Mind actually start as a company. 310 00:15:51,640 --> 00:15:54,000 Speaker 2: Deep Mind started in twenty ten, having raised the money 311 00:15:54,000 --> 00:15:57,560 Speaker 2: from Peter tele The entree is very interesting because in 312 00:15:57,600 --> 00:16:00,600 Speaker 2: fact what happened was, you know, Demis had done a 313 00:16:00,640 --> 00:16:03,680 Speaker 2: small games company before, and he made some money. Wasn't 314 00:16:03,680 --> 00:16:07,040 Speaker 2: a terrific success, but nonetheless it wasn't total failure. And 315 00:16:07,080 --> 00:16:08,960 Speaker 2: he went back to the same investors. They all said, 316 00:16:09,000 --> 00:16:11,840 Speaker 2: you must be joking. There's no product if you're doing 317 00:16:11,880 --> 00:16:14,280 Speaker 2: AI and not putting money into that. So then he 318 00:16:14,280 --> 00:16:17,360 Speaker 2: had to think again, and his entree into the world 319 00:16:17,400 --> 00:16:21,000 Speaker 2: of Peter Teel and what's called the singularity summits, where 320 00:16:21,040 --> 00:16:24,560 Speaker 2: all these very early believers in AI would gather people 321 00:16:24,600 --> 00:16:27,920 Speaker 2: like Ray Kurtzweil, and they would dream about a future 322 00:16:28,240 --> 00:16:31,760 Speaker 2: of an AI that totally did not exist, and when 323 00:16:31,760 --> 00:16:33,720 Speaker 2: they got up on the stage, actually they did often 324 00:16:33,880 --> 00:16:39,000 Speaker 2: draw more on science fiction novels than on science when 325 00:16:39,000 --> 00:16:42,040 Speaker 2: they were kind of imagining a future with AI. And so, 326 00:16:42,360 --> 00:16:46,360 Speaker 2: in this strange cauldron of mythology and reality with all 327 00:16:46,480 --> 00:16:50,600 Speaker 2: kinds of weirdos trotting about, Demisa Sabots, who by this 328 00:16:50,600 --> 00:16:53,080 Speaker 2: point has a computer science degree and a PhD in 329 00:16:53,080 --> 00:16:56,360 Speaker 2: neuroscience as a proper scientist, shows up and he's asked 330 00:16:56,360 --> 00:17:00,520 Speaker 2: by a journalist what do you think of the Singularity conference? 331 00:17:00,640 --> 00:17:03,480 Speaker 2: Are you a Singularitarian? And he says it's a bit 332 00:17:03,600 --> 00:17:06,080 Speaker 2: Californian for me, and he's, oh, you could sort of 333 00:17:06,240 --> 00:17:09,240 Speaker 2: feel the kind of anxiety of being seen in this crowd. 334 00:17:09,680 --> 00:17:11,600 Speaker 2: But that's why you had to go to meet Peter Teele. 335 00:17:12,240 --> 00:17:13,959 Speaker 2: And then when he met Peter Tiel, he had this 336 00:17:14,000 --> 00:17:16,600 Speaker 2: clever trick. Peter Teel is a chess player. Demis is 337 00:17:16,600 --> 00:17:20,200 Speaker 2: a chess player. So rather than pitch Peter tile on 338 00:17:20,280 --> 00:17:22,200 Speaker 2: some idea about a company, and he said, well, I 339 00:17:22,240 --> 00:17:24,960 Speaker 2: think the interesting thing about chess is that the knight 340 00:17:25,119 --> 00:17:28,919 Speaker 2: and the Bishop are supremely well balanced. And it's in 341 00:17:28,960 --> 00:17:31,439 Speaker 2: that tension between those two pieces that much of the 342 00:17:31,560 --> 00:17:34,600 Speaker 2: joy of the game resides. So Peter T' is like, WHOA, 343 00:17:34,800 --> 00:17:37,479 Speaker 2: that's a conversation I want to pursue. And so that 344 00:17:37,520 --> 00:17:40,160 Speaker 2: got him and got Demssus Habits an invitation to Peter 345 00:17:40,200 --> 00:17:42,600 Speaker 2: Tele's house the next day, and then that's when he 346 00:17:42,640 --> 00:17:45,080 Speaker 2: pitched him on Deep Mind and got the money he 347 00:17:45,119 --> 00:17:46,199 Speaker 2: needed to start the company. 348 00:17:46,520 --> 00:17:52,440 Speaker 1: And then first forty twenty thirteen and which the excerpt 349 00:17:52,440 --> 00:17:54,199 Speaker 1: in the Wall Street Journal of your book tells the 350 00:17:54,240 --> 00:17:58,000 Speaker 1: story of a birthday party for Elon Musk, replete with 351 00:17:58,040 --> 00:18:01,640 Speaker 1: all kinds of costumes and strange things and fake battlements. 352 00:18:01,680 --> 00:18:06,520 Speaker 1: But this is, perhaps, apart from the founding, the most 353 00:18:06,560 --> 00:18:09,480 Speaker 1: crucial moment in Deep Minds Genesis as a company. 354 00:18:09,560 --> 00:18:12,240 Speaker 2: Right, yeah, that's right. So you know, by this point 355 00:18:12,840 --> 00:18:17,040 Speaker 2: Demis had raised three rounds of venture capital, including from 356 00:18:17,040 --> 00:18:19,600 Speaker 2: Elon Musk, and you know, there are various people could 357 00:18:19,600 --> 00:18:22,040 Speaker 2: come in, but it was a total pain in the neck. 358 00:18:22,080 --> 00:18:25,480 Speaker 2: He hated it. You know. He would sometimes have this expression, 359 00:18:25,720 --> 00:18:27,920 Speaker 2: I don't want this part of my brain to expand 360 00:18:28,440 --> 00:18:31,439 Speaker 2: he wanted to be doing science, and so what he 361 00:18:31,520 --> 00:18:35,040 Speaker 2: wanted was to be liberated from this hamster wheel of fundraising. 362 00:18:35,880 --> 00:18:39,000 Speaker 2: And along at this party, this birthday party that Elon 363 00:18:39,080 --> 00:18:43,520 Speaker 2: Musk had, along comes Larry Page from Google who's also there, 364 00:18:44,280 --> 00:18:46,440 Speaker 2: and says, let's go for a walk, and they walk 365 00:18:46,480 --> 00:18:50,240 Speaker 2: around the castle grounds, and in this bizarre setting, Larry 366 00:18:50,280 --> 00:18:52,800 Speaker 2: Page says to him, well, you know, you could spend 367 00:18:52,800 --> 00:18:56,080 Speaker 2: your career building another company like Google. That's fine, but 368 00:18:56,160 --> 00:18:59,080 Speaker 2: if you really want to do science, just join Google 369 00:18:59,359 --> 00:19:03,280 Speaker 2: and we'll give the resources, use our platform, and you'll 370 00:19:03,320 --> 00:19:06,440 Speaker 2: be able to do what you really love. And Demis 371 00:19:06,480 --> 00:19:09,880 Speaker 2: not only agreed with that pitch in the sense that yes, 372 00:19:09,920 --> 00:19:12,640 Speaker 2: he preferred to do science then to be a billionaire, 373 00:19:13,440 --> 00:19:16,040 Speaker 2: but he felt that Larry Page himself would have accepted 374 00:19:16,040 --> 00:19:19,480 Speaker 2: that pitch. That Larry Page cared about science, he could 375 00:19:19,480 --> 00:19:22,880 Speaker 2: have been a standard professor of computer science. So Demis 376 00:19:22,880 --> 00:19:25,760 Speaker 2: really identified with Larry Page, and that was why he 377 00:19:25,760 --> 00:19:26,800 Speaker 2: sold the Google. 378 00:19:26,480 --> 00:19:30,000 Speaker 1: And Page had his eye on Demis or this was impulsive. 379 00:19:30,280 --> 00:19:33,080 Speaker 1: Had he planned out his chess game for this party 380 00:19:33,200 --> 00:19:35,000 Speaker 1: like Demis had three years before, he had. 381 00:19:34,920 --> 00:19:37,159 Speaker 2: Totally planned the chess game. He'd been thinking for a 382 00:19:37,200 --> 00:19:41,800 Speaker 2: while about buying up nascent Ai companies, and he'd bought 383 00:19:41,800 --> 00:19:45,679 Speaker 2: the boutique founded by the Toronto professor Jeffrey Hinton together 384 00:19:45,720 --> 00:19:50,040 Speaker 2: with Ilias Askeva and one other person, and so he 385 00:19:50,200 --> 00:19:51,639 Speaker 2: was in a buying mode. 386 00:19:52,920 --> 00:19:55,440 Speaker 1: That was twenty twelve, right the image net team exactly. 387 00:19:55,480 --> 00:19:57,560 Speaker 2: He bought the image net team, and then the next 388 00:19:57,600 --> 00:20:01,359 Speaker 2: obvious person to buy was was and DeepMind, because they 389 00:20:01,440 --> 00:20:05,119 Speaker 2: had a different approach to AI. It wasn't just deep learning, 390 00:20:05,680 --> 00:20:08,919 Speaker 2: which is the ImageNet secret source, which is kind of 391 00:20:08,960 --> 00:20:12,560 Speaker 2: packet pattern recognition learning from data. It was also what's 392 00:20:12,560 --> 00:20:16,800 Speaker 2: called reinforcement learning, which is learning through trial and error 393 00:20:17,119 --> 00:20:19,080 Speaker 2: in a simulation. So you have a game like the 394 00:20:19,119 --> 00:20:23,000 Speaker 2: Atari games or go later on, and you try lots 395 00:20:23,000 --> 00:20:25,119 Speaker 2: of different the computer, try thoughts and move c suites, 396 00:20:25,160 --> 00:20:27,840 Speaker 2: run works and then learns through trial and error. And 397 00:20:27,840 --> 00:20:30,760 Speaker 2: in some ways, another strand in my book is the 398 00:20:31,160 --> 00:20:35,840 Speaker 2: interplay between deep learning on the one hand and reinforcement 399 00:20:35,920 --> 00:20:39,399 Speaker 2: learning on the other hand. And these two fields of 400 00:20:39,520 --> 00:20:43,240 Speaker 2: artificial intelligence, you know, have their different moments in the sun. 401 00:20:43,320 --> 00:20:44,960 Speaker 2: As the story progresses. 402 00:20:45,280 --> 00:20:47,320 Speaker 1: Hinton talked He came on tex Stuff and talked about 403 00:20:47,320 --> 00:20:50,920 Speaker 1: how he ran an auction to sell image Net with Google, Microsoft, 404 00:20:51,000 --> 00:20:52,960 Speaker 1: and by Doo. But in the end he all he 405 00:20:53,000 --> 00:20:56,879 Speaker 1: really wanted was to go to Google for for Demis, 406 00:20:57,480 --> 00:21:00,800 Speaker 1: he was being courted as well by others, including a 407 00:21:00,880 --> 00:21:04,040 Speaker 1: dinner at Mark Zuckerberg's house in the I guess weeks 408 00:21:04,119 --> 00:21:06,920 Speaker 1: or months after this first meeting with Larry Page at 409 00:21:06,960 --> 00:21:12,600 Speaker 1: Elon Maas's birthday party, and he submitted Mark Zuckerberg to 410 00:21:12,640 --> 00:21:14,360 Speaker 1: a test at this dinner. 411 00:21:14,200 --> 00:21:18,800 Speaker 2: Right, Yeah, that's right. So the test was a bit subtle. Predictably, 412 00:21:18,880 --> 00:21:21,800 Speaker 2: they sit down to dinner and Mark Zuckerberg, who's longing 413 00:21:21,800 --> 00:21:24,880 Speaker 2: to buy deep Mind to get one over Google. 414 00:21:25,760 --> 00:21:27,840 Speaker 1: This was not recently, this was ten years ago. 415 00:21:27,880 --> 00:21:32,600 Speaker 2: It is twenty thirteen. So Mark Zuckerberg says, well, I 416 00:21:32,600 --> 00:21:36,760 Speaker 2: think AI is the most important technology in human history. 417 00:21:36,760 --> 00:21:39,159 Speaker 2: It's extraordinary, and you know, I really hope you agree 418 00:21:39,200 --> 00:21:43,760 Speaker 2: to join me at Facebook because you know, we could 419 00:21:43,760 --> 00:21:46,080 Speaker 2: just do great things together. Blah blah blah blah. And 420 00:21:46,119 --> 00:21:48,400 Speaker 2: then you know, the conversation moves on, time goes by, 421 00:21:48,440 --> 00:21:51,800 Speaker 2: and then Demis slightly says, you know, three D printing 422 00:21:51,880 --> 00:21:56,040 Speaker 2: is extraordinary, and Zuckerberg goes, yeah, I agree, you know, incredible, 423 00:21:56,080 --> 00:21:58,040 Speaker 2: that's just going to unlock so many things. And then 424 00:21:58,040 --> 00:22:01,600 Speaker 2: a bit later, Demis says, you know, official reality, that 425 00:22:01,680 --> 00:22:05,600 Speaker 2: really is going to be transformative. As Zuckerberg's like, yeah, 426 00:22:05,640 --> 00:22:08,359 Speaker 2: it's transformative. It's so exciting. I'm so excited by that. 427 00:22:08,880 --> 00:22:11,160 Speaker 2: And then Demis's mind is wearing. Is said, Okay, he's 428 00:22:11,200 --> 00:22:14,560 Speaker 2: a bullshit artist. He does not believe that AI is 429 00:22:14,560 --> 00:22:16,560 Speaker 2: the most important thing. Ever, he does not get it. 430 00:22:16,640 --> 00:22:19,680 Speaker 2: I'm selling to Google, Forget, forget Facebook. 431 00:22:19,320 --> 00:22:20,600 Speaker 1: Even though the more money on the table. 432 00:22:21,359 --> 00:22:23,399 Speaker 2: Yeah, that's right. In fact, Facebook was offering to make 433 00:22:23,440 --> 00:22:25,960 Speaker 2: Demis a lot richer, but he was consistent throughout his career. 434 00:22:26,000 --> 00:22:28,480 Speaker 2: Demis in turning down the money not to go to 435 00:22:28,640 --> 00:22:31,800 Speaker 2: Cambridge University, turning down the money to sell to Facebook. 436 00:22:32,000 --> 00:22:33,800 Speaker 2: It's not about the money for him, it's really about 437 00:22:33,800 --> 00:22:34,240 Speaker 2: the science. 438 00:22:34,240 --> 00:22:36,640 Speaker 1: And you mentioned him using his scientific method to see 439 00:22:36,640 --> 00:22:40,280 Speaker 1: two three years into the future. Instead, Facebook went with 440 00:22:40,560 --> 00:22:43,560 Speaker 1: Jan Lucun and gave him plenty of resources, and I 441 00:22:43,560 --> 00:22:45,879 Speaker 1: think he was trying to poach some of Demos's employees. 442 00:22:46,440 --> 00:22:48,159 Speaker 1: Demists told them that the Google deal was going to 443 00:22:48,160 --> 00:22:50,119 Speaker 1: happen and to sit and therefore to sit tight, and 444 00:22:50,440 --> 00:22:53,720 Speaker 1: they did. But you know, fast forward to twenty twenty 445 00:22:53,760 --> 00:22:57,960 Speaker 1: six and Jan Niklean hasentually been dumped from Meta, and 446 00:22:58,240 --> 00:23:01,600 Speaker 1: Demis is where he's sitting Google is he is? He 447 00:23:01,720 --> 00:23:04,439 Speaker 1: is he the successor to Sundar. Is he the you know, 448 00:23:04,480 --> 00:23:06,879 Speaker 1: the ego and the ID. I mean, what is his 449 00:23:07,040 --> 00:23:08,080 Speaker 1: role in Google today? 450 00:23:08,720 --> 00:23:11,240 Speaker 2: Well, what his rold is today is to be the 451 00:23:11,359 --> 00:23:14,439 Speaker 2: chief executive of Google Deep Mind, which is the AI engine, 452 00:23:14,440 --> 00:23:16,760 Speaker 2: which is basically powering all the new products in Google. 453 00:23:17,320 --> 00:23:20,159 Speaker 2: So he's super important. Sundar is the chief executive of 454 00:23:20,200 --> 00:23:23,879 Speaker 2: Alburt and Google, and I would argue that the relationship 455 00:23:23,920 --> 00:23:26,920 Speaker 2: between Sundar and Demis is the most important relationship in 456 00:23:27,000 --> 00:23:31,520 Speaker 2: business anywhere at the moment. Because Sundar has Demiss back. 457 00:23:32,320 --> 00:23:35,000 Speaker 2: Sundar gives him the resources. Sundar takes care of the 458 00:23:35,080 --> 00:23:38,040 Speaker 2: kind of all that kind of corporate leadership staff that 459 00:23:38,040 --> 00:23:40,080 Speaker 2: Demis is good at, but it's really not what he 460 00:23:40,080 --> 00:23:43,520 Speaker 2: wants to do full time, and that gives Demis the 461 00:23:43,520 --> 00:23:47,400 Speaker 2: oxygen to pursue AI to the fullest of his abilities, 462 00:23:47,480 --> 00:23:51,000 Speaker 2: which are considerable you know, in the future, if Sundar 463 00:23:51,080 --> 00:23:53,000 Speaker 2: were to go, I don't think that's happening anytime soon, 464 00:23:53,040 --> 00:23:55,439 Speaker 2: by the way, but I think if he were to go, 465 00:23:55,480 --> 00:23:57,639 Speaker 2: you know, Demis would obviously be talked about as a candidate. 466 00:23:57,680 --> 00:24:00,600 Speaker 2: And it's a really interesting question because he is on 467 00:24:00,640 --> 00:24:04,919 Speaker 2: the one hand, somebody who is a leader, has vision, 468 00:24:05,160 --> 00:24:09,160 Speaker 2: can motivate people, would have the credibility to lead Google 469 00:24:09,840 --> 00:24:11,560 Speaker 2: as an AI company. I mean, how often do you 470 00:24:11,600 --> 00:24:14,119 Speaker 2: get somebody who's the CEO and also has a Nobel 471 00:24:14,160 --> 00:24:16,959 Speaker 2: Price That would be quite something. But at the same time, 472 00:24:17,040 --> 00:24:18,800 Speaker 2: Demis has a side him that wants to be a 473 00:24:18,840 --> 00:24:22,000 Speaker 2: pure scientist, that talks to me about you know, there's 474 00:24:22,000 --> 00:24:24,040 Speaker 2: too much noise in Silicon Valley. I want to go 475 00:24:24,080 --> 00:24:27,800 Speaker 2: and think I want to have a research professorship at Princeton. 476 00:24:27,920 --> 00:24:31,600 Speaker 2: That's where Oppenheimer went after the Manhattan Project. That's where 477 00:24:31,600 --> 00:24:34,119 Speaker 2: Einstein went, That's where I should be. You know. He 478 00:24:34,160 --> 00:24:37,000 Speaker 2: has that kind of you know, retreat to the idyll 479 00:24:37,280 --> 00:24:41,360 Speaker 2: of abstract contemplation side to him, and he's so good 480 00:24:41,400 --> 00:24:43,600 Speaker 2: at both of these things. It's what makes him exceptional. 481 00:24:43,640 --> 00:24:46,480 Speaker 2: I mean, if you mentioned Jan Lukun, you know, very 482 00:24:46,520 --> 00:24:49,800 Speaker 2: good scientist, but clearly not a great operator inside business. 483 00:24:50,440 --> 00:24:53,520 Speaker 2: You know, one could talk about Simultman, a great business operator, 484 00:24:53,640 --> 00:24:55,800 Speaker 2: but not a scientist. Dropped out of Steinford, doesn't have 485 00:24:55,800 --> 00:24:59,280 Speaker 2: a degree, you know, it's very rare to find both 486 00:24:59,480 --> 00:25:00,439 Speaker 2: in the same person. 487 00:25:09,000 --> 00:25:13,280 Speaker 1: After the break is Demis an evil genius stay with us. 488 00:25:28,560 --> 00:25:33,120 Speaker 1: You mentioned earlier in the conversation this kind of journey 489 00:25:33,960 --> 00:25:38,879 Speaker 1: Demis had been on where one sort of safety mechanism 490 00:25:38,920 --> 00:25:43,320 Speaker 1: after another that he believed in fell away, and thus 491 00:25:43,440 --> 00:25:49,040 Speaker 1: this kind of metaphors about the atomic bomb. But ironically, 492 00:25:49,880 --> 00:25:53,960 Speaker 1: in some sense, the kind of safety to the wayside 493 00:25:54,720 --> 00:25:58,960 Speaker 1: race that we're in today with AI was kicked off 494 00:25:59,520 --> 00:26:02,240 Speaker 1: by Demis his desire for a safety board. 495 00:26:03,160 --> 00:26:06,120 Speaker 2: Yes, that's a good irony, you're right. So what happened 496 00:26:06,200 --> 00:26:10,240 Speaker 2: was that Demis sold the company to Google in twenty fourteen, 497 00:26:10,640 --> 00:26:12,200 Speaker 2: and one of the conditions was there had to be 498 00:26:12,240 --> 00:26:15,440 Speaker 2: a safety oversight board and whereby Google would allow Deep 499 00:26:15,480 --> 00:26:18,680 Speaker 2: Mind to sort of appoint some you know, important philosophers 500 00:26:18,760 --> 00:26:23,359 Speaker 2: or other people of independent stature to make a final 501 00:26:23,400 --> 00:26:27,840 Speaker 2: decision on when AI would be deployed into the world. 502 00:26:27,880 --> 00:26:30,120 Speaker 2: And the idea was this is AI is too big 503 00:26:30,200 --> 00:26:32,639 Speaker 2: just to let the corporate board of Google do whatever 504 00:26:32,640 --> 00:26:33,760 Speaker 2: it wants with it. You know, there has to be 505 00:26:33,840 --> 00:26:37,680 Speaker 2: a check. So the first of these safety meetings was 506 00:26:37,800 --> 00:26:41,399 Speaker 2: arranged and Demis had the idea, we'll invite Elon Musk 507 00:26:42,000 --> 00:26:45,399 Speaker 2: to shair it, and he invited Rieed Hoffmann and various 508 00:26:45,400 --> 00:26:49,640 Speaker 2: other people and they all met at SpaceX and basically 509 00:26:49,680 --> 00:26:53,080 Speaker 2: what happens is Elon Musk sat there listening, absorbed all 510 00:26:53,440 --> 00:26:56,399 Speaker 2: the presentations from Deep Mind about their plans to build AI, 511 00:26:57,240 --> 00:27:01,320 Speaker 2: and a few months later he announces Open Ai, which 512 00:27:01,359 --> 00:27:04,320 Speaker 2: is going to be the rival company. And so all 513 00:27:04,320 --> 00:27:07,280 Speaker 2: of a sudden, this singleton vision, the idea that you know, 514 00:27:07,359 --> 00:27:11,080 Speaker 2: only one AI lab would shepherd AI into the world 515 00:27:11,080 --> 00:27:15,440 Speaker 2: on behalf of all humanity that just is by the wayside, 516 00:27:15,520 --> 00:27:18,560 Speaker 2: and you've now got two competing labs, and the race 517 00:27:18,640 --> 00:27:20,040 Speaker 2: dynamic begins to set in. 518 00:27:20,560 --> 00:27:22,840 Speaker 1: How did demos feel about what Elon did? 519 00:27:23,119 --> 00:27:27,160 Speaker 2: Betrayed? Elon had sat there listening to all his plans, 520 00:27:28,240 --> 00:27:30,680 Speaker 2: and he'd been invited to chair that meeting in good 521 00:27:30,680 --> 00:27:33,920 Speaker 2: faith to ensure safety for the world, which of course 522 00:27:33,960 --> 00:27:36,080 Speaker 2: is what at the time Elon was a big duma 523 00:27:36,760 --> 00:27:40,440 Speaker 2: and was constantly talking about AI safety and existential risk. 524 00:27:40,840 --> 00:27:44,880 Speaker 2: And so the idea that rather than uniting with deep 525 00:27:44,920 --> 00:27:47,800 Speaker 2: Mind and Google in a single effort to make the 526 00:27:47,880 --> 00:27:51,680 Speaker 2: technology safe, Elon must prefer to go off and start 527 00:27:51,720 --> 00:27:54,760 Speaker 2: a rival in Open AI to Demis. This was a 528 00:27:54,760 --> 00:27:58,480 Speaker 2: total betrayal. Of course, Elon thought of this as Demis 529 00:27:58,520 --> 00:28:03,800 Speaker 2: is dangerous, he's an evil genius, and therefore I need 530 00:28:03,840 --> 00:28:06,760 Speaker 2: to be the one because you know, all of these actors, 531 00:28:07,359 --> 00:28:10,000 Speaker 2: they basically say, I know that I'm a good person. Yeah, 532 00:28:10,119 --> 00:28:12,400 Speaker 2: if I'm the leader of the AI race, I will 533 00:28:12,400 --> 00:28:15,040 Speaker 2: make it safe because I'm good. But those other guys 534 00:28:15,040 --> 00:28:18,920 Speaker 2: over there, you can't trust those guys because you know whatever. Now, 535 00:28:19,000 --> 00:28:22,320 Speaker 2: if you quizzed Elon Musk about why did he say 536 00:28:22,359 --> 00:28:25,360 Speaker 2: that Demis was an Elon was it was an evil genius? 537 00:28:25,480 --> 00:28:27,000 Speaker 1: Your term for a Freudian slip. 538 00:28:29,000 --> 00:28:32,760 Speaker 2: Elon evil genius? Why was Demis an evil genius? Well, 539 00:28:32,760 --> 00:28:34,679 Speaker 2: the only good reason, or not a good reason, but 540 00:28:34,760 --> 00:28:38,480 Speaker 2: the reason was apparently Demis, in his game design days, 541 00:28:38,520 --> 00:28:41,720 Speaker 2: had worked on a game called Evil Genius. No, which 542 00:28:41,760 --> 00:28:44,320 Speaker 2: is a pretty thin basis on which to call him 543 00:28:44,320 --> 00:28:47,960 Speaker 2: an evil genius. But whatever I mean, they all had. 544 00:28:48,240 --> 00:28:54,640 Speaker 1: Association Sean Elbows. So then that this is twenty this 545 00:28:54,720 --> 00:28:57,040 Speaker 1: is meeting is in twenty sixteen, twenty fifteen, twenty fifteen, 546 00:28:57,120 --> 00:28:59,640 Speaker 1: and when is the alpha go moment? 547 00:28:59,720 --> 00:29:03,880 Speaker 2: Six? Okay, so coming out of that moment, when Elon 548 00:29:03,960 --> 00:29:09,120 Speaker 2: Musk decides to set up open. Ai Demis decides, well, 549 00:29:09,160 --> 00:29:11,960 Speaker 2: I'm just going to accelerate as fast as possible, and 550 00:29:12,160 --> 00:29:15,160 Speaker 2: the first thing he manages to score is this victory 551 00:29:15,240 --> 00:29:19,720 Speaker 2: over the Korean Go champion Lisa Dol And it's a 552 00:29:19,800 --> 00:29:23,479 Speaker 2: huge exhibition match in South Korea with all the media 553 00:29:23,640 --> 00:29:26,280 Speaker 2: in attendance, and it's kind of an it's not quite 554 00:29:26,320 --> 00:29:30,640 Speaker 2: chatchypt but it's a moment when Ai had what one 555 00:29:30,760 --> 00:29:35,880 Speaker 2: might call the Kasparov Deep Blue moment in nineteen ninety seven, 556 00:29:36,280 --> 00:29:40,280 Speaker 2: first time the human champion gets defeated, and then twenty sixteen, 557 00:29:40,800 --> 00:29:43,200 Speaker 2: so that's nineteen years later, the same thing happens with 558 00:29:43,240 --> 00:29:44,000 Speaker 2: Go and. 559 00:29:43,960 --> 00:29:46,959 Speaker 1: Two hundred million people tune in and the defeated Korean 560 00:29:47,000 --> 00:29:51,480 Speaker 1: player apologizes to humanity. It's a huge moment, but it's 561 00:29:51,520 --> 00:29:53,840 Speaker 1: nothing like the Chatchipt moment six years later. 562 00:29:54,320 --> 00:29:59,280 Speaker 2: Yeah, because Go people watched. Whereas chat ChiPT you used it. 563 00:29:59,280 --> 00:30:01,280 Speaker 2: It was personal, it was visceral. 564 00:30:01,480 --> 00:30:04,080 Speaker 1: And within a week of you pitching damage on the book, 565 00:30:04,320 --> 00:30:05,360 Speaker 1: Chatchipt came out. 566 00:30:05,480 --> 00:30:07,160 Speaker 2: That's right. And I went to see him right after 567 00:30:07,200 --> 00:30:11,200 Speaker 2: that and he said, you know, this is war. Those 568 00:30:11,200 --> 00:30:14,920 Speaker 2: guys have parked their tanks in our front yard actually 569 00:30:14,920 --> 00:30:19,440 Speaker 2: said on our lawn, but translating for American audience, in 570 00:30:19,440 --> 00:30:23,720 Speaker 2: our front yard. And so you could see that competitive 571 00:30:24,040 --> 00:30:26,440 Speaker 2: glint in his eye, and you knew he was going 572 00:30:26,480 --> 00:30:27,360 Speaker 2: to try and fight back. 573 00:30:27,760 --> 00:30:30,280 Speaker 1: Was he self aware about the risk of using that 574 00:30:30,400 --> 00:30:35,080 Speaker 1: language even for himself given all these Manhattan Project analogies, 575 00:30:35,440 --> 00:30:35,760 Speaker 1: you know. 576 00:30:36,160 --> 00:30:40,360 Speaker 2: He's a person with many different dimensions, and he's both 577 00:30:40,440 --> 00:30:44,200 Speaker 2: capable of worrying about safety and also using military metaphors 578 00:30:44,600 --> 00:30:48,000 Speaker 2: to express this determination to crush the opposition. And I 579 00:30:48,000 --> 00:30:49,680 Speaker 2: think actually it's going to be a business school case 580 00:30:49,680 --> 00:30:54,400 Speaker 2: study of how deep Mind made the comeback because they 581 00:30:54,480 --> 00:30:58,400 Speaker 2: emerged deep Mind the London lab with Google Brain the 582 00:30:58,480 --> 00:31:02,640 Speaker 2: Mountain View Google AI Lab. Normally, mergers are super difficult, 583 00:31:02,680 --> 00:31:05,680 Speaker 2: they don't work. And here was a merger you had 584 00:31:05,680 --> 00:31:08,400 Speaker 2: to do in the middle of an AI race which 585 00:31:08,400 --> 00:31:11,280 Speaker 2: had been kicked off by chatchept. You had eight time 586 00:31:11,360 --> 00:31:14,560 Speaker 2: zones between California and London. You had a record of 587 00:31:14,640 --> 00:31:17,960 Speaker 2: bitter rivalry between the AI scientists from Google and the 588 00:31:18,000 --> 00:31:20,600 Speaker 2: ones from deep Mind. And yet they pulled it off. 589 00:31:20,600 --> 00:31:23,240 Speaker 2: They did the merger, they blended the cultures, and within 590 00:31:23,280 --> 00:31:25,120 Speaker 2: two and a half years they had a model that 591 00:31:25,320 --> 00:31:27,120 Speaker 2: was outclassing open AI models. 592 00:31:27,440 --> 00:31:30,960 Speaker 1: See that's just extraordinary. So I remember when the chetchipt 593 00:31:31,120 --> 00:31:34,920 Speaker 1: moment happened, and I would say, up until twenty twenty 594 00:31:35,680 --> 00:31:37,840 Speaker 1: the beginning of twenty twenty five, people were saying Google 595 00:31:37,920 --> 00:31:41,440 Speaker 1: is down and out, Google might be over. I mean, 596 00:31:41,440 --> 00:31:43,280 Speaker 1: you knew because you were reporting along the way that 597 00:31:43,320 --> 00:31:46,480 Speaker 1: probably wasn't true. But what the indications that you saw 598 00:31:46,600 --> 00:31:48,720 Speaker 1: that the rest of the world didn't that convinced you 599 00:31:49,080 --> 00:31:52,640 Speaker 1: along the way that demos and deep Mind might be 600 00:31:53,120 --> 00:31:55,760 Speaker 1: roaring back into Do you put them in first place? Now? 601 00:31:55,920 --> 00:31:58,640 Speaker 2: I think it's sort of a pretty close race between 602 00:31:59,080 --> 00:32:02,640 Speaker 2: the Gemini models from Demis and then Claude is doing 603 00:32:02,640 --> 00:32:05,120 Speaker 2: really well at the moment the anthropic model. People love 604 00:32:05,160 --> 00:32:08,440 Speaker 2: it for coding and so forth. So you know, I'm 605 00:32:08,520 --> 00:32:10,960 Speaker 2: not sure that it's I think the race is still ongoing. 606 00:32:11,120 --> 00:32:13,640 Speaker 2: What I would say, though, is that you know, I'm 607 00:32:13,640 --> 00:32:16,200 Speaker 2: on record as having written in the New York Times. 608 00:32:15,920 --> 00:32:18,320 Speaker 1: That around of money, right, probably run. 609 00:32:18,240 --> 00:32:20,560 Speaker 2: Out of money. I mean, they may put it out, 610 00:32:20,600 --> 00:32:23,920 Speaker 2: but basically, in fact, since I wrote that piece, they 611 00:32:23,960 --> 00:32:27,280 Speaker 2: do seem to have focused their business quite a bit 612 00:32:27,960 --> 00:32:30,320 Speaker 2: by giving up on Soora for example, Saura was a 613 00:32:30,320 --> 00:32:33,880 Speaker 2: classic money losing idea. You know, it costs enormous amounts 614 00:32:33,920 --> 00:32:38,040 Speaker 2: to generate video, but people don't pay you to generate videos, 615 00:32:38,040 --> 00:32:40,920 Speaker 2: so quite rightly, they can do it. So maybe they 616 00:32:40,960 --> 00:32:45,240 Speaker 2: can cut costs enough to survive. But they have a 617 00:32:45,360 --> 00:32:48,280 Speaker 2: huge cash need and they do not have Google's deep 618 00:32:48,280 --> 00:32:49,880 Speaker 2: pockets behind them, unlike Demis. 619 00:32:50,000 --> 00:32:52,360 Speaker 1: So Demis is kind of winning. But he said to you, 620 00:32:52,400 --> 00:32:55,400 Speaker 1: it doesn't necessarily feel like that, right, he said, this 621 00:32:55,440 --> 00:32:58,239 Speaker 1: is a paradoxical moment. It should feel amazing, but it 622 00:32:58,280 --> 00:33:00,400 Speaker 1: doesn't feel how I thought it would feel. 623 00:33:00,600 --> 00:33:03,680 Speaker 2: Yeah, because early on he had this rather naive idea 624 00:33:03,720 --> 00:33:06,360 Speaker 2: that there would be one lab building AI and so 625 00:33:06,400 --> 00:33:08,560 Speaker 2: you could take your time about releasing the models, and 626 00:33:09,080 --> 00:33:10,880 Speaker 2: you know, if you were worried about safety, you could 627 00:33:10,920 --> 00:33:13,400 Speaker 2: just take another six months to test them. And now 628 00:33:13,440 --> 00:33:16,080 Speaker 2: you have this race, and you know, the Chinese have 629 00:33:16,160 --> 00:33:18,280 Speaker 2: plenty of models, and the other thing, it's not just 630 00:33:18,320 --> 00:33:20,840 Speaker 2: a race, it's actually also the open source nature of 631 00:33:20,840 --> 00:33:24,080 Speaker 2: these models, where they're being released out into the wild, 632 00:33:24,920 --> 00:33:29,080 Speaker 2: and some weird group can just download the model, have 633 00:33:29,160 --> 00:33:31,720 Speaker 2: it on their own computer, and then you can't pull 634 00:33:31,760 --> 00:33:34,240 Speaker 2: it back anymore. And so there was a big cyber 635 00:33:34,280 --> 00:33:38,360 Speaker 2: attack in Mexico recently where all of the electoral records 636 00:33:38,360 --> 00:33:42,280 Speaker 2: were stolen, and Anthropic realized that its clawed model was 637 00:33:42,320 --> 00:33:46,000 Speaker 2: being used. But because that model is proprietary, they could 638 00:33:46,000 --> 00:33:49,440 Speaker 2: immediately shut off access and stop the attack. You couldn't 639 00:33:49,440 --> 00:33:51,920 Speaker 2: do that with an open weight, open source model. And 640 00:33:52,040 --> 00:33:55,160 Speaker 2: yet we have open weight, you know, that's being put 641 00:33:55,160 --> 00:33:57,880 Speaker 2: out there, both by Meta and by the Chinese and 642 00:33:57,920 --> 00:34:01,160 Speaker 2: by Mistride in France. A lot of open source models 643 00:34:01,160 --> 00:34:04,960 Speaker 2: are out there, and so in many ways the way 644 00:34:05,040 --> 00:34:09,640 Speaker 2: AI is being deployed is frightening. The obvious safety measures 645 00:34:09,680 --> 00:34:12,560 Speaker 2: one might take are not happening. In addition to banning 646 00:34:12,640 --> 00:34:15,920 Speaker 2: open source, I think there should be much more powerful 647 00:34:16,480 --> 00:34:20,200 Speaker 2: sort of government oversight so that, just like with a pharmaceutical, 648 00:34:20,280 --> 00:34:22,799 Speaker 2: before you release it to be used in people, as 649 00:34:22,840 --> 00:34:25,279 Speaker 2: to go through clinical trials, so too, I think there 650 00:34:25,320 --> 00:34:26,840 Speaker 2: should be a sort of equivalent of the Food and 651 00:34:26,880 --> 00:34:31,560 Speaker 2: Drug Administration an AI agency that can actually veto the 652 00:34:31,600 --> 00:34:34,719 Speaker 2: release of really powerful models. And we don't have that, 653 00:34:35,000 --> 00:34:37,320 Speaker 2: and we should have that, and we should be negotiating 654 00:34:37,360 --> 00:34:39,200 Speaker 2: with China about doing it in both places at once, 655 00:34:39,680 --> 00:34:41,759 Speaker 2: because this is a global race and both sides have 656 00:34:41,800 --> 00:34:44,799 Speaker 2: to slow down. I was in China recently for eight 657 00:34:44,880 --> 00:34:49,560 Speaker 2: days because they always published books faster. So I was 658 00:34:49,880 --> 00:34:54,839 Speaker 2: meeting AI leaders, both from industry and from academia, and 659 00:34:54,880 --> 00:34:57,040 Speaker 2: I was surprised by how much they do talk about safety. 660 00:34:57,760 --> 00:34:59,719 Speaker 2: So I think there is a discussion to be had 661 00:34:59,760 --> 00:35:04,560 Speaker 2: with it Chinese about safety, but the US administration of 662 00:35:04,600 --> 00:35:05,960 Speaker 2: this moment doesn't want to do that. 663 00:35:06,360 --> 00:35:10,800 Speaker 1: I mean, coming back to the Manhattan Project again, Demis 664 00:35:10,840 --> 00:35:13,239 Speaker 1: has said, I think that he thinks this may end 665 00:35:13,239 --> 00:35:15,520 Speaker 1: in a bunker and what does he mean by that? 666 00:35:15,560 --> 00:35:20,680 Speaker 1: And he has he primed himself psychologically for an Ai 667 00:35:21,440 --> 00:35:24,520 Speaker 1: Hiroshima that he may feel in some sense responsible for. 668 00:35:25,200 --> 00:35:28,160 Speaker 2: Yeah, I mean when I was doing the research, interfering 669 00:35:28,200 --> 00:35:30,680 Speaker 2: not just Demis, but all the scientists that he works with, 670 00:35:30,719 --> 00:35:32,919 Speaker 2: you know, one hundred or something of them. In deep mind, 671 00:35:33,360 --> 00:35:35,839 Speaker 2: I would hear this, these references to the bunker come up, 672 00:35:35,880 --> 00:35:39,479 Speaker 2: and I assumed it wasn't literally, you know, a real 673 00:35:39,520 --> 00:35:42,279 Speaker 2: thing that Demis wanted to disappear into a bunker at 674 00:35:42,280 --> 00:35:44,399 Speaker 2: the moment when he thought the AI models were coming 675 00:35:44,440 --> 00:35:47,759 Speaker 2: dangerously powerful. And I would have these dinners every six 676 00:35:47,800 --> 00:35:51,279 Speaker 2: months with a friend who had been at deep mind, 677 00:35:51,280 --> 00:35:54,440 Speaker 2: but had left and I tested this on him one 678 00:35:54,800 --> 00:35:57,160 Speaker 2: evening and I said, yeah, surely this is just a 679 00:35:57,160 --> 00:36:01,280 Speaker 2: metaphor bunker. He can't be serious. This guy said, well, actually, 680 00:36:01,320 --> 00:36:05,080 Speaker 2: you know, I had my bag packed. It was serious. 681 00:36:05,680 --> 00:36:09,440 Speaker 2: There was actually this vision that AI would become so 682 00:36:09,600 --> 00:36:12,040 Speaker 2: powerful that bad guys would try and get it off you. 683 00:36:12,520 --> 00:36:14,239 Speaker 2: So you had to hide in some place a bit 684 00:36:14,280 --> 00:36:18,760 Speaker 2: like Los Animals and develop in sort of isolation and secret, 685 00:36:19,520 --> 00:36:22,440 Speaker 2: and also be isolated because you needed maximum focus on 686 00:36:22,480 --> 00:36:24,759 Speaker 2: the science to get it right when you were at 687 00:36:24,760 --> 00:36:27,240 Speaker 2: this moment of maximum danger because the model was suddenly 688 00:36:27,320 --> 00:36:30,919 Speaker 2: very powerful, and that was his vision. Now I think 689 00:36:30,960 --> 00:36:34,120 Speaker 2: today he doesn't believe that anymore, because we're so far 690 00:36:34,200 --> 00:36:39,759 Speaker 2: from a single lab, you know, midwifing AI. So I 691 00:36:39,760 --> 00:36:43,360 Speaker 2: think now he's more inclined to speak of some version 692 00:36:43,440 --> 00:36:47,000 Speaker 2: of the Center for European Nuclear Research SAN, which is 693 00:36:47,040 --> 00:36:50,759 Speaker 2: a sort of technical agency that oversees nuclear power on 694 00:36:50,800 --> 00:36:53,759 Speaker 2: a multinational basis. I think you would like some sort 695 00:36:53,800 --> 00:36:58,600 Speaker 2: of global body to impose rules on what kind of 696 00:36:58,640 --> 00:37:02,759 Speaker 2: AI should be let out into the wild. But you know, 697 00:37:02,960 --> 00:37:05,160 Speaker 2: at the same time he knows that politically that's not 698 00:37:05,200 --> 00:37:07,960 Speaker 2: on the cards, and he has a sense of timing 699 00:37:08,840 --> 00:37:12,960 Speaker 2: about when you should raise these issues, and so you know, 700 00:37:13,360 --> 00:37:17,320 Speaker 2: whereas Dariama Day took on the Pentagon by trying to 701 00:37:17,360 --> 00:37:21,360 Speaker 2: assert safety principles and then just got rolled, I think Demis, 702 00:37:21,360 --> 00:37:23,960 Speaker 2: when he does that, is going to feel that he's 703 00:37:23,960 --> 00:37:26,640 Speaker 2: got the door is half open and he can give 704 00:37:26,640 --> 00:37:28,799 Speaker 2: it a push and we'll see. You know. Of course, 705 00:37:28,800 --> 00:37:32,400 Speaker 2: sometimes people keep their capital drive for so long that 706 00:37:32,440 --> 00:37:34,800 Speaker 2: they never use it, but we'll see. If the moment 707 00:37:34,840 --> 00:37:36,600 Speaker 2: comes when he does use it will be very interesting. 708 00:37:37,080 --> 00:37:39,000 Speaker 1: Especially just to close, there was a great review of 709 00:37:39,000 --> 00:37:42,600 Speaker 1: your book in the Financial Times which ends with this, 710 00:37:43,480 --> 00:37:48,160 Speaker 1: Whether and how Demis ever achieves AGI will form the 711 00:37:48,200 --> 00:37:53,399 Speaker 1: defining chapters of his extraordinary and unfinished biography. What did 712 00:37:53,400 --> 00:37:55,000 Speaker 1: you think about that? And what is the next chapter 713 00:37:55,120 --> 00:37:57,160 Speaker 1: for him? And will you write another follow up book? 714 00:37:57,160 --> 00:37:57,560 Speaker 1: Do you think? 715 00:37:57,680 --> 00:37:59,399 Speaker 2: You know? I tend not to write follow ups about 716 00:37:59,440 --> 00:38:01,879 Speaker 2: the same thing the same person. I prefer to plow 717 00:38:02,080 --> 00:38:05,320 Speaker 2: fresh ground. But look, I mean, you know, Demis t 718 00:38:05,320 --> 00:38:08,800 Speaker 2: haming fifty this year. He's got a lot of runway. 719 00:38:08,920 --> 00:38:11,280 Speaker 2: I'm sure he'll do more incredible things in the future. 720 00:38:11,880 --> 00:38:15,239 Speaker 2: So probably I am offering an interim report. But the advantage, 721 00:38:15,719 --> 00:38:17,200 Speaker 2: you know, if you wait. I did this before with 722 00:38:17,239 --> 00:38:20,360 Speaker 2: Alan Greenspann. I wrote the definitive biography after he retired, 723 00:38:20,960 --> 00:38:23,680 Speaker 2: and by that time, you know, people are interested, but 724 00:38:23,800 --> 00:38:25,440 Speaker 2: less so than when he's stood in the seat. I 725 00:38:25,440 --> 00:38:29,960 Speaker 2: think capturing a portrait of you know, the most interesting 726 00:38:29,960 --> 00:38:32,799 Speaker 2: figure in artificial intelligence in real time while he's still 727 00:38:32,840 --> 00:38:35,120 Speaker 2: in the steat and he's still doing it is sometimes 728 00:38:35,120 --> 00:38:37,520 Speaker 2: the fun of it, right, I mean, who wants to 729 00:38:37,520 --> 00:38:40,000 Speaker 2: wait for the definitive biography in twenty years time? 730 00:38:40,480 --> 00:38:42,120 Speaker 1: Well, next for him, you know, I think he's. 731 00:38:41,960 --> 00:38:44,799 Speaker 2: Going to carry on running Google Deep Mind. There's going 732 00:38:44,840 --> 00:38:47,680 Speaker 2: to be more agentic models coming out this year. There 733 00:38:47,719 --> 00:38:52,040 Speaker 2: will be you know, world models and more robotics coming 734 00:38:52,480 --> 00:38:55,560 Speaker 2: There will probably be much more AI for science, both 735 00:38:55,600 --> 00:38:58,319 Speaker 2: in terms of drug discovery and in terms of you know, 736 00:38:58,400 --> 00:39:02,239 Speaker 2: material sciences, chemistry and so forth. So I think, you know, 737 00:39:02,840 --> 00:39:05,399 Speaker 2: one day, I remember, towards the end of my time 738 00:39:05,480 --> 00:39:08,520 Speaker 2: interviewing him, he showed up at the pub and he 739 00:39:08,600 --> 00:39:10,759 Speaker 2: had a backpack and he pished something out of it, 740 00:39:11,480 --> 00:39:14,359 Speaker 2: and he got this little box out and he said, 741 00:39:14,360 --> 00:39:16,319 Speaker 2: I got to show you this, and he opened the 742 00:39:16,320 --> 00:39:20,680 Speaker 2: box up and inside was the Nobel prize medal and 743 00:39:21,000 --> 00:39:23,520 Speaker 2: either at that meeting or another one. He said to me, 744 00:39:24,480 --> 00:39:26,600 Speaker 2: I wonder if I can get another one. He's not 745 00:39:26,680 --> 00:39:27,280 Speaker 2: over yet. 746 00:39:28,160 --> 00:39:29,560 Speaker 1: It's a vasty malabi. Thank you. 747 00:39:29,960 --> 00:39:59,000 Speaker 2: It's been great fun to talk. 748 00:39:46,160 --> 00:39:49,320 Speaker 1: For tech Stuff. I'm as Volosciin. This episode was produced 749 00:39:49,320 --> 00:39:52,960 Speaker 1: by Eliza Dennis and Melissa Slaughter. It was executive produced 750 00:39:52,960 --> 00:39:56,320 Speaker 1: by me Julian Nutter and Kate Osborne for Kaleidoscope and 751 00:39:56,400 --> 00:40:00,400 Speaker 1: Katrina Novel for Ihart Podcasts. The engineer is p Bowman 752 00:40:00,600 --> 00:40:03,839 Speaker 1: and Jack Insley makes this episode. Kyle Murdoch wrote our 753 00:40:03,880 --> 00:40:07,080 Speaker 1: theme song. Please rate, review, and reach out to us 754 00:40:07,120 --> 00:40:10,200 Speaker 1: at tech Stuff podcast at gmail dot com. We also 755 00:40:10,200 --> 00:40:11,920 Speaker 1: love to hear what you think our panels should cover 756 00:40:12,000 --> 00:40:12,399 Speaker 1: next time