1 00:00:00,320 --> 00:00:03,000 Speaker 1: Thanks for tune into Tech Stuff. If you don't recognize 2 00:00:03,000 --> 00:00:06,120 Speaker 1: my voice, my name is Osvalshin and I'm here because 3 00:00:06,160 --> 00:00:09,240 Speaker 1: the inimitable Jonathan Strickland has passed the baton to Cara 4 00:00:09,360 --> 00:00:12,680 Speaker 1: Price and myself to host Tech Stuff. The show will 5 00:00:12,720 --> 00:00:15,480 Speaker 1: remain your home for all things tech, and all the 6 00:00:15,480 --> 00:00:17,960 Speaker 1: old episodes will remain available in this feed. 7 00:00:18,440 --> 00:00:19,200 Speaker 2: Thanks for listening. 8 00:00:20,360 --> 00:00:23,120 Speaker 3: I was in a hotel in California and I saw 9 00:00:23,160 --> 00:00:26,120 Speaker 3: that the phone lit up, and I thought, who's calling 10 00:00:26,120 --> 00:00:27,840 Speaker 3: me at one o'clock in the morning, And then this 11 00:00:27,920 --> 00:00:30,520 Speaker 3: Swedish voice came on. Then they said I won the 12 00:00:30,560 --> 00:00:33,840 Speaker 3: Nobel Prize in Physics, and I thought, this is very odd. 13 00:00:33,880 --> 00:00:36,519 Speaker 3: I don't do physics, so that's when I thought it 14 00:00:36,600 --> 00:00:37,280 Speaker 3: might be a prank. 15 00:00:38,360 --> 00:00:42,280 Speaker 1: Jeffrey Hinton won the twenty twenty four Nobel Prize in Physics, 16 00:00:42,880 --> 00:00:46,519 Speaker 1: an honor held by Albert Einstein and Marie Curie. A 17 00:00:46,560 --> 00:00:50,000 Speaker 1: certain j Robert Oppenheimer were shortlisted but never won. 18 00:00:50,400 --> 00:00:52,519 Speaker 3: My big hope was to win the Nobel Prize in 19 00:00:52,520 --> 00:00:55,080 Speaker 3: Physiology or Medicine for figuring out how the brain worked. 20 00:00:55,960 --> 00:00:58,120 Speaker 3: And what I didn't realize is you could fail to 21 00:00:58,120 --> 00:00:59,640 Speaker 3: figure out how the brain worked and still get a 22 00:00:59,680 --> 00:01:00,640 Speaker 3: Noble Prize anyway. 23 00:01:02,560 --> 00:01:06,000 Speaker 1: Welcome to tech stuff. This is the story with our guests, 24 00:01:06,000 --> 00:01:09,880 Speaker 1: the Nobel Laureate Jeffrey Hinton. Each week on Wednesdays, we 25 00:01:10,000 --> 00:01:12,400 Speaker 1: bring you an in depth interview with someone who's at 26 00:01:12,400 --> 00:01:15,800 Speaker 1: the forefront of technology or who can unlock a world 27 00:01:15,840 --> 00:01:19,520 Speaker 1: where tech is at its most fascinating. His recent Nobel 28 00:01:19,560 --> 00:01:23,959 Speaker 1: Prize win was for quote foundational discoveries and inventions that 29 00:01:24,120 --> 00:01:29,200 Speaker 1: enable machine learning with artificial neural networks. Now, artificial neural 30 00:01:29,240 --> 00:01:33,319 Speaker 1: networks are learning models inspired by the network of neurons 31 00:01:33,360 --> 00:01:36,679 Speaker 1: present in the human brain, and Hinton's desire to figure 32 00:01:36,720 --> 00:01:39,759 Speaker 1: out the brain was a key inspiration for his pioneering 33 00:01:39,800 --> 00:01:44,399 Speaker 1: work on AI. I was particularly fascinated by Hinton because 34 00:01:44,480 --> 00:01:47,960 Speaker 1: his work went completely against the mainstream of computer science 35 00:01:48,040 --> 00:01:51,880 Speaker 1: for decades, and yet he stuck to his guns. It's 36 00:01:51,920 --> 00:01:55,800 Speaker 1: an incredible story of dedication in the face of personal loss. 37 00:01:56,720 --> 00:02:01,680 Speaker 1: Also fascinating is Hinton's relationship to his own creation. Here's 38 00:02:01,720 --> 00:02:04,120 Speaker 1: what he said at the Nobel Prize banquet. 39 00:02:04,800 --> 00:02:08,399 Speaker 3: There is also a longer term existential threat that will 40 00:02:08,400 --> 00:02:12,000 Speaker 3: arise when we create digital beings that are more intelligent 41 00:02:12,080 --> 00:02:16,040 Speaker 3: than ourselves. We have no idea whether we can stay 42 00:02:16,040 --> 00:02:20,120 Speaker 3: in control. But we now have evidence that if they 43 00:02:20,160 --> 00:02:24,280 Speaker 3: are created by companies motivated by short term profits, our 44 00:02:24,400 --> 00:02:28,799 Speaker 3: safety will not be the top priority. We urgently need 45 00:02:28,880 --> 00:02:32,239 Speaker 3: research on how to prevent these new beings from wanting 46 00:02:32,320 --> 00:02:37,240 Speaker 3: to take control. They are no longer science fiction, thank you. 47 00:02:40,200 --> 00:02:42,560 Speaker 1: So when I got the opportunity to sit down with 48 00:02:42,639 --> 00:02:46,200 Speaker 1: Jeffrey Hinton, I wanted to know how he went from 49 00:02:46,240 --> 00:02:49,680 Speaker 1: someone who wanted to understand the relationship between the mind 50 00:02:49,760 --> 00:02:53,240 Speaker 1: and the brain to someone who paved the path for 51 00:02:53,320 --> 00:02:56,519 Speaker 1: AI as we know it. And he obliged me by 52 00:02:56,560 --> 00:03:01,240 Speaker 1: telling me about his trajectory from student to researcher, to professor, 53 00:03:01,520 --> 00:03:05,280 Speaker 1: to Google employee to finally AI safety advocate. 54 00:03:06,280 --> 00:03:09,239 Speaker 2: Am I riding thinking with all due respect to Steve 55 00:03:09,360 --> 00:03:11,520 Speaker 2: Jobs and Bill Gates and marks up a bug that 56 00:03:12,160 --> 00:03:15,320 Speaker 2: you were, in a sense the original college dropout, not. 57 00:03:15,360 --> 00:03:17,280 Speaker 3: In the sense they dropped out, Because what I did 58 00:03:17,400 --> 00:03:19,600 Speaker 3: was I I went to Cambridge and after a month 59 00:03:19,639 --> 00:03:21,800 Speaker 3: I dropped out, but then I went back the next year. 60 00:03:22,160 --> 00:03:25,480 Speaker 3: And then while I was doing my PhD, I dropped out, 61 00:03:26,680 --> 00:03:30,480 Speaker 3: but then I finished it. So I'm not like them. 62 00:03:30,760 --> 00:03:33,679 Speaker 4: I went back. I'm a failed to drop out. 63 00:03:34,120 --> 00:03:37,400 Speaker 2: Failed to drop out, that's good. What were the reasons for? 64 00:03:38,640 --> 00:03:42,800 Speaker 2: Was it uncertainty or ambivalence or curiosity or so. 65 00:03:42,880 --> 00:03:45,800 Speaker 3: When I first went to Cambridge, it was the first 66 00:03:45,840 --> 00:03:49,320 Speaker 3: time I'd ever lived away from home, and I'd always 67 00:03:49,560 --> 00:03:51,480 Speaker 3: my image of myself was that it was one of 68 00:03:51,480 --> 00:03:54,200 Speaker 3: the clever ones. And when I went to Cambridge, I 69 00:03:54,240 --> 00:03:57,840 Speaker 3: wasn't one of the clever ones. Everybody was clever, and 70 00:03:57,920 --> 00:04:00,560 Speaker 3: I found it very stressful. I worked very, very hard, 71 00:04:00,720 --> 00:04:03,440 Speaker 3: so I was working like twelve hours a day doing science, 72 00:04:04,280 --> 00:04:07,240 Speaker 3: and the combination of working very hard to keep up 73 00:04:07,800 --> 00:04:10,760 Speaker 3: and living away from home for the first time was 74 00:04:10,920 --> 00:04:11,800 Speaker 3: just too much for me. 75 00:04:12,520 --> 00:04:14,400 Speaker 2: So there was a fantastic profile of you in the 76 00:04:14,440 --> 00:04:18,200 Speaker 2: New Yorker which said, quote, Hinton tried different feels but 77 00:04:18,320 --> 00:04:20,560 Speaker 2: was dismayed to find that he was never the brightest 78 00:04:20,560 --> 00:04:22,960 Speaker 2: student in any given class, which made me smile. But 79 00:04:23,040 --> 00:04:26,719 Speaker 2: I guess dismay and stress are quite a close cousins, 80 00:04:27,040 --> 00:04:30,560 Speaker 2: And I suppose the stakes for you were high given 81 00:04:30,600 --> 00:04:32,760 Speaker 2: the family you came from. Can you speak a little 82 00:04:32,760 --> 00:04:33,159 Speaker 2: bit about that. 83 00:04:33,839 --> 00:04:36,080 Speaker 3: Yes, I had a lot of pressure from my father 84 00:04:36,160 --> 00:04:40,479 Speaker 3: to be an academic success, and my mother sort of 85 00:04:41,040 --> 00:04:44,919 Speaker 3: went along with it. So from an early age I 86 00:04:45,000 --> 00:04:48,000 Speaker 3: realized that's what I had to achieve, and that's a 87 00:04:48,000 --> 00:04:48,600 Speaker 3: lot of pressure. 88 00:04:49,040 --> 00:04:51,760 Speaker 2: How did they exert that pressure? How are you aware 89 00:04:51,760 --> 00:04:52,040 Speaker 2: of it? 90 00:04:53,360 --> 00:04:57,080 Speaker 3: My father was a slightly strange character. He grew up 91 00:04:57,120 --> 00:05:02,040 Speaker 3: in Mexico during all the revolutions without a mother, somewhat odd. 92 00:05:04,279 --> 00:05:06,480 Speaker 3: Every morning when I went to school, not every morning, 93 00:05:06,480 --> 00:05:09,320 Speaker 3: but quite often, as I left, you would say, get 94 00:05:09,320 --> 00:05:11,840 Speaker 3: in their pitching. If you work very hard, when you're 95 00:05:11,839 --> 00:05:13,839 Speaker 3: twice as old as me, you might be half as good. 96 00:05:14,800 --> 00:05:17,000 Speaker 4: Well that's sort of pressure. 97 00:05:18,279 --> 00:05:20,120 Speaker 2: Did you find that motivating? 98 00:05:21,200 --> 00:05:24,560 Speaker 4: I found it irritating, but I think it probably was motivating. 99 00:05:25,800 --> 00:05:27,360 Speaker 4: He very inconsiderately. 100 00:05:27,400 --> 00:05:31,479 Speaker 3: He died while I was writing my thesis, and he 101 00:05:31,600 --> 00:05:33,839 Speaker 3: never saw me being a success. 102 00:05:34,600 --> 00:05:37,320 Speaker 2: You were at Cambridge, you left briefly and you came back. 103 00:05:38,400 --> 00:05:42,039 Speaker 2: I think you settled on experimental psychology as your degree. 104 00:05:42,040 --> 00:05:44,440 Speaker 4: In the end, I was doing natural sciences. 105 00:05:45,120 --> 00:05:50,279 Speaker 3: I started off doing physics chemist ring crystalline state because 106 00:05:50,279 --> 00:05:53,159 Speaker 3: of the success in decoding the structure of DNA. Crystalline 107 00:05:53,160 --> 00:05:56,640 Speaker 3: state was a big thing, and I left after a month. 108 00:05:56,880 --> 00:06:00,480 Speaker 3: Then I reapplied to do architecture. I've always liked oarchitecture, 109 00:06:01,400 --> 00:06:03,480 Speaker 3: and after a day I decided I was never going 110 00:06:03,520 --> 00:06:06,479 Speaker 3: to be any good architecture because I wasn't I wasn't 111 00:06:06,560 --> 00:06:09,840 Speaker 3: artistic enough. I loved the engineering, but the artistic bit 112 00:06:09,839 --> 00:06:13,000 Speaker 3: I couldn't do very well. So I switched back to science. 113 00:06:13,440 --> 00:06:16,920 Speaker 3: But then I did physics, chemistry and physiology, and I 114 00:06:16,960 --> 00:06:20,200 Speaker 3: really liked physiology. I'd never been allowed to do biology 115 00:06:20,240 --> 00:06:20,760 Speaker 3: at school. 116 00:06:21,760 --> 00:06:22,960 Speaker 4: My father wouldn't allow it. 117 00:06:23,480 --> 00:06:26,919 Speaker 3: Why not, Ah well, he said, if you do biology, 118 00:06:26,920 --> 00:06:30,359 Speaker 3: they'll teach you genetics. And he was a Stalinist and 119 00:06:30,360 --> 00:06:34,600 Speaker 3: he didn't believe in genetics. Now, he was also a 120 00:06:34,640 --> 00:06:39,159 Speaker 3: fellow of the British Royal Society in Biology who didn't 121 00:06:39,160 --> 00:06:41,000 Speaker 3: believe in genetics. 122 00:06:41,520 --> 00:06:46,680 Speaker 2: Gosh, complicated man, Yes, but I mean he really you 123 00:06:46,720 --> 00:06:50,240 Speaker 2: won't exaggerated him. He said to you, you can't study biology. 124 00:06:50,560 --> 00:06:50,800 Speaker 4: Now. 125 00:06:50,920 --> 00:06:53,760 Speaker 3: He had other reasons which weren't so bad, which is, 126 00:06:54,120 --> 00:06:57,719 Speaker 3: you can always pick up biology when you're older. What 127 00:06:57,839 --> 00:06:59,800 Speaker 3: you can't pick up when you're older is math. And 128 00:07:00,760 --> 00:07:02,880 Speaker 3: I think that's probably true, and so that was a 129 00:07:02,960 --> 00:07:03,839 Speaker 3: more valid reason. 130 00:07:04,360 --> 00:07:07,479 Speaker 2: Yeah, yeah, So what did you end up graduating from 131 00:07:07,480 --> 00:07:10,040 Speaker 2: Cambridge with a degree in psychology? 132 00:07:10,320 --> 00:07:14,880 Speaker 3: So I did physics, chemistry and physiology for a year, 133 00:07:15,840 --> 00:07:17,440 Speaker 3: and I did very well in physics. I got a 134 00:07:17,480 --> 00:07:20,800 Speaker 3: first in physics. That's obviously a good predictor of a 135 00:07:20,840 --> 00:07:21,720 Speaker 3: Nobel Price. 136 00:07:22,400 --> 00:07:23,880 Speaker 2: Said, your tutors would have been there. 137 00:07:24,960 --> 00:07:29,760 Speaker 3: Then I dropped it all and did philosophy, philosophy, philosophy. 138 00:07:29,800 --> 00:07:33,640 Speaker 3: I did a year of philosophy and I developed strong antibodies, 139 00:07:34,640 --> 00:07:37,600 Speaker 3: and then I switched to psychology, and so my final 140 00:07:37,640 --> 00:07:38,720 Speaker 3: degree was in psychology. 141 00:07:39,160 --> 00:07:42,520 Speaker 2: MHM. And was there a single question or set of 142 00:07:42,640 --> 00:07:44,960 Speaker 2: questions that you were in search of. 143 00:07:45,400 --> 00:07:49,040 Speaker 3: Yes, I wanted to know how the how the brain worked, 144 00:07:49,040 --> 00:07:51,280 Speaker 3: and how the mind worked, and what the relationship was. 145 00:07:52,640 --> 00:07:55,000 Speaker 3: I decided fairly early on that you're never going to 146 00:07:55,040 --> 00:07:56,920 Speaker 3: understand the mind unless you understood the brain. 147 00:07:58,280 --> 00:07:59,800 Speaker 2: Was that a popular view at the time. 148 00:08:00,200 --> 00:08:01,000 Speaker 4: No, not really. 149 00:08:01,320 --> 00:08:03,920 Speaker 3: There was this kind of functionalist view that basically the 150 00:08:04,000 --> 00:08:07,280 Speaker 3: view that came from computer software, which is that the 151 00:08:07,360 --> 00:08:10,760 Speaker 3: software is totally different from the hardware, and what the 152 00:08:10,800 --> 00:08:13,440 Speaker 3: mind is all about is software, the heuristics you use 153 00:08:13,480 --> 00:08:15,600 Speaker 3: and the way you represent things. The hardware has got 154 00:08:15,640 --> 00:08:19,200 Speaker 3: nothing to do with it. That's a completely crazy view, 155 00:08:19,200 --> 00:08:22,040 Speaker 3: but it seemed very plausible at the time, and so 156 00:08:22,120 --> 00:08:25,400 Speaker 3: the computers we designed so that we could program them, 157 00:08:25,720 --> 00:08:29,440 Speaker 3: had programs being a completely separate domain from hardware. 158 00:08:30,360 --> 00:08:32,080 Speaker 4: But that's not how the real brain is. 159 00:08:33,440 --> 00:08:37,719 Speaker 2: Was as funny this constant dance between us expecting our 160 00:08:37,720 --> 00:08:40,560 Speaker 2: computers to be like us, and then expecting us to 161 00:08:40,640 --> 00:08:44,320 Speaker 2: be like our computers, right as a kind of continual 162 00:08:44,400 --> 00:08:45,800 Speaker 2: dance between those two things. 163 00:08:46,320 --> 00:08:49,360 Speaker 3: Yes, well, you always try and understand things in terms 164 00:08:49,400 --> 00:08:54,280 Speaker 3: of the latest technology. So when telephones were new, the 165 00:08:54,320 --> 00:08:56,559 Speaker 3: brain was clearly a very large telephone switchboard. 166 00:08:58,800 --> 00:09:01,160 Speaker 2: But is it different this time in our AI has 167 00:09:01,800 --> 00:09:04,720 Speaker 2: taken off and become ubiquitous. Do you think this is 168 00:09:04,840 --> 00:09:07,199 Speaker 2: indeed more than the telephone? 169 00:09:07,480 --> 00:09:11,040 Speaker 3: Yes, I think these artimicial neural networks for training are 170 00:09:11,320 --> 00:09:14,280 Speaker 3: in many respects quite like real neural networks. Obviously, the 171 00:09:14,320 --> 00:09:17,559 Speaker 3: neurons are much simpler. There's all sorts of properties that 172 00:09:17,600 --> 00:09:21,200 Speaker 3: are different in the brain, but basically they're working in 173 00:09:21,240 --> 00:09:24,240 Speaker 3: the same way. They learn things by changing connection strengths 174 00:09:24,240 --> 00:09:26,520 Speaker 3: between neurons, just like the brain does. 175 00:09:27,400 --> 00:09:29,559 Speaker 2: And when does that question for you of wanting to 176 00:09:29,640 --> 00:09:32,840 Speaker 2: understand how the brain works really really begin? 177 00:09:33,080 --> 00:09:36,280 Speaker 3: When I was at high school, so even before I 178 00:09:36,280 --> 00:09:39,199 Speaker 3: went to Cambridge, I had a very bright friend who 179 00:09:39,240 --> 00:09:42,120 Speaker 3: was always smarter than me called Inmund Harvey, who came 180 00:09:42,160 --> 00:09:46,400 Speaker 3: into school one day and said, maybe memories in the 181 00:09:46,400 --> 00:09:48,760 Speaker 3: brain are spread out over the whole brain. They're not 182 00:09:48,840 --> 00:09:52,960 Speaker 3: localized like a hologram, because holograms when you so he 183 00:09:53,040 --> 00:09:55,080 Speaker 3: was trying to understand memories in the brain in terms 184 00:09:55,120 --> 00:09:57,520 Speaker 3: of this new technology of holograms. 185 00:09:57,280 --> 00:09:58,679 Speaker 2: And you were stimulated by this idea. 186 00:09:58,720 --> 00:09:59,599 Speaker 4: I was very stimulated by that. 187 00:09:59,679 --> 00:10:03,440 Speaker 3: It's very interesting idea, and ever since then I've thought 188 00:10:03,480 --> 00:10:05,760 Speaker 3: a lot about how memories are represented in the brain. 189 00:10:06,520 --> 00:10:08,800 Speaker 3: And then that also led into well how does the 190 00:10:08,840 --> 00:10:09,720 Speaker 3: brain learn stuff? 191 00:10:11,559 --> 00:10:14,439 Speaker 1: Coming up how Jeffrey Hinton ended up at Google. 192 00:10:14,920 --> 00:10:23,319 Speaker 2: Stay with us, so bear with me, but I need 193 00:10:23,360 --> 00:10:27,800 Speaker 2: to try and summarize what the Boltziman machine is because 194 00:10:27,840 --> 00:10:31,040 Speaker 2: the Nobel Prize Committee credited the Boltzman machine with your win. 195 00:10:32,000 --> 00:10:35,600 Speaker 2: According to their press release, the Boltzman machine can quote 196 00:10:36,000 --> 00:10:39,840 Speaker 2: learn to recognize characteristic elements in a given type of data. 197 00:10:40,640 --> 00:10:43,400 Speaker 2: The machine is trained by feeding it examples that are 198 00:10:43,520 --> 00:10:47,280 Speaker 2: very likely to arise. The Boltzmer machine can be used 199 00:10:47,280 --> 00:10:50,960 Speaker 2: to classify images or create new examples of the type 200 00:10:51,000 --> 00:10:54,199 Speaker 2: of pattern on which it was trained. Hinton has built 201 00:10:54,240 --> 00:10:58,400 Speaker 2: upon this work helping initiate the current explosive development of 202 00:10:58,440 --> 00:11:02,840 Speaker 2: machine learning. Now, in fact, the timeline right, you graduated 203 00:11:02,840 --> 00:11:05,920 Speaker 2: from Cambridge, you worked on your PhD, and in the 204 00:11:05,960 --> 00:11:08,880 Speaker 2: eighties you wound up at Connigie Mellon and that's where 205 00:11:08,880 --> 00:11:11,160 Speaker 2: you work on the bolt of machine really took off. 206 00:11:12,080 --> 00:11:14,480 Speaker 4: Uh yes, just before I went to Carnegie Mellon. 207 00:11:14,679 --> 00:11:16,760 Speaker 2: Now am I writing thinking? At the time, there was 208 00:11:16,800 --> 00:11:20,840 Speaker 2: a kind of debate that you were on the unpopular 209 00:11:20,960 --> 00:11:23,280 Speaker 2: side of about artificial intelligence. 210 00:11:23,640 --> 00:11:25,839 Speaker 3: Okay, so there's two things to say about that. From 211 00:11:25,920 --> 00:11:29,839 Speaker 3: the earliest days of AI in the nineteen fifties, there 212 00:11:29,840 --> 00:11:32,640 Speaker 3: were these two approaches, two kind of paradigms. So how 213 00:11:32,679 --> 00:11:36,160 Speaker 3: you build an intelligence system. One was inspired by the brain, 214 00:11:36,200 --> 00:11:40,000 Speaker 3: that was neural networks. Then the other paradigm was no, no, no, no. 215 00:11:40,840 --> 00:11:44,319 Speaker 3: Logic is a paradigm for intelligence. Intelligence is all about reasoning. 216 00:11:45,080 --> 00:11:47,439 Speaker 3: Learning comes later once we figured out how reasoning works, 217 00:11:47,760 --> 00:11:50,640 Speaker 3: and reasoning depends on having the right representations for things. 218 00:11:51,160 --> 00:11:54,520 Speaker 3: So we have to figure out what kind of logically 219 00:11:54,600 --> 00:11:58,040 Speaker 3: unambiguous language the brain is using so that it can 220 00:11:58,240 --> 00:12:01,720 Speaker 3: use rules to do reasoning. It's a completely different approach 221 00:12:02,080 --> 00:12:05,200 Speaker 3: and it's very unbiological because reasoning something comes very late, 222 00:12:06,360 --> 00:12:09,920 Speaker 3: and actually from most of the second half of the 223 00:12:09,960 --> 00:12:13,600 Speaker 3: last century, neural networks weren't seen as AI. AI was 224 00:12:13,640 --> 00:12:16,240 Speaker 3: believing you have symbolic rules in your head and then 225 00:12:16,320 --> 00:12:20,240 Speaker 3: manipulated using rules, using to screet symbolic rules. 226 00:12:20,080 --> 00:12:21,720 Speaker 2: And what was what were neural? They were seen as 227 00:12:21,720 --> 00:12:23,160 Speaker 2: statistics or physically. 228 00:12:23,200 --> 00:12:26,040 Speaker 3: They were seen as this crazy idea that you could 229 00:12:26,040 --> 00:12:28,480 Speaker 3: take this random neural network and just learn everything, which 230 00:12:28,520 --> 00:12:29,440 Speaker 3: was obviously hopeless. 231 00:12:29,440 --> 00:12:30,680 Speaker 2: And what what gave you this conviction? 232 00:12:32,040 --> 00:12:34,840 Speaker 3: Well, the brain has to learn somehow, and of course 233 00:12:34,840 --> 00:12:38,200 Speaker 3: there's a lot of innate structure whir i'd in, which 234 00:12:38,200 --> 00:12:40,160 Speaker 3: explains why a goat can sort of drop out of 235 00:12:40,160 --> 00:12:44,440 Speaker 3: the womb and five minutes later it's walking. But so 236 00:12:44,480 --> 00:12:47,360 Speaker 3: stuff like learning to read, that's all learned. That's not innate, 237 00:12:48,280 --> 00:12:49,760 Speaker 3: and we have to figure out how that works, and 238 00:12:49,800 --> 00:12:51,080 Speaker 3: it's not symbolic rules. 239 00:12:51,480 --> 00:12:55,560 Speaker 2: This conviction that you have now, did you always have? 240 00:12:56,720 --> 00:12:57,040 Speaker 4: Yes? 241 00:13:00,000 --> 00:13:02,360 Speaker 3: Sorry to say it seemed to me it was obviously right. 242 00:13:03,120 --> 00:13:04,760 Speaker 3: I think part of that is I was sent to 243 00:13:04,760 --> 00:13:08,000 Speaker 3: a private school when I was seven from an atheist family, 244 00:13:09,240 --> 00:13:13,720 Speaker 3: and they started telling me about religion and all this rubbish, 245 00:13:14,400 --> 00:13:18,760 Speaker 3: all the stuff that was obviously nonsense, and I had 246 00:13:18,760 --> 00:13:20,600 Speaker 3: the experience of being the only kid at school who 247 00:13:20,600 --> 00:13:23,040 Speaker 3: thought this was all nonsense and turning out to be right. 248 00:13:24,280 --> 00:13:25,880 Speaker 4: And that's very good training for a scientist. 249 00:13:26,679 --> 00:13:29,000 Speaker 2: I brought it up earlier, but there was a wonderful 250 00:13:29,000 --> 00:13:31,880 Speaker 2: profile of you in the New Yorker titled why the 251 00:13:31,920 --> 00:13:35,760 Speaker 2: Godfather of Ai fears what He's built, And there was 252 00:13:35,760 --> 00:13:39,439 Speaker 2: a quote that I found quite stunning where you said 253 00:13:39,880 --> 00:13:42,280 Speaker 2: I was dead in the water at forty six. 254 00:13:43,320 --> 00:13:50,160 Speaker 3: Yes, my wife and I adopted to children, to babies, 255 00:13:52,440 --> 00:13:58,120 Speaker 3: and then my wife got ovarying cancer. But she also, 256 00:13:59,040 --> 00:14:01,080 Speaker 3: even though she was a science she started believing in 257 00:14:01,120 --> 00:14:05,040 Speaker 3: homeopathy and she tried treating her very in cancer with homeopathy, 258 00:14:05,720 --> 00:14:08,400 Speaker 3: and so she died and. 259 00:14:09,760 --> 00:14:10,520 Speaker 4: I was left with. 260 00:14:10,520 --> 00:14:12,480 Speaker 3: Two young children, one of three and one of five, 261 00:14:15,320 --> 00:14:22,600 Speaker 3: who were both very upset, as was I, and I 262 00:14:22,760 --> 00:14:25,840 Speaker 3: began to appreciate what life is like for female academics, 263 00:14:27,240 --> 00:14:28,240 Speaker 3: which is impossible. 264 00:14:28,840 --> 00:14:29,840 Speaker 4: You can't. 265 00:14:30,400 --> 00:14:35,320 Speaker 3: Looking after small children makes it very very hard to 266 00:14:35,400 --> 00:14:39,040 Speaker 3: spend long periods of time thinking about your current idea. 267 00:14:40,520 --> 00:14:41,680 Speaker 3: It's just very difficult. 268 00:14:42,600 --> 00:14:46,320 Speaker 2: So you were bereaved, you were looking after two small children, 269 00:14:46,440 --> 00:14:49,320 Speaker 2: and then yes, that moment in twenty twelve when you 270 00:14:49,360 --> 00:14:55,040 Speaker 2: publish a paper called ImageNet classification with deep convolutional neural networks, 271 00:14:55,120 --> 00:14:58,680 Speaker 2: which to the layman doesn't sound like something that would 272 00:14:58,760 --> 00:15:02,520 Speaker 2: change the world and how we live, but it did well. 273 00:15:02,560 --> 00:15:06,400 Speaker 3: It changed the views of people in computer vision and 274 00:15:06,440 --> 00:15:08,760 Speaker 3: the views of people in other areas of computer science. 275 00:15:09,440 --> 00:15:13,800 Speaker 3: It basically showed neural networks actually do work. Now, people 276 00:15:13,840 --> 00:15:15,920 Speaker 3: have showed that before, but it hadn't convinced people the 277 00:15:16,000 --> 00:15:16,440 Speaker 3: same way. 278 00:15:16,760 --> 00:15:19,680 Speaker 2: See, you published that paper in twenty twelve with Iliot 279 00:15:19,760 --> 00:15:21,920 Speaker 2: sutch Keva, who went on to be a very important 280 00:15:21,960 --> 00:15:23,840 Speaker 2: figure at open AI and I want to talk more 281 00:15:23,840 --> 00:15:27,080 Speaker 2: about him. But within a few days of the publication 282 00:15:27,120 --> 00:15:30,600 Speaker 2: of that paper, you had an offer of millions and 283 00:15:30,600 --> 00:15:33,720 Speaker 2: millions of dollars to move to China. 284 00:15:33,960 --> 00:15:37,000 Speaker 3: Ah. Yeah, either to move to China or to let 285 00:15:37,000 --> 00:15:38,720 Speaker 3: them invest in our group. 286 00:15:39,280 --> 00:15:41,200 Speaker 4: I think it was a bit longer than a few days, 287 00:15:41,200 --> 00:15:42,840 Speaker 4: but it was. It was that fall for sure. 288 00:15:43,080 --> 00:15:45,280 Speaker 2: Did you kind of know, Okay, we're going to publish 289 00:15:45,520 --> 00:15:49,680 Speaker 2: and this is going to change everything or are you 290 00:15:49,720 --> 00:15:51,040 Speaker 2: surprised by this response. 291 00:15:51,520 --> 00:15:53,400 Speaker 3: We thought it would have a big effect. We didn't 292 00:15:53,400 --> 00:15:54,760 Speaker 3: realize quite how big. 293 00:15:55,600 --> 00:15:59,120 Speaker 2: And when you got the call from Baidu, Chinese tech company, 294 00:16:00,040 --> 00:16:02,080 Speaker 2: What did you think, I mean, it must have been tempting. 295 00:16:02,760 --> 00:16:07,800 Speaker 3: H yes, I think they said they could fund our group, 296 00:16:07,920 --> 00:16:10,000 Speaker 3: or they could we could go and work for them, 297 00:16:10,600 --> 00:16:14,200 Speaker 3: or various possible alternatives. And in the end I just 298 00:16:14,240 --> 00:16:16,160 Speaker 3: asked them, well, how much money are we talking about? 299 00:16:17,000 --> 00:16:20,680 Speaker 3: Are you talking about like ten million dollars? And they said, yes. 300 00:16:21,120 --> 00:16:22,160 Speaker 2: You localed yourself. 301 00:16:23,440 --> 00:16:24,600 Speaker 4: I didn't think it at the time. 302 00:16:25,720 --> 00:16:27,240 Speaker 3: I thought of the biggest number I could think of, 303 00:16:27,280 --> 00:16:32,240 Speaker 3: it was five million dollars and doubled it. So once 304 00:16:32,240 --> 00:16:34,520 Speaker 3: they said that, I realized we had no idea how 305 00:16:34,600 --> 00:16:37,760 Speaker 3: much we were worth. And so at that point we 306 00:16:37,840 --> 00:16:39,320 Speaker 3: decided to set up an auction. 307 00:16:39,720 --> 00:16:41,720 Speaker 2: What does it mean to set up an auction for 308 00:16:41,800 --> 00:16:43,200 Speaker 2: an academic paper? 309 00:16:43,480 --> 00:16:47,560 Speaker 3: The three of us founded a company. The company that 310 00:16:47,640 --> 00:16:50,680 Speaker 3: belonged to the three of us owned these six pandent applications, 311 00:16:51,800 --> 00:16:54,360 Speaker 3: so then we had something to sell, but mainly we 312 00:16:54,360 --> 00:16:57,560 Speaker 3: were selling ourselves. But I insisted that I still be 313 00:16:57,600 --> 00:17:00,520 Speaker 3: an academic so I could continue to advise my current students. 314 00:17:01,400 --> 00:17:04,120 Speaker 3: That was a big problem because Google had never done 315 00:17:04,160 --> 00:17:06,400 Speaker 3: that before. They were one of the companies. 316 00:17:06,440 --> 00:17:08,760 Speaker 2: The bid for us, so by do Google. 317 00:17:08,840 --> 00:17:12,560 Speaker 3: Microsoft and deep Mind, which wasn't owned by Google then. 318 00:17:13,000 --> 00:17:16,359 Speaker 2: And not only do you invent this well together with 319 00:17:16,400 --> 00:17:19,359 Speaker 2: your colleagues, this advancement of neural nets, but you also 320 00:17:19,400 --> 00:17:24,320 Speaker 2: invented your own process to sell this company essentially, right. 321 00:17:24,720 --> 00:17:30,119 Speaker 3: Yeah, we decided we just have an auction by Gmail, 322 00:17:31,000 --> 00:17:34,480 Speaker 3: and you'd send me Gmail with your bid, and the 323 00:17:35,080 --> 00:17:37,440 Speaker 3: time of the bid would be the timestamp on the Gmail. 324 00:17:38,560 --> 00:17:40,200 Speaker 3: I would then immediately send the bid to all the 325 00:17:40,240 --> 00:17:44,400 Speaker 3: other bidders, and you had to raise by a million dollars, 326 00:17:44,720 --> 00:17:47,480 Speaker 3: and if there was no bid within an hour of 327 00:17:47,520 --> 00:17:48,960 Speaker 3: the last bid, that was the end of the auction. 328 00:17:50,320 --> 00:17:52,320 Speaker 3: I was amazed to see that the bids would come 329 00:17:52,320 --> 00:17:55,560 Speaker 3: in fifty nine minutes after the previous bid. Would be 330 00:17:55,560 --> 00:17:58,440 Speaker 3: sitting there think okay, it's over, and then fifty nine 331 00:17:58,440 --> 00:17:59,720 Speaker 3: minutes later a bid would come in. 332 00:18:00,880 --> 00:18:02,200 Speaker 2: Only everybody trusted Gmail. 333 00:18:02,880 --> 00:18:05,800 Speaker 3: I had worked at Google over the summer, and that 334 00:18:05,920 --> 00:18:08,199 Speaker 3: meant two things. One, I did trust that they wouldn't 335 00:18:08,200 --> 00:18:11,320 Speaker 3: read my Gmail. I knew they were very serious about that. 336 00:18:12,280 --> 00:18:15,960 Speaker 3: And also I really wanted Google to win the competition 337 00:18:16,000 --> 00:18:18,840 Speaker 3: because I really like Jeff Dean, who ran the brain 338 00:18:18,880 --> 00:18:22,640 Speaker 3: team at Google. So I wanted Google to win, and 339 00:18:23,359 --> 00:18:26,240 Speaker 3: in the end we slightly fudged it. So after deep 340 00:18:26,280 --> 00:18:29,119 Speaker 3: minded Microsoft had dropped out, it was between Google and 341 00:18:29,119 --> 00:18:32,800 Speaker 3: by Do, and we were scared by You would win, 342 00:18:33,520 --> 00:18:35,159 Speaker 3: and I didn't want to go to China. 343 00:18:36,960 --> 00:18:39,240 Speaker 4: I couldn't travel at that point, well, so I felt 344 00:18:39,240 --> 00:18:39,960 Speaker 4: I wouldn't. 345 00:18:39,680 --> 00:18:43,040 Speaker 3: Really understand what was going on in a Chinese company. 346 00:18:43,160 --> 00:18:46,919 Speaker 3: So it got to forty four million, and we just 347 00:18:47,000 --> 00:18:50,120 Speaker 3: said we've got an offer we can't refuse, and it's 348 00:18:50,160 --> 00:18:52,520 Speaker 3: the end of the auction. And the offer we couldn't 349 00:18:52,560 --> 00:18:54,320 Speaker 3: refuse was to work with Jeff Deane. 350 00:18:54,960 --> 00:18:58,240 Speaker 2: So you got what you wanted in the sense of. 351 00:18:57,920 --> 00:19:00,200 Speaker 3: We've got more money than we could imagine, and we 352 00:19:00,280 --> 00:19:03,040 Speaker 3: got to work at the company that I wanted most 353 00:19:03,080 --> 00:19:03,600 Speaker 3: to work. 354 00:19:03,440 --> 00:19:10,119 Speaker 1: With when we come back. Nobel Laureate Jeffrey Hinton on 355 00:19:10,200 --> 00:19:27,520 Speaker 1: why he advocates for AI safety. In twenty eighteen, you 356 00:19:27,640 --> 00:19:29,639 Speaker 1: receive the Touring Award, which is kind of like the 357 00:19:29,680 --> 00:19:33,320 Speaker 1: Nobel Prize for Computer Science. But also in twenty eighteen, 358 00:19:33,800 --> 00:19:36,399 Speaker 1: you were widowed for a second time. You lost your 359 00:19:36,440 --> 00:19:40,000 Speaker 1: wife Rosalind in nineteen ninety four, and then twenty four 360 00:19:40,040 --> 00:19:44,200 Speaker 1: years later your wife Jackie passed away also from cancer. 361 00:19:45,119 --> 00:19:52,560 Speaker 3: Yes, that was that was difficult. I didn't have My 362 00:19:52,640 --> 00:19:54,840 Speaker 3: children were much older then, so I didn't have the 363 00:19:54,840 --> 00:19:57,520 Speaker 3: problem of having to cope with young children at the 364 00:19:57,520 --> 00:20:01,720 Speaker 3: same time as everything else. And Google was very understanding. 365 00:20:02,560 --> 00:20:04,160 Speaker 3: Part of my deal had been that I would spend 366 00:20:04,200 --> 00:20:09,720 Speaker 3: three months a year in Silicon Valley, and they let 367 00:20:09,760 --> 00:20:11,000 Speaker 3: me out of that and said I could spend my 368 00:20:11,040 --> 00:20:14,800 Speaker 3: whole time in Toronto, and they helped me set up 369 00:20:14,800 --> 00:20:19,000 Speaker 3: a small lab doing basic research in Toronto, so that 370 00:20:19,119 --> 00:20:23,359 Speaker 3: was much less stressful, so I could be with my wife. 371 00:20:23,640 --> 00:20:26,479 Speaker 4: She had cancer as well, she got pancreatic cancer. 372 00:20:29,359 --> 00:20:32,520 Speaker 2: One of the most striking things, again in the New 373 00:20:32,560 --> 00:20:38,760 Speaker 2: Yorker piece, was the way you talked about observing the 374 00:20:38,800 --> 00:20:45,760 Speaker 2: way in which Rosalind and Jackie approached their cancer as 375 00:20:45,800 --> 00:20:50,760 Speaker 2: a mental model for how to think about the implications 376 00:20:50,760 --> 00:20:52,040 Speaker 2: of artificial intelligence. 377 00:20:52,680 --> 00:20:55,359 Speaker 4: Yeah, that's a rather sort of dark scenario. 378 00:20:57,440 --> 00:20:59,840 Speaker 3: So occasionally when I think, well, lis stuff probably will 379 00:20:59,840 --> 00:21:05,679 Speaker 3: take over, which I sometimes think, then there's the issue 380 00:21:05,720 --> 00:21:10,800 Speaker 3: of should you go into denial and say no, no, no, no, 381 00:21:10,840 --> 00:21:15,080 Speaker 3: this can't possibly happen, which is what my first wife did. Actually, 382 00:21:15,119 --> 00:21:17,040 Speaker 3: she wasn't my first wife I was married a long 383 00:21:17,040 --> 00:21:25,680 Speaker 3: time before that, just briefly, and so Ross went into 384 00:21:25,800 --> 00:21:31,040 Speaker 3: denial and Jackie was very realistic about it. And maybe 385 00:21:31,080 --> 00:21:34,440 Speaker 3: we should be very realistic about the possibility machines will 386 00:21:34,480 --> 00:21:37,040 Speaker 3: take over. We should do our best to make sure 387 00:21:37,080 --> 00:21:40,680 Speaker 3: that doesn't happen, but we should also think about whether 388 00:21:40,720 --> 00:21:43,440 Speaker 3: there's ways of making that if that does happen, whether 389 00:21:43,480 --> 00:21:46,280 Speaker 3: there's ways of making it not so bad, whether people 390 00:21:46,280 --> 00:21:48,639 Speaker 3: could still be around even if the machines were in control, 391 00:21:48,680 --> 00:21:49,240 Speaker 3: for example. 392 00:21:49,640 --> 00:21:51,760 Speaker 2: Yeah, I mean there's something. I mean, you use the 393 00:21:51,800 --> 00:21:56,800 Speaker 2: word dark, but you've seen your life's work in many 394 00:21:56,800 --> 00:22:02,080 Speaker 2: ways kind of fruition, haunted by these thoughts of how 395 00:22:02,119 --> 00:22:06,680 Speaker 2: to deal with something as awful as the terminal council diagnosis. 396 00:22:07,200 --> 00:22:09,919 Speaker 4: Yeah, you have to be careful what you wish for. 397 00:22:11,440 --> 00:22:17,040 Speaker 2: Yeah, I mean my mentioned Oppenheimer at the beginning in 398 00:22:17,160 --> 00:22:20,040 Speaker 2: terms of a Nobel Well, he wasn't almost a physics gloriate. 399 00:22:20,080 --> 00:22:20,800 Speaker 2: He wasn't in the end. 400 00:22:22,000 --> 00:22:24,800 Speaker 3: Yes, it's rather absurd, isn't it That I've got a 401 00:22:24,880 --> 00:22:26,800 Speaker 3: Nobel Prize in physics and Oppenheimer didn't. 402 00:22:27,240 --> 00:22:29,080 Speaker 4: That's utterly ridiculous. 403 00:22:30,119 --> 00:22:35,000 Speaker 3: I should say something people, particularly journalists, like to say well, 404 00:22:35,000 --> 00:22:36,680 Speaker 3: how would you compare yourself with Oppenheimer? 405 00:22:37,480 --> 00:22:39,199 Speaker 4: And there's two big differences. 406 00:22:40,520 --> 00:22:44,960 Speaker 3: One is that Oppenheimer really was crucial to the development 407 00:22:45,000 --> 00:22:50,360 Speaker 3: of the atomic bomb. He managed the science of it. 408 00:22:50,840 --> 00:22:54,040 Speaker 3: He was a single, extremely important figure with the development 409 00:22:54,080 --> 00:22:57,840 Speaker 3: of AI. There's a bunch of us, and if I 410 00:22:57,880 --> 00:23:02,240 Speaker 3: hadn't been around, all stuff would have happened. That's one difference. 411 00:23:02,440 --> 00:23:06,240 Speaker 3: The other difference is that atomic bombs aren't good for 412 00:23:06,320 --> 00:23:07,040 Speaker 3: anything good. 413 00:23:07,480 --> 00:23:08,560 Speaker 4: They're just for destruction. 414 00:23:10,080 --> 00:23:12,719 Speaker 3: They actually did try using them for fracking in Colorado, 415 00:23:12,960 --> 00:23:14,159 Speaker 3: but that didn't work out too. 416 00:23:14,000 --> 00:23:16,680 Speaker 4: Well and you can't go there anymore. 417 00:23:16,920 --> 00:23:19,200 Speaker 3: The big difference is most of the uses of AI 418 00:23:19,280 --> 00:23:22,920 Speaker 3: are very good. It can lead to huge increases in productivity, 419 00:23:23,440 --> 00:23:28,679 Speaker 3: huge improvements in healthcare, might help a lot with climate change. 420 00:23:29,520 --> 00:23:34,040 Speaker 3: So AI is going to be developed because of all 421 00:23:34,080 --> 00:23:38,560 Speaker 3: those huge beneficial uses. And that's very different from atomic bombs, 422 00:23:38,560 --> 00:23:41,480 Speaker 3: where there was a possibility of not developing the H bomb. 423 00:23:42,680 --> 00:23:46,600 Speaker 2: And why have you taken it upon yourself as your responsibility? 424 00:23:46,600 --> 00:23:49,920 Speaker 2: And you quit Google in twenty twenty three and since 425 00:23:49,960 --> 00:23:54,720 Speaker 2: then have become one of the most vocal and qualified 426 00:23:54,760 --> 00:23:56,600 Speaker 2: people in the world warning of these risks. 427 00:23:57,400 --> 00:23:58,800 Speaker 4: Well, I'm old. 428 00:24:00,440 --> 00:24:05,639 Speaker 3: I'm too old to do original research anymore, but people 429 00:24:05,680 --> 00:24:08,520 Speaker 3: listen to me, and I really believe these risks are 430 00:24:08,600 --> 00:24:12,000 Speaker 3: very real and very important, so I don't really have 431 00:24:12,080 --> 00:24:16,000 Speaker 3: much choice. We are going to develop AI because it's 432 00:24:16,000 --> 00:24:16,360 Speaker 3: got so. 433 00:24:16,320 --> 00:24:17,240 Speaker 4: Many good uses. 434 00:24:18,200 --> 00:24:20,240 Speaker 3: So I'm not warning against developing it, and I'm not 435 00:24:20,280 --> 00:24:25,000 Speaker 3: saying slow down. What I'm saying is try and develop 436 00:24:25,040 --> 00:24:27,120 Speaker 3: it as safely as you can. Try and figure out, 437 00:24:27,320 --> 00:24:31,280 Speaker 3: in particular, how you can stop it eventually taking over, 438 00:24:32,920 --> 00:24:35,359 Speaker 3: but also think about all the other shorter term risks 439 00:24:35,920 --> 00:24:39,960 Speaker 3: like fake videos, corrupting elections, and loss of jobs. I'm 440 00:24:39,960 --> 00:24:42,320 Speaker 3: saying we need to worry about all those things, and 441 00:24:42,400 --> 00:24:46,719 Speaker 3: it might be rational to just stop developing it, but 442 00:24:46,800 --> 00:24:48,879 Speaker 3: I'm not. I don't think there's any hope of that, 443 00:24:48,920 --> 00:24:49,920 Speaker 3: so I'm not advocating that. 444 00:24:51,240 --> 00:24:55,080 Speaker 2: You said in December there was a ten to twenty 445 00:24:55,119 --> 00:24:58,800 Speaker 2: percent risk that AI would cause human extinction in the 446 00:24:58,840 --> 00:25:03,479 Speaker 2: next thirty years. How do you count with those odds? 447 00:25:04,080 --> 00:25:05,000 Speaker 4: I just make them up. 448 00:25:05,280 --> 00:25:11,919 Speaker 3: That's I'm just like, no. If you think about subjective probabilities, 449 00:25:12,440 --> 00:25:17,200 Speaker 3: they're based on intuition. I have a very strong intuition 450 00:25:17,960 --> 00:25:21,760 Speaker 3: that the chance of super intelligent machines taking over from 451 00:25:21,800 --> 00:25:25,560 Speaker 3: people is more than one percent, and I have a 452 00:25:25,640 --> 00:25:28,040 Speaker 3: very strong intuition that is less than ninety nine percent. 453 00:25:29,040 --> 00:25:33,440 Speaker 3: We're dealing with something extremely unknown, so your best bet 454 00:25:33,440 --> 00:25:36,359 Speaker 3: for things totally unknown is maybe fifty percent, but that 455 00:25:36,400 --> 00:25:38,280 Speaker 3: doesn't work very well because it depends on how you 456 00:25:38,320 --> 00:25:39,440 Speaker 3: partition things. 457 00:25:40,400 --> 00:25:42,199 Speaker 4: So clearly the chance is. 458 00:25:42,240 --> 00:25:44,000 Speaker 3: Much bigger than one percent and much les than ninety 459 00:25:44,080 --> 00:25:46,359 Speaker 3: nine percent, and maybe I should just stick at that. 460 00:25:47,480 --> 00:25:50,000 Speaker 3: But I'm hoping that we can figure out a way 461 00:25:50,320 --> 00:25:52,840 Speaker 3: that people can stay in control, because we're people and 462 00:25:52,840 --> 00:25:54,000 Speaker 3: what we care about is people. 463 00:25:54,160 --> 00:25:56,119 Speaker 2: Now, do you view it as inevitable that if we 464 00:25:56,240 --> 00:26:00,480 Speaker 2: if we quote unquote lose control, our destruction that is 465 00:26:00,520 --> 00:26:02,280 Speaker 2: the is the next No, they. 466 00:26:02,240 --> 00:26:04,040 Speaker 4: Might I een on Musk for example. 467 00:26:05,080 --> 00:26:08,119 Speaker 3: I talked to him and he pushed the line that 468 00:26:08,720 --> 00:26:11,080 Speaker 3: they'll keep us around as pets because we're quite interesting. 469 00:26:12,080 --> 00:26:12,480 Speaker 2: M hmm. 470 00:26:13,640 --> 00:26:16,040 Speaker 4: It seems a rather sort of thin thread to hang 471 00:26:16,160 --> 00:26:17,160 Speaker 4: human existence by. 472 00:26:18,800 --> 00:26:21,399 Speaker 2: So, I mean, you've been vocal about the kind of 473 00:26:22,240 --> 00:26:29,360 Speaker 2: overall societal threat, but you've also been specifically critical of 474 00:26:29,920 --> 00:26:31,040 Speaker 2: Sambleman in particular. 475 00:26:31,920 --> 00:26:35,199 Speaker 3: Yes, because I think open Aye was set up to 476 00:26:35,280 --> 00:26:41,200 Speaker 3: develop a GI safely, and it's just been progressively moving 477 00:26:41,240 --> 00:26:47,800 Speaker 3: away from that towards developing a for profit and so 478 00:26:48,160 --> 00:26:52,600 Speaker 3: it's best safety researchers have left and it's now trying 479 00:26:52,600 --> 00:26:55,080 Speaker 3: to convert itself from a not for profit company into 480 00:26:55,119 --> 00:26:58,919 Speaker 3: a for profit company. And that seems entirely wrong to me. 481 00:26:59,480 --> 00:27:03,479 Speaker 2: And you're I'm a colleague and student. Ilius Atsgev, who 482 00:27:03,600 --> 00:27:07,160 Speaker 2: worked on the twenty twelve paper with took a big 483 00:27:07,320 --> 00:27:09,720 Speaker 2: stand on this, which ultimately didn't break his way. 484 00:27:10,080 --> 00:27:13,280 Speaker 3: It didn't break his way in terms of Sam being 485 00:27:13,280 --> 00:27:16,600 Speaker 3: fired as the head of the company. It did break 486 00:27:16,640 --> 00:27:20,359 Speaker 3: his way in terms of people understanding that open Ai 487 00:27:20,760 --> 00:27:23,840 Speaker 3: was going back on its pledge to develop Aji safely 488 00:27:24,680 --> 00:27:26,200 Speaker 3: and India is now set up a company. 489 00:27:26,760 --> 00:27:27,800 Speaker 4: I'm that's trying to do that. 490 00:27:28,840 --> 00:27:31,960 Speaker 2: I mean, if if Samaltman called you tomorrow through Toronto 491 00:27:32,240 --> 00:27:35,000 Speaker 2: and he were sitting with him, he said, you know, 492 00:27:35,720 --> 00:27:37,400 Speaker 2: where have I gone wrong? And what should I do? 493 00:27:39,280 --> 00:27:39,840 Speaker 2: What do you say? 494 00:27:41,600 --> 00:27:43,560 Speaker 4: I'd say, you're not Sam Altman. 495 00:27:46,560 --> 00:27:47,119 Speaker 2: Fair enough? 496 00:27:47,320 --> 00:27:47,639 Speaker 4: Okay. 497 00:27:47,760 --> 00:27:51,080 Speaker 3: So suppose something very surprising happens and Sam Altman suddenly 498 00:27:51,080 --> 00:27:53,840 Speaker 3: has an epiphany and says, oh my god, we shouldn't 499 00:27:53,880 --> 00:27:55,480 Speaker 3: be doing this for a profit. We should be doing 500 00:27:55,480 --> 00:27:58,639 Speaker 3: it to protect humanity. I'd be very happy to doctor. 501 00:27:59,160 --> 00:28:02,120 Speaker 2: But what if you could affect his opinion in one 502 00:28:02,160 --> 00:28:03,640 Speaker 2: way or that of other. 503 00:28:04,040 --> 00:28:07,200 Speaker 3: I would say, keep developing AI, but use a large 504 00:28:07,200 --> 00:28:11,160 Speaker 3: fraction of the resources as you're developing it to try 505 00:28:11,160 --> 00:28:12,720 Speaker 3: and figure out ways in which you might get out 506 00:28:12,760 --> 00:28:15,000 Speaker 3: of control and what we could do about that. So 507 00:28:15,040 --> 00:28:19,000 Speaker 3: if Sam Olman said we're going to use thirty percent 508 00:28:19,000 --> 00:28:21,879 Speaker 3: of our compute, how much compute we have, we use 509 00:28:21,960 --> 00:28:26,720 Speaker 3: thirty percent to get highly paid safety researchers to do 510 00:28:26,800 --> 00:28:30,840 Speaker 3: research on safety, not on making it better, developing for 511 00:28:30,920 --> 00:28:34,680 Speaker 3: the bit on making it safe, I'd take it all back, 512 00:28:34,680 --> 00:28:35,680 Speaker 3: and I'd say he is a great guy. 513 00:28:36,040 --> 00:28:38,600 Speaker 2: I mean for a non for profit, that doesn't sound 514 00:28:38,640 --> 00:28:40,520 Speaker 2: like unreasonable exactly. 515 00:28:40,600 --> 00:28:44,239 Speaker 3: That that was what I thought was happening to begin with, 516 00:28:45,160 --> 00:28:46,800 Speaker 3: and that was what idiot thought was happening. 517 00:28:47,320 --> 00:28:49,880 Speaker 2: Right before Christmas, there was an article in the Wall 518 00:28:49,920 --> 00:28:52,960 Speaker 2: Street Journal basically saying that chat GIPT five was behind 519 00:28:52,960 --> 00:28:57,520 Speaker 2: schedule and that kind of the pace of improvement in 520 00:28:58,240 --> 00:29:01,960 Speaker 2: deep learning was lowing for different because of lack of 521 00:29:02,480 --> 00:29:05,440 Speaker 2: real world data. Could be other reasons, But then O 522 00:29:05,600 --> 00:29:08,480 Speaker 2: three came out and Settlement sort of said that AGI 523 00:29:08,680 --> 00:29:11,560 Speaker 2: is here. Where do you think we are on this agi? 524 00:29:11,680 --> 00:29:13,479 Speaker 2: Do you think it's even a relevant metric? 525 00:29:14,000 --> 00:29:18,520 Speaker 3: So Ever, since twenty twelve, there've been people saying AI 526 00:29:18,600 --> 00:29:21,440 Speaker 3: is about to hit a wall. So Gary Marcus made 527 00:29:21,440 --> 00:29:24,640 Speaker 3: a strong prediction in twenty twenty two that AI was 528 00:29:24,680 --> 00:29:30,400 Speaker 3: hitting a wall and wouldn't get much further. So you 529 00:29:30,480 --> 00:29:33,200 Speaker 3: have to see that against a background of repeated predictions 530 00:29:33,240 --> 00:29:35,000 Speaker 3: that I is about to hit a wall. 531 00:29:35,200 --> 00:29:36,280 Speaker 4: This is a bit more. 532 00:29:36,120 --> 00:29:39,240 Speaker 3: Real in the sense that we really are reaching peak 533 00:29:39,360 --> 00:29:44,160 Speaker 3: data or peak easily available data. There's actually hugely more 534 00:29:44,240 --> 00:29:48,520 Speaker 3: data in silos and companies and in videos. So yes, 535 00:29:48,560 --> 00:29:51,560 Speaker 3: we're running out of easily available data and that may 536 00:29:51,600 --> 00:29:53,920 Speaker 3: slow things down a bit. But if you can get 537 00:29:53,920 --> 00:29:56,800 Speaker 3: them to generate their own data, then you can overcome 538 00:29:56,800 --> 00:29:58,880 Speaker 3: that problem, and you can get to them to do 539 00:29:58,920 --> 00:30:01,120 Speaker 3: that by reasoning. So if you look at even with 540 00:30:01,160 --> 00:30:03,880 Speaker 3: things like chess, there's only a limited number of expert moves, 541 00:30:04,320 --> 00:30:06,680 Speaker 3: but you can overcome that by getting the system to 542 00:30:06,680 --> 00:30:10,600 Speaker 3: play itself, and then you get an infinite amount of 543 00:30:10,720 --> 00:30:13,479 Speaker 3: data to train on. And so the neural nets that 544 00:30:13,520 --> 00:30:17,200 Speaker 3: are saying would this be a good move, or saying 545 00:30:17,440 --> 00:30:20,160 Speaker 3: how good is this position for me, they now get 546 00:30:20,160 --> 00:30:24,280 Speaker 3: an infinite amount of data, or rather unbounded amount of data. 547 00:30:24,760 --> 00:30:28,360 Speaker 3: You can always generate your data at the appropriate difficulty 548 00:30:28,440 --> 00:30:33,040 Speaker 3: level two. And so nobody says your networks for Chest 549 00:30:33,080 --> 00:30:33,920 Speaker 3: and Go are going to run. 550 00:30:33,840 --> 00:30:34,400 Speaker 4: Out of data. 551 00:30:35,120 --> 00:30:38,080 Speaker 3: They're already far better than any person, and we can 552 00:30:38,160 --> 00:30:40,480 Speaker 3: make them much much better than that if we wanted to. 553 00:30:41,400 --> 00:30:47,880 Speaker 2: Shortly before you quit Google, you tweeted caterpillars extract nutrients, 554 00:30:47,880 --> 00:30:51,800 Speaker 2: which are then converted into butterflies. People have extracted billions 555 00:30:51,840 --> 00:30:56,600 Speaker 2: of nuggets of understanding and GPT four is humanity's butterfly. 556 00:30:57,280 --> 00:30:58,000 Speaker 2: Can you explain that? 557 00:30:58,760 --> 00:31:04,200 Speaker 3: Okay, So if you look at insect most insects have 558 00:31:04,400 --> 00:31:10,160 Speaker 3: larvae and they have adults. But let's take butterflies obvious example. 559 00:31:10,880 --> 00:31:14,000 Speaker 3: And if you look at a caterpillar, it's not optimized 560 00:31:15,000 --> 00:31:20,200 Speaker 3: for traveling and mating. A caterpillar is optimized extracting stuff. 561 00:31:20,920 --> 00:31:23,120 Speaker 3: You then turn that stuff into soup, and then you 562 00:31:23,160 --> 00:31:27,760 Speaker 3: get something very different. So what humanity has been doing 563 00:31:27,800 --> 00:31:30,760 Speaker 3: for a long time is understanding little bits of the. 564 00:31:30,720 --> 00:31:34,280 Speaker 2: World, translating the world into data through photographs and words. 565 00:31:34,320 --> 00:31:38,800 Speaker 3: And yes, and now you could take all that work 566 00:31:38,840 --> 00:31:44,400 Speaker 3: we've done at extracting structure from the world, like a 567 00:31:44,440 --> 00:31:48,200 Speaker 3: caterpillar extracting nutrients, and you could take that extracted stuff 568 00:31:48,200 --> 00:31:51,400 Speaker 3: and turn it into something different. You could turn it 569 00:31:51,440 --> 00:31:54,360 Speaker 3: into a single model that knows everything. 570 00:31:54,760 --> 00:31:56,960 Speaker 2: When I read it first, I thought it was quite 571 00:31:58,040 --> 00:32:03,160 Speaker 2: beautiful and optimistic, and then I read it again I 572 00:32:03,160 --> 00:32:04,160 Speaker 2: didn't think that anymore. 573 00:32:05,200 --> 00:32:08,440 Speaker 4: Yes, I mean you could read it as we're. 574 00:32:08,040 --> 00:32:13,480 Speaker 2: History, Yes, being farewell to our caterpillar. Yes, how do 575 00:32:13,560 --> 00:32:14,600 Speaker 2: you read your own metaphor? 576 00:32:15,400 --> 00:32:19,040 Speaker 3: I'm probably somewhat influenced by a piece of William Blake 577 00:32:19,080 --> 00:32:23,040 Speaker 3: poetry which goes the caterpillar on the leaf repeats to 578 00:32:23,080 --> 00:32:27,000 Speaker 3: thee thy mother's grief, which is basically saying the butterfly 579 00:32:27,120 --> 00:32:30,720 Speaker 3: is much prettier than the caterpillar. I think I think 580 00:32:30,720 --> 00:32:33,520 Speaker 3: that's what it's saying. You know, I don't know whether 581 00:32:33,520 --> 00:32:36,600 Speaker 3: we're going to get replaced. I hope we're not. I 582 00:32:36,640 --> 00:32:39,520 Speaker 3: hope people stay in control, but I hope we stay 583 00:32:39,520 --> 00:32:44,240 Speaker 3: in control with assistance that are much more intelligent than us. 584 00:32:44,600 --> 00:32:48,520 Speaker 2: Well. As you said, mothers are the only creatures in 585 00:32:48,560 --> 00:32:51,520 Speaker 2: there that we know of who are controlled by less 586 00:32:51,560 --> 00:32:55,520 Speaker 2: sophisticated beings. I think he said something on this line. 587 00:32:55,400 --> 00:32:58,640 Speaker 3: And babies aren't much less intelligent than the mothers, like 588 00:32:58,720 --> 00:33:02,240 Speaker 3: at most a factor of two. We're talking about huge factors. 589 00:33:02,760 --> 00:33:04,600 Speaker 2: You mentioned Blake just now. But the other person I 590 00:33:04,600 --> 00:33:06,960 Speaker 2: thought of when I was reading that butterfly metaphor was 591 00:33:07,000 --> 00:33:10,080 Speaker 2: your was your father, who of course was an entomologist. 592 00:33:10,160 --> 00:33:13,400 Speaker 4: And yes, that's why I got my interest in metamorphosis. 593 00:33:13,440 --> 00:33:17,080 Speaker 2: Yes, so jet jet GBT is humanity is butterfly, but 594 00:33:17,120 --> 00:33:19,680 Speaker 2: also in a sense your butterfly, And this is a 595 00:33:19,720 --> 00:33:21,360 Speaker 2: metaphor to don't know your. 596 00:33:21,200 --> 00:33:24,400 Speaker 4: Father, I guess grudgingly. 597 00:33:26,440 --> 00:33:31,000 Speaker 2: In the worst case scenario where AI does cause our extinction, 598 00:33:31,680 --> 00:33:34,320 Speaker 2: what are the ways in which that could happen? And 599 00:33:34,360 --> 00:33:37,400 Speaker 2: the best case scenario where it doesn't, what are the 600 00:33:37,440 --> 00:33:38,840 Speaker 2: ways in which that could happen. 601 00:33:39,480 --> 00:33:42,800 Speaker 3: Okay, so the obvious way it could happen is we 602 00:33:42,920 --> 00:33:47,720 Speaker 3: make air agents that can create subcoals. And they realized, 603 00:33:47,760 --> 00:33:50,520 Speaker 3: because they're super intelligent, that a good sub goal is 604 00:33:50,560 --> 00:33:52,400 Speaker 3: to get more control, because if you get more control, 605 00:33:52,480 --> 00:33:54,760 Speaker 3: you can achieve your goals. So even if they're trying 606 00:33:54,760 --> 00:33:57,160 Speaker 3: to achieve goals we gave them, they're trying and get 607 00:33:57,200 --> 00:34:00,520 Speaker 3: more control. It'll be a bit like I don't know 608 00:34:00,520 --> 00:34:02,000 Speaker 3: if you have children. But if you have a three 609 00:34:02,080 --> 00:34:04,680 Speaker 3: year old who's finally decided they want to try tying 610 00:34:04,720 --> 00:34:07,840 Speaker 3: their own shoelaces, but you're in a hurry to get somewhere, 611 00:34:08,719 --> 00:34:10,640 Speaker 3: you let them try tying their shoelaces for a few 612 00:34:10,680 --> 00:34:13,319 Speaker 3: minutes and then you say no, no, I'm going to 613 00:34:13,360 --> 00:34:17,160 Speaker 3: do it. You can learn that way you're older. AIS 614 00:34:17,280 --> 00:34:20,000 Speaker 3: will be like the parent and will be like the children, 615 00:34:20,640 --> 00:34:22,120 Speaker 3: and they'll just push us out the way to get 616 00:34:22,120 --> 00:34:27,040 Speaker 3: things done. So that's the bad scenario. Even if they're 617 00:34:27,080 --> 00:34:28,920 Speaker 3: trying to achieve things that we've told them we want, 618 00:34:30,680 --> 00:34:35,319 Speaker 3: they'll basically take control. And that scenario gets worse if 619 00:34:35,320 --> 00:34:39,080 Speaker 3: ever one of those super intelligent ai I thinks I'd 620 00:34:39,160 --> 00:34:40,920 Speaker 3: rather there were a few more copies of me and 621 00:34:41,000 --> 00:34:44,400 Speaker 3: a few less copies of the other is super intelligent aies. 622 00:34:45,000 --> 00:34:47,600 Speaker 3: As soon as that happens, If that ever happens, you'll 623 00:34:47,600 --> 00:34:50,920 Speaker 3: get evolution between superintelligent aies and we'll be left in 624 00:34:50,960 --> 00:34:54,719 Speaker 3: the dust, and they'll develop all the nasty things you 625 00:34:54,719 --> 00:34:58,400 Speaker 3: get from evolution, being nasty and competitive and very loyal 626 00:34:58,440 --> 00:35:01,239 Speaker 3: to their own tribe and very aggressive other tribes, all 627 00:35:01,280 --> 00:35:07,280 Speaker 3: that nonsense that we have. So that's the bad scenario. 628 00:35:07,960 --> 00:35:11,839 Speaker 3: The good scenario is we figure out a way where 629 00:35:11,880 --> 00:35:15,399 Speaker 3: we can guarantee they're never going to try and get 630 00:35:15,400 --> 00:35:18,560 Speaker 3: control away from us. They're always going to be subservient 631 00:35:18,600 --> 00:35:21,640 Speaker 3: to us, and we figure out how we can guarantee 632 00:35:21,640 --> 00:35:23,960 Speaker 3: that that'll happen, and then we will have these wonderful 633 00:35:24,000 --> 00:35:26,160 Speaker 3: intelligent assistants and life. 634 00:35:26,040 --> 00:35:27,960 Speaker 4: Is just really easy. 635 00:35:29,719 --> 00:35:31,160 Speaker 2: Jeffrey, thank you, Thank you. 636 00:35:33,920 --> 00:35:36,840 Speaker 1: That's it for this week for Tech Stuff. I'm os Voloscian. 637 00:35:37,480 --> 00:35:41,080 Speaker 1: This episode was produced by Eliza Dennis, Victoria, Di Mingez, 638 00:35:41,239 --> 00:35:44,759 Speaker 1: Shino Ozaki, and Lizzie Jacobs. It was executive produced by 639 00:35:44,800 --> 00:35:49,200 Speaker 1: me Kara Price and Kate Osborne, The Kaleidoscope and Katrina. 640 00:35:48,880 --> 00:35:50,440 Speaker 2: Novel for iHeart Podcasts. 641 00:35:51,000 --> 00:35:54,640 Speaker 1: The Engineer is Beheath Fraser Offspin mixed this episode, and 642 00:35:54,719 --> 00:35:57,480 Speaker 1: Kyle Murdoch wrote our theme song. Join us on Friday 643 00:35:57,520 --> 00:35:59,719 Speaker 1: for the Week in Tech. We'll break down the headlines 644 00:36:00,040 --> 00:36:02,160 Speaker 1: and hear from some of our expert friends about the 645 00:36:02,239 --> 00:36:05,400 Speaker 1: latest in tech. Please rate, review, and reach out to 646 00:36:05,480 --> 00:36:08,440 Speaker 1: us at tech Stuff podcast at gmail dot com. 647 00:36:08,440 --> 00:36:08,840 Speaker 2: Thank you.