1 00:00:07,840 --> 00:00:09,800 Speaker 1: I feel like an old man whenever I buy a 2 00:00:09,800 --> 00:00:12,960 Speaker 1: new computer. I mean, why does my laptop need sixty 3 00:00:12,960 --> 00:00:15,680 Speaker 1: four gigabytes of memory? When I learned to program on 4 00:00:15,720 --> 00:00:19,960 Speaker 1: a PC that had twenty kilobytes? No joke. Kids these 5 00:00:20,040 --> 00:00:22,799 Speaker 1: days don't understand how hard we had it back in 6 00:00:22,800 --> 00:00:26,159 Speaker 1: the day. Right, But it's also a nice feeling. It 7 00:00:26,239 --> 00:00:29,560 Speaker 1: tells me that we're making progress and that's good. It's 8 00:00:29,640 --> 00:00:32,479 Speaker 1: creating new worlds and new ways of life. It's literally 9 00:00:32,600 --> 00:00:37,040 Speaker 1: saving lives by accelerating science. That's all great stuff, right, 10 00:00:37,440 --> 00:00:40,319 Speaker 1: But how long can it go on? What is the 11 00:00:40,400 --> 00:00:44,319 Speaker 1: engine of this exponential growth in computing power? And can 12 00:00:44,400 --> 00:00:46,680 Speaker 1: we count on it to take us to the stars, 13 00:00:46,720 --> 00:00:51,159 Speaker 1: to cure cancer and to develop self driving toothbrushes. Today 14 00:00:51,280 --> 00:00:54,120 Speaker 1: we'll dive into the physics underlying this trend and ask 15 00:00:54,600 --> 00:00:57,360 Speaker 1: whether there are fundamental limits that could block us from 16 00:00:57,440 --> 00:01:00,680 Speaker 1: achieving our dreams. And we'll talk about whether there's danger 17 00:01:00,760 --> 00:01:05,200 Speaker 1: and assuming technology will solve all of our problems. Welcome 18 00:01:05,240 --> 00:01:08,000 Speaker 1: to Daniel and Kelly's Extraordinary Universe. 19 00:01:21,319 --> 00:01:25,319 Speaker 2: Hello. I am Kelly Wiener Smith. I study parasites and space, 20 00:01:25,560 --> 00:01:27,720 Speaker 2: and I realized when we were starting to do this 21 00:01:27,760 --> 00:01:30,200 Speaker 2: episode that I wasn't one hundred percent clear on what 22 00:01:30,240 --> 00:01:31,480 Speaker 2: mores Law meant exactly. 23 00:01:31,880 --> 00:01:35,000 Speaker 1: Hi, I'm Daniel. I'm a particle physicist and I've been 24 00:01:35,040 --> 00:01:39,360 Speaker 1: programming computers for more than forty years. They get faster 25 00:01:39,560 --> 00:01:40,560 Speaker 1: and I get slower. 26 00:01:40,760 --> 00:01:44,560 Speaker 2: Oh you're not slowing down yet, Daniel, you stab. 27 00:01:45,600 --> 00:01:48,520 Speaker 1: So my question for you today, Kelly, is what was 28 00:01:48,560 --> 00:01:51,920 Speaker 1: your first computer? Let's age Kelly. 29 00:01:52,080 --> 00:01:55,080 Speaker 2: Okay, So later when we talked to Adam Becker in 30 00:01:55,120 --> 00:01:57,160 Speaker 2: our interview, he mentions that there was a while there 31 00:01:57,200 --> 00:01:59,600 Speaker 2: where folks wouldn't get a computer because you'd wait as 32 00:01:59,680 --> 00:02:01,880 Speaker 2: long as you could, because the computers kept getting so 33 00:02:02,000 --> 00:02:05,320 Speaker 2: much better so quickly that if you could wait, your 34 00:02:05,320 --> 00:02:07,600 Speaker 2: computer would be much better. Yeah, and so my family 35 00:02:07,640 --> 00:02:11,200 Speaker 2: waited way too long. We didn't get one until I 36 00:02:11,320 --> 00:02:15,040 Speaker 2: was in like high school, and I know, and I 37 00:02:15,040 --> 00:02:17,600 Speaker 2: don't even remember what it was. But in the meantime 38 00:02:17,639 --> 00:02:19,520 Speaker 2: I had to write my essays on like it was 39 00:02:19,560 --> 00:02:21,919 Speaker 2: like a brother typewriter, but it also had a little 40 00:02:22,000 --> 00:02:25,960 Speaker 2: electronic screen, and so I could very slowly and laboriously 41 00:02:26,560 --> 00:02:29,120 Speaker 2: click through my essays and then I would print it 42 00:02:29,160 --> 00:02:30,880 Speaker 2: and something would be wrong. It would take me forever 43 00:02:30,960 --> 00:02:33,680 Speaker 2: to find where the error was. It was very annoying. 44 00:02:34,120 --> 00:02:36,519 Speaker 2: But what about you, did you have like the first 45 00:02:36,560 --> 00:02:37,600 Speaker 2: Apple computer? Ever? 46 00:02:38,000 --> 00:02:38,200 Speaker 3: Oh? 47 00:02:38,320 --> 00:02:41,639 Speaker 1: Apple was way too advanced. I go way before that. 48 00:02:41,760 --> 00:02:45,359 Speaker 1: What My first computer was a Commodore VIC twenty, which 49 00:02:45,400 --> 00:02:48,480 Speaker 1: I think had twenty kilobytes of RAM, and we stored 50 00:02:48,520 --> 00:02:51,280 Speaker 1: stuff on an audio tape, you know, like you write 51 00:02:51,320 --> 00:02:54,040 Speaker 1: a little program and then you'd stored on these cassette 52 00:02:54,080 --> 00:02:56,000 Speaker 1: tapes that you could later listen to and like, ooh 53 00:02:56,040 --> 00:02:59,960 Speaker 1: what does that sound? So yeah, we were very very early. 54 00:03:00,120 --> 00:03:02,200 Speaker 1: In fact, I remember hanging out with my dad in 55 00:03:02,240 --> 00:03:04,880 Speaker 1: grad school while he was doing his research and he 56 00:03:04,919 --> 00:03:08,400 Speaker 1: was literally feeding punch cards into those punch cards machines. 57 00:03:08,480 --> 00:03:12,000 Speaker 1: So I feel like I've personally experienced a huge fraction 58 00:03:12,200 --> 00:03:16,200 Speaker 1: of the transformation of computers into the basically supercomputers we 59 00:03:16,240 --> 00:03:19,320 Speaker 1: have today. I mean, my smartphone is so much more 60 00:03:19,360 --> 00:03:23,040 Speaker 1: powerful than anything my dad ever used in his research. 61 00:03:23,320 --> 00:03:25,560 Speaker 2: That is absolutely amazing. I didn't even know that we 62 00:03:25,560 --> 00:03:28,760 Speaker 2: were storing data on like cassette tapes. Oh yeah, that's 63 00:03:28,840 --> 00:03:29,520 Speaker 2: amazing to me. 64 00:03:29,720 --> 00:03:31,640 Speaker 1: Yeah, before magnetic floppies for sure. 65 00:03:31,760 --> 00:03:34,280 Speaker 2: So when you like are using your are you a 66 00:03:34,320 --> 00:03:34,960 Speaker 2: MacBook guy? 67 00:03:35,160 --> 00:03:35,720 Speaker 1: I am? Yeah. 68 00:03:35,760 --> 00:03:37,560 Speaker 2: It's like when you're using your Mac Do you every 69 00:03:37,640 --> 00:03:39,680 Speaker 2: day think I am so lucky I'm not doing this 70 00:03:39,760 --> 00:03:42,080 Speaker 2: on punch cards or you just do you take it 71 00:03:42,080 --> 00:03:42,839 Speaker 2: for granted? Now? 72 00:03:43,280 --> 00:03:46,400 Speaker 1: I think it's awesome. It's incredible. I mean every time 73 00:03:46,440 --> 00:03:49,040 Speaker 1: I get a new MacBook, I'm like, while, this drive 74 00:03:49,160 --> 00:03:51,600 Speaker 1: is ten times bigger than anything I've ever seen, and 75 00:03:52,040 --> 00:03:55,320 Speaker 1: the memory is just shocking. And it's also then incredible 76 00:03:55,320 --> 00:03:59,200 Speaker 1: to me how rapidly our computational ambitions grow. You know, 77 00:03:59,280 --> 00:04:01,920 Speaker 1: my group does a lot of computation, and we're basically 78 00:04:02,000 --> 00:04:05,440 Speaker 1: limited by computation, and so every time we get more 79 00:04:05,440 --> 00:04:08,880 Speaker 1: powerful computers, we scale up our ambitions and solve bigger, 80 00:04:08,880 --> 00:04:12,360 Speaker 1: harder problems, and so we're always at the edge of 81 00:04:12,360 --> 00:04:14,480 Speaker 1: what the computers can do, right, Like, we have an 82 00:04:14,480 --> 00:04:18,080 Speaker 1: infinite number of questions we could ask with harder computers. 83 00:04:18,279 --> 00:04:21,760 Speaker 1: So yeah, I'm in awe of the MacBook, not just 84 00:04:21,839 --> 00:04:24,080 Speaker 1: because it's so much more powerful than anything I've used, 85 00:04:24,120 --> 00:04:26,599 Speaker 1: but it's so reliable. I mean, I spend hours and 86 00:04:26,640 --> 00:04:28,440 Speaker 1: hours a day in front of this thing. It almost 87 00:04:28,520 --> 00:04:32,960 Speaker 1: never gives me problems. So yeah, it's incredible. What engineers 88 00:04:32,960 --> 00:04:34,720 Speaker 1: have provided it is incredible. 89 00:04:34,800 --> 00:04:37,000 Speaker 2: And today we're going to talk about one way in 90 00:04:37,040 --> 00:04:40,279 Speaker 2: which that incredible ability has been expanded, which has to 91 00:04:40,279 --> 00:04:43,000 Speaker 2: do with Moore's Law, which I thought was about how 92 00:04:43,080 --> 00:04:46,440 Speaker 2: much data you can store on your computer, and I 93 00:04:46,440 --> 00:04:47,520 Speaker 2: think it's much more than that. 94 00:04:47,960 --> 00:04:50,120 Speaker 1: You're right, it's much more than that. It's also about 95 00:04:50,120 --> 00:04:52,800 Speaker 1: how things are growing over time and how long that 96 00:04:52,880 --> 00:04:57,200 Speaker 1: will continue. And there's this lore about Moore's Law which 97 00:04:57,279 --> 00:05:00,600 Speaker 1: is permeated Silicon Valley and broader culture. So in a minute, 98 00:05:00,600 --> 00:05:03,359 Speaker 1: we're gonna also talk to Adam Becker about how this 99 00:05:03,400 --> 00:05:07,760 Speaker 1: has impacted philosophy and politics and policy and how it 100 00:05:07,839 --> 00:05:10,520 Speaker 1: might affect our future. But first we wanted to know 101 00:05:10,640 --> 00:05:13,760 Speaker 1: how long people thought Moore's Law might continue to make 102 00:05:13,839 --> 00:05:16,760 Speaker 1: all of our computers faster. So I went out there 103 00:05:16,800 --> 00:05:19,480 Speaker 1: and I talked to our group of volunteers. Here's what 104 00:05:19,520 --> 00:05:22,279 Speaker 1: they had to say about the future of Moore's law. 105 00:05:22,920 --> 00:05:26,040 Speaker 1: So I'll give it six years and then I'm gonna 106 00:05:26,120 --> 00:05:32,119 Speaker 1: sell my nvidious stock. But quand computing will change the game. 107 00:05:32,680 --> 00:05:35,960 Speaker 1: We are restricted by things like housmow. 108 00:05:36,000 --> 00:05:40,240 Speaker 4: We can make stuff, so I think it's not true anymore. 109 00:05:40,800 --> 00:05:44,120 Speaker 4: My first thought was to sign no, but I know 110 00:05:44,480 --> 00:05:48,440 Speaker 4: very little bit quantuine computing. I would say until the 111 00:05:48,440 --> 00:05:52,000 Speaker 4: computer speed, which just be the light maybe ten years, 112 00:05:52,320 --> 00:05:53,880 Speaker 4: another good decade or so. 113 00:05:54,680 --> 00:05:56,440 Speaker 3: My understanding is that it's already done. 114 00:05:56,560 --> 00:06:01,480 Speaker 1: I don't think we are doubling in raw processing speed. 115 00:06:01,839 --> 00:06:05,400 Speaker 4: In my understanding, Moore's law kind of slowed down for 116 00:06:05,600 --> 00:06:08,720 Speaker 4: laptop desktop chips a number of years back, but has 117 00:06:08,800 --> 00:06:11,960 Speaker 4: continued with mobile just because they were a little behind. 118 00:06:12,240 --> 00:06:15,440 Speaker 4: But then you also have graphics and neural processing units 119 00:06:15,520 --> 00:06:19,120 Speaker 4: that power our AI platforms of today. So can we 120 00:06:19,160 --> 00:06:21,440 Speaker 4: go into the future. I think we can go for 121 00:06:21,520 --> 00:06:26,039 Speaker 4: a number more years, given the innovations in transistor stacking. 122 00:06:26,400 --> 00:06:29,400 Speaker 2: I thought Moore's law had to do with cost decreasing 123 00:06:29,480 --> 00:06:32,200 Speaker 2: as speed increased. We do seem to be close to 124 00:06:32,200 --> 00:06:34,840 Speaker 2: a tipping point with electrons being too large for the 125 00:06:35,240 --> 00:06:39,680 Speaker 2: tiny circuitry. However, it seems that optical circuitry might be 126 00:06:39,720 --> 00:06:40,840 Speaker 2: a good replacement for that. 127 00:06:41,279 --> 00:06:47,360 Speaker 1: They're currently reaching the lower limits of workability before they 128 00:06:47,360 --> 00:06:50,719 Speaker 1: start reaching quantum effects. With silicon. 129 00:06:51,240 --> 00:06:54,760 Speaker 3: We are getting closer to the particle level and that 130 00:06:55,640 --> 00:06:56,320 Speaker 3: stops us. 131 00:06:56,440 --> 00:06:59,400 Speaker 1: Honestly, I thought it had already stopped. So do you 132 00:06:59,440 --> 00:07:01,919 Speaker 1: think these are optimistic or pessimistic? 133 00:07:02,279 --> 00:07:04,919 Speaker 2: I mean I think they're realistic, which seems to be 134 00:07:04,920 --> 00:07:07,800 Speaker 2: the option that always gets left out. So I think 135 00:07:07,839 --> 00:07:09,880 Speaker 2: there were a lot of people who said, you know, 136 00:07:09,920 --> 00:07:11,760 Speaker 2: I thought we've already reached the limits. So we're getting 137 00:07:11,760 --> 00:07:13,880 Speaker 2: close to reaching the limits. And I'll admit that I 138 00:07:14,040 --> 00:07:16,320 Speaker 2: did not realize we were getting close to reaching the limits. 139 00:07:16,320 --> 00:07:18,239 Speaker 2: But it seems like a lot of our listeners are 140 00:07:18,600 --> 00:07:19,720 Speaker 2: on top of this trend. 141 00:07:20,040 --> 00:07:22,560 Speaker 1: Yeah, and so we're not giving financial advice. So I 142 00:07:22,600 --> 00:07:24,440 Speaker 1: won't tell you whether or not to buy or sell 143 00:07:24,440 --> 00:07:29,000 Speaker 1: in Vidia stock. But you know, sometimes I wonder about 144 00:07:29,040 --> 00:07:31,559 Speaker 1: these tech companies because their stocks also seem to follow 145 00:07:31,560 --> 00:07:34,840 Speaker 1: Moore's law, Like, how can Google just keep getting more valuable? 146 00:07:35,400 --> 00:07:37,240 Speaker 1: I keep missing out on buying Google. 147 00:07:38,000 --> 00:07:39,960 Speaker 2: I can tell you that if you can make a 148 00:07:40,000 --> 00:07:41,880 Speaker 2: time machine, one of the first things you should do 149 00:07:42,000 --> 00:07:43,880 Speaker 2: is go buy in VideA and Google stuff. 150 00:07:44,360 --> 00:07:46,720 Speaker 1: All right, So let's dig into it. What in the 151 00:07:46,880 --> 00:07:51,920 Speaker 1: end is Moore's law? So Moore's law was something postulated 152 00:07:52,000 --> 00:07:53,000 Speaker 1: by Gordon Moore. 153 00:07:53,320 --> 00:07:54,600 Speaker 2: He has a first name. 154 00:07:58,480 --> 00:08:01,520 Speaker 1: And he was one of the found of Intel, so 155 00:08:01,760 --> 00:08:05,400 Speaker 1: a big dude in like, you know, semiconductors and electronics. 156 00:08:06,000 --> 00:08:10,720 Speaker 1: And he suggested initially that the number of transistors you 157 00:08:10,760 --> 00:08:14,360 Speaker 1: could squeeze onto a chip would double every year. And 158 00:08:14,400 --> 00:08:16,280 Speaker 1: it's a little bit more complicated than that. He also 159 00:08:16,360 --> 00:08:18,720 Speaker 1: was talking about the power usage and the cost, but 160 00:08:19,080 --> 00:08:22,720 Speaker 1: roughly speaking, he was talking about the density of transistors 161 00:08:22,720 --> 00:08:26,200 Speaker 1: on chips getting higher every single year, which means the 162 00:08:26,200 --> 00:08:29,040 Speaker 1: speed of these computers is growing very quickly. 163 00:08:29,520 --> 00:08:33,160 Speaker 2: So transistors are about speed and not storage, or are 164 00:08:33,160 --> 00:08:33,920 Speaker 2: they about both? 165 00:08:34,120 --> 00:08:37,880 Speaker 1: They're about both. The transistor fundamentally is a tiny programmable switch. 166 00:08:38,559 --> 00:08:41,640 Speaker 1: And the reason that computers got small and got fast 167 00:08:41,800 --> 00:08:44,400 Speaker 1: is because we were able to make transistors small and 168 00:08:44,480 --> 00:08:47,000 Speaker 1: make them fast, which allows us to have lots and 169 00:08:47,040 --> 00:08:49,720 Speaker 1: lots and lots of switches in a small area, which 170 00:08:49,760 --> 00:08:51,680 Speaker 1: is what allows the computer to be complex and to 171 00:08:51,760 --> 00:08:54,840 Speaker 1: be fast. And so essentially it's saying we can make 172 00:08:54,880 --> 00:08:58,680 Speaker 1: computers denser every year, and that makes computers faster and 173 00:08:58,720 --> 00:08:59,360 Speaker 1: more powerful. 174 00:08:59,559 --> 00:09:02,079 Speaker 2: Okay, so this has to do with why we went 175 00:09:02,080 --> 00:09:05,160 Speaker 2: from computers that took up entire rooms to something you 176 00:09:05,160 --> 00:09:06,720 Speaker 2: can now stick in your bag and take with you. 177 00:09:06,960 --> 00:09:09,520 Speaker 1: Yeah, exactly, And we'll dig into that in a minute. 178 00:09:09,800 --> 00:09:12,600 Speaker 1: But the history here is that in nineteen sixty five 179 00:09:13,160 --> 00:09:16,080 Speaker 1: More predicted this and then ten years later he revised it. 180 00:09:16,080 --> 00:09:18,280 Speaker 1: He was like, well every year, maybe that's too optimistic. 181 00:09:18,360 --> 00:09:21,280 Speaker 1: Let's go for every two years. And so that was 182 00:09:21,320 --> 00:09:24,440 Speaker 1: the prediction in seventy five, and you'll see that it 183 00:09:24,480 --> 00:09:27,640 Speaker 1: mostly held up until fairly recently. It's sort of an 184 00:09:27,679 --> 00:09:31,040 Speaker 1: extraordinary prediction in that sense, though. You know, anytime there's 185 00:09:31,040 --> 00:09:33,360 Speaker 1: a prediction that holds up, you got to wonder, like, well, 186 00:09:33,400 --> 00:09:36,160 Speaker 1: what were the other predictions this person made, Like you 187 00:09:36,200 --> 00:09:40,160 Speaker 1: just spew predictions constantly, Eventually you're goind of get one Ris. 188 00:09:39,800 --> 00:09:42,280 Speaker 2: Yeah, yeah, well, especially he gave himself another decade to 189 00:09:42,360 --> 00:09:45,600 Speaker 2: like fit the trend line. That was pretty generous to himself. 190 00:09:45,679 --> 00:09:48,319 Speaker 1: But okay, exactly, So let's dick into what a transistor 191 00:09:48,480 --> 00:09:51,600 Speaker 1: is and why it allows computers to be faster, because 192 00:09:51,640 --> 00:09:54,760 Speaker 1: that's crucial to understand why More's law has worked, how 193 00:09:54,800 --> 00:09:56,679 Speaker 1: we've made it work, and whether it's going to work 194 00:09:56,679 --> 00:10:00,200 Speaker 1: in the future. Basically, a transistor is a programmable switch 195 00:10:00,559 --> 00:10:03,880 Speaker 1: like computers operate on digital logic. I have a number 196 00:10:03,880 --> 00:10:07,280 Speaker 1: in the computer, the number four. They store it in binary. 197 00:10:07,720 --> 00:10:10,520 Speaker 1: But to store things in binary you need a physical system. 198 00:10:10,640 --> 00:10:13,319 Speaker 1: But that can store a zero or a one right 199 00:10:13,360 --> 00:10:15,160 Speaker 1: the way you can like write a digit on a 200 00:10:15,160 --> 00:10:18,080 Speaker 1: piece of paper. That's like I'm representing the number four 201 00:10:18,160 --> 00:10:22,120 Speaker 1: by scratching this graphite onto this sheet of paper. I 202 00:10:22,120 --> 00:10:25,080 Speaker 1: want to store things on my computer and zeros and 203 00:10:25,160 --> 00:10:28,960 Speaker 1: ones because binary is the code for computers, and physically 204 00:10:28,960 --> 00:10:31,440 Speaker 1: that means a switch, you know, as you can imagine 205 00:10:31,720 --> 00:10:34,640 Speaker 1: either just like literally like a light switch, but here 206 00:10:34,679 --> 00:10:36,280 Speaker 1: we're doing an electronic switch. 207 00:10:36,600 --> 00:10:40,000 Speaker 2: Okay, And so just to if we switched from using 208 00:10:40,559 --> 00:10:45,439 Speaker 2: transistors to things like DNA to store data or quantum computing, 209 00:10:46,360 --> 00:10:48,720 Speaker 2: could you still apply More's law, like if we switched 210 00:10:48,840 --> 00:10:52,240 Speaker 2: to some other method or is More's law specifically about 211 00:10:52,720 --> 00:10:55,440 Speaker 2: the transistors that we're talking about now, Yeah. 212 00:10:55,400 --> 00:10:58,360 Speaker 1: Great question. You're talking about fundamental changes in how we 213 00:10:58,400 --> 00:11:02,120 Speaker 1: do computing. So currently computing operates on bits, zeros and ones, 214 00:11:02,360 --> 00:11:05,079 Speaker 1: and we're saying those are represented by transistors, which is 215 00:11:05,120 --> 00:11:07,760 Speaker 1: like a physical implementation of that bit. You switch to 216 00:11:07,880 --> 00:11:11,120 Speaker 1: quantum computing, the fundamental element of that is a cubit, 217 00:11:11,480 --> 00:11:14,160 Speaker 1: which isn't necessarily a zero one. It has the probability 218 00:11:14,160 --> 00:11:16,400 Speaker 1: to be in several different states, and so it requires 219 00:11:16,520 --> 00:11:18,800 Speaker 1: a different physical system to model that. We don't use 220 00:11:18,840 --> 00:11:23,400 Speaker 1: transistors or not even like quantum transistors. In fact, transistors 221 00:11:23,440 --> 00:11:27,480 Speaker 1: are already relying deeply on quantum mechanics, so quantum transistor 222 00:11:27,559 --> 00:11:30,560 Speaker 1: is redundant. But yeah, cubit, there's no guarantee that you 223 00:11:30,600 --> 00:11:33,280 Speaker 1: can like build cubits and then build them more densely 224 00:11:33,320 --> 00:11:36,280 Speaker 1: and more rapidly. There's certainly no More's law for quantum 225 00:11:36,320 --> 00:11:40,520 Speaker 1: computing that's a guarantee. And biological computing like DNA is 226 00:11:40,600 --> 00:11:43,240 Speaker 1: super awesome as an idea, but there you have like 227 00:11:43,400 --> 00:11:47,200 Speaker 1: four possibilities, right, DNA is basically base four, and so 228 00:11:47,320 --> 00:11:49,880 Speaker 1: it's a question of like how do you encode numbers 229 00:11:49,920 --> 00:11:52,040 Speaker 1: into DNA? Do you use all four bases? Do you 230 00:11:52,160 --> 00:11:55,319 Speaker 1: group them into two to make binary? The technology is 231 00:11:55,320 --> 00:11:58,760 Speaker 1: fundamentally different, So again you wouldn't expect necessarily further to 232 00:11:58,800 --> 00:12:00,719 Speaker 1: be more solved, but you might get some other law 233 00:12:00,720 --> 00:12:03,559 Speaker 1: which could be better. So but yeah, Mores law reflects 234 00:12:03,640 --> 00:12:07,120 Speaker 1: the details of the technology we're using to represent the 235 00:12:07,120 --> 00:12:10,160 Speaker 1: fundamental element of computing, which is a zero or one, 236 00:12:10,240 --> 00:12:13,720 Speaker 1: and then crucially the logic that operates on those zeros 237 00:12:13,720 --> 00:12:14,160 Speaker 1: and ones. 238 00:12:14,400 --> 00:12:15,400 Speaker 2: Let's get into that logic. 239 00:12:15,600 --> 00:12:17,800 Speaker 1: Yeah, because what you want to do is represent like 240 00:12:17,880 --> 00:12:19,720 Speaker 1: numbers in your computer. I want to put the number 241 00:12:19,800 --> 00:12:22,120 Speaker 1: four in but also I want to calculate stuff. I 242 00:12:22,120 --> 00:12:23,959 Speaker 1: don't just want to write four into my computer. I 243 00:12:24,040 --> 00:12:25,920 Speaker 1: want to be able to add four to two. I 244 00:12:25,920 --> 00:12:28,520 Speaker 1: want to be able to compare four and seven. Right, 245 00:12:28,559 --> 00:12:30,760 Speaker 1: That's what allows you to program a computer for it 246 00:12:30,800 --> 00:12:34,120 Speaker 1: to do useful computation. And if you know something about computing, 247 00:12:34,120 --> 00:12:36,560 Speaker 1: you know like the basics of computation is a Turing 248 00:12:36,600 --> 00:12:39,959 Speaker 1: machine which can like read in numbers and write numbers 249 00:12:40,000 --> 00:12:44,240 Speaker 1: onto this infinite tape. And so in order to do logic, 250 00:12:44,360 --> 00:12:46,480 Speaker 1: you need to be able to have things that respond 251 00:12:46,559 --> 00:12:50,559 Speaker 1: to different inputs. So in logic you have things like gates. 252 00:12:50,600 --> 00:12:52,640 Speaker 1: Like a not gate is something which if you give 253 00:12:52,640 --> 00:12:54,520 Speaker 1: it to zero, it responds a one. If you give 254 00:12:54,520 --> 00:12:56,600 Speaker 1: it a one, it responds to zero. It's like a 255 00:12:56,679 --> 00:13:00,880 Speaker 1: logical map from inputs to outputs, or an a gate. Right, 256 00:13:00,920 --> 00:13:03,240 Speaker 1: an and gate gives you a one if both inputs 257 00:13:03,280 --> 00:13:06,640 Speaker 1: are one and a zero otherwise. Or the converse of 258 00:13:06,679 --> 00:13:10,240 Speaker 1: that is a nand gate NA n D, which is 259 00:13:10,280 --> 00:13:12,680 Speaker 1: the combination of an N gate and a knot gate. 260 00:13:13,320 --> 00:13:15,600 Speaker 1: And the really cool thing is that if you can 261 00:13:15,640 --> 00:13:18,760 Speaker 1: build a nand gate, you can build any logical map 262 00:13:19,280 --> 00:13:22,200 Speaker 1: nandgates are like the basis function of logic. So if 263 00:13:22,240 --> 00:13:24,560 Speaker 1: you have nands, people have shown that you can build 264 00:13:24,640 --> 00:13:27,760 Speaker 1: any map from inputs to outputs and essentially any sort 265 00:13:27,760 --> 00:13:31,800 Speaker 1: of computer logic. So you can build knot gates and 266 00:13:31,800 --> 00:13:35,440 Speaker 1: and gates out of transistors. Transistors are like this digital switch, 267 00:13:35,840 --> 00:13:37,400 Speaker 1: and we'll go into the detail of the physics of 268 00:13:37,440 --> 00:13:40,120 Speaker 1: how they work, but essentially they're a programmable switch. You 269 00:13:40,160 --> 00:13:42,760 Speaker 1: can turn them on or off in response to other stuff. 270 00:13:43,160 --> 00:13:45,320 Speaker 1: So from that you can build logic, and from that 271 00:13:45,360 --> 00:13:47,400 Speaker 1: you can build nandgates, and from that you can build 272 00:13:47,559 --> 00:13:51,440 Speaker 1: literally anything like adders and comparitors and anything you need 273 00:13:51,559 --> 00:13:54,840 Speaker 1: in computers. So this is like the basically the smallest 274 00:13:54,840 --> 00:13:59,199 Speaker 1: little lego brick of computing is a switch, a programmable 275 00:13:59,240 --> 00:14:00,880 Speaker 1: switch that goes from zero to one. And that's what 276 00:14:00,920 --> 00:14:03,720 Speaker 1: a transistor is an implementation of. And it didn't have 277 00:14:03,760 --> 00:14:05,640 Speaker 1: to be a transistor. It could have been something else. 278 00:14:05,679 --> 00:14:07,520 Speaker 1: It could have been DNA, it could have been whatever. 279 00:14:07,920 --> 00:14:10,760 Speaker 1: But this is like the best, fastest, smallest thing that 280 00:14:10,800 --> 00:14:14,400 Speaker 1: we have invented, and this is what revolutionized our society. 281 00:14:14,640 --> 00:14:16,080 Speaker 2: What does the transistor look like? 282 00:14:17,000 --> 00:14:19,160 Speaker 1: Yeah, what does the transistor look like? It looks like 283 00:14:19,200 --> 00:14:22,520 Speaker 1: nothing because it's super duper tiny, right, Like the ones 284 00:14:22,520 --> 00:14:26,560 Speaker 1: that we're building these days are order nanometers right, so 285 00:14:26,680 --> 00:14:28,880 Speaker 1: like you put one on your finger, you can't see it. 286 00:14:29,320 --> 00:14:32,640 Speaker 1: The number of transistors on a typical chip is billions, 287 00:14:33,080 --> 00:14:35,000 Speaker 1: so you can't see an individual one. There used to 288 00:14:35,040 --> 00:14:36,720 Speaker 1: be able to, like when they were first building them 289 00:14:36,720 --> 00:14:39,360 Speaker 1: in the fifties, you would like make one, you know, 290 00:14:39,960 --> 00:14:41,880 Speaker 1: on a bench. You could think of it as sort 291 00:14:41,880 --> 00:14:44,720 Speaker 1: of like three wires coming together. You have a source, 292 00:14:45,160 --> 00:14:48,400 Speaker 1: a drain, and then a gait and the gate basically 293 00:14:48,440 --> 00:14:51,080 Speaker 1: decides do I connect the source and the drain Do 294 00:14:51,160 --> 00:14:54,040 Speaker 1: I open or close this switch? And so it's sort 295 00:14:54,040 --> 00:14:55,840 Speaker 1: of like a wire with a lever in it that 296 00:14:55,880 --> 00:14:58,320 Speaker 1: you know you can open or close, and then another 297 00:14:58,400 --> 00:15:01,720 Speaker 1: wire that determines what whether or not that's open or closed. 298 00:15:02,160 --> 00:15:04,960 Speaker 1: So that's not a physical description of what they look like. 299 00:15:05,560 --> 00:15:07,760 Speaker 1: We can get into like the semiconductors in a minute, 300 00:15:07,800 --> 00:15:10,200 Speaker 1: but that's sort of the logical construction. And when I 301 00:15:10,200 --> 00:15:12,720 Speaker 1: think about more's low, I think, well, what is exactly 302 00:15:12,800 --> 00:15:16,920 Speaker 1: is the connection between more transistors and speed? Like it's 303 00:15:17,000 --> 00:15:19,080 Speaker 1: cool to have things small because then you can put 304 00:15:19,080 --> 00:15:22,840 Speaker 1: a computer in your watch or whatever. But why do 305 00:15:22,960 --> 00:15:26,440 Speaker 1: smaller computers operate faster? Because that's really the crucial key. 306 00:15:26,480 --> 00:15:29,120 Speaker 1: When you sit down at your laptop, you're not like, wow, 307 00:15:29,240 --> 00:15:33,600 Speaker 1: the transistors are super dense. You're like, wow, you know 308 00:15:33,720 --> 00:15:36,360 Speaker 1: word opened in a bill a second instead of you know, 309 00:15:36,440 --> 00:15:39,880 Speaker 1: spinning my beach ball forever. Yeah, So it's the speed 310 00:15:39,920 --> 00:15:42,880 Speaker 1: that's really crucial, and that's really transformed society. Right, it's 311 00:15:42,920 --> 00:15:49,040 Speaker 1: computational power and miniaturization means faster operation for a few reasons. 312 00:15:49,520 --> 00:15:52,000 Speaker 1: Number one, things just don't have to go as far. Right, 313 00:15:52,080 --> 00:15:55,520 Speaker 1: Electronics is limited by the speed of light. It's not instantaneous. 314 00:15:55,760 --> 00:15:58,920 Speaker 1: You close a switch, the electrons don't move instantly. Right, 315 00:15:58,920 --> 00:16:02,360 Speaker 1: the current doesn't change instantly, and so we are still 316 00:16:02,400 --> 00:16:05,080 Speaker 1: limited by the speed of light. And so if the 317 00:16:05,160 --> 00:16:08,560 Speaker 1: distances between the transistors are smaller and the transistors themselves 318 00:16:08,640 --> 00:16:11,720 Speaker 1: are smaller, things just happen faster because there's a speed 319 00:16:11,760 --> 00:16:14,120 Speaker 1: limit to information in the universe. 320 00:16:14,480 --> 00:16:17,560 Speaker 2: That's awesome. I guess I hadn't imagined that as a 321 00:16:17,640 --> 00:16:20,520 Speaker 2: limiting factor. Okay, super cool, what's next. 322 00:16:20,720 --> 00:16:23,720 Speaker 1: Yeah, that's one. The other is you can have wider 323 00:16:23,800 --> 00:16:26,880 Speaker 1: data paths, like instead of just using thirty two bits 324 00:16:26,920 --> 00:16:29,840 Speaker 1: to store your numbers. You can use sixty four bits, right. 325 00:16:30,120 --> 00:16:32,840 Speaker 1: Remember bits are this essential element of binary numbers, and 326 00:16:32,880 --> 00:16:35,400 Speaker 1: so if you have like a two bit number, you 327 00:16:35,400 --> 00:16:38,040 Speaker 1: can only store between zero and four. If you have 328 00:16:38,080 --> 00:16:40,600 Speaker 1: an eight bit number, you can store many more numbers. 329 00:16:40,600 --> 00:16:42,880 Speaker 1: You have thirty two these days computing a sixty four 330 00:16:42,920 --> 00:16:44,920 Speaker 1: one hundred and twenty eight bit If you hear about 331 00:16:44,960 --> 00:16:47,600 Speaker 1: these numbers as the sort of the core the computing 332 00:16:47,760 --> 00:16:51,040 Speaker 1: of your CPU or your operating system, that's what it describes, 333 00:16:51,080 --> 00:16:54,920 Speaker 1: like what size numbers are we operating on? And this 334 00:16:55,000 --> 00:16:57,920 Speaker 1: is important because basically it's how much your computer can 335 00:16:57,960 --> 00:17:01,040 Speaker 1: do in parallel. Like if you can add two hundred 336 00:17:01,040 --> 00:17:03,400 Speaker 1: and twenty eight bit numbers, it's really one hundred and 337 00:17:03,440 --> 00:17:07,560 Speaker 1: twenty eight bit wise operations done in parallel instead of 338 00:17:07,840 --> 00:17:10,360 Speaker 1: if you're doing sixty four bit numbers, then you're only 339 00:17:10,359 --> 00:17:14,040 Speaker 1: doing sixty four operations in parallel, and so you can 340 00:17:14,080 --> 00:17:16,920 Speaker 1: do more operations in parallel, You can pass more data 341 00:17:16,960 --> 00:17:20,720 Speaker 1: at the same time, and so data flows more quickly. 342 00:17:20,920 --> 00:17:23,439 Speaker 1: Another thing that really limits the speed of computers is 343 00:17:23,720 --> 00:17:26,000 Speaker 1: how long does it take to get the data into 344 00:17:26,080 --> 00:17:29,440 Speaker 1: the actual CPU. Right, Like you have these numbers in memory, 345 00:17:29,440 --> 00:17:31,200 Speaker 1: you want to do some calculation on you got to 346 00:17:31,240 --> 00:17:34,560 Speaker 1: slurp them from memory and put them into the registers 347 00:17:34,560 --> 00:17:37,119 Speaker 1: in your CPU. They're actually doing the comparisons or the 348 00:17:37,160 --> 00:17:41,080 Speaker 1: adding or the subtracting or whatever. And so the wider 349 00:17:41,080 --> 00:17:43,080 Speaker 1: the data path, the faster the data gets loaded, and 350 00:17:43,160 --> 00:17:44,720 Speaker 1: the faster the computation happens. 351 00:17:45,119 --> 00:17:53,040 Speaker 2: And CPU probably means senorebditis pyro wetting underwater. What does 352 00:17:53,080 --> 00:17:53,760 Speaker 2: CPU mean? 353 00:17:54,280 --> 00:17:57,239 Speaker 1: CPU means central processing unit. It's a thing on your 354 00:17:57,240 --> 00:18:00,920 Speaker 1: computer that does the actual crunching, does the adding or 355 00:18:00,960 --> 00:18:04,440 Speaker 1: subtracting or comparing, or loading or unloading or writing to 356 00:18:04,560 --> 00:18:08,200 Speaker 1: memorates the closest thing we have to a digital brain. Okay, 357 00:18:08,280 --> 00:18:11,040 Speaker 1: but there's another sort of mechanical element to like why 358 00:18:11,200 --> 00:18:14,280 Speaker 1: speed means faster computers. And you know, back in the 359 00:18:14,320 --> 00:18:17,600 Speaker 1: nineteen fifties, people were doing electronics, and they're doing it 360 00:18:17,680 --> 00:18:19,600 Speaker 1: sort of the way you might do it in your garage. 361 00:18:19,800 --> 00:18:22,520 Speaker 1: You got resistors, you got capacitors, you solder them together, 362 00:18:22,600 --> 00:18:25,480 Speaker 1: you make these big sort of physical circuits. But in 363 00:18:25,520 --> 00:18:29,399 Speaker 1: the late nineteen fifties people invented what's called the integrated circuit. 364 00:18:29,800 --> 00:18:32,240 Speaker 1: Integrated circuit is just like you know, it's a big 365 00:18:32,400 --> 00:18:35,240 Speaker 1: green board and it's got the whole circuit printed onto it. 366 00:18:35,280 --> 00:18:38,560 Speaker 1: You don't like solder the components together, and this really 367 00:18:38,600 --> 00:18:41,640 Speaker 1: allows for like the embedding of these transistors and other 368 00:18:41,680 --> 00:18:46,760 Speaker 1: components inside these protective layers, which enhance their reliability. And 369 00:18:46,800 --> 00:18:49,080 Speaker 1: so that means you can make them smaller, you can 370 00:18:49,119 --> 00:18:51,439 Speaker 1: make more complex circuits that you didn't have to like 371 00:18:51,480 --> 00:18:55,800 Speaker 1: wire together yourself with dripping hot bits of solder, and 372 00:18:55,880 --> 00:18:57,679 Speaker 1: so this makes them more reliable, so you don't need 373 00:18:57,720 --> 00:19:00,960 Speaker 1: as much error correction, et cetera. And that allows things 374 00:19:01,000 --> 00:19:03,920 Speaker 1: to be smaller and to be faster. So you've got 375 00:19:04,000 --> 00:19:07,200 Speaker 1: integrated circuits, you got wider data paths, you got shorter 376 00:19:07,320 --> 00:19:10,159 Speaker 1: distances to travel, and you have faster switching. All these 377 00:19:10,160 --> 00:19:13,160 Speaker 1: things are why more transistors means faster computing. 378 00:19:13,520 --> 00:19:16,399 Speaker 2: Okay, And so when did we get our first transistor? 379 00:19:16,520 --> 00:19:19,080 Speaker 1: Yeah, so the transistor was invented in Bell Labs in 380 00:19:19,200 --> 00:19:22,320 Speaker 1: nineteen forty seven, I think it was. And there was 381 00:19:22,400 --> 00:19:25,040 Speaker 1: a lot of research in the forties different kinds of 382 00:19:25,040 --> 00:19:28,560 Speaker 1: technologies for transistors. Try this, try that, try the other thing. 383 00:19:28,600 --> 00:19:31,080 Speaker 1: But the basic concept was invented in the late forties 384 00:19:31,080 --> 00:19:33,480 Speaker 1: in Bell Labs and you know, Bell Labs is one 385 00:19:33,480 --> 00:19:37,480 Speaker 1: of these like elements of another era, an institution that 386 00:19:37,560 --> 00:19:40,280 Speaker 1: I really miss. You know, it's a privately funded research 387 00:19:40,400 --> 00:19:43,080 Speaker 1: lab that did basic research. You know, this is an 388 00:19:43,200 --> 00:19:46,399 Speaker 1: arm of the telephone company. But they just like gave 389 00:19:46,600 --> 00:19:50,680 Speaker 1: nerds money and said, hey, play around, figure stuff out, 390 00:19:50,880 --> 00:19:53,720 Speaker 1: and they came up with things like the transistor, which 391 00:19:53,760 --> 00:19:57,760 Speaker 1: is I think a trillion dollar idea would be underestimating it, right, Like, 392 00:19:57,880 --> 00:20:01,760 Speaker 1: it's literally the foundation of our entire economy. Its transformed 393 00:20:01,760 --> 00:20:02,560 Speaker 1: the way we live. 394 00:20:02,760 --> 00:20:03,119 Speaker 2: Wow. 395 00:20:03,359 --> 00:20:06,679 Speaker 1: And so I think even if every other piece of 396 00:20:06,720 --> 00:20:10,439 Speaker 1: science was a waste of money, this one brings the 397 00:20:10,600 --> 00:20:15,040 Speaker 1: average up like this one idea like means all of 398 00:20:15,080 --> 00:20:19,200 Speaker 1: science has been worthwhile just from a purely economical, cynical 399 00:20:19,240 --> 00:20:22,280 Speaker 1: point of view. And that's the way science works, right, 400 00:20:22,359 --> 00:20:26,040 Speaker 1: like a lot of fuzzes out and occasionally a huge, 401 00:20:26,640 --> 00:20:30,240 Speaker 1: huge payoff. Anyway, it was the late nineteen forties people 402 00:20:30,240 --> 00:20:32,800 Speaker 1: figured this out. And you know, we've only had quantum 403 00:20:32,800 --> 00:20:34,160 Speaker 1: mechanics for a couple of decades. 404 00:20:34,200 --> 00:20:34,360 Speaker 2: Then. 405 00:20:34,760 --> 00:20:37,960 Speaker 1: People had ideas for making transistors before then, but weren't 406 00:20:37,960 --> 00:20:40,760 Speaker 1: able to make it work. But at Bell Labs, smart 407 00:20:40,800 --> 00:20:43,680 Speaker 1: guys figured this out. One Nobel Prizes. It was really 408 00:20:43,720 --> 00:20:44,399 Speaker 1: pretty awesome. 409 00:20:44,680 --> 00:20:47,000 Speaker 2: Awesome, And when we get back, let's talk about how 410 00:20:47,160 --> 00:21:10,040 Speaker 2: we went about shrinking these transistors. All right, So in 411 00:21:10,119 --> 00:21:14,359 Speaker 2: nineteen forty seven, Bell Labs creates the transistor just in 412 00:21:14,400 --> 00:21:16,560 Speaker 2: time for us to use it to get to space, 413 00:21:16,600 --> 00:21:18,520 Speaker 2: which is the most important topic that we have to 414 00:21:18,560 --> 00:21:21,360 Speaker 2: keep getting to every episode. All right, So now we've 415 00:21:21,359 --> 00:21:24,159 Speaker 2: got the transistor, how do we go about shrinking it? 416 00:21:24,359 --> 00:21:28,000 Speaker 1: Yes, the transistors are built out of semiconductors. You know, 417 00:21:28,040 --> 00:21:30,840 Speaker 1: you hear the semiconductor industry everywhere, And what does that 418 00:21:30,880 --> 00:21:34,320 Speaker 1: really mean? Well, we understand what conductor is, right, It's 419 00:21:34,320 --> 00:21:37,920 Speaker 1: something where electricity can flow. An insulator is something where 420 00:21:38,000 --> 00:21:41,680 Speaker 1: electricity cannot flow. And to understand that, you have to 421 00:21:41,720 --> 00:21:43,960 Speaker 1: take your vision of the atom where you have like 422 00:21:44,000 --> 00:21:47,520 Speaker 1: electrons orbiting around the nucleus or being in fuzzy quantum 423 00:21:47,560 --> 00:21:50,000 Speaker 1: mechanical states, and think about what happens when you put 424 00:21:50,040 --> 00:21:52,680 Speaker 1: a lot of atoms together, Like, what is the energy 425 00:21:52,760 --> 00:21:55,879 Speaker 1: level of an electron around an iron atom? Well, it's 426 00:21:55,920 --> 00:21:58,399 Speaker 1: a bunch of levels. What happens when you have a 427 00:21:58,520 --> 00:22:02,399 Speaker 1: billion iron atoms and a lattice, what happens to those electrons? Well, 428 00:22:02,400 --> 00:22:05,560 Speaker 1: they don't really belong to any individual nucleus anymore. They 429 00:22:05,600 --> 00:22:08,639 Speaker 1: sort of like move around the iron super highway. They 430 00:22:08,640 --> 00:22:12,680 Speaker 1: can flow around from here to there. And what distinguishes 431 00:22:12,720 --> 00:22:16,120 Speaker 1: a conductor from an insulator is whether or not there's 432 00:22:16,160 --> 00:22:19,040 Speaker 1: a big gap between energy levels, like can the electrons 433 00:22:19,080 --> 00:22:21,520 Speaker 1: get up to those energy levels where they can flow 434 00:22:21,560 --> 00:22:25,600 Speaker 1: around between all the atoms or not. If they can 435 00:22:25,640 --> 00:22:27,600 Speaker 1: get up there, then it's a conductor. If there's a 436 00:22:27,640 --> 00:22:30,440 Speaker 1: really big gap so they can't get up there, then 437 00:22:30,480 --> 00:22:34,040 Speaker 1: it's an insulator. Semiconductors are things that are sort of 438 00:22:34,080 --> 00:22:36,959 Speaker 1: halfway in between. They have a medium sized gap between 439 00:22:36,960 --> 00:22:39,800 Speaker 1: the energy levels where the electrons are stuck around individual 440 00:22:39,880 --> 00:22:42,240 Speaker 1: atoms and the ones where they're just flowing across the 441 00:22:42,240 --> 00:22:45,920 Speaker 1: super highway. And so that's something you can control. If 442 00:22:45,960 --> 00:22:47,760 Speaker 1: you tweak it a little bit by like adding them 443 00:22:47,760 --> 00:22:49,960 Speaker 1: a little bit of germanium or this other kind of thing, 444 00:22:50,000 --> 00:22:52,960 Speaker 1: you can control that gap. And so what you want 445 00:22:53,000 --> 00:22:55,640 Speaker 1: when you're building circuits is you want places where things 446 00:22:55,680 --> 00:22:58,240 Speaker 1: conduct really well and then places where things conduct really 447 00:22:58,320 --> 00:23:01,720 Speaker 1: really terribly. So rather than having to have different kinds 448 00:23:01,720 --> 00:23:04,080 Speaker 1: of material, like if I build a circuit in my garage, 449 00:23:04,080 --> 00:23:06,199 Speaker 1: I use copper for the wires and then to use 450 00:23:06,280 --> 00:23:08,920 Speaker 1: rubber for the insulators. It's better if you can have 451 00:23:08,960 --> 00:23:11,080 Speaker 1: a single kind of material and sort of tweak it 452 00:23:11,280 --> 00:23:12,679 Speaker 1: and like, okay, I'm going to make this part of 453 00:23:12,720 --> 00:23:15,200 Speaker 1: it conductor and that part of it an insulator because 454 00:23:15,240 --> 00:23:18,520 Speaker 1: it allows you to print circuits onto your material. 455 00:23:18,840 --> 00:23:21,000 Speaker 2: Okay, And so what is the material you use? 456 00:23:21,320 --> 00:23:24,960 Speaker 1: So we use silicon. Silicon is the semiconductor of choice, 457 00:23:25,080 --> 00:23:27,240 Speaker 1: and then you dope it with various things to change 458 00:23:27,280 --> 00:23:30,280 Speaker 1: its behavior to make it a conductor. And the way 459 00:23:30,320 --> 00:23:33,320 Speaker 1: that we have shrunk transistors from pretty big stuff you 460 00:23:33,320 --> 00:23:37,320 Speaker 1: could see on your garage bench to tiny stuff almost 461 00:23:37,359 --> 00:23:41,040 Speaker 1: the size of atoms is through a technique called photolithography, 462 00:23:41,119 --> 00:23:44,360 Speaker 1: which essentially prints a circuit onto a piece of silicon. 463 00:23:44,680 --> 00:23:47,240 Speaker 1: We grow these huge silicon wafers that are like ten 464 00:23:47,320 --> 00:23:50,200 Speaker 1: inches and then you want to print a circuit onto it, 465 00:23:50,400 --> 00:23:53,080 Speaker 1: and you want to print like billions and billions of transistors, 466 00:23:53,080 --> 00:23:54,720 Speaker 1: and you want them to be as small as possible. 467 00:23:54,720 --> 00:23:57,159 Speaker 1: For the reason we just point it out, So like, 468 00:23:57,280 --> 00:23:59,800 Speaker 1: how do you print this stuff onto a piece of 469 00:23:59,800 --> 00:24:03,919 Speaker 1: sar silicon? So this is what photolithography is. Essentially, you 470 00:24:03,960 --> 00:24:07,119 Speaker 1: design your circuit on the computer, and then you print 471 00:24:07,240 --> 00:24:09,919 Speaker 1: on the surface of the silicon this thing called a 472 00:24:09,960 --> 00:24:13,880 Speaker 1: photo mask, and the photomask like protects the silicon from 473 00:24:13,920 --> 00:24:15,639 Speaker 1: the next thing you're gonna do to it, which is 474 00:24:15,680 --> 00:24:19,160 Speaker 1: blast it with really high energy light. So you shoot 475 00:24:19,200 --> 00:24:22,479 Speaker 1: like super high energy light at the silicon which is 476 00:24:22,840 --> 00:24:25,400 Speaker 1: partially covered by this mask, and the parts that are 477 00:24:25,440 --> 00:24:28,640 Speaker 1: exposed get a little bit chemically changed. Then you dip 478 00:24:28,680 --> 00:24:31,159 Speaker 1: the whole thing in like acid, and the parts that 479 00:24:31,160 --> 00:24:34,040 Speaker 1: we're exposed get like eaten away, for example, and so 480 00:24:34,160 --> 00:24:36,879 Speaker 1: what you're left with is just the pattern that you wanted. 481 00:24:37,240 --> 00:24:41,240 Speaker 1: That's like a very hand wavy explanation of how photolithography works. 482 00:24:41,720 --> 00:24:44,560 Speaker 1: But the things to understand is that it's limited by 483 00:24:44,640 --> 00:24:48,719 Speaker 1: those photons. Like if you use photons that we really 484 00:24:48,800 --> 00:24:51,720 Speaker 1: wide wavelengths, then you're going to get a fuzzy picture. 485 00:24:51,760 --> 00:24:55,000 Speaker 1: If you use photons with really narrow wavelengths, which means 486 00:24:55,160 --> 00:24:58,520 Speaker 1: high energy photons, right, then you're gonna get a much 487 00:24:58,520 --> 00:25:02,360 Speaker 1: crisper picture. And so over the decades, we've been trying 488 00:25:02,400 --> 00:25:05,439 Speaker 1: to shrink these transistors to get more and more transistors 489 00:25:05,480 --> 00:25:08,160 Speaker 1: on these chips and have faster computers. And one way 490 00:25:08,160 --> 00:25:10,560 Speaker 1: to do that is to crank up the energy of 491 00:25:10,560 --> 00:25:13,639 Speaker 1: those photons, and so now we're in the like extreme 492 00:25:13,840 --> 00:25:16,560 Speaker 1: ultraviolet limit where the photons have a wavelength of like 493 00:25:16,640 --> 00:25:20,399 Speaker 1: thirteen or fourteen nanometers. Wow, And that's hard because it 494 00:25:20,440 --> 00:25:23,840 Speaker 1: requires like special optics. You can't just use normal lenses 495 00:25:23,880 --> 00:25:26,320 Speaker 1: to bend this kind of light. It's why it's very 496 00:25:26,359 --> 00:25:29,160 Speaker 1: hard to do, like X ray optics. Also, the higher 497 00:25:29,200 --> 00:25:31,080 Speaker 1: the energy light, the harder it is to bend it. 498 00:25:31,400 --> 00:25:32,520 Speaker 2: Have we maxed this out? 499 00:25:32,920 --> 00:25:35,679 Speaker 1: We probably have maxed this out, because anything beyond this 500 00:25:35,840 --> 00:25:40,080 Speaker 1: requires insane optics. Like already the optics are insane. You know, 501 00:25:40,320 --> 00:25:43,800 Speaker 1: making a single mask for these things costs like hundreds 502 00:25:43,840 --> 00:25:47,240 Speaker 1: of thousands of dollars, and there's like a few places 503 00:25:47,240 --> 00:25:49,000 Speaker 1: in the world you can do this kind of stuff. 504 00:25:49,320 --> 00:25:52,920 Speaker 1: The equipment is extremely expensive, the operating conditions are very 505 00:25:53,000 --> 00:25:56,720 Speaker 1: very particular. You have to have specialized clean rooms. Like 506 00:25:57,359 --> 00:26:01,239 Speaker 1: this is really the pinnacle of technology. It's incredible, and 507 00:26:01,280 --> 00:26:03,760 Speaker 1: that's why you know a few of these players in 508 00:26:03,800 --> 00:26:07,320 Speaker 1: this field, like the Taiwannee semiconductor industry is so important 509 00:26:07,480 --> 00:26:11,520 Speaker 1: for the worldwide computing industry. Like a single company goes 510 00:26:11,560 --> 00:26:14,560 Speaker 1: down and like we can't make computers anymore. 511 00:26:14,600 --> 00:26:18,240 Speaker 2: Wow, Oh my gosh. Right, all of the geopolitical tensions 512 00:26:18,280 --> 00:26:19,800 Speaker 2: just came into focus exactly. 513 00:26:19,880 --> 00:26:22,679 Speaker 1: That's one reason why Taiwan is so important because a 514 00:26:22,720 --> 00:26:25,280 Speaker 1: lot of this stuff is done by Taiwan, these firms. 515 00:26:25,520 --> 00:26:28,800 Speaker 2: All Right, so we figured out photo lithography and we've 516 00:26:28,920 --> 00:26:31,399 Speaker 2: kind of reached the limit. Yeah, is that the end 517 00:26:31,440 --> 00:26:32,000 Speaker 2: of the story. 518 00:26:32,160 --> 00:26:34,439 Speaker 1: It's not quite the end of the story. And you know, 519 00:26:34,600 --> 00:26:36,439 Speaker 1: I'm more to say about like how impressive it is. 520 00:26:36,480 --> 00:26:39,000 Speaker 1: Like in the mid nineties, we were doing things at 521 00:26:39,000 --> 00:26:42,280 Speaker 1: like three hundred and fifteen nanimeters scale, which sounds pretty awesome, 522 00:26:42,320 --> 00:26:45,399 Speaker 1: Like that sounds pretty tiny. And then late nineties it 523 00:26:45,440 --> 00:26:47,520 Speaker 1: was like one hundred and eighty animeters, and the two 524 00:26:47,600 --> 00:26:51,440 Speaker 1: thousands it was sub one hundred nanimeters. These days we're 525 00:26:51,440 --> 00:26:55,720 Speaker 1: getting down to like ten nanimeters single nanometers. It's amazing, 526 00:26:56,080 --> 00:26:59,440 Speaker 1: but it's getting harder and harder because we're already beyond 527 00:26:59,560 --> 00:27:02,119 Speaker 1: the wave length of the light that we're using, right, 528 00:27:02,720 --> 00:27:05,879 Speaker 1: and we're approaching the size of the atom. Right, silk 529 00:27:05,920 --> 00:27:09,679 Speaker 1: and atoms are like zero point two nanometers across, so 530 00:27:09,840 --> 00:27:12,280 Speaker 1: like you're going to be bild a transistor out of something. 531 00:27:12,760 --> 00:27:14,440 Speaker 1: It's like you know, you can't make things out of 532 00:27:14,520 --> 00:27:17,000 Speaker 1: legos if you only have a few of the bricks, right, 533 00:27:17,600 --> 00:27:21,479 Speaker 1: And so it's challenging to make transistors smaller than about 534 00:27:21,520 --> 00:27:25,440 Speaker 1: a nanometer because you're really reaching that fundamental limit of 535 00:27:25,480 --> 00:27:28,760 Speaker 1: the size of the silicon atom. And every year it 536 00:27:28,800 --> 00:27:32,160 Speaker 1: gets harder. Like it's true that we've increased the transistor 537 00:27:32,280 --> 00:27:35,919 Speaker 1: density every two years, we've doubled it, but the amount 538 00:27:35,960 --> 00:27:39,399 Speaker 1: of money spent in this research has increased by a 539 00:27:39,440 --> 00:27:41,879 Speaker 1: factor of ten or twenty, so it's not like a 540 00:27:41,960 --> 00:27:44,959 Speaker 1: constant effort every year to achieve this. We have to 541 00:27:45,080 --> 00:27:48,200 Speaker 1: ramp up the energy and the creativity. And that's great, 542 00:27:48,280 --> 00:27:50,760 Speaker 1: you know, it's like inspired all sorts of cool things 543 00:27:50,880 --> 00:27:54,520 Speaker 1: and spinoffs and whatever. But it gets really really complicated. 544 00:27:55,040 --> 00:27:57,439 Speaker 1: And the sort of cutting edge of this is to 545 00:27:57,440 --> 00:28:01,160 Speaker 1: now start stacking these transistors, so well, don't just think 546 00:28:01,160 --> 00:28:03,760 Speaker 1: of it as a plane. Let's go up in the 547 00:28:03,800 --> 00:28:07,120 Speaker 1: third dimension. Let's make the transistors more powerful by shrinking 548 00:28:07,119 --> 00:28:09,200 Speaker 1: them further and then allowing them to grow in sort 549 00:28:09,200 --> 00:28:13,240 Speaker 1: of the third dimension above this sort of plane. And 550 00:28:13,359 --> 00:28:16,520 Speaker 1: the leading edge of technology right now are these transistors 551 00:28:16,520 --> 00:28:20,119 Speaker 1: called fin fet so FET, which stands for a field 552 00:28:20,160 --> 00:28:24,120 Speaker 1: effet transistor and then a finn meaning like they literally 553 00:28:24,160 --> 00:28:28,080 Speaker 1: have this like fin over the gate that controls it 554 00:28:28,119 --> 00:28:29,840 Speaker 1: like a physical thing. It looks like a shark fin. 555 00:28:30,680 --> 00:28:34,120 Speaker 1: That makes it possible to be efficient while shrinking even further, 556 00:28:34,200 --> 00:28:36,600 Speaker 1: so you can make the sort of footprint of it 557 00:28:36,760 --> 00:28:40,400 Speaker 1: smaller while keeping its effectiveness because you have this third dimension. 558 00:28:40,400 --> 00:28:43,800 Speaker 1: And so that's what stacking is. And really people think 559 00:28:43,840 --> 00:28:46,760 Speaker 1: that we've reached the limit of what we can do technologically, 560 00:28:46,800 --> 00:28:48,920 Speaker 1: and as some of the listeners have said, we're going 561 00:28:48,920 --> 00:28:52,280 Speaker 1: in other directions, like instead of making your CPU more dense, 562 00:28:52,320 --> 00:28:55,800 Speaker 1: you just have multiple cores, or you start building other 563 00:28:55,920 --> 00:28:59,760 Speaker 1: dedicated stuff like graphics processing units that are really good 564 00:29:00,120 --> 00:29:03,080 Speaker 1: at linear algebra, which is needed for graphics and also 565 00:29:03,160 --> 00:29:06,400 Speaker 1: for machine learning. And so we're sort of like simultaneously 566 00:29:06,400 --> 00:29:08,520 Speaker 1: trying to go in many directions at once to improve 567 00:29:08,560 --> 00:29:11,360 Speaker 1: the power of computing. But it's not clear that we 568 00:29:11,400 --> 00:29:13,800 Speaker 1: can keep doing this, and a lot of people think 569 00:29:13,840 --> 00:29:16,040 Speaker 1: that we really are at the edge of what we 570 00:29:16,080 --> 00:29:17,720 Speaker 1: can do to improve computing speed. 571 00:29:17,880 --> 00:29:20,080 Speaker 2: Wow, and so stacking isn't going to be the magic 572 00:29:20,120 --> 00:29:22,280 Speaker 2: solution because there's like limits on stacking. 573 00:29:22,560 --> 00:29:24,760 Speaker 1: Yeah, exactly, like stack can get you a little further. 574 00:29:24,800 --> 00:29:27,080 Speaker 1: But if we're going to keep doubling, then it's hard. 575 00:29:27,640 --> 00:29:29,000 Speaker 1: And you know, I think there's something to be said 576 00:29:29,000 --> 00:29:33,600 Speaker 1: about the sociological impact of this doubling. You know, Moore's 577 00:29:33,680 --> 00:29:36,160 Speaker 1: law is not something that comes out of like the 578 00:29:36,160 --> 00:29:39,959 Speaker 1: fundamental laws of physics. It's something that was predicted and 579 00:29:40,000 --> 00:29:43,640 Speaker 1: that we maintained really over decades, which is really incredible. 580 00:29:43,720 --> 00:29:46,920 Speaker 1: Like one of Intel's earliest processors, the four thousand and four, 581 00:29:47,360 --> 00:29:50,400 Speaker 1: had twenty three hundred transistors in it, right, whereas like 582 00:29:50,640 --> 00:29:52,959 Speaker 1: the eighty three eighty six, which I've spent a lot 583 00:29:52,960 --> 00:29:56,280 Speaker 1: of time programming on as a teenager, had like hundreds 584 00:29:56,320 --> 00:29:58,800 Speaker 1: of thousands of transistors in this MacBook I'm sitting in 585 00:29:58,840 --> 00:30:01,880 Speaker 1: front of has billions. It's incredible, but it's sort of 586 00:30:01,960 --> 00:30:04,760 Speaker 1: guided the field. I think people because they thought this 587 00:30:04,920 --> 00:30:08,960 Speaker 1: was possible and maybe even inevitable, they worked for it. 588 00:30:08,600 --> 00:30:11,280 Speaker 1: It's a target, you know, And so if you think 589 00:30:11,320 --> 00:30:14,000 Speaker 1: something as possible, then like you stay late and you 590 00:30:14,040 --> 00:30:16,040 Speaker 1: push hard and you come up with new ideas, and 591 00:30:16,080 --> 00:30:19,040 Speaker 1: so in some sense it's a self fulfilling prophecy. 592 00:30:19,240 --> 00:30:21,160 Speaker 2: Okay, so first of all, have we hit the limit 593 00:30:21,200 --> 00:30:23,520 Speaker 2: to More's law? Already, or you just think we're going 594 00:30:23,600 --> 00:30:26,240 Speaker 2: to hit it soon, Like when is the first year 595 00:30:26,280 --> 00:30:28,560 Speaker 2: you think where we'll be like Moore's law gone. 596 00:30:30,480 --> 00:30:32,520 Speaker 1: I think we're right at that inflection point. You know, 597 00:30:32,560 --> 00:30:36,360 Speaker 1: we're still seeing improvements in speed, we're still seeing big 598 00:30:36,400 --> 00:30:40,160 Speaker 1: boosts and productivity, but we're sort of running out of 599 00:30:40,200 --> 00:30:43,400 Speaker 1: avenues and so I still see that, Like my MacBook 600 00:30:43,480 --> 00:30:45,600 Speaker 1: is faster than the one I gave my son, which 601 00:30:45,640 --> 00:30:48,280 Speaker 1: is my two year old MacBook, But I don't know 602 00:30:48,360 --> 00:30:50,000 Speaker 1: that the one I'm going to get in two years 603 00:30:50,200 --> 00:30:53,120 Speaker 1: is going to be as much faster. So I think 604 00:30:53,120 --> 00:30:54,520 Speaker 1: we're right at that inflection point. 605 00:30:54,880 --> 00:30:57,240 Speaker 2: So that feels a little scary to me. So, like, 606 00:30:57,360 --> 00:30:59,960 Speaker 2: you know, over time, we've gotten computers that are better 607 00:31:00,200 --> 00:31:02,720 Speaker 2: and so at least, you know, in my field, almost 608 00:31:02,800 --> 00:31:05,920 Speaker 2: every five years you expect, you know, the statistical models 609 00:31:05,920 --> 00:31:08,680 Speaker 2: of the systems that we study to get more complicated 610 00:31:08,760 --> 00:31:10,680 Speaker 2: so that we can get a better understanding out of 611 00:31:10,680 --> 00:31:13,640 Speaker 2: each one of our data sets. Are we not going 612 00:31:13,720 --> 00:31:15,360 Speaker 2: to be able to do that anymore? Or do you 613 00:31:15,360 --> 00:31:17,280 Speaker 2: think in twenty years our computers are just going to 614 00:31:17,280 --> 00:31:19,640 Speaker 2: start getting bigger again until they fill up a room, 615 00:31:19,960 --> 00:31:22,680 Speaker 2: Because we're going to want to keep getting more complicated 616 00:31:23,120 --> 00:31:24,240 Speaker 2: in our analyses. 617 00:31:24,640 --> 00:31:28,000 Speaker 1: Yeah, well, I think we're already seeing our computing getting bigger. 618 00:31:28,080 --> 00:31:30,200 Speaker 1: I mean, think about like the data centers that are 619 00:31:30,240 --> 00:31:33,160 Speaker 1: being built by Meta and Microsoft is like trying to 620 00:31:33,200 --> 00:31:36,160 Speaker 1: turn back on nuclear reactors because they need the power 621 00:31:36,160 --> 00:31:39,040 Speaker 1: for their AI data centers. These things are vast and 622 00:31:39,040 --> 00:31:42,320 Speaker 1: they're consuming huge amounts of our resources. So I think, yeah, 623 00:31:42,360 --> 00:31:45,040 Speaker 1: our appetite for computing is just growing, and even if 624 00:31:45,040 --> 00:31:46,920 Speaker 1: our computers don't get faster, we're just going to keep 625 00:31:46,920 --> 00:31:49,760 Speaker 1: building them bigger and bigger. But I also think that 626 00:31:49,800 --> 00:31:51,239 Speaker 1: for those of us who do things that are not 627 00:31:51,400 --> 00:31:55,760 Speaker 1: directly just computing, that there are other ways to increase speed. 628 00:31:56,000 --> 00:31:57,840 Speaker 1: I was talking to Katrina about this, and she was 629 00:31:57,880 --> 00:32:01,800 Speaker 1: saying that Moore's law also kind of applies to genomics. 630 00:32:02,280 --> 00:32:05,040 Speaker 1: You know, like the first study of a human genome 631 00:32:05,480 --> 00:32:09,040 Speaker 1: costs like how many millions for one genome, And then 632 00:32:09,040 --> 00:32:11,320 Speaker 1: the NIA had a target like it should cost less 633 00:32:11,360 --> 00:32:14,320 Speaker 1: than one thousand dollars to sequence a human genome, and 634 00:32:14,360 --> 00:32:16,480 Speaker 1: they hit that target and now it's cheaper than one 635 00:32:16,520 --> 00:32:19,200 Speaker 1: thousand dollars. And where does this come from? Part of 636 00:32:19,240 --> 00:32:21,640 Speaker 1: it come from computing, but also part of it comes 637 00:32:21,680 --> 00:32:25,240 Speaker 1: from like the miniaturization of biology. And I've seen this 638 00:32:25,440 --> 00:32:28,320 Speaker 1: just like observing her field. Something that used to be 639 00:32:28,480 --> 00:32:31,640 Speaker 1: like a PhD level of work then in a few 640 00:32:31,720 --> 00:32:34,280 Speaker 1: years becomes a little box on the lab bench. You 641 00:32:34,320 --> 00:32:37,120 Speaker 1: press a button and it's done while you're at lunch, right, Yeah, 642 00:32:37,120 --> 00:32:39,680 Speaker 1: and that allows you to now do things that were 643 00:32:39,760 --> 00:32:43,400 Speaker 1: impossible ten years earlier. And that kind of transformation of 644 00:32:43,440 --> 00:32:46,800 Speaker 1: the scope of the capacity of the field enables broader, 645 00:32:46,960 --> 00:32:49,880 Speaker 1: deeper thinking. And that's not just computing, right, that's the 646 00:32:49,880 --> 00:32:53,720 Speaker 1: menigtization of like the actual biology, like micro little bits. 647 00:32:53,960 --> 00:32:56,920 Speaker 1: It's essentially like what therapnose was tapping into this feeling like, oh, 648 00:32:56,960 --> 00:32:59,800 Speaker 1: eventually we should be able to diagnose diseases with tiny 649 00:32:59,800 --> 00:33:02,480 Speaker 1: life jobs of blood in this kind of sense. So 650 00:33:02,480 --> 00:33:05,760 Speaker 1: I think that there's lots of dimensions that we can 651 00:33:05,960 --> 00:33:10,480 Speaker 1: follow for improving our scientific and technological industrial capacity. Is 652 00:33:10,560 --> 00:33:12,640 Speaker 1: not just is my computer faster. 653 00:33:13,120 --> 00:33:16,720 Speaker 2: So when someone says, like you just did such and 654 00:33:16,800 --> 00:33:21,880 Speaker 2: such follows Moore's law, do they essentially mean we do 655 00:33:21,920 --> 00:33:25,200 Speaker 2: it better with smaller stuff, and like we do it 656 00:33:25,240 --> 00:33:27,480 Speaker 2: exponentially better in particular. 657 00:33:27,360 --> 00:33:31,120 Speaker 1: Yeah, I think it's about exponential growth that's a crucial 658 00:33:31,120 --> 00:33:34,640 Speaker 1: thing because you know, exponential growth builds on itself. You know, 659 00:33:34,680 --> 00:33:37,720 Speaker 1: it's like putting a dollar in the bank. Every year 660 00:33:37,760 --> 00:33:40,200 Speaker 1: you have more dollars, and those dollars earn more dollars, 661 00:33:40,200 --> 00:33:43,200 Speaker 1: and eventually you have all the dollars. Whereas like if 662 00:33:43,240 --> 00:33:46,000 Speaker 1: you're just selling lemonade and you're making a dollar every day, 663 00:33:46,160 --> 00:33:48,840 Speaker 1: you're making the same amount of dollars every day. It's 664 00:33:48,840 --> 00:33:51,760 Speaker 1: not increasing. So it's all about that exponential growth. And 665 00:33:51,800 --> 00:33:54,520 Speaker 1: I think that that's what people mean when they refer 666 00:33:54,640 --> 00:33:57,280 Speaker 1: to Moore's law sort of more colloquially than just like 667 00:33:57,440 --> 00:33:59,480 Speaker 1: the density of transistors on a chip. 668 00:33:59,600 --> 00:34:02,480 Speaker 2: That's kind of interesting because like Moore's law isn't really 669 00:34:02,480 --> 00:34:05,040 Speaker 2: a law, like it's an observation. And so it seems 670 00:34:05,040 --> 00:34:07,560 Speaker 2: like now anytime we see exponential growth, we say the 671 00:34:07,600 --> 00:34:11,880 Speaker 2: words Moore's law instead of just saying exponential growth or 672 00:34:11,920 --> 00:34:13,200 Speaker 2: am I being negative? 673 00:34:14,040 --> 00:34:16,360 Speaker 1: No? I think you're right, And I think it says 674 00:34:16,400 --> 00:34:20,200 Speaker 1: something about our aspirations. You know, we live in a 675 00:34:20,280 --> 00:34:23,520 Speaker 1: time when we expect our children's lives to be very 676 00:34:23,560 --> 00:34:26,600 Speaker 1: different from our lives and our grandparent lives, and that's 677 00:34:26,960 --> 00:34:30,080 Speaker 1: really unusual, Like most of human history. You could tell 678 00:34:30,120 --> 00:34:31,520 Speaker 1: your kids what their life was going to be like, 679 00:34:31,520 --> 00:34:33,440 Speaker 1: because it's going to be basically the same as yours 680 00:34:33,480 --> 00:34:37,120 Speaker 1: and your grandparents for like the last ten thousand years, right, 681 00:34:37,160 --> 00:34:41,520 Speaker 1: because like change was inconceivable because nobody had ever experienced it. 682 00:34:42,120 --> 00:34:43,960 Speaker 1: But now we live in a time when, like we 683 00:34:44,080 --> 00:34:47,000 Speaker 1: know that's not true, and so I think it leaves 684 00:34:47,080 --> 00:34:49,880 Speaker 1: us with this like gap in our wisdom. And then 685 00:34:49,920 --> 00:34:52,480 Speaker 1: we project forward, and some of us are optimistic and 686 00:34:52,520 --> 00:34:55,480 Speaker 1: we're like, Yay, this is going to change our lives 687 00:34:55,480 --> 00:34:57,840 Speaker 1: in a way that solves all of our problems. And 688 00:34:57,840 --> 00:35:00,640 Speaker 1: as you'll hear from Adam, some of us are less optimistic, 689 00:35:01,200 --> 00:35:03,719 Speaker 1: you know, about what this means and whether it's the 690 00:35:03,800 --> 00:35:05,520 Speaker 1: right way to place our bets. 691 00:35:05,920 --> 00:35:08,080 Speaker 2: I do feel like that was a slightly simplified view 692 00:35:08,080 --> 00:35:11,400 Speaker 2: of history, But this isn't Daniel and Kelly's historical universe. 693 00:35:11,440 --> 00:35:12,239 Speaker 2: So we're moving on. 694 00:35:12,600 --> 00:35:14,360 Speaker 1: Hey, I have to fit it into about one minute, 695 00:35:14,360 --> 00:35:16,040 Speaker 1: so I'm not going to do a deep dive. But yeah, 696 00:35:16,160 --> 00:35:18,520 Speaker 1: I mean, do you disagree with me about the broader 697 00:35:18,520 --> 00:35:21,960 Speaker 1: assessment of the way that human experience has changed. 698 00:35:22,360 --> 00:35:24,839 Speaker 2: I do think human experience was similar for a really 699 00:35:24,880 --> 00:35:27,319 Speaker 2: long time. You know, like when our ancestors moved out 700 00:35:27,360 --> 00:35:30,000 Speaker 2: of Africa, there was probably a lot that changed in 701 00:35:30,040 --> 00:35:34,040 Speaker 2: a couple generations, and the Industrial Revolution and climate. Yeah, yeah, 702 00:35:34,239 --> 00:35:36,759 Speaker 2: I think there's probably been a lot of moments where 703 00:35:36,800 --> 00:35:39,359 Speaker 2: things were like, oh crud, but usually probably they were 704 00:35:39,400 --> 00:35:42,160 Speaker 2: getting worse, whereas now we're hoping that it's getting better. 705 00:35:42,280 --> 00:35:47,200 Speaker 2: But anyway, so when I was reading Adam Becker's new 706 00:35:47,239 --> 00:35:51,640 Speaker 2: book More Everything Forever, there was a discussion on Moore's 707 00:35:51,680 --> 00:35:54,759 Speaker 2: Law where I realized, like, oh my gosh, I fundamentally 708 00:35:54,800 --> 00:35:58,239 Speaker 2: didn't understand Moore's Law very well or what like underpinned 709 00:35:58,280 --> 00:36:00,839 Speaker 2: Moore's Law, and I didn't realize that we were, you know, 710 00:36:00,960 --> 00:36:04,200 Speaker 2: perhaps reaching the end of Moore's Law. And so we 711 00:36:04,400 --> 00:36:07,120 Speaker 2: reached out to Adam Becker and asked if he would 712 00:36:07,120 --> 00:36:10,480 Speaker 2: talk to us about sort of the implication of, you know, 713 00:36:10,520 --> 00:36:13,359 Speaker 2: the death of Moore's Law. I'll be super dramatic about it, 714 00:36:13,480 --> 00:36:17,759 Speaker 2: but how this expectation of exponential growth impacts our view 715 00:36:17,800 --> 00:36:22,000 Speaker 2: of the future in ways that are not always necessarily realistic. 716 00:36:22,080 --> 00:36:44,480 Speaker 1: Let's say, all right, so then we're very happy to 717 00:36:44,520 --> 00:36:48,799 Speaker 1: welcome to the podcast. Adam Becker, who is an astrophysicist 718 00:36:48,920 --> 00:36:53,200 Speaker 1: turned author. He wrote the widely acclaimed book What Is Real, 719 00:36:53,320 --> 00:36:56,000 Speaker 1: one of my favorite books about quantum chinnings. If you 720 00:36:56,120 --> 00:36:58,400 Speaker 1: write me to ask for a book about quantum mechanics 721 00:36:58,400 --> 00:37:01,600 Speaker 1: that explain stuff in an accessible way, often recommended. And 722 00:37:01,680 --> 00:37:05,800 Speaker 1: he has a new book out called More Everything Forever 723 00:37:06,080 --> 00:37:09,920 Speaker 1: about the rise of techno utopiasts and how we can 724 00:37:09,920 --> 00:37:13,760 Speaker 1: project our future and the future of technology. Adam, Welcome 725 00:37:13,800 --> 00:37:16,520 Speaker 1: back to the podcast. Thanks, it's great to be here. 726 00:37:16,800 --> 00:37:20,279 Speaker 1: So let's start just by talking about Moore's law. It's 727 00:37:20,320 --> 00:37:23,720 Speaker 1: the foundation of so much of the techno utopian movement. 728 00:37:24,320 --> 00:37:27,360 Speaker 1: Why do you think that it has inspired sort of 729 00:37:27,400 --> 00:37:31,960 Speaker 1: this broader fanaticism, especially when it's just like an empirical observation, 730 00:37:32,160 --> 00:37:34,360 Speaker 1: not like a deep law of the universe. 731 00:37:34,760 --> 00:37:35,920 Speaker 3: Yeah, that's a good question. 732 00:37:37,200 --> 00:37:42,000 Speaker 5: I mean Moore's law. Yeah, it is an empirical observation. 733 00:37:42,920 --> 00:37:48,360 Speaker 5: But it's so regular, it's so comforting, and it has 734 00:37:48,440 --> 00:37:52,160 Speaker 5: you know, the fact that Moore's law held more or 735 00:37:52,280 --> 00:37:57,560 Speaker 5: less accurately for what about fifty years. It did change 736 00:37:58,000 --> 00:38:00,760 Speaker 5: a lot of things about the world, and it took 737 00:38:01,520 --> 00:38:08,960 Speaker 5: computers from being these large, slow, you know, refrigerator sized 738 00:38:09,040 --> 00:38:13,600 Speaker 5: things that live in mainframe rooms at corporations to you know, 739 00:38:13,760 --> 00:38:15,480 Speaker 5: tiny little things that live in our pockets, are on 740 00:38:15,560 --> 00:38:20,560 Speaker 5: our wrists and have much more power than all of 741 00:38:20,840 --> 00:38:23,440 Speaker 5: the main frames that existed, you know in the nineteen 742 00:38:23,480 --> 00:38:29,400 Speaker 5: seventies combined, right, caused all sorts of changes in our society, 743 00:38:29,760 --> 00:38:32,400 Speaker 5: some for the better, so much for the worse. But 744 00:38:32,520 --> 00:38:37,799 Speaker 5: you know, living through that, it seemed like clockwork, right, 745 00:38:37,960 --> 00:38:40,000 Speaker 5: you know. I mean I only lived through like the 746 00:38:40,080 --> 00:38:42,960 Speaker 5: last part of it, But I remember when I was 747 00:38:42,960 --> 00:38:44,920 Speaker 5: a kid, it seemed like, you know, computers were just 748 00:38:45,040 --> 00:38:48,600 Speaker 5: always getting smaller and faster and better every single year, 749 00:38:48,640 --> 00:38:52,239 Speaker 5: and you could just get you know, the advice was 750 00:38:52,360 --> 00:38:54,880 Speaker 5: wait as long as you can to get a new computer, 751 00:38:55,000 --> 00:38:58,040 Speaker 5: because the longer you wait, the better it'll be. Right, 752 00:38:58,600 --> 00:39:01,200 Speaker 5: And it was this amazing thing, and it made a 753 00:39:01,200 --> 00:39:03,759 Speaker 5: lot of people a lot of money, and a few 754 00:39:03,760 --> 00:39:07,120 Speaker 5: people truly enormous amounts of money. And so you put 755 00:39:07,160 --> 00:39:09,239 Speaker 5: all of that together and it's it kind of makes 756 00:39:09,239 --> 00:39:12,920 Speaker 5: some sense that some people would take it extremely seriously 757 00:39:12,960 --> 00:39:17,279 Speaker 5: as this general thing, because it seemed to be, you know, 758 00:39:17,400 --> 00:39:21,960 Speaker 5: if you lived a comfortable middle or upper class life, 759 00:39:22,320 --> 00:39:24,520 Speaker 5: it seemed like the most important thing in the world. 760 00:39:24,520 --> 00:39:28,520 Speaker 5: In the late twentieth century, right, and it wasn't really 761 00:39:28,560 --> 00:39:32,279 Speaker 5: like anything that you'd seen before. It was easy to think, oh, 762 00:39:32,360 --> 00:39:37,000 Speaker 5: this is just going to continue. So Ray Kurzweil is 763 00:39:37,040 --> 00:39:40,759 Speaker 5: this inventor and futurist who you know. He made like 764 00:39:40,920 --> 00:39:46,600 Speaker 5: real serious contributions to text to speech technology and like 765 00:39:46,640 --> 00:39:50,280 Speaker 5: assistive devices for the visually impaired, and I think hearing 766 00:39:50,280 --> 00:39:53,400 Speaker 5: impaired as well. You know, he made serious contributions to 767 00:39:53,520 --> 00:39:58,000 Speaker 5: the field of electronic instruments, like you know, musical instruments. 768 00:39:58,640 --> 00:40:00,480 Speaker 3: But he is best known as a futurist. 769 00:40:00,560 --> 00:40:02,200 Speaker 5: He is best known as somebody who you know, makes 770 00:40:02,239 --> 00:40:04,640 Speaker 5: these forecasts about what the future is going to be like. 771 00:40:04,800 --> 00:40:08,120 Speaker 1: So he's a retired electrical engineer, you're saying, essentially, yeah, 772 00:40:08,160 --> 00:40:09,640 Speaker 1: I have a lot of those in my inbox. 773 00:40:10,280 --> 00:40:12,080 Speaker 3: Yeah, me too, man. 774 00:40:14,120 --> 00:40:16,839 Speaker 5: I'm pretty sure that if you put anywhere on the 775 00:40:16,840 --> 00:40:19,640 Speaker 5: Internet that you have a PhD in physics, you get 776 00:40:19,640 --> 00:40:21,840 Speaker 5: a lot of retired electrical engineers in your inbox. 777 00:40:22,800 --> 00:40:25,680 Speaker 2: Guys, my inbox has pictures of feces from people who 778 00:40:25,719 --> 00:40:28,600 Speaker 2: want to note the parasite infections. I'm feeling pretty low 779 00:40:28,719 --> 00:40:30,160 Speaker 2: on sympathy right now. 780 00:40:30,400 --> 00:40:34,080 Speaker 5: Oh, but I once got so sorry for you guys. 781 00:40:34,200 --> 00:40:35,279 Speaker 3: Yeah, no, we should. 782 00:40:35,360 --> 00:40:37,600 Speaker 5: We should have a separate episode just talking about what's 783 00:40:37,600 --> 00:40:42,120 Speaker 5: in our inboxes, because I have some crazy stuff in 784 00:40:42,239 --> 00:40:43,120 Speaker 5: any event. 785 00:40:42,920 --> 00:40:45,440 Speaker 1: All right, So you're telling us how Ray Kurzweild was 786 00:40:45,480 --> 00:40:49,880 Speaker 1: thinking about how Moore's law is transforming technology and that's 787 00:40:50,000 --> 00:40:54,040 Speaker 1: the engine of transformation of society and predicting the future 788 00:40:54,080 --> 00:40:55,840 Speaker 1: of society more broadly exactly. 789 00:40:55,960 --> 00:41:01,320 Speaker 5: Yeah, And like Kurzweil extends Moore's Law in his Forecasts 790 00:41:01,360 --> 00:41:04,239 Speaker 5: of the Future and says, oh, this is part of 791 00:41:04,239 --> 00:41:08,160 Speaker 5: a more general trend in the history of technology and 792 00:41:08,360 --> 00:41:12,320 Speaker 5: the history of you know, even life in the universe. 793 00:41:12,600 --> 00:41:15,239 Speaker 5: And he calls it the law of accelerating returns, where 794 00:41:15,239 --> 00:41:18,239 Speaker 5: he says, you know, once you have better technology, it's 795 00:41:18,280 --> 00:41:20,279 Speaker 5: going to allow you to make the technology that you've 796 00:41:20,280 --> 00:41:22,960 Speaker 5: already got even better, and then that'll just be a 797 00:41:23,000 --> 00:41:26,520 Speaker 5: self reinforcing cycle that leads to this exponential trend. And 798 00:41:26,640 --> 00:41:30,000 Speaker 5: More's law is just one manifestation of that trend, and 799 00:41:30,040 --> 00:41:33,080 Speaker 5: it's going to you know, he says, it's something that 800 00:41:33,160 --> 00:41:35,440 Speaker 5: you can see if you look back through the entire 801 00:41:35,560 --> 00:41:39,560 Speaker 5: history not just of human technology, but evolution of life 802 00:41:39,560 --> 00:41:41,839 Speaker 5: on Earth, because you see the same thing with biological 803 00:41:42,160 --> 00:41:45,480 Speaker 5: quote unquote technology, and he says, you know, this is 804 00:41:45,520 --> 00:41:48,399 Speaker 5: going to continue, and in short order we are going 805 00:41:48,440 --> 00:41:50,959 Speaker 5: to reach this point that he calls the singularity, which 806 00:41:51,000 --> 00:41:54,440 Speaker 5: is where you've got, you know, technology that has developed 807 00:41:54,480 --> 00:41:56,799 Speaker 5: to such an advanced degree that it gives us, you know, 808 00:41:56,920 --> 00:42:00,560 Speaker 5: godlike powers of creation and destruction and transform and just 809 00:42:00,680 --> 00:42:04,799 Speaker 5: changes the fundamental nature of life on Earth and in 810 00:42:04,840 --> 00:42:05,400 Speaker 5: the universe. 811 00:42:05,880 --> 00:42:08,160 Speaker 1: Well, a lot of what you said sounds reasonable. R 812 00:42:08,400 --> 00:42:11,520 Speaker 1: There is evolution, and there is transformation, and things are 813 00:42:11,640 --> 00:42:14,560 Speaker 1: changing more rapidly. But from reading your book and from 814 00:42:14,600 --> 00:42:17,800 Speaker 1: your tone, I'm guessing you don't agree with Cursewil about 815 00:42:18,000 --> 00:42:20,440 Speaker 1: singularity and how we're all going to be techno gods 816 00:42:20,440 --> 00:42:22,960 Speaker 1: in the future. Why not? Why will Daniel not be 817 00:42:23,000 --> 00:42:23,759 Speaker 1: a techno god? 818 00:42:24,160 --> 00:42:24,560 Speaker 3: Yeah? 819 00:42:24,600 --> 00:42:27,200 Speaker 2: I mean, look Daniel in particular, Yes. 820 00:42:26,960 --> 00:42:28,960 Speaker 1: Yeah, I have a personal stake in this question. 821 00:42:29,040 --> 00:42:32,719 Speaker 5: Now, yes, Daniel in particular, Yeah, you are not going 822 00:42:32,760 --> 00:42:36,759 Speaker 5: to be a techno god, Daniel, because you know, by 823 00:42:36,840 --> 00:42:39,520 Speaker 5: having me on this podcast, Ray Kurzwild is going to 824 00:42:39,600 --> 00:42:41,440 Speaker 5: put you on his list, and then you know he 825 00:42:41,520 --> 00:42:43,960 Speaker 5: won't allow you to ascend to God. 826 00:42:44,920 --> 00:42:46,080 Speaker 2: I knew this was a mistake. 827 00:42:46,200 --> 00:42:51,640 Speaker 1: Yeah, exactly, worse than trying to fight a land born Asia. Huh, yes, exactly. 828 00:42:51,760 --> 00:42:52,839 Speaker 3: Yeah, No, that's number two. 829 00:42:52,840 --> 00:42:56,080 Speaker 5: Now number one is inviting Adam Becker onto your podcast. 830 00:42:57,440 --> 00:43:01,040 Speaker 5: But I mean, look, Curzwild is taking this exponential trend 831 00:43:01,040 --> 00:43:03,200 Speaker 5: and just extending it out into the future and saying 832 00:43:03,239 --> 00:43:06,600 Speaker 5: it's going to last forever. And the one thing that's 833 00:43:06,719 --> 00:43:10,000 Speaker 5: always true about exponential trends is that they end. 834 00:43:10,800 --> 00:43:10,960 Speaker 1: Right. 835 00:43:11,239 --> 00:43:14,560 Speaker 5: If you see any sort of exponential trend in nature 836 00:43:14,920 --> 00:43:18,520 Speaker 5: or in you know, technology or whatever, your first thought 837 00:43:18,560 --> 00:43:23,520 Speaker 5: should be, oh, that can't last, because it just doesn't. 838 00:43:23,560 --> 00:43:27,640 Speaker 5: There are not enough resources, there's not enough space, there's 839 00:43:27,719 --> 00:43:31,840 Speaker 5: not enough anything to allow exponential trends in general to 840 00:43:31,960 --> 00:43:36,239 Speaker 5: continue forever. One of the examples that Churzwild gives in 841 00:43:36,280 --> 00:43:38,600 Speaker 5: his book The Singularity Is Near which is probably his 842 00:43:38,640 --> 00:43:41,280 Speaker 5: most famous book from about two thousand and five. 843 00:43:41,520 --> 00:43:43,799 Speaker 1: Doesn't he have a few books like The Singularity is Near, 844 00:43:43,960 --> 00:43:46,759 Speaker 1: The Singularity is near Er, the Singularity is near Ish? 845 00:43:46,920 --> 00:43:47,600 Speaker 3: Yeah, yeah, yeah. 846 00:43:47,600 --> 00:43:50,760 Speaker 5: The Singularity is Near Er came out last year, and 847 00:43:50,800 --> 00:43:52,920 Speaker 5: when I tell people that that's the title, they usually 848 00:43:52,960 --> 00:43:53,640 Speaker 5: don't believe me. 849 00:43:53,719 --> 00:43:55,920 Speaker 3: But that is actually the title. He wrote a book 850 00:43:55,960 --> 00:43:58,359 Speaker 3: called the singularity is nearer. 851 00:43:58,400 --> 00:44:00,600 Speaker 1: Next year, the singularity is near your ear. 852 00:44:01,160 --> 00:44:03,200 Speaker 3: Yeah, neariest. 853 00:44:04,360 --> 00:44:11,719 Speaker 5: But the classic example in biology of exponential growth is 854 00:44:11,760 --> 00:44:15,279 Speaker 5: something like bacterial growth in a petri dish and yeah, 855 00:44:15,320 --> 00:44:18,279 Speaker 5: if you chart the number of bacteria in this you know, 856 00:44:18,400 --> 00:44:23,160 Speaker 5: nutrient rich medium over time, Yeah, it grows exponentially until 857 00:44:23,200 --> 00:44:25,720 Speaker 5: they fill the dish and eat all of the agar 858 00:44:26,440 --> 00:44:28,319 Speaker 5: and then they die. 859 00:44:29,200 --> 00:44:31,120 Speaker 2: To try to play Devil's advocates, so when I was 860 00:44:31,120 --> 00:44:33,880 Speaker 2: talking to space settlement folks, they would say something like, 861 00:44:33,960 --> 00:44:37,080 Speaker 2: you know, the reason we need to go into space 862 00:44:37,239 --> 00:44:40,080 Speaker 2: is because exponential growth does end at some point. But 863 00:44:40,200 --> 00:44:43,040 Speaker 2: our species is so amazing that we can see when 864 00:44:43,080 --> 00:44:46,400 Speaker 2: we're getting close to the like asymptope and the exponential curve, 865 00:44:46,840 --> 00:44:48,680 Speaker 2: and so we can go out to space and get 866 00:44:48,719 --> 00:44:51,480 Speaker 2: resources and we can be more proactive about it. What 867 00:44:52,000 --> 00:44:53,240 Speaker 2: is wrong about that argument? 868 00:44:53,719 --> 00:44:55,600 Speaker 3: Yeah? 869 00:44:55,640 --> 00:44:57,040 Speaker 2: I mean where to start? 870 00:44:56,440 --> 00:44:56,520 Speaker 3: Ye? 871 00:44:57,320 --> 00:44:58,200 Speaker 2: What you got to pick somewhere? 872 00:44:58,280 --> 00:45:01,080 Speaker 5: Okay, I'm going to pick on you know, I'm going 873 00:45:01,120 --> 00:45:02,840 Speaker 5: to do what we should all strive to do, or 874 00:45:02,880 --> 00:45:04,719 Speaker 5: what I strive to do, and punch up right, I'm 875 00:45:04,719 --> 00:45:06,080 Speaker 5: going to pick on somebody bigger than me. 876 00:45:06,480 --> 00:45:09,040 Speaker 3: Jeff Bezos makes the same argument, right, Yeah. 877 00:45:09,040 --> 00:45:12,640 Speaker 5: Jeff Bezos says that we need to go out into 878 00:45:12,680 --> 00:45:16,560 Speaker 5: space because of exactly this. He says, you know, we 879 00:45:16,800 --> 00:45:20,839 Speaker 5: are using exponentially more energy as time goes on, and 880 00:45:20,960 --> 00:45:23,600 Speaker 5: if that trend continues as it has for decades, if 881 00:45:23,640 --> 00:45:27,520 Speaker 5: not centuries, then in about two three hundred years, we're 882 00:45:27,560 --> 00:45:29,920 Speaker 5: going to be using all of the energy on Earth 883 00:45:30,000 --> 00:45:32,160 Speaker 5: that we get from the Sun and will have used 884 00:45:32,239 --> 00:45:35,359 Speaker 5: up all of the non renewable resources. And so at 885 00:45:35,360 --> 00:45:37,160 Speaker 5: that point we need to go out into space, if 886 00:45:37,160 --> 00:45:39,200 Speaker 5: not before, then otherwise we're going to have what he 887 00:45:39,239 --> 00:45:42,719 Speaker 5: calls a civilization of stasis and rationing. And you know, 888 00:45:42,800 --> 00:45:46,160 Speaker 5: he's not wrong about the first part. If somehow we 889 00:45:46,280 --> 00:45:49,359 Speaker 5: continue that exponential trend in energy usage, then yeah, and 890 00:45:49,400 --> 00:45:52,399 Speaker 5: I think it's in about three four hundred years, we'd 891 00:45:52,400 --> 00:45:54,880 Speaker 5: be using all of the energy available to us on Earth. 892 00:45:55,080 --> 00:45:58,120 Speaker 5: And also we'd be using so much energy that like 893 00:45:58,239 --> 00:46:02,080 Speaker 5: the waste heat from our energy usage would like boil 894 00:46:02,120 --> 00:46:02,880 Speaker 5: off the oceans. 895 00:46:03,719 --> 00:46:07,080 Speaker 3: We can't we can't do that, right, it's not possible. 896 00:46:07,600 --> 00:46:10,520 Speaker 5: I mean, putting aside that, you know, it's it's implausible 897 00:46:10,560 --> 00:46:13,319 Speaker 5: that that trend will continue. The problem is with the 898 00:46:13,320 --> 00:46:16,799 Speaker 5: second half, because yeah, Okay, we get like three to 899 00:46:16,800 --> 00:46:19,000 Speaker 5: four hundred more years here on Earth if you continue 900 00:46:19,000 --> 00:46:21,759 Speaker 5: that trend. So Bezos says, we have to go out 901 00:46:21,800 --> 00:46:23,920 Speaker 5: into space, and you know what he doesn't say is 902 00:46:24,280 --> 00:46:27,440 Speaker 5: where where resources are unlimited, but you know he implies it. 903 00:46:28,120 --> 00:46:31,640 Speaker 5: The problem is that if you really want exponential growth 904 00:46:31,719 --> 00:46:34,120 Speaker 5: to continue, going out into space, it doesn't actually help 905 00:46:34,160 --> 00:46:36,560 Speaker 5: you that much if you're looking on a timescale of centuries, 906 00:46:36,960 --> 00:46:41,800 Speaker 5: because if you do that, like about I think it's 907 00:46:41,840 --> 00:46:45,520 Speaker 5: like one thousand years after we hit that point of 908 00:46:45,640 --> 00:46:48,320 Speaker 5: using all of the sunlight that hits Earth, we get 909 00:46:48,600 --> 00:46:50,680 Speaker 5: to a point where we're just using the entire energy 910 00:46:50,680 --> 00:46:56,120 Speaker 5: output of the Sun. And then if we spot Bezos 911 00:46:56,160 --> 00:46:59,560 Speaker 5: and company a warp drive so they can go faster 912 00:46:59,640 --> 00:47:02,839 Speaker 5: than the line to try to amass even more resources 913 00:47:02,920 --> 00:47:05,960 Speaker 5: very very quickly outside of the Solar system, which we 914 00:47:06,000 --> 00:47:09,000 Speaker 5: shouldn't spot them a warp drive. There's no reason to 915 00:47:09,040 --> 00:47:10,560 Speaker 5: think that you can build a warp drive, and a 916 00:47:10,600 --> 00:47:12,680 Speaker 5: lot of reason to think that you can't. But if 917 00:47:12,719 --> 00:47:15,640 Speaker 5: we do spot them a warp drive, that only gets 918 00:47:15,680 --> 00:47:18,480 Speaker 5: you like about another two thousand years before you're using 919 00:47:18,640 --> 00:47:23,400 Speaker 5: all of the energy in the observable universe. Wow, so 920 00:47:23,960 --> 00:47:28,520 Speaker 5: you know there are limits, growth ends, and the fact 921 00:47:28,640 --> 00:47:31,520 Speaker 5: is that you know, all of that is wildly implausible. 922 00:47:31,560 --> 00:47:34,160 Speaker 5: It's not like the lesson that I want people to 923 00:47:34,160 --> 00:47:36,960 Speaker 5: take away from all of this is, oh, well, we 924 00:47:37,080 --> 00:47:39,640 Speaker 5: better keep in mind that growth has to end at 925 00:47:39,719 --> 00:47:42,280 Speaker 5: some point to the next like three thousand days years. 926 00:47:42,600 --> 00:47:44,719 Speaker 5: The answer is, oh, no, growth has to end a 927 00:47:44,760 --> 00:47:49,000 Speaker 5: lot sooner than that, because you know, going out into 928 00:47:49,000 --> 00:47:51,839 Speaker 5: space has lots of problems, even putting aside the lack 929 00:47:51,880 --> 00:47:54,799 Speaker 5: of warp drive, just living in the Solar System is 930 00:47:55,000 --> 00:47:59,400 Speaker 5: an extraordinarily difficult and dubious proposition. To give Bezos a 931 00:47:59,400 --> 00:48:01,959 Speaker 5: little bit of cris after ragging on him just now, 932 00:48:02,200 --> 00:48:04,319 Speaker 5: one of the things I like that Jeff Bezos has 933 00:48:04,360 --> 00:48:07,320 Speaker 5: said is he makes fun of Elon Musk for wanting 934 00:48:07,360 --> 00:48:09,200 Speaker 5: to go to Mars because Mars sucks. 935 00:48:09,680 --> 00:48:11,759 Speaker 3: But Bezos's solution is. 936 00:48:11,800 --> 00:48:14,640 Speaker 5: You know, for going out into space, it's not considerably better, 937 00:48:15,040 --> 00:48:18,640 Speaker 5: which is to build like hundreds of thousands or millions 938 00:48:18,680 --> 00:48:21,920 Speaker 5: of enormous city size space stations and then have everybody 939 00:48:22,000 --> 00:48:22,719 Speaker 5: live inside of them. 940 00:48:22,800 --> 00:48:26,080 Speaker 3: This is also not a great idea for many many reasons. 941 00:48:26,440 --> 00:48:29,160 Speaker 1: All right, So it's reasonable, I think to make these 942 00:48:29,239 --> 00:48:33,080 Speaker 1: arguments against like the strongest version of those claims. You know, 943 00:48:33,400 --> 00:48:36,680 Speaker 1: exponential growth will last forever. Sure, and you're right, that's 944 00:48:36,719 --> 00:48:39,840 Speaker 1: obviously practical because the universe is finite, or the observable 945 00:48:39,840 --> 00:48:41,160 Speaker 1: part of it is finite at least. 946 00:48:41,239 --> 00:48:41,399 Speaker 3: Yeah. 947 00:48:41,480 --> 00:48:43,560 Speaker 1: Yeah, But what if we just like water down those 948 00:48:43,560 --> 00:48:45,880 Speaker 1: claims a little bit and we just say, you know, 949 00:48:46,080 --> 00:48:50,560 Speaker 1: technology is transforming society very rapidly, and even the future 950 00:48:50,640 --> 00:48:54,760 Speaker 1: you describe as refuting exponential growth. That sounds pretty awesome. 951 00:48:54,840 --> 00:48:57,480 Speaker 1: Like if in two thousand years we're tapping into all 952 00:48:57,480 --> 00:49:00,239 Speaker 1: the energy from the Sun and nearby stars and have 953 00:49:00,280 --> 00:49:03,759 Speaker 1: an incredible, you know, star spanning civilization a lot of 954 00:49:03,760 --> 00:49:06,279 Speaker 1: people out there, and be like, that sounds great. What's 955 00:49:06,320 --> 00:49:06,879 Speaker 1: wrong with that? 956 00:49:07,239 --> 00:49:10,720 Speaker 5: The prospect of large numbers of people living and working 957 00:49:10,760 --> 00:49:16,680 Speaker 5: in space has an enormous number of technological and social 958 00:49:16,719 --> 00:49:20,239 Speaker 5: and political questions tied to it that are very very 959 00:49:20,320 --> 00:49:24,680 Speaker 5: difficult to solve and may not be solvable. And some 960 00:49:24,719 --> 00:49:28,359 Speaker 5: of those problems are sort of irreducibly time consuming. You 961 00:49:28,440 --> 00:49:33,319 Speaker 5: can't solve them without doing like lengthy experiments involving things 962 00:49:33,360 --> 00:49:36,240 Speaker 5: like radiation exposure and low gravity exposure. 963 00:49:35,920 --> 00:49:37,760 Speaker 3: And things like that. And I see Kelly nodding. 964 00:49:38,200 --> 00:49:40,560 Speaker 5: And you know, Kelly may know more about this than 965 00:49:40,600 --> 00:49:42,640 Speaker 5: I do, because you know, this is one of the 966 00:49:42,640 --> 00:49:45,920 Speaker 5: subjects in my book. Kelly and Zach wrote an entire 967 00:49:45,960 --> 00:49:49,600 Speaker 5: book about this, an excellent book that I really like. 968 00:49:49,920 --> 00:49:51,759 Speaker 2: I do always find a way to pull the conversation 969 00:49:51,840 --> 00:49:54,560 Speaker 2: back to space settlement. So sorry for derailing us, but 970 00:49:54,719 --> 00:49:56,600 Speaker 2: you do a great chapter on it in your book. 971 00:49:56,719 --> 00:49:59,040 Speaker 5: Yeah, thank you, and you have nothing to apologize for 972 00:49:59,160 --> 00:50:00,120 Speaker 5: it's in my book. 973 00:50:00,239 --> 00:50:02,800 Speaker 1: But let me maybe highlight a difference between the takes 974 00:50:02,800 --> 00:50:05,880 Speaker 1: you guys have in your books. Kelly and Zach say 975 00:50:05,960 --> 00:50:09,120 Speaker 1: that you know, we're maybe not ready to settle space, 976 00:50:09,200 --> 00:50:12,240 Speaker 1: that we haven't done the necessary legwork, and we shouldn't 977 00:50:12,280 --> 00:50:15,520 Speaker 1: get over excited and jump too fast and send people 978 00:50:15,560 --> 00:50:17,239 Speaker 1: to Mars now, because there's a lot of stuff we 979 00:50:17,239 --> 00:50:19,759 Speaker 1: need to figure out, but that it's possible and if 980 00:50:19,800 --> 00:50:22,359 Speaker 1: we do it right, maybe you could figure this out. 981 00:50:22,360 --> 00:50:24,600 Speaker 1: We just aren't there yet. But I feel like your 982 00:50:24,600 --> 00:50:27,600 Speaker 1: book goes a step further and suggests that you know 983 00:50:27,640 --> 00:50:30,600 Speaker 1: it's dangerous to make these projections. You know, somebody out 984 00:50:30,600 --> 00:50:33,239 Speaker 1: there listening might say, all right, Adam, maybe we won't 985 00:50:33,280 --> 00:50:36,480 Speaker 1: get there, you know, to as far as these guys project, 986 00:50:36,800 --> 00:50:39,319 Speaker 1: but however far we get will be great. What do 987 00:50:39,360 --> 00:50:41,319 Speaker 1: you say to that person? Is there a danger in 988 00:50:41,400 --> 00:50:42,360 Speaker 1: this kind of thinking? 989 00:50:42,760 --> 00:50:43,240 Speaker 3: Yeah? 990 00:50:43,280 --> 00:50:45,600 Speaker 5: I mean this This gets back sort of to the 991 00:50:45,640 --> 00:50:48,680 Speaker 5: last question that you asked me as well, because we 992 00:50:48,920 --> 00:50:52,960 Speaker 5: don't know that it's possible to have large numbers of 993 00:50:53,040 --> 00:50:57,040 Speaker 5: humans living off of Earth. Because it's very possible that 994 00:50:57,040 --> 00:51:00,640 Speaker 5: that's not, you know, something that we can do. We 995 00:51:00,719 --> 00:51:04,480 Speaker 5: need to find a way to live safely and healthily 996 00:51:04,560 --> 00:51:09,360 Speaker 5: within the limits imposed by Earth. We can't just assume 997 00:51:09,440 --> 00:51:12,000 Speaker 5: that we're going to be able to leave. The danger 998 00:51:12,239 --> 00:51:15,759 Speaker 5: is that this rhetoric of oh, it's always going to 999 00:51:15,800 --> 00:51:18,400 Speaker 5: be possible to expand out into space and grow forever 1000 00:51:18,880 --> 00:51:22,120 Speaker 5: can be used, and in fact it's not hypothetical. It 1001 00:51:22,239 --> 00:51:27,560 Speaker 5: is being used to justify this sort of logic of 1002 00:51:27,719 --> 00:51:31,720 Speaker 5: rapacious consumption that is not sustainable here on Earth. 1003 00:51:32,280 --> 00:51:34,520 Speaker 3: And because there's a very. 1004 00:51:34,360 --> 00:51:38,720 Speaker 5: Good chance that we cannot in any meaningful way leave Earth, 1005 00:51:39,680 --> 00:51:41,719 Speaker 5: we need to stop doing that and find a way 1006 00:51:41,719 --> 00:51:42,279 Speaker 5: to live here. 1007 00:51:42,840 --> 00:51:45,160 Speaker 3: That's not to say that we shouldn't explore space. I 1008 00:51:45,160 --> 00:51:47,200 Speaker 3: think robots in space are amazing. 1009 00:51:47,640 --> 00:51:51,399 Speaker 5: Like the voyager probes make me cry, you know, I'm 1010 00:51:51,440 --> 00:51:54,840 Speaker 5: a cosmologist by training. I think getting data from space 1011 00:51:54,960 --> 00:51:57,680 Speaker 5: is really important and interesting. I'm not even saying that 1012 00:51:57,680 --> 00:52:01,360 Speaker 5: we shouldn't send people into space to you know, the 1013 00:52:01,400 --> 00:52:05,200 Speaker 5: Apollo missions were amazing and really interesting. They were, of course, 1014 00:52:05,239 --> 00:52:08,200 Speaker 5: you know, not primarily missions of scientific discovery. It was 1015 00:52:08,239 --> 00:52:10,759 Speaker 5: about the Cold War. But still like the fact that 1016 00:52:10,800 --> 00:52:14,279 Speaker 5: we did like a crude sample return mission to the 1017 00:52:14,320 --> 00:52:19,600 Speaker 5: Moon several times and nobody died is amazing. But the 1018 00:52:19,719 --> 00:52:21,640 Speaker 5: visions that we have of the future are used to 1019 00:52:21,880 --> 00:52:24,840 Speaker 5: justify all sorts of things right here and now, and 1020 00:52:24,880 --> 00:52:28,879 Speaker 5: so we need to be careful about what we think 1021 00:52:28,960 --> 00:52:30,960 Speaker 5: the future is going to look like and whether that's 1022 00:52:31,040 --> 00:52:34,799 Speaker 5: remotely plausible. And I really think that the things that 1023 00:52:34,960 --> 00:52:38,000 Speaker 5: Musk and Bezos and these other tech billionaires are talking 1024 00:52:38,080 --> 00:52:41,680 Speaker 5: about are sort of like saying, you know, yeah, well 1025 00:52:41,680 --> 00:52:43,640 Speaker 5: it's okay that we're doing what we're doing right now, 1026 00:52:43,680 --> 00:52:46,040 Speaker 5: because in the future we're all going to live in 1027 00:52:46,080 --> 00:52:49,440 Speaker 5: like Hogwarts and have broomsticks and magic wands and like, 1028 00:52:49,719 --> 00:52:54,200 Speaker 5: it's roughly the same level of plausibility. 1029 00:52:54,480 --> 00:52:57,279 Speaker 2: And so to try to get us connecting More's law 1030 00:52:57,320 --> 00:52:59,480 Speaker 2: back with where we are in the conversation. To me, 1031 00:52:59,560 --> 00:53:02,520 Speaker 2: I see the connection being that you've got this thinking 1032 00:53:02,560 --> 00:53:05,480 Speaker 2: that we're going to have exponential growth and our ability 1033 00:53:05,520 --> 00:53:07,279 Speaker 2: to do everything. So like when I was talking to 1034 00:53:07,320 --> 00:53:09,799 Speaker 2: space settlement people, they'd be like, I'd talk about a 1035 00:53:09,800 --> 00:53:12,080 Speaker 2: problem and they'd say, well, AI is going to solve 1036 00:53:12,120 --> 00:53:15,600 Speaker 2: that everything is expanding. Our ability to do anything related 1037 00:53:15,640 --> 00:53:19,239 Speaker 2: to technology keeps expanding exponentially, and so you know, we've 1038 00:53:19,239 --> 00:53:22,160 Speaker 2: talked about how we have limits and so you can't 1039 00:53:22,200 --> 00:53:25,560 Speaker 2: expect exponential trends to go on forever. Do you connect 1040 00:53:25,640 --> 00:53:29,879 Speaker 2: then this kind of Moore's law thinking with techno optimism 1041 00:53:30,120 --> 00:53:33,520 Speaker 2: and this these sort of views of the future. Or 1042 00:53:33,640 --> 00:53:35,320 Speaker 2: have we just gotten off on a different topic. 1043 00:53:35,760 --> 00:53:37,880 Speaker 5: No, No, No, I think these things are connected, right, Like, 1044 00:53:37,920 --> 00:53:39,759 Speaker 5: there's a reason why all of these different things are. 1045 00:53:39,800 --> 00:53:42,319 Speaker 5: In my book, one of the things that I like 1046 00:53:42,840 --> 00:53:45,480 Speaker 5: to remind people about when we're talking about Moore's law 1047 00:53:45,560 --> 00:53:48,560 Speaker 5: is that More's law. It's not just that it's an 1048 00:53:48,600 --> 00:53:52,839 Speaker 5: empirical observation rather than a law of nature. More's law 1049 00:53:52,920 --> 00:53:57,240 Speaker 5: was a decision. More'slaw is a choice that the leaders 1050 00:53:57,280 --> 00:54:01,280 Speaker 5: of the semiconductor industry made, and then they continued making 1051 00:54:01,320 --> 00:54:03,919 Speaker 5: it for decades, you know, and there was a road 1052 00:54:04,000 --> 00:54:07,200 Speaker 5: map and lots and lots of different, you know, plans 1053 00:54:07,400 --> 00:54:10,879 Speaker 5: made in order to ensure the continuation of Moore's law 1054 00:54:11,160 --> 00:54:15,360 Speaker 5: for as long as possible. There are massive, massive amounts 1055 00:54:15,360 --> 00:54:18,680 Speaker 5: of money and corporate resources poured into this, and in fact, 1056 00:54:18,960 --> 00:54:22,440 Speaker 5: Moore's law is not even an example of accelerating returns, 1057 00:54:22,239 --> 00:54:24,480 Speaker 5: as as Kurzweil would have, but in a sense it's 1058 00:54:24,480 --> 00:54:27,200 Speaker 5: an example of diminishing returns because they got, you know, 1059 00:54:27,280 --> 00:54:31,520 Speaker 5: the semiconductor industry got less bang for their buck over time. 1060 00:54:31,600 --> 00:54:34,360 Speaker 5: They had to spend more and more money, even adjusting 1061 00:54:34,360 --> 00:54:37,759 Speaker 5: for inflation, just to get the same doubling of the 1062 00:54:37,840 --> 00:54:42,120 Speaker 5: number of processors crammed into the same space. The techno 1063 00:54:42,239 --> 00:54:47,239 Speaker 5: utopian sort of ideas that Kurzweil pushes using Moore's law, 1064 00:54:47,440 --> 00:54:50,799 Speaker 5: as you know, sort of the justification and this you 1065 00:54:50,840 --> 00:54:53,839 Speaker 5: know eternal expansion into space stuff that we've just been 1066 00:54:53,840 --> 00:54:57,760 Speaker 5: talking about, they all sort of traffic in the idea 1067 00:54:57,800 --> 00:55:01,799 Speaker 5: that the future of technology is not just eternal exponential 1068 00:55:01,840 --> 00:55:06,120 Speaker 5: growth and expansion, but that it's inevitably that not that 1069 00:55:06,120 --> 00:55:08,160 Speaker 5: that's you know, something that we could do, but that 1070 00:55:08,280 --> 00:55:10,759 Speaker 5: it's it's what we have to do. It's what is 1071 00:55:10,840 --> 00:55:14,040 Speaker 5: going to happen, and the only alternative, if there is one, 1072 00:55:14,360 --> 00:55:17,720 Speaker 5: is the extinction of the species. And you know, again 1073 00:55:17,920 --> 00:55:22,120 Speaker 5: Musk is extremely clear about this. Musk has said the 1074 00:55:22,200 --> 00:55:26,000 Speaker 5: only choice we have is eternal expansion out into cosmos 1075 00:55:26,360 --> 00:55:29,880 Speaker 5: or extinction. And when he's pushed on this, he, you know, 1076 00:55:29,920 --> 00:55:31,800 Speaker 5: he brings up the fact that you know, in about 1077 00:55:31,920 --> 00:55:34,400 Speaker 5: half a billion or a billion years, it's going to 1078 00:55:34,440 --> 00:55:36,360 Speaker 5: get so hot on Earth because of you know, the 1079 00:55:36,400 --> 00:55:41,319 Speaker 5: sun getting hotter, that the oceans will boil off. And yeah, 1080 00:55:41,600 --> 00:55:45,560 Speaker 5: that's not wrong, but you know, a lot's gonna happen 1081 00:55:45,640 --> 00:55:48,279 Speaker 5: between now and then. Not only is it not a 1082 00:55:48,280 --> 00:55:52,400 Speaker 5: particularly pressing problem, but it may not even end up 1083 00:55:52,400 --> 00:55:55,319 Speaker 5: being our problem at all, because there are many other 1084 00:55:55,480 --> 00:55:59,640 Speaker 5: things that could cause humanity to go extinct between now 1085 00:55:59,640 --> 00:56:03,240 Speaker 5: and then, like say, civilizational collapse due to global warming, 1086 00:56:03,920 --> 00:56:08,239 Speaker 5: for example, a problem that tech oligarchs and other billionaires 1087 00:56:08,280 --> 00:56:10,120 Speaker 5: have done a lot of work to try to prevent 1088 00:56:10,239 --> 00:56:13,760 Speaker 5: humanity from solving. But instead Musk says that the solution 1089 00:56:13,840 --> 00:56:15,680 Speaker 5: is to leave Earth. And this is the sort of 1090 00:56:15,760 --> 00:56:17,760 Speaker 5: rhetoric that I was talking about, you know, in terms 1091 00:56:17,760 --> 00:56:20,759 Speaker 5: of like, this is what this eternal expansion idea gets you. 1092 00:56:21,239 --> 00:56:23,960 Speaker 5: But it's also I think part of the connection with 1093 00:56:24,920 --> 00:56:28,840 Speaker 5: the logic of taking Moore's law as this law of 1094 00:56:28,960 --> 00:56:33,040 Speaker 5: nature that we can always count on these exponential trends, 1095 00:56:33,080 --> 00:56:36,560 Speaker 5: and we can always count on human ingenuity and technical 1096 00:56:36,600 --> 00:56:39,640 Speaker 5: knowledge and know how to get us out of any problem. 1097 00:56:39,800 --> 00:56:40,919 Speaker 3: If you believe that. 1098 00:56:41,560 --> 00:56:43,880 Speaker 5: Account for all of the problems in the world today, Like, 1099 00:56:43,920 --> 00:56:46,880 Speaker 5: there's so many problems that we have that are not 1100 00:56:46,960 --> 00:56:50,000 Speaker 5: amenable to technological solutions that people have tried to solve 1101 00:56:50,000 --> 00:56:53,680 Speaker 5: for a long time, that are fundamentally social in nature, 1102 00:56:54,239 --> 00:56:57,040 Speaker 5: or you know, had a technological component but also have 1103 00:56:57,080 --> 00:57:00,840 Speaker 5: a social component, like climate change. Right, we have a lot, 1104 00:57:01,120 --> 00:57:03,200 Speaker 5: if not all, of the technology that we need to 1105 00:57:03,280 --> 00:57:06,760 Speaker 5: address climate change, but we haven't yet as a species, 1106 00:57:06,800 --> 00:57:09,560 Speaker 5: and that's primarily a social and political issue, not an 1107 00:57:09,560 --> 00:57:10,640 Speaker 5: issue of technology. 1108 00:57:11,040 --> 00:57:14,399 Speaker 1: So, to paraphrase your argument, I think you're saying, it's 1109 00:57:14,440 --> 00:57:17,600 Speaker 1: not that computers won't get faster and the technology can't 1110 00:57:17,640 --> 00:57:20,280 Speaker 1: help us in the future. It's just that we can't 1111 00:57:20,560 --> 00:57:23,160 Speaker 1: rely on it always doing so to magically solve all 1112 00:57:23,200 --> 00:57:26,440 Speaker 1: of our problems, and doing so distract ourselves from the 1113 00:57:26,480 --> 00:57:28,600 Speaker 1: real problems we face in the more immediate future. 1114 00:57:28,800 --> 00:57:32,240 Speaker 5: Yeah, yeah, I mean, also, more's laws over. I mean, 1115 00:57:32,360 --> 00:57:36,840 Speaker 5: come on, we have the transistors down about as small 1116 00:57:36,840 --> 00:57:41,000 Speaker 5: as we can get them. You know, you can't make 1117 00:57:41,040 --> 00:57:44,000 Speaker 5: a silicon transistor smaller than an atom of silicon. 1118 00:57:44,120 --> 00:57:46,320 Speaker 1: But you do see a role for technology in shaping 1119 00:57:46,320 --> 00:57:48,479 Speaker 1: our future. I mean, it's not that you don't want 1120 00:57:48,720 --> 00:57:50,080 Speaker 1: chet gpt to cure cancer. 1121 00:57:50,720 --> 00:57:52,720 Speaker 5: I definitely hear that there's a role for technology in 1122 00:57:52,760 --> 00:57:53,520 Speaker 5: shaping our future. 1123 00:57:53,560 --> 00:57:55,920 Speaker 3: Technology is a big part of how we shape our future. 1124 00:57:56,360 --> 00:57:58,120 Speaker 5: I'm going to just pretend that you didn't say the 1125 00:57:58,160 --> 00:58:01,840 Speaker 5: thing about chatch ept curing cancer. Though. God, there's this tweet, 1126 00:58:02,600 --> 00:58:05,720 Speaker 5: like one of my favorite tweet and responses, effor is 1127 00:58:05,760 --> 00:58:10,439 Speaker 5: where Sam Altman said something like be me, build chatch 1128 00:58:10,520 --> 00:58:13,160 Speaker 5: ept to cure cancer or whatever, and then people start 1129 00:58:13,160 --> 00:58:15,240 Speaker 5: criticizing you, and then he like goes on and on 1130 00:58:15,360 --> 00:58:17,400 Speaker 5: and like has a pity party for himself. And then 1131 00:58:17,440 --> 00:58:22,720 Speaker 5: somebody just responded with did you cure cancer or whatever? 1132 00:58:23,000 --> 00:58:26,760 Speaker 5: But you know, there has been actually great progress made 1133 00:58:26,760 --> 00:58:29,280 Speaker 5: and treating cancer just in the last few years, right, 1134 00:58:29,360 --> 00:58:32,360 Speaker 5: you know, like these I don't remember the names of 1135 00:58:32,360 --> 00:58:34,120 Speaker 5: the drugs because like I'm not a cancer guy, but 1136 00:58:34,240 --> 00:58:39,080 Speaker 5: like these approaches of like getting cancer patient's own immune 1137 00:58:39,120 --> 00:58:41,640 Speaker 5: systems to properly recognize and attack the cancers in their 1138 00:58:41,640 --> 00:58:46,160 Speaker 5: own bodies has been like incredibly successful and is really 1139 00:58:46,160 --> 00:58:49,720 Speaker 5: promising for further developments. It's really amazing, and like there 1140 00:58:49,760 --> 00:58:52,200 Speaker 5: have been all sorts of really amazing biomedical advances that 1141 00:58:52,200 --> 00:58:55,000 Speaker 5: are currently being destroyed by RFK Junior and Trump. Like 1142 00:58:55,160 --> 00:58:57,480 Speaker 5: mRNA vaccines are one of the great success stories of 1143 00:58:57,560 --> 00:59:00,960 Speaker 5: you know, biomedical science in the last twenty years. And 1144 00:59:01,000 --> 00:59:04,080 Speaker 5: I think that's important, and I think in general, vaccines 1145 00:59:04,120 --> 00:59:07,080 Speaker 5: are great. You know, there's all sorts of really wonderful 1146 00:59:07,160 --> 00:59:10,920 Speaker 5: technology that we've created that has made the world generally 1147 00:59:11,040 --> 00:59:13,960 Speaker 5: a better place, or has at least enabled. 1148 00:59:13,560 --> 00:59:15,080 Speaker 3: People to make the world a better place. 1149 00:59:15,160 --> 00:59:15,320 Speaker 1: Right. 1150 00:59:15,360 --> 00:59:17,840 Speaker 5: In general, technology is a tool, and there are questions 1151 00:59:17,880 --> 00:59:20,600 Speaker 5: about how you use it, right. You know, nuclear power 1152 00:59:20,640 --> 00:59:22,520 Speaker 5: can be used to build nuclear power plants, but it 1153 00:59:22,520 --> 00:59:25,200 Speaker 5: can also be used to make bombs. YadA, YadA, YadA. 1154 00:59:25,280 --> 00:59:27,480 Speaker 5: I think I just YadA YadA, nuclear apocalypse. 1155 00:59:27,560 --> 00:59:31,040 Speaker 3: But yeah, you did, Yeah whatever. I'm a physicist. Of 1156 00:59:31,080 --> 00:59:32,080 Speaker 3: course that's what I'm gonna do. 1157 00:59:32,160 --> 00:59:34,520 Speaker 5: But the point is, yeah, of course there's a role 1158 00:59:34,600 --> 00:59:36,760 Speaker 5: for technology to play in shaping our future. 1159 00:59:36,760 --> 00:59:38,640 Speaker 3: It's just not two things. 1160 00:59:39,080 --> 00:59:41,280 Speaker 5: Technology is not the only thing that shapes our future, 1161 00:59:41,640 --> 00:59:46,440 Speaker 5: and the development and future direction of technology is not inevitable. 1162 00:59:47,000 --> 00:59:51,680 Speaker 5: Technology is something that humans make, and the future development 1163 00:59:51,720 --> 00:59:55,880 Speaker 5: of technology is filled with contingency and human choice. It 1164 00:59:56,000 --> 00:59:58,840 Speaker 5: is not like we build every single technology that it 1165 00:59:58,880 --> 01:00:02,439 Speaker 5: is physically possible to build. It's not on rails. It's 1166 01:00:02,480 --> 01:00:04,760 Speaker 5: not like it's you know, the analogy I make in 1167 01:00:04,800 --> 01:00:07,840 Speaker 5: the book, It's not like a tech tree in civilization, right, 1168 01:00:08,320 --> 01:00:10,360 Speaker 5: where like the future of technology is just sort of 1169 01:00:10,400 --> 01:00:12,320 Speaker 5: revealed to us and we have we just make a 1170 01:00:12,400 --> 01:00:14,680 Speaker 5: choice about which branch we're going to pursue first. 1171 01:00:15,160 --> 01:00:17,439 Speaker 3: That's not how anything works. 1172 01:00:17,440 --> 01:00:20,160 Speaker 1: All right. Well, thanks Adam for coming on, and let's 1173 01:00:20,160 --> 01:00:22,680 Speaker 1: hope that chat GBT desk your cancer before any of 1174 01:00:22,760 --> 01:00:23,200 Speaker 1: us get it. 1175 01:00:23,400 --> 01:00:32,920 Speaker 2: Thanks for being on the show, Adam absolutely. Daniel and 1176 01:00:33,000 --> 01:00:36,919 Speaker 2: Kelly's Extraordinary Universe is produced by iHeartRadio. We would love 1177 01:00:37,000 --> 01:00:38,920 Speaker 2: to hear from you, We really would. 1178 01:00:39,080 --> 01:00:41,840 Speaker 1: We want to know what questions you have about this 1179 01:00:42,040 --> 01:00:43,680 Speaker 1: Extraordinary Universe. 1180 01:00:43,760 --> 01:00:46,720 Speaker 2: We want to know your thoughts on recent shows, suggestions 1181 01:00:46,720 --> 01:00:49,720 Speaker 2: for future shows. If you contact us, we will get 1182 01:00:49,760 --> 01:00:50,160 Speaker 2: back to you. 1183 01:00:50,440 --> 01:00:51,440 Speaker 3: We really mean it. 1184 01:00:51,560 --> 01:00:56,400 Speaker 1: We answer every message. Email us at questions at danieland Kelly. 1185 01:00:56,280 --> 01:00:58,360 Speaker 2: Dot org, or you can find us on social media. 1186 01:00:58,440 --> 01:01:02,240 Speaker 2: We have accounts on x, Instagram, Blue Sky and on 1187 01:01:02,320 --> 01:01:04,280 Speaker 2: all of those platforms. You can find us at d 1188 01:01:04,720 --> 01:01:06,280 Speaker 2: and K Universe. 1189 01:01:06,440 --> 01:01:08,000 Speaker 1: Don't be shy, write to us