1 00:00:05,160 --> 00:00:08,960 Speaker 1: What is software code and can it be thought of 2 00:00:09,119 --> 00:00:12,000 Speaker 1: like a magic spell? Are we in the process of 3 00:00:12,039 --> 00:00:15,480 Speaker 1: building a world so complex that we will lose the 4 00:00:15,520 --> 00:00:18,320 Speaker 1: ability to understand it? Or has that already happened a 5 00:00:18,360 --> 00:00:20,319 Speaker 1: long time ago? And what does any of this have 6 00:00:20,400 --> 00:00:24,279 Speaker 1: to do with SimCity or knowledge that already exists but 7 00:00:24,400 --> 00:00:27,880 Speaker 1: no one is thought to put together, or inventions that 8 00:00:28,000 --> 00:00:32,599 Speaker 1: evolve beyond the grasp of their creators. And what coding 9 00:00:32,680 --> 00:00:38,840 Speaker 1: looks like in the near future. Welcome to Inner Cosmos 10 00:00:38,840 --> 00:00:41,560 Speaker 1: with Me and David Eagleman. I'm a neuroscientist and author 11 00:00:41,600 --> 00:00:44,680 Speaker 1: at Stanford and in these episodes we look at the 12 00:00:44,720 --> 00:00:48,839 Speaker 1: world inside us and around us to understand why and 13 00:00:48,920 --> 00:01:10,600 Speaker 1: how our lives look the way they do. Today's episode 14 00:01:10,640 --> 00:01:15,520 Speaker 1: is about computer code. We live in a world increasingly 15 00:01:15,560 --> 00:01:19,080 Speaker 1: built out of symbols. We've got strings of code and 16 00:01:19,120 --> 00:01:24,000 Speaker 1: lines of logic and invisible layers of computation stacked very 17 00:01:24,120 --> 00:01:26,959 Speaker 1: deeply in our lives. This goes from traffic lights to 18 00:01:27,080 --> 00:01:32,520 Speaker 1: financial markets, to weather predictions, to streaming recommendations, to all 19 00:01:32,560 --> 00:01:34,679 Speaker 1: of our apps and our AI and on and on. 20 00:01:35,080 --> 00:01:40,240 Speaker 1: We are surrounded by systems that hum quietly in the background, 21 00:01:40,520 --> 00:01:44,760 Speaker 1: and that orchestrate everything about our modern lives. But it's 22 00:01:44,880 --> 00:01:50,160 Speaker 1: very uncommonly that we stop to ask, what is code? Really? 23 00:01:50,520 --> 00:01:54,200 Speaker 1: It's a tool, but it's also a massive force that 24 00:01:54,280 --> 00:01:57,800 Speaker 1: has taken over the world. So where does it come from? 25 00:01:58,160 --> 00:02:00,920 Speaker 1: Where is it taking us? And what does it mean 26 00:02:01,240 --> 00:02:05,800 Speaker 1: to live inside systems that we ourselves have written but 27 00:02:05,880 --> 00:02:10,720 Speaker 1: we can't possibly fully understand. So today's guest invites us 28 00:02:10,720 --> 00:02:15,079 Speaker 1: to see code as having an aspect of magic. We're 29 00:02:15,080 --> 00:02:18,720 Speaker 1: going to talk with Samuel Arbisman. He's a complexity scientist 30 00:02:18,760 --> 00:02:23,400 Speaker 1: and a writer who thinks about the evolving relationship between 31 00:02:23,800 --> 00:02:26,840 Speaker 1: humans and the tools that we build, and how our 32 00:02:26,880 --> 00:02:30,640 Speaker 1: creations can outpace our comprehension. He's just written a new 33 00:02:30,680 --> 00:02:34,119 Speaker 1: book called The Magic of Code, and here he takes 34 00:02:34,160 --> 00:02:38,359 Speaker 1: a dive into the joy and the deeper nature of programming, 35 00:02:38,639 --> 00:02:42,520 Speaker 1: in other words, beyond just an engineering discipline, but instead 36 00:02:42,800 --> 00:02:47,160 Speaker 1: as a hopeful and enchanted practice. I think I would 37 00:02:47,200 --> 00:02:51,400 Speaker 1: describe this book as a blurring of the boundaries between 38 00:02:51,560 --> 00:02:56,800 Speaker 1: science and art, between logic and myth, between our intentions 39 00:02:56,840 --> 00:03:00,600 Speaker 1: with writing code and what can actually emerge. Because the 40 00:03:00,720 --> 00:03:05,560 Speaker 1: paradox is that code is built from rigid languages with 41 00:03:05,720 --> 00:03:09,800 Speaker 1: strict syntax and rules, but it lets us create whole 42 00:03:09,960 --> 00:03:16,280 Speaker 1: worlds from scratch. We can simulate galaxies and evolve virtual creatures, 43 00:03:16,320 --> 00:03:20,919 Speaker 1: and test new economies and reimagine cities, and it gives 44 00:03:21,040 --> 00:03:24,560 Speaker 1: rise to things that are complex and unpredictable and often 45 00:03:24,880 --> 00:03:34,040 Speaker 1: quite beautiful. So here's my interview with Sam Arbusman. So, Sam, 46 00:03:34,080 --> 00:03:37,800 Speaker 1: you're a scientist with very broad interests. What drew you 47 00:03:38,400 --> 00:03:41,840 Speaker 1: to writing about your latest book on the subject of code? 48 00:03:42,400 --> 00:03:45,720 Speaker 2: Yeah, So, one of the things I think about when 49 00:03:45,720 --> 00:03:48,360 Speaker 2: I think about how people are talking about technology and 50 00:03:48,400 --> 00:03:51,880 Speaker 2: computing and code is that right now, it feels like 51 00:03:51,880 --> 00:03:53,880 Speaker 2: there's almost this kind of like a broken conversation in 52 00:03:53,960 --> 00:03:57,520 Speaker 2: society where when we talk about code or computing the 53 00:03:57,520 --> 00:04:01,800 Speaker 2: world of tech, there is this average real stance towards it, 54 00:04:02,040 --> 00:04:04,560 Speaker 2: or or just worried about it. Sometimes people are just 55 00:04:04,880 --> 00:04:07,000 Speaker 2: ignorant about it and unwilling to kind of learn more. 56 00:04:07,400 --> 00:04:10,280 Speaker 2: And certainly some of the adversarial stuff is reasonable. But 57 00:04:11,160 --> 00:04:13,640 Speaker 2: for me, like when I think about my own experience 58 00:04:13,680 --> 00:04:17,040 Speaker 2: with computers growing up, it wasn't adversarial. It was kind 59 00:04:17,040 --> 00:04:19,360 Speaker 2: of full of this kind of like wonder and delight. 60 00:04:19,800 --> 00:04:22,000 Speaker 2: It also didn't feel that like computing was really just 61 00:04:22,000 --> 00:04:26,360 Speaker 2: this branch of engineering. It also really connected to lots 62 00:04:26,400 --> 00:04:28,560 Speaker 2: of different things. It was almost like humanistic liberal arts 63 00:04:28,760 --> 00:04:32,240 Speaker 2: that drew in language and philosophy and biology and art 64 00:04:32,360 --> 00:04:34,839 Speaker 2: and how we think in all these different areas. And 65 00:04:34,920 --> 00:04:38,560 Speaker 2: so for me, I wanted to try to explain how 66 00:04:38,600 --> 00:04:41,640 Speaker 2: to think about code and computing as this almost like 67 00:04:41,920 --> 00:04:44,039 Speaker 2: liberal art that attracts all these different topics. 68 00:04:44,080 --> 00:04:47,240 Speaker 1: And in the book, you compare this to philology, Right, 69 00:04:47,360 --> 00:04:50,000 Speaker 1: what's this you tell us about philology? 70 00:04:50,120 --> 00:04:54,160 Speaker 2: Yeah, so, so philology it's this branch of humanistic study 71 00:04:54,480 --> 00:04:59,600 Speaker 2: that within the humanities, it was devoted to understanding the 72 00:04:59,600 --> 00:05:02,279 Speaker 2: origin of words and kind of the nature of the 73 00:05:02,360 --> 00:05:06,040 Speaker 2: history of language. But philology, in the process of doing philology, 74 00:05:06,480 --> 00:05:11,080 Speaker 2: it required knowing about archaeology and anthropology and history and 75 00:05:11,120 --> 00:05:15,960 Speaker 2: all these different topics. And then eventually philology sort of fractured, 76 00:05:16,000 --> 00:05:17,240 Speaker 2: and that's kind of where we got a lot of 77 00:05:17,240 --> 00:05:20,200 Speaker 2: the different domains within the humanities. And I kind of 78 00:05:20,279 --> 00:05:23,599 Speaker 2: think that in some ways computing has at least some 79 00:05:23,960 --> 00:05:28,040 Speaker 2: aspect of that kind of philology as unifier of lots 80 00:05:28,040 --> 00:05:31,240 Speaker 2: of different topics. And so for me, my goal was 81 00:05:31,279 --> 00:05:35,120 Speaker 2: to kind of try to show how computing and code 82 00:05:35,400 --> 00:05:39,920 Speaker 2: can actually have that connective tissue between all these different domains. 83 00:05:40,120 --> 00:05:43,360 Speaker 1: And so you described code as being magical. 84 00:05:43,680 --> 00:05:46,320 Speaker 2: Why and so when I say magical, I'm not saying 85 00:05:47,160 --> 00:05:48,760 Speaker 2: that it's like magic in the sense of like, oh, 86 00:05:48,800 --> 00:05:51,239 Speaker 2: like this piece of software just works, it works like magic, 87 00:05:51,520 --> 00:05:53,640 Speaker 2: although there is some of that. For me, it's actually 88 00:05:53,680 --> 00:05:56,640 Speaker 2: this idea that when we think about the nature of 89 00:05:56,720 --> 00:06:01,159 Speaker 2: code or magic, and we've had as a society this 90 00:06:01,279 --> 00:06:04,000 Speaker 2: desire for millennia to kind of coerce the world around 91 00:06:04,080 --> 00:06:06,320 Speaker 2: us through our language and our texts and our speech 92 00:06:06,600 --> 00:06:09,240 Speaker 2: and make the world kind of do our bidding. And 93 00:06:09,279 --> 00:06:11,679 Speaker 2: only in the past, I don't know, seventy five odd 94 00:06:11,760 --> 00:06:14,680 Speaker 2: years since the advent of the modern digital computer, has 95 00:06:14,720 --> 00:06:16,680 Speaker 2: this been a reality where we can actually write text, 96 00:06:16,720 --> 00:06:18,880 Speaker 2: we can write code, and it can actually do things 97 00:06:18,880 --> 00:06:21,839 Speaker 2: in the world. And so for me, there is this 98 00:06:22,000 --> 00:06:26,599 Speaker 2: deep similarity between how we've thought about magic in the 99 00:06:26,600 --> 00:06:28,599 Speaker 2: ancient or medieval days, or even in the stories that 100 00:06:28,640 --> 00:06:31,839 Speaker 2: we tell ourselves and the reality of code. And so, 101 00:06:32,800 --> 00:06:35,760 Speaker 2: of course this analogy and metaphor can only be taken 102 00:06:35,800 --> 00:06:38,360 Speaker 2: so far before it kind of breaks down. I certainly 103 00:06:38,400 --> 00:06:41,200 Speaker 2: take it to the bending point, if not the breaking point. 104 00:06:41,400 --> 00:06:44,240 Speaker 2: But there are a lot of deep similarities between how 105 00:06:44,240 --> 00:06:47,720 Speaker 2: to think about this. So, for example, magic often requires 106 00:06:47,760 --> 00:06:50,320 Speaker 2: in our stories a certain amount of training and knowledge. 107 00:06:50,400 --> 00:06:52,600 Speaker 2: And it's a craft. It's not just like a thing 108 00:06:52,640 --> 00:06:55,039 Speaker 2: like that just works. It actually requires you to learn 109 00:06:55,080 --> 00:06:58,039 Speaker 2: certain things. And so we have Hogwarts School of Witchcraft 110 00:06:58,120 --> 00:07:00,000 Speaker 2: and Wizardry. You got to go there for seven years 111 00:07:00,080 --> 00:07:02,280 Speaker 2: or whatever it is. And so too with code, like 112 00:07:02,320 --> 00:07:05,320 Speaker 2: it doesn't necessarily just work. You actually have to understand 113 00:07:05,600 --> 00:07:07,919 Speaker 2: how like the nature of syntax and the details of 114 00:07:07,920 --> 00:07:10,560 Speaker 2: code and so and so that example, as well as 115 00:07:10,560 --> 00:07:12,720 Speaker 2: other ones, I kind of us to show the ways 116 00:07:12,840 --> 00:07:15,280 Speaker 2: in which this this idea of like the kind of 117 00:07:15,280 --> 00:07:17,920 Speaker 2: the analogy of magic actually can be a productive and 118 00:07:18,000 --> 00:07:20,400 Speaker 2: useful one to help us better understand how code works. 119 00:07:20,600 --> 00:07:22,760 Speaker 1: So it's not just a set of instructions. It's like 120 00:07:22,920 --> 00:07:25,160 Speaker 1: a spell in the sense that you puts a strange 121 00:07:25,160 --> 00:07:27,560 Speaker 1: set of symbols and it does stuff in the world 122 00:07:27,640 --> 00:07:31,240 Speaker 1: that moves electrons or launches rockets, or models pandemics or 123 00:07:31,280 --> 00:07:35,600 Speaker 1: creates simulations. And that's the sense in which it's got 124 00:07:35,600 --> 00:07:39,480 Speaker 1: this magic to it. But also you point to the 125 00:07:39,520 --> 00:07:44,400 Speaker 1: fact that there's often unpredictability and emergence of things we 126 00:07:44,440 --> 00:07:47,280 Speaker 1: didn't expect from code. So can you give us an 127 00:07:47,280 --> 00:07:50,240 Speaker 1: example of this dual nature of code? 128 00:07:50,880 --> 00:07:53,000 Speaker 2: Yeah, I mean so certainly. In the world of magic, 129 00:07:53,040 --> 00:07:55,240 Speaker 2: we have a lot of these stories, and like there's 130 00:07:55,240 --> 00:07:56,840 Speaker 2: like the story of like if the source is Apprentice, 131 00:07:56,840 --> 00:07:58,560 Speaker 2: which I think there's like the old version and then 132 00:07:58,560 --> 00:08:02,160 Speaker 2: the Disney makeme Out version, where some sort of magic 133 00:08:02,200 --> 00:08:05,960 Speaker 2: has unanticipated consequences ands only you have you have brooms 134 00:08:06,040 --> 00:08:08,880 Speaker 2: kind of walking around and flooding and flooding a basement, 135 00:08:09,040 --> 00:08:11,480 Speaker 2: And the same kind of thing is true with code. 136 00:08:11,520 --> 00:08:14,120 Speaker 2: Where in code, I think people who might not be 137 00:08:14,200 --> 00:08:16,480 Speaker 2: familiar with with programming think of it as, oh, like 138 00:08:16,480 --> 00:08:18,000 Speaker 2: I have this idea in my mind and I'm going 139 00:08:18,040 --> 00:08:20,200 Speaker 2: to instantiate it into a computer program. And there is that, 140 00:08:20,480 --> 00:08:24,040 Speaker 2: but there's also a huge amount of debugging and frustration 141 00:08:24,160 --> 00:08:26,840 Speaker 2: because oftentimes when you write a program, there's a gap 142 00:08:26,880 --> 00:08:28,680 Speaker 2: between how you think it will actually work and how 143 00:08:28,720 --> 00:08:32,440 Speaker 2: it actually does work. And oftentimes the reality of the 144 00:08:32,440 --> 00:08:36,360 Speaker 2: program and better understanding of it is only revealed through 145 00:08:36,480 --> 00:08:38,640 Speaker 2: these bugs, through these glitches and edge cases and things 146 00:08:38,679 --> 00:08:42,559 Speaker 2: like that, And so there are many situations where we 147 00:08:42,720 --> 00:08:45,760 Speaker 2: only see these bizarre errors, and then based on those 148 00:08:45,880 --> 00:08:48,880 Speaker 2: errors and these kind of unanticipated consequences, do we realize, oh, 149 00:08:48,920 --> 00:08:50,480 Speaker 2: how this thing actually works, and so it can be 150 00:08:50,720 --> 00:08:53,600 Speaker 2: So there's a there's a well known story of someone 151 00:08:53,640 --> 00:08:55,360 Speaker 2: who I think was like a systems administrator for some 152 00:08:55,440 --> 00:08:58,520 Speaker 2: university department. He was told by the chair of the 153 00:08:58,520 --> 00:09:01,640 Speaker 2: department that their email was only able to be sent 154 00:09:01,679 --> 00:09:04,840 Speaker 2: about five hundred miles away, and the systems like this 155 00:09:04,880 --> 00:09:06,920 Speaker 2: is insane, like email, that's not how email works. And 156 00:09:06,960 --> 00:09:09,200 Speaker 2: it turned out by delving into it, and he was 157 00:09:09,200 --> 00:09:11,480 Speaker 2: able to find that I think that there was like 158 00:09:11,520 --> 00:09:14,920 Speaker 2: an older piece of software that hadn't been upgraded, but 159 00:09:15,440 --> 00:09:18,160 Speaker 2: the newer system didn't realize this and like would time out, 160 00:09:18,160 --> 00:09:20,160 Speaker 2: but only after like some very small amount of time. 161 00:09:20,280 --> 00:09:22,440 Speaker 2: And it turns out, based on like the speed of 162 00:09:22,480 --> 00:09:24,560 Speaker 2: sound and that small amount of time, it ended up 163 00:09:24,559 --> 00:09:26,360 Speaker 2: working out to about five hundred miles and it was 164 00:09:26,400 --> 00:09:29,120 Speaker 2: this weird unanticipated consequence. But of course, and there's also 165 00:09:29,200 --> 00:09:32,079 Speaker 2: other things that just the fact that when you stitch 166 00:09:32,480 --> 00:09:35,439 Speaker 2: systems together and pieces of software together, they all interact 167 00:09:35,440 --> 00:09:38,120 Speaker 2: in unexpected ways. And that's also the kind of unanticipated 168 00:09:38,120 --> 00:09:41,760 Speaker 2: consequences we see. Whether it's like some weird little system 169 00:09:41,760 --> 00:09:44,880 Speaker 2: fails to get upgraded and then suddenly all like the 170 00:09:44,880 --> 00:09:47,760 Speaker 2: the airline systems go down for a first certain amount 171 00:09:47,760 --> 00:09:49,120 Speaker 2: of time or whatever it is. And so there is 172 00:09:49,160 --> 00:09:52,319 Speaker 2: that kind of unanticipated consequence in lots of different ways. 173 00:09:52,320 --> 00:09:54,679 Speaker 2: And we're seeing this, of course even more so with AI. 174 00:09:56,440 --> 00:09:57,960 Speaker 1: Okay, So this is the thing that you and I 175 00:09:58,040 --> 00:10:01,080 Speaker 1: both love is the emergence of comp complexity in the world. 176 00:10:01,320 --> 00:10:04,880 Speaker 1: And with code it's a very specific, detailed set of instructions, 177 00:10:04,920 --> 00:10:08,560 Speaker 1: and yet it can break, it can decay, it can 178 00:10:08,600 --> 00:10:12,160 Speaker 1: be opaque. All kinds of things can happen. And sometimes 179 00:10:12,200 --> 00:10:16,000 Speaker 1: when we try to model complexity, we actually unleash complexity. 180 00:10:16,360 --> 00:10:19,440 Speaker 1: So tell us your take on how code can do 181 00:10:19,520 --> 00:10:21,000 Speaker 1: things that we didn't expect it to do. 182 00:10:21,400 --> 00:10:23,760 Speaker 2: Yeah, and so when we think about and just engineered 183 00:10:23,800 --> 00:10:26,000 Speaker 2: systems more broadly, and certainly when it comes to code, 184 00:10:26,360 --> 00:10:30,000 Speaker 2: you think, oh, it's designed by people. It sounds very logical, 185 00:10:30,040 --> 00:10:32,440 Speaker 2: it's kind of derived from mathematics. It should be very 186 00:10:32,480 --> 00:10:36,440 Speaker 2: simple and straightforward, and very small bits of code are that. 187 00:10:36,920 --> 00:10:39,800 Speaker 2: But the truth is it adds up and then through 188 00:10:39,840 --> 00:10:44,520 Speaker 2: this combination of the sizes of programs, growing them beingcoming 189 00:10:44,600 --> 00:10:47,640 Speaker 2: connected to various other bits of code that are out there, 190 00:10:47,880 --> 00:10:49,640 Speaker 2: as well as also just engaging with kind of the 191 00:10:50,120 --> 00:10:52,720 Speaker 2: messiness of the world around us, you end up getting 192 00:10:52,840 --> 00:10:54,920 Speaker 2: sort of a certain amount of unexpectedness as well as 193 00:10:55,000 --> 00:10:57,840 Speaker 2: kind of just a reduced understanding. And part of this 194 00:10:57,960 --> 00:11:00,200 Speaker 2: is because there's and you mentioned this kind of like 195 00:11:00,440 --> 00:11:02,839 Speaker 2: it's like breaking and things kind of growing over time. 196 00:11:02,960 --> 00:11:06,440 Speaker 2: There is this whole phenomenon of legacy code where code 197 00:11:06,640 --> 00:11:08,760 Speaker 2: has been around for a very very long time, and 198 00:11:09,360 --> 00:11:11,400 Speaker 2: we have systems that are still being you like, that 199 00:11:11,440 --> 00:11:14,200 Speaker 2: are still being used that were developed decades ago, but 200 00:11:14,240 --> 00:11:16,200 Speaker 2: they might be involved in kind of the irs, but 201 00:11:16,200 --> 00:11:18,280 Speaker 2: they were developed first developed or in the Kennedy administration. 202 00:11:18,320 --> 00:11:22,240 Speaker 2: Like there's all these kind of crazy examples where things 203 00:11:22,400 --> 00:11:25,440 Speaker 2: that have been developed the people who first made them 204 00:11:25,520 --> 00:11:27,600 Speaker 2: they might be and they might be long retired, they 205 00:11:27,640 --> 00:11:30,520 Speaker 2: might be dead. And we also just don't fully understand 206 00:11:30,520 --> 00:11:34,040 Speaker 2: these systems. And so it's this weird situation where we 207 00:11:34,160 --> 00:11:36,680 Speaker 2: have to recognize that even the systems of our own 208 00:11:36,840 --> 00:11:41,080 Speaker 2: construction are actually like when they become big enough, they 209 00:11:41,080 --> 00:11:43,480 Speaker 2: actually have this kind of qualitative difference where they almost 210 00:11:43,640 --> 00:11:47,480 Speaker 2: they almost become almost biological or organic in their complexity, 211 00:11:47,559 --> 00:11:50,320 Speaker 2: and as a result, we have a reduced understanding and 212 00:11:50,320 --> 00:11:53,240 Speaker 2: we have to kind of take almost biological modes of 213 00:11:53,320 --> 00:11:56,080 Speaker 2: studying these systems, whether it's kind of like the days 214 00:11:56,080 --> 00:11:57,680 Speaker 2: of old with like the natural is kind of going 215 00:11:57,679 --> 00:12:00,480 Speaker 2: out and collecting bugs in this case could be bugs 216 00:12:00,480 --> 00:12:03,040 Speaker 2: and errors. It could be bugs like insects, as well 217 00:12:03,040 --> 00:12:05,000 Speaker 2: as just kind of like trying to tinker at the 218 00:12:05,040 --> 00:12:08,320 Speaker 2: edges and better understand a system, because the thing overall 219 00:12:08,400 --> 00:12:11,640 Speaker 2: you fully don't understand. So there's this weird situation where 220 00:12:12,000 --> 00:12:16,880 Speaker 2: these systems are engineered, but they also involve kind of 221 00:12:17,440 --> 00:12:20,120 Speaker 2: a certain amount of humility in trying to understand these systems. 222 00:12:20,120 --> 00:12:22,000 Speaker 2: And part of that is also because one of the 223 00:12:22,000 --> 00:12:25,240 Speaker 2: other features of computing and software is this idea of 224 00:12:25,240 --> 00:12:27,800 Speaker 2: abstraction that you can kind of build things on top 225 00:12:27,800 --> 00:12:31,800 Speaker 2: of other pieces, and those pieces are then sophisticated kind 226 00:12:31,800 --> 00:12:33,640 Speaker 2: of units, and you can kind of use them as 227 00:12:33,720 --> 00:12:35,800 Speaker 2: standalone bits and then don't have to worry about the 228 00:12:35,840 --> 00:12:39,280 Speaker 2: things underneath it. And so that modularity is very, very powerful, 229 00:12:39,320 --> 00:12:42,520 Speaker 2: but as a result, there is a decreased amount of understanding, 230 00:12:42,559 --> 00:12:45,480 Speaker 2: and so sometimes not understanding what's going on under the 231 00:12:45,480 --> 00:12:48,359 Speaker 2: hood or kind of underneath these pieces, even when yourself 232 00:12:48,600 --> 00:12:51,040 Speaker 2: are programming them or programming kind of the things that 233 00:12:51,080 --> 00:12:53,440 Speaker 2: interact with them, means that you can kind of also 234 00:12:53,520 --> 00:13:10,760 Speaker 2: have a certain amount of anticipated consequences. 235 00:13:12,840 --> 00:13:15,040 Speaker 1: And so what are the key limitations that you see 236 00:13:15,040 --> 00:13:18,239 Speaker 1: when we try to model very complex systems? 237 00:13:18,640 --> 00:13:19,880 Speaker 2: So one of the things I think about when I 238 00:13:19,880 --> 00:13:23,720 Speaker 2: think about modeling complex systems is is what is the 239 00:13:23,720 --> 00:13:27,360 Speaker 2: goal of modeling the complex system? So there's some situations 240 00:13:27,400 --> 00:13:31,000 Speaker 2: where we really want to have perfect fidelity to a 241 00:13:31,040 --> 00:13:33,720 Speaker 2: real world system. And so for predictions, like with weather, 242 00:13:34,160 --> 00:13:35,880 Speaker 2: as a weather prediction, like you want to and you 243 00:13:35,920 --> 00:13:38,760 Speaker 2: want to not just understand kind of how kind of 244 00:13:38,840 --> 00:13:41,800 Speaker 2: air moves around, You want to really understand whether or 245 00:13:41,840 --> 00:13:43,400 Speaker 2: not it's going to rain tomorrow or in an hour 246 00:13:43,480 --> 00:13:46,120 Speaker 2: or two hours or whatever it is. And and based 247 00:13:46,120 --> 00:13:48,360 Speaker 2: on that you have to have a great deal of 248 00:13:48,440 --> 00:13:51,280 Speaker 2: data and a great deal of complexity. And then oftentimes 249 00:13:51,480 --> 00:13:55,360 Speaker 2: the resulting models might be very powerful, very sophisticated, but 250 00:13:55,400 --> 00:13:57,600 Speaker 2: there might be a reduced amount of understanding and actually 251 00:13:57,720 --> 00:13:59,800 Speaker 2: how these things are doing what they're doing. On the 252 00:13:59,800 --> 00:14:03,360 Speaker 2: other hand, if you want to just understand the features 253 00:14:03,400 --> 00:14:06,400 Speaker 2: of a system, you can sometimes get away with a 254 00:14:06,520 --> 00:14:09,880 Speaker 2: much simpler model, which might not necessarily be exactly the 255 00:14:09,920 --> 00:14:12,320 Speaker 2: way that the model works, but could at least capture 256 00:14:12,360 --> 00:14:14,040 Speaker 2: some of the complexity and kind of the and the 257 00:14:14,080 --> 00:14:16,520 Speaker 2: emergence of what you were talking about of that system. 258 00:14:16,559 --> 00:14:18,960 Speaker 2: And so, for example, and this is a kind of 259 00:14:18,960 --> 00:14:22,080 Speaker 2: trivial example, but the computer game SimCity. It is not 260 00:14:22,720 --> 00:14:26,440 Speaker 2: modeling an actual city, but to give you an intuitive 261 00:14:26,520 --> 00:14:32,000 Speaker 2: sense of how feedback operates, or unanticipated consequences work, or 262 00:14:32,080 --> 00:14:34,480 Speaker 2: just the fact that, like complex systems can bite back 263 00:14:34,600 --> 00:14:37,800 Speaker 2: and do weird things that you might not expect. SimCity 264 00:14:37,880 --> 00:14:40,080 Speaker 2: is great for that kind of thing. And so, and 265 00:14:40,120 --> 00:14:41,640 Speaker 2: it can also kind of give you a sense of, oh, 266 00:14:41,640 --> 00:14:43,240 Speaker 2: when I do this kind of thing, according to this 267 00:14:43,320 --> 00:14:46,120 Speaker 2: model of how Will Wright or whoever was programming it 268 00:14:46,280 --> 00:14:48,520 Speaker 2: thought cities would work, this maybe would be the way 269 00:14:48,520 --> 00:14:50,040 Speaker 2: it works. Whether or not that is actually how the 270 00:14:50,040 --> 00:14:52,680 Speaker 2: city operates, that's entirely different, different thing. But so for me, 271 00:14:53,080 --> 00:14:55,680 Speaker 2: I often think about, like, yeah, what is the ultimate 272 00:14:55,720 --> 00:14:57,520 Speaker 2: goal with the model? Is the goal to kind of 273 00:14:57,600 --> 00:15:00,960 Speaker 2: understand things? And we have to recognizing our human minds 274 00:15:00,960 --> 00:15:04,400 Speaker 2: are really limited when it comes to understanding complex and 275 00:15:04,440 --> 00:15:07,680 Speaker 2: nonlinear systems, and so we need these simplified models. If 276 00:15:07,720 --> 00:15:11,320 Speaker 2: it is actually just prediction, then sometimes a really complex 277 00:15:11,360 --> 00:15:15,080 Speaker 2: model can work, but at the cost of reduced understanding. 278 00:15:15,440 --> 00:15:19,640 Speaker 1: So as AI generated code becomes more common, what does 279 00:15:19,640 --> 00:15:24,120 Speaker 1: that do to our sense of authorship and even understanding? 280 00:15:24,120 --> 00:15:26,160 Speaker 1: And could we end up in a situation where we're 281 00:15:26,160 --> 00:15:29,320 Speaker 1: surrounded by systems that are running our world that we 282 00:15:29,360 --> 00:15:30,520 Speaker 1: don't understand at all. 283 00:15:30,640 --> 00:15:33,040 Speaker 2: I mean, to be honest, I think we're probably there already. 284 00:15:33,480 --> 00:15:35,800 Speaker 2: It's just the situation where I think many people are 285 00:15:35,800 --> 00:15:38,000 Speaker 2: not aware of that fact. This is also one of 286 00:15:38,040 --> 00:15:40,640 Speaker 2: these situations where as we build more and more complex 287 00:15:40,680 --> 00:15:44,040 Speaker 2: systems and everyday users are kind of more distant from them, 288 00:15:44,080 --> 00:15:46,880 Speaker 2: we just don't realize the sheer complexity. When the Apple 289 00:15:46,920 --> 00:15:50,080 Speaker 2: Watch first came out, this is years ago, there was 290 00:15:50,120 --> 00:15:52,600 Speaker 2: an article in the Wall Street Journal, I think it 291 00:15:52,640 --> 00:15:54,680 Speaker 2: was like the Style section about like, are people going 292 00:15:54,720 --> 00:15:56,960 Speaker 2: to still use like biomechanical watch? As the answer is 293 00:15:57,040 --> 00:15:59,200 Speaker 2: they still are that. They interviewed this one guy about it, 294 00:15:59,240 --> 00:16:01,360 Speaker 2: like whether or not you want to buy mechanical watch 295 00:16:01,400 --> 00:16:03,720 Speaker 2: or just smart watches, and this guy said something to 296 00:16:03,720 --> 00:16:06,400 Speaker 2: the effect of, of course I want a mechanical watch. 297 00:16:06,400 --> 00:16:08,160 Speaker 2: When I think about a mechanical watch. It's so complex 298 00:16:08,160 --> 00:16:10,360 Speaker 2: as opposed to a smart watch, which is just a chip, 299 00:16:10,520 --> 00:16:12,440 Speaker 2: and the thing is like a chip is orders of 300 00:16:12,520 --> 00:16:15,480 Speaker 2: magnitude more complex than a mechanical watch. But we've been 301 00:16:15,480 --> 00:16:18,560 Speaker 2: shielded from it, and I think as we have AI 302 00:16:18,640 --> 00:16:21,880 Speaker 2: generated code, we're going to kind of have another level 303 00:16:21,920 --> 00:16:26,160 Speaker 2: of shielding. I do think we need better mechanisms for 304 00:16:26,680 --> 00:16:28,720 Speaker 2: so interrogating the system. So I actually so I think 305 00:16:28,760 --> 00:16:30,280 Speaker 2: one of these situations we're On the one hand, it 306 00:16:30,360 --> 00:16:32,560 Speaker 2: is very good that we can now generate code via 307 00:16:32,640 --> 00:16:36,760 Speaker 2: AI and build simple tools and actually democratize the software development. 308 00:16:36,760 --> 00:16:39,200 Speaker 2: I think there's lots of interesting things there. But I 309 00:16:39,560 --> 00:16:42,120 Speaker 2: still also think understanding code to a certain degree and 310 00:16:42,160 --> 00:16:44,360 Speaker 2: allowing you to kind of like dive into the code 311 00:16:44,360 --> 00:16:46,920 Speaker 2: that is being generated and tweak it, not only does 312 00:16:46,920 --> 00:16:48,760 Speaker 2: it give you a better understanding of what you're doing, 313 00:16:48,880 --> 00:16:50,920 Speaker 2: but it's still actually really good to help make sure 314 00:16:50,960 --> 00:16:53,720 Speaker 2: that it is doing at least partly what you hope for. 315 00:16:54,440 --> 00:16:57,200 Speaker 2: That being said, our systems have always been imperfect, and 316 00:16:57,240 --> 00:16:59,000 Speaker 2: I think right now, this is this moment is kind 317 00:16:59,000 --> 00:17:02,120 Speaker 2: of just heightening that fact, and maybe we'll give people 318 00:17:02,200 --> 00:17:07,800 Speaker 2: a better awareness that these systems have always been enormously complex, 319 00:17:08,520 --> 00:17:11,440 Speaker 2: enormously imperfect, made by humans at least at some level, 320 00:17:12,560 --> 00:17:15,760 Speaker 2: but maybe give us a greater appreciation for building systems 321 00:17:15,800 --> 00:17:18,680 Speaker 2: on top of these, maybe also AI generated as well, 322 00:17:18,960 --> 00:17:23,040 Speaker 2: that can allow us to make sure that the the 323 00:17:23,080 --> 00:17:25,560 Speaker 2: unanticipated consequences are as minimal as possible. 324 00:17:27,359 --> 00:17:28,720 Speaker 1: You know, I was just thinking about right after I 325 00:17:28,720 --> 00:17:31,320 Speaker 1: asked my question about could we end up living inside 326 00:17:31,359 --> 00:17:34,520 Speaker 1: systems that are too complex for us to understand. Obviously, 327 00:17:34,600 --> 00:17:37,480 Speaker 1: we live inside our biology, and that is for sure 328 00:17:37,680 --> 00:17:41,120 Speaker 1: that we don't understand. But a fraction of what's going 329 00:17:41,160 --> 00:17:43,480 Speaker 1: on inside is biologically. But we try to eat the 330 00:17:43,560 --> 00:17:46,600 Speaker 1: right foods and get exercise and just just right on 331 00:17:46,680 --> 00:17:49,320 Speaker 1: top of this system. So we're actually quite used to 332 00:17:49,359 --> 00:17:52,920 Speaker 1: living inside systems that are beyond our understanding. 333 00:17:53,200 --> 00:17:55,000 Speaker 2: Yeah, And I think and that goes back to kind 334 00:17:55,000 --> 00:17:59,240 Speaker 2: of like the biological nature of these massive, complex computational systems. Like, 335 00:17:59,280 --> 00:18:02,880 Speaker 2: the more we recognize that these systems are really complex 336 00:18:02,920 --> 00:18:06,480 Speaker 2: and almost have like this organic quality, we will have 337 00:18:06,520 --> 00:18:08,520 Speaker 2: to realize, yeah, that we need different ways of approaching 338 00:18:08,560 --> 00:18:11,040 Speaker 2: them and the way we approach our bodies. Right, It's not, 339 00:18:11,240 --> 00:18:13,720 Speaker 2: oh like I have total ignorance about how the system 340 00:18:13,720 --> 00:18:16,440 Speaker 2: works or have complete understanding. There's there's a lot in between, 341 00:18:16,840 --> 00:18:20,480 Speaker 2: and using rules of thumbs and things are are very 342 00:18:20,520 --> 00:18:22,359 Speaker 2: powerful and I but at the same time, though, we 343 00:18:22,359 --> 00:18:24,439 Speaker 2: don't want to necessarily kind of like succumb to like 344 00:18:24,480 --> 00:18:26,600 Speaker 2: the like the biohacking trend, which I feel like five 345 00:18:26,640 --> 00:18:29,360 Speaker 2: ten years ago was a really big thing where it's like, oh, 346 00:18:29,440 --> 00:18:31,280 Speaker 2: if I can just find this one chemical to ingest 347 00:18:31,320 --> 00:18:34,440 Speaker 2: or this one cool trick to do, then I'll never 348 00:18:34,480 --> 00:18:37,240 Speaker 2: need to sleep again or I'll be held and we 349 00:18:37,280 --> 00:18:40,399 Speaker 2: realize that, I mean, yeah, our system, our bodies have 350 00:18:40,480 --> 00:18:44,840 Speaker 2: evolved over millions of years and are optimizing a huge 351 00:18:44,920 --> 00:18:46,680 Speaker 2: number of different things, and they're going to be imperfect 352 00:18:46,720 --> 00:18:49,000 Speaker 2: and weird, and so it's going to be maybe we 353 00:18:49,040 --> 00:18:51,160 Speaker 2: can find those things, but the odds that we are 354 00:18:51,400 --> 00:18:53,320 Speaker 2: going to is very low. And I think the same 355 00:18:53,400 --> 00:18:56,200 Speaker 2: kind of thing that that same kind of approach needs 356 00:18:56,240 --> 00:18:59,240 Speaker 2: to be used when we think about these complex technologies 357 00:18:59,280 --> 00:19:01,000 Speaker 2: that were building around us as well. 358 00:19:01,160 --> 00:19:03,200 Speaker 1: So let me ask you this, if we fast forward 359 00:19:03,280 --> 00:19:07,280 Speaker 1: one hundred years, are we still coding by using symbols 360 00:19:07,280 --> 00:19:09,919 Speaker 1: and syntax or is it a completely different sort of 361 00:19:10,040 --> 00:19:15,320 Speaker 1: thing where we are setting initial conditions and letting complexity evolve. 362 00:19:15,320 --> 00:19:17,480 Speaker 2: I would say it's very different. I don't think we 363 00:19:17,520 --> 00:19:19,639 Speaker 2: need to necessarily go even one hundred years into the future. 364 00:19:19,680 --> 00:19:22,159 Speaker 2: It could be like five ten years until we're kind 365 00:19:22,160 --> 00:19:25,000 Speaker 2: of managing these AI generated systems. I mean. But the 366 00:19:25,040 --> 00:19:28,840 Speaker 2: truth is, when I think about what coding is, it's 367 00:19:28,880 --> 00:19:31,640 Speaker 2: always been this moving target, like it's always been changing. 368 00:19:31,960 --> 00:19:34,439 Speaker 2: So the way I learned how to program in some 369 00:19:34,480 --> 00:19:37,760 Speaker 2: of the languages I learned, they're not totally extinct, but 370 00:19:37,760 --> 00:19:40,760 Speaker 2: they're not things I would ever consider using nowadays. But 371 00:19:40,800 --> 00:19:43,760 Speaker 2: also even when I think about how people before me 372 00:19:43,840 --> 00:19:46,959 Speaker 2: learned how to program, that too was something very very different. 373 00:19:47,040 --> 00:19:50,199 Speaker 2: It was like plugging in cables or flipping switches or 374 00:19:50,359 --> 00:19:53,760 Speaker 2: writing things in binary or assembly code. Like I never 375 00:19:53,800 --> 00:19:55,960 Speaker 2: did those kinds of things. Nor do I really have 376 00:19:56,040 --> 00:19:57,840 Speaker 2: a strong desire too. And that's okay, And I think 377 00:19:57,840 --> 00:19:59,600 Speaker 2: it's going to continue changing. And so one of the 378 00:19:59,600 --> 00:20:02,280 Speaker 2: ways I think about this is actually I tell this 379 00:20:02,359 --> 00:20:05,800 Speaker 2: story in the book, a story from the Tumud actually 380 00:20:05,920 --> 00:20:11,000 Speaker 2: where it's this conversation that's describing a conversation between God 381 00:20:11,040 --> 00:20:14,240 Speaker 2: and Moses and they're discussing like, oh, like who's going 382 00:20:14,280 --> 00:20:16,919 Speaker 2: to be the greatest scholar in the future, and God says, oh, 383 00:20:16,920 --> 00:20:18,960 Speaker 2: it's going to be some rabbi, like a thousand and 384 00:20:18,960 --> 00:20:20,879 Speaker 2: two thousand years in the future, and Moses says, can 385 00:20:20,920 --> 00:20:22,600 Speaker 2: you show him to me. So there's this weird time 386 00:20:22,640 --> 00:20:24,720 Speaker 2: travel moment where he's transported to the hall of Study, 387 00:20:24,720 --> 00:20:26,280 Speaker 2: like a thousand years in the future whatever it is, 388 00:20:26,600 --> 00:20:28,560 Speaker 2: and Moses is sitting in the back listening to this 389 00:20:28,600 --> 00:20:32,680 Speaker 2: illustrious scholar talking about things, and he realizes he doesn't 390 00:20:32,760 --> 00:20:35,560 Speaker 2: understand anything this guy's talking about, and he's kind of overwhelmed. 391 00:20:35,560 --> 00:20:37,920 Speaker 2: He's like, Oh, I'm the one who received the law 392 00:20:38,400 --> 00:20:40,480 Speaker 2: from Heaven and I don't get it until at the 393 00:20:40,560 --> 00:20:42,640 Speaker 2: very last moment, the rabbi says, oh, in the way 394 00:20:42,680 --> 00:20:46,320 Speaker 2: we understand all this is because of the law received 395 00:20:46,359 --> 00:20:48,479 Speaker 2: from Moses at Sinai, And at that point he's calmed 396 00:20:48,600 --> 00:20:51,239 Speaker 2: and because he realizes that even if he doesn't understand it, 397 00:20:51,480 --> 00:20:55,080 Speaker 2: there's this clear, continuous line and kind of continuous tradition. 398 00:20:55,119 --> 00:20:57,359 Speaker 2: And I feel like when it comes to code, the 399 00:20:57,400 --> 00:20:59,520 Speaker 2: same kind of thing is true. Coding has changed, it 400 00:20:59,520 --> 00:21:02,000 Speaker 2: will continue to change. It'll be much more like managing 401 00:21:02,040 --> 00:21:05,080 Speaker 2: AI systems or some other thing. But Ultimately, it's all 402 00:21:05,119 --> 00:21:08,240 Speaker 2: about taking some idea in our heads and finding some 403 00:21:08,320 --> 00:21:11,040 Speaker 2: way of instantiing it into a machine and actually getting 404 00:21:11,080 --> 00:21:12,840 Speaker 2: the machine to do something, to kind of do our bidding. 405 00:21:13,440 --> 00:21:15,679 Speaker 2: And what that looks like is always going to be 406 00:21:15,760 --> 00:21:18,240 Speaker 2: changing and so but as long as we recognize it's 407 00:21:18,280 --> 00:21:21,480 Speaker 2: all part of this long tradition, I think, then I 408 00:21:21,720 --> 00:21:24,040 Speaker 2: kind of do it. It's all coding, whether or not 409 00:21:24,080 --> 00:21:26,919 Speaker 2: it's syntax or Python or Pearl or whatever it is, 410 00:21:27,320 --> 00:21:29,200 Speaker 2: it will definitely not be that. But that's okay. 411 00:21:29,400 --> 00:21:31,080 Speaker 1: First of all, I love that story. I had no 412 00:21:31,080 --> 00:21:33,720 Speaker 1: idea that there was time travel and the toallment. That's amazing. 413 00:21:33,840 --> 00:21:35,320 Speaker 1: I want to come back to this point that we're 414 00:21:35,320 --> 00:21:38,680 Speaker 1: talking about that we're already living inside a system that 415 00:21:38,880 --> 00:21:41,960 Speaker 1: is so complex. We don't understand this because you know, 416 00:21:42,040 --> 00:21:46,000 Speaker 1: we program simulations, we program other things. But increasingly what 417 00:21:46,040 --> 00:21:50,919 Speaker 1: it means is we're really living inside of this opaque simulation. 418 00:21:51,119 --> 00:21:53,680 Speaker 1: And this will be even more true for our descendants, 419 00:21:53,680 --> 00:21:56,879 Speaker 1: where they'll be living inside this world of creation that 420 00:21:56,920 --> 00:21:59,440 Speaker 1: they can't understand explicitly. 421 00:22:00,040 --> 00:22:03,080 Speaker 2: Danny Danny hillis the computer scientist. He has this great 422 00:22:03,160 --> 00:22:05,359 Speaker 2: term where he talks about how we've kind of moved 423 00:22:05,400 --> 00:22:07,560 Speaker 2: from the enlightenment to this kind of age where we 424 00:22:07,560 --> 00:22:10,080 Speaker 2: could take our take our mind and kind of apply 425 00:22:10,119 --> 00:22:11,520 Speaker 2: it to the world around us and really understand it. 426 00:22:11,960 --> 00:22:13,760 Speaker 2: To the entanglement. We've kind of moved to this area 427 00:22:13,800 --> 00:22:16,520 Speaker 2: where everything is so hopelessly interconnected, we're never going to 428 00:22:16,560 --> 00:22:19,119 Speaker 2: fully understand it. And I think we've been in the 429 00:22:19,240 --> 00:22:22,600 Speaker 2: entanglement for quite some time, and it's it's really just 430 00:22:22,640 --> 00:22:25,679 Speaker 2: a matter of becoming a little bit more aware of it. 431 00:22:25,920 --> 00:22:27,879 Speaker 2: I think, I mean going back to kind of the 432 00:22:27,920 --> 00:22:31,560 Speaker 2: analogy of biology and things like that with technology, and 433 00:22:31,600 --> 00:22:34,199 Speaker 2: biology is a form of technology. Someone once told me 434 00:22:34,240 --> 00:22:36,119 Speaker 2: that the way he kind of thought about it is 435 00:22:36,160 --> 00:22:39,440 Speaker 2: like the most complicated engineered system that humans have ever 436 00:22:39,520 --> 00:22:44,800 Speaker 2: made are domesticating dogs, because like these are we made them. 437 00:22:44,920 --> 00:22:48,479 Speaker 2: I we we evolve them basically through like through our 438 00:22:48,680 --> 00:22:52,479 Speaker 2: artificial selection, but they're an enormously complicated system. And I 439 00:22:52,520 --> 00:22:56,560 Speaker 2: think that's those kinds of approaches, whether it's kind of 440 00:22:56,600 --> 00:23:00,480 Speaker 2: like tinkering with systems, kind of evolving them, wrecking these 441 00:23:00,480 --> 00:23:03,280 Speaker 2: things as enormously complicated, those might be the kind of 442 00:23:03,320 --> 00:23:07,240 Speaker 2: approaches that that we need. That being said, I would 443 00:23:07,240 --> 00:23:08,800 Speaker 2: say one of the other things though that at least 444 00:23:08,800 --> 00:23:10,399 Speaker 2: gives me a certain amount of hope. Though that it 445 00:23:10,480 --> 00:23:12,320 Speaker 2: is kind of weird when you think about it, is 446 00:23:12,359 --> 00:23:15,359 Speaker 2: that the extent to which humans are also really good 447 00:23:15,400 --> 00:23:18,000 Speaker 2: at adapting to the world around us. And so we 448 00:23:18,320 --> 00:23:21,360 Speaker 2: think about all the technological changes that have come over 449 00:23:21,400 --> 00:23:26,360 Speaker 2: the past couple hundred years, and these things were enormously destabilizing, 450 00:23:26,880 --> 00:23:28,879 Speaker 2: but in many ways we kind of now take them 451 00:23:28,920 --> 00:23:32,560 Speaker 2: for granted, and even like more modern and more modern ones. 452 00:23:32,600 --> 00:23:35,320 Speaker 2: I'm not just talking about I like air travel or 453 00:23:35,520 --> 00:23:38,000 Speaker 2: certain things around the internet, or the industrial Revolution. So 454 00:23:38,000 --> 00:23:40,000 Speaker 2: my grandfather, he was he lived at the age of 455 00:23:40,040 --> 00:23:42,119 Speaker 2: ninety nine. He was a retired dentist. But he also 456 00:23:42,280 --> 00:23:44,840 Speaker 2: read science fiction since like the modern dawn of the genre, 457 00:23:44,880 --> 00:23:47,879 Speaker 2: Like he read his entire life, and I remember he 458 00:23:47,880 --> 00:23:49,720 Speaker 2: read I think he read Dune when it was sialized 459 00:23:49,760 --> 00:23:52,440 Speaker 2: in a magazine, so like no story could surprise him. 460 00:23:52,600 --> 00:23:54,920 Speaker 2: And I remember when when the iPhone first came out. 461 00:23:55,160 --> 00:23:57,480 Speaker 2: I went with my grandfather as well as my father 462 00:23:57,520 --> 00:23:59,280 Speaker 2: to the Apple store to kind of check out the 463 00:23:59,600 --> 00:24:01,960 Speaker 2: iPhone and we're playing with them looking at it, and 464 00:24:02,000 --> 00:24:04,000 Speaker 2: he looks at and one point he goes, this is it, 465 00:24:04,080 --> 00:24:05,919 Speaker 2: Like this is the object I've been reading about for 466 00:24:05,960 --> 00:24:09,280 Speaker 2: all these years, and we've moved though from like, oh, 467 00:24:09,520 --> 00:24:11,760 Speaker 2: the iPhone is this object of wonder and science fiction 468 00:24:11,840 --> 00:24:15,040 Speaker 2: in the future, to like like complaining about like camera 469 00:24:15,080 --> 00:24:17,639 Speaker 2: resolution or battery life and things like that. Like we've 470 00:24:17,760 --> 00:24:21,000 Speaker 2: so quickly adapted, which, on the one hand, is good 471 00:24:21,080 --> 00:24:22,760 Speaker 2: and kind of gives me hope that we are going 472 00:24:22,800 --> 00:24:26,119 Speaker 2: to figure out ways of adapting to new types of complexity. 473 00:24:26,280 --> 00:24:28,800 Speaker 2: On the other hand, though, it means that we sometimes 474 00:24:28,840 --> 00:24:32,080 Speaker 2: don't necessarily retain that capacity for wonder or when it 475 00:24:32,119 --> 00:24:35,240 Speaker 2: comes to complexity and the complex systems around us, maybe 476 00:24:35,520 --> 00:24:38,600 Speaker 2: a more critical stance and actually saying okay, like how 477 00:24:38,680 --> 00:24:40,240 Speaker 2: like what are the kind of systems we want to 478 00:24:40,240 --> 00:24:43,000 Speaker 2: be embedded within and as opposed to kind of just 479 00:24:43,040 --> 00:24:45,840 Speaker 2: allowing them to kind of wash over us. But I 480 00:24:45,880 --> 00:24:48,480 Speaker 2: do think that kind of adaptive capacity does give me 481 00:24:48,480 --> 00:24:49,240 Speaker 2: a little bit of hope. 482 00:24:49,359 --> 00:24:52,120 Speaker 1: Yes, you probably know this routine from the comedian Louis 483 00:24:52,160 --> 00:24:54,639 Speaker 1: c k where the first time he's ever on an 484 00:24:54,640 --> 00:24:56,600 Speaker 1: airplane and they announce we have Wi Fi in the 485 00:24:56,640 --> 00:24:59,760 Speaker 1: airplane and he's amazed. Everyone on the plane is amazing 486 00:25:00,000 --> 00:25:02,640 Speaker 1: I ever heard of this, And then ten minutes into 487 00:25:02,680 --> 00:25:05,560 Speaker 1: the flight, the Wi Fi breaks, it stops working in 488 00:25:05,600 --> 00:25:08,240 Speaker 1: the guy next to him starts complaining, and Luisy cases 489 00:25:08,280 --> 00:25:10,639 Speaker 1: you know, ten minutes ago you didn't even know this existed, 490 00:25:10,720 --> 00:25:13,280 Speaker 1: and now you're complaining about it. So yes, it is 491 00:25:13,320 --> 00:25:16,000 Speaker 1: true that we adapt so quickly to that. Okay, so 492 00:25:16,080 --> 00:25:18,960 Speaker 1: let me ask you something really random. Given the evolution 493 00:25:19,240 --> 00:25:22,480 Speaker 1: of the complexity all around us, what is your opinion 494 00:25:22,520 --> 00:25:24,880 Speaker 1: on whether we are already living in a simulation? 495 00:25:25,119 --> 00:25:27,880 Speaker 2: For me, I like to think about it much more 496 00:25:27,920 --> 00:25:30,720 Speaker 2: as if you kind of not necessarily take the simulation 497 00:25:30,800 --> 00:25:34,359 Speaker 2: hypothesis seriously as this like question of great importance, but 498 00:25:34,400 --> 00:25:36,280 Speaker 2: as like, oh, a question that kind of leads me 499 00:25:36,320 --> 00:25:39,239 Speaker 2: to think about more things around physics and computing. Then 500 00:25:39,280 --> 00:25:40,879 Speaker 2: I think it can actually be very productive. So the 501 00:25:40,920 --> 00:25:44,240 Speaker 2: question becomes like, in the same way that people talk 502 00:25:44,280 --> 00:25:48,919 Speaker 2: about the simulation hypothesis, there are interesting aspects around, like 503 00:25:49,160 --> 00:25:51,560 Speaker 2: breaking out of a computer program when you were inside it, 504 00:25:51,840 --> 00:25:56,800 Speaker 2: or like the like the high resolution fidelity of computer games, 505 00:25:57,320 --> 00:26:00,920 Speaker 2: or even just the ways in which physic in reality 506 00:26:00,960 --> 00:26:04,560 Speaker 2: and computing intersect. Oftentimes, when we think about computation, we 507 00:26:04,560 --> 00:26:08,159 Speaker 2: think of it as this kind of like ephemeral information stuff, 508 00:26:08,800 --> 00:26:10,960 Speaker 2: and I think that's a really powerful way of thinking 509 00:26:10,960 --> 00:26:13,920 Speaker 2: about it. But the truth is, like computing and computers, 510 00:26:13,920 --> 00:26:17,359 Speaker 2: like they are deeply physical. So the like the Internet, 511 00:26:17,440 --> 00:26:19,840 Speaker 2: like the Internet is not just information kind of whizzing around. 512 00:26:20,760 --> 00:26:22,560 Speaker 2: It is kind of to use the term from like 513 00:26:22,600 --> 00:26:25,280 Speaker 2: the that senator a number of years ago who was 514 00:26:25,280 --> 00:26:27,600 Speaker 2: like widely mocked, like it is a series of tubes. 515 00:26:27,840 --> 00:26:29,639 Speaker 2: And actually there's a book called Tubes based on that 516 00:26:29,720 --> 00:26:31,800 Speaker 2: of like the physical infrastructure of the Internet, Like there 517 00:26:31,840 --> 00:26:34,280 Speaker 2: is a lot of this physicality, and and I think 518 00:26:34,520 --> 00:26:38,480 Speaker 2: thinking about that the physical nature of our computing can 519 00:26:38,520 --> 00:26:41,240 Speaker 2: be really powerful. And sometimes the simulation hypothesis, like thinking 520 00:26:41,240 --> 00:26:43,560 Speaker 2: about it can can help heighten that or can just 521 00:26:43,640 --> 00:26:45,560 Speaker 2: make you realize, oh, there's some interesting bugs that are 522 00:26:45,600 --> 00:26:47,480 Speaker 2: worth thinking about. So for example, there was a story 523 00:26:47,520 --> 00:26:49,720 Speaker 2: I read where I think it was like in some 524 00:26:49,880 --> 00:26:53,399 Speaker 2: hospital people noticed that iPhones stopped working when they were 525 00:26:53,560 --> 00:26:56,840 Speaker 2: near one MRI machine. But it wasn't Android phones, it 526 00:26:56,880 --> 00:26:58,960 Speaker 2: wasn't on there. It was like just Apple products. And 527 00:26:58,960 --> 00:27:01,280 Speaker 2: it turned out it happened to be that some sort 528 00:27:01,320 --> 00:27:04,520 Speaker 2: of switch or some other component within these Apple devices 529 00:27:04,640 --> 00:27:08,960 Speaker 2: it had some small enough gap that it happened to 530 00:27:08,960 --> 00:27:11,080 Speaker 2: be this MRI machine had had a helium leak, and 531 00:27:11,119 --> 00:27:13,040 Speaker 2: the helium atoms were just the right size to kind 532 00:27:13,080 --> 00:27:15,639 Speaker 2: of get into this machine but didn't affect Android devices 533 00:27:15,720 --> 00:27:17,720 Speaker 2: or other things. And so it was this wild thing 534 00:27:17,960 --> 00:27:21,959 Speaker 2: that just brought home like the deep, deeply physical nature 535 00:27:22,200 --> 00:27:25,040 Speaker 2: of computing. And so for me, when I think about 536 00:27:25,040 --> 00:27:26,919 Speaker 2: the simulation hypothesis, I don't think about it as like, 537 00:27:27,119 --> 00:27:30,119 Speaker 2: oh no, like I'm being controlled by aliens or humans 538 00:27:30,119 --> 00:27:32,040 Speaker 2: in the future or whatever it is. It's much more about, Okay, 539 00:27:32,119 --> 00:27:35,200 Speaker 2: how do I think about breaking open computer games, or 540 00:27:35,520 --> 00:27:37,760 Speaker 2: like the deeply physical nature of bugs and like all this, 541 00:27:37,920 --> 00:27:39,920 Speaker 2: like that's the kind of stuff that I find most 542 00:27:39,960 --> 00:27:43,360 Speaker 2: interesting about the simulation hypothesis. I also think about it 543 00:27:43,400 --> 00:27:47,960 Speaker 2: as for me, it's almost this like cry for myth 544 00:27:48,160 --> 00:27:51,440 Speaker 2: in like the in the tech world where it's like, oh, 545 00:27:51,720 --> 00:27:55,000 Speaker 2: like we're a deeply like like the Silton Valley world 546 00:27:55,080 --> 00:27:59,040 Speaker 2: is like deeply rational, deeply logical, but we still kind 547 00:27:59,040 --> 00:28:01,800 Speaker 2: of need some sort of myth or store organizing story 548 00:28:01,880 --> 00:28:06,000 Speaker 2: in our world. And the simulation hypothesis and ideas around 549 00:28:06,040 --> 00:28:09,600 Speaker 2: the singularity, certain ideas around longevity or AI or things 550 00:28:09,600 --> 00:28:12,800 Speaker 2: like that, they often many many times they're also based 551 00:28:12,840 --> 00:28:17,080 Speaker 2: on technology, but when they kind of get big enough, 552 00:28:17,359 --> 00:28:21,159 Speaker 2: those ideas kind of veer into kind of myth and storyland. 553 00:28:21,240 --> 00:28:23,760 Speaker 2: And so for me, it's I view that as kind 554 00:28:23,760 --> 00:28:25,959 Speaker 2: of when people take those ideas a little bit more seriously, 555 00:28:26,000 --> 00:28:29,440 Speaker 2: it's much more around okay, fitting kind of a certain 556 00:28:29,440 --> 00:28:31,320 Speaker 2: amount of myth into kind of that myth shape hole 557 00:28:31,359 --> 00:28:32,520 Speaker 2: for those for those type people. 558 00:28:49,600 --> 00:28:51,880 Speaker 1: So let's return to your grandfather and the iPhone. So 559 00:28:52,280 --> 00:28:54,880 Speaker 1: how do you recommend in your book and in your 560 00:28:54,920 --> 00:28:59,479 Speaker 1: life preserving our sense of magic around the technology that 561 00:28:59,480 --> 00:28:59,800 Speaker 1: we have. 562 00:29:00,400 --> 00:29:03,120 Speaker 2: When I think about like magic and wonder and kind 563 00:29:03,120 --> 00:29:08,640 Speaker 2: of delight in computing, it's never really been an either 564 00:29:08,800 --> 00:29:11,000 Speaker 2: or of like, oh, like there's either kind of like 565 00:29:11,120 --> 00:29:14,480 Speaker 2: corporate SaaS software or kind of like the fun weird things. 566 00:29:15,520 --> 00:29:17,080 Speaker 2: And I feel like you can kind of tell some 567 00:29:17,120 --> 00:29:18,640 Speaker 2: people might tell a story of like oh, there used 568 00:29:18,640 --> 00:29:20,120 Speaker 2: to be more of that kind of wondrous stuff and 569 00:29:20,160 --> 00:29:23,160 Speaker 2: now we've kind of it's all like we're we're just 570 00:29:23,200 --> 00:29:27,120 Speaker 2: kind of locked into large social media sites or just 571 00:29:27,240 --> 00:29:29,880 Speaker 2: we're just using these like a large, kind of bland, 572 00:29:30,040 --> 00:29:33,520 Speaker 2: beige pieces of software. I think there is an element 573 00:29:33,600 --> 00:29:38,800 Speaker 2: of that, but the truth is these two aspects of computing, 574 00:29:38,800 --> 00:29:42,640 Speaker 2: they've always co existed, and so like alongside like the 575 00:29:42,720 --> 00:29:45,880 Speaker 2: really big mainframe or refrigerator sized computers, there were people 576 00:29:45,960 --> 00:29:48,800 Speaker 2: trying to build like early computer games, and then once 577 00:29:48,800 --> 00:29:51,200 Speaker 2: we had personal computers, there was a lot of fun, 578 00:29:51,240 --> 00:29:55,320 Speaker 2: weird things, people experimenting with fractals, but also people using 579 00:29:55,320 --> 00:29:57,640 Speaker 2: spreadsheets in businesses, and so it's not an either or 580 00:29:57,680 --> 00:29:59,400 Speaker 2: any and the truth is even on the web now 581 00:30:00,080 --> 00:30:03,280 Speaker 2: alongside kind of the large websites, there are also people 582 00:30:03,360 --> 00:30:07,000 Speaker 2: talking about there's a term called the poetic web, where 583 00:30:07,000 --> 00:30:09,960 Speaker 2: it's like the kind of the more human scale, fun, 584 00:30:10,400 --> 00:30:14,240 Speaker 2: funkier and weirder sort of websites. And for me, it's 585 00:30:14,280 --> 00:30:18,560 Speaker 2: really just a matter of trying to actually like discover 586 00:30:18,640 --> 00:30:20,400 Speaker 2: these kinds of things and realize that it's that it's 587 00:30:20,440 --> 00:30:22,720 Speaker 2: always been out there and it's really just a matter 588 00:30:22,920 --> 00:30:25,280 Speaker 2: of being able to find it. And so for me, 589 00:30:25,920 --> 00:30:27,560 Speaker 2: I kind of view some of the ideas in the 590 00:30:27,560 --> 00:30:30,320 Speaker 2: book almost as like a proof of existence of like, oh, 591 00:30:30,360 --> 00:30:32,280 Speaker 2: these things do exist out that out there. You don't 592 00:30:32,280 --> 00:30:34,160 Speaker 2: necessarily have to be as excited as I am by 593 00:30:34,200 --> 00:30:36,960 Speaker 2: some of the examples I give, but let that be 594 00:30:37,560 --> 00:30:40,640 Speaker 2: a guide to ohkay, there are other things out there 595 00:30:40,720 --> 00:30:45,160 Speaker 2: that are just worth enjoying and experiencing and delighting it. 596 00:30:45,600 --> 00:30:48,280 Speaker 2: And I think part of that often is just kind 597 00:30:48,320 --> 00:30:52,440 Speaker 2: of at the smaller scale. And I do actually think 598 00:30:53,120 --> 00:30:56,120 Speaker 2: that one of the exciting things about AI generated code 599 00:30:56,440 --> 00:30:59,240 Speaker 2: is it really allows for this kind of democratization of 600 00:30:59,360 --> 00:31:01,880 Speaker 2: building soft where and so people have talked about this 601 00:31:01,920 --> 00:31:03,160 Speaker 2: kind of thing for a very long time of like 602 00:31:03,320 --> 00:31:05,680 Speaker 2: it shouldn't just be the domain of like big companies 603 00:31:05,800 --> 00:31:08,080 Speaker 2: or kind of serious software developers of building things that 604 00:31:08,120 --> 00:31:10,720 Speaker 2: are going to be used by like millions or hundreds 605 00:31:10,720 --> 00:31:12,640 Speaker 2: of millions of people. There should be a way for 606 00:31:12,760 --> 00:31:15,120 Speaker 2: each individual user to kind of build the bespoke thing 607 00:31:15,120 --> 00:31:18,200 Speaker 2: they want. And so the novelist Robin Sloan has this phrase. 608 00:31:18,600 --> 00:31:20,040 Speaker 2: I think it's like an app can be a home 609 00:31:20,040 --> 00:31:22,480 Speaker 2: cooked meal, this idea that like, you don't necessarily need 610 00:31:22,640 --> 00:31:24,840 Speaker 2: to build something for everyone. You can build like a 611 00:31:24,880 --> 00:31:27,400 Speaker 2: little program for yourself or for your loved ones, and 612 00:31:27,440 --> 00:31:29,720 Speaker 2: that's fine, and that's great. In fact, and the true 613 00:31:29,720 --> 00:31:31,719 Speaker 2: this spreadsheets were actually a simple version of this kind 614 00:31:31,720 --> 00:31:33,760 Speaker 2: of thing, because you can actually program in very simple ways. 615 00:31:33,760 --> 00:31:35,520 Speaker 2: But then there was and there was HyperCard with kind 616 00:31:35,520 --> 00:31:37,120 Speaker 2: of some of the early macintoshes where it was like 617 00:31:37,120 --> 00:31:39,760 Speaker 2: this authoring program to kind of build like weird little 618 00:31:40,600 --> 00:31:43,040 Speaker 2: sort of like pseudo website programs on your own computer. 619 00:31:43,200 --> 00:31:48,040 Speaker 2: But now with AI generated code, I really see this 620 00:31:48,120 --> 00:31:52,240 Speaker 2: kind of democratization potential really blossoming. And and so for me, 621 00:31:52,680 --> 00:31:54,760 Speaker 2: like that is the kind of thing that really can 622 00:31:54,800 --> 00:31:57,120 Speaker 2: hopefully induce a sense of wonder and people where like 623 00:31:57,200 --> 00:31:59,320 Speaker 2: they can now build all the programs that they want, 624 00:31:59,360 --> 00:32:01,840 Speaker 2: and so to have to be that if you kind 625 00:32:01,840 --> 00:32:03,680 Speaker 2: of explored the world and kind of went about your 626 00:32:03,760 --> 00:32:05,760 Speaker 2: day and looked at the world and noticed interesting problems 627 00:32:05,800 --> 00:32:08,280 Speaker 2: that could maybe be solved by software, if you weren't 628 00:32:08,280 --> 00:32:09,920 Speaker 2: a software developer, you would kind of have to shut 629 00:32:09,960 --> 00:32:12,000 Speaker 2: down that portion of your mind because you couldn't do 630 00:32:12,040 --> 00:32:14,360 Speaker 2: anything about it. But now you can turn it back 631 00:32:14,400 --> 00:32:16,320 Speaker 2: on because now anyone can build those kinds of things. 632 00:32:16,360 --> 00:32:17,920 Speaker 2: And so I think that is actually a really interesting 633 00:32:17,960 --> 00:32:18,640 Speaker 2: source for wonder. 634 00:32:18,720 --> 00:32:21,080 Speaker 1: And you know, to my mind, there's the flip side 635 00:32:21,080 --> 00:32:24,560 Speaker 1: of that coin, not just for the individual, but for society. 636 00:32:24,600 --> 00:32:27,120 Speaker 1: What's going to come out of this? So tell us 637 00:32:27,120 --> 00:32:30,120 Speaker 1: about Dawn Swanson's paper from what was that I think 638 00:32:30,160 --> 00:32:33,360 Speaker 1: the eighties or something about undiscovered public knowledge. Tell us 639 00:32:33,360 --> 00:32:33,720 Speaker 1: about that. 640 00:32:34,480 --> 00:32:37,120 Speaker 2: Yeah, so Don Swanson, Yeah, he's this information scientist. In 641 00:32:37,120 --> 00:32:41,120 Speaker 2: the nineteen eighties, he wrote this paper called Undisovered Public Knowledge, 642 00:32:41,160 --> 00:32:45,400 Speaker 2: where the idea behind it was and he begins kind 643 00:32:45,400 --> 00:32:47,400 Speaker 2: of like with a thought experiment. He says, Okay, imagine, 644 00:32:47,480 --> 00:32:50,440 Speaker 2: somewhere in the scientific literature there's a paper that says 645 00:32:50,600 --> 00:32:52,920 Speaker 2: A implies B, and then somewhere else in the literature 646 00:32:52,960 --> 00:32:54,840 Speaker 2: could be in the same field, it could be an 647 00:32:54,960 --> 00:32:57,920 Speaker 2: entirely different field. There's another paper that says BE implies C. 648 00:32:58,320 --> 00:33:00,400 Speaker 2: And so if you would connect to them together, you 649 00:33:00,440 --> 00:33:03,040 Speaker 2: would say, oh, maybe in fact A IMPLYI C. But 650 00:33:03,120 --> 00:33:06,239 Speaker 2: because the scientific literature is so vast, no one has 651 00:33:06,240 --> 00:33:08,600 Speaker 2: actually been able to read these two papers, and so 652 00:33:09,160 --> 00:33:12,680 Speaker 2: that that knowledge, the connection was undiscovered, but it was 653 00:33:12,680 --> 00:33:14,479 Speaker 2: public because it was out there. And so it's one 654 00:33:14,480 --> 00:33:17,000 Speaker 2: of these things where if we actually had ways of 655 00:33:17,040 --> 00:33:20,480 Speaker 2: stitching together all the scientific knowledge that was out there, 656 00:33:20,560 --> 00:33:22,320 Speaker 2: we would actually be able to make new discoveries that 657 00:33:22,320 --> 00:33:24,680 Speaker 2: were kind of just lying out there, ready for the taking. 658 00:33:24,960 --> 00:33:26,760 Speaker 2: The interesting thing with Swanson is that he was not 659 00:33:26,880 --> 00:33:29,240 Speaker 2: content with leaving this as a thought experiment. He actually 660 00:33:29,320 --> 00:33:31,720 Speaker 2: tried to test it in the real world, and he 661 00:33:31,880 --> 00:33:34,520 Speaker 2: used in the then cutting edge technology, which was I 662 00:33:34,560 --> 00:33:37,600 Speaker 2: think like using keyword searches on like the medline database. 663 00:33:37,680 --> 00:33:40,680 Speaker 2: But he actually found this relationship between I think I 664 00:33:40,720 --> 00:33:43,320 Speaker 2: think it was consuming fish oil and then helping treat 665 00:33:43,360 --> 00:33:45,560 Speaker 2: some sort of circulatory disorder, and then he was I 666 00:33:45,560 --> 00:33:47,360 Speaker 2: think he was able to publish it in a medical 667 00:33:47,440 --> 00:33:49,880 Speaker 2: journal even though he himself had no medical training, which 668 00:33:49,920 --> 00:33:52,440 Speaker 2: was kind of wild and so and I think with 669 00:33:53,000 --> 00:33:54,760 Speaker 2: all like we had with a lot of these AI tools, 670 00:33:54,800 --> 00:33:56,360 Speaker 2: we are now going to be able to kind of 671 00:33:56,400 --> 00:33:59,840 Speaker 2: stitch together lots of different ideas and kind of navigate 672 00:33:59,880 --> 00:34:01,960 Speaker 2: them like the latent space of knowledge or however you 673 00:34:02,000 --> 00:34:04,120 Speaker 2: want to describe it, in a way that that has 674 00:34:04,240 --> 00:34:06,160 Speaker 2: really never before been possible. 675 00:34:07,800 --> 00:34:10,840 Speaker 1: This is actually my highest hope with large language models 676 00:34:11,320 --> 00:34:17,359 Speaker 1: is tackling the biomedical data, in putting facts together that 677 00:34:17,480 --> 00:34:20,320 Speaker 1: anyone could know, but nobody is going to because they're 678 00:34:20,320 --> 00:34:23,840 Speaker 1: published in totally different journals. I wrote a paper a 679 00:34:23,880 --> 00:34:27,719 Speaker 1: couple of years ago now on a meaningful test for 680 00:34:27,840 --> 00:34:32,040 Speaker 1: intelligence in AI, and I think that what I just 681 00:34:32,080 --> 00:34:35,080 Speaker 1: described that's going to be enormously helpful for science. But 682 00:34:35,480 --> 00:34:38,000 Speaker 1: the next level of intelligence which I don't think llms 683 00:34:38,000 --> 00:34:43,160 Speaker 1: are at yet is actually questioning whether something is true 684 00:34:43,480 --> 00:34:47,759 Speaker 1: and coming up with alternative models and then simulating those 685 00:34:47,800 --> 00:34:51,680 Speaker 1: and evaluating them. For example, you know, saying, hey, what 686 00:34:51,719 --> 00:34:53,600 Speaker 1: if I were writing on a photon of light, what 687 00:34:53,640 --> 00:34:57,080 Speaker 1: would that look like? And then getting to the theory 688 00:34:57,080 --> 00:35:00,680 Speaker 1: of relativity and realizing the trajectory of merch can be 689 00:35:00,719 --> 00:35:03,200 Speaker 1: explained by that, and so on. So that's the sort 690 00:35:03,200 --> 00:35:06,520 Speaker 1: of thing that lms don't do now. But yes, I 691 00:35:06,520 --> 00:35:11,200 Speaker 1: think they're going to be enormously helpful in this discovery 692 00:35:11,520 --> 00:35:14,439 Speaker 1: process within the public knowledge. It's already sitting out there. 693 00:35:14,520 --> 00:35:17,040 Speaker 2: People are already talking about, like AI scientists and things 694 00:35:17,040 --> 00:35:18,239 Speaker 2: like that, like whether or not it's going to be 695 00:35:18,280 --> 00:35:22,560 Speaker 2: helping with or it's stitching together the knowledge hypothesis generation. 696 00:35:22,719 --> 00:35:25,719 Speaker 2: Maybe eventually even yeah, this kind of like thought experiment 697 00:35:25,800 --> 00:35:28,000 Speaker 2: and then examining like what are the implications of the 698 00:35:28,000 --> 00:35:30,719 Speaker 2: thought experiments. But I do think, yeah, even if they're 699 00:35:30,719 --> 00:35:33,040 Speaker 2: not necessarily able to kind of do everything on their own, 700 00:35:33,640 --> 00:35:36,200 Speaker 2: the potential for this kind of like science like human 701 00:35:36,239 --> 00:35:40,160 Speaker 2: scientist machine partnership will well hopefully unlock a lot of 702 00:35:40,200 --> 00:35:42,080 Speaker 2: information and knowledge that is already out there, but we 703 00:35:42,120 --> 00:35:43,000 Speaker 2: just don't even realize it. 704 00:35:43,280 --> 00:35:46,880 Speaker 1: What's one thing that you wish more people understood about 705 00:35:46,920 --> 00:35:48,960 Speaker 1: the coded systems that surround them. 706 00:35:49,040 --> 00:35:51,839 Speaker 2: One aspect about code is that the extent to which 707 00:35:51,880 --> 00:35:55,160 Speaker 2: there's like a craft and a style and almost an 708 00:35:55,239 --> 00:35:57,640 Speaker 2: art to it. And when people think about like programming 709 00:35:57,680 --> 00:35:59,520 Speaker 2: languages or kind of which language they want to program, 710 00:35:59,600 --> 00:36:03,880 Speaker 2: and the truth is there's there's a lot of personal 711 00:36:03,960 --> 00:36:07,040 Speaker 2: choice and a lot of a lot of opinion, very 712 00:36:07,040 --> 00:36:10,520 Speaker 2: strong opinions about what kind of languages work, but also 713 00:36:10,560 --> 00:36:12,480 Speaker 2: even kind of the way in which you program. And 714 00:36:12,480 --> 00:36:16,440 Speaker 2: so for example, there's actually this book called If Hemingway 715 00:36:16,480 --> 00:36:18,719 Speaker 2: Wrote JavaScript, where it actually takes I think, like the 716 00:36:18,719 --> 00:36:22,400 Speaker 2: same coding task and then programs in different ways accordinated 717 00:36:22,480 --> 00:36:25,759 Speaker 2: kind of different like authorial styles, and to show that 718 00:36:26,000 --> 00:36:28,920 Speaker 2: it is a deeply human kind of thing. Now, of course, 719 00:36:29,360 --> 00:36:33,360 Speaker 2: and many people compare it to writing or fiction or 720 00:36:33,400 --> 00:36:36,040 Speaker 2: poetry and things like that, and there are aspects of that, 721 00:36:36,080 --> 00:36:38,160 Speaker 2: and I certainly, but I don't want to push it 722 00:36:38,200 --> 00:36:40,319 Speaker 2: too far because the code still has to do something, 723 00:36:40,360 --> 00:36:43,320 Speaker 2: it still has to operate. But there really are many 724 00:36:43,719 --> 00:36:47,480 Speaker 2: almost like artistic aspects to code, and I think that 725 00:36:47,560 --> 00:36:52,640 Speaker 2: kind of interesting combination of extreme logic and practicality and 726 00:36:52,680 --> 00:36:58,160 Speaker 2: efficacy combined with style and art and problem solving, I 727 00:36:58,200 --> 00:36:59,960 Speaker 2: think is something that maybe people who are kind of 728 00:37:00,040 --> 00:37:02,000 Speaker 2: outside of the world of code just don't realize. 729 00:37:02,080 --> 00:37:06,600 Speaker 1: So here's a random question, the relationship between software code 730 00:37:06,680 --> 00:37:09,719 Speaker 1: and let's say, biological code like DNA. Is this just 731 00:37:09,760 --> 00:37:12,080 Speaker 1: a metaphorical thing or is there something deeper there? 732 00:37:12,160 --> 00:37:15,640 Speaker 2: We are wet, squishy, messy things, and then down at 733 00:37:15,680 --> 00:37:21,080 Speaker 2: the sollular or substolular level, it's incredibly incredibly stochastic and random, 734 00:37:21,080 --> 00:37:24,480 Speaker 2: and there's things just all vibrating around, and it's wildly 735 00:37:24,560 --> 00:37:27,280 Speaker 2: different from the way in which we think about coding. 736 00:37:27,680 --> 00:37:31,319 Speaker 2: But the one exciting aspect about this is that, like 737 00:37:31,480 --> 00:37:33,560 Speaker 2: Mike Levin and some of his and his collaborators, they've 738 00:37:33,560 --> 00:37:38,640 Speaker 2: talked about this idea that really traditional computation is really 739 00:37:38,680 --> 00:37:42,280 Speaker 2: just a subset of kind of information processing as a whole, 740 00:37:42,600 --> 00:37:45,399 Speaker 2: and biology is just another mode of doing that kind 741 00:37:45,400 --> 00:37:47,560 Speaker 2: of thing. So I think by looking at what biology 742 00:37:47,600 --> 00:37:50,560 Speaker 2: is doing and comparing it to how code operates and 743 00:37:50,560 --> 00:37:53,280 Speaker 2: how computers operate, where they are similar and where they're different, 744 00:37:54,239 --> 00:37:58,239 Speaker 2: just shows you the sheer number of different ways that 745 00:37:58,360 --> 00:38:00,759 Speaker 2: computing can be done, and when it comes to kind 746 00:38:00,800 --> 00:38:03,640 Speaker 2: of more engineered traditional computing, we are still only beginning 747 00:38:03,640 --> 00:38:06,160 Speaker 2: to scratch the surface. So I think in that way, 748 00:38:06,360 --> 00:38:10,000 Speaker 2: comparing contrasting the way biology and computation are similar and 749 00:38:10,000 --> 00:38:11,640 Speaker 2: different can be enormously valuable. 750 00:38:11,880 --> 00:38:14,719 Speaker 1: So let's end with telling us what your message is 751 00:38:14,800 --> 00:38:17,520 Speaker 1: at the heart of the magic of Code, your new book. 752 00:38:17,600 --> 00:38:20,000 Speaker 1: What do you hope that people will take away from 753 00:38:20,000 --> 00:38:22,040 Speaker 1: it in seeing the world around them? 754 00:38:22,360 --> 00:38:25,400 Speaker 2: Like Steve Jobs has this idea that computers are the 755 00:38:25,400 --> 00:38:28,000 Speaker 2: bicycle for the mind. And the idea behind this was 756 00:38:28,239 --> 00:38:30,200 Speaker 2: that he was reading I think some old scientific American 757 00:38:30,280 --> 00:38:31,960 Speaker 2: article where it was like a chart of like the 758 00:38:32,080 --> 00:38:36,080 Speaker 2: energy efficiency of different organisms and humans were kind of mediocre, 759 00:38:36,120 --> 00:38:37,920 Speaker 2: and like maybe some birds were much better, but then 760 00:38:37,960 --> 00:38:39,759 Speaker 2: everything changed when a human got on a bicycle, because 761 00:38:39,760 --> 00:38:42,400 Speaker 2: suddenly they were much more efficient. And his idea was 762 00:38:42,440 --> 00:38:44,359 Speaker 2: that computers they should be this bicycle for the mind, 763 00:38:44,400 --> 00:38:47,640 Speaker 2: for helping accelerate how we think, how we interact engage 764 00:38:47,680 --> 00:38:51,080 Speaker 2: with the world. And that's really ultimately what it's all about. 765 00:38:51,120 --> 00:38:54,360 Speaker 2: And so for me, whether I'm thinking about like trends 766 00:38:54,440 --> 00:38:58,360 Speaker 2: in like super powerful AI or certain other things around 767 00:38:59,000 --> 00:39:01,799 Speaker 2: the Internet or whatever it is we all have to 768 00:39:01,800 --> 00:39:04,720 Speaker 2: be thinking about, not just saying, oh, these are interesting trends. 769 00:39:04,760 --> 00:39:06,400 Speaker 2: I wonder what things are going to look like in 770 00:39:06,400 --> 00:39:10,239 Speaker 2: the future, but more no, like these tools are for me, 771 00:39:10,680 --> 00:39:12,359 Speaker 2: what is the future that I want to live in? 772 00:39:12,440 --> 00:39:14,840 Speaker 2: And how can I kind of make that human centered 773 00:39:14,880 --> 00:39:17,920 Speaker 2: future with technology that much more possible? And so the 774 00:39:17,960 --> 00:39:19,919 Speaker 2: book is kind of a guide to kind of think 775 00:39:20,000 --> 00:39:23,279 Speaker 2: about all these different sort of human centered ideas, to 776 00:39:23,280 --> 00:39:25,440 Speaker 2: hopefully provide a guide for that sense of wonder and 777 00:39:25,520 --> 00:39:28,560 Speaker 2: delight and humane aspects of computing. 778 00:39:33,200 --> 00:39:36,960 Speaker 1: That was my interview with complexity scientist and lover of code, 779 00:39:37,320 --> 00:39:40,520 Speaker 1: Sam Arbusman. We are right now at the beginning of 780 00:39:40,600 --> 00:39:46,279 Speaker 1: a centuries long experiment in computation. For the first time 781 00:39:46,320 --> 00:39:50,360 Speaker 1: in history, we can build these dynamic worlds that evolve 782 00:39:50,440 --> 00:39:55,239 Speaker 1: and adapt. We can simulate climate futures, or economic collapses, 783 00:39:55,440 --> 00:40:00,960 Speaker 1: or entire societies that rise and fall in silic But 784 00:40:01,080 --> 00:40:04,360 Speaker 1: Sam talks about code not just as a set of instructions, 785 00:40:04,440 --> 00:40:08,200 Speaker 1: but more generally like a kind of spell, a system 786 00:40:08,239 --> 00:40:12,320 Speaker 1: of symbols that does something real in the outside world. 787 00:40:12,719 --> 00:40:14,919 Speaker 1: And part of what I find the most amazing about 788 00:40:14,920 --> 00:40:18,160 Speaker 1: our current moment in time is the way that code 789 00:40:18,520 --> 00:40:23,880 Speaker 1: can and has evolved past the understanding of its creators. 790 00:40:24,200 --> 00:40:27,120 Speaker 1: And that's the paradox we're sitting with, and in some 791 00:40:27,160 --> 00:40:29,600 Speaker 1: sense we have been sitting with for some centuries, now 792 00:40:29,800 --> 00:40:33,759 Speaker 1: that we are building systems more powerful than our ability 793 00:40:34,040 --> 00:40:36,920 Speaker 1: to fully understand them. There's one more thing I just 794 00:40:36,960 --> 00:40:40,320 Speaker 1: want to touch on from Sam's book, the idea that code, 795 00:40:40,960 --> 00:40:44,520 Speaker 1: like language or like myth, offers us a kind of mirror. 796 00:40:45,160 --> 00:40:49,439 Speaker 1: Code can reflect our values and our metaphors, and our 797 00:40:49,719 --> 00:40:54,000 Speaker 1: hopes for control, and our particular curiosities about how the 798 00:40:54,040 --> 00:40:57,200 Speaker 1: world works. I suspect that someday there are going to 799 00:40:57,239 --> 00:41:01,600 Speaker 1: be code anthropologists who look back on the kind of 800 00:41:01,680 --> 00:41:05,839 Speaker 1: programs written by different civilizations at different time points, and 801 00:41:05,880 --> 00:41:09,640 Speaker 1: it will tell them as much about those civilizations as 802 00:41:09,960 --> 00:41:13,360 Speaker 1: their books and plays and religious practices, because our code 803 00:41:13,400 --> 00:41:16,600 Speaker 1: reflects the assumptions that we build into them and the 804 00:41:16,680 --> 00:41:20,879 Speaker 1: blind spots that we forget to consider. Every simulation has 805 00:41:20,880 --> 00:41:24,839 Speaker 1: some of us in there. So with the passing of decades, 806 00:41:25,200 --> 00:41:27,560 Speaker 1: we're going to go beyond how do we code the 807 00:41:27,560 --> 00:41:31,359 Speaker 1: world to questions about who's doing the coding, and what 808 00:41:31,400 --> 00:41:33,400 Speaker 1: do we make sure is in there, and what do 809 00:41:33,440 --> 00:41:36,200 Speaker 1: we choose to leave out? And what are the limits 810 00:41:36,280 --> 00:41:38,440 Speaker 1: of what we can simulate? And when does it matter 811 00:41:38,480 --> 00:41:43,399 Speaker 1: that those limits shape our conclusions? In any case, as 812 00:41:43,400 --> 00:41:46,080 Speaker 1: I think about what Sam and I talked about, I 813 00:41:46,120 --> 00:41:50,200 Speaker 1: come to something like this conclusion. As we look into 814 00:41:50,239 --> 00:41:54,839 Speaker 1: the deep future, we may find ourselves less in control 815 00:41:54,960 --> 00:41:58,920 Speaker 1: than we thought, but also more creative than we ever 816 00:41:59,000 --> 00:42:03,719 Speaker 1: thought possible. Our minds will be riding bicycles and eventually 817 00:42:03,960 --> 00:42:07,640 Speaker 1: motorcycles and jets. And that is the sense in which 818 00:42:07,920 --> 00:42:11,440 Speaker 1: even the most logical systems we've ever built have a 819 00:42:11,520 --> 00:42:20,000 Speaker 1: healthy dose of magic. Go to eagleman dot com slash 820 00:42:20,040 --> 00:42:23,840 Speaker 1: podcast for more information and find further reading. Join the 821 00:42:23,840 --> 00:42:27,680 Speaker 1: weekly discussions on my substack, and check out and subscribe 822 00:42:27,719 --> 00:42:30,760 Speaker 1: to Inner Cosmos on YouTube for videos of each episode 823 00:42:30,800 --> 00:42:35,720 Speaker 1: and to leave comments until next time. I'm David Eagleman, 824 00:42:35,880 --> 00:42:37,759 Speaker 1: and this is Inner Cosmos.