1 00:00:02,400 --> 00:00:06,760 Speaker 1: Bloomberg Audio Studios, Podcasts, radio News. 2 00:00:08,920 --> 00:00:12,880 Speaker 2: This is Master's in Business with Barry rid Holds on 3 00:00:13,119 --> 00:00:15,160 Speaker 2: Bloomberg Radio. 4 00:00:16,680 --> 00:00:19,920 Speaker 1: This week on the podcast, I have a fascinating guest. 5 00:00:19,960 --> 00:00:22,920 Speaker 1: His name is Brian Klass. He teaches at the University 6 00:00:22,960 --> 00:00:26,239 Speaker 1: College London, where he focuses on global politics, and he 7 00:00:26,280 --> 00:00:29,240 Speaker 1: has written a book that I have just plowed through 8 00:00:29,320 --> 00:00:35,160 Speaker 1: the first half of and found absolutely fascinating, Fluke, Chance, 9 00:00:35,280 --> 00:00:39,200 Speaker 1: Chaos and Why Everything we do Matters. He just really 10 00:00:39,240 --> 00:00:44,559 Speaker 1: explains why our understanding of cause and effect is so 11 00:00:44,760 --> 00:00:47,400 Speaker 1: flawed that we think that A naturally leads to B, 12 00:00:47,560 --> 00:00:51,200 Speaker 1: which leads to C, and instead the world is far 13 00:00:51,280 --> 00:00:56,600 Speaker 1: more random and complex, and little things that happened years ago, 14 00:00:56,880 --> 00:01:01,160 Speaker 1: sometimes thousands or millions of years ago, have a giant 15 00:01:01,400 --> 00:01:04,800 Speaker 1: impact on what happens today. It really turns your view 16 00:01:04,840 --> 00:01:08,080 Speaker 1: on causation upside down and it makes you rethink just 17 00:01:08,240 --> 00:01:12,040 Speaker 1: how random everything is. I found the book fascinating and 18 00:01:12,080 --> 00:01:14,880 Speaker 1: I found our conversation fascinating, and I think you will 19 00:01:14,920 --> 00:01:18,440 Speaker 1: also with no further ado, my conversation with the author 20 00:01:18,480 --> 00:01:20,679 Speaker 1: of Luke, Brian Klass. 21 00:01:21,000 --> 00:01:22,360 Speaker 2: It's a pleasure to be here. Thanks for having me 22 00:01:22,400 --> 00:01:22,839 Speaker 2: on the show. 23 00:01:22,920 --> 00:01:27,320 Speaker 1: So this book is nothing more than just all confirmation 24 00:01:27,400 --> 00:01:29,959 Speaker 1: biased for me. We'll jump into this in a bit. 25 00:01:30,319 --> 00:01:34,160 Speaker 1: I'm about halfway through it and really really enjoying it. 26 00:01:34,680 --> 00:01:38,480 Speaker 1: But I have to start out with a story you 27 00:01:38,560 --> 00:01:41,880 Speaker 1: tell in the introduction to the book. You're twenty years old. 28 00:01:41,959 --> 00:01:45,680 Speaker 1: Your father pulls you aside, shows you a newspaper clipping 29 00:01:45,760 --> 00:01:50,680 Speaker 1: from nineteen o five and the headline is terrible act 30 00:01:50,800 --> 00:01:58,160 Speaker 1: of insane woman. Tell us about that woman, Clara Magdalene Jansen, 31 00:01:58,600 --> 00:01:59,440 Speaker 1: and what she did. 32 00:02:00,160 --> 00:02:03,080 Speaker 2: Yeah, so this story is from a place called Keeler, Wisconsin, 33 00:02:03,240 --> 00:02:06,520 Speaker 2: a little rural farmhouse in nineteen oh five, and she's 34 00:02:06,560 --> 00:02:09,600 Speaker 2: got four young children, and she probably has what we 35 00:02:09,600 --> 00:02:12,280 Speaker 2: would determine as postpart and depression, but of course they 36 00:02:12,280 --> 00:02:13,560 Speaker 2: don't know what that is in nineteen oh five, and 37 00:02:13,560 --> 00:02:16,799 Speaker 2: she has a mental breakdown, and so she ends up 38 00:02:17,120 --> 00:02:19,840 Speaker 2: tragically killing all of her kids and then taking her 39 00:02:19,840 --> 00:02:23,120 Speaker 2: own life. And her husband comes home to the farmhouse 40 00:02:23,160 --> 00:02:26,520 Speaker 2: and finds his whole family dead, and you could just 41 00:02:26,560 --> 00:02:28,600 Speaker 2: imagine the horror of this. And the reason I put 42 00:02:28,600 --> 00:02:30,680 Speaker 2: this in the introduction to Fluke is because this is 43 00:02:30,720 --> 00:02:34,600 Speaker 2: my great grandfather's first wife, and so one of the 44 00:02:34,639 --> 00:02:36,400 Speaker 2: things that was really extraordinary for me was that I 45 00:02:36,440 --> 00:02:38,840 Speaker 2: went through my first twenty odd years of life not 46 00:02:38,880 --> 00:02:41,680 Speaker 2: knowing about this dark chapter in my family history. But 47 00:02:41,760 --> 00:02:43,880 Speaker 2: after I saw this newspaper headline, you know, sort of 48 00:02:43,880 --> 00:02:46,600 Speaker 2: get over the shock of knowing this about your own family. 49 00:02:47,240 --> 00:02:50,640 Speaker 2: But then you realize that you don't exist unless this 50 00:02:50,720 --> 00:02:52,880 Speaker 2: had happened to me, right, so you wouldn't be listening 51 00:02:52,880 --> 00:02:55,800 Speaker 2: to my voice unless these children had died. 52 00:02:56,080 --> 00:02:59,760 Speaker 1: So following that tragedy, your grandfather moves on with his 53 00:02:59,800 --> 00:03:03,560 Speaker 1: life exactly, eventually remarries the woman who becomes Greg, your 54 00:03:03,600 --> 00:03:07,960 Speaker 1: great grandmother. So but for this random horrible events, we're 55 00:03:08,000 --> 00:03:09,600 Speaker 1: not here having this conversation exactly. 56 00:03:09,639 --> 00:03:10,959 Speaker 2: And this is where, you know, this is why I 57 00:03:11,000 --> 00:03:13,720 Speaker 2: started getting interested in applying things like chaos theory to 58 00:03:14,160 --> 00:03:17,600 Speaker 2: human society and also to our own lives, because, of course, 59 00:03:17,680 --> 00:03:19,880 Speaker 2: you know, Clara, when she decided to do this horrible 60 00:03:20,000 --> 00:03:21,880 Speaker 2: thing to her children and also take her own life, 61 00:03:22,720 --> 00:03:24,239 Speaker 2: she had no way of knowing that one hundred and 62 00:03:24,320 --> 00:03:26,079 Speaker 2: nineteen years later, you know, you and I would be 63 00:03:26,120 --> 00:03:29,000 Speaker 2: talking on Bloomberg. But that's that's the way it is, right, 64 00:03:29,000 --> 00:03:30,600 Speaker 2: That's the way the world works. And so I think 65 00:03:30,639 --> 00:03:32,960 Speaker 2: this is the kind of stuff where we tend to 66 00:03:33,040 --> 00:03:35,600 Speaker 2: imagine that there's just sort of these you know, big 67 00:03:35,680 --> 00:03:38,560 Speaker 2: building blocks of life, like the really obvious variables that 68 00:03:38,600 --> 00:03:42,080 Speaker 2: create outcomes. And the argument I'm making is, actually, you know, 69 00:03:42,240 --> 00:03:45,160 Speaker 2: it's it's sort of heretical to the you know, look 70 00:03:45,200 --> 00:03:46,920 Speaker 2: for the signal, not the noise, because I am a 71 00:03:46,960 --> 00:03:48,000 Speaker 2: byproduct of the noise. 72 00:03:48,360 --> 00:03:52,040 Speaker 1: So so the rational cause and effect A leads to 73 00:03:52,080 --> 00:03:56,360 Speaker 1: be are. So that's one individual, and obviously one individual 74 00:03:56,400 --> 00:04:00,720 Speaker 1: can change a future set of bloodlines. Let's take this 75 00:04:00,800 --> 00:04:04,600 Speaker 1: a little bigger. Let's talk about mister and missus Stimpson 76 00:04:05,120 --> 00:04:08,680 Speaker 1: who go on vacation in Coyota, Japan in nineteen twenty six. 77 00:04:09,480 --> 00:04:13,880 Speaker 1: How significant can that vacation? Possibly big? 78 00:04:14,200 --> 00:04:16,839 Speaker 2: Yeah, So this is a couple, mister and missus hl Stimpson. 79 00:04:16,880 --> 00:04:18,880 Speaker 2: They go to Kyoto, Japan on a holiday, on a 80 00:04:18,920 --> 00:04:21,279 Speaker 2: vacation in nineteen twenty six, and they just fall in 81 00:04:21,279 --> 00:04:23,160 Speaker 2: love with the city. It's an experience that a lot 82 00:04:23,160 --> 00:04:24,480 Speaker 2: of us have where you go on vacation, you get 83 00:04:24,480 --> 00:04:28,240 Speaker 2: a soft spot for wherever you've gone to relax and 84 00:04:28,240 --> 00:04:30,920 Speaker 2: so on, and they just find it utterly charming. Now, 85 00:04:31,400 --> 00:04:33,520 Speaker 2: nineteen years later, this turns out to matter quite a 86 00:04:33,520 --> 00:04:36,880 Speaker 2: lot because the husband and the couple. Henry Stimpson ends 87 00:04:36,920 --> 00:04:40,400 Speaker 2: up as America's Secretary of War and the Target Committee 88 00:04:40,480 --> 00:04:43,040 Speaker 2: approaches him with their recommendations of where to drop the 89 00:04:43,040 --> 00:04:45,440 Speaker 2: first atomic bomb in nineteen forty five, and top of 90 00:04:45,440 --> 00:04:47,560 Speaker 2: the list unequivocal Kyoto. 91 00:04:48,200 --> 00:04:51,680 Speaker 1: Now he's not Tokyo, which has already been demolished. 92 00:04:51,720 --> 00:04:55,240 Speaker 2: Tokyo's basically been destroyed. There's an argument here that Kyoto's 93 00:04:55,279 --> 00:04:59,760 Speaker 2: just opened up a warplane factory. It's a former imperial capital, 94 00:05:00,040 --> 00:05:02,320 Speaker 2: so it has sort of propaganda value for, you know, 95 00:05:02,440 --> 00:05:04,880 Speaker 2: reducing Japanese morales. So all the generals say, look, this 96 00:05:04,920 --> 00:05:06,600 Speaker 2: is a good idea, this is where we should drop 97 00:05:06,600 --> 00:05:10,520 Speaker 2: the bomb, and you know, Stimson basically springs to action 98 00:05:10,600 --> 00:05:13,400 Speaker 2: because they the generals started calling it his pet city 99 00:05:13,440 --> 00:05:16,320 Speaker 2: because he kept talking about it, and he twice met 100 00:05:16,320 --> 00:05:18,359 Speaker 2: with President Truman in person, we have records of the 101 00:05:18,360 --> 00:05:20,919 Speaker 2: meetings and so on, and basically said you have to 102 00:05:20,920 --> 00:05:23,719 Speaker 2: take this off the list, and eventually Truman relents, and 103 00:05:23,800 --> 00:05:26,760 Speaker 2: so the first bomb gets dropped on Hiroshima instead. Now 104 00:05:26,800 --> 00:05:28,360 Speaker 2: the second bomb is supposed to go to a place 105 00:05:28,400 --> 00:05:31,680 Speaker 2: called Kokura, and as the bomber gets to Kokura, there's 106 00:05:31,720 --> 00:05:34,279 Speaker 2: briefly cloud cover, and they don't want to accidentally drop 107 00:05:34,320 --> 00:05:36,680 Speaker 2: the bomb somewhere that's not the city, because of course 108 00:05:36,720 --> 00:05:39,480 Speaker 2: that would not have the same effect. So they decide 109 00:05:39,520 --> 00:05:41,320 Speaker 2: to go to the secondary target, which is Nagasaki. 110 00:05:41,480 --> 00:05:43,919 Speaker 1: They literally do a loop to see, hey, maybe it 111 00:05:43,960 --> 00:05:47,440 Speaker 1: clears up, it doesn't, yeah, and onto Nagasak exactly. 112 00:05:47,600 --> 00:05:50,680 Speaker 2: They actually, I think, do loops until they're running low 113 00:05:50,680 --> 00:05:52,560 Speaker 2: on fuel and they're starting to think, okay, we're not 114 00:05:52,600 --> 00:05:54,960 Speaker 2: going to make it to the secondary target, so they finally, 115 00:05:55,040 --> 00:05:57,440 Speaker 2: you know, pull the plug on Kokura, dropped the bomb 116 00:05:57,440 --> 00:05:59,960 Speaker 2: on Nagasaki. So hundreds of thousands of people live or 117 00:06:00,120 --> 00:06:02,679 Speaker 2: die in these in these cities based on a nineteen 118 00:06:02,760 --> 00:06:05,520 Speaker 2: year old vacation and a cloud. And the point that 119 00:06:05,600 --> 00:06:08,240 Speaker 2: I think is important to realize here is that, you know, 120 00:06:08,279 --> 00:06:10,120 Speaker 2: if you were modeling this, if you're trying to say, like, 121 00:06:10,160 --> 00:06:12,120 Speaker 2: how is the US government going to determine where to 122 00:06:12,160 --> 00:06:14,640 Speaker 2: drop the atomic bomb? You would not put in your 123 00:06:14,640 --> 00:06:18,279 Speaker 2: model the vacation histories of American government officials or like 124 00:06:18,360 --> 00:06:20,320 Speaker 2: cloud cover, right, you would come up with these very 125 00:06:20,360 --> 00:06:23,320 Speaker 2: obvious big things like where are the places that have 126 00:06:23,440 --> 00:06:26,080 Speaker 2: strategic importance or propaganda value, And if you did that, 127 00:06:26,120 --> 00:06:28,000 Speaker 2: you probably would put Kyoto on top of the list, 128 00:06:28,240 --> 00:06:30,160 Speaker 2: and you get the wrong answer. And you wouldn't get 129 00:06:30,160 --> 00:06:31,840 Speaker 2: the wrong answer because you were stupid. You'd get the 130 00:06:31,880 --> 00:06:35,279 Speaker 2: wrong answer because sometimes things that don't seem to be 131 00:06:35,279 --> 00:06:38,279 Speaker 2: important actually end up being the most important factor in 132 00:06:38,360 --> 00:06:38,840 Speaker 2: an outcome. 133 00:06:39,240 --> 00:06:43,160 Speaker 1: And the Japanese actually have an expression Kokura's luck. Tell 134 00:06:43,240 --> 00:06:45,159 Speaker 1: us what that means to the Japanese. 135 00:06:45,279 --> 00:06:47,000 Speaker 2: Yeah, I think this is a very useful thing to 136 00:06:47,000 --> 00:06:49,599 Speaker 2: think about. It's Kokura's luck refers to when you unknowingly 137 00:06:49,720 --> 00:06:52,520 Speaker 2: escape disaster. So it was a long time before the 138 00:06:52,600 --> 00:06:55,120 Speaker 2: US government acknowledged that they were planning to drop the 139 00:06:55,120 --> 00:06:57,400 Speaker 2: bomb on Kokura. So you know, hundreds of thousands of 140 00:06:57,400 --> 00:06:59,279 Speaker 2: people in that city had no idea there was an 141 00:06:59,320 --> 00:07:01,839 Speaker 2: airplane over them that, but for a cloud, would have 142 00:07:01,839 --> 00:07:05,120 Speaker 2: incinerated the entire city and killed most of them. And 143 00:07:05,200 --> 00:07:06,799 Speaker 2: so I think this is the kind of thing where, 144 00:07:06,839 --> 00:07:08,760 Speaker 2: you know, one of the ideas that is central to 145 00:07:09,040 --> 00:07:11,400 Speaker 2: the argument in Fluke is that these sorts of things, 146 00:07:11,400 --> 00:07:13,560 Speaker 2: this Kokura's luck, is happening to us all the time, right, 147 00:07:13,600 --> 00:07:17,040 Speaker 2: we were completely oblivious to the diversions in our lives 148 00:07:17,080 --> 00:07:20,880 Speaker 2: and our societies, the alternative possible history simply because we 149 00:07:20,920 --> 00:07:23,280 Speaker 2: can only experience one reality. And what we do is 150 00:07:23,280 --> 00:07:26,520 Speaker 2: we've then stitch a narrative back where it's A to B. 151 00:07:26,680 --> 00:07:28,840 Speaker 2: This makes complete sense. Here are the five reasons why 152 00:07:28,880 --> 00:07:31,000 Speaker 2: this happened. And in fact, I think this is a 153 00:07:31,000 --> 00:07:33,120 Speaker 2: way that we end up deluding ourselves into a neater 154 00:07:33,400 --> 00:07:35,080 Speaker 2: and tidier version of the real world. 155 00:07:35,160 --> 00:07:39,200 Speaker 1: So you describe why we can't know what matters most 156 00:07:39,280 --> 00:07:43,200 Speaker 1: because we can't see the alternative universes. I love this quote. 157 00:07:43,560 --> 00:07:47,440 Speaker 1: We ignore the invisible pivots, the moments that we will 158 00:07:47,480 --> 00:07:51,200 Speaker 1: never realize we're consequential, the near misses and near hits 159 00:07:51,240 --> 00:07:54,280 Speaker 1: that are unknown to us because we've never seen and 160 00:07:54,360 --> 00:07:59,520 Speaker 1: will never see, our alternative possible lives. That's really very 161 00:07:59,640 --> 00:08:03,160 Speaker 1: chilling to know that we're just walking through life unaware 162 00:08:03,320 --> 00:08:06,880 Speaker 1: that hey, atomic bomb over our head. Better hope the 163 00:08:06,920 --> 00:08:08,040 Speaker 1: clouds don't clear up. 164 00:08:08,360 --> 00:08:10,000 Speaker 2: Yeah. I have this saying that I refer to a 165 00:08:10,000 --> 00:08:12,440 Speaker 2: lot in the book, which is that we control nothing, 166 00:08:12,480 --> 00:08:14,600 Speaker 2: but we influence everything. And this is when you think 167 00:08:14,600 --> 00:08:16,960 Speaker 2: about this in our own lives. I think this is 168 00:08:16,960 --> 00:08:20,040 Speaker 2: something where you realize that there are these diversions happening constantly. 169 00:08:20,080 --> 00:08:22,120 Speaker 2: There's a film in the nineteen nineties with Gwyneth Peltrow 170 00:08:22,120 --> 00:08:24,960 Speaker 2: called Sliding Doors, and it has this idea and I 171 00:08:25,040 --> 00:08:27,600 Speaker 2: sort of riff on that with this concept I coined 172 00:08:27,640 --> 00:08:30,600 Speaker 2: called the snooze button effect, where you imagine that you 173 00:08:30,600 --> 00:08:32,480 Speaker 2: know it's Tuesday morning, you're a little bit groggy, you 174 00:08:32,520 --> 00:08:35,360 Speaker 2: wake up, the snooze button beckons to you, you slap it, 175 00:08:35,440 --> 00:08:37,959 Speaker 2: and you get delayed by five minutes. You imagine you're 176 00:08:38,000 --> 00:08:40,640 Speaker 2: now your life rewinds by thirty seconds and you say, 177 00:08:40,720 --> 00:08:42,440 Speaker 2: uh no, I won't hit the snooze button. I'll get 178 00:08:42,440 --> 00:08:45,760 Speaker 2: out of bed. Now. I think that has changed your life. Now, 179 00:08:45,800 --> 00:08:47,600 Speaker 2: the question is how much has it change your life? 180 00:08:47,600 --> 00:08:49,959 Speaker 2: And under some short time scales, maybe things sort of 181 00:08:49,960 --> 00:08:52,240 Speaker 2: get ironed out in the end. But you're gonna have 182 00:08:52,280 --> 00:08:54,640 Speaker 2: different conversations that day, You're gonna talk to different people, 183 00:08:54,679 --> 00:08:56,600 Speaker 2: you might get in a car accident in some days, right. 184 00:08:56,760 --> 00:08:58,120 Speaker 2: I mean, these are the kinds of things that we 185 00:08:58,200 --> 00:09:00,880 Speaker 2: sort of are oblivious to, and I think when you 186 00:09:00,920 --> 00:09:03,040 Speaker 2: think about them. With social change, it's happening all the 187 00:09:03,040 --> 00:09:05,440 Speaker 2: time too. I mean, there's just so many ways that 188 00:09:05,480 --> 00:09:07,640 Speaker 2: the world could have unfolded differently, but for a few 189 00:09:07,640 --> 00:09:09,960 Speaker 2: small changes. I mean, you know, you think about even 190 00:09:10,000 --> 00:09:12,760 Speaker 2: like nine to eleven. We think about all the variables 191 00:09:12,760 --> 00:09:14,200 Speaker 2: that go into nine to eleven. One of them that 192 00:09:14,200 --> 00:09:16,280 Speaker 2: people don't talk about was the weather. It was an 193 00:09:16,280 --> 00:09:19,440 Speaker 2: incredibly blue, blue sky day, Chris. Yeah, and if you 194 00:09:19,520 --> 00:09:22,760 Speaker 2: had if you had a you know, very very cloudy 195 00:09:22,800 --> 00:09:24,600 Speaker 2: day or storm, some of the planes wouldn't have taken 196 00:09:24,640 --> 00:09:26,040 Speaker 2: off on time. They might have had a chance to 197 00:09:26,040 --> 00:09:28,080 Speaker 2: foil some of the plots. Or if you had had 198 00:09:28,120 --> 00:09:30,520 Speaker 2: a different slate of passengers on flight ninety three, So 199 00:09:30,559 --> 00:09:33,640 Speaker 2: if it had gone September tenth or September twelfth, maybe 200 00:09:33,640 --> 00:09:36,200 Speaker 2: those passengers don't take down the plane, Maybe the White 201 00:09:36,240 --> 00:09:39,240 Speaker 2: House or the capital's destroyed, and then the world's different. 202 00:09:39,280 --> 00:09:41,000 Speaker 2: I mean, you know, can you imagine how it would 203 00:09:41,040 --> 00:09:44,320 Speaker 2: change America or geopolitics if there was no White House anymore? 204 00:09:44,520 --> 00:09:46,079 Speaker 2: So I think these are the kinds of things where, 205 00:09:46,440 --> 00:09:49,920 Speaker 2: you know, you just imagine that there's this straight line 206 00:09:49,960 --> 00:09:52,119 Speaker 2: of cause and effect, and of course when we experience 207 00:09:52,160 --> 00:09:55,480 Speaker 2: the world, we then explain it. But you know, these 208 00:09:55,520 --> 00:09:58,360 Speaker 2: small changes could really reshape the future. Some of them 209 00:09:58,360 --> 00:10:00,880 Speaker 2: are going to be more consequential, like that Kyoto story, 210 00:10:01,360 --> 00:10:02,920 Speaker 2: others are going to, you know, be a little bit 211 00:10:02,960 --> 00:10:05,600 Speaker 2: less consequential, at least on human time scales. But the 212 00:10:05,600 --> 00:10:07,679 Speaker 2: point is we can't know, and I think that's something 213 00:10:07,679 --> 00:10:09,160 Speaker 2: that is bewildering to think about. 214 00:10:09,559 --> 00:10:13,480 Speaker 1: So can we actually identify cause and effect? We tell 215 00:10:13,520 --> 00:10:18,680 Speaker 1: ourselves stories. We have not only narrative fallacy in everything 216 00:10:18,720 --> 00:10:21,600 Speaker 1: we do because we love a good plot line, but 217 00:10:21,640 --> 00:10:24,880 Speaker 1: there's also hindsight by us where we imagine, oh, I 218 00:10:24,960 --> 00:10:28,480 Speaker 1: knew this was coming all along, and you know, can 219 00:10:28,559 --> 00:10:32,880 Speaker 1: we really truly know the impact of what how a 220 00:10:33,160 --> 00:10:35,920 Speaker 1: leads to be or how something that we think is 221 00:10:35,960 --> 00:10:39,880 Speaker 1: completely meaningless actually has deep significance. 222 00:10:40,200 --> 00:10:42,120 Speaker 2: Yeah, So I very much subscribe to this view that 223 00:10:42,160 --> 00:10:44,120 Speaker 2: all models are wrong, but some are useful. I don't 224 00:10:44,120 --> 00:10:45,959 Speaker 2: think X yes exactly, But I think that one of 225 00:10:46,000 --> 00:10:48,120 Speaker 2: the things that has been lost on us is I 226 00:10:48,120 --> 00:10:50,040 Speaker 2: think there's so much of the world that runs on 227 00:10:50,120 --> 00:10:54,439 Speaker 2: models that we sometimes forget that they are extremely simplified 228 00:10:54,480 --> 00:10:57,280 Speaker 2: abstractions of reality and that we actually don't understand how 229 00:10:57,280 --> 00:11:00,800 Speaker 2: the causation works. And I think that creates hubrist that's dangerous. So, 230 00:11:01,480 --> 00:11:03,480 Speaker 2: you know, when you think about why the atomic bomb 231 00:11:03,520 --> 00:11:05,600 Speaker 2: ended up getting dropped on Hiroshima. There are an infinite 232 00:11:05,679 --> 00:11:07,440 Speaker 2: number of causes, and they're things that we would not 233 00:11:07,480 --> 00:11:10,959 Speaker 2: think about. Right. Geological forces for gen uranium millions of 234 00:11:11,040 --> 00:11:13,640 Speaker 2: years ago is part of that story. Einstein being born 235 00:11:13,720 --> 00:11:15,640 Speaker 2: is part of that story. The Battle of Midway pivoting 236 00:11:15,720 --> 00:11:19,040 Speaker 2: on a fluke event where the US wins because they 237 00:11:19,080 --> 00:11:21,439 Speaker 2: just happen to stumble upon the Japanese fleet at the 238 00:11:21,480 --> 00:11:23,400 Speaker 2: right moment, right. I mean, if any of these things 239 00:11:23,440 --> 00:11:25,600 Speaker 2: have been different, there's like there's an almost infinite number 240 00:11:25,600 --> 00:11:27,440 Speaker 2: of them where little tweak would have been different, a 241 00:11:27,440 --> 00:11:30,520 Speaker 2: different outcome would have happened. Now, for the useful navigation 242 00:11:30,600 --> 00:11:32,960 Speaker 2: of society, we have to simplify reality because we can't 243 00:11:32,960 --> 00:11:36,040 Speaker 2: build a model that has nine hundred thousand variables, right, 244 00:11:36,520 --> 00:11:39,120 Speaker 2: So what you instead do is you sort of say, Okay, 245 00:11:39,160 --> 00:11:42,160 Speaker 2: this is a crude version of reality. And I think, like, 246 00:11:42,200 --> 00:11:44,320 Speaker 2: you know, one of the things that is really useful 247 00:11:44,360 --> 00:11:47,320 Speaker 2: about some models, like Google Maps, for example, we know 248 00:11:47,440 --> 00:11:49,240 Speaker 2: that's not the world, right, we know the map is 249 00:11:49,280 --> 00:11:51,040 Speaker 2: not the territory. You look at Google Maps and you're 250 00:11:51,080 --> 00:11:53,160 Speaker 2: not like, oh, well, I imagine that that's what the 251 00:11:53,200 --> 00:11:55,520 Speaker 2: real world looks like. It's a clear abstraction. I think 252 00:11:55,520 --> 00:11:59,280 Speaker 2: when we start to get into forecasting and other modeling 253 00:11:59,360 --> 00:12:01,880 Speaker 2: of social ties change, I think we lose sight of 254 00:12:01,920 --> 00:12:04,840 Speaker 2: the fact that we have a Google Maps distortion and 255 00:12:04,840 --> 00:12:07,520 Speaker 2: that we're actually looking at something that is potentially useful 256 00:12:07,559 --> 00:12:10,400 Speaker 2: to navigate but is very, very different from the real world. 257 00:12:10,760 --> 00:12:16,320 Speaker 1: Huh. Really interesting. So let's talk about the way different 258 00:12:16,440 --> 00:12:23,120 Speaker 1: schools of thought perceive and manage these philosophical differences. You 259 00:12:23,520 --> 00:12:29,000 Speaker 1: point out Eastern and Western thinking have a very different 260 00:12:29,080 --> 00:12:34,000 Speaker 1: set of precepts because of just the nature of each society. 261 00:12:34,960 --> 00:12:38,280 Speaker 1: In the Bible, in Genesis, God proclaims, let us make 262 00:12:38,360 --> 00:12:42,560 Speaker 1: man in our image after our likeness, and let them 263 00:12:42,559 --> 00:12:44,960 Speaker 1: have dominion over the fishes, the foul of the cattle, etc. 264 00:12:45,600 --> 00:12:50,080 Speaker 1: Eastern culture takes a whole lot more of a collectivist approach, 265 00:12:50,520 --> 00:12:53,680 Speaker 1: where you're part of a group, not you were made 266 00:12:53,679 --> 00:12:57,120 Speaker 1: in God's images. Tell us a little bit about how 267 00:12:57,160 --> 00:13:02,480 Speaker 1: this schism developed and what is the relationship of chaos 268 00:13:02,520 --> 00:13:03,000 Speaker 1: theory to. 269 00:13:02,960 --> 00:13:06,439 Speaker 2: Each Yeah, so this is a speculative theory, but it's 270 00:13:06,480 --> 00:13:09,400 Speaker 2: a theory that suggests that the reason why Eastern cultures 271 00:13:09,400 --> 00:13:12,880 Speaker 2: have much more relational concepts of interconnectivity between humans and 272 00:13:13,000 --> 00:13:15,920 Speaker 2: the rest of the world and human society as well, 273 00:13:16,520 --> 00:13:21,040 Speaker 2: is derived from the differences or proximity, rather that humans 274 00:13:21,080 --> 00:13:24,320 Speaker 2: have to primates, for example, in their own cultures. So 275 00:13:24,360 --> 00:13:26,480 Speaker 2: there's lots of monkey gods and so on, and there's 276 00:13:26,480 --> 00:13:28,680 Speaker 2: also course lots of monkeys in many of these cultures 277 00:13:28,720 --> 00:13:32,439 Speaker 2: that are developing. And the idea is that the hypothesis 278 00:13:33,000 --> 00:13:36,600 Speaker 2: is that this meant that people could not avoid the 279 00:13:36,600 --> 00:13:39,480 Speaker 2: commonality that we have with the rest of the world. Right, 280 00:13:39,720 --> 00:13:42,040 Speaker 2: whereas if you think about like biblical societies, if you 281 00:13:42,200 --> 00:13:44,800 Speaker 2: look at animals and you see camels, you think like, hey, 282 00:13:45,320 --> 00:13:47,280 Speaker 2: we are super different. We are separate from the rest 283 00:13:47,280 --> 00:13:49,160 Speaker 2: of the world. So the argument is that over the 284 00:13:49,200 --> 00:13:53,400 Speaker 2: long stretch of civilization that this created a slightly different 285 00:13:53,400 --> 00:13:57,080 Speaker 2: mentality that then manifests in what was called relational versus 286 00:13:57,160 --> 00:14:01,680 Speaker 2: atomistic thinking, and Western society is atomistic thinking on steroids, 287 00:14:01,720 --> 00:14:04,120 Speaker 2: which is to say, you know, I mean the American 288 00:14:04,200 --> 00:14:07,319 Speaker 2: dream is very atomistic individualist. It's like, you know, if 289 00:14:07,360 --> 00:14:10,120 Speaker 2: you just want to succeed, then you have to do everything, 290 00:14:10,520 --> 00:14:14,080 Speaker 2: whereas the relational concepts are much more about the interconnections 291 00:14:14,120 --> 00:14:16,280 Speaker 2: that people have. And so I think that also tells 292 00:14:16,320 --> 00:14:18,320 Speaker 2: you how you think about society right. Social change is 293 00:14:18,320 --> 00:14:21,480 Speaker 2: either driven by individuals or's driven by systems. And I 294 00:14:21,480 --> 00:14:24,880 Speaker 2: think that there is a way in which Western culture, 295 00:14:24,920 --> 00:14:27,520 Speaker 2: I think, can learn to actually appreciate some of the 296 00:14:27,560 --> 00:14:31,200 Speaker 2: complexity of social change more with a healthy, increased dose 297 00:14:31,280 --> 00:14:32,640 Speaker 2: of relational thinking. 298 00:14:33,120 --> 00:14:36,320 Speaker 1: And you kind of bring the Eastern and Western philosophies 299 00:14:36,400 --> 00:14:41,120 Speaker 1: together where you discuss the overview effects. And it really 300 00:14:41,160 --> 00:14:46,600 Speaker 1: begins with the United States. Western society sends astronauts to 301 00:14:46,720 --> 00:14:51,480 Speaker 1: the moon, sends astronauts around around the Earth. And these 302 00:14:51,520 --> 00:14:54,680 Speaker 1: astronauts are chosen out of off and out of the military, 303 00:14:54,680 --> 00:14:59,320 Speaker 1: out of the Air Force. They're pilots, their logical they're 304 00:14:59,560 --> 00:15:05,000 Speaker 1: unfeel they're supposed to be essentially soldiers, and yet all 305 00:15:05,040 --> 00:15:08,240 Speaker 1: of them have this impact when they see the blue, 306 00:15:08,240 --> 00:15:12,560 Speaker 1: green Earth in its entirety from space. They all describe 307 00:15:12,560 --> 00:15:17,880 Speaker 1: it as being overwhelmed by a life shattering epiphany on 308 00:15:18,040 --> 00:15:22,720 Speaker 1: the interconnection of everything. That doesn't sound very Western, that 309 00:15:22,800 --> 00:15:25,920 Speaker 1: sounds more like an Eastern philosophy. But this has been 310 00:15:26,080 --> 00:15:28,520 Speaker 1: time and time again. Lots of astronauts have had this. 311 00:15:28,800 --> 00:15:30,760 Speaker 2: Yeah, there's you know, there's funny because there's been like 312 00:15:31,240 --> 00:15:36,400 Speaker 2: nine five hundred generations of modern humans, and four hundred 313 00:15:36,440 --> 00:15:39,239 Speaker 2: and ninety seven of them have not seen the earth, right, 314 00:15:39,560 --> 00:15:42,160 Speaker 2: So when people do see the earth, they have this 315 00:15:42,200 --> 00:15:44,160 Speaker 2: profound epiphany. And as you say, you know, they're worried 316 00:15:44,200 --> 00:15:47,120 Speaker 2: about sending up you know, philosophers and poets, because they 317 00:15:47,120 --> 00:15:49,640 Speaker 2: figured they'd be overwhelmed by the sort of existential awe 318 00:15:49,720 --> 00:15:51,200 Speaker 2: and like, you know, would forget to hit the right 319 00:15:51,240 --> 00:15:52,840 Speaker 2: buttons or whatever. So they pick these people who are 320 00:15:52,880 --> 00:15:55,680 Speaker 2: supposed to be robots effectively in their personality, and all 321 00:15:55,680 --> 00:15:58,560 Speaker 2: of them still have this incredible sort of epiphany about 322 00:15:58,560 --> 00:16:00,240 Speaker 2: the interconnection of the world because you look at the 323 00:16:00,280 --> 00:16:03,160 Speaker 2: single planet and you think, okay, this is one structure. 324 00:16:03,520 --> 00:16:06,080 Speaker 2: This is not something where I'm this distinct bit. You're like, 325 00:16:06,160 --> 00:16:08,840 Speaker 2: this is all together right now. I think what's really 326 00:16:08,840 --> 00:16:12,080 Speaker 2: striking about that is that those world views do shape 327 00:16:12,120 --> 00:16:14,720 Speaker 2: your thinking around social change. And I think when you 328 00:16:14,760 --> 00:16:17,240 Speaker 2: start to think that you are in control rather than 329 00:16:17,240 --> 00:16:19,520 Speaker 2: an agent of influence, you have a different worldview. When 330 00:16:19,560 --> 00:16:22,080 Speaker 2: you start to think that you're individual rather relational, you 331 00:16:22,120 --> 00:16:24,080 Speaker 2: have a different worldview. And all these things feed into 332 00:16:24,080 --> 00:16:25,960 Speaker 2: the ways that we set up models that we sort 333 00:16:25,960 --> 00:16:29,640 Speaker 2: of interact with our conceptions of social change. And so on, 334 00:16:29,920 --> 00:16:31,880 Speaker 2: and also the degree to which we have hubris that 335 00:16:31,880 --> 00:16:33,880 Speaker 2: we can control things. And I think this is where 336 00:16:33,880 --> 00:16:36,000 Speaker 2: the danger comes in. Right, It's not that you shouldn't model, 337 00:16:36,040 --> 00:16:38,560 Speaker 2: It's not that you shouldn't have abstractions of systems. It's 338 00:16:38,600 --> 00:16:40,520 Speaker 2: that when you start to get hubristic about it and 339 00:16:40,600 --> 00:16:43,800 Speaker 2: think you have top down individualists control, you start to 340 00:16:43,800 --> 00:16:47,040 Speaker 2: get overconfident in ways that you try to tame something 341 00:16:47,040 --> 00:16:49,240 Speaker 2: that I think is untamable. And this is where we 342 00:16:49,280 --> 00:16:51,600 Speaker 2: get shocks more often, because you try to impose this 343 00:16:51,680 --> 00:16:55,160 Speaker 2: sort of control on a system that is so complex 344 00:16:55,520 --> 00:16:57,800 Speaker 2: that it resists control. And so you know, there's some 345 00:16:57,800 --> 00:17:00,320 Speaker 2: of these things where I think the insights the flawosophy 346 00:17:00,400 --> 00:17:04,160 Speaker 2: behind this it's it's sort of lurking there invisibly, where 347 00:17:04,240 --> 00:17:06,119 Speaker 2: no one says this when they build a model, but 348 00:17:06,160 --> 00:17:08,679 Speaker 2: it's it's obviously shaping the way they think about it 349 00:17:08,720 --> 00:17:11,200 Speaker 2: and their sort of assumptions before they go into trying 350 00:17:11,240 --> 00:17:13,680 Speaker 2: to determine how to navigate risk and uncertainty. 351 00:17:13,960 --> 00:17:17,000 Speaker 1: Along those lines, you have a great quote in the book. 352 00:17:17,200 --> 00:17:20,400 Speaker 1: God may have created the clock, but it was Newton's 353 00:17:20,440 --> 00:17:23,760 Speaker 1: laws that kept it ticking. So so how do you 354 00:17:25,080 --> 00:17:32,000 Speaker 1: resolve that inherent tension between big forces driving things or 355 00:17:32,080 --> 00:17:36,880 Speaker 1: random elements affecting it, or is there no resolving them. 356 00:17:36,920 --> 00:17:37,680 Speaker 1: They both matter. 357 00:17:38,520 --> 00:17:40,480 Speaker 2: Yeah, So I think it's a question of time scales, 358 00:17:41,080 --> 00:17:42,399 Speaker 2: and I think one of the big one of the 359 00:17:42,400 --> 00:17:44,280 Speaker 2: big problems. And this is something that I you know, 360 00:17:44,320 --> 00:17:46,679 Speaker 2: it's always it's it's such a nuanced concept that it's 361 00:17:46,720 --> 00:17:48,520 Speaker 2: sometimes difficult to explain. But I think there's a really 362 00:17:48,520 --> 00:17:53,680 Speaker 2: important point about whether ideas that happen for a long 363 00:17:53,720 --> 00:17:55,760 Speaker 2: time seem to be validated by what goes on the 364 00:17:55,800 --> 00:17:58,920 Speaker 2: patterns that we see, right, whether you can actually falsify 365 00:17:58,920 --> 00:18:01,040 Speaker 2: a theory when you're talking about social change. So my 366 00:18:01,080 --> 00:18:05,399 Speaker 2: favorite example of this is the Arab Spring in political science, 367 00:18:05,440 --> 00:18:07,800 Speaker 2: my own realm, there is a lot of stuff written 368 00:18:07,920 --> 00:18:09,720 Speaker 2: in sort of two thousand and eight, two thousand and nine, 369 00:18:09,880 --> 00:18:13,240 Speaker 2: even into twenty ten that says, here's why Middle Eastern 370 00:18:13,280 --> 00:18:16,320 Speaker 2: dictatorships are extremely resilient, and there's all this data showing 371 00:18:16,359 --> 00:18:18,639 Speaker 2: this the longevity, et cetera, et cetera, and then like 372 00:18:18,680 --> 00:18:20,520 Speaker 2: within six months of some of these books coming out, 373 00:18:20,840 --> 00:18:22,360 Speaker 2: you know, all of them are on fire. I mean, 374 00:18:22,560 --> 00:18:24,359 Speaker 2: I saw a political risk map when I was in 375 00:18:24,400 --> 00:18:26,879 Speaker 2: grad school where like every single country that was on 376 00:18:26,920 --> 00:18:28,879 Speaker 2: fire was green on the political risk map from the 377 00:18:28,880 --> 00:18:32,040 Speaker 2: previous year. Right now, There's two ways of thinking about that. 378 00:18:32,080 --> 00:18:35,240 Speaker 2: The first way is to say the theory has been falsified. 379 00:18:35,440 --> 00:18:38,119 Speaker 2: They were wrong, right. The second way of thinking about is, 380 00:18:38,160 --> 00:18:40,679 Speaker 2: hold on, maybe the world changed, Maybe the patterns of 381 00:18:40,720 --> 00:18:43,400 Speaker 2: cause and effect have actually shifted, right. And I think 382 00:18:43,440 --> 00:18:45,760 Speaker 2: this is something that people don't appreciate that much, is 383 00:18:45,760 --> 00:18:48,160 Speaker 2: they assume that the patterns of the past are going 384 00:18:48,200 --> 00:18:50,000 Speaker 2: to be predictive of the patterns of the future. I mean, 385 00:18:50,040 --> 00:18:52,280 Speaker 2: David Hume came up with this idea hundreds of years ago, 386 00:18:52,600 --> 00:18:55,080 Speaker 2: but it is something that I think is particularly important 387 00:18:55,080 --> 00:18:58,280 Speaker 2: for our world because the patterns of the past being 388 00:18:58,359 --> 00:19:01,000 Speaker 2: indicative of the patterns of the future has ever before 389 00:19:01,200 --> 00:19:04,240 Speaker 2: been as flawed of an assumption, because our world is 390 00:19:04,320 --> 00:19:06,600 Speaker 2: changing faster than ever before. So I think one of 391 00:19:06,680 --> 00:19:08,400 Speaker 2: the issues that we have is when we think about 392 00:19:08,400 --> 00:19:10,520 Speaker 2: these sort of clockwork models where we say, oh, yes, 393 00:19:10,680 --> 00:19:12,280 Speaker 2: you know, these are the ways that things have worked 394 00:19:12,280 --> 00:19:15,520 Speaker 2: in the past. Our world is very very different year 395 00:19:15,520 --> 00:19:17,200 Speaker 2: to year, and that didn't used to happen. I mean 396 00:19:17,440 --> 00:19:19,200 Speaker 2: I was talking before about these you know, nine thou 397 00:19:19,359 --> 00:19:22,920 Speaker 2: five hundred generations of humans. If you think about the 398 00:19:23,000 --> 00:19:26,480 Speaker 2: sort of entirety of human history as a twenty four 399 00:19:26,520 --> 00:19:29,359 Speaker 2: hour day, twenty three hours and like ten minutes is 400 00:19:29,440 --> 00:19:32,359 Speaker 2: hunter gatherer period, right, and then you get into farming, 401 00:19:32,359 --> 00:19:34,239 Speaker 2: which is another like thirty minutes, and then you've got, 402 00:19:34,320 --> 00:19:36,760 Speaker 2: you know, a few minutes for the industrial revolution, and 403 00:19:36,800 --> 00:19:38,439 Speaker 2: you get to the information age, which we're in now, 404 00:19:38,440 --> 00:19:40,679 Speaker 2: which is like eleven seconds right in this in this 405 00:19:40,720 --> 00:19:43,360 Speaker 2: one day o'clock. And I think the point that's important 406 00:19:43,400 --> 00:19:46,280 Speaker 2: here is that if we base almost all of our 407 00:19:46,320 --> 00:19:49,320 Speaker 2: decision making and almost all of our models on causal 408 00:19:49,359 --> 00:19:52,280 Speaker 2: inference from past patterns of behavior, but the world is 409 00:19:52,359 --> 00:19:55,479 Speaker 2: changing year to year, then the assumptions we're making are 410 00:19:55,480 --> 00:19:57,600 Speaker 2: becoming more and more short lived. And I think that's 411 00:19:57,640 --> 00:20:02,439 Speaker 2: where we're embedding risk into are thinking, because we have 412 00:20:02,560 --> 00:20:04,719 Speaker 2: no other way of inferring cause and effect other than 413 00:20:04,760 --> 00:20:07,320 Speaker 2: past patterns. There's no alternative. That's what Hume says. He's like, 414 00:20:07,560 --> 00:20:09,160 Speaker 2: this is the only way we can understand the world 415 00:20:09,240 --> 00:20:10,600 Speaker 2: is to look at what happened in the past. We 416 00:20:10,960 --> 00:20:12,960 Speaker 2: can't look into the future. So I think this is 417 00:20:13,000 --> 00:20:15,719 Speaker 2: something that I do worry about when I see a 418 00:20:15,720 --> 00:20:18,680 Speaker 2: lot of decision making built on this sort of mentality 419 00:20:18,680 --> 00:20:21,120 Speaker 2: of the clockwork model that like, oh, yes, well, it's 420 00:20:21,160 --> 00:20:23,080 Speaker 2: just going to keep ticking along. And you know, there's 421 00:20:23,080 --> 00:20:24,720 Speaker 2: a lot of very smart thinkers who have thought about 422 00:20:24,720 --> 00:20:26,440 Speaker 2: black swans and so on. I just think that we've 423 00:20:26,440 --> 00:20:28,840 Speaker 2: made a system where the black swans are actually going 424 00:20:28,880 --> 00:20:31,119 Speaker 2: to be more frequent. I think we've designed a system 425 00:20:31,160 --> 00:20:33,560 Speaker 2: that's more prone to systemic risks than before. 426 00:20:33,720 --> 00:20:37,520 Speaker 1: Especially given not only does information move fast than ever, 427 00:20:38,040 --> 00:20:42,480 Speaker 1: but we're more interconnected, we're more related, and it becomes 428 00:20:42,720 --> 00:20:46,520 Speaker 1: increasingly difficult, if not impossible, to figure out what are 429 00:20:46,560 --> 00:20:52,800 Speaker 1: the unanticipated results, consequences, side effects of anything that we do. 430 00:20:53,400 --> 00:20:54,600 Speaker 2: Yeah, and this is you know, this is one of 431 00:20:54,640 --> 00:20:56,600 Speaker 2: those things where I think there's some there's some pretty 432 00:20:56,600 --> 00:20:59,520 Speaker 2: good examples from history of when somebody tries to control 433 00:20:59,560 --> 00:21:03,159 Speaker 2: a system that is uncontrollable and it backfires catastrophically. And 434 00:21:03,160 --> 00:21:06,600 Speaker 2: my favorite example is ihouldn't say favorite is horrible tragedy. 435 00:21:06,440 --> 00:21:09,760 Speaker 2: But the best illustration of this is Mao has this 436 00:21:09,880 --> 00:21:12,280 Speaker 2: idea in communist China. He has this idea. He says, 437 00:21:12,320 --> 00:21:14,600 Speaker 2: I'm we're gonna eradicate disease, and the way we're gonna do. 438 00:21:14,640 --> 00:21:17,080 Speaker 2: This is massive four pests campaigns, so we're gonna kill 439 00:21:17,080 --> 00:21:20,719 Speaker 2: all these pests. So he basically tells everyone just go 440 00:21:20,760 --> 00:21:23,639 Speaker 2: out and you know, kill all these various things that 441 00:21:23,640 --> 00:21:26,800 Speaker 2: potentially are vectors of disease. And what it ultimately does, 442 00:21:26,840 --> 00:21:29,119 Speaker 2: it leads to one of the worst famines in human 443 00:21:29,160 --> 00:21:33,080 Speaker 2: history because they've disrupted the ecosystem, and they figure, oh, 444 00:21:33,200 --> 00:21:34,520 Speaker 2: you know, as long as we just get rid of 445 00:21:34,520 --> 00:21:36,439 Speaker 2: these pests, it will be fine. What they actually have 446 00:21:36,520 --> 00:21:38,840 Speaker 2: done is they've made it to the crops fail and 447 00:21:38,920 --> 00:21:40,680 Speaker 2: so you know, this is the kind of stuff where 448 00:21:40,720 --> 00:21:44,159 Speaker 2: I think that's the it's the parable that warns us of, 449 00:21:44,400 --> 00:21:47,720 Speaker 2: you know, assuming that simply because we have either have 450 00:21:47,800 --> 00:21:50,040 Speaker 2: had some success in the past, or because our model 451 00:21:50,080 --> 00:21:52,360 Speaker 2: seems to guide us in this way, that we can 452 00:21:52,440 --> 00:21:55,440 Speaker 2: therefore insert ourselves into a system and not worry about 453 00:21:55,440 --> 00:21:57,760 Speaker 2: the unintended consequences. I think that's the kind of thing where, 454 00:21:57,960 --> 00:21:59,000 Speaker 2: you know, a lot of the people who are the 455 00:21:59,040 --> 00:22:01,120 Speaker 2: dumers in AI are talking about this. There are some 456 00:22:01,160 --> 00:22:05,200 Speaker 2: things where you know, when you have AI based decision making, 457 00:22:05,280 --> 00:22:07,800 Speaker 2: it is you know, the training data is the past. 458 00:22:08,160 --> 00:22:10,080 Speaker 2: So there are some things that I think are are 459 00:22:10,080 --> 00:22:13,000 Speaker 2: getting worse in this front, and we are also, as 460 00:22:13,000 --> 00:22:16,320 Speaker 2: you said, the interconnectivity. I mean, one of my favorite 461 00:22:16,359 --> 00:22:18,240 Speaker 2: examples of this is the Suez Canal boat. That the 462 00:22:18,280 --> 00:22:20,399 Speaker 2: infamous Suez Canal boat, right. I mean, you have a 463 00:22:20,440 --> 00:22:23,560 Speaker 2: gust of wind that hits a boat and twists its sideways, 464 00:22:23,560 --> 00:22:25,960 Speaker 2: it gets lodged in the canal, and the best estimate 465 00:22:26,000 --> 00:22:28,359 Speaker 2: I've seen is that it created fifty four billion dollars 466 00:22:28,359 --> 00:22:29,840 Speaker 2: of economic damage. And they said it was you know, 467 00:22:29,880 --> 00:22:31,919 Speaker 2: something like point two to point four percent of global 468 00:22:31,920 --> 00:22:35,320 Speaker 2: GDP could have been wiped off by this this one boat. 469 00:22:35,480 --> 00:22:38,200 Speaker 2: Now the question is is there ever another moment in 470 00:22:38,280 --> 00:22:41,159 Speaker 2: human history where one boat could do that? Right? And 471 00:22:41,200 --> 00:22:43,639 Speaker 2: I think the answer is quite clearly no. So the 472 00:22:43,920 --> 00:22:46,000 Speaker 2: I mean maybe the one that brought the plague, right, right, 473 00:22:46,800 --> 00:22:48,240 Speaker 2: But I mean this is the kind of stuff where 474 00:22:48,280 --> 00:22:49,840 Speaker 2: I think one of the one of the lessons that 475 00:22:49,920 --> 00:22:52,840 Speaker 2: I think is important is that there's a trade off 476 00:22:53,280 --> 00:22:56,800 Speaker 2: very often between optimization and resilience. And I think, you know, 477 00:22:56,840 --> 00:23:00,240 Speaker 2: we're told all the time efficiency and optimization are are, 478 00:23:00,520 --> 00:23:03,560 Speaker 2: you know, the guiding principles of so many of our systems, 479 00:23:04,040 --> 00:23:06,600 Speaker 2: but they come at a cost. They do create less resilience, 480 00:23:06,600 --> 00:23:09,600 Speaker 2: and I think there are some things where the long 481 00:23:09,720 --> 00:23:12,399 Speaker 2: term planning that we can do is to put a 482 00:23:12,400 --> 00:23:14,320 Speaker 2: little bit more into resilience and a little bit less 483 00:23:14,359 --> 00:23:16,600 Speaker 2: in optimization. It will cost us money in the short term, 484 00:23:16,720 --> 00:23:18,119 Speaker 2: but it will probably save us a hell of a 485 00:23:18,119 --> 00:23:19,440 Speaker 2: lot of money in the long term. Huh. 486 00:23:19,480 --> 00:23:23,280 Speaker 1: Really really interesting. So I found the book fascinating and 487 00:23:23,359 --> 00:23:27,919 Speaker 1: I really enjoyed where you go down the evolutionary biology 488 00:23:28,560 --> 00:23:33,840 Speaker 1: rabbit hole, starting with convergence is the everything happens for 489 00:23:33,920 --> 00:23:39,840 Speaker 1: a reason school of evolutionary biology. Contingency is the the 490 00:23:39,920 --> 00:23:43,840 Speaker 1: g rated version is stuff happens theory. Explain the difference 491 00:23:43,880 --> 00:23:44,520 Speaker 1: between the two. 492 00:23:45,240 --> 00:23:48,120 Speaker 2: Yes, So I think that evolutionary biology has a lot 493 00:23:48,160 --> 00:23:50,840 Speaker 2: to teach us about understanding change. It's a historical science 494 00:23:50,840 --> 00:23:53,200 Speaker 2: and they're trying to understand, you know, the origin story 495 00:23:53,200 --> 00:23:56,320 Speaker 2: of species, and they're thinking about cause and effect, just 496 00:23:56,359 --> 00:23:59,480 Speaker 2: as people in economics and politics are as well. And 497 00:23:59,520 --> 00:24:02,240 Speaker 2: so these two ideas they're very simple to understand with 498 00:24:02,280 --> 00:24:05,240 Speaker 2: two examples. The first example of contingency is the asteroid 499 00:24:05,280 --> 00:24:08,640 Speaker 2: that wipes out the dinosaurs. Now, if this asteroid, which 500 00:24:08,680 --> 00:24:11,280 Speaker 2: was by the way, was produced by an oscillation in 501 00:24:11,320 --> 00:24:13,359 Speaker 2: a place called the ort cloud in the distant reaches 502 00:24:13,400 --> 00:24:14,200 Speaker 2: of space. 503 00:24:14,320 --> 00:24:20,320 Speaker 1: Absolute outer ring of assorted Detrius that surrounds the entire 504 00:24:20,400 --> 00:24:22,400 Speaker 1: Solar System beyond Pluto. 505 00:24:22,520 --> 00:24:26,280 Speaker 2: Yeah. So this oscillation flings this space rock towards Earth 506 00:24:26,800 --> 00:24:30,119 Speaker 2: and it hits in the most destructive way possible. It 507 00:24:30,200 --> 00:24:31,960 Speaker 2: hits in the ocean in a way that brings up 508 00:24:32,000 --> 00:24:35,360 Speaker 2: a lot of toxic gas and effectively incinerates the dinosaurs 509 00:24:35,400 --> 00:24:37,520 Speaker 2: because the surface temperature went up to about the same 510 00:24:37,560 --> 00:24:40,119 Speaker 2: level as a broiled chicken. I mean, it was deadly 511 00:24:40,240 --> 00:24:42,639 Speaker 2: right now. The reason this is important is because if 512 00:24:42,640 --> 00:24:44,360 Speaker 2: it had hit a slightly different place on the Earth, 513 00:24:44,400 --> 00:24:45,840 Speaker 2: the dinosaurs probably went to died out. 514 00:24:46,040 --> 00:24:48,560 Speaker 1: And let me just point out, and you mentioned this 515 00:24:48,640 --> 00:24:50,439 Speaker 1: in the book. It's not like if it hits a 516 00:24:50,480 --> 00:24:56,200 Speaker 1: different continent five seconds earlier, five seconds later, it completely 517 00:24:56,280 --> 00:25:00,480 Speaker 1: misses that sulfur rich if miss at the in the 518 00:25:00,560 --> 00:25:02,639 Speaker 1: Yucatan Peninsula. 519 00:25:02,840 --> 00:25:04,720 Speaker 2: Yeah, so, I mean, you know, this is the kind 520 00:25:04,760 --> 00:25:06,960 Speaker 2: of stuff where you think about it and it is 521 00:25:07,240 --> 00:25:10,880 Speaker 2: very unsettling because you can imagine everything that humans have done, right, 522 00:25:11,200 --> 00:25:13,240 Speaker 2: I mean, you have a second difference in this asteroid. 523 00:25:13,240 --> 00:25:16,359 Speaker 2: There's no humans because the extinction of the dinosaurs is 524 00:25:16,359 --> 00:25:18,359 Speaker 2: what led to the rise of mammals and eventually the 525 00:25:18,359 --> 00:25:21,400 Speaker 2: evolution of us. And so this is contingency. It's where 526 00:25:21,440 --> 00:25:25,520 Speaker 2: this small change could radically reshape the future. Now, convergence 527 00:25:25,640 --> 00:25:28,399 Speaker 2: is the alternative hypothesis, and they both exist, right, the 528 00:25:28,520 --> 00:25:31,960 Speaker 2: sort of order and disorder and convergence says, Okay, yeah, 529 00:25:32,000 --> 00:25:33,840 Speaker 2: there's a lot of noise, there's a lot of fluctuations 530 00:25:33,840 --> 00:25:38,240 Speaker 2: and flukes, but eventually things that work win, right. So 531 00:25:38,520 --> 00:25:40,680 Speaker 2: my favorite example of this is that if you look 532 00:25:40,720 --> 00:25:42,600 Speaker 2: at if you were to take out a human eye 533 00:25:43,040 --> 00:25:44,600 Speaker 2: and you were to look at it and you were 534 00:25:44,600 --> 00:25:47,320 Speaker 2: to compare it next to an octopus's eye, they're actually 535 00:25:47,400 --> 00:25:50,600 Speaker 2: extremely similar, which is bizarre because there's about six hundred 536 00:25:50,600 --> 00:25:54,560 Speaker 2: million years of separate evolutionary pathways for the two branches 537 00:25:54,600 --> 00:25:57,800 Speaker 2: of life. And the reason this happened isn't because you know, 538 00:25:57,840 --> 00:26:00,760 Speaker 2: we just got super lucky. It's because solution came up 539 00:26:00,800 --> 00:26:04,600 Speaker 2: with a strategy by random experimentation that simply worked. It 540 00:26:04,680 --> 00:26:07,960 Speaker 2: made the species navigate the world effectively long enough to 541 00:26:08,000 --> 00:26:10,680 Speaker 2: survive to have offspring, which is the engine of evolution. Right. 542 00:26:11,040 --> 00:26:13,320 Speaker 2: So this is the kind of stuff where yeah, there 543 00:26:13,359 --> 00:26:15,760 Speaker 2: was like a lot of very profound differences. I mean, 544 00:26:15,760 --> 00:26:17,880 Speaker 2: we do not look like octopus, thank goodness, but it's 545 00:26:17,880 --> 00:26:19,840 Speaker 2: something where as a result of that, the I is 546 00:26:19,880 --> 00:26:23,040 Speaker 2: basically the same. And so the question here, I think 547 00:26:23,200 --> 00:26:26,439 Speaker 2: is can we apply these frameworks to our own change, 548 00:26:26,520 --> 00:26:28,560 Speaker 2: right in our own societies? And so what I try 549 00:26:28,600 --> 00:26:30,560 Speaker 2: to say is, Okay, there's some stuff that is ordered. 550 00:26:30,560 --> 00:26:33,520 Speaker 2: There's lots of regularity, there's lots of patterns in our lives. 551 00:26:33,920 --> 00:26:36,119 Speaker 2: That's the convergence stuff. At some point, you know, you 552 00:26:36,160 --> 00:26:38,639 Speaker 2: go on the highway, there might be an accident sometimes, 553 00:26:38,960 --> 00:26:41,200 Speaker 2: but like most of the time, you know, the cars 554 00:26:41,400 --> 00:26:44,920 Speaker 2: drive around the same speed, they have space between them 555 00:26:44,920 --> 00:26:47,679 Speaker 2: that's about the same distance, right, And like there's all 556 00:26:47,680 --> 00:26:50,000 Speaker 2: these patterns, but every so often there's a car accident, 557 00:26:50,000 --> 00:26:51,760 Speaker 2: and that's contingency. Right. So this is the kind of 558 00:26:51,800 --> 00:26:54,560 Speaker 2: stuff where what I say is that the way that 559 00:26:54,600 --> 00:26:57,800 Speaker 2: social change happens and also our lives unfold is what 560 00:26:57,840 --> 00:27:00,960 Speaker 2: I call contingent convergence. Not the most beautiful phrase, but 561 00:27:01,000 --> 00:27:03,760 Speaker 2: it's I think very accurate and saying, okay, so there's 562 00:27:03,800 --> 00:27:06,840 Speaker 2: these contingencies that change the path you're on, and then 563 00:27:06,880 --> 00:27:10,040 Speaker 2: once you're on that path, the sort of forces of 564 00:27:10,160 --> 00:27:13,320 Speaker 2: order do constrain the outcomes that are possible. They say, look, 565 00:27:13,359 --> 00:27:15,359 Speaker 2: this stuff's gonna work. That stuff's not gonna work, and 566 00:27:15,400 --> 00:27:19,280 Speaker 2: the sort of survivors bias produces the stuff that does work. 567 00:27:19,480 --> 00:27:21,880 Speaker 2: So I think this is a useful framework that I'm 568 00:27:21,920 --> 00:27:26,280 Speaker 2: borrowing from evolutionary biology to help us better understand social change. 569 00:27:26,280 --> 00:27:31,600 Speaker 1: So before I get to contingent convergence, I wanna stay 570 00:27:31,640 --> 00:27:36,320 Speaker 1: with the difference between contingents, which is the meteor killing 571 00:27:36,320 --> 00:27:40,720 Speaker 1: the dinosaurs and allowing the mammals derive to rise, and convergence. 572 00:27:40,960 --> 00:27:43,720 Speaker 1: A couple of other examples that you give in the 573 00:27:43,720 --> 00:27:50,119 Speaker 1: Book of Convergence. Crab like bodies keep evolving time and 574 00:27:50,160 --> 00:27:55,120 Speaker 1: again there are five separate instances that that shape somehow 575 00:27:55,160 --> 00:27:59,960 Speaker 1: seems to provide a useful adaptive way to navigating the world. 576 00:28:00,400 --> 00:28:01,760 Speaker 2: Yeah, so this is I mean, this is one of 577 00:28:01,760 --> 00:28:04,159 Speaker 2: those things where evolutionary biologists joke about that and they 578 00:28:04,200 --> 00:28:06,240 Speaker 2: always say, you know, eventually we're gonna have pincers, like 579 00:28:06,280 --> 00:28:08,960 Speaker 2: we're all gonna end up as crabs because like evolution, 580 00:28:09,119 --> 00:28:11,000 Speaker 2: if you know, and some of them say, if there 581 00:28:11,040 --> 00:28:13,000 Speaker 2: is a god, he really likes crabs. 582 00:28:12,800 --> 00:28:14,919 Speaker 1: And this is there is actually I actually heard that 583 00:28:14,920 --> 00:28:18,800 Speaker 1: about beetles, but there's actually a word for this. Carsonization 584 00:28:19,200 --> 00:28:23,000 Speaker 1: is the process of evolving towards a crab like shape. 585 00:28:23,760 --> 00:28:27,200 Speaker 1: Similarly flight. I never thought about this until I read 586 00:28:27,240 --> 00:28:30,400 Speaker 1: it in the book Flight Evolve four separate times. It's insects, 587 00:28:30,760 --> 00:28:35,280 Speaker 1: it's bats, it's birds, and it's terosaurs. That that's amazing. 588 00:28:35,600 --> 00:28:37,199 Speaker 2: Yeah, I mean, this is the stuff where you know, 589 00:28:37,480 --> 00:28:41,400 Speaker 2: evolution is the It's a really powerful lesson of the 590 00:28:41,480 --> 00:28:45,720 Speaker 2: value of undirected experimentation because every strange thing that we 591 00:28:45,760 --> 00:28:48,800 Speaker 2: see around us, every you know, organism, every plant, et cetera, 592 00:28:49,200 --> 00:28:53,160 Speaker 2: is just the byproduct of this undirected experimentation navigating uncertainty, right, 593 00:28:53,160 --> 00:28:54,920 Speaker 2: I mean that the world is changing all the time. 594 00:28:55,120 --> 00:28:57,760 Speaker 2: It's different concentrations of oxygen. They sometimes have to be 595 00:28:57,760 --> 00:28:59,320 Speaker 2: in the ocean, sometimes I have to be on land, 596 00:28:59,680 --> 00:29:02,400 Speaker 2: and you know, this sort of diverse array of life 597 00:29:02,440 --> 00:29:05,440 Speaker 2: is just undirected experimentation. But the thing is that these 598 00:29:05,480 --> 00:29:08,600 Speaker 2: do these these forces do end up constraining the possibilities. 599 00:29:08,600 --> 00:29:11,360 Speaker 2: Now when we talk about carsonization, is really interesting thing 600 00:29:11,360 --> 00:29:12,880 Speaker 2: that I don't go into much depth in the book, 601 00:29:12,880 --> 00:29:14,760 Speaker 2: but it's called the Burgess shale up in Canada, and 602 00:29:14,800 --> 00:29:18,600 Speaker 2: the Canadian Rockies, And it's basically like this, this like 603 00:29:19,120 --> 00:29:22,960 Speaker 2: fossilized museum of all these really wild body plans that 604 00:29:23,080 --> 00:29:25,160 Speaker 2: used to exist hundreds of millions of years ago before 605 00:29:25,160 --> 00:29:28,040 Speaker 2: a mass extinction event, and what happened is they all 606 00:29:28,040 --> 00:29:31,360 Speaker 2: got obliterated. So you can't have any sort of convergence 607 00:29:31,360 --> 00:29:34,120 Speaker 2: from those body plans because they don't exist anymore, whereas 608 00:29:34,120 --> 00:29:37,000 Speaker 2: the ones that survived all of us are derived from them. Right, 609 00:29:37,240 --> 00:29:40,440 Speaker 2: So the contingency is like, okay, which body plans exist, 610 00:29:40,600 --> 00:29:42,560 Speaker 2: which which sort of ways could you set up life, 611 00:29:42,600 --> 00:29:44,719 Speaker 2: you know, with spines or not spines, whatever it is. 612 00:29:45,320 --> 00:29:47,800 Speaker 2: And then once you have that contingent event where there's 613 00:29:47,840 --> 00:29:51,240 Speaker 2: the extinction within that, there's this sort of constrained evolution 614 00:29:51,320 --> 00:29:54,600 Speaker 2: that is, okay, well, when this happens, the animal dies, 615 00:29:54,960 --> 00:29:57,400 Speaker 2: so it doesn't exist very long. And when this happens, 616 00:29:57,400 --> 00:29:59,320 Speaker 2: the animal survives, so it does exist. And this is 617 00:29:59,320 --> 00:30:01,840 Speaker 2: where arsonization, you know, you need to have a term 618 00:30:01,880 --> 00:30:04,400 Speaker 2: because the crabs are very much survivors. 619 00:30:05,720 --> 00:30:07,680 Speaker 1: And it turns out that unless you were on the 620 00:30:07,680 --> 00:30:11,280 Speaker 1: other side of the planet from where the meteor hit, 621 00:30:11,760 --> 00:30:14,920 Speaker 1: if you're a borrower, if you get underground, you could 622 00:30:15,000 --> 00:30:18,960 Speaker 1: survive that those fires in that heat and then come 623 00:30:19,000 --> 00:30:22,200 Speaker 1: out and continue the evolutionary process. Yeah. 624 00:30:22,200 --> 00:30:23,920 Speaker 2: I mean, this is the thing I find this really 625 00:30:24,200 --> 00:30:27,160 Speaker 2: fascinating to think about, but also unsettling, is that you know, 626 00:30:27,360 --> 00:30:30,960 Speaker 2: all all the life that exists now is basically offspring 627 00:30:31,200 --> 00:30:34,040 Speaker 2: of either something that could dig when the asteroid hit 628 00:30:34,440 --> 00:30:36,560 Speaker 2: or that lived in the ocean, and that's it, right, 629 00:30:36,600 --> 00:30:39,720 Speaker 2: because everything else died now. The really strange thing to 630 00:30:39,760 --> 00:30:41,440 Speaker 2: think about as well is that, you know, I told 631 00:30:41,440 --> 00:30:44,120 Speaker 2: the story about my great grandfather's first wife, and then 632 00:30:44,240 --> 00:30:46,520 Speaker 2: there's this murder and so on. But you keep tracing 633 00:30:46,560 --> 00:30:49,479 Speaker 2: these things back, right, So my great grandfather's ancestors had 634 00:30:49,520 --> 00:30:51,440 Speaker 2: to meet in just the right way and their great grandfather, 635 00:30:51,520 --> 00:30:53,320 Speaker 2: you know, they had to meet. But you go back 636 00:30:53,360 --> 00:30:55,840 Speaker 2: then six million years, this chimpanzee like creature had to 637 00:30:55,840 --> 00:30:58,400 Speaker 2: meet another chimpanzee like creature, and the two of them 638 00:30:58,440 --> 00:31:01,280 Speaker 2: mating is part of the story of human existence. You 639 00:31:01,320 --> 00:31:03,440 Speaker 2: go back further, you know, there's a worm like creature 640 00:31:03,520 --> 00:31:06,040 Speaker 2: hundreds of millions of years ago, it dies, we probably 641 00:31:06,040 --> 00:31:08,360 Speaker 2: don't exist. Or my favorite example I think in the 642 00:31:08,360 --> 00:31:11,640 Speaker 2: book is, and this is a finding from modern science 643 00:31:11,640 --> 00:31:14,640 Speaker 2: about a year ago, was they found out that the 644 00:31:14,680 --> 00:31:18,160 Speaker 2: reason why mammals don't lay eggs, right, why we don't 645 00:31:18,200 --> 00:31:21,400 Speaker 2: have eggs and we instead have live births, is they believed, 646 00:31:21,440 --> 00:31:24,760 Speaker 2: based on genetic testing, that a single shrew like creature 647 00:31:24,800 --> 00:31:27,280 Speaker 2: got infected by a virus one hundred million years ago, 648 00:31:27,560 --> 00:31:30,320 Speaker 2: which caused a mutation, which led to placenta and the 649 00:31:30,400 --> 00:31:32,760 Speaker 2: rise of mammals. And you think of I mean, to me, 650 00:31:32,880 --> 00:31:36,280 Speaker 2: that is just so utterly bizarre to imagine that our existence, 651 00:31:36,320 --> 00:31:39,040 Speaker 2: like everything in humans, you know, ancient Rome, all this stuff, 652 00:31:39,040 --> 00:31:41,000 Speaker 2: you know, Donald Trump, whatever it is, all of it 653 00:31:41,080 --> 00:31:43,600 Speaker 2: is completely contingent on a shrew like creature one hundred 654 00:31:43,640 --> 00:31:46,440 Speaker 2: million years ago getting sick. He's like, when you think 655 00:31:46,440 --> 00:31:48,560 Speaker 2: about this stuff, I think evolutionary biology tell you know, 656 00:31:48,560 --> 00:31:51,400 Speaker 2: they have encountered black swans throughout hundreds of millions of years. 657 00:31:51,440 --> 00:31:54,440 Speaker 2: It's basically the origin story of complex life. 658 00:31:54,480 --> 00:31:57,520 Speaker 1: So let's talk about one of those black swans and 659 00:31:58,200 --> 00:32:04,000 Speaker 1: the specific concept of contingent convergence. I love the example 660 00:32:04,080 --> 00:32:08,280 Speaker 1: you use of the long term evolution experiment using E 661 00:32:08,480 --> 00:32:17,479 Speaker 1: coli twelve identical flasks of ecoli and in separate separate environment. 662 00:32:17,640 --> 00:32:22,680 Speaker 1: Separate but identical environments run ten million years worth of 663 00:32:22,720 --> 00:32:25,680 Speaker 1: human evolution through it. What's the results of that? 664 00:32:25,880 --> 00:32:28,920 Speaker 2: Yeah, this one, this one. Making ecoli sexy in a 665 00:32:28,920 --> 00:32:31,160 Speaker 2: book is pretty hard, I must say, but I think 666 00:32:31,200 --> 00:32:33,600 Speaker 2: this is such a powerful lesson for change, so I 667 00:32:33,880 --> 00:32:35,720 Speaker 2: had to include it. I flew out to Michigan State 668 00:32:36,080 --> 00:32:38,640 Speaker 2: to meet with the people running the Long term Evolution Experiment, 669 00:32:38,680 --> 00:32:41,120 Speaker 2: and the simple idea they had, the genius idea, was 670 00:32:41,120 --> 00:32:44,360 Speaker 2: they said, let's see what happens if we take twelve 671 00:32:44,400 --> 00:32:48,200 Speaker 2: identical populations of ecoli, so they're genetically identical, we put 672 00:32:48,200 --> 00:32:52,160 Speaker 2: them in twelve flasks, and we just evolve them for decades. Right, 673 00:32:52,240 --> 00:32:55,120 Speaker 2: And because ecoi life cycles are so short, it's basically 674 00:32:55,120 --> 00:32:57,440 Speaker 2: the equivalent of millions of years of human evolution. 675 00:32:57,280 --> 00:33:01,080 Speaker 1: Like multiple life spans a day exactly, general per day exactly. 676 00:33:01,120 --> 00:33:02,600 Speaker 2: So it's like it's the equivalent of it if you 677 00:33:02,680 --> 00:33:06,600 Speaker 2: went through like great great great grandparents each day. Right now, 678 00:33:06,640 --> 00:33:08,800 Speaker 2: the beauty of the experiment is they controlled everything. So 679 00:33:08,840 --> 00:33:12,760 Speaker 2: there's nothing in these flasks except for a glucose and 680 00:33:12,840 --> 00:33:16,000 Speaker 2: citrate mix because the glucose is food for the ecoi 681 00:33:16,040 --> 00:33:19,040 Speaker 2: and the citrate is like a stabilizer. Okay, now what 682 00:33:19,160 --> 00:33:22,000 Speaker 2: happens is. They figured, okay, let's test contingents to your convergence. 683 00:33:22,360 --> 00:33:24,320 Speaker 2: And for like the first fifteen years or so of 684 00:33:24,360 --> 00:33:28,160 Speaker 2: the experiment, the lesson was, okay, it's it's convergence. Because 685 00:33:28,280 --> 00:33:31,000 Speaker 2: all twelve of the lines were evolving in slightly different ways. 686 00:33:31,040 --> 00:33:33,360 Speaker 2: There's noise, right, there's little differences. The genome is not 687 00:33:33,360 --> 00:33:37,480 Speaker 2: the same. But they're basically all getting fitter at eating glucose, 688 00:33:37,520 --> 00:33:40,960 Speaker 2: so they're getting better at surviving. And then one day 689 00:33:41,040 --> 00:33:44,200 Speaker 2: a researcher comes in and one of the flasks is cloudy, 690 00:33:44,320 --> 00:33:45,960 Speaker 2: and this is not supposed to be the way it is. 691 00:33:46,000 --> 00:33:47,320 Speaker 2: It looks like a little bit of milk has been 692 00:33:47,400 --> 00:33:50,280 Speaker 2: dropped into it instead of this really clear substance that 693 00:33:50,320 --> 00:33:52,480 Speaker 2: the rest of the other eleven are. So they sort 694 00:33:52,520 --> 00:33:54,360 Speaker 2: of think, oh, this is a mistake. Can they throw 695 00:33:54,360 --> 00:33:57,240 Speaker 2: it out? They restart because they frozen the equali so 696 00:33:57,240 --> 00:33:58,160 Speaker 2: they can restart. 697 00:33:57,920 --> 00:34:01,280 Speaker 1: Raise it like the equivalent of every five hundred years. Yeah, 698 00:34:01,600 --> 00:34:04,080 Speaker 1: so they could reset the clock anytime they want, exactly 699 00:34:04,400 --> 00:34:05,480 Speaker 1: twelve flass. 700 00:34:05,200 --> 00:34:06,960 Speaker 2: Yes, so they're all frozen. They all this sort of 701 00:34:06,960 --> 00:34:09,080 Speaker 2: fossil record. They can restart it at any point, So 702 00:34:09,080 --> 00:34:11,719 Speaker 2: they restart the experiment in this flask, just backing up 703 00:34:11,719 --> 00:34:13,839 Speaker 2: a little bit, and about two weeks later, I think 704 00:34:13,880 --> 00:34:17,319 Speaker 2: it is something like that, the flask trends cloudy again, 705 00:34:17,600 --> 00:34:19,279 Speaker 2: and like, okay, this was not an accident. There's something 706 00:34:19,320 --> 00:34:21,680 Speaker 2: going on here. So they actually pay to sequence the genome. 707 00:34:21,760 --> 00:34:24,319 Speaker 2: Very expensive at the time, a lot cheaper today, but 708 00:34:24,840 --> 00:34:27,960 Speaker 2: they paid a sequence it and the amazing finding. And 709 00:34:28,000 --> 00:34:29,560 Speaker 2: this is the thing. When I read this, I was like, 710 00:34:29,600 --> 00:34:32,960 Speaker 2: this is a central way of capturing. My idea is 711 00:34:33,320 --> 00:34:36,600 Speaker 2: that when they looked at the genome, there were four 712 00:34:36,840 --> 00:34:40,160 Speaker 2: totally random mutations that did not matter at all for 713 00:34:40,280 --> 00:34:44,360 Speaker 2: the survivability of the ecoli that proceeded in just the 714 00:34:44,440 --> 00:34:47,759 Speaker 2: right chain. That when the fifth mutation happened, all of 715 00:34:47,800 --> 00:34:50,799 Speaker 2: a sudden, that population could now eat the citrate, which 716 00:34:50,880 --> 00:34:52,920 Speaker 2: was not supposed to happen, right. It was supposed to 717 00:34:52,920 --> 00:34:55,200 Speaker 2: only eat the glucose. The citrate was there as a stabilizer. 718 00:34:55,719 --> 00:34:58,120 Speaker 2: But as a result of this, they became way more fit, 719 00:34:58,239 --> 00:35:01,279 Speaker 2: way more survivable than the other populations because they could 720 00:35:01,280 --> 00:35:04,399 Speaker 2: eat something the others couldn't. Right, And what happened then 721 00:35:04,480 --> 00:35:06,600 Speaker 2: is that since then, this has now been going on 722 00:35:06,680 --> 00:35:10,240 Speaker 2: for twenty plus years or so, since then, the citrate 723 00:35:10,280 --> 00:35:13,520 Speaker 2: population has an advantage over all of the other eleven, 724 00:35:13,560 --> 00:35:15,600 Speaker 2: and none of the others have developed that mutation because 725 00:35:15,600 --> 00:35:17,040 Speaker 2: it's sort of like a house of cards. You had 726 00:35:17,040 --> 00:35:21,040 Speaker 2: to have these exact four accidents in exactly the right order. 727 00:35:21,080 --> 00:35:22,720 Speaker 2: If they'd reach if they changed the order, it once't 728 00:35:22,680 --> 00:35:24,560 Speaker 2: have happened. And then they had to find me. On 729 00:35:24,600 --> 00:35:26,480 Speaker 2: top of that those four accidents, they had to have 730 00:35:26,520 --> 00:35:28,800 Speaker 2: the fifth accident, which gives them the ability to eat sitratee. 731 00:35:29,200 --> 00:35:31,880 Speaker 2: And so this is the idea of contingent convergence. Right. 732 00:35:31,880 --> 00:35:34,680 Speaker 2: It's like for that population that evolved the ability to 733 00:35:34,719 --> 00:35:39,840 Speaker 2: eat sitrade, that one mutation has changed everything forever. It 734 00:35:39,840 --> 00:35:41,879 Speaker 2: will never go back to eating glucose the same way 735 00:35:41,880 --> 00:35:45,120 Speaker 2: as the others. But for the others that didn't develop 736 00:35:45,200 --> 00:35:48,759 Speaker 2: that change, they are all still evolving in relatively predictable ways. 737 00:35:48,800 --> 00:35:51,960 Speaker 2: So I think this is the capturing of the of 738 00:35:52,000 --> 00:35:54,719 Speaker 2: the sort of paradox of our lives is that we 739 00:35:55,160 --> 00:35:59,120 Speaker 2: exist somewhere between order and disorder. Complete disorder would destroy humans, 740 00:35:59,360 --> 00:36:02,640 Speaker 2: we couldn't exist to our society's confunction. Complete order also 741 00:36:02,680 --> 00:36:05,240 Speaker 2: wouldn't work because there'd be no change, there'd be no innovation, 742 00:36:05,320 --> 00:36:07,080 Speaker 2: and so on and so I think this is where 743 00:36:07,120 --> 00:36:10,120 Speaker 2: contingent convergence really really shines. But I will admit that 744 00:36:10,440 --> 00:36:12,200 Speaker 2: trying to do a sound bite version of the long 745 00:36:12,280 --> 00:36:15,320 Speaker 2: term evolution experiment is something that in writing the book 746 00:36:15,880 --> 00:36:18,800 Speaker 2: was probably the greatest challenge of making something about bacteria interesting. 747 00:36:18,960 --> 00:36:21,319 Speaker 1: But it's really fascinating because if you stop and think 748 00:36:21,360 --> 00:36:24,279 Speaker 1: about that, first of all, the genius of doing this 749 00:36:24,360 --> 00:36:26,799 Speaker 1: over twenty years when you have no idea what the 750 00:36:26,840 --> 00:36:29,560 Speaker 1: outcome is, and hey, maybe we're wasting our lives in 751 00:36:29,600 --> 00:36:32,360 Speaker 1: our career doing this number one, but number two, you 752 00:36:32,400 --> 00:36:35,520 Speaker 1: come out and you see that it's cloudy, is it? 753 00:36:35,680 --> 00:36:39,480 Speaker 1: I'm assuming it's cloudy coast. They're reproducing in greater numbers, 754 00:36:39,719 --> 00:36:43,360 Speaker 1: they're processing the citrate, a whole bunch of different stuff 755 00:36:43,400 --> 00:36:47,600 Speaker 1: is going on than the other eleven environments. And one 756 00:36:47,800 --> 00:36:51,200 Speaker 1: has to imagine that if this wasn't taking place in 757 00:36:51,200 --> 00:36:57,520 Speaker 1: an experiment, but this was a big natural scenario, the 758 00:36:57,560 --> 00:37:02,000 Speaker 1: citrate consuming E. Coli would eventually take over the population 759 00:37:02,080 --> 00:37:05,560 Speaker 1: because they have twice as much food available, were more 760 00:37:06,080 --> 00:37:08,799 Speaker 1: than just the plain old glucose eating equal life. 761 00:37:08,920 --> 00:37:10,359 Speaker 2: Yeah, and this is I mean when I was talking 762 00:37:10,440 --> 00:37:12,040 Speaker 2: to so one of the one of the researchers named 763 00:37:12,080 --> 00:37:14,640 Speaker 2: Richard Lensky, the other one Zach Blount, and I was 764 00:37:14,640 --> 00:37:16,640 Speaker 2: talking to them about this and they said, look, we 765 00:37:16,640 --> 00:37:18,919 Speaker 2: tried to control everything. We tried to control every single 766 00:37:18,920 --> 00:37:21,840 Speaker 2: you know, you pipett the exact same amount of solution 767 00:37:21,960 --> 00:37:23,960 Speaker 2: into the you know, into the beakers each day and 768 00:37:24,000 --> 00:37:26,239 Speaker 2: so on. But what they said was that, you know, 769 00:37:26,440 --> 00:37:29,200 Speaker 2: well what if one day, you know, when we were 770 00:37:29,360 --> 00:37:33,240 Speaker 2: washing the flask, just a tiny microscopic amount of soap 771 00:37:33,760 --> 00:37:36,920 Speaker 2: stayed on there, right, that could affect the evolution. And 772 00:37:36,960 --> 00:37:38,920 Speaker 2: so there's no I mean, even even in this experiment, 773 00:37:38,960 --> 00:37:41,080 Speaker 2: there's contingency they couldn't control, which is I mean, it's 774 00:37:41,120 --> 00:37:43,799 Speaker 2: the most controlled evolutionary experiment that's ever been done, but 775 00:37:43,840 --> 00:37:45,600 Speaker 2: it's still like, you know, these little tiny bits. If 776 00:37:45,600 --> 00:37:48,080 Speaker 2: you just have you know, a microscopic bit of soap, 777 00:37:48,400 --> 00:37:50,319 Speaker 2: well that's going to kill some of the bacteria, and 778 00:37:50,360 --> 00:37:53,000 Speaker 2: then the evolutionary pathway is going to be slightly changed. 779 00:37:53,239 --> 00:37:54,880 Speaker 2: And I think this is the stuff where, you know, 780 00:37:55,360 --> 00:37:58,040 Speaker 2: had they been a different researcher, had a grant run out, 781 00:37:58,320 --> 00:38:00,160 Speaker 2: they might have just said, okay, we've solved it. It's 782 00:38:00,160 --> 00:38:02,600 Speaker 2: all convergence. Because they could have shut down the experiment 783 00:38:02,640 --> 00:38:04,880 Speaker 2: after fifteen years. So there's just all these things that 784 00:38:04,880 --> 00:38:06,760 Speaker 2: are like layered on top of each other. And I think, 785 00:38:07,120 --> 00:38:08,960 Speaker 2: you know, a lot of scientists, especially in the world 786 00:38:08,960 --> 00:38:12,680 Speaker 2: of evolutionary biology, understands that this is something that we 787 00:38:13,320 --> 00:38:16,759 Speaker 2: really have to take seriously. And I think the way 788 00:38:16,800 --> 00:38:19,799 Speaker 2: that we are set up in human society is to 789 00:38:19,880 --> 00:38:22,719 Speaker 2: ignore the contingency because those are not useful things to 790 00:38:22,760 --> 00:38:26,360 Speaker 2: think about. They're the noise, They're the aberrations of the outliers, 791 00:38:26,360 --> 00:38:28,400 Speaker 2: you know, you delete them from the data whatever. And 792 00:38:28,440 --> 00:38:30,440 Speaker 2: I think this is the kind of stuff where the 793 00:38:30,560 --> 00:38:32,759 Speaker 2: lesson here is that those are actually central to the 794 00:38:32,840 --> 00:38:34,200 Speaker 2: question of how change happens. 795 00:38:34,760 --> 00:38:37,200 Speaker 1: I love this quote from the book. I began to 796 00:38:37,280 --> 00:38:40,080 Speaker 1: wonder whether the history of humanity is just an endless 797 00:38:40,120 --> 00:38:45,240 Speaker 1: but feudal struggle to impose order, certainty, and rationality onto 798 00:38:45,320 --> 00:38:49,920 Speaker 1: a world defined by disorder, chance, and chaos. 799 00:38:51,080 --> 00:38:52,880 Speaker 2: Yeah. I mean, I think this is where I became 800 00:38:53,080 --> 00:38:55,440 Speaker 2: a bit of a disillusioned social scientist, to be honest, 801 00:38:55,560 --> 00:38:57,880 Speaker 2: was that I think that the way that I was 802 00:38:57,960 --> 00:39:01,600 Speaker 2: taught to present change to people was to come up 803 00:39:01,600 --> 00:39:04,120 Speaker 2: with a really elegant model, you know, a really beautiful 804 00:39:04,160 --> 00:39:08,000 Speaker 2: equation and that has statistical significance and has like the 805 00:39:08,000 --> 00:39:10,759 Speaker 2: smallest number of variables possible to explain the entire world. 806 00:39:11,440 --> 00:39:13,879 Speaker 2: And the reason that I ended up, you know, having 807 00:39:13,960 --> 00:39:16,640 Speaker 2: that mentality that I think we're trying to cram complexity 808 00:39:16,680 --> 00:39:18,720 Speaker 2: into these neat and tidy, sort of straight jack models 809 00:39:18,760 --> 00:39:22,359 Speaker 2: is because my PhD dissertation so on, I was looking 810 00:39:22,400 --> 00:39:25,799 Speaker 2: at the origin story of coups and civil wars. That 811 00:39:25,880 --> 00:39:29,440 Speaker 2: was part of my research. And these are black Swan events. 812 00:39:29,480 --> 00:39:31,640 Speaker 2: I mean, you know, there's only a few coup attempts 813 00:39:31,640 --> 00:39:34,880 Speaker 2: that happen every year, and they're so hard to predict. 814 00:39:35,000 --> 00:39:37,320 Speaker 2: I mean because you know, one of the coup plots 815 00:39:37,360 --> 00:39:40,719 Speaker 2: that I studied was where this guy, you know, who's 816 00:39:40,719 --> 00:39:42,799 Speaker 2: a sort of mid level officer in the army, just 817 00:39:42,840 --> 00:39:44,880 Speaker 2: on a whim, decides to try to overthrow the government. 818 00:39:45,560 --> 00:39:47,840 Speaker 2: And he's got like fifty guys in his command. This 819 00:39:47,920 --> 00:39:51,239 Speaker 2: is in nineteen ninety seven in Zambia, and you know, 820 00:39:51,320 --> 00:39:54,040 Speaker 2: his plan is to kidnap the army commander and force 821 00:39:54,080 --> 00:39:56,000 Speaker 2: the army commander to announce the coup on the radio. 822 00:39:56,040 --> 00:39:57,839 Speaker 2: It's not a stupid plan, it's actually it probably would 823 00:39:57,840 --> 00:40:00,520 Speaker 2: have worked, but the group of so soldiers that were 824 00:40:00,560 --> 00:40:03,279 Speaker 2: dispatched to the house. I interviewed some of them when 825 00:40:03,320 --> 00:40:05,680 Speaker 2: I went to Zambia, and they said, look, you know 826 00:40:05,719 --> 00:40:07,640 Speaker 2: we ran in the army commanders in his pajamas. He 827 00:40:07,680 --> 00:40:09,520 Speaker 2: runs out the back because he sees these soldiers coming 828 00:40:09,520 --> 00:40:12,759 Speaker 2: to kidnap him, and he climbs up the compound wall 829 00:40:13,200 --> 00:40:14,719 Speaker 2: and you know, it's like in a film where like 830 00:40:14,760 --> 00:40:17,600 Speaker 2: they grab his pant leg he's pulling up, they're pulling down, 831 00:40:17,960 --> 00:40:20,560 Speaker 2: and they just he slips through their fingers and he 832 00:40:20,600 --> 00:40:23,680 Speaker 2: then goes to the government HQ and announces that there's 833 00:40:23,680 --> 00:40:25,719 Speaker 2: a coup under a coup plot underway, and so the 834 00:40:25,760 --> 00:40:28,200 Speaker 2: soldiers go to the radio station. They capture the coup 835 00:40:28,280 --> 00:40:31,239 Speaker 2: ring leader, who's at this point literally hiding in a 836 00:40:31,280 --> 00:40:34,279 Speaker 2: trash can. Okay, three hours after the coup plot has 837 00:40:34,719 --> 00:40:36,799 Speaker 2: been hashed. Now, the problem is, I was reading all 838 00:40:36,800 --> 00:40:39,239 Speaker 2: this stuff about like Zambia's democracy, and it was oh, 839 00:40:39,320 --> 00:40:42,200 Speaker 2: Zambia is a resilient democracy. It's one of the beacons 840 00:40:42,200 --> 00:40:44,879 Speaker 2: of African democracy in the nineteen nineties. And I'm trying 841 00:40:44,920 --> 00:40:48,080 Speaker 2: to reconcile this with the fact that in my own research, 842 00:40:48,120 --> 00:40:50,160 Speaker 2: I'm finding this story where the soldier says like, yeah, 843 00:40:50,160 --> 00:40:52,360 Speaker 2: I think if I was like one second faster, I 844 00:40:52,440 --> 00:40:55,640 Speaker 2: probably would have gotten the government overthrown. And on top 845 00:40:55,680 --> 00:40:58,879 Speaker 2: of this, the other contingency was they didn't chase him. 846 00:40:58,880 --> 00:41:00,560 Speaker 2: And I said, why didn't you chase them? We said, well, 847 00:41:01,320 --> 00:41:04,600 Speaker 2: the army commander's wife was really attractive and we wanted 848 00:41:04,600 --> 00:41:06,880 Speaker 2: to talk to her. And also we opened the fridge 849 00:41:07,400 --> 00:41:10,400 Speaker 2: and there's Namibian import beer in the fridge and we 850 00:41:10,480 --> 00:41:12,759 Speaker 2: hadn't had Namibian beer for a long time, so we said, 851 00:41:12,880 --> 00:41:14,920 Speaker 2: you know, screw this, We're gonna We're gonna drink some 852 00:41:14,960 --> 00:41:17,160 Speaker 2: beer and talk to the wife. And I'm thinking, you know, 853 00:41:17,280 --> 00:41:19,200 Speaker 2: like like how do I put this in my model? 854 00:41:19,360 --> 00:41:20,960 Speaker 2: Like you know, I mean, like like what is my 855 00:41:21,040 --> 00:41:23,640 Speaker 2: quantitative analysis going to show me about this? And I 856 00:41:23,640 --> 00:41:27,000 Speaker 2: think that's the stuff where those little pivot points and 857 00:41:27,080 --> 00:41:29,880 Speaker 2: studying really rare events that are highly consequential makes you 858 00:41:29,920 --> 00:41:32,319 Speaker 2: think differently about the nature of social change. And I 859 00:41:32,320 --> 00:41:34,960 Speaker 2: would go to these like political science conferences and I 860 00:41:35,000 --> 00:41:36,920 Speaker 2: was just like, I don't I don't believe this is 861 00:41:36,920 --> 00:41:38,799 Speaker 2: how the world works. I think there are times where 862 00:41:38,800 --> 00:41:41,239 Speaker 2: these can be useful models, but I don't think we're 863 00:41:41,239 --> 00:41:43,640 Speaker 2: capturing reality accurately, and that's where, you know, some of 864 00:41:43,680 --> 00:41:47,040 Speaker 2: the origin story professionally of the book comes from you. 865 00:41:47,000 --> 00:41:50,480 Speaker 1: Have to build in attractive women and imported beer exactly 866 00:41:50,520 --> 00:41:55,360 Speaker 1: into your models, or or more accurately, just completely random 867 00:41:55,400 --> 00:41:59,560 Speaker 1: events there. There's a research note in the book from 868 00:41:59,560 --> 00:42:03,920 Speaker 1: an evolutionary biologists seventy eight percent of new species were 869 00:42:03,960 --> 00:42:09,640 Speaker 1: triggered by a single event, typically a random mistake or 870 00:42:09,680 --> 00:42:10,520 Speaker 1: genetic error. 871 00:42:11,040 --> 00:42:14,240 Speaker 2: Yeah. My favorite example this is something called the bottleneck effect, 872 00:42:14,280 --> 00:42:16,160 Speaker 2: and it's actually I think it's actually an important idea 873 00:42:16,320 --> 00:42:19,880 Speaker 2: for economics as well. So I'll start with the biology. 874 00:42:19,920 --> 00:42:23,279 Speaker 2: The bottleneck is where a population arbitrarily gets reduced to 875 00:42:23,320 --> 00:42:26,160 Speaker 2: a very small number. And the number of people in 876 00:42:26,200 --> 00:42:27,839 Speaker 2: that population could be you know, it could be ten, 877 00:42:27,880 --> 00:42:30,040 Speaker 2: it could be one hundred, whatever it is. But who 878 00:42:30,080 --> 00:42:33,160 Speaker 2: those ten or one hundred people are really really matters. 879 00:42:33,200 --> 00:42:35,960 Speaker 2: So there's one island, for example, where half the population 880 00:42:36,040 --> 00:42:39,880 Speaker 2: as asthma because it was populated initially by this bottleneck 881 00:42:39,880 --> 00:42:42,319 Speaker 2: of a very small number of people who disproportionately had 882 00:42:42,360 --> 00:42:45,520 Speaker 2: more asthma than the rest of the population. There's elephant seals, 883 00:42:45,520 --> 00:42:48,440 Speaker 2: for example, who got whittled down through hunting and so 884 00:42:48,480 --> 00:42:51,440 Speaker 2: on to something like I think it's fifty breeding pairs 885 00:42:51,480 --> 00:42:54,880 Speaker 2: or something like that, but which exact seals lived or 886 00:42:54,920 --> 00:42:58,319 Speaker 2: died completely changed the trajectory of that species. Now, I 887 00:42:58,360 --> 00:43:01,719 Speaker 2: sort of say this because human society has had bottlenecks 888 00:43:01,680 --> 00:43:04,160 Speaker 2: at various times. We don't know exactly how small they've been, 889 00:43:04,160 --> 00:43:07,480 Speaker 2: but the hypothesis is perhaps that it may have been 890 00:43:07,520 --> 00:43:10,239 Speaker 2: as few as a few thousand humans at one point, 891 00:43:10,480 --> 00:43:14,000 Speaker 2: and which humans were in that group that determined everything 892 00:43:14,040 --> 00:43:16,799 Speaker 2: for who's allowed now, right, So if you swap out, 893 00:43:16,840 --> 00:43:19,439 Speaker 2: you know, one person for a different person, you've changed 894 00:43:19,440 --> 00:43:20,719 Speaker 2: the trajectory of the species. 895 00:43:20,920 --> 00:43:21,040 Speaker 1: Now. 896 00:43:21,040 --> 00:43:23,680 Speaker 2: I think this is also true when you think about economics, 897 00:43:23,680 --> 00:43:26,640 Speaker 2: you think about innovation. Every so often shocks go through 898 00:43:26,640 --> 00:43:30,520 Speaker 2: industries and they whittle down the competition, and who survives 899 00:43:30,560 --> 00:43:33,120 Speaker 2: in that moment is potentially somewhat arbitrary. It could be 900 00:43:33,120 --> 00:43:35,520 Speaker 2: based on some pressures, it could be a smart CEO, 901 00:43:35,640 --> 00:43:39,160 Speaker 2: whatever it is, But the sort of survivors in that bottleneck, 902 00:43:39,440 --> 00:43:41,840 Speaker 2: then we'll dictate how the industry might unfold in the future. 903 00:43:41,840 --> 00:43:44,160 Speaker 2: I mean, you know, Apple has this outsized effect on 904 00:43:44,200 --> 00:43:46,960 Speaker 2: the tech industry, but you know, maybe the time means 905 00:43:46,960 --> 00:43:48,440 Speaker 2: a little bit different in Apple dice. 906 00:43:48,760 --> 00:43:51,399 Speaker 1: I mean, it's not implausible, but for Microsoft giving them 907 00:43:51,400 --> 00:43:54,440 Speaker 1: alone and what was it ninety eight eighty, but for 908 00:43:55,040 --> 00:43:58,520 Speaker 1: any trust case which gave Microsoft an incentive to have 909 00:43:58,600 --> 00:44:01,040 Speaker 1: another survival ble operating system. 910 00:44:01,080 --> 00:44:02,799 Speaker 2: Who knows. Yeah, And so this you know, when you 911 00:44:02,800 --> 00:44:05,480 Speaker 2: think about I think bottlenecks are a useful way of 912 00:44:05,480 --> 00:44:08,880 Speaker 2: thinking about this, partly because they affect trajectories very very profoundly, 913 00:44:09,360 --> 00:44:11,520 Speaker 2: but also because they can be arbitrary. And I think 914 00:44:11,560 --> 00:44:14,800 Speaker 2: this is something where what we do in human society 915 00:44:14,880 --> 00:44:17,359 Speaker 2: is we write history backwards, so we look at who 916 00:44:17,440 --> 00:44:19,840 Speaker 2: is successful and we say, I mean hindsight bias. You 917 00:44:19,840 --> 00:44:21,360 Speaker 2: know many people, I'm sure I've talked to you about this, 918 00:44:21,400 --> 00:44:24,759 Speaker 2: but it's very important to underline that, like when these 919 00:44:24,840 --> 00:44:28,719 Speaker 2: arbitrary things happen, if you then infer causality, that's a 920 00:44:28,760 --> 00:44:32,360 Speaker 2: neat and tidy story, you actually are learning exactly the 921 00:44:32,400 --> 00:44:35,680 Speaker 2: wrong lesson. I mean, the reason these particular elephant seals 922 00:44:35,680 --> 00:44:38,400 Speaker 2: survived is probably arbitrary. It just happened to depend on 923 00:44:38,440 --> 00:44:42,160 Speaker 2: who the people who are proaching them, you know, happen 924 00:44:42,239 --> 00:44:45,040 Speaker 2: to stumble upon. And then of course the evolutionary history 925 00:44:45,040 --> 00:44:48,000 Speaker 2: of that animal is completely changed. So I think that 926 00:44:48,560 --> 00:44:50,799 Speaker 2: lesson is that, you know, sometimes when bottlenecks happen, it 927 00:44:50,840 --> 00:44:53,319 Speaker 2: reshapes the trajectory of the future, but it also is 928 00:44:54,400 --> 00:44:58,120 Speaker 2: inescapably arbitrary at times, and we don't like that. I mean, 929 00:44:58,160 --> 00:45:00,480 Speaker 2: the entire world of self help and the entire world 930 00:45:00,560 --> 00:45:05,080 Speaker 2: of sort of business advice is, oh, these people were successful, 931 00:45:05,120 --> 00:45:07,640 Speaker 2: here's how you replicate it. And the replication is always 932 00:45:07,920 --> 00:45:09,600 Speaker 2: just do what they did, right. But I mean, of 933 00:45:09,600 --> 00:45:11,239 Speaker 2: course the world's different now. I mean, if you do 934 00:45:11,360 --> 00:45:14,239 Speaker 2: what they did, you're just making something that's not truly innovative. Right. 935 00:45:14,280 --> 00:45:19,040 Speaker 1: You can't invent an iPhone today exactly. So it's fascinating 936 00:45:19,040 --> 00:45:21,759 Speaker 1: when you talk about bottlenecks. I read a book some 937 00:45:21,880 --> 00:45:25,440 Speaker 1: years ago called Last Ape stand In, and it talks 938 00:45:25,480 --> 00:45:31,719 Speaker 1: about all the various proto human species, from Chromagnum to 939 00:45:31,760 --> 00:45:36,799 Speaker 1: Neanderthal to Homo sapiens. And the theory is that in 940 00:45:36,840 --> 00:45:40,680 Speaker 1: the last ice Age, maybe it's twenty or forty thousand 941 00:45:40,800 --> 00:45:44,640 Speaker 1: years ago, we were down to a few thousand humans. 942 00:45:46,160 --> 00:45:50,680 Speaker 1: But for the Ice Age ending when it did another 943 00:45:50,760 --> 00:45:54,279 Speaker 1: year again, we may not be having this conversation. There 944 00:45:54,280 --> 00:45:55,440 Speaker 1: may be no humans around. 945 00:45:55,719 --> 00:45:57,800 Speaker 2: Yeah, I mean this is the This is the stuff 946 00:45:57,800 --> 00:46:00,600 Speaker 2: also where I think that the sort of predictable patterns 947 00:46:00,600 --> 00:46:03,560 Speaker 2: that people try to impose on the world are also 948 00:46:03,840 --> 00:46:07,880 Speaker 2: subject to whims of timing. Right, And your example is 949 00:46:08,280 --> 00:46:10,600 Speaker 2: completely apt, and I think it's a very important one. 950 00:46:10,640 --> 00:46:12,399 Speaker 2: And I think it also speaks to the question when 951 00:46:12,440 --> 00:46:14,240 Speaker 2: you say when the ice age ends, right, the timing 952 00:46:14,280 --> 00:46:17,640 Speaker 2: issue is so important. Now, one of my you know, 953 00:46:17,719 --> 00:46:19,960 Speaker 2: examples of this that I think is so fascinating is 954 00:46:20,440 --> 00:46:23,400 Speaker 2: you think about, like our daily lives, and our daily 955 00:46:23,440 --> 00:46:27,160 Speaker 2: lives are you know, basically set up in groups of seven. Okay, 956 00:46:27,239 --> 00:46:29,239 Speaker 2: we've got a seven day week. Why is that? So 957 00:46:29,280 --> 00:46:32,360 Speaker 2: I start looking into this and effectively what happens is 958 00:46:32,400 --> 00:46:35,920 Speaker 2: there's this period in ancient Rome where they have this 959 00:46:36,000 --> 00:46:38,960 Speaker 2: superstition that says the planets are really important for being 960 00:46:39,000 --> 00:46:41,760 Speaker 2: you know, auspicious and so on, and they can see 961 00:46:41,800 --> 00:46:44,640 Speaker 2: because they don't have telescopes, five planets with the naked 962 00:46:44,680 --> 00:46:47,160 Speaker 2: eye and the Sun and the moon. You add them up, 963 00:46:47,239 --> 00:46:49,799 Speaker 2: that's seven. They set up a seven day week because 964 00:46:49,800 --> 00:46:51,800 Speaker 2: of that. That's why we divide our lives in seven. 965 00:46:52,080 --> 00:46:53,960 Speaker 2: And it's because of this lock, this this this thing 966 00:46:53,960 --> 00:46:55,440 Speaker 2: that I also talk about in Fluke, which is this 967 00:46:55,480 --> 00:46:58,120 Speaker 2: concept of lock in where an arbitrary thing can happen 968 00:46:58,440 --> 00:47:00,960 Speaker 2: and then sometimes it persists and sometimes it doesn't, and 969 00:47:00,960 --> 00:47:03,719 Speaker 2: that's often very random. So my other example of this 970 00:47:03,880 --> 00:47:06,680 Speaker 2: is everything that we write, everything that we say, is 971 00:47:06,680 --> 00:47:09,240 Speaker 2: derived from English being locked in when the printing press 972 00:47:09,320 --> 00:47:11,960 Speaker 2: was invented. If the printing press had been invented, you know, 973 00:47:12,040 --> 00:47:14,200 Speaker 2: six decades earlier, six decades later, there'd be a different 974 00:47:14,239 --> 00:47:16,560 Speaker 2: language because the language was in flux, and all of 975 00:47:16,560 --> 00:47:19,400 Speaker 2: a sudden it became really important to have a standardized system. 976 00:47:19,600 --> 00:47:20,839 Speaker 2: So a lot of people used to write the word 977 00:47:20,920 --> 00:47:25,719 Speaker 2: had hadd Now that was expensive because they figured, okay, 978 00:47:25,719 --> 00:47:27,600 Speaker 2: we've got a typeset this with a bunch of letters. 979 00:47:27,680 --> 00:47:29,880 Speaker 2: Why don't we just do had and I'll boom, all 980 00:47:29,920 --> 00:47:32,120 Speaker 2: of a sudden the language changes. Right, So there's a 981 00:47:32,200 --> 00:47:34,319 Speaker 2: series of things that happen really really quickly, but they 982 00:47:34,320 --> 00:47:36,879 Speaker 2: basically produce modern English. And so I think this sort 983 00:47:36,920 --> 00:47:40,319 Speaker 2: of concept of the arbitrary experimentation and you know, superstition 984 00:47:40,360 --> 00:47:42,560 Speaker 2: of the Romans and then getting locked in and the 985 00:47:42,560 --> 00:47:44,520 Speaker 2: empire sort of sets it up and then it spreads 986 00:47:44,520 --> 00:47:46,279 Speaker 2: and all that, and then you think, okay, why do 987 00:47:46,280 --> 00:47:47,640 Speaker 2: we have a five day working way. I mean, it's 988 00:47:47,640 --> 00:47:51,759 Speaker 2: partly tied to, you know, this superstition about the auspicious 989 00:47:51,840 --> 00:47:55,240 Speaker 2: nature of the visible planets, which themselves are an arbitrary 990 00:47:55,239 --> 00:47:57,719 Speaker 2: byproduct of how our eyes evolved. So I mean, it's 991 00:47:57,760 --> 00:47:59,680 Speaker 2: just sort of an everything you think about has got 992 00:47:59,680 --> 00:48:02,200 Speaker 2: these sort of tentacles where they could have been slightly 993 00:48:02,239 --> 00:48:04,760 Speaker 2: different and then our lives would be radically changed. 994 00:48:05,120 --> 00:48:07,759 Speaker 1: One of the things that's so fascinating with us as 995 00:48:08,120 --> 00:48:11,920 Speaker 1: narrative storytellers, Right, we think about, Okay, we've had the 996 00:48:11,960 --> 00:48:16,080 Speaker 1: spoken language for tens of thousands of years, maybe one 997 00:48:16,160 --> 00:48:19,440 Speaker 1: hundred thousand years, and we think about the cuneiform and 998 00:48:19,480 --> 00:48:23,120 Speaker 1: the written language going back to the Egyptians and the Greeks. 999 00:48:23,719 --> 00:48:28,760 Speaker 1: But that's history, and ninety nine percent of the people 1000 00:48:28,800 --> 00:48:34,520 Speaker 1: who lived during that period were illiterate. In fact, species 1001 00:48:34,560 --> 00:48:38,279 Speaker 1: wide literacy, which we arguably still don't have but are 1002 00:48:38,320 --> 00:48:41,920 Speaker 1: closer to. This is like a century old, like for 1003 00:48:41,960 --> 00:48:46,120 Speaker 1: a hundred years people could read and write and meaning 1004 00:48:46,200 --> 00:48:50,160 Speaker 1: most people, but go back beyond the century, and the 1005 00:48:50,320 --> 00:48:54,120 Speaker 1: vast majority of people either couldn't read, couldn't write, never 1006 00:48:54,160 --> 00:48:57,000 Speaker 1: went to school. They had to get up and work 1007 00:48:57,040 --> 00:48:59,520 Speaker 1: in the land. They didn't have time to mess around 1008 00:48:59,520 --> 00:49:00,680 Speaker 1: with this stuff. 1009 00:49:01,440 --> 00:49:03,560 Speaker 2: Yeah, you know, I think there's a lot of things 1010 00:49:03,600 --> 00:49:06,680 Speaker 2: where we are blinded to the fact that we have 1011 00:49:07,000 --> 00:49:09,600 Speaker 2: lives that are unlike any humans who have come before us, right, 1012 00:49:10,040 --> 00:49:12,640 Speaker 2: And I think there's some really big superstructure events that 1013 00:49:12,719 --> 00:49:15,200 Speaker 2: are related to this that that really do affect our lives. 1014 00:49:15,239 --> 00:49:18,440 Speaker 2: So my favorite way of thinking about this is that 1015 00:49:18,520 --> 00:49:21,400 Speaker 2: I think that every human who came before the modern period, 1016 00:49:21,440 --> 00:49:23,239 Speaker 2: most you know, at least you know, maybe the last 1017 00:49:23,239 --> 00:49:26,320 Speaker 2: two hundred years or so, what they experienced was uncertainty 1018 00:49:26,360 --> 00:49:28,560 Speaker 2: in their day to day life. There was almost no regularity, 1019 00:49:28,600 --> 00:49:30,160 Speaker 2: no patterns in their day to day life. They didn't 1020 00:49:30,160 --> 00:49:31,880 Speaker 2: know where their next meal would come from. They didn't know, 1021 00:49:32,200 --> 00:49:33,920 Speaker 2: you know, whether they would get eaten by an animal, 1022 00:49:34,239 --> 00:49:36,440 Speaker 2: et cetera, the crops might fail, you know, et cetera. 1023 00:49:37,200 --> 00:49:39,719 Speaker 2: But they had what I call global stability, which is 1024 00:49:39,760 --> 00:49:41,719 Speaker 2: to say, like the parents and the children lived in 1025 00:49:41,719 --> 00:49:43,319 Speaker 2: the same kind of world. You're a hunter gather, your 1026 00:49:43,400 --> 00:49:45,719 Speaker 2: kids a hunter gather, you know, And this means that 1027 00:49:45,719 --> 00:49:48,520 Speaker 2: the parents teach the kids how to use technology. There's 1028 00:49:48,560 --> 00:49:51,319 Speaker 2: basically regularity from generation to generation. 1029 00:49:51,080 --> 00:49:52,040 Speaker 1: For thousands of years. 1030 00:49:52,120 --> 00:49:54,279 Speaker 2: Yeah, we have flipped that right. So what we have 1031 00:49:54,440 --> 00:49:57,960 Speaker 2: is local stability and global instability. So we have extreme 1032 00:49:58,040 --> 00:50:01,000 Speaker 2: regularity like no human has ever experienced before, where we 1033 00:50:01,040 --> 00:50:03,640 Speaker 2: can know to almost the minute when something we order 1034 00:50:03,680 --> 00:50:05,359 Speaker 2: off the internet is going to arrive at our house, 1035 00:50:05,920 --> 00:50:07,800 Speaker 2: and we go to Starbucks anywhere in the world and 1036 00:50:07,840 --> 00:50:09,120 Speaker 2: we can have the same drink and it's going to 1037 00:50:09,120 --> 00:50:11,239 Speaker 2: taste basically the same thing, and we're really angry if 1038 00:50:11,280 --> 00:50:13,879 Speaker 2: somebody messes up, you know, in order, because that that 1039 00:50:14,040 --> 00:50:17,840 Speaker 2: expectation of regularity is so high. But we have global instability. 1040 00:50:17,880 --> 00:50:19,200 Speaker 2: I mean, you know, I grew up in a world 1041 00:50:19,200 --> 00:50:22,239 Speaker 2: where the Internet didn't exist really for ordinary people, and 1042 00:50:22,280 --> 00:50:24,840 Speaker 2: now it's impossible to live without it. You know. You 1043 00:50:24,840 --> 00:50:26,960 Speaker 2: think about the ways that children teach parents how to 1044 00:50:27,040 --> 00:50:29,920 Speaker 2: use technology that's never been possible before. And on top 1045 00:50:29,960 --> 00:50:31,680 Speaker 2: of this, you have the sort of AI you know, 1046 00:50:32,640 --> 00:50:35,360 Speaker 2: rise where the world's going to profoundly change in a 1047 00:50:35,400 --> 00:50:39,319 Speaker 2: very short period of time. There has never been a 1048 00:50:39,360 --> 00:50:43,640 Speaker 2: generation of our species. We're not just the global dynamics 1049 00:50:43,640 --> 00:50:47,480 Speaker 2: have changed generation to generation, but within generations. I mean, 1050 00:50:47,480 --> 00:50:49,600 Speaker 2: we're going to live in a world where, you know, 1051 00:50:49,680 --> 00:50:53,080 Speaker 2: the way that we understand and navigate systems and our 1052 00:50:53,120 --> 00:50:56,160 Speaker 2: lives is going to change multiple times in one lifetime. 1053 00:50:56,520 --> 00:50:59,120 Speaker 2: And you think about you know, Hunter gathers that the 1054 00:50:59,320 --> 00:51:02,799 Speaker 2: average human generations about twenty six point nine years. In 1055 00:51:02,840 --> 00:51:05,279 Speaker 2: the long stretch of our species, you can go twenty 1056 00:51:05,280 --> 00:51:07,960 Speaker 2: seven years over and over and over. It's pretty much 1057 00:51:07,960 --> 00:51:10,680 Speaker 2: the same world for pretty much the entirety of our 1058 00:51:10,719 --> 00:51:12,719 Speaker 2: species until I would say the last you know, maybe 1059 00:51:12,719 --> 00:51:14,759 Speaker 2: one hundred years or so. And that's the thing, you know, 1060 00:51:15,400 --> 00:51:17,080 Speaker 2: you think about this. The more you think about this, 1061 00:51:17,160 --> 00:51:18,839 Speaker 2: the more of these examples you find. I mean, one 1062 00:51:18,880 --> 00:51:20,560 Speaker 2: of them is, you know, jet leg I flew in 1063 00:51:20,600 --> 00:51:23,560 Speaker 2: from London and there's been three generations of people who 1064 00:51:23,600 --> 00:51:26,600 Speaker 2: could ever move fast enough to knock out their biology 1065 00:51:26,600 --> 00:51:28,680 Speaker 2: in a way that they have jet legs. So, I mean, 1066 00:51:28,680 --> 00:51:31,640 Speaker 2: there's just a million things that we experience as routine 1067 00:51:32,080 --> 00:51:33,879 Speaker 2: that no humans before us have ever brought. 1068 00:51:34,040 --> 00:51:38,000 Speaker 1: You can never outrun your circadian rhythms until you could 1069 00:51:38,239 --> 00:51:41,080 Speaker 1: travel at a few hundred miles an hour and go 1070 00:51:41,120 --> 00:51:44,920 Speaker 1: from from country to country change You couldn't even change 1071 00:51:44,960 --> 00:51:48,239 Speaker 1: time zones until what is it, seventy five years ago. 1072 00:51:48,560 --> 00:51:51,200 Speaker 2: Yeah, I mean, there's an amazing map. I don't know 1073 00:51:51,239 --> 00:51:53,799 Speaker 2: the exact name of it. I think it's an isochrome 1074 00:51:53,800 --> 00:51:55,920 Speaker 2: map or something like that, but it's a map of 1075 00:51:55,960 --> 00:51:59,680 Speaker 2: London from one hundred plus years ago, and it's showing 1076 00:51:59,719 --> 00:52:02,200 Speaker 2: the war based on how long it takes you to 1077 00:52:02,239 --> 00:52:06,080 Speaker 2: get anywhere. And you see that like western Europe is 1078 00:52:06,280 --> 00:52:08,640 Speaker 2: you know, the closest, and it's like five plus days 1079 00:52:08,719 --> 00:52:11,920 Speaker 2: or whatever. Right now, somebody made a renewed version of 1080 00:52:11,920 --> 00:52:14,360 Speaker 2: that map a couple of years ago, and the furthest 1081 00:52:14,400 --> 00:52:16,400 Speaker 2: reach you can go is like thirty six plus hours, 1082 00:52:16,400 --> 00:52:19,160 Speaker 2: where in the old map it was like three plus months. 1083 00:52:19,760 --> 00:52:21,520 Speaker 2: And you know that's the stuff as well, where we 1084 00:52:21,920 --> 00:52:23,880 Speaker 2: just we've sped up the world so much. And I 1085 00:52:23,920 --> 00:52:26,480 Speaker 2: think this is embedded a lot of the dynamics where 1086 00:52:26,520 --> 00:52:29,600 Speaker 2: flukes and sort of chance events become more common. 1087 00:52:29,520 --> 00:52:31,839 Speaker 1: Thirty six hours. I think you get to the moon 1088 00:52:31,880 --> 00:52:34,520 Speaker 1: in thirty six hours, right, It's true, And that's how 1089 00:52:34,600 --> 00:52:37,279 Speaker 1: much it's changed. Yeah, So let's let's play a little 1090 00:52:37,280 --> 00:52:41,920 Speaker 1: bit of a game called convergence or contingency. We talked 1091 00:52:41,960 --> 00:52:48,640 Speaker 1: before about sometimes hey, multiple evolutionary paths lead to flight 1092 00:52:49,120 --> 00:52:52,520 Speaker 1: in very different ways, and sometimes it's just a random 1093 00:52:52,719 --> 00:52:56,560 Speaker 1: meteor wiping out the dinosaurs. So once convergence. The other 1094 00:52:56,760 --> 00:53:00,840 Speaker 1: is contingency. And since you're in from London, Brexit was 1095 00:53:00,880 --> 00:53:04,239 Speaker 1: that a function of random elements or was that a 1096 00:53:04,280 --> 00:53:06,440 Speaker 1: convergence that was a long time in the make. 1097 00:53:07,040 --> 00:53:09,000 Speaker 2: Well, like most things, is both. I mean, I think 1098 00:53:09,040 --> 00:53:11,040 Speaker 2: there are factors around the Brexit vote that could have 1099 00:53:11,160 --> 00:53:13,200 Speaker 2: very quick clearly gone the other way. I mean, there 1100 00:53:13,239 --> 00:53:15,720 Speaker 2: are the timing of the vote could have been different, 1101 00:53:16,520 --> 00:53:18,920 Speaker 2: the ways that the polls were presented could have been different. 1102 00:53:19,600 --> 00:53:22,279 Speaker 2: And also I think some of the dynamics of how 1103 00:53:22,280 --> 00:53:25,200 Speaker 2: the EU behave could have been slightly different. So I mean, yeah, 1104 00:53:25,560 --> 00:53:30,040 Speaker 2: anytime you have a close outcome, it produces you know, 1105 00:53:30,080 --> 00:53:32,160 Speaker 2: I think contingency where it could have it could have 1106 00:53:32,360 --> 00:53:34,920 Speaker 2: gone the other way. But there are trends as well, right, 1107 00:53:34,960 --> 00:53:37,960 Speaker 2: I mean, there's these are the things where I'm even 1108 00:53:38,000 --> 00:53:41,360 Speaker 2: though I believe that Flukes changed the world profoundly regularly, 1109 00:53:41,840 --> 00:53:44,960 Speaker 2: I also completely accept the idea that there are sort 1110 00:53:44,960 --> 00:53:47,960 Speaker 2: of long term forces that yield something like Brexit. And 1111 00:53:48,000 --> 00:53:51,360 Speaker 2: there was a long sort of bubbling antagonism to immigration 1112 00:53:51,520 --> 00:53:54,120 Speaker 2: levels and anger at Brussels and all these sorts of 1113 00:53:54,160 --> 00:53:57,640 Speaker 2: things which politicians capitalized on and leads to Brexit. I mean, 1114 00:53:57,680 --> 00:54:00,239 Speaker 2: I think one of the things that will be interesten 1115 00:54:00,280 --> 00:54:03,759 Speaker 2: about this, and perhaps the biggest convergence, is the conversation 1116 00:54:03,840 --> 00:54:07,000 Speaker 2: which David Cameron decided to hold the referendum. That would 1117 00:54:07,000 --> 00:54:10,040 Speaker 2: be the biggest con contingency for me because he thought, 1118 00:54:11,160 --> 00:54:13,359 Speaker 2: at least as has been reported, he thought that he 1119 00:54:13,400 --> 00:54:15,400 Speaker 2: was going to put to bed the challenge from the 1120 00:54:15,440 --> 00:54:18,279 Speaker 2: right in the Conservative Party by holding the referendum, that 1121 00:54:18,320 --> 00:54:19,800 Speaker 2: he would win and that he would have to stop 1122 00:54:19,800 --> 00:54:22,880 Speaker 2: dealing with questions about Brexit, and of course it backfired 1123 00:54:22,920 --> 00:54:25,160 Speaker 2: on him. He didn't really believe in Brexit, but he 1124 00:54:25,200 --> 00:54:28,120 Speaker 2: figured this was a political ploy that would know basically 1125 00:54:28,560 --> 00:54:31,680 Speaker 2: cut off the pivot to the right. So that's one 1126 00:54:31,719 --> 00:54:33,120 Speaker 2: of those things where you know, if a different set 1127 00:54:33,160 --> 00:54:34,799 Speaker 2: of people had been in the room with Cameron, then 1128 00:54:35,239 --> 00:54:37,160 Speaker 2: maybe they don't hold a referendum, and then that's a 1129 00:54:37,239 --> 00:54:38,279 Speaker 2: very different world we live in. 1130 00:54:38,360 --> 00:54:41,040 Speaker 1: Huh. So you're over in the UK looking at the 1131 00:54:41,200 --> 00:54:46,240 Speaker 1: United States as a political science, the election of Donald 1132 00:54:46,280 --> 00:54:50,959 Speaker 1: Trump in twenty sixteen by forty or fifty thousand votes 1133 00:54:51,000 --> 00:54:55,120 Speaker 1: in a handful of swing states. Fascinating question was that 1134 00:54:55,680 --> 00:54:59,960 Speaker 1: a random contingency or was the convergence and the art 1135 00:55:00,080 --> 00:55:04,360 Speaker 1: ark of history moving towards a populist in the United States. 1136 00:55:04,680 --> 00:55:07,839 Speaker 2: Yes, so there's sort of precursor factors that Trump tacked 1137 00:55:07,840 --> 00:55:09,440 Speaker 2: into and this is the convergence, right, this is the 1138 00:55:09,440 --> 00:55:11,840 Speaker 2: stuff that's the trends. I do think there's some pretty 1139 00:55:11,840 --> 00:55:14,520 Speaker 2: big contingencies around Trump. I mean, there's there's one hypothesis 1140 00:55:14,520 --> 00:55:16,479 Speaker 2: which I you know, I can't I don't know Donald 1141 00:55:16,480 --> 00:55:19,520 Speaker 2: Trump's thinking, but there's speculation by people who are close 1142 00:55:19,560 --> 00:55:22,040 Speaker 2: to him that the moment he decided he would definitely 1143 00:55:22,120 --> 00:55:24,640 Speaker 2: run for the twenty sixteen race was in twenty eleven 1144 00:55:24,719 --> 00:55:27,000 Speaker 2: when there was the White House Correspondence dinner and he 1145 00:55:27,160 --> 00:55:30,120 Speaker 2: was exactly and he was publicly humiliated by Barack Obama 1146 00:55:30,160 --> 00:55:33,080 Speaker 2: with a joke that basically said something to the effect of, 1147 00:55:33,520 --> 00:55:36,000 Speaker 2: I really sympathize with you, Donald because I couldn't handle 1148 00:55:36,040 --> 00:55:38,560 Speaker 2: the hard choices that you have to make on Celebrity Apprentice, 1149 00:55:38,840 --> 00:55:40,800 Speaker 2: whereas I, you know, have to make the easy choices 1150 00:55:40,800 --> 00:55:43,000 Speaker 2: in the situation room. And everyone's sort of laughing at 1151 00:55:43,000 --> 00:55:45,480 Speaker 2: Donald Trump and so on. And the question is, you know, 1152 00:55:45,520 --> 00:55:47,560 Speaker 2: if the joke writer had not come up with that 1153 00:55:47,600 --> 00:55:50,239 Speaker 2: idea or Obama said at let's just can that joke. 1154 00:55:50,640 --> 00:55:53,879 Speaker 2: Does Trump run? I mean, that's question one. Then there's 1155 00:55:53,880 --> 00:55:56,800 Speaker 2: the questions around the election, right, And this is something where, 1156 00:55:57,600 --> 00:56:00,480 Speaker 2: you know, without going into too much detail, the reopening 1157 00:56:00,520 --> 00:56:03,080 Speaker 2: of the FBI investigation, which happens because of a congressman 1158 00:56:03,120 --> 00:56:05,960 Speaker 2: in New York and his inability to sort of control himself. 1159 00:56:06,120 --> 00:56:11,120 Speaker 1: Right, you know that send naked genital pictures to underage women. 1160 00:56:11,239 --> 00:56:12,759 Speaker 2: Thank you for saying it for me. So there's a 1161 00:56:12,840 --> 00:56:15,080 Speaker 2: you know, this is the thing where this causes the 1162 00:56:15,080 --> 00:56:17,440 Speaker 2: reopening the FBI investigation. Did this cause a shift in 1163 00:56:17,520 --> 00:56:19,960 Speaker 2: votes in those three critical states? I don't know, but possibly, 1164 00:56:20,040 --> 00:56:22,640 Speaker 2: right could be. And on top of that, you have 1165 00:56:23,560 --> 00:56:25,000 Speaker 2: one of my things that I do talk about in 1166 00:56:25,000 --> 00:56:27,440 Speaker 2: the book. I have a chapter on called the Lottery 1167 00:56:27,440 --> 00:56:30,279 Speaker 2: of Earth. And this is the strangest example of US 1168 00:56:30,280 --> 00:56:33,360 Speaker 2: politics with a fluke. Around the time of the dinosaurs, 1169 00:56:33,360 --> 00:56:35,840 Speaker 2: there was an ancient inland sea in America and it 1170 00:56:35,920 --> 00:56:37,799 Speaker 2: basically had a coastline that would if you were going 1171 00:56:37,800 --> 00:56:39,319 Speaker 2: to chart it today, it would be like a little 1172 00:56:39,320 --> 00:56:43,000 Speaker 2: crescent shape, a sort of swoop across Mississippi, Alabama, and Georgia. 1173 00:56:43,680 --> 00:56:46,240 Speaker 2: Now what happens is on the coastline there's these fital 1174 00:56:46,280 --> 00:56:50,239 Speaker 2: plankton that live in this shallow sea and when they die, 1175 00:56:50,360 --> 00:56:52,920 Speaker 2: their bodies eventually get turned into these really, really rich 1176 00:56:52,960 --> 00:56:55,120 Speaker 2: soils when the sea ends. Now, I promise this makes 1177 00:56:55,120 --> 00:56:59,160 Speaker 2: sense for how it links to Trump. This produces extremely 1178 00:56:59,200 --> 00:57:01,439 Speaker 2: fertile soil and what's called the Black Belt. And when 1179 00:57:01,680 --> 00:57:06,239 Speaker 2: slavery was developed, the plantations are. You can map them 1180 00:57:06,520 --> 00:57:09,160 Speaker 2: exactly where the ancient in lan Sea was. That's where 1181 00:57:09,160 --> 00:57:11,520 Speaker 2: they go. So this means that there's all these enslaved 1182 00:57:11,560 --> 00:57:13,759 Speaker 2: people brought to the southern United States according to this 1183 00:57:13,800 --> 00:57:17,160 Speaker 2: ancient coastline, and a lot of the people who were 1184 00:57:17,360 --> 00:57:20,600 Speaker 2: freed then settled there. And so the demographics of those 1185 00:57:21,040 --> 00:57:24,280 Speaker 2: counties are overwhelming the African American And when you look 1186 00:57:24,280 --> 00:57:27,000 Speaker 2: at the election results for the twenty twenty election, where 1187 00:57:27,040 --> 00:57:29,479 Speaker 2: Georgia becomes this pivotal state and also is the reason 1188 00:57:29,520 --> 00:57:32,200 Speaker 2: why the Democrats hold onto the Senate, if you map 1189 00:57:32,240 --> 00:57:35,200 Speaker 2: the county level election results, you will see the swoop 1190 00:57:35,560 --> 00:57:38,240 Speaker 2: of the ancient inlan Sea and it's exactly where the 1191 00:57:38,240 --> 00:57:41,120 Speaker 2: Democrats carried the state because it's where the black population, 1192 00:57:41,160 --> 00:57:44,840 Speaker 2: which is disproportionately likely to vote for Democrats, lives. And 1193 00:57:44,880 --> 00:57:46,120 Speaker 2: so you know this is the kind of stuff where, 1194 00:57:46,120 --> 00:57:47,880 Speaker 2: of course this is the long stretch of history, but 1195 00:57:47,880 --> 00:57:50,360 Speaker 2: it's something where I think we don't think about geological 1196 00:57:50,440 --> 00:57:54,320 Speaker 2: or geographical forces, and they do affect our politics. It's 1197 00:57:54,360 --> 00:57:56,560 Speaker 2: just that we're completely oblivious to them, and they're not 1198 00:57:56,600 --> 00:57:59,200 Speaker 2: that changing from election to election, so we're not fixating 1199 00:57:59,200 --> 00:58:00,400 Speaker 2: them for punditry. 1200 00:58:00,160 --> 00:58:04,920 Speaker 1: So your book forced me, as I was prepping for this, 1201 00:58:05,560 --> 00:58:10,880 Speaker 1: to go back in time and rethink what's contingent, what's convergent. 1202 00:58:11,480 --> 00:58:13,720 Speaker 1: And as I was prepping this, I'm going to ask 1203 00:58:13,760 --> 00:58:16,280 Speaker 1: you about January sixth and Ukraine and Gaza. But before 1204 00:58:16,320 --> 00:58:18,560 Speaker 1: I get to those questions, I want to stay with 1205 00:58:18,680 --> 00:58:22,200 Speaker 1: Trump in twenty sixteen and Trump in twenty twenty. As 1206 00:58:22,240 --> 00:58:27,040 Speaker 1: I was reading your language about the long fabric of 1207 00:58:27,280 --> 00:58:35,600 Speaker 1: threads in history, the conversation unrelated had talked about Iraq 1208 00:58:35,680 --> 00:58:39,200 Speaker 1: in two thousand and three, and as I'm plowing through 1209 00:58:39,200 --> 00:58:42,600 Speaker 1: the book, it sort of dawns on me the changes 1210 00:58:42,640 --> 00:58:46,360 Speaker 1: that are put into place under the Bush administration with 1211 00:58:46,440 --> 00:58:52,480 Speaker 1: Dick Cheney after nine to eleven, which essentially comes out 1212 00:58:52,520 --> 00:58:56,680 Speaker 1: of Afghanistan. Iraq had nothing to do with this. The 1213 00:58:56,720 --> 00:58:59,520 Speaker 1: idea that we're going to use this to invade a 1214 00:58:59,560 --> 00:59:03,040 Speaker 1: country that's not related to nine to eleven and just 1215 00:59:03,320 --> 00:59:07,200 Speaker 1: the ginned up weapons of mass destruction and all the 1216 00:59:07,240 --> 00:59:10,120 Speaker 1: evidence that turned out to be no evidence at all. 1217 00:59:10,800 --> 00:59:13,720 Speaker 1: That was at the time felt like a radical change. 1218 00:59:13,720 --> 00:59:16,840 Speaker 1: That the government was not just lying to us about 1219 00:59:16,880 --> 00:59:19,920 Speaker 1: little things we weren't paying attention to. They were like 1220 00:59:20,160 --> 00:59:24,400 Speaker 1: clearly not telling the truth, which most of us either 1221 00:59:24,400 --> 00:59:26,640 Speaker 1: didn't believe or didn't want to believe at the time. 1222 00:59:27,160 --> 00:59:29,560 Speaker 1: Of course, there's got to be some reason to invade 1223 00:59:29,560 --> 00:59:32,640 Speaker 1: a country. The Government's not just gonna make that up. 1224 00:59:32,680 --> 00:59:35,480 Speaker 1: And I'm wondering if that is that a contingent. Is 1225 00:59:35,480 --> 00:59:43,720 Speaker 1: that a convergence, because following the Bush Cheney administration, Donald 1226 00:59:43,760 --> 00:59:48,080 Speaker 1: Trump was kind of radical. But for that, I think 1227 00:59:48,120 --> 00:59:51,760 Speaker 1: if the Iraq War doesn't happen, and if the presentation 1228 00:59:52,360 --> 00:59:55,040 Speaker 1: by Colin Powell at the UN doesn't happen and the 1229 00:59:55,080 --> 00:59:58,959 Speaker 1: whole thing turns out to be bs afterwards, I think 1230 00:59:58,960 --> 01:00:02,560 Speaker 1: that kind of made people a little cynical and Trump 1231 01:00:02,720 --> 01:00:05,920 Speaker 1: was a modest step from that, whereas if that doesn't happen, 1232 01:00:06,000 --> 01:00:07,760 Speaker 1: Trump is a radical leap from that. 1233 01:00:07,960 --> 01:00:10,280 Speaker 2: Yeah. Yeah, So the Iraq War is a great example 1234 01:00:10,280 --> 01:00:12,440 Speaker 2: of this because I would go back even further to 1235 01:00:12,480 --> 01:00:16,600 Speaker 2: the First Golf War as the as Lady ninety one. Yes, exactly, yes, 1236 01:00:16,680 --> 01:00:18,600 Speaker 2: And I think this is an important part of the 1237 01:00:18,640 --> 01:00:22,600 Speaker 2: story that leads to Bush Junior going into Iraq in 1238 01:00:22,600 --> 01:00:26,320 Speaker 2: two thousand and three. So when Saddam Hussein was thinking 1239 01:00:26,320 --> 01:00:29,160 Speaker 2: about invading Kuwait in the early nineteen nineties, the US 1240 01:00:29,240 --> 01:00:32,000 Speaker 2: government wanted to tell him that if they if he 1241 01:00:32,040 --> 01:00:34,360 Speaker 2: did this, they would attack him. But there were two 1242 01:00:34,360 --> 01:00:36,920 Speaker 2: messages sent through diplomatic channels. One was called the Gillespie 1243 01:00:36,960 --> 01:00:39,240 Speaker 2: Memo and the other one was a sort of official communicate, 1244 01:00:40,240 --> 01:00:42,920 Speaker 2: and one of them was a little bit more lenient 1245 01:00:42,960 --> 01:00:45,720 Speaker 2: than the other. It sort of sounded like, we will 1246 01:00:45,720 --> 01:00:48,280 Speaker 2: disapprove of this, but you know, we won't attack you. 1247 01:00:48,640 --> 01:00:50,600 Speaker 2: That was the sort of subtext of it, whereas the 1248 01:00:50,600 --> 01:00:52,920 Speaker 2: other one was like, we will attack you. And what 1249 01:00:53,000 --> 01:00:56,440 Speaker 2: happened was because there were these two signals, Saddam Hussein 1250 01:00:56,480 --> 01:00:58,080 Speaker 2: picked the one that he thought was correct, and the 1251 01:00:58,120 --> 01:00:59,520 Speaker 2: one that he thought was correct was they're not going 1252 01:00:59,560 --> 01:01:02,120 Speaker 2: to do anything. So when you look at the reason 1253 01:01:02,160 --> 01:01:03,920 Speaker 2: why he invades and then gets wiped out, I mean 1254 01:01:03,920 --> 01:01:07,040 Speaker 2: you can look at the casualty numbers, it's like so ridiculously. 1255 01:01:07,080 --> 01:01:09,360 Speaker 2: It's probably the most lopsided conflict in modern history. 1256 01:01:09,480 --> 01:01:09,640 Speaker 1: Right. 1257 01:01:11,440 --> 01:01:15,800 Speaker 2: This origin story goes back to a misinterpretation of two 1258 01:01:15,880 --> 01:01:19,840 Speaker 2: conflicting signals that the US government basically miss He miscalculated based 1259 01:01:19,840 --> 01:01:23,680 Speaker 2: on a misinterpretation of a diplomatic signal. If that doesn't happen, 1260 01:01:23,760 --> 01:01:25,800 Speaker 2: you know, then you don't have the Bush connection to Iraq. 1261 01:01:26,000 --> 01:01:28,800 Speaker 2: You know, there's all these questions of what will happen now. 1262 01:01:29,440 --> 01:01:31,120 Speaker 2: I think there's there's a bigger point that I wanted 1263 01:01:31,160 --> 01:01:33,560 Speaker 2: to get into here, which I think is where I 1264 01:01:33,600 --> 01:01:35,400 Speaker 2: think about this differently from some other people. And I 1265 01:01:35,440 --> 01:01:37,200 Speaker 2: realized this when I was talking about the book. So 1266 01:01:37,200 --> 01:01:40,240 Speaker 2: I told a historian friend of mine the story of Kyoto, right, 1267 01:01:40,560 --> 01:01:42,480 Speaker 2: and how Kyoto doesn't get blown up in the atomic 1268 01:01:42,480 --> 01:01:45,960 Speaker 2: bomb from this vacation. And he says, okay, but hold on, like, 1269 01:01:46,040 --> 01:01:47,720 Speaker 2: the US is still going to win the war, right, 1270 01:01:47,760 --> 01:01:48,960 Speaker 2: Like it doesn't. Like, I mean, at the end of 1271 01:01:49,000 --> 01:01:50,520 Speaker 2: the day, if they drop the bomb on Kyoto, they'ld 1272 01:01:50,560 --> 01:01:51,960 Speaker 2: drop the bomb in Erosia, They're still going to win 1273 01:01:52,000 --> 01:01:54,480 Speaker 2: the war. I'm like, yes, that's true. The problem I 1274 01:01:54,520 --> 01:01:56,080 Speaker 2: think we make when we think about these things is 1275 01:01:56,120 --> 01:01:59,280 Speaker 2: we impose categories that don't really exist because there's a 1276 01:01:59,280 --> 01:02:01,320 Speaker 2: binary of whether you win the war or not. But 1277 01:02:01,360 --> 01:02:04,840 Speaker 2: the question is does Japan develop in the same way 1278 01:02:04,920 --> 01:02:08,160 Speaker 2: if you swap out Kyoto for Hiroshima. I don't think so, right, 1279 01:02:08,160 --> 01:02:10,880 Speaker 2: there's totally different people who live and die. And also 1280 01:02:10,920 --> 01:02:13,120 Speaker 2: one of the people who's one of the founding you know, 1281 01:02:13,200 --> 01:02:17,080 Speaker 2: scientists of modern meteorology was in Kyoto, so like he 1282 01:02:17,120 --> 01:02:19,200 Speaker 2: would have probably died. And this is a lot of 1283 01:02:19,200 --> 01:02:22,320 Speaker 2: the stuff that ends up helping us basically detect major storms. 1284 01:02:22,560 --> 01:02:24,320 Speaker 2: So you think there's I mean, even that's just a 1285 01:02:24,360 --> 01:02:26,400 Speaker 2: small ripple effect that we can imagine that. Okay, maybe 1286 01:02:26,480 --> 01:02:29,160 Speaker 2: mederology goes a little bit differently. So you know, what 1287 01:02:29,240 --> 01:02:30,720 Speaker 2: I think about with some of this stuff is like, 1288 01:02:31,080 --> 01:02:33,040 Speaker 2: you know, do we end up invading Iraq or not? 1289 01:02:33,160 --> 01:02:35,560 Speaker 2: Maybe we still do. Maybe that's the convergence. Maybe there's 1290 01:02:35,560 --> 01:02:38,600 Speaker 2: still a war, But the way it happens matters, And 1291 01:02:38,640 --> 01:02:41,040 Speaker 2: I think, you know, the way the conflict unfolds, the 1292 01:02:41,040 --> 01:02:43,960 Speaker 2: way that the losses accrue, the way that you know, 1293 01:02:44,800 --> 01:02:47,640 Speaker 2: the way the US had relationships with Osama bin Laden 1294 01:02:47,680 --> 01:02:49,520 Speaker 2: when he was a you know, a fighter in Afghanistan, 1295 01:02:49,560 --> 01:02:52,600 Speaker 2: and it I mean all this stuff matters, and I 1296 01:02:52,600 --> 01:02:54,280 Speaker 2: think the thing that we tend to do is we 1297 01:02:54,320 --> 01:02:56,200 Speaker 2: tend to just say, well, it would have been the same, 1298 01:02:56,240 --> 01:03:00,120 Speaker 2: because in our category, which is a fake construction and 1299 01:03:00,160 --> 01:03:01,960 Speaker 2: of the way we think about the world, it's the 1300 01:03:02,000 --> 01:03:04,960 Speaker 2: same binary outcome, Right, you win the war, you don't. 1301 01:03:05,440 --> 01:03:07,720 Speaker 2: But the way you win the war actually affects the future. 1302 01:03:07,760 --> 01:03:09,240 Speaker 2: And so that's the kind of stuff I think. I'm 1303 01:03:09,280 --> 01:03:11,000 Speaker 2: sure that people in business understand this as well, where 1304 01:03:11,040 --> 01:03:13,800 Speaker 2: it's like, you know, the way that a product launches, Yeah, 1305 01:03:13,840 --> 01:03:16,480 Speaker 2: it's a success, but if it's five percent more of 1306 01:03:16,520 --> 01:03:18,520 Speaker 2: a success, that might affect the way that you behave 1307 01:03:18,560 --> 01:03:20,200 Speaker 2: in your future investments, and then that's going to have 1308 01:03:20,280 --> 01:03:21,320 Speaker 2: ripple effects in the future. 1309 01:03:21,640 --> 01:03:23,920 Speaker 1: The way you win the war or not is the 1310 01:03:23,920 --> 01:03:29,080 Speaker 1: theme of liqud. A'mat's book Lords of Finance. The conditions 1311 01:03:29,080 --> 01:03:33,440 Speaker 1: that were imposed after World War One. Yep, pretty directly 1312 01:03:33,560 --> 01:03:36,840 Speaker 1: leads to Germany and World War Two. But for those 1313 01:03:37,280 --> 01:03:41,680 Speaker 1: very stringent conditions that lead to Germany being broken and 1314 01:03:41,680 --> 01:03:44,560 Speaker 1: then the rise of the hyperinflation in the Weimar Republican 1315 01:03:45,520 --> 01:03:49,640 Speaker 1: that was a series of choices, and he very brilliantly 1316 01:03:49,680 --> 01:03:54,400 Speaker 1: tells the story of this was absolutely not convergent. It 1317 01:03:54,440 --> 01:03:55,640 Speaker 1: didn't have to happen that way. 1318 01:03:55,720 --> 01:03:58,120 Speaker 2: Well, the story that is famous about World War One 1319 01:03:58,160 --> 01:04:01,120 Speaker 2: is how Archiduke Franz ferdinance car breaks down right in 1320 01:04:01,120 --> 01:04:04,120 Speaker 2: front of the assassin who kills him. It's a complete accident, right. 1321 01:04:04,480 --> 01:04:06,560 Speaker 2: I actually found a different contingency that I think is 1322 01:04:06,560 --> 01:04:09,080 Speaker 2: even more bewildering, which is that Franz Ferda and the 1323 01:04:09,160 --> 01:04:12,840 Speaker 2: Archduke goes to England about I think several months before 1324 01:04:12,880 --> 01:04:16,800 Speaker 2: he's actually killed in Sarajevo, and he ends up on 1325 01:04:16,840 --> 01:04:19,920 Speaker 2: a hunting expedition at this place called Welbeck Abbey, and 1326 01:04:20,080 --> 01:04:23,360 Speaker 2: the person who's loading the shotguns slips because there's just 1327 01:04:23,360 --> 01:04:26,040 Speaker 2: been a snowstorm, and the gun goes off and a 1328 01:04:26,080 --> 01:04:29,040 Speaker 2: bullet goes right over the shoulder of the Archduke and 1329 01:04:29,120 --> 01:04:32,160 Speaker 2: misses him by like three inches. And you think to yourself, Okay, 1330 01:04:32,160 --> 01:04:34,120 Speaker 2: So if this guy slips in a slightly different way 1331 01:04:34,160 --> 01:04:36,640 Speaker 2: and hits him in the head and the trigger event 1332 01:04:36,720 --> 01:04:40,240 Speaker 2: of World War One is instead dead already in Wellbeck Abbey, 1333 01:04:40,360 --> 01:04:42,160 Speaker 2: does World War One happen? Now? This is a debate 1334 01:04:42,160 --> 01:04:45,560 Speaker 2: that historians really can't answer, and there's lots of people 1335 01:04:45,560 --> 01:04:47,440 Speaker 2: on both sides of the argument. And I think the 1336 01:04:47,920 --> 01:04:51,000 Speaker 2: point is maybe World War one still happens, but if 1337 01:04:51,000 --> 01:04:53,439 Speaker 2: it's not triggered by this assassination, the way the war 1338 01:04:53,800 --> 01:04:55,560 Speaker 2: is going to unfold is going to be different. Does 1339 01:04:55,600 --> 01:04:57,200 Speaker 2: it lead to Nazi Germany the same? I mean, these 1340 01:04:57,200 --> 01:04:59,240 Speaker 2: are the things where I think what we do is 1341 01:04:59,280 --> 01:05:01,800 Speaker 2: we just pretend these things don't matter that much because 1342 01:05:01,840 --> 01:05:04,440 Speaker 2: it's so overwhelming. I mean, if the idea that somebody 1343 01:05:04,560 --> 01:05:06,640 Speaker 2: slipping is the response, you know, is sort of the 1344 01:05:06,760 --> 01:05:09,720 Speaker 2: proximate cause of millions of deaths and then the rise 1345 01:05:09,760 --> 01:05:12,479 Speaker 2: of Nazism. I mean, this is the kind of stuff 1346 01:05:12,480 --> 01:05:14,800 Speaker 2: where it's just so overwhelming that you mnd blow in. 1347 01:05:14,920 --> 01:05:18,600 Speaker 1: Yeah, so let me throw some more again your political science. 1348 01:05:18,960 --> 01:05:22,800 Speaker 1: Let's talk about some some recent political actions that are 1349 01:05:22,840 --> 01:05:26,520 Speaker 1: kind of fascinating and ask the question, is this convergence 1350 01:05:26,600 --> 01:05:30,040 Speaker 1: or contingency the Russian invasion of Ukraine. 1351 01:05:30,640 --> 01:05:33,480 Speaker 2: Yeah, you know, I think this is uh, it's they're 1352 01:05:33,480 --> 01:05:36,240 Speaker 2: always both. But the convergence of this is the sort 1353 01:05:36,280 --> 01:05:40,400 Speaker 2: of long standing humiliation of Russia that Vladimir Putin has 1354 01:05:40,440 --> 01:05:43,240 Speaker 2: a very big chip on his shoulder about, you know, 1355 01:05:43,280 --> 01:05:45,560 Speaker 2: sort of the fact that he has this predisposition to 1356 01:05:45,640 --> 01:05:47,760 Speaker 2: view Russia as a major global power because he was 1357 01:05:47,760 --> 01:05:49,960 Speaker 2: in the KGB and so on. You know that I 1358 01:05:49,960 --> 01:05:52,120 Speaker 2: think is a long term trend. And like Trump, sorry, 1359 01:05:52,160 --> 01:05:55,480 Speaker 2: Putin was always very very keen on reestablishing Russian dominance. 1360 01:05:56,000 --> 01:05:57,760 Speaker 2: But I think there was some stuff where there was 1361 01:05:57,800 --> 01:06:00,640 Speaker 2: some serious miscalculations going on, and this is where the 1362 01:06:00,800 --> 01:06:03,760 Speaker 2: contingencies I think could have cropped up. So I wrote 1363 01:06:03,760 --> 01:06:05,760 Speaker 2: a piece for The Atlantic in twenty twenty two right 1364 01:06:05,760 --> 01:06:10,600 Speaker 2: after the invasion happened, where it was like, look, what 1365 01:06:10,640 --> 01:06:14,080 Speaker 2: happens with dictators is they purge all the people who 1366 01:06:14,160 --> 01:06:17,240 Speaker 2: challenge them and tell them the truth, nothing but yes men, exactly. 1367 01:06:17,240 --> 01:06:19,760 Speaker 2: And this happens over decades. So the fact that Putin 1368 01:06:19,800 --> 01:06:22,000 Speaker 2: stayed in power for so long, he probably got some 1369 01:06:22,040 --> 01:06:24,080 Speaker 2: really bad information that told them, look, it's going to 1370 01:06:24,080 --> 01:06:26,880 Speaker 2: be three day war, and then he miscalculates based on this, and. 1371 01:06:26,840 --> 01:06:31,120 Speaker 1: Ill look back at the annexation of Crimea that kind 1372 01:06:31,120 --> 01:06:32,520 Speaker 1: of was a three day war exactly. 1373 01:06:32,600 --> 01:06:36,400 Speaker 2: And this is where I think the aspects of contingency 1374 01:06:36,880 --> 01:06:40,280 Speaker 2: are tied to the personality traits of leaders sometimes and 1375 01:06:40,360 --> 01:06:42,160 Speaker 2: if you have a different Russian president, maybe he doesn't 1376 01:06:42,160 --> 01:06:43,400 Speaker 2: do the same thing, right, And I think this is 1377 01:06:43,400 --> 01:06:46,880 Speaker 2: the kind of stuff where political science, you know, this 1378 01:06:46,920 --> 01:06:48,800 Speaker 2: is a little bit of inside baseball. Political science is 1379 01:06:48,840 --> 01:06:52,480 Speaker 2: obsessed with institutions. We tried to explain through institutions, and 1380 01:06:52,480 --> 01:06:54,520 Speaker 2: there was a long standing viewpoint. This speaks to, you know, 1381 01:06:54,600 --> 01:06:57,000 Speaker 2: January sixth and Trump and all these other things, that 1382 01:06:57,160 --> 01:06:59,920 Speaker 2: the institution of the president matters, not the president themselves. 1383 01:07:00,640 --> 01:07:03,760 Speaker 2: And I think Trump obliterated this mentality. Putin also obliterates 1384 01:07:03,760 --> 01:07:05,600 Speaker 2: this mentality. Nobody thinks the world would be the same 1385 01:07:05,720 --> 01:07:07,120 Speaker 2: Hillary Clinton had one in twenty. 1386 01:07:06,920 --> 01:07:09,240 Speaker 1: Sixteen, clearly very different, and you could say the same 1387 01:07:09,280 --> 01:07:12,320 Speaker 1: thing about Bush versus Gore completely. I think the world 1388 01:07:12,480 --> 01:07:15,680 Speaker 1: it feels like we took a different track following the 1389 01:07:15,680 --> 01:07:17,040 Speaker 1: two thousand election as well. 1390 01:07:17,160 --> 01:07:18,800 Speaker 2: Yeah, And I think this is where we make the mistake. 1391 01:07:18,800 --> 01:07:21,840 Speaker 2: I mean, contingency is obviously amplified for people in power. 1392 01:07:22,000 --> 01:07:26,240 Speaker 2: Hierarchies make contingency more more influential and on a shorter 1393 01:07:26,320 --> 01:07:29,480 Speaker 2: time scales. But everyone is affecting the world in some way, right, 1394 01:07:29,520 --> 01:07:31,080 Speaker 2: I mean, like we all have. As I say, we 1395 01:07:31,120 --> 01:07:34,000 Speaker 2: control nothing but influence everything. I mean that for ordinary people. 1396 01:07:33,760 --> 01:07:36,520 Speaker 1: Say that again, we control nothing but influence everything. 1397 01:07:36,600 --> 01:07:38,400 Speaker 2: Yeah, And I think that what this means is that 1398 01:07:38,440 --> 01:07:41,800 Speaker 2: we cannot control anything. There's nothing that we have absolute 1399 01:07:41,840 --> 01:07:44,200 Speaker 2: control over, but everything that we do has ripple effects. 1400 01:07:44,240 --> 01:07:46,960 Speaker 2: Every single action we make has ripple effects. The question 1401 01:07:47,080 --> 01:07:50,520 Speaker 2: is on what timescale are those important and how much 1402 01:07:50,560 --> 01:07:53,000 Speaker 2: are they affecting people around the world. So when Joe 1403 01:07:53,040 --> 01:07:56,840 Speaker 2: Biden does something, the contingency of that is highly probable 1404 01:07:56,880 --> 01:07:59,440 Speaker 2: that it will affect lots and lots of people. Whereas 1405 01:07:59,480 --> 01:08:01,560 Speaker 2: if you're somebody who's a hermit living in the forest, 1406 01:08:01,880 --> 01:08:03,840 Speaker 2: it's not going to affect that many people right away. 1407 01:08:04,120 --> 01:08:06,400 Speaker 2: Is it going to affect nobody? No, because if that 1408 01:08:06,480 --> 01:08:08,720 Speaker 2: hermit went and met somebody else, they would have a baby, 1409 01:08:08,720 --> 01:08:10,920 Speaker 2: and that baby might you know, rise up to you know, 1410 01:08:11,080 --> 01:08:13,200 Speaker 2: change the world, and who knows. So I think the 1411 01:08:13,320 --> 01:08:16,080 Speaker 2: idea is that everyone is influencing the future all the time. 1412 01:08:16,520 --> 01:08:18,880 Speaker 2: The question is just on what timescale and how many 1413 01:08:18,920 --> 01:08:20,639 Speaker 2: people will be affected in a way that we think 1414 01:08:20,720 --> 01:08:21,480 Speaker 2: is consequential. 1415 01:08:21,520 --> 01:08:25,840 Speaker 1: So you mentioned January sixth, That feels more like it's 1416 01:08:25,880 --> 01:08:30,679 Speaker 1: a contingency. But you're implying a lot of these things 1417 01:08:30,760 --> 01:08:34,360 Speaker 1: are convergent and might have happened given all the events 1418 01:08:34,360 --> 01:08:35,560 Speaker 1: that took place beforehand. 1419 01:08:35,760 --> 01:08:37,880 Speaker 2: Yeah, so I think the build up to January sixth 1420 01:08:38,000 --> 01:08:40,960 Speaker 2: was I think relatively predictable. I wrote a column actually 1421 01:08:41,240 --> 01:08:42,840 Speaker 2: about six months before it, where I said, look, I 1422 01:08:42,840 --> 01:08:44,600 Speaker 2: think there's going to be violence between the election and 1423 01:08:44,600 --> 01:08:47,560 Speaker 2: the inauguration, significant political violence between the election and the inauguration. 1424 01:08:47,880 --> 01:08:50,519 Speaker 2: And it wasn't like, it wasn't something that was completely 1425 01:08:50,560 --> 01:08:52,679 Speaker 2: out of left field. It was possible that these forces 1426 01:08:52,720 --> 01:08:53,960 Speaker 2: were amassing. 1427 01:08:54,520 --> 01:08:54,720 Speaker 1: You know. 1428 01:08:54,760 --> 01:08:57,000 Speaker 2: I think the contingency is there were a few of 1429 01:08:57,000 --> 01:08:59,759 Speaker 2: the people in the group that took over the capital 1430 01:08:59,840 --> 01:09:03,600 Speaker 2: that had zip tizze and were trying to kidnap politicians 1431 01:09:03,640 --> 01:09:06,759 Speaker 2: hang like Pennce. Yeah, and you know there are videos 1432 01:09:06,760 --> 01:09:08,720 Speaker 2: you can see in that in the in the CCTV 1433 01:09:08,840 --> 01:09:11,960 Speaker 2: where they were close, and you know, how does how 1434 01:09:12,000 --> 01:09:14,720 Speaker 2: does American politics unfold if somebody actually gets killed in that? 1435 01:09:15,320 --> 01:09:17,880 Speaker 2: I mean, there's there's a lot of things where you know, 1436 01:09:18,000 --> 01:09:19,880 Speaker 2: they kill a senior politician or something, and that's going 1437 01:09:19,920 --> 01:09:22,080 Speaker 2: to change the dynamics of the country. I think that 1438 01:09:22,120 --> 01:09:24,240 Speaker 2: if they had had a you know, if the if 1439 01:09:24,280 --> 01:09:27,080 Speaker 2: the outcome of January sixth had been worse in that regard, 1440 01:09:27,200 --> 01:09:29,599 Speaker 2: if there had been a senior politician murdered by somebody 1441 01:09:29,600 --> 01:09:32,960 Speaker 2: in the in the in the group. You know, that 1442 01:09:33,040 --> 01:09:35,960 Speaker 2: would have been harder for Trump to recover from Politically. 1443 01:09:35,600 --> 01:09:39,479 Speaker 1: I think I was surprised how quickly he recovered trust. 1444 01:09:40,080 --> 01:09:43,240 Speaker 1: What looked like, you know, from my perspective, the game 1445 01:09:43,320 --> 01:09:48,000 Speaker 1: theory was, Hey, I'm a conservative Republican and I'm against 1446 01:09:48,040 --> 01:09:50,879 Speaker 1: abortion and in favor of tax cuts. I got everything 1447 01:09:50,920 --> 01:09:53,800 Speaker 1: I want from Trump. Let's throw them under the bus 1448 01:09:53,800 --> 01:09:56,760 Speaker 1: and move on. We could retake our party. I was 1449 01:09:56,840 --> 01:10:03,000 Speaker 1: shocked at that a principle didn't permeate the Republican right 1450 01:10:03,280 --> 01:10:06,599 Speaker 1: because it looked like, in real time, Hey, you guys 1451 01:10:06,640 --> 01:10:08,880 Speaker 1: don't need this guy anymore. He just did you a 1452 01:10:08,920 --> 01:10:09,559 Speaker 1: huge favor. 1453 01:10:10,000 --> 01:10:11,600 Speaker 2: Yeah. Well, and this is also where, you know, the 1454 01:10:11,680 --> 01:10:15,280 Speaker 2: dynamics of contingency play into this in a huge way, 1455 01:10:15,320 --> 01:10:17,360 Speaker 2: because part of the anger that I think exists on 1456 01:10:17,360 --> 01:10:20,200 Speaker 2: the political right is the backlash to policies during the 1457 01:10:20,240 --> 01:10:22,960 Speaker 2: pandemic and some of the information that people in the 1458 01:10:23,000 --> 01:10:26,519 Speaker 2: Republican Party share about the pandemic and so on, And 1459 01:10:26,760 --> 01:10:29,400 Speaker 2: that is a single person in China getting infected by 1460 01:10:29,400 --> 01:10:31,000 Speaker 2: a mutation of a virus, you know what I mean. 1461 01:10:31,240 --> 01:10:33,040 Speaker 2: So like, you know, you think about the twenty twenty race. 1462 01:10:33,080 --> 01:10:35,880 Speaker 2: I mean, it is affected profoundly by one person getting sick. 1463 01:10:36,120 --> 01:10:39,120 Speaker 1: Right. My argument has long been that but for the 1464 01:10:39,120 --> 01:10:44,040 Speaker 1: mishandling of COVID, he would have easily cruised to reelection. Yeah. 1465 01:10:44,040 --> 01:10:46,639 Speaker 1: I mean of he was fine pre COVID, and people 1466 01:10:46,680 --> 01:10:48,080 Speaker 1: tend to vote their pocepo Yeah. 1467 01:10:48,320 --> 01:10:49,760 Speaker 2: And this is the stuff where I think we just 1468 01:10:49,840 --> 01:10:52,840 Speaker 2: can never know, but I I know. My point is 1469 01:10:52,840 --> 01:10:56,879 Speaker 2: that when you accept that these things are so fragile, 1470 01:10:57,600 --> 01:11:00,439 Speaker 2: the hubrist that comes with it is reduced because you 1471 01:11:00,479 --> 01:11:03,640 Speaker 2: start to think, Okay, A, this is not inevitable. B 1472 01:11:03,840 --> 01:11:06,640 Speaker 2: I didn't control this completely. And see, because it's so 1473 01:11:07,600 --> 01:11:13,400 Speaker 2: derived from contingency, maybe I shouldn't over confidently try to 1474 01:11:13,439 --> 01:11:15,840 Speaker 2: manipulate the system. I think these are the things where, 1475 01:11:15,880 --> 01:11:18,000 Speaker 2: like you know, some people will think will be listening 1476 01:11:18,040 --> 01:11:19,000 Speaker 2: to me and say, oh, this is a bit of 1477 01:11:19,040 --> 01:11:20,960 Speaker 2: a parlor game. These are all thought experiments, et cetera. 1478 01:11:21,120 --> 01:11:23,120 Speaker 2: I think the lesson, the important lesson, is that when 1479 01:11:23,160 --> 01:11:26,479 Speaker 2: you accept these strange happenstance events the way chaos theory 1480 01:11:26,479 --> 01:11:31,120 Speaker 2: actually works in social systems, you have an appreciation for 1481 01:11:31,160 --> 01:11:34,160 Speaker 2: the fact that you simply cannot control anything. And when 1482 01:11:34,240 --> 01:11:36,920 Speaker 2: you accept that you live in a world where you 1483 01:11:36,920 --> 01:11:39,280 Speaker 2: are more likely to focus on resilience and less likely 1484 01:11:39,320 --> 01:11:41,720 Speaker 2: to focus on optimization to the absolute work. 1485 01:11:41,920 --> 01:11:47,240 Speaker 1: So last two random examples I want to ask about. First. 1486 01:11:47,560 --> 01:11:50,719 Speaker 1: I love the example you give of Keith Jarrett live 1487 01:11:50,760 --> 01:11:53,920 Speaker 1: at the Opera House in Germany. He's supposed to come 1488 01:11:53,960 --> 01:11:59,160 Speaker 1: in and play on a beautiful, you know, concert piano. 1489 01:11:59,280 --> 01:12:02,799 Speaker 1: Instead he shows up, there's an old, rickety attitude piano, 1490 01:12:03,360 --> 01:12:08,280 Speaker 1: and he has to improvise around broken keys and attitude notes. 1491 01:12:08,800 --> 01:12:12,200 Speaker 1: This becomes the best selling solo jazz album in history. 1492 01:12:12,479 --> 01:12:15,160 Speaker 2: Yeah, so this is the lesson of how sometimes forced 1493 01:12:15,200 --> 01:12:18,479 Speaker 2: experimentation can be really good for innovation. So you know, 1494 01:12:18,520 --> 01:12:21,400 Speaker 2: this guy, basically you know, plays a crappy piano and 1495 01:12:21,520 --> 01:12:24,559 Speaker 2: ends up producing something incredible. He never would have chosen 1496 01:12:24,600 --> 01:12:26,519 Speaker 2: to do that. It was forced on him, right, it 1497 01:12:26,520 --> 01:12:28,760 Speaker 2: was an accident. Now, one of my favorite studies that's 1498 01:12:28,800 --> 01:12:30,960 Speaker 2: around that section of the book is a study about 1499 01:12:31,160 --> 01:12:34,200 Speaker 2: a tube strike in London where they've geolocated all the 1500 01:12:34,280 --> 01:12:36,479 Speaker 2: data of the commuters and they look at these anonymous 1501 01:12:36,479 --> 01:12:39,680 Speaker 2: cell phone data pathways to work and everybody has to 1502 01:12:39,680 --> 01:12:41,479 Speaker 2: find a different way to work because the subway system 1503 01:12:41,520 --> 01:12:43,479 Speaker 2: has just been shut down by these drivers on strike. 1504 01:12:43,840 --> 01:12:45,919 Speaker 2: What they found is that five percent of the commuters 1505 01:12:46,000 --> 01:12:48,759 Speaker 2: stuck with the new pathway to work after the strike 1506 01:12:49,000 --> 01:12:51,080 Speaker 2: because they were forced to sort of try something new 1507 01:12:51,120 --> 01:12:53,720 Speaker 2: and they realized they've liked the new alternative. And I 1508 01:12:53,760 --> 01:12:56,320 Speaker 2: think this is something where because of optimization in our lives, 1509 01:12:56,320 --> 01:12:58,320 Speaker 2: you know, we're always looking for the trip Advisor quote 1510 01:12:58,360 --> 01:13:01,679 Speaker 2: or you know, the perfect way on maps, you experiment less. 1511 01:13:01,840 --> 01:13:04,320 Speaker 2: And when you experiment less, you actually find that you 1512 01:13:04,600 --> 01:13:06,640 Speaker 2: don't navigate uncertainty as well. And I think this is 1513 01:13:06,640 --> 01:13:09,160 Speaker 2: the lesson again. It brings us back to evolution. The 1514 01:13:09,160 --> 01:13:12,519 Speaker 2: wisdom of evolution is experimentation through uncertainty, and I think 1515 01:13:12,520 --> 01:13:17,000 Speaker 2: that's where humans, when they have hubris, experiment less and 1516 01:13:17,040 --> 01:13:19,000 Speaker 2: become less resilient. And I think it's a very important 1517 01:13:19,040 --> 01:13:19,559 Speaker 2: lesson for us. 1518 01:13:19,640 --> 01:13:22,120 Speaker 1: All Right, so now I'm gonna get super wonky on you, 1519 01:13:22,600 --> 01:13:27,559 Speaker 1: and you use the thought experiment of Laplace's demon. You 1520 01:13:27,640 --> 01:13:32,599 Speaker 1: have a demon that has perfect knowledge of every atom 1521 01:13:32,800 --> 01:13:36,400 Speaker 1: in the universe, but I and because of that precise detail, 1522 01:13:36,600 --> 01:13:39,120 Speaker 1: they know everything that's happened, they know everything that's going 1523 01:13:39,160 --> 01:13:41,960 Speaker 1: on right now, and they know that everything that's gonna happen. 1524 01:13:42,120 --> 01:13:45,960 Speaker 1: Let me throw a curveball at you. The latest findings 1525 01:13:46,120 --> 01:13:52,360 Speaker 1: from quantum research and physics is that, well, you can 1526 01:13:52,520 --> 01:13:55,519 Speaker 1: know everything. You could know the location of electron or 1527 01:13:55,640 --> 01:14:00,800 Speaker 1: its spin and handedness, but not both. So that kind 1528 01:14:00,800 --> 01:14:04,880 Speaker 1: of raises the question. Even Laplace's thought experiment with the demon, 1529 01:14:05,920 --> 01:14:09,400 Speaker 1: there's too much randomness to for even an all knowing 1530 01:14:10,400 --> 01:14:12,360 Speaker 1: demon to be able to predict the future. 1531 01:14:12,400 --> 01:14:14,960 Speaker 2: Yeah, we're covering all the basis today. We got quantum mechanics. Now, 1532 01:14:15,000 --> 01:14:17,080 Speaker 2: so quantum mechanics, I mean, the thing is, it is 1533 01:14:17,120 --> 01:14:20,400 Speaker 2: absolutely the case that the scientific interpretation of highly verified 1534 01:14:20,400 --> 01:14:23,920 Speaker 2: experimental data is that probably the only genuinely random thing 1535 01:14:23,960 --> 01:14:26,360 Speaker 2: in the universe is quantum effects at the atomic and 1536 01:14:26,400 --> 01:14:29,639 Speaker 2: subatomic levels. Right now, the question is what does that mean? 1537 01:14:30,080 --> 01:14:32,719 Speaker 2: And so this is where things get very trippy, very quickly, 1538 01:14:32,720 --> 01:14:36,080 Speaker 2: because the many world's interpretation of quantum mechanics where an 1539 01:14:36,120 --> 01:14:38,519 Speaker 2: infinite number of things that can happen do happen, and 1540 01:14:38,600 --> 01:14:41,880 Speaker 2: there's an infinite copy of you in infinite universes, right, 1541 01:14:42,600 --> 01:14:45,439 Speaker 2: that is still a deterministic universe where Laplace's demon could 1542 01:14:45,479 --> 01:14:48,960 Speaker 2: theoretically be true, right, because then you would know you 1543 01:14:49,040 --> 01:14:50,720 Speaker 2: just you wouldn't know which universe you were in, but 1544 01:14:50,760 --> 01:14:53,280 Speaker 2: it would be all the universes are happening all the time, right. 1545 01:14:53,560 --> 01:14:56,840 Speaker 2: Whereas if you take the standard interpretation of quantum mechanics 1546 01:14:56,920 --> 01:14:59,600 Speaker 2: DU to the Copenhagen interpretation, then yes, you have irreducible 1547 01:14:59,680 --> 01:15:04,400 Speaker 2: random indeterminism is correct, and therefore the plus's demon is nonsensical. 1548 01:15:04,840 --> 01:15:06,360 Speaker 2: So you know, I mean, there's lots of reasons why 1549 01:15:06,360 --> 01:15:08,280 Speaker 2: the plus's demon probably wouldn't work anyway that a lot 1550 01:15:08,280 --> 01:15:12,160 Speaker 2: of philosophers have objections to, but it is This is 1551 01:15:12,160 --> 01:15:14,960 Speaker 2: one of those fascinating questions I think, is that you know, 1552 01:15:15,040 --> 01:15:18,120 Speaker 2: we have this world where we believe we have more 1553 01:15:18,240 --> 01:15:22,240 Speaker 2: understanding than any you know, human ever alive, but the 1554 01:15:22,280 --> 01:15:24,680 Speaker 2: big questions are still completely uncertain to us. We don't 1555 01:15:24,760 --> 01:15:28,080 Speaker 2: understand consciousness, we have no idea what produces it, and 1556 01:15:28,120 --> 01:15:31,439 Speaker 2: we also don't understand anything about quantum mechanics in terms 1557 01:15:31,520 --> 01:15:33,720 Speaker 2: of what it actually means. And these are like the 1558 01:15:33,760 --> 01:15:36,519 Speaker 2: building blocks of our world, you know, I think that's 1559 01:15:36,560 --> 01:15:39,280 Speaker 2: pretty amazing to imagine that, and it gives us a 1560 01:15:39,280 --> 01:15:42,439 Speaker 2: healthy dose of sort of you know, a bit of 1561 01:15:42,840 --> 01:15:46,479 Speaker 2: humility because we just there's so much we still don't understand, but. 1562 01:15:46,520 --> 01:15:48,960 Speaker 1: Throw free will in that also, whether or not you 1563 01:15:49,120 --> 01:15:53,240 Speaker 1: right between the intersection of quantum mechanics and consciousness. You know, 1564 01:15:53,320 --> 01:15:57,479 Speaker 1: do we really control even our own agency? Forget the 1565 01:15:57,520 --> 01:16:00,280 Speaker 1: rest of the world. It's it's even more comp life. 1566 01:16:00,280 --> 01:16:03,120 Speaker 1: So I only have you for a handful of minutes, 1567 01:16:03,160 --> 01:16:06,120 Speaker 1: and I want to jump to my favorite questions that 1568 01:16:06,200 --> 01:16:11,560 Speaker 1: I ask all of my guests, starting with tell us, Uh, 1569 01:16:11,640 --> 01:16:14,040 Speaker 1: what you've been streaming these days? What are you watching 1570 01:16:14,160 --> 01:16:15,800 Speaker 1: or listening to? Yeah? 1571 01:16:16,080 --> 01:16:18,120 Speaker 2: My favorite show that I've been watching recently is called 1572 01:16:18,160 --> 01:16:22,240 Speaker 2: Slow Horses, the great TV Yeah, the great spy drama. 1573 01:16:22,400 --> 01:16:24,360 Speaker 2: And I've read all the books to which I highly 1574 01:16:24,400 --> 01:16:27,640 Speaker 2: recommend by Mick Herron. You know, I think there's uh 1575 01:16:28,680 --> 01:16:30,600 Speaker 2: in terms of in terms of podcasts if people are 1576 01:16:30,640 --> 01:16:32,639 Speaker 2: interested in some of the ideas that I've been talking about. 1577 01:16:33,280 --> 01:16:36,280 Speaker 2: There's a podcast called Mindscape by a physicist named Sean 1578 01:16:36,280 --> 01:16:37,960 Speaker 2: Carroll who's one of the main proponents of the many 1579 01:16:37,960 --> 01:16:41,240 Speaker 2: worlds hypothesis. It's nerdy. I'm not gonna lie you know 1580 01:16:41,280 --> 01:16:43,479 Speaker 2: this is it's a brainy podcast, but it's something where 1581 01:16:43,800 --> 01:16:46,959 Speaker 2: he brings on really smart people and ask them questions 1582 01:16:47,000 --> 01:16:49,240 Speaker 2: that only Sean Carroll could come up with as a 1583 01:16:49,360 --> 01:16:53,559 Speaker 2: highly highly informed quantum mechanics researcher, but about all sorts 1584 01:16:53,600 --> 01:16:57,479 Speaker 2: of things politics, economics, life, philosophy, et cetera. So I 1585 01:16:57,560 --> 01:16:59,120 Speaker 2: highly recommend the Mindscape podcast. 1586 01:16:59,479 --> 01:17:02,400 Speaker 1: Tell us about out your mentors who helped shape your career? 1587 01:17:03,439 --> 01:17:05,479 Speaker 2: Yeah, you know, I mean I think my mom was 1588 01:17:05,520 --> 01:17:07,800 Speaker 2: one of them. She decided to run for school board 1589 01:17:07,800 --> 01:17:09,800 Speaker 2: and that's probably the reason why I ended up interested 1590 01:17:09,800 --> 01:17:11,559 Speaker 2: in politics when I was eight years old and she 1591 01:17:11,600 --> 01:17:14,120 Speaker 2: decided to run for the local school board. You know, 1592 01:17:14,120 --> 01:17:15,720 Speaker 2: there's a lot of a lot of teachers I had. 1593 01:17:15,760 --> 01:17:19,160 Speaker 2: I think my main one, though, is my PhD advisor, 1594 01:17:19,680 --> 01:17:22,640 Speaker 2: Nick Cheeseman is his name. He's a professor previously at 1595 01:17:22,640 --> 01:17:25,400 Speaker 2: Oxford now at the University of Birmingham. We co wrote 1596 01:17:25,400 --> 01:17:27,040 Speaker 2: a book together called had to Rig An Election? And 1597 01:17:27,080 --> 01:17:29,920 Speaker 2: you know, I mean, what year was that. This came 1598 01:17:29,960 --> 01:17:34,320 Speaker 2: out in twenty eighteen, so it's all about election rigging 1599 01:17:34,320 --> 01:17:37,040 Speaker 2: around the world, but it's you know, he was one 1600 01:17:37,080 --> 01:17:38,840 Speaker 2: of these people who just like really taught me how 1601 01:17:38,880 --> 01:17:41,840 Speaker 2: to think about change in a very detailed and complex way. 1602 01:17:42,640 --> 01:17:44,080 Speaker 2: And I owe a lot of my career to him, 1603 01:17:44,120 --> 01:17:44,400 Speaker 2: I think. 1604 01:17:44,960 --> 01:17:48,320 Speaker 1: And since you mentioned books, let's talk about what you're 1605 01:17:48,360 --> 01:17:50,360 Speaker 1: reading now and what are some of your favorites. 1606 01:17:51,360 --> 01:17:54,280 Speaker 2: Yeah, so I read a lot of fiction and nonfiction both. 1607 01:17:54,920 --> 01:17:58,400 Speaker 2: There's a nonfiction book I highly recommend called Beyond Measure 1608 01:17:58,479 --> 01:18:00,680 Speaker 2: by James Vincent, and it really does dovetail with some 1609 01:18:00,720 --> 01:18:03,360 Speaker 2: of the ideas we've been talking about. It's a history 1610 01:18:03,400 --> 01:18:06,120 Speaker 2: of measurement, and this is a perfect example of what 1611 01:18:06,120 --> 01:18:08,040 Speaker 2: I talked about with Locke in because the sort of 1612 01:18:08,040 --> 01:18:12,240 Speaker 2: way that we subdivide the world is often completely arbitrary. 1613 01:18:12,320 --> 01:18:14,720 Speaker 2: So much of America, by the way, is arranged the 1614 01:18:14,720 --> 01:18:16,760 Speaker 2: way it is because of a thing called the Gunter's chain, 1615 01:18:17,040 --> 01:18:19,200 Speaker 2: which is why city blocks are arranged the way they are. 1616 01:18:19,240 --> 01:18:22,559 Speaker 2: It's this arbitrary measure to try to subdivide land in 1617 01:18:22,600 --> 01:18:25,679 Speaker 2: a way that was standardized. So yeah, Beyond Measures very good. 1618 01:18:26,360 --> 01:18:29,719 Speaker 2: I love Kurt Vonnegut as a novelist. He's book Cat's 1619 01:18:29,720 --> 01:18:32,360 Speaker 2: Cradle and Sirens of Titan are my two favorite novels, 1620 01:18:32,360 --> 01:18:35,120 Speaker 2: along with Douglas Adams's work, Hey Checker's Guide to the Galaxy. 1621 01:18:35,160 --> 01:18:36,599 Speaker 2: So I can't recommend all of those enough. 1622 01:18:37,960 --> 01:18:40,880 Speaker 1: It's funny because when you're talking about the various things 1623 01:18:40,880 --> 01:18:45,000 Speaker 1: that change history, I'm normally not a big fan of 1624 01:18:45,040 --> 01:18:49,000 Speaker 1: the revisionist history. But Man in the High Castle by 1625 01:18:49,040 --> 01:18:52,240 Speaker 1: Philip K. Dick, What happens if the US loses World 1626 01:18:52,280 --> 01:18:54,880 Speaker 1: War Two and Japan and Germany take over the world? 1627 01:18:55,560 --> 01:19:01,720 Speaker 1: Fascinating book, along Thost similar concepts, and our final two questions. 1628 01:19:02,040 --> 01:19:05,000 Speaker 1: What sort of advice would you give a recent college 1629 01:19:05,080 --> 01:19:10,679 Speaker 1: grad interested in a career in either political science or writing. 1630 01:19:11,360 --> 01:19:14,200 Speaker 2: It's fine. I do give advice to people who are 1631 01:19:14,200 --> 01:19:16,400 Speaker 2: about to graduate all the time, and what I always 1632 01:19:16,400 --> 01:19:19,000 Speaker 2: tell them is to try things out. I mean, the 1633 01:19:19,040 --> 01:19:24,280 Speaker 2: period of exploration in the twenties is one where I 1634 01:19:24,360 --> 01:19:27,240 Speaker 2: think people end up much happier if they sort of 1635 01:19:27,240 --> 01:19:29,640 Speaker 2: do a trial and error approach, realize what works for 1636 01:19:29,680 --> 01:19:32,320 Speaker 2: them what doesn't work for them. My brother always used 1637 01:19:32,360 --> 01:19:34,519 Speaker 2: to say that the most important internship he ever had 1638 01:19:34,560 --> 01:19:37,560 Speaker 2: was the one he hated the most, because he realized 1639 01:19:37,960 --> 01:19:39,599 Speaker 2: he thought he wanted to be a geneticist. He got 1640 01:19:39,640 --> 01:19:43,000 Speaker 2: this like plumb post as a researcher on fig wasps 1641 01:19:43,040 --> 01:19:46,559 Speaker 2: of all things, hated every minute of it. Now he's 1642 01:19:46,560 --> 01:19:49,000 Speaker 2: a doctor and loves it. But it was because he 1643 01:19:49,080 --> 01:19:51,519 Speaker 2: listened to that feedback in his own experience and said 1644 01:19:51,560 --> 01:19:53,760 Speaker 2: you know this is not for me, so you know, 1645 01:19:54,000 --> 01:19:57,800 Speaker 2: really go out try things and take notes about what 1646 01:19:57,840 --> 01:19:59,000 Speaker 2: you like and what you don't like, and then that 1647 01:19:59,000 --> 01:19:59,960 Speaker 2: will help you make better decison. 1648 01:20:00,479 --> 01:20:03,040 Speaker 1: And our final question, what do you know about the 1649 01:20:03,080 --> 01:20:09,160 Speaker 1: world of chaos, theory, causation, the butterfly effect today you 1650 01:20:09,200 --> 01:20:11,080 Speaker 1: wish you knew twenty or so years ago. 1651 01:20:12,120 --> 01:20:13,920 Speaker 2: Well, I like, you know, one of the things is 1652 01:20:13,960 --> 01:20:16,000 Speaker 2: that I'm derived from mass murder, because I didn't know 1653 01:20:16,040 --> 01:20:18,880 Speaker 2: that previously, but I will say that, you know, I 1654 01:20:18,880 --> 01:20:21,160 Speaker 2: think that navigating uncertainty is one of those things that 1655 01:20:21,200 --> 01:20:24,800 Speaker 2: I used to think was only something that we should 1656 01:20:24,800 --> 01:20:28,040 Speaker 2: try to slay and tame. What I like to appreciate now, 1657 01:20:28,080 --> 01:20:29,640 Speaker 2: and I write about some of the philosophy of this 1658 01:20:29,720 --> 01:20:31,960 Speaker 2: in fluke is. I actually think uncertainty can be a 1659 01:20:31,960 --> 01:20:35,559 Speaker 2: really wonderful thing, and you just have to sometimes accept 1660 01:20:35,600 --> 01:20:38,880 Speaker 2: it and then navigate based on the understanding that there 1661 01:20:38,960 --> 01:20:42,880 Speaker 2: is radical uncertainty that we can't eliminate, and that is 1662 01:20:42,920 --> 01:20:45,879 Speaker 2: where some of the best flukes in life come from. 1663 01:20:45,920 --> 01:20:49,200 Speaker 1: Really very fascinating. Thank you, Brian for being so generous 1664 01:20:49,200 --> 01:20:52,479 Speaker 1: with your time. We have been speaking with Brian Klass, 1665 01:20:52,720 --> 01:20:56,839 Speaker 1: Professor of Global Politics at University College London and author 1666 01:20:56,920 --> 01:21:00,439 Speaker 1: of the new book Fluke, Chance, Chaos and Why Everything 1667 01:21:00,479 --> 01:21:04,280 Speaker 1: We Do Matters. If you enjoy this conversation, well be 1668 01:21:04,320 --> 01:21:07,120 Speaker 1: sure and check out any of the five hundred previous 1669 01:21:07,160 --> 01:21:10,000 Speaker 1: discussions we've had over the past ten years. You can 1670 01:21:10,040 --> 01:21:14,200 Speaker 1: find those at iTunes, Spotify, YouTube, wherever you find your 1671 01:21:14,240 --> 01:21:18,479 Speaker 1: favorite podcast. Check out my new podcast at the Money 1672 01:21:18,960 --> 01:21:21,680 Speaker 1: once a week, a quick discussion with an expert on 1673 01:21:21,720 --> 01:21:25,719 Speaker 1: a subject that matters to investors. You can find those 1674 01:21:25,800 --> 01:21:28,840 Speaker 1: in the Masters of Business feed. Sign up for my 1675 01:21:28,960 --> 01:21:31,360 Speaker 1: daily reading list at ridults dot com. Follow me on 1676 01:21:31,400 --> 01:21:35,280 Speaker 1: Twitter at rit Alts. Follow the full family of Bloomberg 1677 01:21:35,520 --> 01:21:39,680 Speaker 1: podcasts at podcast I would be remiss if I did 1678 01:21:39,680 --> 01:21:42,360 Speaker 1: not thank the correct team that puts these conversations together 1679 01:21:42,439 --> 01:21:45,840 Speaker 1: each week. Kaylie Lapara is my audio engineer, A Tick 1680 01:21:45,880 --> 01:21:49,960 Speaker 1: of Albrun is my project manager, Sean Russo is my researcher, 1681 01:21:50,080 --> 01:21:53,720 Speaker 1: and A Luke is my producer. I'm Barry Hults. You've 1682 01:21:53,720 --> 01:21:58,840 Speaker 1: been listening to Masters and Business on Bloomberg Radio.