1 00:00:15,396 --> 00:00:24,036 Speaker 1: Pushkin from Pushkin Industries. This is Deep Background, the show 2 00:00:24,116 --> 00:00:27,916 Speaker 1: where we explore the stories behind the stories in the news. 3 00:00:28,436 --> 00:00:33,156 Speaker 1: I'm Noah Feldman. Today we have a remarkable guest and 4 00:00:33,236 --> 00:00:37,276 Speaker 1: a remarkable conversation to share with you here on Deep Background. 5 00:00:39,156 --> 00:00:42,596 Speaker 1: I was fortunate enough to have a wide ranging conversation 6 00:00:42,996 --> 00:00:48,156 Speaker 1: with Vitalik Bhuteran. Vitalic is the co founder of Ethereum, 7 00:00:48,196 --> 00:00:52,076 Speaker 1: which you may have heard of as the other extraordinarily 8 00:00:52,236 --> 00:00:57,796 Speaker 1: popular cryptocurrency and a broader blockchain that is a complex 9 00:00:57,876 --> 00:01:01,876 Speaker 1: distributed computer network on which it is possible to build 10 00:01:02,116 --> 00:01:07,476 Speaker 1: all sorts of different applications. Vitalik has contributed centrally to 11 00:01:07,596 --> 00:01:12,116 Speaker 1: creating an enormous infrastructure and a tremendous amount of wealth, 12 00:01:12,716 --> 00:01:15,876 Speaker 1: and unsurprisingly that's made him into a kind of folk 13 00:01:15,996 --> 00:01:19,676 Speaker 1: hero of the crypto world. Oh and I may have 14 00:01:19,716 --> 00:01:23,436 Speaker 1: forgotten to mention this Vitalk is all of twenty seven 15 00:01:23,596 --> 00:01:28,876 Speaker 1: years old. Vitalik is a kind of public intellectual of 16 00:01:29,036 --> 00:01:33,436 Speaker 1: the cryptocurrency and blockchain space. He has a blog where 17 00:01:33,436 --> 00:01:38,116 Speaker 1: his posts are more like fully crafted, well thought out essays. 18 00:01:38,396 --> 00:01:41,196 Speaker 1: This blog, which you can find at vitalic dot Ca, 19 00:01:41,356 --> 00:01:43,956 Speaker 1: turns out to be one of the most extraordinary resources 20 00:01:43,956 --> 00:01:46,836 Speaker 1: you can imagine for anyone who, like me, is trying 21 00:01:46,876 --> 00:01:49,716 Speaker 1: to make sense of this new and complex set of 22 00:01:49,756 --> 00:01:53,916 Speaker 1: developments in the world around blockchain. What inspired me to 23 00:01:53,956 --> 00:01:56,316 Speaker 1: ask Metallic to come on the show was a particular 24 00:01:56,356 --> 00:02:00,876 Speaker 1: post or essay on his blog about the nature of legitimacy. 25 00:02:01,596 --> 00:02:06,196 Speaker 1: Legitimacy is a central concept in government, a central concept 26 00:02:06,236 --> 00:02:09,836 Speaker 1: in constitutions, a central concept in property law. And I 27 00:02:09,916 --> 00:02:14,076 Speaker 1: was stunned and amazed to see how central Vitalic himself 28 00:02:14,236 --> 00:02:17,156 Speaker 1: made it to the whole structure of the blockchain and 29 00:02:17,276 --> 00:02:21,716 Speaker 1: of cryptocurrency. I really wanted to delve deeper into his 30 00:02:21,836 --> 00:02:25,116 Speaker 1: idea and to how it relates to the entire system 31 00:02:25,156 --> 00:02:29,236 Speaker 1: of crypto and Vitalik graciously agreed to come and talk 32 00:02:29,276 --> 00:02:36,076 Speaker 1: about exactly that topic. Vitalic, thank you so much for 33 00:02:36,156 --> 00:02:40,756 Speaker 1: being here. I really appreciate your work because I see 34 00:02:40,756 --> 00:02:44,316 Speaker 1: who as somebody who is simultaneously a creator, an inventor 35 00:02:44,316 --> 00:02:46,956 Speaker 1: of things, and also a public intellectual of those same things. 36 00:02:47,436 --> 00:02:50,716 Speaker 1: And I was extremely struck by one of the many 37 00:02:50,756 --> 00:02:52,596 Speaker 1: fascinating posts that you put on your blog, and they're 38 00:02:52,596 --> 00:02:55,276 Speaker 1: really I mean, most people would publish them as essays 39 00:02:55,276 --> 00:02:56,956 Speaker 1: and magazines you put them up on your blog, so 40 00:02:57,036 --> 00:03:02,236 Speaker 1: that makes them easily accessible about legitimacy, and in particular 41 00:03:02,516 --> 00:03:05,636 Speaker 1: about the idea that legitimacy is a scarce resource and 42 00:03:05,836 --> 00:03:08,876 Speaker 1: something that we need to think very seriously about. In 43 00:03:08,876 --> 00:03:12,516 Speaker 1: the text of the blockchain. It grabbed my attention because 44 00:03:12,796 --> 00:03:14,996 Speaker 1: my day job as being a constitutional law professor, and 45 00:03:14,996 --> 00:03:16,916 Speaker 1: so what I do is I think about legitimacy twenty 46 00:03:16,956 --> 00:03:19,396 Speaker 1: four seven three to six five, And I was fascinated 47 00:03:19,436 --> 00:03:22,156 Speaker 1: to see the ways that you were engaging with the topic. 48 00:03:22,556 --> 00:03:26,036 Speaker 1: So I wonder if we could start, maybe with your 49 00:03:26,116 --> 00:03:29,796 Speaker 1: working definition of legitimacy, and then from there I want 50 00:03:29,836 --> 00:03:32,636 Speaker 1: you to talk to us about your leading and really 51 00:03:32,636 --> 00:03:35,796 Speaker 1: fascinating example that you given the essay, which is the 52 00:03:35,916 --> 00:03:39,556 Speaker 1: great powable of Steam and Hive. So maybe the start 53 00:03:39,596 --> 00:03:43,316 Speaker 1: by what you mean by legitimacy. Sure, So the thing 54 00:03:43,396 --> 00:03:47,556 Speaker 1: that I use the word legitimacy to refer to is 55 00:03:47,636 --> 00:03:51,676 Speaker 1: this idea that there are these collective patterns of behavior 56 00:03:51,956 --> 00:03:55,996 Speaker 1: where people all act in a particular way, so they 57 00:03:56,076 --> 00:04:00,956 Speaker 1: play a part in enacting and participating in some particular outcome. 58 00:04:01,276 --> 00:04:05,076 Speaker 1: Basically because there exists this kind of collective idea in 59 00:04:05,716 --> 00:04:09,676 Speaker 1: everyone's heads, that this outcome is like it or not. 60 00:04:09,756 --> 00:04:12,396 Speaker 1: It is a thing that's happening, and everyone is okay 61 00:04:12,436 --> 00:04:14,716 Speaker 1: with going along with it. Right. So basically it's this 62 00:04:14,796 --> 00:04:17,676 Speaker 1: sort of self referential concept that you know, there's a 63 00:04:17,716 --> 00:04:20,436 Speaker 1: lot of different situations in the world where the thing 64 00:04:20,516 --> 00:04:23,036 Speaker 1: that is in everyone's interests is to just sort of 65 00:04:23,076 --> 00:04:25,956 Speaker 1: play along with the same game that everyone else is playing. Right, So, like, 66 00:04:25,996 --> 00:04:28,556 Speaker 1: if everyone else is driving on the left side of 67 00:04:28,556 --> 00:04:29,996 Speaker 1: the road, it makes a lot of sense for you 68 00:04:30,036 --> 00:04:31,636 Speaker 1: to drive on the right side of the road. If 69 00:04:31,636 --> 00:04:33,436 Speaker 1: everyone else is driving on the left side of the road, 70 00:04:33,436 --> 00:04:35,876 Speaker 1: it makes sense for you to drive on the left 71 00:04:35,876 --> 00:04:38,836 Speaker 1: side of the road. Even if there aren't any police 72 00:04:38,876 --> 00:04:41,996 Speaker 1: around enforcing the rules, it would still mostly work. And 73 00:04:42,556 --> 00:04:44,876 Speaker 1: you know, even if like you personally have some slight 74 00:04:44,956 --> 00:04:47,676 Speaker 1: preference for one direction or the other, it's still in 75 00:04:47,796 --> 00:04:51,516 Speaker 1: your interest to just sort of go along with the 76 00:04:51,556 --> 00:04:54,596 Speaker 1: strategy that everyone else is going along with. And I 77 00:04:54,636 --> 00:04:59,596 Speaker 1: think that this is something that goes beyond very simple 78 00:04:59,636 --> 00:05:04,156 Speaker 1: examples like that. I think it's an extremely important concept 79 00:05:04,276 --> 00:05:06,996 Speaker 1: in just social relations in general. Right, And I gave 80 00:05:07,036 --> 00:05:09,356 Speaker 1: a few examples of this kind of both on the 81 00:05:09,396 --> 00:05:12,036 Speaker 1: blockchain and off So just starting with a couple of 82 00:05:12,036 --> 00:05:15,876 Speaker 1: off the blockchain examples. Like language is one good example, right, Like, 83 00:05:16,436 --> 00:05:19,396 Speaker 1: you know we argue whether or not a dolphin is 84 00:05:19,396 --> 00:05:21,876 Speaker 1: a fish, right, you know, a version of English where 85 00:05:22,196 --> 00:05:24,316 Speaker 1: a dolphin is a fish would work totally fine. A 86 00:05:24,396 --> 00:05:26,356 Speaker 1: version of English where a dolphin isn't a fish would 87 00:05:26,356 --> 00:05:29,316 Speaker 1: also work totally fine. But you know, even more important 88 00:05:29,356 --> 00:05:32,156 Speaker 1: than getting the right answer is just effects that we 89 00:05:32,196 --> 00:05:37,476 Speaker 1: agree on an answer. National borders are also another example 90 00:05:37,556 --> 00:05:42,876 Speaker 1: of legitimacy, right, Like if aliens came in tomorrow and they, yeah, 91 00:05:42,956 --> 00:05:47,036 Speaker 1: kind of edited all our brains and they to make 92 00:05:47,156 --> 00:05:49,396 Speaker 1: us all think that, say the border between the US 93 00:05:49,436 --> 00:05:52,516 Speaker 1: and Canada is five kilometers north of where it is today, Well, 94 00:05:52,756 --> 00:05:54,436 Speaker 1: you know, the world will just keep on going. But 95 00:05:54,476 --> 00:05:57,156 Speaker 1: the border between the US and Canada actually would be 96 00:05:57,236 --> 00:06:00,396 Speaker 1: five kilometers north of where it is today. So governments 97 00:06:00,396 --> 00:06:05,716 Speaker 1: have legitimacy. Systems of property rights have legitimacy, and systems 98 00:06:05,756 --> 00:06:09,036 Speaker 1: of rules inside of watchains can have legitimacy as well. 99 00:06:09,756 --> 00:06:13,236 Speaker 1: One pushback question before we get to the big parable. 100 00:06:13,476 --> 00:06:17,756 Speaker 1: In my world, when we talk about legitimacy, we usually 101 00:06:18,116 --> 00:06:21,396 Speaker 1: build into it a specific component that's in some of 102 00:06:21,436 --> 00:06:23,476 Speaker 1: your examples, but maybe not in all of them, and 103 00:06:23,676 --> 00:06:26,236 Speaker 1: you did allude to it, and that is not only 104 00:06:26,556 --> 00:06:29,796 Speaker 1: that this is the game that's happening right now and 105 00:06:29,836 --> 00:06:32,036 Speaker 1: it's in my interest to play in this game, but 106 00:06:32,116 --> 00:06:34,396 Speaker 1: also the component that I think to use your words were, 107 00:06:34,676 --> 00:06:37,596 Speaker 1: and I'm okay with that, right. So in other words, 108 00:06:37,636 --> 00:06:40,316 Speaker 1: that's the part of legitimacy that is sometimes called not 109 00:06:40,396 --> 00:06:43,956 Speaker 1: just descriptive but normative, you know, the m I okay 110 00:06:44,036 --> 00:06:46,196 Speaker 1: with that part of it. And as you mentioned, it's 111 00:06:46,196 --> 00:06:49,396 Speaker 1: a bit of a self referential concept because we can 112 00:06:49,436 --> 00:06:52,036 Speaker 1: ask whether a system has legitimacy by asking do the 113 00:06:52,156 --> 00:06:55,756 Speaker 1: people in that system feel okay with the way things 114 00:06:55,756 --> 00:06:58,076 Speaker 1: that are working. The reason I bring this up, even 115 00:06:58,076 --> 00:07:00,396 Speaker 1: though it's a teeny bit abstract, is that in your 116 00:07:00,436 --> 00:07:03,956 Speaker 1: examples of pure coordination games like everyone drive on the 117 00:07:04,036 --> 00:07:05,396 Speaker 1: right side of the road or everyone drive on the 118 00:07:05,476 --> 00:07:08,516 Speaker 1: left side of the road, it literally doesn't matter which 119 00:07:08,516 --> 00:07:10,756 Speaker 1: side we choose. Right in the UK and in Japan 120 00:07:10,796 --> 00:07:12,556 Speaker 1: they drive on the left. In other countries they drive 121 00:07:12,596 --> 00:07:15,276 Speaker 1: on the right. And you do it because everyone else 122 00:07:15,356 --> 00:07:17,356 Speaker 1: is doing it, but you also do it because if 123 00:07:17,396 --> 00:07:20,756 Speaker 1: you don't, you're going to smash into another car. So 124 00:07:21,036 --> 00:07:23,036 Speaker 1: you're okay with it, but you're only okay with it 125 00:07:23,076 --> 00:07:24,596 Speaker 1: in the sense that you had to pick one, and 126 00:07:24,636 --> 00:07:28,756 Speaker 1: it's arbitrary as to which one you pick. In contrast, 127 00:07:28,796 --> 00:07:34,276 Speaker 1: if you think about examples like borders, governments, property rights, 128 00:07:34,796 --> 00:07:37,236 Speaker 1: saying you're okay with it means something a little more 129 00:07:37,276 --> 00:07:39,996 Speaker 1: than that. It doesn't just mean, you know, like in 130 00:07:40,036 --> 00:07:43,196 Speaker 1: a government of absolute arbitrary power, well, I'm going to 131 00:07:43,236 --> 00:07:44,676 Speaker 1: go along with what the government says because I'm going 132 00:07:44,716 --> 00:07:48,156 Speaker 1: to be killed. Otherwise, such a government might be legitimate 133 00:07:48,196 --> 00:07:51,396 Speaker 1: if everyone starts saying and I'm okay with that, But 134 00:07:51,476 --> 00:07:54,076 Speaker 1: the government might be illegitimate even if people are following 135 00:07:54,156 --> 00:07:57,796 Speaker 1: the rules if people say to themselves, I'm not okay 136 00:07:57,836 --> 00:07:59,596 Speaker 1: with this, but what am I going to do? I 137 00:07:59,676 --> 00:08:01,756 Speaker 1: have no choice, And then if I knew that no 138 00:08:01,796 --> 00:08:04,676 Speaker 1: one were watching, I would break the rules. So I 139 00:08:04,716 --> 00:08:07,316 Speaker 1: just I guess before we dive into the concrete example, 140 00:08:07,356 --> 00:08:10,356 Speaker 1: I wonder how important is it to you to give 141 00:08:10,356 --> 00:08:14,436 Speaker 1: a definition of legitimacy that could be satisfied without our 142 00:08:14,556 --> 00:08:18,556 Speaker 1: focusing on how okay with it people are. No, It's 143 00:08:18,596 --> 00:08:21,556 Speaker 1: definitely a very good question and a very important points right, 144 00:08:21,596 --> 00:08:25,196 Speaker 1: because people, I think do use the word sort of 145 00:08:25,276 --> 00:08:28,516 Speaker 1: equivocating between the positive way of looking at things and 146 00:08:28,596 --> 00:08:30,756 Speaker 1: the normative way of looking at things. So like the 147 00:08:30,796 --> 00:08:33,156 Speaker 1: way that I defined legitimacy in the post, and you know, 148 00:08:33,316 --> 00:08:36,076 Speaker 1: the government of North Korea is the legitimate government of 149 00:08:36,116 --> 00:08:40,996 Speaker 1: North Korea, right, because exactly like you know, even if 150 00:08:41,276 --> 00:08:44,036 Speaker 1: lots of people are really unhappy with it, North Koreans 151 00:08:44,116 --> 00:08:47,476 Speaker 1: all act like it's the government of North Korea. I know, 152 00:08:47,516 --> 00:08:49,436 Speaker 1: America act so like it's the government of North Korea. 153 00:08:49,516 --> 00:08:52,156 Speaker 1: South Korea act so like it's the government of North Korea. Right. 154 00:08:52,196 --> 00:08:56,356 Speaker 1: But then at the same time, I also later on 155 00:08:56,396 --> 00:08:59,156 Speaker 1: in my post, I bring up these different theories of legitimacy, 156 00:08:59,356 --> 00:09:05,196 Speaker 1: basically ways in which or reasons by which some outcome 157 00:09:05,276 --> 00:09:08,836 Speaker 1: could be perceived as sell legitimate, and number one in 158 00:09:08,836 --> 00:09:11,756 Speaker 1: the way I put brute force, right, But aside from 159 00:09:11,876 --> 00:09:15,196 Speaker 1: number one, I had all of these others and the 160 00:09:15,476 --> 00:09:17,996 Speaker 1: other theories of legitimacy having to do with things like 161 00:09:18,156 --> 00:09:21,996 Speaker 1: fairness and participation and continuity and like, these are things 162 00:09:22,036 --> 00:09:25,516 Speaker 1: that we would be much more morally okay with. And 163 00:09:25,836 --> 00:09:29,836 Speaker 1: I think if I had any point on that question, 164 00:09:29,996 --> 00:09:34,716 Speaker 1: I would say that like descriptive legitimacy and a kind 165 00:09:34,716 --> 00:09:38,796 Speaker 1: of moral acceptability. Like they're both real contexts, and I 166 00:09:38,876 --> 00:09:40,756 Speaker 1: do think that they have some kid action even in 167 00:09:40,796 --> 00:09:44,156 Speaker 1: the real world, right, Like something being morally acceptable to 168 00:09:44,196 --> 00:09:46,516 Speaker 1: the people in that context as definitely is something that 169 00:09:46,556 --> 00:09:51,276 Speaker 1: really contributes to that thing being legitimate in a descriptive sense, 170 00:09:51,316 --> 00:09:55,116 Speaker 1: but it's also not the only thing. Let's turn now 171 00:09:55,196 --> 00:09:58,316 Speaker 1: to this what I'm calling a parable And like a 172 00:09:58,356 --> 00:10:01,036 Speaker 1: lot of great essays, your essay is built around a 173 00:10:01,156 --> 00:10:04,116 Speaker 1: central example, and I wonder if you would describe to 174 00:10:04,196 --> 00:10:08,596 Speaker 1: us the parable of Steam and Hive. Sure. So, there 175 00:10:08,676 --> 00:10:11,356 Speaker 1: was a sub blockchain called Steam, right, it was a 176 00:10:11,636 --> 00:10:14,956 Speaker 1: trying to be a decentralized social media thing, like you 177 00:10:15,116 --> 00:10:18,756 Speaker 1: could have some system for built in tipping. There's incentives, 178 00:10:18,916 --> 00:10:22,556 Speaker 1: there's like decentralized content curation of sometime. You know, it 179 00:10:22,956 --> 00:10:26,076 Speaker 1: was a very interesting experiment, right, But there is also 180 00:10:26,196 --> 00:10:29,636 Speaker 1: this kind of political intrium inside of the Steam ecosystem 181 00:10:29,636 --> 00:10:32,956 Speaker 1: that happens where basically at the beginning there is Steam 182 00:10:32,996 --> 00:10:35,556 Speaker 1: the blockchain, and then there is Steam the company, and 183 00:10:35,716 --> 00:10:41,316 Speaker 1: Steam the company that it happens everywhere exactly like you know, 184 00:10:41,316 --> 00:10:45,116 Speaker 1: there's a the Ethereum Foundation, there's the z cash Foundation, 185 00:10:45,196 --> 00:10:49,356 Speaker 1: there's all these foundations, but the big difference between a 186 00:10:49,396 --> 00:10:53,276 Speaker 1: blockchain and a traditional centralized service is that in the 187 00:10:53,276 --> 00:10:57,636 Speaker 1: centralized context, the like Twitter the company based has the 188 00:10:57,676 --> 00:11:01,316 Speaker 1: technical ability to do whatever they want to Twitter the service, right. 189 00:11:01,716 --> 00:11:05,156 Speaker 1: But Steam the company versus Steam the blockchain, there's actually 190 00:11:05,196 --> 00:11:07,836 Speaker 1: a bit more of a difference. And most of the 191 00:11:07,876 --> 00:11:10,356 Speaker 1: time in crypto land, I don't really think about this, right, 192 00:11:10,516 --> 00:11:12,716 Speaker 1: But what happened here was that the owner of a 193 00:11:12,796 --> 00:11:16,916 Speaker 1: Steam the company decided to sell Steam the company. Now 194 00:11:16,916 --> 00:11:19,556 Speaker 1: it's not actually possible to sell Steam the blockchain, right 195 00:11:19,556 --> 00:11:22,116 Speaker 1: because Steam the blockchain is a thing that only exists 196 00:11:22,196 --> 00:11:26,996 Speaker 1: because you have these thousands of independent, coordinating participants and they, 197 00:11:28,476 --> 00:11:30,636 Speaker 1: you know, ultimately have the right to run whatever code 198 00:11:30,636 --> 00:11:34,196 Speaker 1: they want. But you know, ultimately, like what code gets 199 00:11:34,276 --> 00:11:37,436 Speaker 1: run is constrained by the facts that people have to 200 00:11:37,476 --> 00:11:39,596 Speaker 1: agree to all run the same code. And if you 201 00:11:39,676 --> 00:11:42,756 Speaker 1: run different code from everyone else, then you know, even 202 00:11:42,756 --> 00:11:44,476 Speaker 1: if your code is better, you're just kind of off 203 00:11:44,516 --> 00:11:46,396 Speaker 1: in your own little universe and you're not talking to 204 00:11:46,436 --> 00:11:49,196 Speaker 1: anyone else. Right. The blockchain's kind of forked, and you know, 205 00:11:49,236 --> 00:11:51,316 Speaker 1: if you're on the minority for it, that's a lonely 206 00:11:51,316 --> 00:11:55,276 Speaker 1: place to be. So Steam the company got sold and 207 00:11:55,676 --> 00:11:58,756 Speaker 1: the new buyer, the new owner or Steam the company 208 00:11:58,796 --> 00:12:03,756 Speaker 1: as this a character named Justin's son. And Justin's son 209 00:12:04,196 --> 00:12:09,076 Speaker 1: is this crypto entrepreneur who's you know, a very controversial 210 00:12:09,516 --> 00:12:12,396 Speaker 1: and is widely known for doing a lot of things 211 00:12:12,396 --> 00:12:15,356 Speaker 1: that I personally against it are unethical, like you know, 212 00:12:15,436 --> 00:12:18,516 Speaker 1: like his white paper, plagiarized ipfs, and there's other examples. 213 00:12:18,836 --> 00:12:21,956 Speaker 1: So there's reasons for the community to be a kind 214 00:12:21,956 --> 00:12:25,676 Speaker 1: of very suspicious of him. So overnight, basically this a 215 00:12:25,756 --> 00:12:28,236 Speaker 1: sale was announced and I don't think the community was 216 00:12:28,276 --> 00:12:32,276 Speaker 1: really consulted about it and the company to clarified by 217 00:12:32,276 --> 00:12:33,876 Speaker 1: the way, because I think it's useful for people who 218 00:12:33,876 --> 00:12:36,556 Speaker 1: are like me, who are come from other worlds. The 219 00:12:36,556 --> 00:12:39,516 Speaker 1: things that he's done that you consider unethical are unethical 220 00:12:39,596 --> 00:12:44,756 Speaker 1: within the ethical framework shared by other people who operate 221 00:12:44,796 --> 00:12:47,556 Speaker 1: in and live in the blockchain. They're not illegal in 222 00:12:47,596 --> 00:12:49,036 Speaker 1: some sense of the term. Or do you think maybe 223 00:12:49,036 --> 00:12:52,116 Speaker 1: they were illegal in some cases it's possible that he 224 00:12:52,236 --> 00:12:54,396 Speaker 1: did illegal stuff. It's possible he didn't. I don't. I 225 00:12:54,396 --> 00:12:57,436 Speaker 1: don't know, but like plagiarism, that's one example, right, Like 226 00:12:57,476 --> 00:13:00,636 Speaker 1: I don't that's not illegal, but that is considered unethical 227 00:13:00,676 --> 00:13:03,276 Speaker 1: in a context much broader than the blockchain space. And 228 00:13:03,956 --> 00:13:06,076 Speaker 1: like his white paper, did you know a copy a 229 00:13:06,116 --> 00:13:08,396 Speaker 1: bunch of pages of I think it was the IPFS 230 00:13:08,396 --> 00:13:12,076 Speaker 1: white paper pretty directly. Okay, thanks so close, Parenthices, go 231 00:13:12,116 --> 00:13:16,036 Speaker 1: back to your story. Okay, So basically overnight, right, the 232 00:13:16,116 --> 00:13:20,396 Speaker 1: Steam community woke up and they realized that Steam the Company, 233 00:13:20,676 --> 00:13:22,916 Speaker 1: this thing that was supposed to be a steward of 234 00:13:22,916 --> 00:13:25,356 Speaker 1: Steam the blockchain, and up until then they kind of 235 00:13:25,356 --> 00:13:28,356 Speaker 1: trusted without really thinking about it, is suddenly controlled by 236 00:13:28,356 --> 00:13:32,276 Speaker 1: a potentially hostile actor. And you know, they were obviously 237 00:13:32,396 --> 00:13:35,996 Speaker 1: very scared by this, right, and so they and they 238 00:13:36,036 --> 00:13:39,276 Speaker 1: before there was this sort of informal gentlemen's agreement that 239 00:13:39,676 --> 00:13:43,036 Speaker 1: the funds that are that are controlled by Steam the Company. 240 00:13:43,196 --> 00:13:45,156 Speaker 1: So Steam the Company had twenty percent of all the 241 00:13:45,156 --> 00:13:48,836 Speaker 1: Steam tokens would be like are sort of held in trust, right, 242 00:13:48,836 --> 00:13:53,676 Speaker 1: and they're intended for the ongoing development of the Steam ecosystem, 243 00:13:53,676 --> 00:13:55,796 Speaker 1: and they're not just like, you know, the personal play 244 00:13:55,796 --> 00:13:58,956 Speaker 1: money of the founder. But then when Justin Sun comes in, 245 00:13:59,116 --> 00:14:01,476 Speaker 1: they did not have the trust that this agreement would 246 00:14:01,476 --> 00:14:04,116 Speaker 1: be honored, and so they had decided to make a 247 00:14:04,156 --> 00:14:07,356 Speaker 1: move on the Steam blockchain, so using the voting mechanism 248 00:14:07,396 --> 00:14:10,276 Speaker 1: of the allegated proof of stakes mechanism that's built into 249 00:14:10,316 --> 00:14:13,236 Speaker 1: Steam in order to just add a rule that says 250 00:14:13,276 --> 00:14:16,356 Speaker 1: that that accounts cannot be used to vote, because like, 251 00:14:16,436 --> 00:14:18,196 Speaker 1: if that accounts could be used to vote, right, that's 252 00:14:18,196 --> 00:14:20,316 Speaker 1: one accountable twenty percent of all the Steam tokens, and 253 00:14:20,356 --> 00:14:23,436 Speaker 1: that would just give Justin Son essentially like you and 254 00:14:23,476 --> 00:14:26,596 Speaker 1: a later all rule setting power. So just pause for 255 00:14:26,636 --> 00:14:30,676 Speaker 1: clarification purposes, the people who the other holders of the 256 00:14:30,716 --> 00:14:36,796 Speaker 1: tokens exercise their voting power to disenfranchise whoever would be 257 00:14:36,836 --> 00:14:38,036 Speaker 1: the owner of it. In this case, it was a 258 00:14:38,076 --> 00:14:42,476 Speaker 1: particular person, the owner of twenty percent of the tokens 259 00:14:42,476 --> 00:14:44,756 Speaker 1: that were out there. So as it were eighty percent, 260 00:14:44,836 --> 00:14:46,196 Speaker 1: I mean it made it probably wasn't all of them, 261 00:14:46,356 --> 00:14:49,556 Speaker 1: but a majority of the of all the voters voted 262 00:14:49,596 --> 00:14:54,356 Speaker 1: to disenfranchise the person who held twenty percent, right, Okay, 263 00:14:54,396 --> 00:14:56,356 Speaker 1: go on. That's fascinating in its own right because of 264 00:14:56,396 --> 00:14:58,596 Speaker 1: the way that's parable as headache. It is. Yeah, So 265 00:14:58,716 --> 00:15:01,716 Speaker 1: Justin Son struck back, right, and what he did was 266 00:15:01,756 --> 00:15:05,436 Speaker 1: that he made another proposal. The eventual result of them 267 00:15:05,556 --> 00:15:09,556 Speaker 1: was to basically like kick out the delegates, the kind 268 00:15:09,556 --> 00:15:14,316 Speaker 1: of elected participants that participated in the original move against him, 269 00:15:14,556 --> 00:15:17,916 Speaker 1: and basically kind of secure majority control over the blockchain 270 00:15:17,996 --> 00:15:20,996 Speaker 1: to the point where he could not be diswatched anymore. 271 00:15:21,436 --> 00:15:25,236 Speaker 1: And he did this using not just funds that he 272 00:15:25,276 --> 00:15:28,436 Speaker 1: had directly, but he also talked to exchanges, right, because 273 00:15:28,596 --> 00:15:30,876 Speaker 1: there's a lot of users that just have like their 274 00:15:30,916 --> 00:15:34,476 Speaker 1: Steam tokens held by exchanges and exchanges like they have 275 00:15:34,556 --> 00:15:37,156 Speaker 1: the private key, so they theoretically have the ability to vote. 276 00:15:37,356 --> 00:15:39,116 Speaker 1: And so he wants to a bunch of these big 277 00:15:39,116 --> 00:15:41,876 Speaker 1: exchanges and convinced them to make this kind of big 278 00:15:41,956 --> 00:15:46,196 Speaker 1: move to basically make a counter vote reverse those results 279 00:15:46,236 --> 00:15:51,076 Speaker 1: and just secure power for himself. Now again clarification question, 280 00:15:51,316 --> 00:15:54,476 Speaker 1: when you say secure power for himself, if you asked him, 281 00:15:54,516 --> 00:15:57,036 Speaker 1: if we had him on the show, wouldn't he say, 282 00:15:57,476 --> 00:15:59,636 Speaker 1: what do you mean? Secure power for myself? They just 283 00:15:59,716 --> 00:16:03,076 Speaker 1: disenfranchised me, and so all I was trying to do 284 00:16:03,236 --> 00:16:05,836 Speaker 1: was get back my voting power. I mean, leaving aside 285 00:16:05,836 --> 00:16:07,276 Speaker 1: whether that would be sure or not. Is that what 286 00:16:07,316 --> 00:16:09,836 Speaker 1: he would say? I think so? And what do you 287 00:16:09,876 --> 00:16:13,236 Speaker 1: make of it? I mean, well, there's multiple levels to this, right, 288 00:16:13,276 --> 00:16:16,316 Speaker 1: because this is one of those situations where there's like, 289 00:16:17,156 --> 00:16:21,476 Speaker 1: like there was this sort of informal agreement that the 290 00:16:21,596 --> 00:16:24,236 Speaker 1: coins owned by the companies should like should not be 291 00:16:24,316 --> 00:16:26,596 Speaker 1: used to vote in before the handover, were not being 292 00:16:26,716 --> 00:16:29,756 Speaker 1: used to vote. But then of course that agreements have 293 00:16:29,956 --> 00:16:33,076 Speaker 1: ended up never being put into stone. So you can 294 00:16:33,076 --> 00:16:35,796 Speaker 1: blame the community for not being careful as well. You know, 295 00:16:35,876 --> 00:16:37,876 Speaker 1: you can blame the ritual owners of Steam for not 296 00:16:37,916 --> 00:16:41,076 Speaker 1: being transparent and not consulting with the community about the handover. 297 00:16:41,476 --> 00:16:46,316 Speaker 1: I think ultimately, like the thing that caused the community 298 00:16:46,356 --> 00:16:48,436 Speaker 1: to do this is just that like they you know, 299 00:16:48,516 --> 00:16:52,116 Speaker 1: did not want to participate in a blockchain where one 300 00:16:52,116 --> 00:16:55,516 Speaker 1: participant had this amount of power Like that just sort 301 00:16:55,556 --> 00:16:57,756 Speaker 1: of goes against what even the purpose of a blockchain 302 00:16:57,876 --> 00:17:00,236 Speaker 1: is trying to be. So then what happened when he 303 00:17:00,316 --> 00:17:02,636 Speaker 1: got back the power that he had been taken haven't 304 00:17:02,636 --> 00:17:05,196 Speaker 1: been teamed from him. So when he got that power back, 305 00:17:05,356 --> 00:17:08,996 Speaker 1: at that point, he had power secured in the sense 306 00:17:09,276 --> 00:17:11,436 Speaker 1: that he had all of these delegates, he had a 307 00:17:11,436 --> 00:17:14,636 Speaker 1: lot of coins, and so the only move that the 308 00:17:14,636 --> 00:17:17,596 Speaker 1: community had available was to do this sort of extra 309 00:17:17,636 --> 00:17:20,596 Speaker 1: protocol appeal, right, they announced that they would make a 310 00:17:20,636 --> 00:17:24,196 Speaker 1: hard work. They would just release a new version of 311 00:17:24,236 --> 00:17:27,796 Speaker 1: the software, and that new version of the software would 312 00:17:28,716 --> 00:17:32,236 Speaker 1: basically remove the right to vote from and I think 313 00:17:32,356 --> 00:17:37,076 Speaker 1: even burned the coins of anyone who participated on justin 314 00:17:37,156 --> 00:17:39,316 Speaker 1: sun side during the votes, right, so they made this 315 00:17:39,356 --> 00:17:42,716 Speaker 1: protocol keeping while keeping the value of their own coins, 316 00:17:42,756 --> 00:17:45,796 Speaker 1: I guess, transposing it precisely. So they decided this is 317 00:17:45,916 --> 00:17:48,996 Speaker 1: totally fascinating, and it's it's where your story starts to 318 00:17:49,036 --> 00:17:51,156 Speaker 1: look really different from what could have happened in a country. 319 00:17:51,236 --> 00:17:53,716 Speaker 1: Let's say they said, we're going to go and make 320 00:17:53,756 --> 00:17:58,676 Speaker 1: our own new ecosystem copied from the old ecosystem, excluding 321 00:17:58,796 --> 00:18:00,956 Speaker 1: the wealth and the power of the people whom we 322 00:18:01,556 --> 00:18:04,996 Speaker 1: now see as the bad guys, replicating our own wealth 323 00:18:04,996 --> 00:18:07,876 Speaker 1: and power, but in theory in precise proportion to how 324 00:18:07,876 --> 00:18:11,476 Speaker 1: it's allocated among us right now, minus the parts that 325 00:18:11,516 --> 00:18:16,796 Speaker 1: we burned, exactly. And they did this right, and for 326 00:18:17,316 --> 00:18:20,316 Speaker 1: trademark reasons, they had to rename it, so instead of 327 00:18:20,316 --> 00:18:22,676 Speaker 1: being Steam, it was called Hive, and a huge part 328 00:18:22,716 --> 00:18:25,356 Speaker 1: of the ecosystem actually did migrate, right. So I think 329 00:18:25,596 --> 00:18:28,436 Speaker 1: since then the market caps over the two have actually 330 00:18:28,516 --> 00:18:30,556 Speaker 1: been sort of pretty neck and neck against each other. 331 00:18:31,876 --> 00:18:35,436 Speaker 1: So what is your takeaway from this whole parable? Right? 332 00:18:35,476 --> 00:18:39,276 Speaker 1: I mean, you know, many influential people in history, religious 333 00:18:39,316 --> 00:18:41,516 Speaker 1: leaders amongst them, like to speak in parables. It's a 334 00:18:41,516 --> 00:18:43,316 Speaker 1: good way to reach people, but it's sometimes helpful to 335 00:18:43,356 --> 00:18:45,396 Speaker 1: explain what the parable means. So what does this parable 336 00:18:45,436 --> 00:18:48,356 Speaker 1: mean to you? Right? So the conclusion that I made 337 00:18:48,396 --> 00:18:51,796 Speaker 1: in the post, right, is that this was proof that 338 00:18:52,356 --> 00:18:55,396 Speaker 1: like Justin Sun never actually owns the coins. Like when 339 00:18:55,396 --> 00:18:59,116 Speaker 1: we talk about ownership, there's you know, this concept of 340 00:18:59,156 --> 00:19:02,276 Speaker 1: like usas fructus and abusis I think is the Latin 341 00:19:02,316 --> 00:19:04,436 Speaker 1: phrase like you can use the property, you can enjoy it, 342 00:19:04,516 --> 00:19:06,676 Speaker 1: or you can abuse it. Like those are the rights 343 00:19:06,676 --> 00:19:08,996 Speaker 1: of ownership. Right, If you own something, you can you know, 344 00:19:09,196 --> 00:19:11,916 Speaker 1: basically do whatever you want with it, with the exception 345 00:19:11,996 --> 00:19:14,836 Speaker 1: of regulations that apply regardless of what property you're using 346 00:19:14,876 --> 00:19:17,436 Speaker 1: to do things. But clearly, in this sense, like Justin 347 00:19:18,316 --> 00:19:21,476 Speaker 1: used the coins that were in his possession, but he 348 00:19:21,596 --> 00:19:23,956 Speaker 1: ended up not being able to use them successfully, right, 349 00:19:23,996 --> 00:19:27,316 Speaker 1: because you know, the ecosystem forked off and he was 350 00:19:27,436 --> 00:19:30,836 Speaker 1: not able to sort of fully implement the thing that 351 00:19:30,876 --> 00:19:33,436 Speaker 1: he was entitled to emplements. According to the formal rules 352 00:19:33,436 --> 00:19:36,116 Speaker 1: of the system. And what that shows to me is 353 00:19:36,156 --> 00:19:38,756 Speaker 1: that this is proof that justin sun Never and even 354 00:19:38,796 --> 00:19:41,636 Speaker 1: the Steam company never actually owns those coins. Right, the 355 00:19:41,716 --> 00:19:45,876 Speaker 1: coins in some sense actually were sort of owned not 356 00:19:46,036 --> 00:19:48,796 Speaker 1: just by the holder of a cryptographic key, but they 357 00:19:48,796 --> 00:19:54,076 Speaker 1: were also owned directly by a community conception of legitimacy. Right, 358 00:19:54,116 --> 00:19:58,156 Speaker 1: they were directly owned sort of by this metaphysical object 359 00:19:58,316 --> 00:20:02,436 Speaker 1: that consists of this set of principles that existed and 360 00:20:02,516 --> 00:20:05,796 Speaker 1: we're maybe unstated and we're maybe informal and vague up 361 00:20:05,836 --> 00:20:09,876 Speaker 1: until that point in the Steam community's heads, and like, 362 00:20:09,996 --> 00:20:14,596 Speaker 1: that's a really powerful thing. It's tremendously powerful. So now 363 00:20:14,636 --> 00:20:16,516 Speaker 1: if I may, let me just say a quick word 364 00:20:16,716 --> 00:20:20,516 Speaker 1: and ask your dreer reaction about how these same issues 365 00:20:20,516 --> 00:20:22,476 Speaker 1: are thought about in the context of property law and 366 00:20:23,036 --> 00:20:25,996 Speaker 1: constitutional law, which you think of themselves. You know, constitution 367 00:20:26,076 --> 00:20:28,396 Speaker 1: law thinks it's the law of the blockchain, and property 368 00:20:28,476 --> 00:20:30,396 Speaker 1: law thinks it's the law of the tokens, at least 369 00:20:30,556 --> 00:20:33,436 Speaker 1: in this particularity. So in that world, we would say 370 00:20:33,516 --> 00:20:36,796 Speaker 1: everything that you described could happen in a real existing 371 00:20:36,996 --> 00:20:41,436 Speaker 1: community or government. It's usually called a revolution. Right, you 372 00:20:41,556 --> 00:20:44,876 Speaker 1: have some backgrounds set of principles that people think of 373 00:20:44,916 --> 00:20:48,316 Speaker 1: as the allocation of power. Who's in charge, who are 374 00:20:48,316 --> 00:20:50,116 Speaker 1: the courts, who do you have to listen to? Who 375 00:20:50,116 --> 00:20:52,916 Speaker 1: are the police? And then that system uses some set 376 00:20:52,916 --> 00:20:56,236 Speaker 1: of rules to decide who owns what property law governs 377 00:20:56,276 --> 00:20:58,436 Speaker 1: the stuff you own, voting rights law, which used to 378 00:20:58,476 --> 00:21:00,156 Speaker 1: be thought of many many years ago as a species 379 00:21:00,156 --> 00:21:03,996 Speaker 1: of property law, because they're closely connected. Governs how you 380 00:21:04,036 --> 00:21:08,156 Speaker 1: get to participate in decision making. But if someone exercises, 381 00:21:08,236 --> 00:21:09,796 Speaker 1: or some group of people or size a lot of 382 00:21:09,836 --> 00:21:12,476 Speaker 1: power in a way that makes other people in this system, 383 00:21:13,076 --> 00:21:14,756 Speaker 1: usually a majority of them, but it doesn't have to 384 00:21:14,756 --> 00:21:18,396 Speaker 1: be a majority angry enough to break the system. You 385 00:21:18,436 --> 00:21:21,596 Speaker 1: can get a revolution, and in that revolution, people break 386 00:21:21,596 --> 00:21:25,156 Speaker 1: the existing structures of power. They take away people's property rights. 387 00:21:25,756 --> 00:21:28,796 Speaker 1: They can take away physical objects that people have, or 388 00:21:28,836 --> 00:21:31,676 Speaker 1: they can take away their abstract rights, like you know, 389 00:21:31,716 --> 00:21:34,276 Speaker 1: the abstract idea that they own a piece of land, 390 00:21:34,956 --> 00:21:37,636 Speaker 1: or you know the shares that they In theory wouldn't 391 00:21:37,676 --> 00:21:40,676 Speaker 1: that kind of value can be destroyed, And because we 392 00:21:40,756 --> 00:21:43,476 Speaker 1: know that that can happen, we tend to think their 393 00:21:43,476 --> 00:21:48,076 Speaker 1: property law and a whole constitutional order are contingent and 394 00:21:48,116 --> 00:21:51,356 Speaker 1: their contingent on their not being a revolution. And when 395 00:21:51,396 --> 00:21:53,396 Speaker 1: we're looking for a word to describe the world where 396 00:21:53,396 --> 00:21:55,676 Speaker 1: there's no revolution, sometimes we use the exact same word 397 00:21:55,716 --> 00:21:58,036 Speaker 1: that you used. We say this system has a property 398 00:21:58,036 --> 00:22:01,676 Speaker 1: of legitimacy, meaning that right now, no one is starting 399 00:22:01,716 --> 00:22:04,636 Speaker 1: a revolution who has enough support to make it into 400 00:22:04,716 --> 00:22:07,956 Speaker 1: a very successful revolution. And we think that all the 401 00:22:07,996 --> 00:22:13,196 Speaker 1: time we're operating against that backdrop. So in my world, 402 00:22:13,556 --> 00:22:17,916 Speaker 1: your story makes perfect sense, and we also always assume 403 00:22:18,476 --> 00:22:20,836 Speaker 1: that it's true, even though we might not always talk 404 00:22:20,836 --> 00:22:23,156 Speaker 1: about it, except when there's revolution in the air. As 405 00:22:23,156 --> 00:22:24,996 Speaker 1: you were saying early in peacetime, you don't have to 406 00:22:24,996 --> 00:22:27,316 Speaker 1: think about these things. So I guess the first question 407 00:22:27,316 --> 00:22:28,836 Speaker 1: I want to ask is, when you hear me say 408 00:22:28,876 --> 00:22:30,796 Speaker 1: what I just said, do you think duh, you know 409 00:22:30,876 --> 00:22:35,156 Speaker 1: that's obvious? And if so, why does it why did 410 00:22:35,196 --> 00:22:37,556 Speaker 1: it seem or why does it seem surprising in the 411 00:22:37,596 --> 00:22:42,876 Speaker 1: context of the blockchain for similar dynamics to be in play. Again, 412 00:22:43,716 --> 00:22:48,716 Speaker 1: I'm definitely not surprised that constitutional law has similar vocabulary 413 00:22:48,756 --> 00:22:53,156 Speaker 1: for these kinds of things. The thing that just surprised 414 00:22:53,236 --> 00:22:57,556 Speaker 1: me and fascinated me about this happening in block channel land. 415 00:22:57,796 --> 00:22:59,716 Speaker 1: Is that the way that a lot of us think 416 00:22:59,756 --> 00:23:03,596 Speaker 1: about the world, or at least philosophers think kind of 417 00:23:03,676 --> 00:23:06,156 Speaker 1: legal philosopher's take about the world, right, is that you 418 00:23:06,236 --> 00:23:08,476 Speaker 1: have the laws of physics and the laws of physics 419 00:23:08,556 --> 00:23:10,996 Speaker 1: or layers right, and the laws of physics. You know, 420 00:23:11,036 --> 00:23:15,356 Speaker 1: you have guns and bombs and soldiers, and like that's 421 00:23:15,396 --> 00:23:17,636 Speaker 1: what ends up kind of ultimately in the extreme of 422 00:23:17,676 --> 00:23:21,036 Speaker 1: extremes deciding things. Then you have layer one, and layer 423 00:23:21,116 --> 00:23:24,356 Speaker 1: one is like basically legal norms that are run by countries. 424 00:23:24,596 --> 00:23:27,476 Speaker 1: And then sometimes that layer one collapses, but most of 425 00:23:27,476 --> 00:23:29,956 Speaker 1: the time that layer one doesn't collapse, But then everything 426 00:23:29,996 --> 00:23:32,996 Speaker 1: above that layer one is some layer two. And the 427 00:23:33,076 --> 00:23:35,316 Speaker 1: way that layer two works though, is that most of 428 00:23:35,356 --> 00:23:37,716 Speaker 1: the time it runs fine, but the portion of the 429 00:23:37,756 --> 00:23:39,836 Speaker 1: time when it does not run fine, what you have 430 00:23:39,996 --> 00:23:41,876 Speaker 1: is you actually have some kind of appeal like a 431 00:23:42,156 --> 00:23:44,916 Speaker 1: descents to layer one, right, Like if a business transaction 432 00:23:44,956 --> 00:23:48,076 Speaker 1: goes badly one if this party sues the other party, 433 00:23:48,436 --> 00:23:51,356 Speaker 1: you know, layer one, the government legal system ultimately ends 434 00:23:51,396 --> 00:23:54,716 Speaker 1: up kind of deciding who's right. If say you have 435 00:23:54,756 --> 00:23:58,196 Speaker 1: a dispute where the employees of Twitter decide that they're 436 00:23:58,196 --> 00:24:01,636 Speaker 1: going to take over Twitter servers because they have different 437 00:24:01,636 --> 00:24:04,116 Speaker 1: opinions about which accounts should be banned. They might be 438 00:24:04,116 --> 00:24:07,036 Speaker 1: able to kind of achieve some victories for a few hours, 439 00:24:07,036 --> 00:24:10,116 Speaker 1: but you know, eventually they the police get called in 440 00:24:10,116 --> 00:24:12,796 Speaker 1: and then the courts get called in, and like this 441 00:24:12,996 --> 00:24:18,636 Speaker 1: layer one that presents this abstraction of just always giving 442 00:24:18,636 --> 00:24:22,116 Speaker 1: you some kind of concrete final resolution that has you know, 443 00:24:22,236 --> 00:24:26,556 Speaker 1: universal agreements, is what ends up being the final designer, right. 444 00:24:26,876 --> 00:24:29,196 Speaker 1: But in the blockchain space, what we have is like 445 00:24:29,236 --> 00:24:32,436 Speaker 1: we in this particular context, we don't have that, right, 446 00:24:32,676 --> 00:24:36,316 Speaker 1: Like there's no sort of recourse to at least the 447 00:24:36,396 --> 00:24:39,196 Speaker 1: nation state layer one legal system. Instead, we have something 448 00:24:39,596 --> 00:24:43,436 Speaker 1: that feels more similar to this kind of layer one's 449 00:24:43,716 --> 00:24:47,236 Speaker 1: that is the blockchain community. And then that ends up 450 00:24:47,476 --> 00:24:51,316 Speaker 1: descending to something that feels like a different layer zero, 451 00:24:51,396 --> 00:24:54,596 Speaker 1: which is basically not quite laws of physics, but at 452 00:24:54,636 --> 00:24:57,356 Speaker 1: least kind of the laws of computing, and like the 453 00:24:57,396 --> 00:25:00,716 Speaker 1: practical rules of well, you know, who actually is running 454 00:25:00,756 --> 00:25:03,956 Speaker 1: the software, who actually is running the blockchain, how quickly 455 00:25:03,996 --> 00:25:06,916 Speaker 1: is it possible to convince people to update software. So 456 00:25:07,436 --> 00:25:09,636 Speaker 1: things that feel like there are sort of aspects of 457 00:25:09,756 --> 00:25:12,516 Speaker 1: nature as opposed to things that are aspects of a 458 00:25:12,716 --> 00:25:16,116 Speaker 1: kind of a human main construction. We'll be right back. 459 00:25:25,876 --> 00:25:29,676 Speaker 1: Let's bring now your big claim about legitimacy to bear 460 00:25:29,916 --> 00:25:33,556 Speaker 1: on the big picture of the blockchain. Sure, to those 461 00:25:33,636 --> 00:25:37,076 Speaker 1: on the outside who were eagerly like me, eagerly learning 462 00:25:37,116 --> 00:25:39,996 Speaker 1: about the ideology of the blockchain and its functionalities and 463 00:25:40,036 --> 00:25:42,116 Speaker 1: its goals and its trans and are really interested in 464 00:25:42,116 --> 00:25:46,036 Speaker 1: the question of its transformative capacities, A crucial question is, 465 00:25:47,116 --> 00:25:49,276 Speaker 1: you know, why will lots and lots and lots of 466 00:25:49,356 --> 00:25:53,796 Speaker 1: people in the future engage on the blockchain, right? Why 467 00:25:53,876 --> 00:25:58,316 Speaker 1: will they prefer it to other more centralized options. How 468 00:25:58,396 --> 00:26:01,716 Speaker 1: much of the answer for you is connected to this 469 00:26:01,876 --> 00:26:04,636 Speaker 1: legitimacy question. How much does it connect to the idea that, well, 470 00:26:05,196 --> 00:26:09,036 Speaker 1: the blockchain can offer forms of legitimacy for the kinds 471 00:26:09,036 --> 00:26:12,276 Speaker 1: of things that it does well or that enables people 472 00:26:12,276 --> 00:26:14,876 Speaker 1: to do well that are different from and maybe more 473 00:26:14,916 --> 00:26:19,036 Speaker 1: attractive than the kinds of legitimacy that we're accustomed to 474 00:26:19,076 --> 00:26:26,716 Speaker 1: seeing in more centralized deployment. I think that's a good question, 475 00:26:26,836 --> 00:26:30,476 Speaker 1: and it depends quite a lot on no legitimacy, and 476 00:26:31,116 --> 00:26:35,596 Speaker 1: I think that's true on multiple levels. So there's sort 477 00:26:35,596 --> 00:26:38,116 Speaker 1: of two different levels that I can talk about. So 478 00:26:38,196 --> 00:26:42,556 Speaker 1: the first level is that I think the decentralized architecture 479 00:26:42,676 --> 00:26:46,876 Speaker 1: of blockchains is something that directly can give users a 480 00:26:46,876 --> 00:26:49,876 Speaker 1: feeling of a safety and a feeling of safety that 481 00:26:50,476 --> 00:26:53,436 Speaker 1: the functionality that the blockchain provides is not going to 482 00:26:53,476 --> 00:26:57,596 Speaker 1: just change on them in unexpected and unacceptable ways. And like, 483 00:26:57,716 --> 00:27:00,156 Speaker 1: one very practical example of this, right is, like if 484 00:27:00,156 --> 00:27:02,916 Speaker 1: you look at things like Twitter and Facebook, they at 485 00:27:02,956 --> 00:27:06,116 Speaker 1: the very beginning had very open APIs, and there are 486 00:27:06,156 --> 00:27:09,956 Speaker 1: lots of startups that built themselves relying on the assumption 487 00:27:10,036 --> 00:27:12,916 Speaker 1: that the APIs would continue to exist as they have existed. 488 00:27:13,356 --> 00:27:15,836 Speaker 1: But then what happens in a lot of cases is 489 00:27:15,876 --> 00:27:18,316 Speaker 1: that Twitter and Facebook basically pulled the rug on them right, 490 00:27:18,316 --> 00:27:21,156 Speaker 1: like they just made the APIs no longer work. Regardless 491 00:27:21,196 --> 00:27:23,916 Speaker 1: of what the reasons were. The result was that people 492 00:27:24,076 --> 00:27:26,836 Speaker 1: like Yahd entrepreneurs have just spent years building these projects. 493 00:27:26,836 --> 00:27:29,036 Speaker 1: But the result is that like Twitter just makes a 494 00:27:29,116 --> 00:27:32,796 Speaker 1: decision with a few clicks and boom, their entire business 495 00:27:32,796 --> 00:27:35,596 Speaker 1: model is gone. With a blockchain like that, you can't 496 00:27:35,596 --> 00:27:38,836 Speaker 1: really do that, right because even if you don't want 497 00:27:38,916 --> 00:27:41,916 Speaker 1: to talk to a blockchain directly and you're making a service, 498 00:27:41,956 --> 00:27:44,716 Speaker 1: and it's like working through some APIs like from ether 499 00:27:44,796 --> 00:27:47,356 Speaker 1: scan or some one of these kind of service providers. 500 00:27:47,636 --> 00:27:50,036 Speaker 1: Like if either scan decides to change its serves of service, 501 00:27:50,036 --> 00:27:52,276 Speaker 1: the blockchain is open, right, and there's other alternatives. You 502 00:27:52,316 --> 00:27:54,916 Speaker 1: can just like go over to ether chain and they'll 503 00:27:54,916 --> 00:27:57,996 Speaker 1: start doing the same thing. And then if someone suggests 504 00:27:57,996 --> 00:28:01,076 Speaker 1: that the rules of the blockchain themselves change, then that 505 00:28:01,116 --> 00:28:04,676 Speaker 1: can only happen through this long, open process, and even 506 00:28:04,796 --> 00:28:07,356 Speaker 1: you can just potentially go in and raise your objections 507 00:28:07,356 --> 00:28:10,556 Speaker 1: to it, and like it's much much harder and much 508 00:28:10,636 --> 00:28:13,356 Speaker 1: less likely for the thing that you were doing to 509 00:28:13,476 --> 00:28:17,916 Speaker 1: just suddenly, you know, become impossible on you. Right, for example, 510 00:28:17,996 --> 00:28:20,356 Speaker 1: is even just a twenty one million women in bitcoin? Right? 511 00:28:20,676 --> 00:28:23,276 Speaker 1: Like I think bitcoin is sometimes like they overstate this 512 00:28:23,316 --> 00:28:24,796 Speaker 1: a bit and they say that, you know, the twenty 513 00:28:24,796 --> 00:28:27,236 Speaker 1: one million limit is backed by mass, like obviously it's 514 00:28:27,236 --> 00:28:30,796 Speaker 1: not right, Like, obviously, if everyone agrees that tomorrow we're 515 00:28:30,836 --> 00:28:32,956 Speaker 1: gonna run a new version of the Clients, and that 516 00:28:33,076 --> 00:28:35,716 Speaker 1: Clients now says that there's thirty one million bitcoins, then 517 00:28:35,996 --> 00:28:38,316 Speaker 1: like for all practical intents and purposes, there's thirty one 518 00:28:38,316 --> 00:28:41,676 Speaker 1: million bitoins now, but in practice, like, there are these 519 00:28:41,836 --> 00:28:45,556 Speaker 1: very strong forces of a legitimacy that go against that. Right, 520 00:28:45,596 --> 00:28:48,756 Speaker 1: there's this entire community that basically says, you know, twenty 521 00:28:48,756 --> 00:28:53,556 Speaker 1: one million is the zone data of bitcoin or like, 522 00:28:53,836 --> 00:28:57,236 Speaker 1: without that property, this sting is not bitcoin. And there's 523 00:28:57,436 --> 00:29:00,116 Speaker 1: enough people who just really fervently believe in this that 524 00:29:00,276 --> 00:29:03,236 Speaker 1: you know, in practice, breaking the twenty one million limit 525 00:29:03,356 --> 00:29:06,556 Speaker 1: is just going to be like extremely harder impossible. So 526 00:29:07,316 --> 00:29:10,716 Speaker 1: that's money answer, right, Okay, good, let's talk about that answer, 527 00:29:10,716 --> 00:29:12,676 Speaker 1: and then I'm sure you have more to say about it. 528 00:29:12,916 --> 00:29:15,236 Speaker 1: But this answer is really fascinating an incredibly generative. So 529 00:29:15,356 --> 00:29:16,836 Speaker 1: I want to say a couple of different things about 530 00:29:16,836 --> 00:29:20,796 Speaker 1: it in question. So the first is, let's start with 531 00:29:20,836 --> 00:29:26,036 Speaker 1: this question of is the argument basically, look, distributed power 532 00:29:26,636 --> 00:29:32,116 Speaker 1: democracy would be the analogy in ordinary constitutional design has 533 00:29:32,196 --> 00:29:37,276 Speaker 1: some real advantages over concentrated power like autocracy or certain 534 00:29:37,276 --> 00:29:41,676 Speaker 1: forms of monarchy, and so in the long run, many 535 00:29:41,756 --> 00:29:47,676 Speaker 1: people will prefer the blockchain model to the more centralized model. First, 536 00:29:47,756 --> 00:29:51,716 Speaker 1: sort of like the reasons that people prefer democracies to autocracies, 537 00:29:52,436 --> 00:29:56,196 Speaker 1: which includes the possibility or includes the probability that for 538 00:29:56,196 --> 00:29:59,356 Speaker 1: a democracy to make changes, you need a lot of 539 00:29:59,356 --> 00:30:00,996 Speaker 1: people to agree, and so you need this a certain 540 00:30:00,996 --> 00:30:05,236 Speaker 1: amount of decentralization there, whereas in a pure autocracy, you know, 541 00:30:05,316 --> 00:30:08,716 Speaker 1: the emperor decides tomorrow or Hijinping decides tomorrow, the things 542 00:30:08,756 --> 00:30:10,956 Speaker 1: are going to be different and then they are going 543 00:30:10,996 --> 00:30:14,756 Speaker 1: to be different. So my first question is is there 544 00:30:14,796 --> 00:30:17,116 Speaker 1: a kind of loose analogy there such that when people 545 00:30:17,116 --> 00:30:19,516 Speaker 1: are making the argument for the appeal of the blockchain 546 00:30:19,516 --> 00:30:22,556 Speaker 1: and its advantages, it has a kind of family resemblance 547 00:30:23,156 --> 00:30:26,396 Speaker 1: to the argument about the advantages over decentralized governance, namely 548 00:30:26,436 --> 00:30:32,316 Speaker 1: democracy over centralized government, namely autocracy. I think that's definitely 549 00:30:32,316 --> 00:30:35,396 Speaker 1: a good analogy. So if it is, and that's for me, 550 00:30:35,436 --> 00:30:38,556 Speaker 1: that's very clarifying. And I apologize for translating your world 551 00:30:38,596 --> 00:30:41,396 Speaker 1: into my world, but for lease some listeners, your world 552 00:30:41,436 --> 00:30:47,316 Speaker 1: is certainly newer than mine. If that's true, then what 553 00:30:47,476 --> 00:30:50,836 Speaker 1: about the counter concern, which is that you know, there 554 00:30:50,836 --> 00:30:53,516 Speaker 1: are a lot of people who prefer big centralized systems 555 00:30:53,516 --> 00:30:56,916 Speaker 1: of governance, and what partly the reason for that is 556 00:30:56,956 --> 00:31:01,436 Speaker 1: that big decentralized groups of people are susceptible to their 557 00:31:01,516 --> 00:31:07,396 Speaker 1: own weird changes over time. Right, they can collectively decide 558 00:31:07,756 --> 00:31:10,196 Speaker 1: that they're going to go out and do something completely different, 559 00:31:10,316 --> 00:31:12,796 Speaker 1: and we call that a revolution, a revolution that establishes 560 00:31:12,956 --> 00:31:16,436 Speaker 1: and achieves legitimacy, And it doesn't necessarily do a better 561 00:31:16,516 --> 00:31:21,116 Speaker 1: job of protecting settled expectations than do the older systems 562 00:31:21,676 --> 00:31:27,356 Speaker 1: like monarchies and autocracies. Yeah, so I think two answers there. 563 00:31:27,756 --> 00:31:31,236 Speaker 1: One is that I do think that blockchains have more 564 00:31:31,316 --> 00:31:34,596 Speaker 1: of aya do nothing biased than like sort of built 565 00:31:34,636 --> 00:31:36,636 Speaker 1: in inherently to this lay or zero that we talked 566 00:31:36,676 --> 00:31:41,316 Speaker 1: about them, than systems of governance do. Right, Like, if 567 00:31:41,676 --> 00:31:44,396 Speaker 1: the social layer goes to hell in a country, then 568 00:31:44,436 --> 00:31:46,356 Speaker 1: you know, often a strong man does get in power 569 00:31:46,356 --> 00:31:48,916 Speaker 1: and do whatever they want, Whereas if the social layer 570 00:31:48,956 --> 00:31:51,436 Speaker 1: goes to hell in something like biquin and ethereum, then 571 00:31:51,756 --> 00:31:56,036 Speaker 1: like often just nothing happens. Right So that now, of 572 00:31:56,036 --> 00:31:58,076 Speaker 1: course you can you know, disagree, and there might still 573 00:31:58,116 --> 00:32:01,076 Speaker 1: be the possibility that the social layer goes to hell 574 00:32:01,236 --> 00:32:04,276 Speaker 1: in such a way that makes it easy for crazy 575 00:32:04,316 --> 00:32:07,116 Speaker 1: things to happen in a blockchain. But like it feels 576 00:32:07,196 --> 00:32:10,116 Speaker 1: like there are these a sort of layer zero. Pressure 577 00:32:10,196 --> 00:32:13,276 Speaker 1: is just because of this coordination problem of getting people 578 00:32:13,276 --> 00:32:15,356 Speaker 1: to update their notes at the same time that make 579 00:32:15,396 --> 00:32:19,156 Speaker 1: it less likely. So I think the distinction here is that, like, 580 00:32:20,156 --> 00:32:23,036 Speaker 1: blockchains are only the bottom layer of an application, right, 581 00:32:23,076 --> 00:32:26,636 Speaker 1: and blockchains don't even necessarily have to innovate much for 582 00:32:26,676 --> 00:32:29,396 Speaker 1: applications to be able to innovate much on top. And 583 00:32:29,436 --> 00:32:31,916 Speaker 1: so going back to your point of what if people 584 00:32:31,996 --> 00:32:35,076 Speaker 1: like centralization, one of the big reasons why people like 585 00:32:35,196 --> 00:32:39,516 Speaker 1: centralization is just performance and the ability to kind of 586 00:32:39,596 --> 00:32:45,116 Speaker 1: rapidly pivot and make changes. But applications on a decentralized 587 00:32:45,156 --> 00:32:48,436 Speaker 1: system can do this, right, Like, there's plenty of applications 588 00:32:48,596 --> 00:32:52,996 Speaker 1: on ethereum where those applications have internal governance, and that 589 00:32:53,036 --> 00:32:57,636 Speaker 1: internal governance has the ability to, like say, completely switch 590 00:32:57,676 --> 00:32:59,956 Speaker 1: over the rules with a sixty to eight time delay 591 00:33:00,316 --> 00:33:02,556 Speaker 1: or sometimes completely switch over the rules with a zero 592 00:33:02,676 --> 00:33:05,676 Speaker 1: day time delay, right, Like, you can build rules and 593 00:33:05,756 --> 00:33:07,716 Speaker 1: you can make things a kind of all long the 594 00:33:07,716 --> 00:33:11,356 Speaker 1: spectrum from the centrals a full centralization on top of 595 00:33:11,396 --> 00:33:14,956 Speaker 1: a decentralized system. Now, what you can't do is you 596 00:33:15,036 --> 00:33:18,316 Speaker 1: can't build a decentralized application on top of Twitter, right, 597 00:33:18,516 --> 00:33:21,636 Speaker 1: So that's kind of the intuitive case for wanting or 598 00:33:21,716 --> 00:33:23,556 Speaker 1: you can only in so far as Twitter will let 599 00:33:23,596 --> 00:33:26,396 Speaker 1: you exactly right. So that's kind of the intuitive case 600 00:33:26,476 --> 00:33:29,836 Speaker 1: for wanting things that are closer to being base layers 601 00:33:29,836 --> 00:33:32,956 Speaker 1: to be more decentralized, but wanting things that are closer 602 00:33:32,996 --> 00:33:36,476 Speaker 1: to the application layer to be more centralized. Although you know, 603 00:33:36,516 --> 00:33:38,556 Speaker 1: in the case of the modern state, you know, the 604 00:33:38,596 --> 00:33:42,116 Speaker 1: post sixteen forty eight state, the state actually has exactly 605 00:33:42,116 --> 00:33:45,916 Speaker 1: that hybrid character that you just describe. The state is Twitter, right, 606 00:33:45,996 --> 00:33:50,476 Speaker 1: it maintains control, albeit it has a governance mechanism behind it, 607 00:33:50,876 --> 00:33:52,796 Speaker 1: although so does Twitter. Technically it has the laws of 608 00:33:52,836 --> 00:33:57,076 Speaker 1: corporations behind it. But then by enabling a free market, 609 00:33:57,436 --> 00:34:00,916 Speaker 1: the state is in a sense allowing people to build 610 00:34:00,956 --> 00:34:05,516 Speaker 1: all kinds of applications on the centralized platform. And so 611 00:34:05,556 --> 00:34:08,076 Speaker 1: the state has really hugely benefited in the modern era 612 00:34:08,196 --> 00:34:12,356 Speaker 1: from this domination. So it is possible and similarly, presumably 613 00:34:12,436 --> 00:34:15,676 Speaker 1: if you were, you know, a major platform, you might 614 00:34:15,796 --> 00:34:19,236 Speaker 1: want it might be the smartest move to allow the 615 00:34:19,276 --> 00:34:23,556 Speaker 1: building of all kinds of you know, decentralized things on 616 00:34:23,596 --> 00:34:27,716 Speaker 1: top of your platform, because it actually encourages everybody to 617 00:34:27,716 --> 00:34:29,516 Speaker 1: be to be invested in that platform. That might be 618 00:34:29,556 --> 00:34:33,996 Speaker 1: the winning strategy from a certain standpoint. That's very possible, 619 00:34:34,156 --> 00:34:37,316 Speaker 1: but there are also limits to the extent to which 620 00:34:37,436 --> 00:34:41,196 Speaker 1: a centralized state can commit to a trustfulness. Like one 621 00:34:41,356 --> 00:34:43,756 Speaker 1: very practical example, right as if we look at say 622 00:34:43,756 --> 00:34:47,476 Speaker 1: centralized US tech companies, like back about ten or fifteen 623 00:34:47,556 --> 00:34:52,036 Speaker 1: years ago, they were fairly fairly trusted internationally, right, and 624 00:34:52,116 --> 00:34:54,796 Speaker 1: there was this kind of legal order where people could 625 00:34:54,836 --> 00:34:56,956 Speaker 1: sort of put their heads in the sand. It kind 626 00:34:56,956 --> 00:34:59,996 Speaker 1: of pretends that these Silicon Valley companies were basically living 627 00:35:00,036 --> 00:35:02,916 Speaker 1: inside of an anarchy, and they would just like be 628 00:35:02,996 --> 00:35:05,756 Speaker 1: able to you know, keep on and do keep on 629 00:35:05,796 --> 00:35:09,116 Speaker 1: doing whatever was in their business interests. But then what happened, 630 00:35:09,116 --> 00:35:11,396 Speaker 1: and what's happened over the last ten years, is that 631 00:35:11,956 --> 00:35:15,836 Speaker 1: a lot of countries, like both governments and people in 632 00:35:15,916 --> 00:35:18,436 Speaker 1: countries and especially countries that are not very friendly to 633 00:35:18,476 --> 00:35:22,836 Speaker 1: the US, started being much more suspicious of US tech companies. Now, 634 00:35:22,876 --> 00:35:24,796 Speaker 1: some of that I think is, of course, is because 635 00:35:24,836 --> 00:35:28,276 Speaker 1: they just they want to promote local alternatives so they 636 00:35:28,276 --> 00:35:30,836 Speaker 1: can have control themselves. But some of that is also 637 00:35:30,916 --> 00:35:33,596 Speaker 1: because like they are actually afraid that if the US 638 00:35:33,636 --> 00:35:35,756 Speaker 1: and this other country comes into conflicts, then the US 639 00:35:35,796 --> 00:35:39,236 Speaker 1: government is actually going to ruck hold them. Whereas you know, 640 00:35:39,316 --> 00:35:42,796 Speaker 1: the Ethereum blockchain, for example, is less capable of doing 641 00:35:43,236 --> 00:35:48,276 Speaker 1: that if say, an application was built by someone from 642 00:35:48,316 --> 00:35:51,196 Speaker 1: the Bitcoin community and at ran on the ethereum blockchain. 643 00:35:52,876 --> 00:35:55,796 Speaker 1: So now you're turning to the last topic I want 644 00:35:55,796 --> 00:35:58,396 Speaker 1: to talk about, and it's very rich and fascinating to me. 645 00:35:58,476 --> 00:36:00,516 Speaker 1: And that is what you might call the issue of 646 00:36:00,516 --> 00:36:04,036 Speaker 1: pre commitment. Right, how well suited are different kinds of 647 00:36:04,076 --> 00:36:09,436 Speaker 1: institutions to promising in advance that they won't mess things 648 00:36:09,516 --> 00:36:12,116 Speaker 1: up relative to your settled expectations. And you already mentioned 649 00:36:12,116 --> 00:36:14,676 Speaker 1: that in the context of the people who say, well, 650 00:36:14,996 --> 00:36:18,436 Speaker 1: you know, bitcoin's foundational definition is it's limited to twenty 651 00:36:18,436 --> 00:36:21,036 Speaker 1: one million, and a lot follows from that. Right, Therefore 652 00:36:21,116 --> 00:36:24,436 Speaker 1: they say it's a fixed supply. Therefore they say it's 653 00:36:24,436 --> 00:36:27,236 Speaker 1: analogous to gold, which is also a fixed supply. Although 654 00:36:27,636 --> 00:36:29,436 Speaker 1: we may not have found all of that fixed supply, 655 00:36:29,636 --> 00:36:33,276 Speaker 1: the fixed supply in principle exists. So what I'm interested 656 00:36:33,316 --> 00:36:35,516 Speaker 1: in there're two aspects of this. I'm interested in one. 657 00:36:35,716 --> 00:36:38,956 Speaker 1: Why would we believe that one kind of institution is 658 00:36:38,996 --> 00:36:43,076 Speaker 1: more impervious to pre commitment than another if all precommitments 659 00:36:43,116 --> 00:36:45,196 Speaker 1: are just based on self interest? Right, So if the 660 00:36:45,236 --> 00:36:49,116 Speaker 1: people who were holding the bitcoin voting power chose to 661 00:36:49,156 --> 00:36:51,396 Speaker 1: expand the number, and it was in their interest to 662 00:36:51,436 --> 00:36:54,436 Speaker 1: do so, they would do so much in the same 663 00:36:54,516 --> 00:36:57,076 Speaker 1: way that when a government decides that it's in its 664 00:36:57,116 --> 00:37:00,236 Speaker 1: interests to expand its monetary supply, it has a governance 665 00:37:00,276 --> 00:37:03,756 Speaker 1: mechanism or multiple complex governance mechanisms, and then it does so. 666 00:37:04,076 --> 00:37:06,516 Speaker 1: And yet when he hears from people in the bitcoin 667 00:37:06,596 --> 00:37:09,996 Speaker 1: world things like, well, that can ever happened here? We 668 00:37:10,036 --> 00:37:12,676 Speaker 1: can't trust states to be pre committed, but of course 669 00:37:12,716 --> 00:37:16,716 Speaker 1: we can trust our institutions to be pre committed. So 670 00:37:17,116 --> 00:37:18,996 Speaker 1: is there anything to that? I mean, I have trouble 671 00:37:19,036 --> 00:37:20,876 Speaker 1: seeing why we would be more trusting of one than 672 00:37:20,916 --> 00:37:25,356 Speaker 1: the other. Sure, so I think like the important thing 673 00:37:25,436 --> 00:37:28,156 Speaker 1: is that bitcoin doesn't have a concept of voting power, right, 674 00:37:28,556 --> 00:37:31,316 Speaker 1: like it has miners. But at the same time, like 675 00:37:31,356 --> 00:37:33,956 Speaker 1: if a minor creates a block that violates the current roles, 676 00:37:34,156 --> 00:37:36,036 Speaker 1: like even if ninety percent of all the miners are 677 00:37:36,076 --> 00:37:38,276 Speaker 1: going along with them, the nodes that are run by 678 00:37:38,356 --> 00:37:41,596 Speaker 1: users are still going to reject them. Right Although although 679 00:37:41,636 --> 00:37:44,196 Speaker 1: your whole, your whole parable shows that they could just 680 00:37:44,276 --> 00:37:48,236 Speaker 1: hive fork off and replicate, they could exactly so that's 681 00:37:48,236 --> 00:37:51,276 Speaker 1: why such genius. Right, Yeah, that's very true. But at 682 00:37:51,276 --> 00:37:53,876 Speaker 1: the same time, like that action does have a cost, right, 683 00:37:54,156 --> 00:37:57,996 Speaker 1: Like if it was a an actual voting system, then 684 00:37:58,396 --> 00:38:00,716 Speaker 1: like you know, once the vote happened and fifty one 685 00:38:00,756 --> 00:38:03,476 Speaker 1: percent approved the motion and change the source code, well 686 00:38:03,756 --> 00:38:06,036 Speaker 1: you're kind of screwed, right, yeah, well, or or at 687 00:38:06,036 --> 00:38:08,476 Speaker 1: the very least, like the shoe is on the other foot. 688 00:38:08,476 --> 00:38:11,716 Speaker 1: And like the burden of overcoming the coordination problem, which 689 00:38:11,796 --> 00:38:14,476 Speaker 1: is a really really high burden like basically falls on 690 00:38:14,516 --> 00:38:17,756 Speaker 1: the side of the dissenters. But in the case of bitcoin, 691 00:38:18,116 --> 00:38:21,636 Speaker 1: the burden of overcoming the coordination burden for anything controversial 692 00:38:21,996 --> 00:38:24,436 Speaker 1: falls on the side of the people that are trying 693 00:38:24,436 --> 00:38:26,836 Speaker 1: to make the change. Right. That's true in a revolution 694 00:38:26,876 --> 00:38:28,676 Speaker 1: as well, Right, the burden is on the people who 695 00:38:28,716 --> 00:38:32,516 Speaker 1: want to make the revolution typically right, but there are 696 00:38:32,596 --> 00:38:36,196 Speaker 1: also ways to rapidly change policy in a state that 697 00:38:36,196 --> 00:38:39,036 Speaker 1: don't require revolution. May I just say about that often 698 00:38:39,396 --> 00:38:42,556 Speaker 1: people who study states. I had Francis fo Kiama on 699 00:38:42,556 --> 00:38:44,076 Speaker 1: the show the other day, and this is something that 700 00:38:44,276 --> 00:38:47,396 Speaker 1: he writes about extensively, think that the capacity to make 701 00:38:47,476 --> 00:38:50,196 Speaker 1: changes legally is actually a crucial part of the anti 702 00:38:50,276 --> 00:38:53,356 Speaker 1: fragility of states. In other words, it's thought to be 703 00:38:53,636 --> 00:38:55,636 Speaker 1: one of the best ways to avoid the decay of 704 00:38:55,636 --> 00:39:00,276 Speaker 1: a political order, to ensure the possibility of negotiating change 705 00:39:00,276 --> 00:39:01,796 Speaker 1: within the framework of law. And when you can't do 706 00:39:01,796 --> 00:39:04,716 Speaker 1: it anymore, you get rigid, and when you get rigid, 707 00:39:04,716 --> 00:39:07,956 Speaker 1: you tend towards decay. So that seems like a good 708 00:39:07,996 --> 00:39:11,836 Speaker 1: feature rather of the very possible. It's also like states 709 00:39:11,836 --> 00:39:15,356 Speaker 1: and blockchains also do very different things, right, Like blockchains 710 00:39:15,396 --> 00:39:20,396 Speaker 1: are like software constructions that relatively are interacting with a 711 00:39:20,436 --> 00:39:23,076 Speaker 1: simpler world. You know, States have to deal with just 712 00:39:23,156 --> 00:39:26,516 Speaker 1: all sorts of problems and they have to act strategically 713 00:39:26,516 --> 00:39:29,876 Speaker 1: in the face of external threats. So it's possible that 714 00:39:30,236 --> 00:39:34,116 Speaker 1: the kind of like the correct balance between agility and 715 00:39:34,756 --> 00:39:37,716 Speaker 1: ability to just kind of credibly stick in one place 716 00:39:37,836 --> 00:39:39,996 Speaker 1: is just different between the two contexts. And that's fine. 717 00:39:40,396 --> 00:39:42,636 Speaker 1: I mean, the question that I want to close with then, 718 00:39:43,036 --> 00:39:45,156 Speaker 1: is one that I really fascinates me, and I see 719 00:39:45,156 --> 00:39:46,716 Speaker 1: it again and again and again as I talk to 720 00:39:46,876 --> 00:39:50,636 Speaker 1: sophisticated people in your blockchain universe or in crypto land, 721 00:39:51,276 --> 00:39:55,076 Speaker 1: and that is there seems to be this impulse towards 722 00:39:55,356 --> 00:39:58,556 Speaker 1: what I would call almost like a conservative preservationism, a 723 00:39:58,756 --> 00:40:02,436 Speaker 1: worry that the reason we can't trust, say, governments, is 724 00:40:02,476 --> 00:40:06,436 Speaker 1: that they do stuff too frequently, you know, too riskly 725 00:40:06,916 --> 00:40:10,916 Speaker 1: that facilitates the interests of vote majorities, and that we 726 00:40:10,956 --> 00:40:13,636 Speaker 1: need to draw a line against that. We need to 727 00:40:14,076 --> 00:40:16,116 Speaker 1: have principles and rules. And in that sense, it sounds 728 00:40:16,116 --> 00:40:18,516 Speaker 1: in libertarianism, the idea that you know, we have some 729 00:40:18,596 --> 00:40:20,676 Speaker 1: basic fundamental rights and we ought to stick to those 730 00:40:20,716 --> 00:40:22,396 Speaker 1: and we don't want to live in a world where 731 00:40:22,636 --> 00:40:26,956 Speaker 1: the government can just wish away our rights. And that 732 00:40:26,996 --> 00:40:30,116 Speaker 1: makes sense to me. And here's the punchline. But if 733 00:40:30,156 --> 00:40:33,556 Speaker 1: your account of legitimacy is true, and I believe you 734 00:40:33,596 --> 00:40:37,276 Speaker 1: convince me that it is, then I don't see how 735 00:40:37,436 --> 00:40:42,676 Speaker 1: decentralization in the blockchain fulfills those libertarian aspirations, because it 736 00:40:42,756 --> 00:40:46,876 Speaker 1: just pushes them up one level to the level of 737 00:40:46,956 --> 00:40:49,716 Speaker 1: the voting members, the people who structure the governance, and 738 00:40:49,756 --> 00:40:51,356 Speaker 1: they don't literally have to vote, they just have to 739 00:40:51,396 --> 00:40:53,076 Speaker 1: have the capacity to do what you said in the 740 00:40:53,516 --> 00:40:57,516 Speaker 1: steam Hive story of just forking off and replicating. And 741 00:40:57,556 --> 00:41:00,396 Speaker 1: so if that's the case, then this world it's no 742 00:41:00,516 --> 00:41:04,556 Speaker 1: worse than but it not obviously better than a world 743 00:41:04,556 --> 00:41:08,836 Speaker 1: of centralized platforms when it comes to including governments, when 744 00:41:08,836 --> 00:41:11,636 Speaker 1: it comes to these kinds of pre commitment to certain 745 00:41:11,676 --> 00:41:17,196 Speaker 1: core principles, I think the main part of my answer 746 00:41:17,236 --> 00:41:20,156 Speaker 1: to that is that I'm a marginalist and not an absolutist. 747 00:41:20,436 --> 00:41:23,436 Speaker 1: Like I think, even if you can't get a one 748 00:41:23,516 --> 00:41:27,556 Speaker 1: hundred percent of the way towards satisfying some principle or 749 00:41:27,996 --> 00:41:29,996 Speaker 1: or idea or systems of values, you could still make 750 00:41:29,996 --> 00:41:32,396 Speaker 1: a huge gain by getting thirty percent or fifty percent 751 00:41:32,436 --> 00:41:35,476 Speaker 1: of the way there. And like I do think that 752 00:41:35,756 --> 00:41:39,036 Speaker 1: what I call the laws of physics that exist in 753 00:41:39,076 --> 00:41:42,676 Speaker 1: the in watchannel onto and the way that a kind 754 00:41:42,676 --> 00:41:46,396 Speaker 1: of the coordination problems tilt against someone making wanting to 755 00:41:46,436 --> 00:41:49,476 Speaker 1: make radical changes, even though radical changes are still possible 756 00:41:49,556 --> 00:41:52,876 Speaker 1: if they can overcome those coordination problems. Just the facts 757 00:41:52,876 --> 00:41:54,996 Speaker 1: that users run notes just means that you have to 758 00:41:55,276 --> 00:41:57,556 Speaker 1: consult users at all. Like I do think that those 759 00:41:57,556 --> 00:42:02,476 Speaker 1: things are improvements. I definitely don't claim that watchings are 760 00:42:02,836 --> 00:42:06,036 Speaker 1: or can be one hundred percent immutable systems, no matter 761 00:42:06,036 --> 00:42:09,316 Speaker 1: how hard someone tries to be. I think, just going 762 00:42:09,356 --> 00:42:12,036 Speaker 1: back a bit to your earlier analogy, it's with like 763 00:42:12,116 --> 00:42:14,516 Speaker 1: tech companies and how a tech company is, like they 764 00:42:14,516 --> 00:42:17,476 Speaker 1: do end up changing a lot. I think my answer 765 00:42:17,596 --> 00:42:21,116 Speaker 1: there is just that like there are if we look 766 00:42:21,156 --> 00:42:24,276 Speaker 1: at the tech universe, like there are already things that 767 00:42:24,356 --> 00:42:26,796 Speaker 1: are more immutable than tech companies, right, and that is 768 00:42:26,796 --> 00:42:29,916 Speaker 1: programming languages, right like Python. You know, it took decades 769 00:42:29,956 --> 00:42:32,796 Speaker 1: to move from Python to to Python three C plus plus. 770 00:42:32,836 --> 00:42:35,116 Speaker 1: I think it's still backwards compatible with what was there 771 00:42:35,156 --> 00:42:39,676 Speaker 1: like tens of years ago. So there are also things 772 00:42:39,716 --> 00:42:42,676 Speaker 1: that are decentralized in that sense, right, and like programming 773 00:42:42,716 --> 00:42:46,836 Speaker 1: languages are a good example. So the way that I 774 00:42:47,116 --> 00:42:49,556 Speaker 1: kind of I guess view the way things should be 775 00:42:49,836 --> 00:42:53,036 Speaker 1: is that intuitively, things that are more general purpose and 776 00:42:53,116 --> 00:42:56,356 Speaker 1: that are more abstract, it's less likely that they are 777 00:42:56,396 --> 00:42:59,076 Speaker 1: going to need to change rapidly because the goal of 778 00:42:59,116 --> 00:43:01,356 Speaker 1: them is to be general purpose, right, Like with a 779 00:43:01,356 --> 00:43:03,756 Speaker 1: programming language like Python, like you can build anything in 780 00:43:03,796 --> 00:43:05,876 Speaker 1: response to it, Like Python did not even have to 781 00:43:05,956 --> 00:43:08,556 Speaker 1: change in a response to say, the emergence of the 782 00:43:08,556 --> 00:43:11,756 Speaker 1: blockchain world. But things that are closer to the application layer, 783 00:43:11,996 --> 00:43:14,716 Speaker 1: they do need to adapt. There's more need for centralization. 784 00:43:15,036 --> 00:43:18,476 Speaker 1: There is potentially need for kind of bounded centralization where 785 00:43:18,676 --> 00:43:21,836 Speaker 1: you include some decentralized components, but they're more to keep 786 00:43:21,876 --> 00:43:24,596 Speaker 1: the centralized party honestly than to kind of shackle than one. 787 00:43:26,556 --> 00:43:28,356 Speaker 1: So things that are closer to the user can be 788 00:43:28,396 --> 00:43:31,676 Speaker 1: more centralized, and that's fine Ethereum. I guess I view 789 00:43:31,756 --> 00:43:35,116 Speaker 1: it as being a little bit more like a programming 790 00:43:35,156 --> 00:43:37,556 Speaker 1: language than or taking the role of something like a 791 00:43:37,596 --> 00:43:41,036 Speaker 1: programming language, than taking a role of something like Twitter. 792 00:43:41,116 --> 00:43:44,396 Speaker 1: But at the same time, like blockchains are their own category, right, 793 00:43:44,476 --> 00:43:46,836 Speaker 1: Like I think, you know, blockchains, like they take some 794 00:43:46,916 --> 00:43:49,636 Speaker 1: of the properties of a languages, some of the properties 795 00:43:49,636 --> 00:43:52,756 Speaker 1: of companies, you could say, some of the properties of states, 796 00:43:52,796 --> 00:43:55,716 Speaker 1: but they also like they are not each any of 797 00:43:55,756 --> 00:43:59,076 Speaker 1: those things individually, right, Like they are I think this 798 00:43:59,716 --> 00:44:03,236 Speaker 1: new construction that's only really possible now that we have 799 00:44:03,276 --> 00:44:06,796 Speaker 1: this second all of these advancements in technology and cryptography, 800 00:44:07,236 --> 00:44:10,236 Speaker 1: and like there's a lot of opportunity to build interesting 801 00:44:10,276 --> 00:44:13,196 Speaker 1: things on top of them, and like the legitimacy that 802 00:44:13,556 --> 00:44:16,396 Speaker 1: blockchains provide by being what I call this kind of 803 00:44:16,396 --> 00:44:19,476 Speaker 1: credibly neutral environment. Like I think there's a lot of 804 00:44:19,476 --> 00:44:23,196 Speaker 1: potential for people to build even centralized services that talk 805 00:44:23,236 --> 00:44:27,036 Speaker 1: to each other, that sit on the same blockchain. There's 806 00:44:27,156 --> 00:44:30,596 Speaker 1: value that I think can come out of those things. 807 00:44:30,876 --> 00:44:34,156 Speaker 1: But at the same time, like, I'm definitely not a 808 00:44:34,436 --> 00:44:37,996 Speaker 1: proponent of the idea that sort of blockchain like designs 809 00:44:38,036 --> 00:44:41,196 Speaker 1: are something that's intended to take over every sector of 810 00:44:41,196 --> 00:44:43,236 Speaker 1: how the world operates either, Like I think they're just 811 00:44:43,516 --> 00:44:46,036 Speaker 1: a new thing that we've invented in like, we'll see 812 00:44:46,076 --> 00:44:48,716 Speaker 1: how they interact with everything else. So we've created in 813 00:44:48,716 --> 00:44:52,236 Speaker 1: the world the telch You don't need me to thank 814 00:44:52,276 --> 00:44:55,476 Speaker 1: you for the tremendous value that your creativity has created 815 00:44:55,596 --> 00:44:57,436 Speaker 1: lots of spaces, But what I do want to thank 816 00:44:57,476 --> 00:45:02,116 Speaker 1: you for is writing and talking about the things that 817 00:45:02,196 --> 00:45:04,476 Speaker 1: you do in the worlds that you're creating in such 818 00:45:04,476 --> 00:45:08,076 Speaker 1: an accessible way. I've learned an enormous amount from reading 819 00:45:08,116 --> 00:45:10,836 Speaker 1: your blog, post, your work, and also just for engaging 820 00:45:11,396 --> 00:45:14,956 Speaker 1: in this sort of open minded, engaged, analogical way. I 821 00:45:14,996 --> 00:45:18,036 Speaker 1: think it's tremendously valuable for those of us who are 822 00:45:18,076 --> 00:45:21,556 Speaker 1: not crypto natives. And sometime sometime in the future we 823 00:45:21,596 --> 00:45:24,076 Speaker 1: can talk about language. I mean, I think linguistics as 824 00:45:24,076 --> 00:45:26,796 Speaker 1: a backdrop here is fascinating because sort of what you 825 00:45:26,796 --> 00:45:28,796 Speaker 1: were just saying at the end there the relationship between 826 00:45:29,476 --> 00:45:32,716 Speaker 1: language and governance you know, we do all of our 827 00:45:32,756 --> 00:45:37,796 Speaker 1: governance through language, right, but language itself is gradically decentralized. 828 00:45:37,956 --> 00:45:40,196 Speaker 1: There are efforts to centralize that they never go very well. 829 00:45:40,596 --> 00:45:43,956 Speaker 1: You know, there's definitional, there's something very rich there. Yeah, no, 830 00:45:44,076 --> 00:45:47,236 Speaker 1: some do. Yeah, you know, language is definitely fascinating. Like 831 00:45:47,276 --> 00:45:49,636 Speaker 1: I think in some ways that might even be one 832 00:45:49,676 --> 00:45:52,076 Speaker 1: of the kind of closest things to a blotcha in 833 00:45:52,236 --> 00:45:56,156 Speaker 1: a historical context, right like absolutely, yeah, yeah, open source, 834 00:45:56,356 --> 00:45:59,076 Speaker 1: but with fixed units, with some connection, depending on if 835 00:45:59,076 --> 00:46:01,036 Speaker 1: you're a Chomsky and or not, some connection to some 836 00:46:01,116 --> 00:46:04,996 Speaker 1: underlying biological principles and rules. Deep divide within the field 837 00:46:05,036 --> 00:46:07,596 Speaker 1: about whether these are primary, like there's a language unit 838 00:46:07,596 --> 00:46:10,436 Speaker 1: in the brain, whether they are alternatively socially constructed. I mean, 839 00:46:10,876 --> 00:46:13,676 Speaker 1: you know, it's totally rich for everything that we were 840 00:46:13,716 --> 00:46:15,356 Speaker 1: talking about, and you should write about it, and we 841 00:46:15,356 --> 00:46:16,916 Speaker 1: should talk about it some other time. I mean I 842 00:46:16,916 --> 00:46:19,796 Speaker 1: would love to talk absolutely, I would too. I learned 843 00:46:19,796 --> 00:46:22,076 Speaker 1: a huge amount from this conversation, so I really want 844 00:46:22,076 --> 00:46:23,836 Speaker 1: to thank you for that. No, thank you very much. 845 00:46:23,836 --> 00:46:33,916 Speaker 1: I'll learn a lot too. We're in our third season 846 00:46:33,916 --> 00:46:37,516 Speaker 1: of Deep Background, and according to my producer Molaboard, we've 847 00:46:37,516 --> 00:46:40,756 Speaker 1: interviewed more than a hundred people on the show. I 848 00:46:40,876 --> 00:46:42,876 Speaker 1: have to say that although it would be a terrible 849 00:46:42,956 --> 00:46:45,796 Speaker 1: idea to play favorites among them, I was about as 850 00:46:45,836 --> 00:46:49,476 Speaker 1: excited by my conversation with Vitalic as I have been 851 00:46:49,716 --> 00:46:52,556 Speaker 1: at any time in all of the interviews that I've done. 852 00:46:53,156 --> 00:46:56,716 Speaker 1: What excited me was to see the talk's mind at work, 853 00:46:56,996 --> 00:46:59,796 Speaker 1: to see its speed and its creativity, and to have 854 00:46:59,836 --> 00:47:02,476 Speaker 1: the opportunity to play a little bit with him in 855 00:47:02,596 --> 00:47:05,836 Speaker 1: banging around ideas that have been the focus of much 856 00:47:05,876 --> 00:47:09,236 Speaker 1: of my own work and that I developed in a 857 00:47:09,276 --> 00:47:13,916 Speaker 1: context completely unrelated to crypto and the blockchain, and to 858 00:47:14,036 --> 00:47:17,676 Speaker 1: see that in some potential way, these ideas of legitimacy 859 00:47:17,836 --> 00:47:20,876 Speaker 1: may in fact be central to the way that this 860 00:47:20,916 --> 00:47:26,196 Speaker 1: new world is developing. I was deeply struck by Vitalis 861 00:47:26,436 --> 00:47:31,076 Speaker 1: parable of Steam and Hive and his core takeaway, namely 862 00:47:31,116 --> 00:47:34,836 Speaker 1: that on the blockchain and in the crypto space, everything 863 00:47:34,876 --> 00:47:40,476 Speaker 1: depends on the acquiescence of the participants in the collective undertaking, 864 00:47:40,676 --> 00:47:42,956 Speaker 1: and if they get up and decide to change it, 865 00:47:43,236 --> 00:47:47,316 Speaker 1: eliminate it, or imitate it and recreate it, they can 866 00:47:47,516 --> 00:47:51,836 Speaker 1: do that. This parable opens up some deep and fundamental 867 00:47:51,916 --> 00:47:56,156 Speaker 1: questions about what crypto and blockchain are good for and 868 00:47:56,196 --> 00:47:58,556 Speaker 1: what they might or might not be able to contribute 869 00:47:58,596 --> 00:48:01,916 Speaker 1: to the broader world. I myself am only beginning to 870 00:48:01,956 --> 00:48:05,196 Speaker 1: develop my thoughts on this topic, but listening to Vitalic 871 00:48:05,556 --> 00:48:09,116 Speaker 1: made me suspect that the benefits that people are described 872 00:48:09,556 --> 00:48:12,436 Speaker 1: as the benefits of the blockchain have a lot to 873 00:48:12,476 --> 00:48:16,236 Speaker 1: do with the benefits of decentralized governance that we associate 874 00:48:16,276 --> 00:48:21,116 Speaker 1: with democracy, as opposed to the historically significant benefits of 875 00:48:21,156 --> 00:48:24,996 Speaker 1: autocracy or monarchy, which are more closely associated in the 876 00:48:24,996 --> 00:48:30,516 Speaker 1: blockchain world with the dominance of central powerful platforms controlled 877 00:48:30,636 --> 00:48:34,116 Speaker 1: by a single corporation and often headed by a single 878 00:48:34,196 --> 00:48:38,556 Speaker 1: powerful founder CEO. If this analogy turns out to be 879 00:48:38,636 --> 00:48:41,716 Speaker 1: valuable and useful, and Vitallic seem to think that perhaps 880 00:48:41,796 --> 00:48:44,556 Speaker 1: it could be, it will help us to see that 881 00:48:44,836 --> 00:48:48,556 Speaker 1: some of the advantages of the distributed blockchain might be 882 00:48:48,796 --> 00:48:52,476 Speaker 1: real and significant, because there are some kinds of decision 883 00:48:52,476 --> 00:48:55,916 Speaker 1: making and some kinds of development of businesses and opportunities 884 00:48:56,076 --> 00:49:00,796 Speaker 1: that might work better in decentralized ways. Yet simultaneously, there 885 00:49:00,836 --> 00:49:05,396 Speaker 1: may also be circumstances where centralized governance is preferable even 886 00:49:05,436 --> 00:49:09,836 Speaker 1: to the users than decentralized governance. And what's more, there 887 00:49:09,836 --> 00:49:13,396 Speaker 1: could be hybrid models, much like modern governments, where there 888 00:49:13,476 --> 00:49:16,796 Speaker 1: is a central national government, but that government then enables 889 00:49:16,836 --> 00:49:20,996 Speaker 1: people to create all kinds of different sorts of applications, 890 00:49:20,996 --> 00:49:24,836 Speaker 1: like private companies, against the backdrop of the state's ordinary 891 00:49:24,916 --> 00:49:30,876 Speaker 1: centralized operations. Most fundamentally, however, if it's the case, as 892 00:49:30,956 --> 00:49:34,516 Speaker 1: Vitali says, that legitimacy is the key concept on the blockchain, 893 00:49:34,796 --> 00:49:37,316 Speaker 1: then we always have to keep our eye on what 894 00:49:37,396 --> 00:49:43,276 Speaker 1: happens when legitimacy erodes, is recreated and is reshaped. Events 895 00:49:43,316 --> 00:49:47,116 Speaker 1: that occur in governments, both through gradual changes sometimes and 896 00:49:47,236 --> 00:49:52,276 Speaker 1: also through revolutionary changes. Those revolutions may be rare, but 897 00:49:52,316 --> 00:49:55,636 Speaker 1: when they happen, they move around a lot of wealth, 898 00:49:55,956 --> 00:49:59,036 Speaker 1: they unsettle a lot of settled expectations, and they can 899 00:49:59,116 --> 00:50:04,996 Speaker 1: lead to tremendous improvement as well as tremendous disorder. It's 900 00:50:05,036 --> 00:50:09,196 Speaker 1: far too soon to draw conclusions about any of these subjects, 901 00:50:09,196 --> 00:50:11,596 Speaker 1: partly because in the world of governments, where people have 902 00:50:11,676 --> 00:50:14,876 Speaker 1: been talking about concepts like legitimacy for at least twenty 903 00:50:14,876 --> 00:50:19,076 Speaker 1: five hundred years, we still don't have definitive answers but 904 00:50:19,156 --> 00:50:21,796 Speaker 1: the idea that there may be some points of contact 905 00:50:21,996 --> 00:50:24,956 Speaker 1: between the worlds that we have known and understand and 906 00:50:25,036 --> 00:50:27,316 Speaker 1: the new worlds that are being created by people like 907 00:50:27,396 --> 00:50:33,356 Speaker 1: the talk is to me hopeful, interesting, and potentially of 908 00:50:33,396 --> 00:50:36,756 Speaker 1: tremendous value to trying to figure out the world around us. 909 00:50:37,956 --> 00:50:40,276 Speaker 1: I hope you had as much fun in this conversation 910 00:50:40,436 --> 00:50:43,436 Speaker 1: as I did. Until the next time I speak to you, 911 00:50:43,956 --> 00:50:48,516 Speaker 1: be well, think deep thoughts like Fatalic does, and have 912 00:50:48,636 --> 00:50:52,036 Speaker 1: a little fun. Deep background is brought to you by 913 00:50:52,076 --> 00:50:56,156 Speaker 1: Pushkin Industries. Our producer is Mola Board, our engineer is 914 00:50:56,196 --> 00:51:00,436 Speaker 1: Ben Talliday, and our showrunner is Sophie Crane mckibbon. Editorial 915 00:51:00,476 --> 00:51:05,036 Speaker 1: support from noahm Osband. Theme music by Luis Gara at Pushkin. 916 00:51:05,196 --> 00:51:08,956 Speaker 1: Thanks to Mia Lobell, Julia Barton, Lydia Jean Cott, Heather Fain, 917 00:51:09,316 --> 00:51:14,116 Speaker 1: Carli Migliori, Maggie Taylor, Eric Sandler, and Jacob Weissberg. You 918 00:51:14,116 --> 00:51:16,676 Speaker 1: can find me on Twitter at Noah R. Feldman. I 919 00:51:16,756 --> 00:51:19,156 Speaker 1: also write a column from Bloomberg Opinion, which you can 920 00:51:19,196 --> 00:51:23,236 Speaker 1: find at Bloomberg dot com slash Feldman. To discover Bloomberg's 921 00:51:23,276 --> 00:51:27,276 Speaker 1: original slate of podcasts, go to bloomberg dot com, slash podcasts, 922 00:51:27,596 --> 00:51:30,076 Speaker 1: and if you liked what you heard today, please write 923 00:51:30,116 --> 00:51:33,836 Speaker 1: a review or tell a friend. This is deep background