1 00:00:15,356 --> 00:00:23,396 Speaker 1: Pushkin. There are these moments when people make huge technical 2 00:00:23,436 --> 00:00:26,396 Speaker 1: advances and it happens all of a sudden, or at 3 00:00:26,476 --> 00:00:28,836 Speaker 1: least it feels like it happens all of a sudden. 4 00:00:29,596 --> 00:00:33,196 Speaker 1: You know this is happening right now, most obviously with AI, 5 00:00:33,396 --> 00:00:37,556 Speaker 1: with artificial intelligence. It happened not too long ago with rockets, 6 00:00:37,796 --> 00:00:41,196 Speaker 1: when SpaceX dramatically lowered the cost of getting to space. 7 00:00:42,076 --> 00:00:45,876 Speaker 1: Maybe it's happening now with crypto. I'd say it's probably 8 00:00:45,876 --> 00:00:49,156 Speaker 1: too soon to say on that one. In any case, 9 00:00:49,436 --> 00:00:52,796 Speaker 1: you can look at technological breakthroughs in different fields and 10 00:00:52,836 --> 00:00:55,956 Speaker 1: at different times, and you can ask, what can we 11 00:00:56,076 --> 00:01:00,196 Speaker 1: learn from these? You can ask, can we abstract certain 12 00:01:00,596 --> 00:01:04,356 Speaker 1: you know, certain qualities, certain tendencies that seem to drive 13 00:01:04,476 --> 00:01:09,436 Speaker 1: toward these burths of technological progress. There's a recent book 14 00:01:09,436 --> 00:01:12,836 Speaker 1: called Boom that asked this question and it comes up 15 00:01:12,876 --> 00:01:16,396 Speaker 1: with an interesting answer. According to the book, one thing 16 00:01:16,476 --> 00:01:19,236 Speaker 1: that's really helpful if you want to make a wild 17 00:01:19,316 --> 00:01:29,516 Speaker 1: technological leap a bubble. I'm Jacob Goldstein, and this is 18 00:01:29,516 --> 00:01:33,276 Speaker 1: what's your problem. My guest today is Burne Hobart. He's 19 00:01:33,356 --> 00:01:36,356 Speaker 1: the author of a finance newsletter called The diff and 20 00:01:36,396 --> 00:01:39,156 Speaker 1: he's also the co author of a book called Boom, 21 00:01:39,396 --> 00:01:43,956 Speaker 1: Bubbles and the End of Stagnation. When burn talks about bubbles, 22 00:01:44,196 --> 00:01:47,676 Speaker 1: he isn't just talking about financial bubbles where investors drive 23 00:01:47,756 --> 00:01:51,316 Speaker 1: prices through the roof. When he says bubble, largely he 24 00:01:51,396 --> 00:01:55,996 Speaker 1: means social bubbles, filter bubbles, little groups of people who 25 00:01:56,036 --> 00:01:59,796 Speaker 1: share some wild belief. He really gets it what he 26 00:01:59,836 --> 00:02:01,676 Speaker 1: means in this one sentence where he and his co 27 00:02:01,756 --> 00:02:07,516 Speaker 1: author write, quote, transformative progress arises from small groups with 28 00:02:07,596 --> 00:02:14,156 Speaker 1: a unified vision, vast funding, and surprisingly poor accountability end quote, 29 00:02:14,876 --> 00:02:19,116 Speaker 1: basically living the dream. Later in the conversation, Burne and 30 00:02:19,156 --> 00:02:22,796 Speaker 1: I discussed the modern space industry and cryptocurrency and AI. 31 00:02:23,436 --> 00:02:25,996 Speaker 1: But to start, we talked about two case studies from 32 00:02:26,036 --> 00:02:29,316 Speaker 1: the US in the twentieth century. Burn writes about him 33 00:02:29,316 --> 00:02:31,916 Speaker 1: in the book, and he argues that these two moments 34 00:02:31,996 --> 00:02:36,196 Speaker 1: hold broader lessons for how technological progress works. The two 35 00:02:36,236 --> 00:02:38,556 Speaker 1: case studies we talk about are the Manhattan Project and 36 00:02:38,596 --> 00:02:43,436 Speaker 1: the Apollo Missions. So let's start with the Manhattan Project, 37 00:02:43,196 --> 00:02:46,196 Speaker 1: and maybe one place to start it is with this 38 00:02:46,276 --> 00:02:50,836 Speaker 1: famous nineteen thirty nine letter from Albert Einstein and other 39 00:02:50,916 --> 00:02:55,436 Speaker 1: scientists to FDR the President about the possibility of the 40 00:02:55,516 --> 00:02:56,596 Speaker 1: nazis building an. 41 00:02:56,556 --> 00:03:02,116 Speaker 2: Atomic bomb, right, So that letter it is it feels 42 00:03:02,116 --> 00:03:04,476 Speaker 2: like good material for maybe not a complete musical comedy, 43 00:03:04,516 --> 00:03:06,436 Speaker 2: but at least an act of a musical comedy, because 44 00:03:06,716 --> 00:03:07,196 Speaker 2: it's kind. 45 00:03:07,116 --> 00:03:09,316 Speaker 1: Of a springtime for Hitler in Germany. 46 00:03:10,356 --> 00:03:13,436 Speaker 2: There is this whole thing where you have this brilliant physicist, 47 00:03:13,676 --> 00:03:17,236 Speaker 2: but he is just kind of the stereotypical professor, you know, 48 00:03:17,356 --> 00:03:19,876 Speaker 2: crazy hair, very absent minded, always talking about these things 49 00:03:19,916 --> 00:03:22,836 Speaker 2: where no one, you know, no normal person can really 50 00:03:22,916 --> 00:03:26,956 Speaker 2: understand what he's talking about. And suddenly, instead of talking 51 00:03:27,036 --> 00:03:30,316 Speaker 2: about space time and that you know, energy and matter 52 00:03:30,556 --> 00:03:33,156 Speaker 2: in their relationship, suddenly he's saying someone could build a 53 00:03:33,156 --> 00:03:36,676 Speaker 2: really really big bomb and that person will probably be 54 00:03:36,836 --> 00:03:40,236 Speaker 2: a German, and that has some very bad implications for 55 00:03:40,276 --> 00:03:40,996 Speaker 2: the rest of the world. 56 00:03:41,076 --> 00:03:43,956 Speaker 1: So now here we are, the president decides, okay, we 57 00:03:43,996 --> 00:03:46,116 Speaker 1: need to build a bomb, we need to spend a 58 00:03:46,156 --> 00:03:48,156 Speaker 1: wild amount of money on it. And this is a 59 00:03:48,156 --> 00:03:50,916 Speaker 1: thing that you describe as a bubble, which is interesting, right, 60 00:03:50,956 --> 00:03:53,156 Speaker 1: because it's not a bubble in the sense of market prices. 61 00:03:53,196 --> 00:03:56,956 Speaker 1: It's the federal government in the military, but it has 62 00:03:56,996 --> 00:04:01,996 Speaker 1: these other bubble like characteristics in your telling, right, maybe 63 00:04:02,036 --> 00:04:03,756 Speaker 1: other meanings of bubble, the way we talk about a 64 00:04:03,796 --> 00:04:06,076 Speaker 1: social bubble or a filter bubble. Tell me about that, like, 65 00:04:06,116 --> 00:04:08,956 Speaker 1: why is the Manhattan Project a kind of bubble? 66 00:04:09,796 --> 00:04:11,956 Speaker 2: Why is it a bubble? Because there's that feedback loop 67 00:04:12,036 --> 00:04:15,916 Speaker 2: because people take the idea of actually building the bomb 68 00:04:16,036 --> 00:04:19,156 Speaker 2: more seriously as other people take it more seriously, and 69 00:04:19,596 --> 00:04:21,996 Speaker 2: the more that you have someone like more that you 70 00:04:22,076 --> 00:04:25,036 Speaker 2: have people like Oppenheimer actually dedicating time to the projects, 71 00:04:25,196 --> 00:04:27,556 Speaker 2: the more other people think the project will actually happen, 72 00:04:27,556 --> 00:04:29,516 Speaker 2: this is actually worth doing. So you have this group 73 00:04:29,556 --> 00:04:32,796 Speaker 2: of people who start taking the idea of building a 74 00:04:32,796 --> 00:04:35,076 Speaker 2: bomb more seriously. They treat it as a thing that 75 00:04:35,116 --> 00:04:38,036 Speaker 2: will actually happen rather than a thing that is hypothetically 76 00:04:38,076 --> 00:04:40,396 Speaker 2: possible if if this particular equation is right, if these 77 00:04:40,396 --> 00:04:42,436 Speaker 2: measurements are right, et cetera, and then they start actually 78 00:04:42,436 --> 00:04:42,996 Speaker 2: designing them. 79 00:04:43,036 --> 00:04:47,036 Speaker 1: The man Hen Project seems like this sort of point 80 00:04:47,076 --> 00:04:49,476 Speaker 1: that gets a bunch of really smart people to coalesce 81 00:04:49,516 --> 00:04:51,916 Speaker 1: in one place on one project at one time. Right. 82 00:04:52,436 --> 00:04:55,876 Speaker 1: It sort of solves the coordination problem the way whatever 83 00:04:55,916 --> 00:04:59,676 Speaker 1: you might say AI today is doing that, Like just 84 00:04:59,796 --> 00:05:02,196 Speaker 1: brilliant people suddenly are all in one place working on 85 00:05:02,236 --> 00:05:04,116 Speaker 1: the same thing in a way that they absolutely would 86 00:05:04,196 --> 00:05:05,236 Speaker 1: not otherwise be. 87 00:05:05,756 --> 00:05:08,196 Speaker 2: That is true. And this was both within the US 88 00:05:08,276 --> 00:05:10,596 Speaker 2: academic community and then within the global academic community, because 89 00:05:10,596 --> 00:05:13,196 Speaker 2: you had a lot of people who were in Central 90 00:05:13,236 --> 00:05:15,676 Speaker 2: Europe or Eastern Europe who realized that that is just 91 00:05:15,796 --> 00:05:18,596 Speaker 2: not a great place for them to be and tried 92 00:05:18,596 --> 00:05:23,236 Speaker 2: to get to the UK, US other Allied countries as 93 00:05:23,356 --> 00:05:26,036 Speaker 2: as quickly as possible. And so there was just this 94 00:05:26,276 --> 00:05:29,276 Speaker 2: massive intellectual dividend of a lot of the most brilliant 95 00:05:29,276 --> 00:05:34,076 Speaker 2: people in Germany and in Eastern Europe and in Hungary, etcetera. 96 00:05:34,196 --> 00:05:37,276 Speaker 2: They were all fleeing and all ended up in the 97 00:05:37,316 --> 00:05:40,796 Speaker 2: same country. So, yeah, you have just this this serendipity 98 00:05:40,876 --> 00:05:44,236 Speaker 2: machine where it was just if you were a physicist, 99 00:05:44,236 --> 00:05:47,476 Speaker 2: it was an incredible place for just overhearing really novel 100 00:05:47,556 --> 00:05:50,756 Speaker 2: ideas and putting those ideas, putting your own ideas to 101 00:05:50,796 --> 00:05:53,076 Speaker 2: the test, because you had all the all the smartest 102 00:05:53,116 --> 00:05:55,436 Speaker 2: people in the world pretty much in this one little 103 00:05:55,436 --> 00:05:56,516 Speaker 2: town in New Mexico. 104 00:05:57,156 --> 00:06:01,236 Speaker 1: Right. So the Los Alamos piece is is the famous part, 105 00:06:01,396 --> 00:06:04,996 Speaker 1: you know, it's the part one has heard of with 106 00:06:05,116 --> 00:06:07,356 Speaker 1: respect to the Manhattan Project. There's a less famous part 107 00:06:07,396 --> 00:06:09,556 Speaker 1: that's really interesting, and that also seems to hold some 108 00:06:09,556 --> 00:06:12,276 Speaker 1: broader lessons as well, right, And that is the kind 109 00:06:12,316 --> 00:06:15,756 Speaker 1: of basically the manufacturing part of the how do we 110 00:06:15,916 --> 00:06:19,196 Speaker 1: enrich enough uranium to build the bomb? If the physicists 111 00:06:19,236 --> 00:06:21,556 Speaker 1: figure out how to design it, talk about that piece 112 00:06:21,556 --> 00:06:22,596 Speaker 1: and the lessons there. 113 00:06:23,356 --> 00:06:26,796 Speaker 2: Yes, so that one, You're right, It is often under 114 00:06:26,836 --> 00:06:29,956 Speaker 2: emphasized in the history. It was more of an engineering 115 00:06:30,036 --> 00:06:31,756 Speaker 2: project than a research project, though there was a lot 116 00:06:31,756 --> 00:06:35,316 Speaker 2: of research involved. The purpose was get enough of the 117 00:06:35,716 --> 00:06:39,196 Speaker 2: get enough in rich uranium so the isotope that is 118 00:06:39,236 --> 00:06:44,476 Speaker 2: actually prone to these chain reactions, get it isolated, and 119 00:06:44,516 --> 00:06:48,276 Speaker 2: then be able to incorporate that into a bomb. They 120 00:06:48,276 --> 00:06:52,076 Speaker 2: were also working on other physom materials because there were 121 00:06:52,156 --> 00:06:56,596 Speaker 2: multiple plausible bomb designs. Some use different triggering mechanisms, some 122 00:06:56,756 --> 00:07:00,756 Speaker 2: use different materials, and there were also multiple plausible ways 123 00:07:00,836 --> 00:07:03,756 Speaker 2: to enrich enough of the physico material to actually build 124 00:07:03,796 --> 00:07:07,156 Speaker 2: a bomb. And so one version of the story is 125 00:07:07,676 --> 00:07:09,916 Speaker 2: you just go down the list and you the one 126 00:07:09,956 --> 00:07:12,476 Speaker 2: that you think is the most cost effective, most likely, 127 00:07:12,596 --> 00:07:15,916 Speaker 2: and so we choose one way to get just you 128 00:07:16,036 --> 00:07:18,916 Speaker 2: two thirty five, and we have one way to build bomb. 129 00:07:18,716 --> 00:07:21,716 Speaker 1: You two thirty five is the enriched uranium. Yes, And that, 130 00:07:21,756 --> 00:07:24,396 Speaker 1: by the way, is the way normal businesses do things 131 00:07:24,396 --> 00:07:26,036 Speaker 1: in normal times. You're like, well, we got to do 132 00:07:26,076 --> 00:07:28,276 Speaker 1: this really expensive thing. We got to build a factory, 133 00:07:28,276 --> 00:07:29,556 Speaker 1: and we don't even know if it's going to work. 134 00:07:29,796 --> 00:07:32,316 Speaker 1: Let's choose the version that's most likely to work. Like 135 00:07:32,356 --> 00:07:35,316 Speaker 1: that is the kind of standard move right, yeah, right? 136 00:07:35,356 --> 00:07:38,156 Speaker 1: And then the problem though, is that if you try 137 00:07:38,196 --> 00:07:41,596 Speaker 1: that and you just you got unlucky. You kept the 138 00:07:41,596 --> 00:07:44,076 Speaker 1: wrong bomb design and the right phissile material or right 139 00:07:44,116 --> 00:07:46,476 Speaker 1: material of wrong bomb design, you've done a lot of 140 00:07:46,516 --> 00:07:49,676 Speaker 1: work which has zero payoff, and you've lost time. Right, 141 00:07:49,716 --> 00:07:52,236 Speaker 1: Like crucially, there is there is a huge sense of 142 00:07:52,396 --> 00:07:56,036 Speaker 1: urgency present at this moment that is driving the whole 143 00:07:56,076 --> 00:07:57,556 Speaker 1: thing really right. 144 00:07:57,796 --> 00:08:00,516 Speaker 2: We could also do more than one of them in parallel, 145 00:08:00,556 --> 00:08:02,196 Speaker 2: and that is what we did. And on the manufacturing 146 00:08:02,236 --> 00:08:05,396 Speaker 2: side that was actually just murderously expensive. If you are 147 00:08:05,516 --> 00:08:08,196 Speaker 2: building a factory and you build the wrong kind of factory, 148 00:08:08,396 --> 00:08:10,596 Speaker 2: then you you've wasted a lot of a lot of 149 00:08:10,716 --> 00:08:13,676 Speaker 2: money and effort and time. So they just they did 150 00:08:13,836 --> 00:08:17,756 Speaker 2: more than one. They did several different processes for enriching 151 00:08:17,876 --> 00:08:19,356 Speaker 2: uranium and for perstonium. 152 00:08:19,396 --> 00:08:21,276 Speaker 1: All at the same time, right, And they knew they 153 00:08:21,316 --> 00:08:23,396 Speaker 1: weren't going to use all of them, they just didn't 154 00:08:23,396 --> 00:08:25,316 Speaker 1: know which one was going to work. So it's like, well, 155 00:08:25,396 --> 00:08:27,276 Speaker 1: let's try all of them at the same time and 156 00:08:27,316 --> 00:08:30,156 Speaker 1: hopefully one of them will work. Yes, Like that is 157 00:08:30,196 --> 00:08:33,196 Speaker 1: super bubbly, right. That is wild and expensive. That is 158 00:08:33,316 --> 00:08:37,276 Speaker 1: just throwing wild amounts of money at something in a 159 00:08:37,316 --> 00:08:38,236 Speaker 1: great amount of haste. 160 00:08:38,916 --> 00:08:42,316 Speaker 2: Yes. Yeah, And if you so, if you believe that 161 00:08:42,356 --> 00:08:46,196 Speaker 2: there's this pretty linear payoff, then every additional investment you make, 162 00:08:46,556 --> 00:08:49,196 Speaker 2: you know has it doesn't qualitatively change things. It just 163 00:08:49,236 --> 00:08:50,316 Speaker 2: means you're doing a little bit more of it. But 164 00:08:50,356 --> 00:08:52,516 Speaker 2: if you believe there's some kind of nonlinear payoff where 165 00:08:52,916 --> 00:08:56,436 Speaker 2: either this facility doesn't basically doesn't work at all or 166 00:08:56,716 --> 00:08:59,916 Speaker 2: it works really really well, then when you when you 167 00:08:59,956 --> 00:09:02,396 Speaker 2: diversify a little bit, you do actually get just this 168 00:09:02,396 --> 00:09:05,356 Speaker 2: this better risk adjusted return even though you're objectively taking 169 00:09:05,476 --> 00:09:06,116 Speaker 2: more risks. 170 00:09:06,116 --> 00:09:09,796 Speaker 1: Interesting, right, So in this instance, it's if the Nazis 171 00:09:09,836 --> 00:09:12,076 Speaker 1: have the bomb before we do, it's the end of 172 00:09:12,156 --> 00:09:14,436 Speaker 1: the world as we know it, Yes, and so we 173 00:09:14,596 --> 00:09:17,236 Speaker 1: better take a lot of risk. And that's actually rational. 174 00:09:17,956 --> 00:09:21,996 Speaker 1: It reminds me a little bit of aspects of operation 175 00:09:22,276 --> 00:09:26,316 Speaker 1: warp speed. I remember talking to Susan Athea, Stanford economist 176 00:09:26,396 --> 00:09:28,596 Speaker 1: early in the pandemic who was making the case to 177 00:09:28,636 --> 00:09:32,156 Speaker 1: do exactly this with vaccine manufacturing in like, you know, 178 00:09:32,716 --> 00:09:34,916 Speaker 1: early twenty twenty, we didn't know if any vaccine was 179 00:09:34,956 --> 00:09:36,676 Speaker 1: going to work, and it takes a long time to 180 00:09:36,796 --> 00:09:39,076 Speaker 1: build a factory to make a vaccine, basically a tailor 181 00:09:39,076 --> 00:09:41,316 Speaker 1: of factory. And she was like, just make a bunch 182 00:09:41,356 --> 00:09:43,716 Speaker 1: of factories to make vaccines because if one of them work, 183 00:09:43,716 --> 00:09:45,276 Speaker 1: is we want to be able to start working on 184 00:09:45,316 --> 00:09:47,876 Speaker 1: it that day. Like that seems quite similar to this 185 00:09:48,396 --> 00:09:48,956 Speaker 1: and you work. 186 00:09:49,596 --> 00:09:53,156 Speaker 2: Yeah, yeah, I think that's absolutely true that you you know, 187 00:09:53,196 --> 00:09:55,556 Speaker 2: the higher the stakes are, the more you want to 188 00:09:55,556 --> 00:10:00,116 Speaker 2: be running everything that can plausibly help in parallel. And 189 00:10:00,156 --> 00:10:01,916 Speaker 2: depending on the exact nature of what you're doing, there 190 00:10:01,956 --> 00:10:04,676 Speaker 2: can be some spillover effects. It's you know, it's possible 191 00:10:04,676 --> 00:10:08,316 Speaker 2: that you build a factory for manufacturing vaccine A and 192 00:10:08,716 --> 00:10:11,276 Speaker 2: vaccine doesn't work out, but you can retrofit that factory 193 00:10:11,316 --> 00:10:14,036 Speaker 2: and start doing vaccin B. And you know, there are 194 00:10:14,036 --> 00:10:16,516 Speaker 2: little ways to shuffle things around a bit, but you 195 00:10:16,556 --> 00:10:19,436 Speaker 2: often want to go into this basically telling yourself, if 196 00:10:19,436 --> 00:10:22,076 Speaker 2: we didn't waste money and we still got a good outcome, 197 00:10:22,076 --> 00:10:24,156 Speaker 2: it's because we got very, very lucky, and that we 198 00:10:24,236 --> 00:10:26,116 Speaker 2: only know we're being serious if we did, in fact 199 00:10:26,116 --> 00:10:27,756 Speaker 2: waste a lot of money. And yeah, I think that 200 00:10:28,596 --> 00:10:30,716 Speaker 2: kind of inverting your view of risk is often a 201 00:10:30,756 --> 00:10:34,076 Speaker 2: really good way to think about these big transformative changes. 202 00:10:34,196 --> 00:10:36,996 Speaker 2: And this is actually another case where the financial metaphors 203 00:10:37,156 --> 00:10:41,076 Speaker 2: do give useful information about just real world behaviors because 204 00:10:41,476 --> 00:10:43,836 Speaker 2: at at head funds, this is actually something that risk 205 00:10:43,876 --> 00:10:47,076 Speaker 2: teams will sometimes teleportfolio managers. Is you are making money 206 00:10:47,116 --> 00:10:49,316 Speaker 2: on two higher percentage of your trades. This means that 207 00:10:49,396 --> 00:10:51,436 Speaker 2: you are not making all the trades that you could 208 00:10:51,756 --> 00:10:54,276 Speaker 2: and if you made, if you took your hit rate 209 00:10:54,276 --> 00:10:56,836 Speaker 2: from fifty five percent down to fifty three percent, we'd 210 00:10:56,836 --> 00:10:58,916 Speaker 2: be able to allocate more capital to you, even though 211 00:10:59,036 --> 00:11:01,356 Speaker 2: you'd be annoyed that you were losing money on more trades. 212 00:11:01,516 --> 00:11:05,436 Speaker 1: Interesting because overall you would likely have a more profitable 213 00:11:05,516 --> 00:11:08,916 Speaker 1: outcome by taking bigger risks and incurring a few more losses, 214 00:11:08,916 --> 00:11:11,116 Speaker 1: but winds would be bigger and make up for the losses. 215 00:11:11,516 --> 00:11:13,316 Speaker 2: Yes, and this kind of thing, you know, it's very 216 00:11:13,356 --> 00:11:14,996 Speaker 2: easy if you're the one sitting behind the desk just 217 00:11:15,036 --> 00:11:16,876 Speaker 2: talking about these relative trade offs. It's a lot harder 218 00:11:16,876 --> 00:11:19,396 Speaker 2: if you are the first person working with uranium in 219 00:11:19,436 --> 00:11:21,636 Speaker 2: the factory and we don't quite know what the risks 220 00:11:21,636 --> 00:11:23,516 Speaker 2: of that are, but it is. It is just a 221 00:11:23,596 --> 00:11:26,316 Speaker 2: it's a generally true thing about trade offs that if 222 00:11:26,356 --> 00:11:28,556 Speaker 2: you about trade offs and risk, that there is an 223 00:11:28,556 --> 00:11:31,276 Speaker 2: optimal amount of risk to take. That optimal amount is 224 00:11:31,316 --> 00:11:34,396 Speaker 2: sometimes dependent on what the what the downside risk of 225 00:11:34,476 --> 00:11:38,796 Speaker 2: inaction is. And so sometimes if you're if you're too successful, 226 00:11:38,836 --> 00:11:41,036 Speaker 2: you realize that you are actually messing something up. 227 00:11:41,276 --> 00:11:44,036 Speaker 1: Yeah, you're not taking enough risk. So we all know 228 00:11:44,036 --> 00:11:49,116 Speaker 1: how the Manhattan Project ends. It worked. I mean, it 229 00:11:49,196 --> 00:11:51,636 Speaker 1: is a little bit of a weird one to start with. 230 00:11:51,796 --> 00:11:54,636 Speaker 1: You know, the basic ideas like technological progress is good, 231 00:11:54,716 --> 00:11:56,676 Speaker 1: risks are good, and we're talking about building the atomic 232 00:11:56,716 --> 00:12:00,716 Speaker 1: bomb and dropping it on two cities and it's you know, 233 00:12:01,076 --> 00:12:03,276 Speaker 1: it's morally a much easier question if you think it's 234 00:12:03,316 --> 00:12:06,036 Speaker 1: the Nazis. Sorry, but the Nazis are absolutely the worst, 235 00:12:06,036 --> 00:12:08,196 Speaker 1: and I definitely don't want them to have a bomb first. 236 00:12:09,196 --> 00:12:11,596 Speaker 1: You know, there is the argument that more people would 237 00:12:11,636 --> 00:12:14,476 Speaker 1: have died in a conventional invasion without the bomb. I don't, 238 00:12:14,836 --> 00:12:19,516 Speaker 1: I don't know. I mean, how do you what do 239 00:12:19,556 --> 00:12:21,556 Speaker 1: you make of it? Like, obviously the book is very 240 00:12:21,596 --> 00:12:25,036 Speaker 1: pro technological progress, this show is basically pro technological progress. 241 00:12:25,196 --> 00:12:28,156 Speaker 1: But like the bomb isn't isn't a happy one to start? 242 00:12:28,196 --> 00:12:29,116 Speaker 1: I'm like, what do you make of it? 243 00:12:29,276 --> 00:12:32,476 Speaker 2: Ultimately, Yeah, it's one of those things where you do 244 00:12:32,996 --> 00:12:34,636 Speaker 2: it does make me wish that we could we could 245 00:12:34,676 --> 00:12:36,876 Speaker 2: run the same simulation, you know, a couple of million 246 00:12:36,876 --> 00:12:39,396 Speaker 2: times and see what the what the net? You know, 247 00:12:39,516 --> 00:12:42,436 Speaker 2: loss and save lives are different scenarios, but one thing 248 00:12:43,356 --> 00:12:46,956 Speaker 2: the bomb. If you I guess from like a purely 249 00:12:47,076 --> 00:12:52,876 Speaker 2: utilitarian standpoint, I suspect that there have been net lives 250 00:12:52,956 --> 00:12:56,916 Speaker 2: save because of less use of coal for electricity generation, 251 00:12:57,036 --> 00:12:59,476 Speaker 2: more use of nuclear power, And that is directly downstream 252 00:12:59,636 --> 00:13:02,636 Speaker 2: of the bomb. That you can build these, you know, 253 00:13:02,916 --> 00:13:06,716 Speaker 2: buy design uncontrollable releases of atomic energy. You can also 254 00:13:07,076 --> 00:13:10,676 Speaker 2: build more controllable ones, and then getting the funding for 255 00:13:10,676 --> 00:13:11,396 Speaker 2: that would be a lot harder. 256 00:13:11,676 --> 00:13:14,556 Speaker 1: And presumably we got nuclear power much sooner than we 257 00:13:14,596 --> 00:13:18,196 Speaker 1: otherwise would have because the incredibly rapid project progress of 258 00:13:18,236 --> 00:13:21,836 Speaker 1: the Manhattan Project. That's yes, there fair. 259 00:13:22,956 --> 00:13:24,516 Speaker 2: Which I don't I don't think you know you if 260 00:13:24,556 --> 00:13:27,596 Speaker 2: you let me push the button on what I drop 261 00:13:27,916 --> 00:13:31,756 Speaker 2: an atomic bomb on a civilian population in exchange for 262 00:13:32,076 --> 00:13:35,516 Speaker 2: fewer people dying of respiratory diseases over the next couple decades, 263 00:13:35,556 --> 00:13:37,636 Speaker 2: you know, there, I would have to give it a 264 00:13:37,636 --> 00:13:38,236 Speaker 2: lot of thought. 265 00:13:38,516 --> 00:13:41,236 Speaker 1: I'm not going to I'm not going to push that button, 266 00:13:41,236 --> 00:13:42,516 Speaker 1: but I'm never going to have a job where I 267 00:13:42,556 --> 00:13:47,116 Speaker 1: have to decide because I can't deal. Okay. Then you 268 00:13:47,196 --> 00:13:50,196 Speaker 1: mentioned in the book kind of in passing that was 269 00:13:50,276 --> 00:13:54,636 Speaker 1: really interesting and surprising to me was that nuclear power 270 00:13:54,916 --> 00:14:00,236 Speaker 1: today accounts for eighteen percent of electric power generation in 271 00:14:00,276 --> 00:14:03,196 Speaker 1: the US. Eighteen percent, Like that is so much higher 272 00:14:03,276 --> 00:14:06,156 Speaker 1: than I would have thought, given sort of how little 273 00:14:06,236 --> 00:14:08,716 Speaker 1: you hear about existing nuclear power plants, right. 274 00:14:08,596 --> 00:14:13,156 Speaker 2: Like, that is a lot. Yeah, yeah, it is. It 275 00:14:13,196 --> 00:14:15,996 Speaker 2: is a surprisingly it's a president hand number. But also 276 00:14:16,276 --> 00:14:18,596 Speaker 2: nuclear power it is it is one of the most 277 00:14:18,636 --> 00:14:21,276 Speaker 2: annoying technologies to talk about, in the sense that it 278 00:14:21,316 --> 00:14:25,916 Speaker 2: doesn't do anything really really exciting other than provide essentially 279 00:14:25,996 --> 00:14:28,716 Speaker 2: unlimited power with minimal risk. 280 00:14:28,596 --> 00:14:33,716 Speaker 1: And some amount of some amount of scary tail risk, 281 00:14:34,116 --> 00:14:36,356 Speaker 1: right ye, Like, I mean that is what is actually 282 00:14:36,476 --> 00:14:39,276 Speaker 1: interesting to talk about. Sort of unfortunately for the world, 283 00:14:39,316 --> 00:14:41,836 Speaker 1: given that it has a lot of benefits, there is 284 00:14:41,876 --> 00:14:44,316 Speaker 1: this tail risk, and once in a while something goes 285 00:14:44,316 --> 00:14:47,156 Speaker 1: horribly wrong, even though on the whole it seems to 286 00:14:47,196 --> 00:14:50,356 Speaker 1: be clearly less risky than say, a caul fired power. 287 00:14:50,076 --> 00:14:54,596 Speaker 2: Plant, right, And the industry has they they're aware of 288 00:14:54,596 --> 00:14:58,076 Speaker 2: those risks, and nobody wants to be responsible for that 289 00:14:58,196 --> 00:15:00,676 Speaker 2: kind of thing, and nobody wants to be testifying before 290 00:15:00,716 --> 00:15:04,076 Speaker 2: Congress about ever having cut any corner whatsoever in the 291 00:15:04,076 --> 00:15:07,236 Speaker 2: event that a disaster happens. So they do actually take 292 00:15:07,276 --> 00:15:11,556 Speaker 2: that incredibly seriously. So nuclear power does end up being 293 00:15:11,716 --> 00:15:15,356 Speaker 2: in practice much safer than other power sources. And then 294 00:15:15,516 --> 00:15:18,396 Speaker 2: you add in the externality of doesn't really produce emissions, 295 00:15:18,436 --> 00:15:22,996 Speaker 2: and uranium exists in some quantities just about everywhere. 296 00:15:23,116 --> 00:15:25,516 Speaker 1: No climate change, no local air pollution, has a lot 297 00:15:25,556 --> 00:15:31,036 Speaker 1: going for it. Always on. Okay, let's go to the moon. 298 00:15:31,956 --> 00:15:36,556 Speaker 1: So you write also about the Apollo missions us going 299 00:15:36,556 --> 00:15:40,236 Speaker 1: to the moon. It's the early sixties, right, was it 300 00:15:40,276 --> 00:15:44,076 Speaker 1: sixty one? Kennedy says we're gonna go to the moon 301 00:15:44,076 --> 00:15:48,076 Speaker 1: by the end of the decade. There's the Cold War context. Yes, 302 00:15:48,556 --> 00:15:52,676 Speaker 1: Kennedy announces this goal. What's the response in the US 303 00:15:52,716 --> 00:15:55,276 Speaker 1: when Kennedy says this, Yeah, so a lot of the response, 304 00:15:55,356 --> 00:15:59,116 Speaker 1: you know, at first people are somewhat hypothetically excited. 305 00:15:59,196 --> 00:16:01,916 Speaker 2: As they start realizing how much it will cost, they 306 00:16:01,956 --> 00:16:05,636 Speaker 2: go from not especially excited to actually pretty deeply opposed. 307 00:16:06,316 --> 00:16:09,076 Speaker 2: And you know this shows up in there was a 308 00:16:09,236 --> 00:16:10,676 Speaker 2: someone coined the term moondoggle. 309 00:16:10,916 --> 00:16:14,156 Speaker 1: Yeah, moondoggle. I loved moondoggle. I learned that from the book. 310 00:16:14,276 --> 00:16:17,676 Speaker 1: It was Norbert Wiener, like a famous technologist, not a 311 00:16:17,956 --> 00:16:20,876 Speaker 1: not a not a crank, right, somebody I knew what 312 00:16:20,916 --> 00:16:23,396 Speaker 1: he was talking about. Was like, this is a crazy idea. 313 00:16:23,556 --> 00:16:26,796 Speaker 2: It's a moondoggle, right, And you know, this really works 314 00:16:26,796 --> 00:16:28,476 Speaker 2: its way to popular culture. Like if you if you 315 00:16:28,556 --> 00:16:32,076 Speaker 2: go on Spotify and listen to the Tom learror song 316 00:16:32,476 --> 00:16:35,636 Speaker 2: Verner von Braun, the recording that Spotify has, it opens 317 00:16:35,636 --> 00:16:38,196 Speaker 2: with a monologue that is talking about how stupid the 318 00:16:38,276 --> 00:16:41,636 Speaker 2: idea of the Apollo program is. It's a it and 319 00:16:42,116 --> 00:16:43,916 Speaker 2: you know this is a This is again someone who 320 00:16:43,996 --> 00:16:46,836 Speaker 2: is in academia, who's a very very sharp guy and 321 00:16:47,236 --> 00:16:50,596 Speaker 2: who is just feels like he completely sees through this 322 00:16:50,876 --> 00:16:54,796 Speaker 2: political giveaway program to big defense contractors and knows that 323 00:16:54,836 --> 00:16:56,316 Speaker 2: there's no point in doing this. 324 00:16:56,636 --> 00:17:01,796 Speaker 1: You write that NASA's own analysis found a ninety percent 325 00:17:01,876 --> 00:17:05,636 Speaker 1: chance that that a failure of failing to reach the 326 00:17:05,636 --> 00:17:07,676 Speaker 1: moon by the end of the decade. Like, it wasn't 327 00:17:07,756 --> 00:17:10,916 Speaker 1: just outside people being critical, it was NASA itself didn't think. 328 00:17:10,836 --> 00:17:11,476 Speaker 2: It was good work. 329 00:17:12,756 --> 00:17:15,636 Speaker 1: There's a phrase you use in the book to talk 330 00:17:15,676 --> 00:17:18,756 Speaker 1: about talk about these sort of bubble like environments that 331 00:17:18,796 --> 00:17:21,076 Speaker 1: are of interest to you, and I found it really 332 00:17:21,076 --> 00:17:23,276 Speaker 1: interesting and I think we can talk about it in 333 00:17:23,276 --> 00:17:27,436 Speaker 1: the context of Apollo. That phrase is definite optimism. Tell 334 00:17:27,476 --> 00:17:28,276 Speaker 1: me about that phrase. 335 00:17:28,836 --> 00:17:32,076 Speaker 2: Yes, So definite optimism is the view that the future 336 00:17:32,196 --> 00:17:35,836 Speaker 2: can and will be better in some very specific way, 337 00:17:36,436 --> 00:17:39,076 Speaker 2: that there will be there is something we cannot do 338 00:17:39,316 --> 00:17:40,916 Speaker 2: now we will be able to do in the future, 339 00:17:40,956 --> 00:17:42,196 Speaker 2: and it will be good that we can do it. 340 00:17:42,636 --> 00:17:45,516 Speaker 1: And why is it important, Like it's a big deal 341 00:17:45,636 --> 00:17:48,916 Speaker 1: in your telling in an interesting way. Why is it 342 00:17:48,956 --> 00:17:49,516 Speaker 1: so important? 343 00:17:50,036 --> 00:17:52,756 Speaker 2: It's important because that is what allows you to actually 344 00:17:53,316 --> 00:17:57,596 Speaker 2: marshal those resources, whether those are the people or the 345 00:17:57,636 --> 00:18:00,796 Speaker 2: capital or the political pull, to put them all in 346 00:18:00,876 --> 00:18:03,676 Speaker 2: some specific direction and say we're going to build this thing. 347 00:18:03,716 --> 00:18:05,556 Speaker 2: So we need to actually go step by step and 348 00:18:05,556 --> 00:18:08,356 Speaker 2: figure out, Okay, what specific things have to be done, 349 00:18:08,396 --> 00:18:10,756 Speaker 2: what discoveries have have to be made, what laws have 350 00:18:10,836 --> 00:18:13,476 Speaker 2: to be passed in order for this to happen. And 351 00:18:13,996 --> 00:18:16,236 Speaker 2: that is so it's definitely optimism in the sense that 352 00:18:16,276 --> 00:18:18,276 Speaker 2: you're saying, there is a specific thing we're going to build. 353 00:18:18,796 --> 00:18:20,436 Speaker 2: It's the kind of thing that can keep you going 354 00:18:20,836 --> 00:18:23,596 Speaker 2: when you encounter temporary setbacks. And that's that's where the 355 00:18:23,596 --> 00:18:27,996 Speaker 2: optimism part comes in, because if you have a less 356 00:18:28,076 --> 00:18:31,036 Speaker 2: definitely optimistic view of about that project, you might say 357 00:18:31,076 --> 00:18:33,396 Speaker 2: the goal of the Apollo program is to figure out 358 00:18:33,716 --> 00:18:36,396 Speaker 2: if we can put a person on the moon. But 359 00:18:36,956 --> 00:18:38,956 Speaker 2: I think what that leaves you open to is the 360 00:18:38,996 --> 00:18:41,116 Speaker 2: temptation to give up at any point, because at any 361 00:18:41,156 --> 00:18:45,236 Speaker 2: point you can have you know, a botched launch or 362 00:18:45,276 --> 00:18:48,876 Speaker 2: an accident, or you are you're designing some component and 363 00:18:48,916 --> 00:18:51,116 Speaker 2: the math just doesn't pencil out. You need you know 364 00:18:51,156 --> 00:18:53,716 Speaker 2: it's going to weigh too much to actually make it 365 00:18:53,756 --> 00:18:56,156 Speaker 2: onto craft. And you could say, okay, well that's how 366 00:18:56,156 --> 00:18:57,636 Speaker 2: we figured out that we're not actually doing this. But 367 00:18:58,156 --> 00:19:01,116 Speaker 2: if you do just have this kind of delusional view 368 00:19:01,356 --> 00:19:04,196 Speaker 2: that know that if there's a mistake, it's a mistake 369 00:19:04,196 --> 00:19:07,236 Speaker 2: in my analysis, not in the ultimate plan here, and 370 00:19:07,236 --> 00:19:08,916 Speaker 2: that it is physically possible, we just have to figure 371 00:19:08,916 --> 00:19:11,116 Speaker 2: out the all the details. Then I think that does 372 00:19:11,156 --> 00:19:12,836 Speaker 2: set up a different kind of motivation, because at that 373 00:19:12,876 --> 00:19:16,556 Speaker 2: point you can view every mistake as just exhausting the 374 00:19:16,596 --> 00:19:19,556 Speaker 2: set of possibilities and letting you narrow things down to 375 00:19:19,796 --> 00:19:22,476 Speaker 2: what is the correct approach. What you sort of needed 376 00:19:22,676 --> 00:19:26,876 Speaker 2: was this sort of very localized definite optimism where someone 377 00:19:27,556 --> 00:19:30,076 Speaker 2: could you could imagine a researcher thinking to themselves, or 378 00:19:30,076 --> 00:19:33,876 Speaker 2: an engineer or someone throughout the project thinking to themselves that, Okay, 379 00:19:34,356 --> 00:19:38,516 Speaker 2: this will probably not work overall, but the specific thing 380 00:19:38,556 --> 00:19:41,196 Speaker 2: I'm working on, whether it is designing a spacesuit or 381 00:19:41,236 --> 00:19:45,556 Speaker 2: designing this rocket or programming the guidance computer, that one 382 00:19:45,996 --> 00:19:47,996 Speaker 2: I can tell that my part is actually going to work, 383 00:19:48,076 --> 00:19:49,636 Speaker 2: or at least I believe that I can make it work. 384 00:19:49,676 --> 00:19:51,676 Speaker 2: And two, this is my only chance to work with 385 00:19:51,716 --> 00:19:54,316 Speaker 2: these really cool toys. So if the money is going 386 00:19:54,316 --> 00:19:56,236 Speaker 2: to be wasted at some point, let that money be 387 00:19:56,276 --> 00:19:59,036 Speaker 2: wasted on me. And I think that that kind of 388 00:19:59,076 --> 00:20:01,916 Speaker 2: attitude of just you know that you have one shot 389 00:20:01,956 --> 00:20:04,356 Speaker 2: to actually do something really interesting, you will not get 390 00:20:04,396 --> 00:20:07,236 Speaker 2: a second chance. If everyone believes that, it does become 391 00:20:07,236 --> 00:20:10,276 Speaker 2: a coordinating mechanism where now they're working extremely hard, they 392 00:20:10,316 --> 00:20:13,596 Speaker 2: all recognize that the success of what they are doing 393 00:20:13,716 --> 00:20:16,076 Speaker 2: is very much up to them, and then that ends 394 00:20:16,156 --> 00:20:17,636 Speaker 2: up contributing to this group success. 395 00:20:18,316 --> 00:20:22,716 Speaker 1: So it's like this, if I'm going to do this, 396 00:20:22,756 --> 00:20:24,596 Speaker 1: I got to do it now. Everybody's doing it. Now, 397 00:20:24,596 --> 00:20:26,996 Speaker 1: we got the money. Now this is our one shot. 398 00:20:27,116 --> 00:20:29,156 Speaker 1: We better get it right, We better do everything we 399 00:20:29,236 --> 00:20:30,116 Speaker 1: can to make it work. 400 00:20:30,796 --> 00:20:32,636 Speaker 2: Yes, fear of missing out? 401 00:20:33,036 --> 00:20:35,676 Speaker 1: Yeah, fomo right, so fomo it's funny. People talk about 402 00:20:35,676 --> 00:20:39,076 Speaker 1: that as like a dumb investment thesis basically right, it's 403 00:20:39,116 --> 00:20:41,836 Speaker 1: like a meme stock idea, but you talk about it 404 00:20:41,876 --> 00:20:46,716 Speaker 1: in these more interesting context basically right, more meaningful. I 405 00:20:46,716 --> 00:20:48,236 Speaker 1: would say, yes, yeah. 406 00:20:48,036 --> 00:20:52,876 Speaker 2: That it's so in the purely straightforward way. The idea 407 00:20:52,996 --> 00:20:55,916 Speaker 2: is there are sometimes these very time limited opportunities to 408 00:20:55,956 --> 00:20:58,716 Speaker 2: do something, and if you're capable of doing that thing, 409 00:20:58,796 --> 00:21:00,436 Speaker 2: this may be your only chance, and so missing out 410 00:21:00,476 --> 00:21:02,996 Speaker 2: is actually something you should be afraid of. So you know, 411 00:21:02,676 --> 00:21:06,236 Speaker 2: if you actually have a really clever idea for an 412 00:21:06,276 --> 00:21:08,516 Speaker 2: AI company, this is actually a time where you can 413 00:21:08,636 --> 00:21:11,636 Speaker 2: at least the attempt. So yeah, we do argue that 414 00:21:11,676 --> 00:21:13,676 Speaker 2: missing out is something you should absolutely fear. 415 00:21:13,876 --> 00:21:16,636 Speaker 1: So what happens with the Apollo project, just even brief 416 00:21:16,716 --> 00:21:19,436 Speaker 1: like talk about just how big it is and how 417 00:21:19,516 --> 00:21:21,836 Speaker 1: risky it is, like it's it's striking right. 418 00:21:22,596 --> 00:21:25,036 Speaker 2: Right, Yeah, so it is. It was running. The expenses 419 00:21:25,036 --> 00:21:27,716 Speaker 2: were running on like a low single digital percentage of 420 00:21:27,716 --> 00:21:29,236 Speaker 2: GDP for a while. 421 00:21:29,476 --> 00:21:33,116 Speaker 1: So a couple percent of the value of everything everybody 422 00:21:33,116 --> 00:21:36,236 Speaker 1: in the country does is going into the Apollo mission. 423 00:21:36,436 --> 00:21:41,196 Speaker 1: Just this one plainly unnecessary thing that the government has 424 00:21:41,236 --> 00:21:42,036 Speaker 1: decided to do. 425 00:21:42,836 --> 00:21:44,796 Speaker 2: Right, And this is one of the cases where there 426 00:21:44,836 --> 00:21:49,516 Speaker 2: was a very powerful spillover effects because the Apollo Guidance 427 00:21:49,516 --> 00:21:53,836 Speaker 2: computer needed the most lightweight and least power consuming and 428 00:21:54,116 --> 00:21:57,756 Speaker 2: most reliable components possible. And if you were building a 429 00:21:57,796 --> 00:22:00,476 Speaker 2: computer conventionally at that time and you had a budget, 430 00:22:00,516 --> 00:22:02,436 Speaker 2: you would probably build it out of vacum tubes. And 431 00:22:02,876 --> 00:22:05,236 Speaker 2: you knew that the vacuum tubes they're bulky, they consume 432 00:22:05,276 --> 00:22:06,636 Speaker 2: a lot of power, they throw off a lot of heat, 433 00:22:06,836 --> 00:22:10,196 Speaker 2: they burn out all the time, but they are fairly cheap. 434 00:22:10,476 --> 00:22:15,716 Speaker 2: But in this case, there there was an alternative technology. 435 00:22:15,876 --> 00:22:20,316 Speaker 2: It was extremely expensive, but it was it was lightweight, 436 00:22:20,436 --> 00:22:22,116 Speaker 2: didn't use a lot of power, and did not have 437 00:22:22,156 --> 00:22:26,676 Speaker 2: moving parts. And that's the integrated circuit, so transistor based computing. 438 00:22:26,356 --> 00:22:28,716 Speaker 1: The chip, well we know today as the chip. 439 00:22:29,276 --> 00:22:30,036 Speaker 2: Yes, the chip. 440 00:22:30,116 --> 00:22:34,036 Speaker 1: You read that. In nineteen sixty three, NASA bought sixty 441 00:22:34,156 --> 00:22:37,956 Speaker 1: percent of the chips made in the United States, just NASA, 442 00:22:38,076 --> 00:22:40,916 Speaker 1: not though government, just NASA sixty percent. 443 00:22:41,476 --> 00:22:44,116 Speaker 2: They actually bought more chips than they needed because they 444 00:22:44,196 --> 00:22:47,916 Speaker 2: recognized that the chip companies were run by, you know, 445 00:22:48,196 --> 00:22:51,956 Speaker 2: very very nice electrical engineering nerds who just love designing tiny, 446 00:22:51,996 --> 00:22:54,556 Speaker 2: tiny things, and that these people just don't know how 447 00:22:54,556 --> 00:22:56,996 Speaker 2: to run a business, and so they were worried that 448 00:22:57,356 --> 00:23:00,276 Speaker 2: fair Child Semiconductor would just run out of cash at 449 00:23:00,276 --> 00:23:02,996 Speaker 2: some point and then NASA would have half of a 450 00:23:03,036 --> 00:23:05,276 Speaker 2: computer and no way to build the rest of it. 451 00:23:05,356 --> 00:23:08,156 Speaker 2: So they actually over ordered the US. They used integrated 452 00:23:08,196 --> 00:23:10,796 Speaker 2: circuits for a few applications that actually were not so 453 00:23:10,956 --> 00:23:13,636 Speaker 2: dependent on the power consumption and weight and things. So 454 00:23:13,676 --> 00:23:17,196 Speaker 2: that critique of the Apollo program was directionally correct. It 455 00:23:17,316 --> 00:23:19,916 Speaker 2: was money being splashed out to defense contractors who were 456 00:23:19,916 --> 00:23:21,436 Speaker 2: favored by the government, but in this case it was 457 00:23:21,476 --> 00:23:25,676 Speaker 2: being done in a more strategic and thoughtful way and 458 00:23:25,716 --> 00:23:26,796 Speaker 2: kind of kept the industry going. 459 00:23:26,916 --> 00:23:31,156 Speaker 1: So you talk a fair bit in the book about 460 00:23:31,236 --> 00:23:35,636 Speaker 1: the sort of religious and quasi religious aspects of these 461 00:23:35,956 --> 00:23:38,516 Speaker 1: little groups of people that come together in these bubble 462 00:23:38,596 --> 00:23:42,396 Speaker 1: like moments to do these big things, and that's really 463 00:23:42,436 --> 00:23:46,196 Speaker 1: present in the Apollo section, like talk about the sort 464 00:23:46,236 --> 00:23:51,156 Speaker 1: of religious ideas associated with the Apollo mission that the 465 00:23:51,156 --> 00:23:52,676 Speaker 1: people working on the mission had. 466 00:23:52,956 --> 00:23:55,836 Speaker 2: Yeah, I mean, you name it after a Greek god 467 00:23:56,116 --> 00:24:00,076 Speaker 2: and you're already starting starting a little bit religious. So 468 00:24:00,916 --> 00:24:03,716 Speaker 2: there were people who worked on these missions who felt like, 469 00:24:03,756 --> 00:24:07,396 Speaker 2: this is part of mankind's destiny is to explore the stars, 470 00:24:07,436 --> 00:24:09,836 Speaker 2: and that there's this whole universe that is universe created 471 00:24:09,876 --> 00:24:11,956 Speaker 2: by God and it would be kind of weird. You know, 472 00:24:11,996 --> 00:24:14,156 Speaker 2: we can't second guess the divine, but it's a little 473 00:24:14,156 --> 00:24:16,716 Speaker 2: weird for God to create all of these astronomical bodies 474 00:24:16,756 --> 00:24:18,996 Speaker 2: that just kind of look good from the ground and 475 00:24:18,996 --> 00:24:20,436 Speaker 2: that you're not actually meant to go visit. 476 00:24:20,876 --> 00:24:24,156 Speaker 1: You talk about somewhat similar things in other kind of 477 00:24:24,276 --> 00:24:29,796 Speaker 1: less obviously spiritual dimensions of people coming together and having 478 00:24:29,796 --> 00:24:34,036 Speaker 1: it kind of more than rational. You use this word 479 00:24:34,316 --> 00:24:37,956 Speaker 1: thimos from the Greek meaning spirit, like what's going on there? 480 00:24:37,956 --> 00:24:42,276 Speaker 1: More broadly, why is that important more generally for technological progress, 481 00:24:43,316 --> 00:24:47,396 Speaker 1: because the so th amos is part of this tripartite 482 00:24:47,436 --> 00:24:50,956 Speaker 1: model of the soul where you have your appetites. 483 00:24:50,396 --> 00:24:53,836 Speaker 2: And your reason and then your themost like you're longing 484 00:24:53,916 --> 00:24:58,796 Speaker 2: for glory and honor and this kind of transcendent transcendent 485 00:24:58,876 --> 00:25:03,316 Speaker 2: achievement and logos reasoning it only gets you so far. 486 00:25:03,476 --> 00:25:05,636 Speaker 2: You can you can reason your way into some pretty 487 00:25:05,676 --> 00:25:08,156 Speaker 2: interesting things, but at some point you do decide that 488 00:25:08,236 --> 00:25:12,436 Speaker 2: the reason thing is probably to take it a little 489 00:25:12,476 --> 00:25:15,276 Speaker 2: bit easier and not take not take certain risks. And 490 00:25:15,836 --> 00:25:19,356 Speaker 2: it still is just this this pursuit of something, something greater, 491 00:25:19,676 --> 00:25:22,716 Speaker 2: and you know, something something beyond the ordinary, something really 492 00:25:22,716 --> 00:25:25,396 Speaker 2: beyond the logos, right, like beyond what you could get 493 00:25:25,436 --> 00:25:30,276 Speaker 2: to just by reasoning one step at a time. And 494 00:25:30,316 --> 00:25:32,196 Speaker 2: I think that that is, Yeah, that is that is 495 00:25:32,356 --> 00:25:36,276 Speaker 2: just a deeply attractive proposition to to many people. And 496 00:25:36,436 --> 00:25:38,636 Speaker 2: it's also a scary one because at that point, you know, 497 00:25:38,636 --> 00:25:40,716 Speaker 2: if you are if you're doing things that are beyond 498 00:25:40,956 --> 00:25:43,436 Speaker 2: what is the rational thing to do, then of course 499 00:25:43,436 --> 00:25:45,876 Speaker 2: you have no rational explanation for what you did wrong. 500 00:25:46,036 --> 00:25:48,876 Speaker 2: If you mess up. Yeah, and you are sort of 501 00:25:48,876 --> 00:25:50,556 Speaker 2: betting on some historical contingencies. 502 00:25:50,716 --> 00:25:54,396 Speaker 1: That's the definite optimism part, right, Betting on historical contingencies 503 00:25:54,516 --> 00:26:00,716 Speaker 1: is another way of saying definite optimism. Right. So back 504 00:26:00,756 --> 00:26:03,356 Speaker 1: to the moon. So we get to the moon. In fact, 505 00:26:03,436 --> 00:26:08,476 Speaker 1: against all odds, we make it. Uh, And there's this 506 00:26:08,556 --> 00:26:12,476 Speaker 1: moment where it's like, you know, today the moon tomorrow 507 00:26:12,716 --> 00:26:15,956 Speaker 1: the solar system, but in fact it was today the 508 00:26:15,996 --> 00:26:18,116 Speaker 1: moon tomorrow, not even the. 509 00:26:18,036 --> 00:26:22,316 Speaker 2: Moon, right, Like what happened? Well, you know, you had 510 00:26:22,356 --> 00:26:25,596 Speaker 2: asked about what these megaprojects have in common with financial bubbles, 511 00:26:25,596 --> 00:26:26,876 Speaker 2: and one of the things they have in common is 512 00:26:26,956 --> 00:26:29,876 Speaker 2: sometimes there's a bust, and sometimes that bust is actually 513 00:26:29,876 --> 00:26:34,436 Speaker 2: an overreaction in the opposite direction. And people people take 514 00:26:34,516 --> 00:26:38,156 Speaker 2: everything every true, you know, everything they believed in say 515 00:26:38,356 --> 00:26:41,836 Speaker 2: nineteen you know, nineteen sixty nine about humanity's future in 516 00:26:41,876 --> 00:26:43,276 Speaker 2: the stars, and they say, okay, this is this is 517 00:26:43,276 --> 00:26:46,196 Speaker 2: exactly the opposite of where things will actually go in 518 00:26:46,236 --> 00:26:48,076 Speaker 2: the exact opposite of what we should care about. That 519 00:26:48,236 --> 00:26:50,236 Speaker 2: we have plenty of problems here on Earth, and why 520 00:26:50,236 --> 00:26:52,316 Speaker 2: would we you know, do we really want to turn 521 00:26:52,396 --> 00:26:56,156 Speaker 2: Mars into just another planet? That also has problems of 522 00:26:56,236 --> 00:26:58,956 Speaker 2: racism and poverty and nuclear war and all that stuff. 523 00:26:58,996 --> 00:27:00,916 Speaker 2: So so maybe we should stay home and fix fix 524 00:27:00,996 --> 00:27:03,796 Speaker 2: our stuff. Then public policy, you'd actually need for there 525 00:27:03,876 --> 00:27:07,156 Speaker 2: to be some kind of resurgence in belief in space. 526 00:27:07,196 --> 00:27:09,996 Speaker 2: You need some kind of charismatic story, and perhaps to 527 00:27:09,996 --> 00:27:13,036 Speaker 2: an extent, we have that right now. Yes, maybe Elan's 528 00:27:13,076 --> 00:27:14,756 Speaker 2: not the perfect front man for all of this, but 529 00:27:14,956 --> 00:27:17,836 Speaker 2: he is certainly someone who demonstrates that space travel it 530 00:27:17,876 --> 00:27:21,676 Speaker 2: can be done, it can be improved, and that it's 531 00:27:21,716 --> 00:27:24,356 Speaker 2: just objectively cool. That it is just hard to watch 532 00:27:24,396 --> 00:27:26,916 Speaker 2: a SpaceX launch video and not feel something. 533 00:27:28,276 --> 00:27:31,636 Speaker 1: Yes, so so good. I want to talk more about 534 00:27:31,676 --> 00:27:37,196 Speaker 1: space in a minute. So it's interesting these two stories 535 00:27:37,236 --> 00:27:39,036 Speaker 1: that are kind of in the middle of your book. 536 00:27:39,076 --> 00:27:40,756 Speaker 1: They're kind of the core of the book, right, these 537 00:27:40,756 --> 00:27:44,156 Speaker 1: two interesting moments that are non financial bubbles, when you 538 00:27:44,156 --> 00:27:47,876 Speaker 1: have this incredible technological innovation and a short amount of 539 00:27:47,916 --> 00:27:52,796 Speaker 1: time that seems unrealistic, unrealistically you know, fast impressive outcome. 540 00:27:53,636 --> 00:27:58,716 Speaker 1: And they're both pure government projects. They're both you know, 541 00:27:58,836 --> 00:28:02,596 Speaker 1: command and control economy. It is not the private sector, 542 00:28:02,596 --> 00:28:05,596 Speaker 1: it does not capitalism. What do you make of that? 543 00:28:06,396 --> 00:28:09,316 Speaker 2: I would say, there's a very strong indirect link for 544 00:28:10,196 --> 00:28:15,316 Speaker 2: a couple of reasons. One is just the practical, kind 545 00:28:15,316 --> 00:28:18,236 Speaker 2: of the practically enough reason that personnelis policy, and that 546 00:28:19,236 --> 00:28:23,196 Speaker 2: the US government in the nineteen thirties, the US government 547 00:28:23,276 --> 00:28:25,916 Speaker 2: was hiring and the private sector mostly wasn't, and so 548 00:28:26,276 --> 00:28:28,676 Speaker 2: all the ambitious people, basically all the ambitious people in 549 00:28:28,716 --> 00:28:30,996 Speaker 2: the country tried to get government jobs. And that is 550 00:28:31,116 --> 00:28:34,076 Speaker 2: usually not the case, and there are there's certainly circumstances 551 00:28:34,076 --> 00:28:36,596 Speaker 2: where that's a really bad sign, but in this case 552 00:28:36,636 --> 00:28:38,436 Speaker 2: it was great. It meant that there were a lot 553 00:28:38,476 --> 00:28:40,916 Speaker 2: of New Deal projects that were staffed by the people 554 00:28:40,916 --> 00:28:43,596 Speaker 2: who would have been rising up the ranks at RCA 555 00:28:43,716 --> 00:28:46,196 Speaker 2: or General Electric or something a decade earlier. Now they're 556 00:28:46,276 --> 00:28:49,036 Speaker 2: running New Deal projects instead, and they're again rising up 557 00:28:49,036 --> 00:28:50,756 Speaker 2: their ranks really fast, having a very large real world 558 00:28:50,796 --> 00:28:53,796 Speaker 2: impact for early in their careers. And those people had 559 00:28:53,836 --> 00:28:56,796 Speaker 2: been working together for a while, and you know, they 560 00:28:56,876 --> 00:28:58,716 Speaker 2: knew each other. There was a lot of just institutional 561 00:28:58,756 --> 00:29:01,396 Speaker 2: knowledge about how to get big things done within the 562 00:29:01,476 --> 00:29:04,356 Speaker 2: US government, and a lot of that institutional knowledge could 563 00:29:04,356 --> 00:29:07,316 Speaker 2: then be redirected. So you have the New Deal and 564 00:29:07,316 --> 00:29:09,396 Speaker 2: then the war effort, and then you have the post 565 00:29:09,436 --> 00:29:11,036 Speaker 2: war economy where there's still you know, it takes a 566 00:29:11,036 --> 00:29:13,236 Speaker 2: while for the government to fully relax its control, and 567 00:29:13,276 --> 00:29:17,116 Speaker 2: then very soon we're into the Korean War. So yeah, 568 00:29:17,156 --> 00:29:20,156 Speaker 2: there was just a large increase in state capacity and 569 00:29:20,196 --> 00:29:22,436 Speaker 2: just in the quality of people making decisions within the 570 00:29:22,516 --> 00:29:24,276 Speaker 2: US government in that period. 571 00:29:28,556 --> 00:29:30,796 Speaker 1: We'll be back in a minute to talk about bubble 572 00:29:30,956 --> 00:29:43,716 Speaker 1: esque things happening right now, namely rockets, cryptocurrency, and AI. Okay, 573 00:29:44,396 --> 00:29:48,676 Speaker 1: now to space today. Burn and I talked about SpaceX 574 00:29:48,716 --> 00:29:51,996 Speaker 1: in particular, because you know, it really is the company 575 00:29:51,996 --> 00:29:55,076 Speaker 1: that launched the modern space industry, and there's this one 576 00:29:55,156 --> 00:29:58,516 Speaker 1: key trait that SpaceX shares with the other projects Burn 577 00:29:58,596 --> 00:30:01,796 Speaker 1: wrote about in the book. It brought together people who 578 00:30:01,836 --> 00:30:05,396 Speaker 1: share a wild dream. If you go to work at SpaceX, 579 00:30:05,716 --> 00:30:10,476 Speaker 1: it's probably because you believe in getting humanity to mar Yeah. 580 00:30:10,556 --> 00:30:12,356 Speaker 2: Yeah, it's not just that you believe in the dream. 581 00:30:12,396 --> 00:30:14,996 Speaker 2: But when you get the job, you're suddenly in an 582 00:30:15,036 --> 00:30:17,636 Speaker 2: environment where everyone believes in the dream. And if you're 583 00:30:17,636 --> 00:30:20,116 Speaker 2: working one of those organizations, you're probably not working nine 584 00:30:20,156 --> 00:30:22,876 Speaker 2: to five, which means you have very few hours in 585 00:30:22,916 --> 00:30:25,636 Speaker 2: your day or week where you are not completely surrounded 586 00:30:25,636 --> 00:30:29,636 Speaker 2: by people who believe that humanity will people will be 587 00:30:29,876 --> 00:30:32,436 Speaker 2: living on Mars, and that this is the organization that 588 00:30:32,476 --> 00:30:34,956 Speaker 2: will make it happen. And that just has to really 589 00:30:34,996 --> 00:30:37,916 Speaker 2: mess with your mind, Like, is what is normal to 590 00:30:38,356 --> 00:30:40,596 Speaker 2: an engineer working at SpaceX in two thousand and six 591 00:30:40,676 --> 00:30:43,996 Speaker 2: is completely abnormal to ninety nine point nine percent of 592 00:30:44,036 --> 00:30:46,476 Speaker 2: the human population. And you know most of the exceptions 593 00:30:46,476 --> 00:30:48,276 Speaker 2: are like six year old boys who just want Star 594 00:30:48,316 --> 00:30:48,996 Speaker 2: Wars for the first time. 595 00:30:49,316 --> 00:30:52,036 Speaker 1: Mars's crazy, Yeah, I mean really, as I went through 596 00:30:52,036 --> 00:30:53,996 Speaker 1: the book, I was like, oh, really, the bubble you're 597 00:30:53,996 --> 00:30:57,396 Speaker 1: talking about is a social bubble, like the meaningful bubble. 598 00:30:57,436 --> 00:30:59,716 Speaker 1: Like maybe there's a financial bubble attachment, maybe there isn't, 599 00:30:59,716 --> 00:31:02,756 Speaker 1: But what really matters is you're in this weird little 600 00:31:02,756 --> 00:31:07,836 Speaker 1: social bubble that believes some wild thing together, that believes 601 00:31:07,836 --> 00:31:09,876 Speaker 1: it is not wild, that believes it is going to happen, 602 00:31:09,956 --> 00:31:13,996 Speaker 1: like that's the thing. Yeah, and has money and has 603 00:31:14,076 --> 00:31:18,356 Speaker 1: the money to act on their wild belief yes, And 604 00:31:18,396 --> 00:31:20,916 Speaker 1: so you know, getting the money does mean interacting with 605 00:31:21,116 --> 00:31:23,756 Speaker 1: the normy sphere, interacting with people who don't. 606 00:31:23,596 --> 00:31:26,756 Speaker 2: Quite buy into all of it, but they when you 607 00:31:26,796 --> 00:31:30,716 Speaker 2: have these really ambitious plans and you're taking them seriously, 608 00:31:30,876 --> 00:31:33,116 Speaker 2: you're doing them step by step. Some of those steps 609 00:31:33,156 --> 00:31:36,236 Speaker 2: do have other practical applications, and so that is the 610 00:31:36,236 --> 00:31:38,676 Speaker 2: space story. It was not just street shot. We are 611 00:31:38,676 --> 00:31:41,076 Speaker 2: going to invest all the money Elon got from PayPal 612 00:31:41,116 --> 00:31:42,996 Speaker 2: into going to Mars and hopefully we get to Mark 613 00:31:43,036 --> 00:31:45,796 Speaker 2: before we run out. Yeah, it was you know, we're 614 00:31:46,036 --> 00:31:49,036 Speaker 2: going to build these prototypes, We're going to build reusable rockets. 615 00:31:49,036 --> 00:31:52,516 Speaker 2: We're going to use those for existing use cases, and 616 00:31:52,556 --> 00:31:54,996 Speaker 2: we will probably find new use cases. And then once 617 00:31:55,036 --> 00:31:57,356 Speaker 2: we get really really good at launching things cheaply, well, 618 00:31:57,476 --> 00:31:59,796 Speaker 2: there are a lot of satellites out there, and perhaps 619 00:31:59,836 --> 00:32:01,796 Speaker 2: we should have some of our own, and we can 620 00:32:01,796 --> 00:32:03,756 Speaker 2: do it at sufficient scale, then maybe we can just 621 00:32:03,956 --> 00:32:06,436 Speaker 2: throw a global communications network up there in the sky 622 00:32:06,556 --> 00:32:09,596 Speaker 2: and see what happens next. So yeah, that's you know, 623 00:32:09,636 --> 00:32:13,796 Speaker 2: the intermediate steps. Each one it's basically taking the themost 624 00:32:13,836 --> 00:32:16,436 Speaker 2: like the spirited. You know, here's our grand vision of 625 00:32:16,516 --> 00:32:19,076 Speaker 2: the future, and you know, here's my destiny and I 626 00:32:19,116 --> 00:32:20,836 Speaker 2: was put on earth to do this and saying okay, well, 627 00:32:21,156 --> 00:32:23,236 Speaker 2: the next step is have enough money to pay rent 628 00:32:23,236 --> 00:32:23,916 Speaker 2: next month, right. 629 00:32:24,796 --> 00:32:29,476 Speaker 1: Tomorrow to get to Mars. So is there a space 630 00:32:29,996 --> 00:32:33,796 Speaker 1: bubble right now. I think so. 631 00:32:34,116 --> 00:32:36,876 Speaker 2: I think there is I think there are people who 632 00:32:37,236 --> 00:32:40,956 Speaker 2: look at SpaceX and say this is achievable, and that 633 00:32:41,036 --> 00:32:43,196 Speaker 2: more is achievable. They also look at SpaceX and say 634 00:32:43,396 --> 00:32:46,716 Speaker 2: this is a kind of infrastructure, that there are things 635 00:32:46,796 --> 00:32:50,636 Speaker 2: like doing manufacturing in orbit or doing manufacturing on the Moon, 636 00:32:50,796 --> 00:32:53,636 Speaker 2: where in some cases that is actually the best place 637 00:32:53,636 --> 00:32:54,596 Speaker 2: to build something. 638 00:32:54,916 --> 00:32:58,236 Speaker 1: Basically because SpaceX has driven down the cost so much 639 00:32:58,276 --> 00:33:02,116 Speaker 1: of getting stuff into orbit, new ideas that would have 640 00:33:02,116 --> 00:33:06,596 Speaker 1: been economically absurd twenty years ago, like manufacturing and space 641 00:33:06,636 --> 00:33:08,516 Speaker 1: are now plausible. And so this is the sort of 642 00:33:08,516 --> 00:33:11,236 Speaker 1: bubble building on itself, and like, why is it not 643 00:33:11,356 --> 00:33:13,156 Speaker 1: just an industry now? Why is it a bubble? 644 00:33:13,196 --> 00:33:17,836 Speaker 2: In your telling, it is the feedback loop where what 645 00:33:17,916 --> 00:33:20,756 Speaker 2: SpaceX does makes more sense if they believe that there 646 00:33:20,756 --> 00:33:24,076 Speaker 2: will be a lot of demand to move physical things 647 00:33:24,156 --> 00:33:27,516 Speaker 2: off of Earth and into orbit, and perhaps further out 648 00:33:27,836 --> 00:33:30,356 Speaker 2: that if they believe that there's more demand for that, 649 00:33:30,396 --> 00:33:32,556 Speaker 2: they should be investing more on R and D. They 650 00:33:32,556 --> 00:33:35,076 Speaker 2: should be building better and bigger and better rockets, and 651 00:33:35,276 --> 00:33:38,436 Speaker 2: they should be doing the you know, big fixed cost 652 00:33:38,436 --> 00:33:41,356 Speaker 2: investment that incrementally reduces the cost of launches and only 653 00:33:41,396 --> 00:33:42,956 Speaker 2: pays for itself, you do a lot of them. And 654 00:33:42,996 --> 00:33:46,156 Speaker 2: then if they're doing that and you have your dream 655 00:33:46,396 --> 00:33:49,276 Speaker 2: of we're going to manufacture drugs in space and they 656 00:33:49,276 --> 00:33:51,756 Speaker 2: will be you know, like the marginal cost is low 657 00:33:51,836 --> 00:33:53,916 Speaker 2: once you get stuff up there, well that dream is 658 00:33:53,916 --> 00:33:56,276 Speaker 2: a little bit more plausible if you can actually plot 659 00:33:56,356 --> 00:33:57,996 Speaker 2: that curve of how much does it cost to get 660 00:33:57,996 --> 00:34:00,836 Speaker 2: a kilogram into space and say this, you know, there 661 00:34:00,956 --> 00:34:03,876 Speaker 2: is a specific ear at which point we would actually 662 00:34:03,916 --> 00:34:06,716 Speaker 2: have the cost advantage versus terrestrial manufacturing. 663 00:34:06,756 --> 00:34:09,556 Speaker 1: So it's this sort of coordinating mechan and that like 664 00:34:09,596 --> 00:34:12,316 Speaker 1: you also write about with Microsoft and Intel in the 665 00:34:12,676 --> 00:34:15,156 Speaker 1: eighties and nineties, where it's like, oh, they're building better chips, 666 00:34:15,196 --> 00:34:18,196 Speaker 1: so we will build better software, and then because they're 667 00:34:18,196 --> 00:34:21,516 Speaker 1: building better software, will build better chips. So this is 668 00:34:21,556 --> 00:34:24,796 Speaker 1: like a more exciting version of that, right because it's 669 00:34:24,836 --> 00:34:28,036 Speaker 1: going to get even cheaper to send stuff to space. 670 00:34:28,076 --> 00:34:31,596 Speaker 1: We can build this crazy factory to exist in space, 671 00:34:31,636 --> 00:34:33,996 Speaker 1: and then that tells SpaceX, oh, we can in fact 672 00:34:34,956 --> 00:34:37,316 Speaker 1: keep building, keep innovating, keep spending money. 673 00:34:37,916 --> 00:34:41,436 Speaker 2: Yes, And so someone has to do just half of that, 674 00:34:41,516 --> 00:34:43,796 Speaker 2: like the half of that that makes no sense whatsoever. 675 00:34:43,916 --> 00:34:46,716 Speaker 1: That was SpaceX at the beginning, right, that was like, yes, 676 00:34:46,876 --> 00:34:50,116 Speaker 1: just a guy with a lot of money in a crazy. 677 00:34:49,636 --> 00:34:51,956 Speaker 2: Yeah, it just really helps to have someone who's eccentric 678 00:34:52,036 --> 00:34:54,156 Speaker 2: and it has a lot of money and is willing 679 00:34:54,156 --> 00:34:55,676 Speaker 2: to throw it at a lot of different things. Like Musk. 680 00:34:56,476 --> 00:34:59,916 Speaker 2: He spent some substantial fraction of his network right after 681 00:34:59,956 --> 00:35:03,236 Speaker 2: this PayPal sale on a really nice sports car and 682 00:35:03,276 --> 00:35:05,996 Speaker 2: then immediately took it for a drive and wrecked. It 683 00:35:06,196 --> 00:35:08,396 Speaker 2: had no insurance and was not wearing a seat belt. 684 00:35:08,436 --> 00:35:10,756 Speaker 2: So the Elon mus story could have just been this 685 00:35:10,756 --> 00:35:14,916 Speaker 2: this proverb about dot com access and what happened when 686 00:35:14,956 --> 00:35:17,596 Speaker 2: you finally gave these people money as they immediately bought 687 00:35:17,636 --> 00:35:20,916 Speaker 2: sportscrossing Rerexiam. Instead, it's it's a story about a different 688 00:35:20,956 --> 00:35:23,756 Speaker 2: kind of success, but it's still I guess know what 689 00:35:23,836 --> 00:35:24,916 Speaker 2: that illustrates really risky. 690 00:35:25,476 --> 00:35:25,956 Speaker 1: Yeah. 691 00:35:26,036 --> 00:35:30,836 Speaker 2: Yeah, there's there's a risk level where you you are 692 00:35:31,796 --> 00:35:33,476 Speaker 2: going on going for a joy ride in your two 693 00:35:33,476 --> 00:35:35,516 Speaker 2: million dollar car and you haven't bothered to fill out 694 00:35:35,516 --> 00:35:37,876 Speaker 2: all the paperwork or by the insurance, and that is 695 00:35:37,916 --> 00:35:40,756 Speaker 2: the risk tolerance of someone who starts a company like SpaceX. 696 00:35:41,396 --> 00:35:46,716 Speaker 1: Okay, enough about space let's talk about Crypto, formerly known 697 00:35:46,756 --> 00:35:51,476 Speaker 1: as cryptocurrency. Let's talk about bitcoin, and let's talk about 698 00:35:51,516 --> 00:35:55,276 Speaker 1: bitcoin especially at the beginning, right before it was number 699 00:35:55,356 --> 00:35:58,996 Speaker 1: go up, when it was it really was true believers, right, 700 00:35:59,116 --> 00:36:01,796 Speaker 1: it was people who had a crazy worldview like you're 701 00:36:01,836 --> 00:36:03,956 Speaker 1: talking about in these in these other contexts. 702 00:36:04,996 --> 00:36:08,676 Speaker 2: Yes, so we we still don't know for sure who 703 00:36:08,756 --> 00:36:10,956 Speaker 2: said to A Nakamoto was, and I think everyone in 704 00:36:10,996 --> 00:36:14,116 Speaker 2: crypto has at least one guest, sometimes many guesses. But 705 00:36:14,396 --> 00:36:16,796 Speaker 2: whoever Satoshi was, whoever they were. 706 00:36:16,796 --> 00:36:18,956 Speaker 1: This is the creator of bitcoin for the one person 707 00:36:18,996 --> 00:36:19,476 Speaker 1: who doesn't know. 708 00:36:19,596 --> 00:36:24,516 Speaker 2: Yeah, they had this view that one of the fundamental 709 00:36:24,596 --> 00:36:28,236 Speaker 2: problems in the world today is that if you are 710 00:36:28,316 --> 00:36:30,956 Speaker 2: going to transfer value from one party to another, you 711 00:36:31,036 --> 00:36:32,716 Speaker 2: need some trusted intermediary. 712 00:36:32,996 --> 00:36:37,316 Speaker 1: You need a trusted intermediary like a government and a bank. Right. 713 00:36:37,396 --> 00:36:40,076 Speaker 1: Typically in money you need both governments and banks the 714 00:36:40,156 --> 00:36:42,596 Speaker 1: way it works in the world today, right, Yes, And 715 00:36:42,716 --> 00:36:45,396 Speaker 1: Satoshi happened to publish the Bitcoin White Paper in October 716 00:36:45,436 --> 00:36:47,916 Speaker 1: two thousand and eight, which was a great moment to 717 00:36:47,956 --> 00:36:50,716 Speaker 1: find people who really didn't want to have to deal 718 00:36:50,796 --> 00:36:53,276 Speaker 1: with governments and banks when they were dealing with money 719 00:36:53,676 --> 00:36:55,716 Speaker 1: at the financial crisis, right right in the teeth of 720 00:36:55,756 --> 00:36:56,596 Speaker 1: the financial crisis. 721 00:36:57,316 --> 00:36:59,716 Speaker 2: Yes, so hes So it is in one sense just 722 00:36:59,796 --> 00:37:03,356 Speaker 2: this this technically clever thing, and then in another sense 723 00:37:03,396 --> 00:37:06,676 Speaker 2: it's this very ideological project where he doesn't like central banks, 724 00:37:06,876 --> 00:37:09,596 Speaker 2: he doesn't like regular banks. He feels like all of 725 00:37:09,636 --> 00:37:12,476 Speaker 2: these institutions are corrupt, and you know, your money is 726 00:37:12,596 --> 00:37:15,476 Speaker 2: just an entry in somebody's database, and they can update 727 00:37:15,516 --> 00:37:18,356 Speaker 2: that database tomorrow and either change how much you have 728 00:37:18,636 --> 00:37:20,356 Speaker 2: or change what it's worth, and we need to just 729 00:37:20,756 --> 00:37:23,476 Speaker 2: build something new from a clean slate. And there's also 730 00:37:23,676 --> 00:37:25,476 Speaker 2: I think there's this tendency among a lot of tech 731 00:37:25,556 --> 00:37:29,676 Speaker 2: people too. When you look at any kind of communications 732 00:37:29,716 --> 00:37:32,436 Speaker 2: technology and money broadly defined as a communication technology, you're 733 00:37:32,436 --> 00:37:34,756 Speaker 2: always looking at something that has evolved from something simple, 734 00:37:34,996 --> 00:37:37,796 Speaker 2: and it has just been patched and altered and edited 735 00:37:37,956 --> 00:37:40,636 Speaker 2: and tweaked and so on until it works the way 736 00:37:40,796 --> 00:37:43,476 Speaker 2: that it works. But that always means that you can 737 00:37:43,596 --> 00:37:46,116 Speaker 2: easily come up with some first principles view that's a 738 00:37:46,156 --> 00:37:49,636 Speaker 2: whole lot cleaner, easier to reason about amidst some mistakes, 739 00:37:49,676 --> 00:37:52,036 Speaker 2: and then you often find that, Okay, you emitted all 740 00:37:52,076 --> 00:37:53,916 Speaker 2: the mistakes that are really really salient about FIAT, but 741 00:37:53,996 --> 00:37:56,716 Speaker 2: then you added some brand new mistakes or added mistakes 742 00:37:56,716 --> 00:37:59,036 Speaker 2: that we haven't made in hundreds of years. So there 743 00:37:59,716 --> 00:38:00,436 Speaker 2: it's full of trade off. 744 00:38:00,436 --> 00:38:02,676 Speaker 1: It gets complicated, but at the beginning, right, so the 745 00:38:02,756 --> 00:38:06,796 Speaker 1: white paper comes out and you know, I covered I 746 00:38:06,876 --> 00:38:08,876 Speaker 1: did a story about Quinn in twenty eleven, which was 747 00:38:08,876 --> 00:38:12,796 Speaker 1: still quite early. You know, we'd shocked that it had 748 00:38:12,836 --> 00:38:15,476 Speaker 1: gone from ten dollars a bitcoin to twenty dollars a bitcoin. 749 00:38:15,516 --> 00:38:17,756 Speaker 1: Thought we were reading it wrong. And at that time, 750 00:38:17,836 --> 00:38:20,196 Speaker 1: like I took to Gavin Drieson, who was very early 751 00:38:21,156 --> 00:38:23,596 Speaker 1: in the bitcoin universe, like he was not in it 752 00:38:23,716 --> 00:38:26,356 Speaker 1: to get rich, right, Like he really believed, he really 753 00:38:26,396 --> 00:38:29,756 Speaker 1: believed in it, and that was the vibe then, and 754 00:38:29,956 --> 00:38:32,996 Speaker 1: like he thought it was gonna be money, right. The 755 00:38:33,116 --> 00:38:36,556 Speaker 1: dream was people will use this to buy stuff. And 756 00:38:37,396 --> 00:38:40,276 Speaker 1: one thing that is interesting to me is, yeah, some 757 00:38:40,436 --> 00:38:43,996 Speaker 1: people sort of use it to buy stuff, but basically 758 00:38:44,476 --> 00:38:48,356 Speaker 1: not right like that it would go from twenty dollars 759 00:38:48,396 --> 00:38:50,916 Speaker 1: a bitcoin to one hundred thousand dollars a bitcoin without 760 00:38:51,036 --> 00:38:54,396 Speaker 1: some crazy killer app, without becoming the web, without becoming 761 00:38:54,436 --> 00:38:57,556 Speaker 1: something that everybody uses whether they care about it or not. 762 00:38:57,996 --> 00:39:00,436 Speaker 1: That I would not have guessed. And it seems weird 763 00:39:01,076 --> 00:39:03,356 Speaker 1: and plainly now crypto is full of some people who 764 00:39:03,396 --> 00:39:04,796 Speaker 1: are true believers and a lot of people who just 765 00:39:04,996 --> 00:39:07,676 Speaker 1: want to get rich, and some of them are pretty scammy. 766 00:39:08,356 --> 00:39:11,396 Speaker 2: Yeah. Yeah, there's like the grifter coefficient always goes up 767 00:39:11,436 --> 00:39:14,236 Speaker 2: with the price, and then you know, the true believers 768 00:39:14,236 --> 00:39:16,756 Speaker 2: are still there during the next eighty percent draw down. 769 00:39:16,796 --> 00:39:18,636 Speaker 2: And I'm sure there will be a drawdown something like 770 00:39:18,716 --> 00:39:20,396 Speaker 2: that at some point in the future. It's just that 771 00:39:20,596 --> 00:39:22,916 Speaker 2: that's kind of the nature of these kinds of assets. 772 00:39:24,196 --> 00:39:27,596 Speaker 2: The bitcoin, it was originally conceived as more of a currency, 773 00:39:27,876 --> 00:39:30,956 Speaker 2: and so so she talked about some hypothetical products you 774 00:39:30,996 --> 00:39:35,036 Speaker 2: could buy with it, and then one like the first 775 00:39:35,076 --> 00:39:37,316 Speaker 2: bitcoin killer app to be fair was e commerce. It 776 00:39:37,436 --> 00:39:40,636 Speaker 2: was specifically drugs. Yes, it is. 777 00:39:40,756 --> 00:39:41,636 Speaker 1: It is a very very. 778 00:39:41,556 --> 00:39:44,036 Speaker 2: Libertarian product in that way. So so it doesn't it 779 00:39:44,076 --> 00:39:46,756 Speaker 2: doesn't work very well as a dollar substitute for many reasons, 780 00:39:46,956 --> 00:39:49,836 Speaker 2: you know, most of the obvious reasons. But it is 781 00:39:50,076 --> 00:39:52,716 Speaker 2: interesting as a gold substitute, where part of the point 782 00:39:52,756 --> 00:39:55,676 Speaker 2: of gold is that it is very divisible and your 783 00:39:55,756 --> 00:39:58,636 Speaker 2: gold is the same as my gold, and we've all 784 00:39:58,956 --> 00:40:01,916 Speaker 2: kind of collectively agreed that gold is worth more than 785 00:40:02,316 --> 00:40:05,196 Speaker 2: the value than its value as just an industrial product. 786 00:40:05,636 --> 00:40:09,476 Speaker 2: And then and the neat thing about gold is it's 787 00:40:09,516 --> 00:40:13,356 Speaker 2: really hard to dig up anymore. Gold supply is extremely anelastic. 788 00:40:12,916 --> 00:40:16,596 Speaker 1: And so and bitcoin is designed to have a finite supply, right, Yeah, 789 00:40:16,636 --> 00:40:21,316 Speaker 1: important analogy, Yeah, yes, more generally, like it's what, it's 790 00:40:21,356 --> 00:40:25,196 Speaker 1: a long time out now, it's you know, seventeen years 791 00:40:25,316 --> 00:40:30,036 Speaker 1: or something since the white paper. What do you make 792 00:40:30,116 --> 00:40:34,476 Speaker 1: of the sort of costs and benefits of cryptocurrency so far? 793 00:40:34,756 --> 00:40:36,956 Speaker 1: The costs are more obvious to me, Like, there's a 794 00:40:36,996 --> 00:40:41,436 Speaker 1: lot of grift. It's you know, by design, very energy intensive, 795 00:40:41,676 --> 00:40:45,236 Speaker 1: Like I'm open to like better payment systems. There's lots 796 00:40:45,236 --> 00:40:48,276 Speaker 1: of just like boring efficiency gains you would think we 797 00:40:48,356 --> 00:40:51,596 Speaker 1: could get that we haven't gotten, right, Yeah, what do 798 00:40:51,636 --> 00:40:53,636 Speaker 1: you think about the cost versus the benefits so far? 799 00:40:54,316 --> 00:40:56,996 Speaker 2: I think I think in terms of the present value 800 00:40:57,116 --> 00:40:59,716 Speaker 2: of future gains, probably better off. I think in terms 801 00:40:59,756 --> 00:41:02,796 Speaker 2: of yeah, realize gains so far, worse off? Huh so 802 00:41:03,116 --> 00:41:05,276 Speaker 2: basically worse off so far, but in the long run 803 00:41:05,396 --> 00:41:07,596 Speaker 2: will be better off. We just haven't got the payss yet. 804 00:41:08,196 --> 00:41:10,876 Speaker 2: This is actually something that general purpose technologies. It is 805 00:41:10,916 --> 00:41:13,556 Speaker 2: a feature of general purpose technologies that there's often a 806 00:41:13,636 --> 00:41:15,516 Speaker 2: point early in their history where the net benefit has 807 00:41:15,596 --> 00:41:16,236 Speaker 2: been negative. 808 00:41:16,756 --> 00:41:20,476 Speaker 1: What what what wouldn't make it clearly positive? Like what's 809 00:41:20,516 --> 00:41:23,916 Speaker 1: the killer return you're hoping to see from cryptocurrency? 810 00:41:24,276 --> 00:41:26,036 Speaker 2: Yeah, so I think I think the killer return would 811 00:41:26,036 --> 00:41:29,436 Speaker 2: be if there is a a financial system that is 812 00:41:29,636 --> 00:41:32,836 Speaker 2: open in the sense that starting starting a financial institution, 813 00:41:32,876 --> 00:41:34,676 Speaker 2: starting a bank or an insurance company or something is 814 00:41:34,756 --> 00:41:38,116 Speaker 2: basically you write some code and you click the deploy 815 00:41:38,236 --> 00:41:41,276 Speaker 2: button and your code is running. You have capitalized your 816 00:41:41,356 --> 00:41:44,556 Speaker 2: little entity, and now you can provide whatever it is. 817 00:41:44,716 --> 00:41:47,436 Speaker 2: Like mean, tweet insurance. You're selling people for a dollar 818 00:41:47,436 --> 00:41:48,876 Speaker 2: a day. You will pay them one hundred dollars if 819 00:41:48,876 --> 00:41:51,556 Speaker 2: there's a tweet tweet that makes them cry, you know 820 00:41:53,076 --> 00:41:56,116 Speaker 2: in your insurance business, Yes, you know, you get to 821 00:41:56,156 --> 00:41:58,156 Speaker 2: speed run all kinds of financial history. I'm sure you 822 00:41:58,276 --> 00:42:00,716 Speaker 2: learn all about adverse selection. But like a financial system 823 00:42:00,716 --> 00:42:03,876 Speaker 2: where anything, anything can be plugged into something else and 824 00:42:04,436 --> 00:42:07,116 Speaker 2: basically everything is an API call away is just a 825 00:42:07,196 --> 00:42:10,676 Speaker 2: really interesting concept. And there's the field system is moving 826 00:42:10,716 --> 00:42:12,276 Speaker 2: in that direction, but slowly, and just. 827 00:42:12,316 --> 00:42:15,436 Speaker 1: To be clear, like why is it? Why is that 828 00:42:16,476 --> 00:42:18,796 Speaker 1: better on balance? So for it to for it to 829 00:42:18,876 --> 00:42:21,276 Speaker 1: be net positive that has to be not only interesting, 830 00:42:21,356 --> 00:42:24,436 Speaker 1: but that has to like lead to more human flourishing 831 00:42:24,476 --> 00:42:26,636 Speaker 1: and less suffering than we would have in its absence. 832 00:42:26,756 --> 00:42:32,396 Speaker 2: Right, Yeah, markets provide large positive externalities. There's a lot 833 00:42:32,436 --> 00:42:35,556 Speaker 2: of effort in those markets that feels wasted. But it 834 00:42:35,756 --> 00:42:39,916 Speaker 2: is like markets transmit information better than basically anything else 835 00:42:39,956 --> 00:42:42,436 Speaker 2: because what they're always transmitting is the information you actually 836 00:42:42,516 --> 00:42:46,316 Speaker 2: care about. So like oil prices, you don't have to 837 00:42:46,516 --> 00:42:50,036 Speaker 2: know that oil prices are up because there was a 838 00:42:50,116 --> 00:42:52,796 Speaker 2: terrorist attack or because someone drilled a dry hole or whatever. 839 00:42:53,436 --> 00:42:56,516 Speaker 2: You what you respond to is just gas is more expensive, 840 00:42:56,716 --> 00:42:59,916 Speaker 2: and therefore I will drive less, or you know, energy 841 00:42:59,996 --> 00:43:01,716 Speaker 2: is cheaper or more expensive, and so I need to 842 00:43:01,876 --> 00:43:04,636 Speaker 2: change my behavior. So it's always transmitted the actually useful 843 00:43:04,676 --> 00:43:06,516 Speaker 2: information to the people who would want to use it. 844 00:43:06,916 --> 00:43:10,836 Speaker 2: And the more complete markets are and the more things 845 00:43:10,876 --> 00:43:13,556 Speaker 2: there are where that information can be instantaneously transmitted to 846 00:43:13,556 --> 00:43:15,596 Speaker 2: the people who want to respond to it, the more 847 00:43:15,836 --> 00:43:19,596 Speaker 2: everyone's real world behavior actually reflects whatever the underlying material 848 00:43:19,636 --> 00:43:21,556 Speaker 2: constraints are on doing what we want to do. 849 00:43:21,836 --> 00:43:25,916 Speaker 1: The sort of crypto dream there is just more more 850 00:43:26,076 --> 00:43:32,836 Speaker 1: finance markets, more feedback, more market feedback, better financial services, 851 00:43:32,916 --> 00:43:35,716 Speaker 1: as a result. That's the that's the basic view you're 852 00:43:35,996 --> 00:43:37,196 Speaker 1: arguing for it, and it's just. 853 00:43:37,396 --> 00:43:41,156 Speaker 2: A really interesting way to build up new financial products 854 00:43:41,196 --> 00:43:43,676 Speaker 2: from first principles. And sometimes you learn whether those first 855 00:43:43,716 --> 00:43:46,516 Speaker 2: principles are wrong, but that itself is valuable, Like there 856 00:43:46,676 --> 00:43:50,636 Speaker 2: there is actual value in understanding something that is a 857 00:43:50,716 --> 00:43:53,356 Speaker 2: tradition or a norm and understanding why it works and 858 00:43:53,756 --> 00:43:56,196 Speaker 2: therefore deciding that that norm is actually a good norm. 859 00:43:56,636 --> 00:44:00,396 Speaker 1: Good last one? All right, you know what it's going 860 00:44:00,436 --> 00:44:02,036 Speaker 1: to be. You tell me what the last one is? 861 00:44:03,196 --> 00:44:04,556 Speaker 2: Is AI a bubble? 862 00:44:05,916 --> 00:44:08,596 Speaker 1: Yes? But you sound so sad about it? Of course 863 00:44:08,676 --> 00:44:10,756 Speaker 1: we've got to talk about AI, right, are you sad? 864 00:44:11,036 --> 00:44:14,876 Speaker 1: Talk about AI? Like it's exactly like what what you're 865 00:44:14,916 --> 00:44:19,916 Speaker 1: writing about. Yeah. When you hear Sam Altman talk about 866 00:44:20,556 --> 00:44:26,676 Speaker 1: creating open AI, starting open AI, it's like we basically said, 867 00:44:27,276 --> 00:44:31,716 Speaker 1: you know, we're going to make a GI artificial general intelligence. 868 00:44:32,356 --> 00:44:34,436 Speaker 1: Come work with us. And when he talks about it's 869 00:44:34,476 --> 00:44:36,596 Speaker 1: like there was a universe of people who were like 870 00:44:36,916 --> 00:44:40,276 Speaker 1: the smartest people who really believed, oh that's what they 871 00:44:40,356 --> 00:44:41,956 Speaker 1: wanted to do, so they came and worked with us, 872 00:44:42,116 --> 00:44:44,916 Speaker 1: which seems like exactly your story. 873 00:44:45,796 --> 00:44:47,876 Speaker 2: Yes, it turns out that a lot of people have 874 00:44:47,996 --> 00:44:50,156 Speaker 2: had that dream and for a lot of people, and 875 00:44:50,236 --> 00:44:53,036 Speaker 2: maybe it wasn't what they were studying in grad school, 876 00:44:53,076 --> 00:44:55,156 Speaker 2: but it was why they ended up being the kind 877 00:44:55,156 --> 00:44:56,796 Speaker 2: of person with major computer science and then try to 878 00:44:56,836 --> 00:44:58,716 Speaker 2: get a PhD in it and you know, would go 879 00:44:59,316 --> 00:45:01,716 Speaker 2: go into a more researchy end of the software world. 880 00:45:02,116 --> 00:45:05,116 Speaker 2: So yeah, there were there were people for whom this was. 881 00:45:05,276 --> 00:45:07,396 Speaker 2: It was incredibly refreshing to hear that someone actually wants 882 00:45:07,436 --> 00:45:08,076 Speaker 2: to build a thing. 883 00:45:08,236 --> 00:45:11,396 Speaker 1: So you have that kind of shared belief. I mean, 884 00:45:11,436 --> 00:45:13,876 Speaker 1: at this point, you have these other elements of what 885 00:45:13,996 --> 00:45:20,236 Speaker 1: you're talking about, right, like a sense of urgency, an 886 00:45:20,316 --> 00:45:27,916 Speaker 1: incredible amount of money, elements of spiritual or quasi spiritual belief. 887 00:45:29,116 --> 00:45:32,436 Speaker 2: Yes, there are pseudonymous open Aye employees on Twitter who 888 00:45:32,436 --> 00:45:35,916 Speaker 2: will tweet about things like building God. So yeah, they're 889 00:45:35,956 --> 00:45:39,276 Speaker 2: they're taking it in a weird spiritual direction. But there, 890 00:45:39,396 --> 00:45:42,236 Speaker 2: I think there there is something you know, it is 891 00:45:42,436 --> 00:45:44,716 Speaker 2: it is interesting that a feature of the natural world 892 00:45:45,076 --> 00:45:48,436 Speaker 2: is that you can actually if you put enough of 893 00:45:48,716 --> 00:45:51,876 Speaker 2: a you know, you you arrange refined sand and a 894 00:45:51,916 --> 00:45:54,556 Speaker 2: couple of metals in exactly the right way and type 895 00:45:54,596 --> 00:45:57,276 Speaker 2: in the right incantations and add a lot of power 896 00:45:57,596 --> 00:46:00,356 Speaker 2: that you get something that appears to think and that 897 00:46:00,636 --> 00:46:02,836 Speaker 2: can trick someone into thinking that it's a real human being. 898 00:46:03,556 --> 00:46:05,836 Speaker 1: The is it good or is it bad? Question is 899 00:46:05,916 --> 00:46:10,076 Speaker 1: quite interesting here, obviously too soon to tell well, but 900 00:46:11,356 --> 00:46:13,956 Speaker 1: striking to me in the case of AI that the 901 00:46:13,956 --> 00:46:16,436 Speaker 1: people who seem most worried about it are the people 902 00:46:16,476 --> 00:46:19,316 Speaker 1: who know the most about it, which is not often 903 00:46:19,396 --> 00:46:22,236 Speaker 1: the case, right. Usually the people doing the work building 904 00:46:22,316 --> 00:46:24,396 Speaker 1: the thing just love it and think it's great. In 905 00:46:24,476 --> 00:46:26,196 Speaker 1: this case, it's kind of the opposite. 906 00:46:26,596 --> 00:46:30,356 Speaker 2: Yeah, I think the times when I am calmst about 907 00:46:30,396 --> 00:46:33,436 Speaker 2: AI and least worried about it taking my job are 908 00:46:33,716 --> 00:46:39,116 Speaker 2: times when I'm using AI products to slightly improve how 909 00:46:39,156 --> 00:46:41,916 Speaker 2: I do my job that is better natural language search, 910 00:46:42,116 --> 00:46:46,036 Speaker 2: or actually most of it is processing natural language. When 911 00:46:46,196 --> 00:46:48,396 Speaker 2: there are a lot of pages I need to read 912 00:46:48,556 --> 00:46:50,796 Speaker 2: which contain you know, if it's like a thousand pages 913 00:46:50,916 --> 00:46:53,036 Speaker 2: of which five sentences matter to me, that is a 914 00:46:53,156 --> 00:46:55,036 Speaker 2: job for the API and not a job for me. 915 00:46:55,316 --> 00:46:57,156 Speaker 2: But it is now a job that the API and 916 00:46:57,236 --> 00:47:00,076 Speaker 2: I can actually get done. And my function is to 917 00:47:00,276 --> 00:47:03,356 Speaker 2: figure out what those five sentences are and figure out 918 00:47:03,356 --> 00:47:04,916 Speaker 2: a clever way to find them and then the AI's 919 00:47:04,996 --> 00:47:07,236 Speaker 2: job is to do the grant work of actually reading 920 00:47:07,276 --> 00:47:07,596 Speaker 2: through them. 921 00:47:07,836 --> 00:47:10,676 Speaker 1: That's AI as you full tool, right, that's the happy 922 00:47:10,796 --> 00:47:11,476 Speaker 1: AI story. 923 00:47:11,596 --> 00:47:14,956 Speaker 2: Yeah, And I actually think that preserving preserving your own 924 00:47:14,996 --> 00:47:17,516 Speaker 2: agency is a pretty big deal in this context. So 925 00:47:17,676 --> 00:47:19,876 Speaker 2: I think that if you are, if you're making a decision, 926 00:47:20,036 --> 00:47:23,556 Speaker 2: it needs to be something where you have actually formalized 927 00:47:23,596 --> 00:47:25,036 Speaker 2: it to the extent that you can formalize it, and 928 00:47:25,396 --> 00:47:28,596 Speaker 2: then you have made the call. But for for a 929 00:47:28,636 --> 00:47:31,236 Speaker 2: lot of the gruntwork, AI is just it's a way 930 00:47:31,276 --> 00:47:34,276 Speaker 2: to massively parallelize having an intern. 931 00:47:34,516 --> 00:47:37,196 Speaker 1: Plainly, it's powerful, and you're talking about what it can 932 00:47:37,236 --> 00:47:41,596 Speaker 1: do right now. I mean, the smartest people are like, yes, 933 00:47:41,676 --> 00:47:43,876 Speaker 1: but we're gonna have AGI in two years, which I 934 00:47:43,956 --> 00:47:45,476 Speaker 1: don't know if that's right or not. I don't know 935 00:47:45,516 --> 00:47:48,196 Speaker 1: how to evaluate that claim. But it's a wild claim. 936 00:47:48,836 --> 00:47:53,156 Speaker 1: It's plainly not obviously wrong on its face, right, it's possible. 937 00:47:54,316 --> 00:47:56,676 Speaker 1: Can you even start to parse that. You're giving sort 938 00:47:56,676 --> 00:47:59,236 Speaker 1: of little things today about oh, here's a useful tool, 939 00:47:59,276 --> 00:48:00,436 Speaker 1: and here's the thing I don't use it for. But 940 00:48:00,476 --> 00:48:02,916 Speaker 1: there's a much bigger set of questions that seem imminent. 941 00:48:03,276 --> 00:48:05,236 Speaker 2: You know, there's certain kinds of radical insert day there. 942 00:48:05,756 --> 00:48:10,196 Speaker 2: You know, I think it increases wealth inequality, but also 943 00:48:10,716 --> 00:48:14,356 Speaker 2: means that intelligence is just more abundant and is available 944 00:48:14,396 --> 00:48:17,796 Speaker 2: on demand and is baked into more things. I think 945 00:48:17,996 --> 00:48:20,356 Speaker 2: that it's you know, you can definitely sketch out really 946 00:48:20,396 --> 00:48:23,316 Speaker 2: really negative scenarios. You could sketch out, you know, not 947 00:48:23,596 --> 00:48:26,276 Speaker 2: end of the world, but maybe might as well be 948 00:48:26,396 --> 00:48:29,516 Speaker 2: for the average person. Scenarios where every white collar job 949 00:48:29,556 --> 00:48:32,076 Speaker 2: gets eliminated and then a tiny handful of people have 950 00:48:32,156 --> 00:48:35,116 Speaker 2: just unimaginable wealth and you know, rearrange the system to 951 00:48:35,156 --> 00:48:37,796 Speaker 2: make sure that doesn't change. But I think there are 952 00:48:37,796 --> 00:48:41,396 Speaker 2: a lot of intermediate stories that are closer to just 953 00:48:41,516 --> 00:48:44,276 Speaker 2: the story of say, accountants after the rise of Excel, 954 00:48:44,476 --> 00:48:47,716 Speaker 2: where there were parts of their job that got much 955 00:48:47,796 --> 00:48:49,636 Speaker 2: much easier and then the scope of what they could 956 00:48:49,636 --> 00:48:50,876 Speaker 2: do expanded. It was the. 957 00:48:50,916 --> 00:48:53,636 Speaker 1: Bookkeepers who took it on the chin. It turns out, yeah, 958 00:48:53,716 --> 00:48:56,636 Speaker 1: like Excel actually did drive bookkeepers out of work and 959 00:48:56,716 --> 00:48:58,956 Speaker 1: it made accountants more powerful. 960 00:49:00,756 --> 00:49:02,996 Speaker 2: Yeah, So, you you know, within I think within a 961 00:49:03,756 --> 00:49:06,196 Speaker 2: kind of company function, you'll have specific job functions that 962 00:49:06,276 --> 00:49:08,116 Speaker 2: do mostly go away, and then a lot of them 963 00:49:08,636 --> 00:49:11,876 Speaker 2: will evolve, and so the way that AI seems to 964 00:49:11,916 --> 00:49:15,916 Speaker 2: be rolling out in big companies in practice is they 965 00:49:16,036 --> 00:49:18,956 Speaker 2: generally don't lay off a ton of people. They will 966 00:49:19,036 --> 00:49:22,796 Speaker 2: sometimes end outsource contracts, but in a lot of the case, 967 00:49:22,916 --> 00:49:25,156 Speaker 2: a lot of cases, they don't lay people off. They 968 00:49:25,756 --> 00:49:29,476 Speaker 2: change people's responsibilities. They ask them to do less of 969 00:49:29,556 --> 00:49:31,236 Speaker 2: one thing and a whole lot more of something else. 970 00:49:31,556 --> 00:49:33,636 Speaker 2: And then in some cases that means they don't have 971 00:49:33,676 --> 00:49:34,956 Speaker 2: to do much hiring right now, but they think that 972 00:49:34,996 --> 00:49:36,956 Speaker 2: a layoff would be pretty demoralizing, so they sort of 973 00:49:37,396 --> 00:49:40,516 Speaker 2: grow into the new cost structure that they can support. 974 00:49:41,276 --> 00:49:43,916 Speaker 2: And then in other cases there are companies where they realize, wait, 975 00:49:44,236 --> 00:49:46,836 Speaker 2: we can ship features twice as fast now, and so 976 00:49:46,956 --> 00:49:48,596 Speaker 2: our revenue is going up faster, So we actually need 977 00:49:48,636 --> 00:49:50,796 Speaker 2: more developers because our developers are so much more productive. 978 00:49:55,476 --> 00:50:05,796 Speaker 1: We'll be back in a minute with the lightning round. Okay, 979 00:50:05,876 --> 00:50:10,196 Speaker 1: let's finish with the lightning round. The most interesting thing 980 00:50:10,236 --> 00:50:12,916 Speaker 1: you learned from an earnings call transcript in the last. 981 00:50:12,796 --> 00:50:17,276 Speaker 2: Year, most interesting thing from a transcript in the last year, 982 00:50:17,756 --> 00:50:21,676 Speaker 2: I would say there was a point, this might have 983 00:50:21,716 --> 00:50:23,476 Speaker 2: been a little over yeargo. There was a point at 984 00:50:23,516 --> 00:50:28,076 Speaker 2: which Satia Nadella was talking about Microsoft's AI spending, and 985 00:50:28,276 --> 00:50:30,796 Speaker 2: he said we are still at the point, and I 986 00:50:30,836 --> 00:50:32,596 Speaker 2: think he and Zuckerberg both said something to the same 987 00:50:32,596 --> 00:50:35,196 Speaker 2: effect in the same quarter, which is very exciting for 988 00:50:35,356 --> 00:50:37,676 Speaker 2: Nvidia people. But it was like, we're at the point 989 00:50:37,676 --> 00:50:39,996 Speaker 2: where we see a lot more risk to underspending than 990 00:50:40,036 --> 00:50:42,716 Speaker 2: to overspending on AI specifically. 991 00:50:43,516 --> 00:50:46,316 Speaker 1: That really speaks to your book, right, that really is 992 00:50:46,476 --> 00:50:49,276 Speaker 1: like bubbly as hell in the context of your book, 993 00:50:49,396 --> 00:50:53,956 Speaker 1: like overspending, like the Apollo missions, like the Manhattan Project, 994 00:50:54,036 --> 00:50:56,036 Speaker 1: like the big risk is that we don't spend enough. 995 00:50:56,876 --> 00:50:59,116 Speaker 2: And also they know that their competitors are listening to 996 00:50:59,156 --> 00:51:02,836 Speaker 2: these calls too, so they were also seeing that this 997 00:51:02,996 --> 00:51:05,636 Speaker 2: is kind of a winnable fight that they do think 998 00:51:05,676 --> 00:51:07,876 Speaker 2: that there is a level of capital spending at which 999 00:51:08,276 --> 00:51:10,796 Speaker 2: Microsoft can win simply because they took it more seriously 1000 00:51:10,916 --> 00:51:11,676 Speaker 2: than everybody else. 1001 00:51:11,916 --> 00:51:15,436 Speaker 1: So he's like, yes, We're going to spend billions and 1002 00:51:15,516 --> 00:51:18,916 Speaker 1: billions of dollars on AI because we think we can win. 1003 00:51:20,316 --> 00:51:30,556 Speaker 1: Zuckerberg implicitly, what's one innovation in history that you wish 1004 00:51:30,636 --> 00:51:31,196 Speaker 1: didn't happen? 1005 00:51:36,436 --> 00:51:40,476 Speaker 2: I wish there were some reason that it was infeasible 1006 00:51:40,636 --> 00:51:45,396 Speaker 2: to have really, really tight feedback loops for consumer facing apps, 1007 00:51:45,436 --> 00:51:46,356 Speaker 2: particularly games. 1008 00:51:48,636 --> 00:51:50,316 Speaker 1: Is that a way of saying you wish games were 1009 00:51:50,436 --> 00:51:51,116 Speaker 1: less addictive? 1010 00:51:51,796 --> 00:51:53,436 Speaker 2: Yeah, I wish games were less addictive, or that they 1011 00:51:53,476 --> 00:51:57,116 Speaker 2: didn't get as they weren't as good at getting more addictive. 1012 00:51:57,116 --> 00:51:58,836 Speaker 2: So I wrote a piece in the newsletter about this recently, 1013 00:51:59,036 --> 00:52:01,596 Speaker 2: because there was that wonderful article in the Loneliness Economy 1014 00:52:01,756 --> 00:52:03,956 Speaker 2: in the Atlantic a couple of weeks back that was 1015 00:52:04,036 --> 00:52:07,116 Speaker 2: talking about it. We just spend One of the pandemic 1016 00:52:07,196 --> 00:52:09,076 Speaker 2: trends that has mean reverted the least is how much 1017 00:52:09,156 --> 00:52:11,156 Speaker 2: time people spend alone. And I think one of the 1018 00:52:11,156 --> 00:52:13,956 Speaker 2: reasons for that is that all the things you do alone, 1019 00:52:14,316 --> 00:52:17,276 Speaker 2: they are things that produce data for the company that 1020 00:52:17,636 --> 00:52:20,556 Speaker 2: monetizes the time that you spend alone. And so the 1021 00:52:20,676 --> 00:52:23,236 Speaker 2: fact that we all washed a whole lot of Netflix 1022 00:52:23,276 --> 00:52:25,316 Speaker 2: in the spring of twenty twenty means that Netflix has 1023 00:52:25,316 --> 00:52:27,076 Speaker 2: a lot more data on what our preferences are. 1024 00:52:27,756 --> 00:52:30,796 Speaker 1: So they got better at making us want to watch Netflix, 1025 00:52:30,876 --> 00:52:33,596 Speaker 1: and all the video games we played on our phones 1026 00:52:34,076 --> 00:52:37,036 Speaker 1: got better at making us addicted to keep playing video 1027 00:52:37,076 --> 00:52:40,516 Speaker 1: games on our phones. Yeah, that's a bummer. It's a bummer. 1028 00:52:42,676 --> 00:52:44,716 Speaker 1: What was the best thing about dropping out of college 1029 00:52:44,756 --> 00:52:46,596 Speaker 1: and moving to New York City at age eighteen. 1030 00:52:48,116 --> 00:52:53,036 Speaker 2: So I would say that it was it really meant 1031 00:52:53,076 --> 00:52:56,836 Speaker 2: that I could could and had to just take full 1032 00:52:56,876 --> 00:53:01,316 Speaker 2: responsibility for outcomes, and that I get to get to 1033 00:53:01,356 --> 00:53:03,996 Speaker 2: take a lot more credit for what I've done since then, 1034 00:53:04,316 --> 00:53:07,796 Speaker 2: but also get a lot more blame. Where there isn't 1035 00:53:07,996 --> 00:53:10,516 Speaker 2: really a brand named fall back on and so if 1036 00:53:10,556 --> 00:53:12,756 Speaker 2: someone hires me, they can't say this person got a 1037 00:53:12,876 --> 00:53:15,636 Speaker 2: degree from institution X. You know I didn't eve. I 1038 00:53:15,756 --> 00:53:18,396 Speaker 2: dropped out of a really bad school too, So there's 1039 00:53:18,516 --> 00:53:21,276 Speaker 2: there's not even like the not even the the extra 1040 00:53:21,436 --> 00:53:24,116 Speaker 2: upside of you know, I startup was so great, I 1041 00:53:24,236 --> 00:53:26,996 Speaker 2: just had to leave Stanford after only a couple of semesters. No, 1042 00:53:27,196 --> 00:53:29,756 Speaker 2: it was it was Arizona State and I didn't even party, 1043 00:53:29,876 --> 00:53:34,836 Speaker 2: so they yeah, but yeah, it's it's that it's just 1044 00:53:35,396 --> 00:53:37,916 Speaker 2: being being a little more in control of the narrative 1045 00:53:37,996 --> 00:53:41,116 Speaker 2: and also just knowing that it's it's a lot more 1046 00:53:41,196 --> 00:53:41,436 Speaker 2: up to me. 1047 00:53:42,156 --> 00:53:44,196 Speaker 1: What was the worst thing about dropping out of college 1048 00:53:44,236 --> 00:53:45,596 Speaker 1: and moving to New York at eighteen? 1049 00:53:46,956 --> 00:53:48,796 Speaker 2: So one time I went through a really really long 1050 00:53:48,916 --> 00:53:51,156 Speaker 2: interview process for a job that I really wanted and 1051 00:53:52,436 --> 00:53:55,036 Speaker 2: at the end of many, many rounds of interviews and 1052 00:53:55,196 --> 00:53:58,876 Speaker 2: you know, a work session and lots of stuff. The 1053 00:53:59,396 --> 00:54:01,556 Speaker 2: hiring committee rejected because I didn't have a degree and 1054 00:54:01,916 --> 00:54:04,556 Speaker 2: that was on my resume, so that was kind of inconvenient. 1055 00:54:04,916 --> 00:54:08,796 Speaker 2: I guess another another downside, like it might have been 1056 00:54:09,636 --> 00:54:14,196 Speaker 2: to spend more time with fewer obligations and access to 1057 00:54:14,236 --> 00:54:15,076 Speaker 2: a really good library. 1058 00:54:22,036 --> 00:54:25,196 Speaker 1: Burn Hobart is the co author of Boom Bubbles and 1059 00:54:25,276 --> 00:54:28,956 Speaker 1: the End of Stagnation. Today's show was produced by Gabriel 1060 00:54:29,036 --> 00:54:32,196 Speaker 1: Hunter Cheng. It was edited by Lydia Jeane Kott and 1061 00:54:32,476 --> 00:54:36,076 Speaker 1: engineered by Sarah Brugier. You can email us at problem 1062 00:54:36,236 --> 00:54:39,636 Speaker 1: at Pushkin dot FM. I'm Jacob Goldstein and we'll be 1063 00:54:39,716 --> 00:54:42,276 Speaker 1: back next week with another episode of What's Your Problem.