1 00:00:15,356 --> 00:00:22,396 Speaker 1: Pushkin. When I was a kid in the nineteen eighties, 2 00:00:22,836 --> 00:00:25,956 Speaker 1: I lived about forty miles from a nuclear power plant. 3 00:00:26,116 --> 00:00:28,676 Speaker 1: It's called Santa No Frey was right by the freeway, 4 00:00:29,076 --> 00:00:31,596 Speaker 1: and whenever we drove past it, me and my family, 5 00:00:31,916 --> 00:00:35,036 Speaker 1: we would all hold our breath, like, you know, to 6 00:00:35,116 --> 00:00:38,316 Speaker 1: protect ourselves from the radiation or whatever. So one of 7 00:00:38,316 --> 00:00:42,436 Speaker 1: those ritual family jokes, those things you do a million times, 8 00:00:42,756 --> 00:00:45,636 Speaker 1: not really because they're funny, but because they're just what 9 00:00:45,676 --> 00:00:51,436 Speaker 1: you do. I'm telling you this, because that joke, that ritual, 10 00:00:51,516 --> 00:00:54,476 Speaker 1: that holding our breath, it speaks to what the vibes 11 00:00:54,556 --> 00:00:58,156 Speaker 1: were in the eighties about nuclear power, right. That was 12 00:00:58,196 --> 00:01:02,476 Speaker 1: a moment of like peak nuclear fear. There had been 13 00:01:02,476 --> 00:01:06,956 Speaker 1: the three Mile Island nuclear accident in nineteen seventy nine. Yeah, 14 00:01:06,956 --> 00:01:10,356 Speaker 1: the Simpsons with Homer Simpson always most causing a meltdown, 15 00:01:10,436 --> 00:01:13,116 Speaker 1: and then more seriously in the eighties you had the 16 00:01:13,316 --> 00:01:18,596 Speaker 1: Chernobyl nuclear disaster. So we were very scared of nuclear 17 00:01:18,596 --> 00:01:23,196 Speaker 1: power at the time. But looking back, looking back from today, 18 00:01:23,436 --> 00:01:27,036 Speaker 1: I wonder if maybe we were scared of the wrong thing, 19 00:01:28,036 --> 00:01:32,956 Speaker 1: because today it looks increasingly likely that we may need 20 00:01:33,236 --> 00:01:38,396 Speaker 1: more nuclear power alongside more renewables. In order to stop 21 00:01:38,436 --> 00:01:42,996 Speaker 1: burning fossil fuel and contain the risk of climate change. 22 00:01:43,316 --> 00:01:46,676 Speaker 1: So looking back, maybe instead of being afraid of a 23 00:01:46,676 --> 00:01:50,076 Speaker 1: world with nuclear power, we should have been afraid of 24 00:01:50,116 --> 00:01:59,476 Speaker 1: a world without nuclear power. I'm Jacob Goldstein and this 25 00:01:59,556 --> 00:02:01,716 Speaker 1: is What's Your Problem, the show where I talk to 26 00:02:01,756 --> 00:02:05,596 Speaker 1: people who are trying to make technological progress. My guest 27 00:02:05,636 --> 00:02:09,556 Speaker 1: today is Yasser Arafat. He's the chief Technology office at 28 00:02:09,676 --> 00:02:13,036 Speaker 1: Hollo Atomics. Earlier in his career he worked for the 29 00:02:13,076 --> 00:02:16,676 Speaker 1: federal government at the Idaho National Lab, where he designed 30 00:02:16,676 --> 00:02:21,036 Speaker 1: a nuclear microreactor that he called Marvel. Now at Allo, 31 00:02:21,596 --> 00:02:24,916 Speaker 1: Yasser is trying to commercialize a version of that reactor. 32 00:02:25,836 --> 00:02:29,156 Speaker 1: His problem is this, how can you mass produce nuclear 33 00:02:29,196 --> 00:02:32,996 Speaker 1: reactors in a factory in a way that's safe, scalable, 34 00:02:33,156 --> 00:02:37,116 Speaker 1: and cheap. We mostly talked about the reactor that Yaser 35 00:02:37,116 --> 00:02:39,956 Speaker 1: has designed to be mass produced in a factory, but 36 00:02:40,076 --> 00:02:43,076 Speaker 1: to start we talked about the on again, off again 37 00:02:43,236 --> 00:02:47,196 Speaker 1: history of nuclear power in the United States. 38 00:02:49,236 --> 00:02:52,316 Speaker 2: Yeah, I mean, the sort of nuclear really starts from 39 00:02:52,556 --> 00:02:55,476 Speaker 2: the especially in the US in the fifties, right, we've 40 00:02:55,476 --> 00:02:59,196 Speaker 2: had the Atomic Energy ec was amended right to allow 41 00:02:59,996 --> 00:03:03,476 Speaker 2: nuclear industry to be privatized in nineteen fifty four, and 42 00:03:03,516 --> 00:03:05,836 Speaker 2: that kind of you know was you know, that paved 43 00:03:05,876 --> 00:03:10,276 Speaker 2: the way to the construction off the first commercial power plant, 44 00:03:10,556 --> 00:03:14,476 Speaker 2: I should say, in shipping Port, Pennsylvania, which began operations 45 00:03:14,556 --> 00:03:17,316 Speaker 2: like fifties, I think fifty eight and fifty eight, and 46 00:03:17,356 --> 00:03:21,676 Speaker 2: shipping Port really symbolized this beginning of this new dawn 47 00:03:21,876 --> 00:03:26,436 Speaker 2: of the what we called the first atomic Age. And 48 00:03:26,836 --> 00:03:28,956 Speaker 2: if you post there for a second, up until then, 49 00:03:29,276 --> 00:03:32,316 Speaker 2: if you think about it, for the last million years 50 00:03:32,396 --> 00:03:38,716 Speaker 2: or so, humanity really used combustion as their primary source 51 00:03:38,756 --> 00:03:40,116 Speaker 2: of power for growth. 52 00:03:40,276 --> 00:03:44,076 Speaker 1: For you know, for most of that time, we burned wood, 53 00:03:44,516 --> 00:03:47,596 Speaker 1: and then for like a brief moment of one hundred 54 00:03:47,636 --> 00:03:50,876 Speaker 1: two hundred years, three hundred years, we burned cold, a 55 00:03:50,916 --> 00:03:52,636 Speaker 1: little bit of natural gas, a little bit oil. But 56 00:03:52,836 --> 00:03:53,996 Speaker 1: you're always burning something. 57 00:03:53,996 --> 00:03:57,436 Speaker 2: What's burning something, it's always combustion, right, So that was 58 00:03:57,436 --> 00:04:01,796 Speaker 2: really a pivotal moment, and really humanity first unlocked that 59 00:04:02,436 --> 00:04:07,716 Speaker 2: amazing new modern way of creating energy by splitting atoms. 60 00:04:08,116 --> 00:04:10,836 Speaker 2: It was a big pivot moment and then entered the 61 00:04:10,836 --> 00:04:13,636 Speaker 2: seven sixties and mid seventies. So from the sixties to 62 00:04:13,676 --> 00:04:18,756 Speaker 2: seven mid seventies, we call this the golden age of nuclear, right, 63 00:04:19,356 --> 00:04:21,996 Speaker 2: and that's when really like, we built a ton of 64 00:04:22,036 --> 00:04:25,676 Speaker 2: reactors commercially in the United States, about fifty five of thems. 65 00:04:25,756 --> 00:04:28,156 Speaker 2: You know, up until mid seventies, there was a lot 66 00:04:28,196 --> 00:04:31,756 Speaker 2: of optimism about nuclear and a lot of the investments 67 00:04:31,876 --> 00:04:34,676 Speaker 2: went in there. However, when you when you started approaching 68 00:04:34,756 --> 00:04:38,276 Speaker 2: the mid seventies and if all these nuclear problems around, 69 00:04:38,876 --> 00:04:42,996 Speaker 2: it also invoked the creation of a regulatory body, right. 70 00:04:43,476 --> 00:04:47,436 Speaker 2: The NRC was formed in the mid seventies, and you know, 71 00:04:47,636 --> 00:04:53,316 Speaker 2: new regulations started getting imposed on plants and automatically things. 72 00:04:53,556 --> 00:04:57,436 Speaker 2: You know, the cost went out when regulations became tighter. 73 00:04:57,636 --> 00:05:00,756 Speaker 1: The NRC is the Nuclear Regulatory Commission. 74 00:05:00,276 --> 00:05:03,156 Speaker 2: That's correct, the Nuclear Regulatory Commission. And then right after, 75 00:05:03,436 --> 00:05:05,996 Speaker 2: you know, just a few years later, nineteen seventy nine, 76 00:05:06,036 --> 00:05:09,676 Speaker 2: that's when Three Mile Island happened, right. I was in Slovenia. 77 00:05:09,996 --> 00:05:13,076 Speaker 2: We had a partial meltdown of a reactor and there 78 00:05:13,076 --> 00:05:17,316 Speaker 2: was a widespread public concern of fear. Sure nobody died 79 00:05:17,356 --> 00:05:20,436 Speaker 2: from that accident directly, but it really like you know, 80 00:05:20,476 --> 00:05:22,676 Speaker 2: shook the public quite a bit and really put a 81 00:05:22,676 --> 00:05:26,836 Speaker 2: lot of emphasis on the potential safety risks, and that 82 00:05:27,116 --> 00:05:32,236 Speaker 2: in turn made the regulatory activities even stricter. 83 00:05:32,556 --> 00:05:38,356 Speaker 1: And so that's basically like new construction of nuclear power 84 00:05:38,396 --> 00:05:42,196 Speaker 1: plants more or less stops in the US after that, right. 85 00:05:42,076 --> 00:05:44,756 Speaker 2: Pretty much, that was the nail in the coffin for decades. 86 00:05:44,796 --> 00:05:45,876 Speaker 2: It stopped, exactly. 87 00:05:46,316 --> 00:05:49,756 Speaker 1: And so you know, it's interesting for me personally because 88 00:05:50,076 --> 00:05:53,036 Speaker 1: so I was growing up in the nineteen eighties, right, 89 00:05:53,556 --> 00:05:57,236 Speaker 1: and that was definitely a time when what we would 90 00:05:57,276 --> 00:06:02,756 Speaker 1: now call the vibes were like anti nuclear basically, right, 91 00:06:02,916 --> 00:06:06,916 Speaker 1: Like nuclear power was this scary thing, and nuclear waste 92 00:06:06,996 --> 00:06:09,676 Speaker 1: was this scary thing that lasted forever. And you have 93 00:06:09,756 --> 00:06:14,516 Speaker 1: Chernobyl in there somewhere, which is like very bad and 94 00:06:14,716 --> 00:06:20,276 Speaker 1: very scary, right, and people did die, right and and 95 00:06:20,476 --> 00:06:23,276 Speaker 1: what and so so you know, that was what I 96 00:06:23,276 --> 00:06:25,676 Speaker 1: grew up with. And then just in the last few 97 00:06:25,716 --> 00:06:30,876 Speaker 1: years there has been this shift, right, Like, intellectually I 98 00:06:31,036 --> 00:06:37,556 Speaker 1: get now why nuclear power is good. I get intellectually 99 00:06:38,276 --> 00:06:43,036 Speaker 1: in fact that certainly coal fired power plants are super 100 00:06:43,116 --> 00:06:46,876 Speaker 1: dangerous and literally thousands of people die every year from them. 101 00:06:47,156 --> 00:06:49,356 Speaker 1: They just die in a way that is invisible, right, 102 00:06:49,396 --> 00:06:51,796 Speaker 1: because it's not like there's some accident, it's just that 103 00:06:52,156 --> 00:06:55,796 Speaker 1: coal fired power plants, emit pollutants that clearly are in 104 00:06:55,836 --> 00:06:58,276 Speaker 1: the aggregate killing people. We just don't know which people 105 00:06:58,356 --> 00:07:02,516 Speaker 1: and when, right, Like that seems pretty unambiguous. So I'm 106 00:07:02,556 --> 00:07:07,556 Speaker 1: at this point now where like, intellectually I think I'm 107 00:07:07,556 --> 00:07:09,676 Speaker 1: pro nuclear. I'm pro nuclear, So I do have this 108 00:07:09,796 --> 00:07:13,236 Speaker 1: question about tail risk, right, tail risk seems like a 109 00:07:13,276 --> 00:07:17,596 Speaker 1: thing with nuclear power that I haven't quite figured out. 110 00:07:18,076 --> 00:07:21,676 Speaker 1: But I still have the emotional wariness, right can you 111 00:07:21,716 --> 00:07:22,596 Speaker 1: bring me around? 112 00:07:23,196 --> 00:07:26,796 Speaker 2: Sure? And rightfully, So, when you've gone through that era, 113 00:07:27,396 --> 00:07:31,316 Speaker 2: that stigma, that feeling, that fear kind of like lags. 114 00:07:31,396 --> 00:07:33,876 Speaker 2: It stays there for a very long time. And so 115 00:07:34,676 --> 00:07:37,756 Speaker 2: you know, if you kind of fast forward, that had 116 00:07:37,796 --> 00:07:44,916 Speaker 2: a real implication as how the energy infrastructure ecosystem kind 117 00:07:44,956 --> 00:07:47,596 Speaker 2: of shape in the United States. Right, So you see 118 00:07:47,596 --> 00:07:52,116 Speaker 2: a big lag after Chernobyl obviously TMI and Chernobyl, and 119 00:07:52,116 --> 00:07:55,396 Speaker 2: then in nineteen nineties and then two thousands is where 120 00:07:55,596 --> 00:07:59,436 Speaker 2: we started like seeing you know, some murmurs about like hey, 121 00:07:59,516 --> 00:08:02,116 Speaker 2: you know, is there you know, renewed interest And really 122 00:08:02,116 --> 00:08:04,196 Speaker 2: in the two thousands, you know, when people are talking 123 00:08:04,236 --> 00:08:06,796 Speaker 2: about climate change and they start looking around and see, okay, 124 00:08:06,836 --> 00:08:09,516 Speaker 2: what can really what can we do? About it, the 125 00:08:09,516 --> 00:08:13,076 Speaker 2: concerns about climate change and the need for low carbon 126 00:08:13,196 --> 00:08:16,356 Speaker 2: energy sources. It renews some of those interests. Yes, we've 127 00:08:16,356 --> 00:08:19,196 Speaker 2: seen a lot of growth in solar and other renewables, 128 00:08:19,516 --> 00:08:21,076 Speaker 2: but really, at the end of the day, you know, 129 00:08:21,156 --> 00:08:23,316 Speaker 2: you chilled the customers the new back in their head. 130 00:08:23,956 --> 00:08:27,716 Speaker 2: They need something dispatchable. They wanted some real clean base 131 00:08:27,836 --> 00:08:28,276 Speaker 2: or power. 132 00:08:28,436 --> 00:08:32,596 Speaker 1: So dispatchable and base load basically means always available whenever 133 00:08:32,636 --> 00:08:34,356 Speaker 1: you need it now, like solar and wind. 134 00:08:34,436 --> 00:08:37,196 Speaker 2: That's correct, that's great. So in two thousand and five, 135 00:08:37,236 --> 00:08:39,996 Speaker 2: you see some policy changes, right, you see the Energy 136 00:08:40,036 --> 00:08:44,916 Speaker 2: Policy Act that providers some incentive to revive the industry. Okay, 137 00:08:45,356 --> 00:08:48,316 Speaker 2: and so that kind of like sparked. You know, you've 138 00:08:48,356 --> 00:08:51,396 Speaker 2: seen like you know, after many decades, we've built Plan 139 00:08:51,556 --> 00:08:56,436 Speaker 2: Vogel that just Unit three one operational last year. Unit 140 00:08:56,436 --> 00:08:59,036 Speaker 2: four went online this year, so you know, it's it's 141 00:08:59,036 --> 00:09:02,356 Speaker 2: a big achievement for a nuclear after such a long lag. 142 00:09:02,596 --> 00:09:05,396 Speaker 1: So this is the project in Georgia, like the first 143 00:09:05,716 --> 00:09:09,356 Speaker 1: new nuclear power plant in decades. 144 00:09:09,596 --> 00:09:12,396 Speaker 2: That's correct, that's correct. The two units, I think there 145 00:09:12,396 --> 00:09:17,476 Speaker 2: were originally two other units being pursued in summer, but 146 00:09:17,556 --> 00:09:21,196 Speaker 2: then those projects stalled, but these two have continued and 147 00:09:21,276 --> 00:09:23,516 Speaker 2: then Unit three and four just came online and now 148 00:09:23,556 --> 00:09:26,916 Speaker 2: millions of homes are being powered from this clean source 149 00:09:26,956 --> 00:09:30,756 Speaker 2: of energy. However, these are first of a kind units, 150 00:09:30,796 --> 00:09:32,196 Speaker 2: and there's a lot of first of a kind of 151 00:09:32,276 --> 00:09:34,876 Speaker 2: risk that went along with it. So it's a mix 152 00:09:34,916 --> 00:09:38,316 Speaker 2: of optimism on one side that hey, we just built 153 00:09:38,356 --> 00:09:41,196 Speaker 2: new power plants after so many decades, But on the 154 00:09:41,236 --> 00:09:44,476 Speaker 2: other hand, oh, you know, the cost went off, it 155 00:09:44,516 --> 00:09:47,076 Speaker 2: took longer to build it. You know, it's really the 156 00:09:47,116 --> 00:09:50,556 Speaker 2: first of a kind, and that kind of challenge is 157 00:09:50,556 --> 00:09:53,316 Speaker 2: what we are living through right now, right it's really 158 00:09:53,396 --> 00:09:57,436 Speaker 2: the project costs are high. There's a lot of risks 159 00:09:57,876 --> 00:10:01,476 Speaker 2: and uncertainties around how long can we actually take to 160 00:10:01,476 --> 00:10:04,996 Speaker 2: build one of these? But the good news is, hopefully 161 00:10:05,076 --> 00:10:07,116 Speaker 2: we built two of these units, we'll learn from it 162 00:10:07,276 --> 00:10:09,636 Speaker 2: and we can do it faster and better and and cheaper. 163 00:10:09,676 --> 00:10:11,716 Speaker 1: I mean, is it's sort of like we never, at 164 00:10:11,796 --> 00:10:14,996 Speaker 1: least in this country, learned how to build a modern 165 00:10:15,076 --> 00:10:18,356 Speaker 1: nuclear plant, Like we build nuclear plants like literally fifty 166 00:10:18,436 --> 00:10:20,276 Speaker 1: years ago, and then we kind of stopped and now 167 00:10:20,316 --> 00:10:22,996 Speaker 1: we got to start from not quite zero but kind 168 00:10:22,996 --> 00:10:23,836 Speaker 1: of scratch again. 169 00:10:24,036 --> 00:10:26,596 Speaker 2: So if you look at the infrastructure, right, we don't 170 00:10:26,596 --> 00:10:29,356 Speaker 2: build big things anymore. 171 00:10:29,436 --> 00:10:32,036 Speaker 1: Much less nuclear power plants. Like even the tunnel. Right, 172 00:10:32,036 --> 00:10:34,356 Speaker 1: they're building a tunnel from New Jersey to New York 173 00:10:34,396 --> 00:10:36,476 Speaker 1: under the Hudson River. It's gonna cost I don't know, 174 00:10:36,596 --> 00:10:40,076 Speaker 1: fifteen billion dollars or something. That's just a tube under 175 00:10:40,116 --> 00:10:40,916 Speaker 1: the river. 176 00:10:41,796 --> 00:10:43,996 Speaker 2: And it's it's it's all common across the board. It 177 00:10:44,036 --> 00:10:48,356 Speaker 2: is because when you build something bespoke and a very 178 00:10:48,436 --> 00:10:52,796 Speaker 2: giant complex project, we lost that muscle to really execute 179 00:10:52,836 --> 00:10:54,556 Speaker 2: such ginomics projects in this kind. 180 00:10:54,476 --> 00:10:58,356 Speaker 1: Of Well, so you were walking us very elegantly toward 181 00:10:58,436 --> 00:11:02,956 Speaker 1: the dream of micro reactors, right, like, away from giant 182 00:11:02,996 --> 00:11:06,116 Speaker 1: bespoke projects and toward the dream of a sort of 183 00:11:06,636 --> 00:11:09,356 Speaker 1: factory built put it on the back of a truck 184 00:11:09,836 --> 00:11:11,956 Speaker 1: nuclear reactor, which is in fact what you're working on. 185 00:11:12,236 --> 00:11:12,876 Speaker 2: That's correct. 186 00:11:13,156 --> 00:11:17,476 Speaker 1: So tell me about microreactors, right. Microreactor is this word 187 00:11:17,916 --> 00:11:21,156 Speaker 1: that I've heard, like smart people say for a few years, 188 00:11:21,236 --> 00:11:23,516 Speaker 1: and I get from the name that it is a 189 00:11:23,556 --> 00:11:27,876 Speaker 1: reactor that is small. But like to start telling me, like, 190 00:11:27,916 --> 00:11:30,396 Speaker 1: what is the dream of microreactors? Why is this what 191 00:11:30,516 --> 00:11:32,956 Speaker 1: smart people talk about when they talk about nuclear power? 192 00:11:33,036 --> 00:11:38,236 Speaker 2: So microreactors are really defined very small transportable reactors that 193 00:11:38,356 --> 00:11:43,276 Speaker 2: are between one to you know, ten or twenty megawatt electric. 194 00:11:43,236 --> 00:11:47,876 Speaker 1: So that's maybe whatever, less than a tenth the size 195 00:11:47,876 --> 00:11:51,676 Speaker 1: maybe one hundredth the size of a of a power plant. 196 00:11:52,276 --> 00:11:56,676 Speaker 1: Truly micro truly Okay, so they're micro, Like, why is 197 00:11:56,716 --> 00:11:58,956 Speaker 1: that appealing? Like, what's the rationale there? 198 00:11:59,156 --> 00:12:02,956 Speaker 2: So there are three key features that makes these small 199 00:12:03,116 --> 00:12:08,756 Speaker 2: reactors attractive microreactors in general. First, there, because of their 200 00:12:08,876 --> 00:12:13,596 Speaker 2: small size, they're in envision to be fully factory built, ah, 201 00:12:14,036 --> 00:12:18,556 Speaker 2: not smaller components or modules. And then bring to site. 202 00:12:18,596 --> 00:12:21,116 Speaker 2: You build a whole thing in a factory. That's number one. 203 00:12:21,476 --> 00:12:25,836 Speaker 2: And you can also transport them using standard roadways or 204 00:12:26,236 --> 00:12:29,876 Speaker 2: railways or or you know through the sea. Right, Okay, 205 00:12:30,156 --> 00:12:31,036 Speaker 2: that's number one. 206 00:12:31,516 --> 00:12:32,916 Speaker 1: So you build it in a factory and put it 207 00:12:32,956 --> 00:12:34,996 Speaker 1: on the back of a truck, and that is going 208 00:12:35,076 --> 00:12:41,036 Speaker 1: to be, in theory, wildly cheaper than building a bespoke 209 00:12:41,276 --> 00:12:44,316 Speaker 1: power plant every time. I mean, it's just like like 210 00:12:44,396 --> 00:12:46,596 Speaker 1: a building a car, right, Like if you had to 211 00:12:46,596 --> 00:12:49,316 Speaker 1: build a car from scratch every time somebody wanted a car, 212 00:12:49,396 --> 00:12:51,916 Speaker 1: it would literally cost millions of dollars. But if you 213 00:12:51,956 --> 00:12:53,676 Speaker 1: make a thousand of the same car in a factory 214 00:12:53,756 --> 00:12:55,596 Speaker 1: or one hundred thousand of the same car in factory, 215 00:12:55,636 --> 00:12:58,036 Speaker 1: it gets wildly cheaper. That's the that's part one of the. 216 00:12:58,036 --> 00:13:00,436 Speaker 2: Dream, and that's really the main idea. Right when you 217 00:13:00,516 --> 00:13:03,996 Speaker 2: do repetition of the same thing over and over again, 218 00:13:04,316 --> 00:13:06,436 Speaker 2: you can learn how to bring the cost down faster 219 00:13:06,876 --> 00:13:09,356 Speaker 2: you learn it. You're building in a controlled environment, meant 220 00:13:09,676 --> 00:13:10,436 Speaker 2: you're bringing. 221 00:13:10,196 --> 00:13:14,516 Speaker 1: The industrial revolution. Like we've known this for hundreds of years. Literally, 222 00:13:14,596 --> 00:13:17,316 Speaker 1: adamstraethroat about this in seventeen seventy six. 223 00:13:17,396 --> 00:13:17,756 Speaker 2: That's right. 224 00:13:17,796 --> 00:13:19,956 Speaker 1: If you build things in a factory, they get weight cheaper. 225 00:13:20,036 --> 00:13:24,116 Speaker 2: Okay, However, yeah, there are some downsides of a small reactor. 226 00:13:24,116 --> 00:13:27,076 Speaker 2: From a physics perspective. You have higher leakage and the 227 00:13:27,076 --> 00:13:30,596 Speaker 2: economies of scales against you, so you have filtified other 228 00:13:30,756 --> 00:13:33,476 Speaker 2: ways to offset the costs. 229 00:13:33,476 --> 00:13:36,476 Speaker 1: So there's a cost. It doesn't just scale down in 230 00:13:36,516 --> 00:13:41,356 Speaker 1: an elegant way. It gets worse on certain dimensions. 231 00:13:40,796 --> 00:13:43,436 Speaker 2: Like for example, if you look at a current power plant, 232 00:13:43,636 --> 00:13:46,916 Speaker 2: a water water cooled power plants that are basically the 233 00:13:46,956 --> 00:13:48,956 Speaker 2: infrastructure you know, that's the basis of all of the 234 00:13:48,996 --> 00:13:51,596 Speaker 2: nuclear power plants, commercially found today in the US. So 235 00:13:51,636 --> 00:13:54,316 Speaker 2: if you look at those, you have around one hundred 236 00:13:54,476 --> 00:13:58,996 Speaker 2: systems that that's around the nuclear reactor to keep it happy, 237 00:13:59,076 --> 00:14:03,596 Speaker 2: to make it work functionally, operationally, safer. One hundred systems, right. 238 00:14:03,596 --> 00:14:06,716 Speaker 1: One hundred different Like when you say systems, like, what's 239 00:14:06,796 --> 00:14:08,636 Speaker 1: one of the hundred systems you're talking about? 240 00:14:08,676 --> 00:14:12,156 Speaker 2: Chemical and volume control system? Are you know, a high 241 00:14:12,156 --> 00:14:15,516 Speaker 2: pressure injection system for safety? There are various systems that 242 00:14:15,836 --> 00:14:19,156 Speaker 2: ensure that the reactor runs properly, right, huh. 243 00:14:19,196 --> 00:14:21,796 Speaker 1: And so for a microreactor, you cannot build one hundred 244 00:14:21,796 --> 00:14:25,396 Speaker 1: systems for every microreactor because then you lose all the 245 00:14:25,436 --> 00:14:26,956 Speaker 1: cost benefits you have gained. 246 00:14:27,236 --> 00:14:29,156 Speaker 2: And now all of a sudden you have to think like, Okay, 247 00:14:29,276 --> 00:14:31,236 Speaker 2: is that the right technology to scale down? Because if 248 00:14:31,276 --> 00:14:33,076 Speaker 2: I scale it down, I still in one hundred systems 249 00:14:33,156 --> 00:14:35,796 Speaker 2: even I'm beyond. They might be smaller, but it's not 250 00:14:35,876 --> 00:14:38,236 Speaker 2: going to help me on economics of scale. Yeah, so 251 00:14:38,236 --> 00:14:40,436 Speaker 2: you have to kind of rethink the problem a little bit. 252 00:14:40,476 --> 00:14:43,316 Speaker 2: So that's number one is factory made, second is transportation. 253 00:14:43,516 --> 00:14:47,916 Speaker 2: The third one is it's self regulating. Right, if you 254 00:14:47,916 --> 00:14:52,156 Speaker 2: look at a current large scale conventional power plant, you 255 00:14:52,196 --> 00:14:54,996 Speaker 2: have hundreds of people working in the power plant to 256 00:14:54,996 --> 00:14:55,476 Speaker 2: make sure. 257 00:14:55,316 --> 00:15:00,476 Speaker 1: It works well, Homer Simpson famously, Well, let's not go there. 258 00:15:01,836 --> 00:15:05,316 Speaker 1: I apologize. Is that an annoying How do you? Are 259 00:15:05,316 --> 00:15:07,836 Speaker 1: you tired of that? I'm sorry. It's lazy on my part. 260 00:15:07,996 --> 00:15:11,196 Speaker 2: Yeah, no, I mean it is. It does portray I 261 00:15:11,236 --> 00:15:14,956 Speaker 2: mean Simpsons. My whole entire generation grew up watching Simpsons, right, 262 00:15:15,316 --> 00:15:18,516 Speaker 2: and so it portrayed some things about nuclear power plants 263 00:15:18,516 --> 00:15:21,716 Speaker 2: that its not necessarily painting the right picture. 264 00:15:22,716 --> 00:15:26,876 Speaker 1: It's capturing so that the Simpsons launched in the eighties, right, 265 00:15:26,996 --> 00:15:32,156 Speaker 1: So it is capturing that sort of peak anti nuclear zeitgeist. 266 00:15:32,356 --> 00:15:33,636 Speaker 2: That's right, that's right, that's right. 267 00:15:34,196 --> 00:15:37,276 Speaker 1: So okay, so I apologize, I have derailed us. 268 00:15:37,916 --> 00:15:42,916 Speaker 2: So factor that makes a microaractor unique is the ability 269 00:15:43,036 --> 00:15:46,116 Speaker 2: to self regulate. So instead of needing hundreds of people, 270 00:15:46,636 --> 00:15:51,036 Speaker 2: you need one or two operators to run the system. 271 00:15:51,316 --> 00:15:56,036 Speaker 2: That means the machine itself must be able to ensure 272 00:15:56,156 --> 00:16:00,716 Speaker 2: safe operations without relying on people or if there's a 273 00:16:00,796 --> 00:16:04,236 Speaker 2: human error, it kind of self regulates itself. 274 00:16:04,516 --> 00:16:11,156 Speaker 1: So you actually came up with an idea for you 275 00:16:11,436 --> 00:16:16,196 Speaker 1: came up with a design for a microreactor, right, You 276 00:16:16,356 --> 00:16:18,436 Speaker 1: were you were. It was your previous job. You were 277 00:16:18,436 --> 00:16:20,596 Speaker 1: working for the federal government right as a as a 278 00:16:20,636 --> 00:16:25,036 Speaker 1: researcher at a lab dedicated to to figuring out microreactors. 279 00:16:25,076 --> 00:16:27,156 Speaker 1: And as I understand that there was actually like a 280 00:16:27,196 --> 00:16:30,676 Speaker 1: particular moment when you had an idea, which seems like 281 00:16:30,716 --> 00:16:33,036 Speaker 1: it never actually happens, but I always love it when 282 00:16:33,076 --> 00:16:35,476 Speaker 1: it happens, So tell me about this moment. 283 00:16:35,876 --> 00:16:40,676 Speaker 2: Sure, So after a month I joined Idaho National Laboratory 284 00:16:41,316 --> 00:16:43,876 Speaker 2: and they really hired me in to establish or to 285 00:16:43,956 --> 00:16:48,876 Speaker 2: help them establish the DOV Department of Energy microreactive program. Okay, 286 00:16:49,156 --> 00:16:53,036 Speaker 2: and very soon after I helped kind of establish the program, 287 00:16:53,396 --> 00:16:58,876 Speaker 2: I realized, instead of having smaller projects and specific problem areas, 288 00:16:59,276 --> 00:17:02,236 Speaker 2: we need to put them together into a test reactor. 289 00:17:02,916 --> 00:17:06,356 Speaker 2: We have to build a prototype, a real test reactor 290 00:17:06,716 --> 00:17:10,676 Speaker 2: that shows everyone what a microreactor is, How does it operate? 291 00:17:10,716 --> 00:17:13,196 Speaker 2: How many people do we need to operate it? Can 292 00:17:13,276 --> 00:17:16,196 Speaker 2: it be co located in a neighborhood, for example, and 293 00:17:16,276 --> 00:17:19,996 Speaker 2: operate safely? Now, right after, after about a month or softer, 294 00:17:20,116 --> 00:17:22,636 Speaker 2: I joined iron L. I realized, let me go ahead 295 00:17:22,636 --> 00:17:25,556 Speaker 2: and pitch this to the Department of Energy, And I 296 00:17:25,596 --> 00:17:27,916 Speaker 2: did that to the lab leadership. They liked the idea. 297 00:17:28,676 --> 00:17:30,756 Speaker 2: I went to Department of Energy. They thought it was 298 00:17:30,796 --> 00:17:35,276 Speaker 2: an important thing to do. And so the question becomes, okay, 299 00:17:35,596 --> 00:17:38,196 Speaker 2: what size, what should be the technology? 300 00:17:38,196 --> 00:17:40,876 Speaker 1: And now you got to design it right. Everybody's like, yeah, great, 301 00:17:40,916 --> 00:17:43,276 Speaker 1: go do it. Now you gotta do it right. What 302 00:17:43,396 --> 00:17:48,156 Speaker 1: is the most basic, like plane vanilla explanation of what 303 00:17:48,316 --> 00:17:51,476 Speaker 1: is going on in the core of a nuclear power plant, 304 00:17:51,556 --> 00:17:53,636 Speaker 1: just generically any nuclear power plant. 305 00:17:54,516 --> 00:17:56,796 Speaker 2: So what you're really looking for is, you know, you're 306 00:17:56,956 --> 00:18:01,516 Speaker 2: you're splitting larger heavy atoms. In our case, it is 307 00:18:01,636 --> 00:18:06,356 Speaker 2: mostly uranium, right, and there's a specific isotope called urinium 308 00:18:06,356 --> 00:18:09,756 Speaker 2: two thirty five. It's a fecile material. If you hit 309 00:18:09,836 --> 00:18:13,556 Speaker 2: it with the neutron, it splits and into you know, 310 00:18:13,596 --> 00:18:17,556 Speaker 2: fragments of you know, other nuclei and some neutrons and 311 00:18:17,596 --> 00:18:21,356 Speaker 2: some energy. But you also release other neutron as part 312 00:18:21,396 --> 00:18:25,316 Speaker 2: of that splitting. So what you want a nuclear reactor 313 00:18:25,956 --> 00:18:29,596 Speaker 2: is for that secondary neutron to go hit another nuclear 314 00:18:29,716 --> 00:18:34,036 Speaker 2: nucleus and then continue on that and that perpetuates into 315 00:18:34,636 --> 00:18:39,756 Speaker 2: a chain reaction, right, and the process of fission splitting 316 00:18:39,836 --> 00:18:42,676 Speaker 2: up the nucleus releases large amount of energy, and that's 317 00:18:42,716 --> 00:18:45,516 Speaker 2: the energy we want to essentially take out of the 318 00:18:45,636 --> 00:18:50,716 Speaker 2: fuel through a coolant into and dump it into a turbin. 319 00:18:51,076 --> 00:18:53,956 Speaker 1: You capture the energy as heat and then it's just 320 00:18:54,036 --> 00:18:57,396 Speaker 1: like any other power plant, but instead of burning fossil 321 00:18:57,396 --> 00:19:00,516 Speaker 1: fuel to get the heat, you're splitting uranium atoms. 322 00:19:00,156 --> 00:19:03,276 Speaker 2: To get precisely so after you take the heat away 323 00:19:03,396 --> 00:19:06,596 Speaker 2: and send it to a secondary system, to a turbine, 324 00:19:07,236 --> 00:19:10,116 Speaker 2: it's no different than a coal power lane or a 325 00:19:10,236 --> 00:19:11,196 Speaker 2: natural gas. 326 00:19:11,716 --> 00:19:15,436 Speaker 1: And so what is the what is the challenge? What 327 00:19:15,516 --> 00:19:18,236 Speaker 1: is the problem you're trying to avoid in that setting? 328 00:19:19,316 --> 00:19:22,276 Speaker 2: So, I mean from a reactor physics perspective, you want 329 00:19:22,276 --> 00:19:24,876 Speaker 2: to make sure that when you when you want heat 330 00:19:25,716 --> 00:19:29,156 Speaker 2: and you can generate a chain reaction to to emit 331 00:19:29,236 --> 00:19:31,196 Speaker 2: this heat and capture it and use it in a 332 00:19:31,276 --> 00:19:34,916 Speaker 2: useful way. You want to be able to control it effectively, right, 333 00:19:35,116 --> 00:19:39,916 Speaker 2: that's what involves you know, the whole reactor. If you're 334 00:19:39,916 --> 00:19:43,116 Speaker 2: able to control this chain reaction, then you are functioning 335 00:19:43,596 --> 00:19:47,076 Speaker 2: you know, power reactor. You don't want an uncontrolled reaction. 336 00:19:47,156 --> 00:19:48,796 Speaker 2: You want to be able to control it so you 337 00:19:48,836 --> 00:19:51,716 Speaker 2: can you can ensure that you can safely remove this 338 00:19:51,836 --> 00:19:55,596 Speaker 2: heat without breaking anything. That's the whole premise of a 339 00:19:55,676 --> 00:19:56,876 Speaker 2: nuclear reactor, right. 340 00:19:57,076 --> 00:20:00,276 Speaker 1: I mean, an uncontrolled reaction is like a bomb. Right, 341 00:20:00,356 --> 00:20:02,196 Speaker 1: It's like a terrible bomb. 342 00:20:02,356 --> 00:20:03,236 Speaker 2: That's exactly right. 343 00:20:05,276 --> 00:20:08,196 Speaker 1: Coming up after the break, the alser goes to Walmart 344 00:20:09,476 --> 00:20:12,556 Speaker 1: winds up designing a new kind of nuclear reactor. 345 00:20:21,156 --> 00:20:24,956 Speaker 2: So these ideas were You know, when you're a reactor designer, 346 00:20:25,276 --> 00:20:28,276 Speaker 2: you're then I thinking about all the various iterations and 347 00:20:28,276 --> 00:20:33,716 Speaker 2: permutations and combinations of what makes a nuclear technology feasible. Right, 348 00:20:33,996 --> 00:20:37,356 Speaker 2: And if you look into it, mostly the combination of 349 00:20:37,396 --> 00:20:42,356 Speaker 2: fuel and coolant used in a reactor defines a nuclear technology, 350 00:20:42,396 --> 00:20:44,796 Speaker 2: and there's like, you know, about one hundred, one hundred 351 00:20:44,796 --> 00:20:48,396 Speaker 2: and twenty combinations out there. Mostly we've tried almost every 352 00:20:48,396 --> 00:20:50,236 Speaker 2: combination in tests in the past, right. 353 00:20:50,796 --> 00:20:53,956 Speaker 1: So you basically you got to make the fission reaction happen. 354 00:20:53,996 --> 00:20:55,756 Speaker 1: You need some fuel to do that, and then it's 355 00:20:55,756 --> 00:20:58,276 Speaker 1: going to generate a crazy amount of heat, so you 356 00:20:58,356 --> 00:20:59,996 Speaker 1: got to keep that from getting out of hand with 357 00:20:59,996 --> 00:21:02,596 Speaker 1: the coolant. Like, those are the two things you got 358 00:21:02,596 --> 00:21:02,756 Speaker 1: to do. 359 00:21:03,236 --> 00:21:06,116 Speaker 2: That's every reactor designers to pick that. You're then I 360 00:21:06,156 --> 00:21:09,756 Speaker 2: thinking about different technologies, right, It's not really fully formulated 361 00:21:10,116 --> 00:21:13,356 Speaker 2: is in your subconscious mind. So the moment I was 362 00:21:13,396 --> 00:21:16,556 Speaker 2: thinking about let's go build a reactor in i n 363 00:21:16,676 --> 00:21:20,196 Speaker 2: L for the microreactor program, I started thinking about what 364 00:21:20,276 --> 00:21:24,436 Speaker 2: should be the technology and then it really happened in 365 00:21:24,756 --> 00:21:28,316 Speaker 2: a suddenly overnight I woke up and I said, okay, 366 00:21:28,356 --> 00:21:30,276 Speaker 2: you know what I think. I know what it is, 367 00:21:31,196 --> 00:21:33,876 Speaker 2: but I really have to put that on paper. I 368 00:21:33,996 --> 00:21:37,476 Speaker 2: did go to Walmark, got some colored pencils and a 369 00:21:37,476 --> 00:21:40,956 Speaker 2: big paper and started sketching it up how that system 370 00:21:40,996 --> 00:21:43,836 Speaker 2: is going to look like. Now, that's just an idea. 371 00:21:43,956 --> 00:21:47,636 Speaker 2: Obviously we took that idea and really started making the 372 00:21:47,716 --> 00:21:51,556 Speaker 2: requirements to build a reactor. Some things evolved, but fundamentally 373 00:21:51,596 --> 00:21:55,236 Speaker 2: it was the same concept that I sketched up a 374 00:21:55,276 --> 00:21:57,596 Speaker 2: few days before Christmas in twenty nineteen. 375 00:21:58,076 --> 00:22:00,956 Speaker 1: So what was the concept? What was the design? 376 00:22:01,156 --> 00:22:04,196 Speaker 2: So really looked at all those different iterations and came 377 00:22:04,276 --> 00:22:09,836 Speaker 2: down with what's called a sodium thermal reactor. Right. It 378 00:22:09,916 --> 00:22:13,996 Speaker 2: is basically using uranium or coonium hydrite, the same fuel 379 00:22:14,036 --> 00:22:16,836 Speaker 2: that we use in a lot of research reactors around 380 00:22:16,836 --> 00:22:18,796 Speaker 2: the globe. We have a lot of data on it. 381 00:22:18,836 --> 00:22:21,316 Speaker 2: If we understand it very well, if you couple that 382 00:22:21,676 --> 00:22:26,556 Speaker 2: with a very high conductive coolant like sodium liquid sodium 383 00:22:26,596 --> 00:22:29,396 Speaker 2: in our case, all of a sudden you can have 384 00:22:29,476 --> 00:22:34,116 Speaker 2: a low pressured h nuclear reactor with a high power 385 00:22:34,156 --> 00:22:38,276 Speaker 2: density and low enrichment need. So that really was the 386 00:22:38,316 --> 00:22:42,356 Speaker 2: basis the fundamental technology choice for Marvel. 387 00:22:44,996 --> 00:22:46,076 Speaker 1: Why do you call it Marvel? 388 00:22:47,356 --> 00:22:53,436 Speaker 2: Huh? Well, that's because I wanted to name that people 389 00:22:54,356 --> 00:22:57,276 Speaker 2: can remember easily, and that does not sound like a 390 00:22:57,676 --> 00:23:01,996 Speaker 2: scary Greek god. Smart and and and it can it 391 00:23:01,996 --> 00:23:03,156 Speaker 2: can shine the light. 392 00:23:03,356 --> 00:23:04,756 Speaker 1: You know, you don't want to call it. You don't 393 00:23:04,756 --> 00:23:06,556 Speaker 1: want to call it Icarus, right, you know what to 394 00:23:06,596 --> 00:23:07,956 Speaker 1: call it? Nuclear actor Icarus? 395 00:23:07,996 --> 00:23:10,476 Speaker 2: And that's right, that's right. And also like it's a. 396 00:23:10,356 --> 00:23:12,956 Speaker 1: Prometheus, don't call it perm Yeah, what's what's it an 397 00:23:12,956 --> 00:23:13,516 Speaker 1: acronym for? 398 00:23:13,876 --> 00:23:15,996 Speaker 2: Oh god, it has a very long name, so it's 399 00:23:16,036 --> 00:23:21,436 Speaker 2: just just do it. And it's Microreactor Applications Research and 400 00:23:22,516 --> 00:23:27,036 Speaker 2: microature Applications Research, Validation and Evaluation project. And it's very 401 00:23:27,716 --> 00:23:30,196 Speaker 2: it's a very districtive name if you think about it. 402 00:23:29,916 --> 00:23:33,836 Speaker 1: It could be anything, right, right, right? Yeah, your acronym 403 00:23:34,516 --> 00:23:38,556 Speaker 1: it was mine. Unfortunately, Well it was sort of peak Marvel, right, 404 00:23:38,596 --> 00:23:40,876 Speaker 1: you said it was twenty nineteen. It really sticks it 405 00:23:40,916 --> 00:23:45,196 Speaker 1: in time as like a peak Marvel moment. So okay, 406 00:23:45,276 --> 00:23:49,276 Speaker 1: so you have designed this thing, you get approval for it. 407 00:23:51,556 --> 00:23:54,196 Speaker 1: Let's let's talk about safety, because you've talked about, you know, 408 00:23:54,276 --> 00:23:56,596 Speaker 1: wanting to engineer it in a way that is both 409 00:23:57,556 --> 00:24:03,076 Speaker 1: economically sensible, right, to engineer it in a way that 410 00:24:03,796 --> 00:24:05,516 Speaker 1: some company is going to pay to build it, and 411 00:24:05,516 --> 00:24:07,956 Speaker 1: that it makes sense to build it and safely run it. 412 00:24:10,156 --> 00:24:13,516 Speaker 1: And that's complicated, right, It's complicated for a microreactor. So 413 00:24:13,596 --> 00:24:16,716 Speaker 1: how how are you dealing with that as you're designing 414 00:24:16,756 --> 00:24:17,436 Speaker 1: this reactor. 415 00:24:17,956 --> 00:24:20,876 Speaker 2: If you look at what is and ask the question 416 00:24:20,956 --> 00:24:26,036 Speaker 2: what is an ideal nuclear reactor, it would be what 417 00:24:26,156 --> 00:24:30,636 Speaker 2: is the simplest reactor that can have the highest level 418 00:24:30,676 --> 00:24:33,796 Speaker 2: of safety without having to add a ton of systems 419 00:24:34,196 --> 00:24:35,476 Speaker 2: to ensure that it is safe? 420 00:24:35,956 --> 00:24:39,076 Speaker 1: Right? I mean the dream is just like whatever, a 421 00:24:39,116 --> 00:24:41,596 Speaker 1: pile of dirt or something. Right, The dream is that 422 00:24:41,756 --> 00:24:44,516 Speaker 1: it's a glass of water that you could somehow magically 423 00:24:44,556 --> 00:24:46,356 Speaker 1: get power out of. It's like, what's the worst that 424 00:24:46,356 --> 00:24:46,836 Speaker 1: could happen? 425 00:24:46,956 --> 00:24:50,196 Speaker 2: Right? That's right. So there's engineered safety, which really is 426 00:24:50,796 --> 00:24:53,196 Speaker 2: you know, you have to do have a lot of 427 00:24:53,276 --> 00:24:57,196 Speaker 2: engineered man made systems. It's like pressing a brake in 428 00:24:57,236 --> 00:25:00,116 Speaker 2: a car. If you're designing the system, breaks can fail. 429 00:25:00,476 --> 00:25:02,636 Speaker 2: Sometimes you have to kind of have backups for that. 430 00:25:02,996 --> 00:25:05,076 Speaker 2: So there's a lot of additional things that go into it. 431 00:25:05,316 --> 00:25:07,276 Speaker 1: And to be clear, that is sort of the model 432 00:25:07,396 --> 00:25:10,556 Speaker 1: for big utility scale NUC power plants, right. They're full 433 00:25:10,596 --> 00:25:14,596 Speaker 1: of highly engineered systems and backups for those systems and 434 00:25:14,676 --> 00:25:17,276 Speaker 1: lots of people there to make sure that all those 435 00:25:17,276 --> 00:25:21,396 Speaker 1: systems are functioning so that you don't have some terrible 436 00:25:21,516 --> 00:25:23,036 Speaker 1: nuclear accident. 437 00:25:22,876 --> 00:25:25,556 Speaker 2: That is correct. And you're really engineered those systems to 438 00:25:25,556 --> 00:25:27,916 Speaker 2: make sure they're reliable, and you go through all years 439 00:25:27,916 --> 00:25:29,436 Speaker 2: of qualification tends to achieve that. 440 00:25:29,836 --> 00:25:32,396 Speaker 1: And like, that's just not going to work for a microreactor, right, 441 00:25:32,436 --> 00:25:34,756 Speaker 1: Like you can't have all that because it'll be too 442 00:25:34,796 --> 00:25:36,316 Speaker 1: expensive for too little power. 443 00:25:36,556 --> 00:25:40,956 Speaker 2: That's correct. So you to really achieve that high safety 444 00:25:40,996 --> 00:25:45,316 Speaker 2: with fewer amount of systems, you want what is called 445 00:25:45,356 --> 00:25:49,316 Speaker 2: inherent safety, right, it is baked into the material physics 446 00:25:49,716 --> 00:25:53,436 Speaker 2: of the fuel. And so we looked around and we said, okay, 447 00:25:53,476 --> 00:25:57,516 Speaker 2: what is the highest inherent safety fuel out there? And 448 00:25:57,556 --> 00:26:01,116 Speaker 2: it really is uraniums of coonium hydride Okay. 449 00:26:00,876 --> 00:26:07,116 Speaker 1: So you choose a fuel that has this elegant property, 450 00:26:07,116 --> 00:26:09,636 Speaker 1: which is if the chain reactions starts to get out 451 00:26:09,676 --> 00:26:13,196 Speaker 1: of control, the hydrogen that is mixed in with the 452 00:26:13,236 --> 00:26:16,916 Speaker 1: fuel tends to bring it back under control. Is that 453 00:26:16,956 --> 00:26:19,956 Speaker 1: a fair okay? So is it the case that with 454 00:26:20,396 --> 00:26:23,756 Speaker 1: the fuel you're using, like there is physically no way 455 00:26:23,876 --> 00:26:25,796 Speaker 1: the chain reaction could get out of control or is 456 00:26:25,796 --> 00:26:27,036 Speaker 1: it just way less likely? 457 00:26:27,876 --> 00:26:29,276 Speaker 2: It's way less likely. 458 00:26:29,436 --> 00:26:33,276 Speaker 1: Okay. So in addition to choosing this particular fuel, that 459 00:26:33,356 --> 00:26:35,476 Speaker 1: was one of the things you did to bring this 460 00:26:35,716 --> 00:26:38,036 Speaker 1: higher level of inherent safety. It's clearly not going to 461 00:26:38,076 --> 00:26:39,836 Speaker 1: be enough. Like, what else do you have to do 462 00:26:39,916 --> 00:26:41,036 Speaker 1: in designing this reactor? 463 00:26:41,596 --> 00:26:44,156 Speaker 2: Well, there's a lot, But the second choice is the coolant, 464 00:26:44,396 --> 00:26:47,956 Speaker 2: right okay. Coolant is the fluid that takes the heat 465 00:26:47,996 --> 00:26:52,676 Speaker 2: from the core and transfers it to the secondary system 466 00:26:52,836 --> 00:26:54,916 Speaker 2: where you want to make use of this heat. 467 00:26:55,116 --> 00:26:55,716 Speaker 1: Right okay. 468 00:26:55,836 --> 00:26:59,276 Speaker 2: And if you look at water today, most existing power 469 00:26:59,316 --> 00:27:02,276 Speaker 2: plants are built with water. Water will be known very much, 470 00:27:02,356 --> 00:27:05,236 Speaker 2: you know, all the properties we've known. We've designed other 471 00:27:05,276 --> 00:27:08,036 Speaker 2: power plants before nuclears, We're very familiar with water. So 472 00:27:08,236 --> 00:27:10,876 Speaker 2: the industry kind of more toward that direction. But if 473 00:27:10,916 --> 00:27:12,436 Speaker 2: you if you take a step back and you look 474 00:27:12,476 --> 00:27:15,596 Speaker 2: at water. It has some benefits because it's familiar, but 475 00:27:15,676 --> 00:27:19,076 Speaker 2: it has some cons as well, so some some some 476 00:27:19,156 --> 00:27:22,916 Speaker 2: challenges because you want a system to be hot to 477 00:27:22,996 --> 00:27:25,436 Speaker 2: extract that heat. But with water, as soon as you 478 00:27:26,236 --> 00:27:29,076 Speaker 2: exceed one hundred degrees celsias, what does it want to do. 479 00:27:29,356 --> 00:27:31,956 Speaker 2: It wants to boil off. Right, We're just not a 480 00:27:31,996 --> 00:27:36,716 Speaker 2: good thing. So to prevent from boiling, you pressurize the system. 481 00:27:37,156 --> 00:27:40,356 Speaker 1: Right, there's more adding pressure raises the boiling point. 482 00:27:40,596 --> 00:27:43,516 Speaker 2: That's correct. Now you can now all of a sudden, 483 00:27:43,556 --> 00:27:46,956 Speaker 2: you need something that is thick our vessel. You want 484 00:27:46,996 --> 00:27:50,036 Speaker 2: to make sure the you know, you can keep it 485 00:27:50,076 --> 00:27:52,716 Speaker 2: at at the pressurized level. You need a pressurizer. You 486 00:27:52,756 --> 00:27:56,156 Speaker 2: need a sick containment building in case there's a pipe 487 00:27:56,156 --> 00:27:58,716 Speaker 2: break or something. You still have a you know, sick 488 00:27:58,796 --> 00:28:02,076 Speaker 2: steel and concrete line containment to hold everything together. It's 489 00:28:02,116 --> 00:28:04,516 Speaker 2: part of the safety case, right, and it also protects 490 00:28:04,596 --> 00:28:08,436 Speaker 2: you from external hazards like a tornado or a missile 491 00:28:08,596 --> 00:28:12,556 Speaker 2: or something else. Right, So it's really all of these 492 00:28:12,596 --> 00:28:15,396 Speaker 2: combining makes up the overall safety case. So when it 493 00:28:15,436 --> 00:28:18,556 Speaker 2: came for us to choose the coolant we use sodium. 494 00:28:18,876 --> 00:28:25,196 Speaker 2: Sodium is many times more thermally conductive than water, and 495 00:28:25,396 --> 00:28:28,196 Speaker 2: when you heat it up, it does not really boil away. 496 00:28:28,596 --> 00:28:30,756 Speaker 2: At one hundred degrees celsius. Right, the boiling point of 497 00:28:30,796 --> 00:28:34,316 Speaker 2: sodium is hundreds of degrees, much higher than what we 498 00:28:34,356 --> 00:28:40,476 Speaker 2: need for the power generation. Right. So it really gives 499 00:28:40,516 --> 00:28:45,796 Speaker 2: you a non pressurized system, so your vessel walls does 500 00:28:45,836 --> 00:28:49,876 Speaker 2: not have to be this thick forged component that are 501 00:28:49,916 --> 00:28:52,636 Speaker 2: extremely expensive or difficult to make. You can now make 502 00:28:52,676 --> 00:28:57,356 Speaker 2: them with thin walled vessels by simpler manufacturing methods, or 503 00:28:57,396 --> 00:29:00,716 Speaker 2: your costs can go down because you're no longer pressurized. 504 00:29:00,756 --> 00:29:02,756 Speaker 2: You don't need this and you don't have a large 505 00:29:02,796 --> 00:29:06,756 Speaker 2: amount of fuel radiactive material in the core all of 506 00:29:06,796 --> 00:29:10,116 Speaker 2: a sudden. With the microreactor using sodium, you can make 507 00:29:10,156 --> 00:29:13,156 Speaker 2: the case to the regulator that you don't need a 508 00:29:14,116 --> 00:29:17,756 Speaker 2: traditional containment. Okay, you still need a confinement, but it 509 00:29:17,756 --> 00:29:20,396 Speaker 2: doesn't need to be like you know, extremely you know, 510 00:29:20,556 --> 00:29:24,836 Speaker 2: several feet of concrete and thick, large steel lined containment. 511 00:29:25,716 --> 00:29:27,996 Speaker 2: So there's a lot of other symptoms that you can 512 00:29:28,036 --> 00:29:31,956 Speaker 2: simplify and what you end up seeing by just making 513 00:29:31,996 --> 00:29:35,116 Speaker 2: those two choices and the way you design the reactor. 514 00:29:35,676 --> 00:29:39,156 Speaker 2: From one hundred systems like a traditional plant, you can 515 00:29:39,156 --> 00:29:40,636 Speaker 2: bring that down to about twenty. 516 00:29:40,916 --> 00:29:43,716 Speaker 1: And so what is going from one hundred engineered systems 517 00:29:43,756 --> 00:29:45,196 Speaker 1: to twenty do for you? 518 00:29:46,156 --> 00:29:51,956 Speaker 2: So it really reduces the amount of capital expenditure you 519 00:29:52,156 --> 00:29:57,316 Speaker 2: need initially to build a plant with fewer systems, you need, 520 00:29:57,516 --> 00:30:01,876 Speaker 2: smaller footprint, you need less civil structure. You're paying for 521 00:30:01,956 --> 00:30:06,276 Speaker 2: less components and pipes and vessels and form work and concrete. 522 00:30:06,476 --> 00:30:11,316 Speaker 2: So your cost per kilowatt initially can go down if 523 00:30:11,316 --> 00:30:15,796 Speaker 2: you simplify your plant. And that's really what we're you know, 524 00:30:15,836 --> 00:30:18,716 Speaker 2: that's one of one big piece of the puzzle. The 525 00:30:18,756 --> 00:30:20,596 Speaker 2: other big piece of the puzzle, which is really our 526 00:30:20,636 --> 00:30:24,356 Speaker 2: main thesis in ALLO is you know, there's one model, 527 00:30:24,396 --> 00:30:27,476 Speaker 2: which is you spend six to ten years to build 528 00:30:27,476 --> 00:30:29,596 Speaker 2: a gigawat scale plant. If you get really good at it, 529 00:30:29,636 --> 00:30:32,196 Speaker 2: you can bring it down to like five, Right, so 530 00:30:32,396 --> 00:30:35,636 Speaker 2: you spend five years or six years optimistically, and you 531 00:30:35,676 --> 00:30:39,516 Speaker 2: build a gigawat scale plant. What we're doing instead is 532 00:30:39,716 --> 00:30:44,076 Speaker 2: instead of building a single gigawat scale plant, we're focusing 533 00:30:44,196 --> 00:30:49,836 Speaker 2: on building factories that can produce at least a gigawat 534 00:30:50,636 --> 00:30:54,556 Speaker 2: power output every year by making smaller reactors. 535 00:30:54,636 --> 00:30:56,916 Speaker 1: So how many reactors per year would one of these 536 00:30:56,956 --> 00:30:57,636 Speaker 1: factories make. 537 00:30:57,956 --> 00:31:02,036 Speaker 2: So we're trying to build our first pilot scale facility 538 00:31:02,156 --> 00:31:06,636 Speaker 2: here in Austin, Texas and we're establishing that by end 539 00:31:06,676 --> 00:31:09,076 Speaker 2: of next year, and that is going to be just 540 00:31:09,116 --> 00:31:13,396 Speaker 2: designed to build twenty of these reactors per year and 541 00:31:13,676 --> 00:31:17,316 Speaker 2: if demand outgrows that, which we believe it will. Uh, 542 00:31:17,476 --> 00:31:19,956 Speaker 2: the idea is the learning from that, we're going to 543 00:31:20,036 --> 00:31:23,876 Speaker 2: a full factory. A full factor's anticipated to be between 544 00:31:23,876 --> 00:31:25,716 Speaker 2: one hundred to two hundred reactors a year. 545 00:31:26,396 --> 00:31:30,116 Speaker 1: So tell me about what the world looks like if 546 00:31:30,156 --> 00:31:33,636 Speaker 1: it works. Like if this idea you have of building 547 00:31:33,636 --> 00:31:38,196 Speaker 1: a factory to build whatever a nuclear power plant every 548 00:31:38,916 --> 00:31:43,076 Speaker 1: two days or something like, how does that work in 549 00:31:43,076 --> 00:31:44,516 Speaker 1: the world and what is it? What does what does 550 00:31:44,516 --> 00:31:46,956 Speaker 1: it look like looking around America in that world? 551 00:31:47,116 --> 00:31:49,796 Speaker 2: You know, we believe that we can actually usher in 552 00:31:50,116 --> 00:31:53,716 Speaker 2: the second atomic age like we can we can grow 553 00:31:53,836 --> 00:31:59,076 Speaker 2: nuclear much more rapidly. So this whole entire energy transition 554 00:31:59,516 --> 00:32:03,156 Speaker 2: we're just not only fueled by you know, wanting to 555 00:32:03,196 --> 00:32:06,476 Speaker 2: have you know, lower carbon or no carbon energy source, 556 00:32:07,036 --> 00:32:10,276 Speaker 2: but also this massive demand and role that we're seeing 557 00:32:11,116 --> 00:32:13,876 Speaker 2: in the electric sector as well as the industrial. 558 00:32:13,476 --> 00:32:18,716 Speaker 1: Sector electrification plus AI plus AI. Right, it seems like, yes, 559 00:32:18,756 --> 00:32:21,676 Speaker 1: there's a lot of demand, so right, so so sure 560 00:32:21,796 --> 00:32:24,996 Speaker 1: it means lots of nuclear power plants. I mean specifically, 561 00:32:25,116 --> 00:32:27,236 Speaker 1: is it like there's a little nuclear power plant in 562 00:32:27,276 --> 00:32:29,516 Speaker 1: every neighborhood. Is it like people are buying kind of 563 00:32:29,836 --> 00:32:32,436 Speaker 1: you know, utilities will buy ten or twenty of these 564 00:32:32,476 --> 00:32:35,316 Speaker 1: microreactors and sort of put them all, you know, on 565 00:32:35,316 --> 00:32:38,116 Speaker 1: one site, Like how does it actually work? 566 00:32:38,356 --> 00:32:40,836 Speaker 2: The idea is, you know, the way we're designing these 567 00:32:40,836 --> 00:32:43,236 Speaker 2: systems that if you want a single reactor, you can 568 00:32:43,276 --> 00:32:46,356 Speaker 2: have a single reactor, but if you want two, they 569 00:32:46,356 --> 00:32:49,276 Speaker 2: don't share any infrastructures. You can daisy chain them as 570 00:32:49,276 --> 00:32:52,156 Speaker 2: many as you want. So if a customer wants, hey, 571 00:32:52,236 --> 00:32:55,356 Speaker 2: give me five hundred megalots, we would provide you know, 572 00:32:55,436 --> 00:32:58,236 Speaker 2: fifty of these Allo one reactors. Or in the near future, 573 00:32:58,276 --> 00:33:00,956 Speaker 2: when we build our one hundred megawot system, it'll be 574 00:33:00,996 --> 00:33:04,676 Speaker 2: five of those systems daisy change next to one another. 575 00:33:07,556 --> 00:33:09,716 Speaker 1: What do you think the first use case will be? 576 00:33:10,076 --> 00:33:13,196 Speaker 2: So so one of microreactors first came into being right 577 00:33:14,036 --> 00:33:16,796 Speaker 2: many years ago in the mid twenty fourteen when we 578 00:33:16,796 --> 00:33:18,636 Speaker 2: were really trying to figure out what the market was, 579 00:33:19,476 --> 00:33:24,236 Speaker 2: it really was the remote communities, remote mindes, islands. Those 580 00:33:24,276 --> 00:33:27,996 Speaker 2: are areas where energy cost of energy is really really high. 581 00:33:28,116 --> 00:33:30,996 Speaker 2: So when you deploy a first product into the market, 582 00:33:31,636 --> 00:33:34,356 Speaker 2: normally it's high cost, and then you try to lower 583 00:33:34,436 --> 00:33:37,716 Speaker 2: it down and then try to penetrate a broader market. 584 00:33:37,916 --> 00:33:41,316 Speaker 2: That was the entire idea for first generation microreactors. 585 00:33:41,596 --> 00:33:45,676 Speaker 1: And I should ask do microreactors exist in the world now? 586 00:33:46,596 --> 00:33:50,356 Speaker 2: Well, not in the modern definition, it doesn't. We have 587 00:33:50,356 --> 00:33:53,716 Speaker 2: a lot of small reactors, but they're not designed to 588 00:33:53,876 --> 00:33:57,076 Speaker 2: stay small or being mass manufactured. If you look around 589 00:33:57,116 --> 00:34:00,356 Speaker 2: right now, you don't see a factory as mass manufacturing 590 00:34:00,796 --> 00:34:03,276 Speaker 2: you a bunch of small reactors. The most we see 591 00:34:03,356 --> 00:34:05,796 Speaker 2: is in the nuclear submarine site, where you can make 592 00:34:05,836 --> 00:34:08,396 Speaker 2: maybe one or two reactors a year, but not at 593 00:34:08,396 --> 00:34:09,716 Speaker 2: the scale we're talking about. 594 00:34:09,956 --> 00:34:13,196 Speaker 1: Yes, and that's a very particular use case. 595 00:34:13,356 --> 00:34:15,756 Speaker 2: Yeah, But to come back to your question, where are 596 00:34:15,756 --> 00:34:19,796 Speaker 2: these first applications. The first reactor we're going to build 597 00:34:19,836 --> 00:34:22,996 Speaker 2: from our company is going to be at Idaho National Laboratory. 598 00:34:22,996 --> 00:34:25,996 Speaker 2: It's going to be a single unit, and it's mostly 599 00:34:26,076 --> 00:34:30,116 Speaker 2: because you know, we want to learn how this thing operates. 600 00:34:31,356 --> 00:34:32,836 Speaker 1: At some point, you got to build one. 601 00:34:33,076 --> 00:34:35,196 Speaker 2: We're going to show the world that we can validate 602 00:34:35,236 --> 00:34:37,796 Speaker 2: the cost. We can you know, validate the deployment model, 603 00:34:37,836 --> 00:34:41,356 Speaker 2: which we're trying to do onset construction less than sixty days. 604 00:34:41,356 --> 00:34:43,476 Speaker 2: These are very challenging targets. 605 00:34:44,476 --> 00:34:45,716 Speaker 1: Why might it not work? 606 00:34:47,396 --> 00:34:50,716 Speaker 2: So if you look at nuclear fission. 607 00:34:51,836 --> 00:34:54,156 Speaker 1: The fund the fundamental thing, you're doing the. 608 00:34:54,036 --> 00:34:57,756 Speaker 2: Fundamental thing right. We know the physics work, we know 609 00:34:57,916 --> 00:35:01,796 Speaker 2: nuclear fission works, we operate them today. It's not a 610 00:35:01,796 --> 00:35:04,156 Speaker 2: matter of proving the technology if it works or not. Right, 611 00:35:04,236 --> 00:35:08,716 Speaker 2: We've built other advanced reactors before. That's there's a lot 612 00:35:08,756 --> 00:35:12,836 Speaker 2: of challe just getting there. But the true challenge, in 613 00:35:12,876 --> 00:35:18,796 Speaker 2: my opinion, is in the scaling of the technology. Can 614 00:35:18,836 --> 00:35:21,716 Speaker 2: we make hundreds of these a year? Can we build 615 00:35:21,756 --> 00:35:25,556 Speaker 2: a factory that can effectively reduced down the cost. Can 616 00:35:25,596 --> 00:35:31,236 Speaker 2: we make fuel in large quantities enough to fuel all 617 00:35:31,276 --> 00:35:34,116 Speaker 2: of these reactors? And this is not a traditional fuel type, 618 00:35:34,676 --> 00:35:37,316 Speaker 2: this is an advanced reactor fuel. I mean it's slightly 619 00:35:37,356 --> 00:35:42,956 Speaker 2: higher enrichment than traditional nuclear reactors. These a different chemical form. 620 00:35:43,436 --> 00:35:47,756 Speaker 2: So we have to establish infrastructure to build fuel to 621 00:35:47,836 --> 00:35:52,156 Speaker 2: build these reactors as well as the expertise to deploy them, 622 00:35:52,316 --> 00:35:54,996 Speaker 2: like in a K model, Right, you get the instruction, 623 00:35:55,676 --> 00:35:59,956 Speaker 2: you get all the modules a flat pack, that's right, 624 00:36:00,076 --> 00:36:02,596 Speaker 2: You get all the modules onside and be able to 625 00:36:02,676 --> 00:36:05,196 Speaker 2: quickly assemble that together in a matter of days, not 626 00:36:05,236 --> 00:36:06,196 Speaker 2: in years. Right. 627 00:36:06,516 --> 00:36:07,876 Speaker 1: That all sounds so hard. 628 00:36:08,676 --> 00:36:10,716 Speaker 2: It is hot. And so we believe you have a 629 00:36:10,836 --> 00:36:14,316 Speaker 2: very strong team. And we're assembling strong team not just 630 00:36:14,356 --> 00:36:19,196 Speaker 2: from nuclear but from other industries like automotive and aerospace 631 00:36:20,196 --> 00:36:23,356 Speaker 2: and chip manufacturing to understand how, you know, what are 632 00:36:23,396 --> 00:36:26,836 Speaker 2: the lessons learned can bring from those industries that worked 633 00:36:26,996 --> 00:36:31,436 Speaker 2: that have been successful into a nuclear trying to not 634 00:36:31,796 --> 00:36:34,276 Speaker 2: reinvent the wheel all over again. But there's a lot 635 00:36:34,316 --> 00:36:37,076 Speaker 2: of challenges, there's a lot of unknowns, and we're trying 636 00:36:37,076 --> 00:36:41,356 Speaker 2: to diligently solve them, focusing on the most important question 637 00:36:41,436 --> 00:36:41,916 Speaker 2: at a time. 638 00:36:43,236 --> 00:36:47,076 Speaker 1: So I want to just return briefly to the idea 639 00:36:47,156 --> 00:36:52,516 Speaker 1: of tail risk, like because it is, it does, it's 640 00:36:52,636 --> 00:36:54,876 Speaker 1: I don't know how to parse it at some level 641 00:36:55,276 --> 00:36:59,916 Speaker 1: with nuclear power, right, like you tell me, Like one 642 00:36:59,996 --> 00:37:02,356 Speaker 1: version of the question is what's the worst thing that 643 00:37:02,356 --> 00:37:04,676 Speaker 1: could happen with one of these reactors? 644 00:37:04,956 --> 00:37:09,156 Speaker 2: Okay, So when you go through the regulatory process, this 645 00:37:10,316 --> 00:37:13,236 Speaker 2: is the very question that they ask you. What is 646 00:37:13,276 --> 00:37:16,676 Speaker 2: the what is the worst thing that can happen, even 647 00:37:16,716 --> 00:37:20,716 Speaker 2: if it's the very very low probability, what happens? What 648 00:37:20,756 --> 00:37:22,316 Speaker 2: do you do in the in the scenario, what does 649 00:37:22,316 --> 00:37:25,156 Speaker 2: the recovery look like? What is the consequence of that? 650 00:37:25,756 --> 00:37:28,676 Speaker 2: And the way we are designing our reactors. And I 651 00:37:28,676 --> 00:37:31,236 Speaker 2: can't speak for everyone out there, and the most companies 652 00:37:31,276 --> 00:37:34,316 Speaker 2: are doing very similar things is even in the worst 653 00:37:34,356 --> 00:37:39,676 Speaker 2: worst case scenario, we don't have any release of any 654 00:37:39,756 --> 00:37:42,676 Speaker 2: radiactive material from the reactor to the outside. 655 00:37:43,196 --> 00:37:46,396 Speaker 1: Huh? And is that inherent in the physics? Like? How 656 00:37:46,836 --> 00:37:47,716 Speaker 1: how do you know that? 657 00:37:47,796 --> 00:37:47,916 Speaker 2: Like? 658 00:37:48,156 --> 00:37:49,516 Speaker 1: How do you know that with certainty? 659 00:37:50,236 --> 00:37:52,836 Speaker 2: It's a it's a so a question is how do 660 00:37:52,916 --> 00:37:57,476 Speaker 2: we know? The second question is how can we prove it? So? 661 00:37:57,516 --> 00:38:01,396 Speaker 2: How do we know? Is mostly by the data that 662 00:38:01,436 --> 00:38:04,156 Speaker 2: we have on the physics side, as well as the engineering, 663 00:38:04,716 --> 00:38:07,676 Speaker 2: the way we design our reactor. How do we prove it? 664 00:38:08,196 --> 00:38:12,636 Speaker 2: So the proving goes in several stages. Right the first 665 00:38:12,636 --> 00:38:16,876 Speaker 2: stage is we're building a full scale non nuclear prototype 666 00:38:17,236 --> 00:38:19,716 Speaker 2: of the reactor starting this year. It's going to be 667 00:38:19,916 --> 00:38:22,516 Speaker 2: you know, turning on next year. The purpose of that 668 00:38:22,876 --> 00:38:25,956 Speaker 2: is to collect the data so we can validate some 669 00:38:26,036 --> 00:38:28,716 Speaker 2: of our safety claims. But it's not going to be 670 00:38:28,716 --> 00:38:34,116 Speaker 2: a nuclear fuel. But apart from that, that little disclaimer 671 00:38:34,156 --> 00:38:38,196 Speaker 2: that we don't have nuclear fuel, everything else that that 672 00:38:38,476 --> 00:38:41,236 Speaker 2: ensures the performance of the system, the safety of the system. 673 00:38:41,396 --> 00:38:43,156 Speaker 2: We can collect data on so. 674 00:38:43,116 --> 00:38:46,036 Speaker 1: You can kick it and throw things at it and whatever, 675 00:38:46,156 --> 00:38:48,196 Speaker 1: stress test it exactly. 676 00:38:48,436 --> 00:38:52,756 Speaker 2: So that's the first stage. The second stage is, you know, 677 00:38:52,796 --> 00:38:55,876 Speaker 2: when you have a reactor, a full blown you know, 678 00:38:55,916 --> 00:39:00,396 Speaker 2: physics based reactor, you have fuel insert into it and 679 00:39:00,436 --> 00:39:03,436 Speaker 2: you're going to you know, turn it. What a nuclear term, 680 00:39:03,436 --> 00:39:07,676 Speaker 2: it's called going critical, meaning you first turn on the 681 00:39:07,716 --> 00:39:11,036 Speaker 2: machine and then you slow ramp up power level from 682 00:39:11,076 --> 00:39:14,076 Speaker 2: ten percent power, twenty percent power thirty. So you don't 683 00:39:14,116 --> 00:39:16,396 Speaker 2: go like, you know, yeah, I've got a reactor and 684 00:39:16,476 --> 00:39:19,676 Speaker 2: I put fuel in and here it goes one hundred 685 00:39:19,676 --> 00:39:22,436 Speaker 2: percent power. You don't necessarily do that. You do a 686 00:39:22,556 --> 00:39:28,116 Speaker 2: very step wise increment and that is extremely crucial to 687 00:39:28,356 --> 00:39:33,756 Speaker 2: validate the safety characteristics of your reactor. And once we 688 00:39:33,836 --> 00:39:36,316 Speaker 2: have validated those, we do some other tests to ensure 689 00:39:36,316 --> 00:39:39,316 Speaker 2: our safety systems work. And when all of those are done, 690 00:39:39,356 --> 00:39:43,116 Speaker 2: that's when you go full power. Right, So that's really 691 00:39:43,156 --> 00:39:47,196 Speaker 2: how you prove that whatever you've designed has the right 692 00:39:47,276 --> 00:39:51,276 Speaker 2: level of safety that you've designed too. Now, having all 693 00:39:51,356 --> 00:39:57,196 Speaker 2: that said, there's alsoknown unknowns, Yeah, and that exists in 694 00:39:57,236 --> 00:40:02,116 Speaker 2: almost every technologies and that's something we hope to learn 695 00:40:02,196 --> 00:40:05,916 Speaker 2: more as we have more of these systems operational. But 696 00:40:06,196 --> 00:40:07,876 Speaker 2: going back to the question, what is the worst thing 697 00:40:07,916 --> 00:40:10,396 Speaker 2: that can happen? Because us we have designed this reactor 698 00:40:10,436 --> 00:40:14,156 Speaker 2: with enough margin built into it. In the worst case scenario, 699 00:40:14,156 --> 00:40:17,156 Speaker 2: we shut it down and no bad things happen, nothing releases, 700 00:40:17,196 --> 00:40:20,196 Speaker 2: nothing breaks down, And that's a level of safety pedigree 701 00:40:20,596 --> 00:40:22,236 Speaker 2: that we have to brain the way we see in 702 00:40:22,956 --> 00:40:26,796 Speaker 2: research reactors and universities, right, you know, they try to 703 00:40:26,796 --> 00:40:29,076 Speaker 2: pull the control rod as fast as they can and 704 00:40:29,596 --> 00:40:31,956 Speaker 2: you don't see any braking, you don't see any boiling 705 00:40:31,956 --> 00:40:32,436 Speaker 2: of coolant. 706 00:40:32,996 --> 00:40:36,396 Speaker 1: Yeah, So you're alluding to research reactors in universities, which 707 00:40:36,436 --> 00:40:39,236 Speaker 1: I didn't know about until I was preparing for this interview. So, like, 708 00:40:39,876 --> 00:40:42,476 Speaker 1: is it right there are nuclear reactors at what colleges 709 00:40:42,516 --> 00:40:45,316 Speaker 1: around the country? Like, what is the story with that? 710 00:40:45,316 --> 00:40:48,236 Speaker 2: That's right? I mean research reactors were really built to 711 00:40:48,836 --> 00:40:51,836 Speaker 2: collect data to measure nuclear physics data. And if you 712 00:40:51,876 --> 00:40:54,556 Speaker 2: look around all the major engineering schools around the United 713 00:40:54,556 --> 00:40:58,156 Speaker 2: States and also even beyond, you have research reactors. They're 714 00:40:58,196 --> 00:41:02,716 Speaker 2: called non power reactors. You've got coolant, you've got fuel, 715 00:41:02,796 --> 00:41:05,636 Speaker 2: you've got all the various instrumentation in place. But it 716 00:41:05,636 --> 00:41:08,116 Speaker 2: does not really go high temperature because you're not really 717 00:41:08,156 --> 00:41:10,396 Speaker 2: trying to make electricity city out of them. You try 718 00:41:10,516 --> 00:41:15,236 Speaker 2: to generate a chain reaction and measure physics data. Right. 719 00:41:15,276 --> 00:41:18,036 Speaker 1: And they're so safe that they let college students play. 720 00:41:17,796 --> 00:41:19,356 Speaker 2: With them pretty much. 721 00:41:19,956 --> 00:41:21,996 Speaker 1: And and did you say they used the same fuel 722 00:41:22,076 --> 00:41:22,916 Speaker 1: as you were using. 723 00:41:23,396 --> 00:41:23,996 Speaker 2: That's correct. 724 00:41:27,796 --> 00:41:29,796 Speaker 1: We'll be back in a minute with the lightning round. 725 00:41:39,956 --> 00:41:41,756 Speaker 1: So now we're just going to finish with the lightning round, 726 00:41:41,956 --> 00:41:44,316 Speaker 1: which could be quick. It can be a little. 727 00:41:44,036 --> 00:41:45,436 Speaker 2: More random, sure. 728 00:41:45,756 --> 00:41:50,436 Speaker 1: Than the rest. What's the most underrated sub atomic particle? 729 00:41:53,076 --> 00:41:59,396 Speaker 2: Hmm, underrated subatomic particle the neutron? 730 00:41:59,476 --> 00:42:02,396 Speaker 1: Right? I thought you were going to go straight to neutron. 731 00:42:02,316 --> 00:42:03,636 Speaker 2: Don't fair. 732 00:42:03,676 --> 00:42:06,076 Speaker 1: No, it's very obvious. That's fair. Okay, Good, give me 733 00:42:06,116 --> 00:42:07,556 Speaker 1: a better run, give me a better one. 734 00:42:07,916 --> 00:42:11,156 Speaker 2: Yeah, well, it is certainly the neutron. I have to 735 00:42:11,236 --> 00:42:14,396 Speaker 2: figure out it. 736 00:42:13,716 --> 00:42:16,396 Speaker 1: Because like you don't even think of it if you don't, right, 737 00:42:17,076 --> 00:42:23,836 Speaker 1: the positiveative. Okay, Well, what's the most overrated subatomic particle? 738 00:42:26,196 --> 00:42:34,676 Speaker 2: I think it's uh uh it's a what was that proton? Okay, yeah, 739 00:42:34,756 --> 00:42:38,196 Speaker 2: it's it's really not okay. And here's why I say it, Right, 740 00:42:38,596 --> 00:42:40,676 Speaker 2: if you look into I mean, I'm an energy guy, 741 00:42:40,676 --> 00:42:42,316 Speaker 2: I look at you know how you can I'm not 742 00:42:42,356 --> 00:42:47,116 Speaker 2: a you know, a reactor physicist per se. But if 743 00:42:47,116 --> 00:42:50,076 Speaker 2: I look on a high level on the application side, 744 00:42:50,756 --> 00:42:54,436 Speaker 2: what gives me energy? Chemical reactions like combustion, where you 745 00:42:54,516 --> 00:42:58,356 Speaker 2: have exchange of electrons giving energy. So electrons have some 746 00:42:58,436 --> 00:43:01,076 Speaker 2: prominence in the world of energy. Sure when it comes 747 00:43:01,116 --> 00:43:04,956 Speaker 2: to you know, splitting a nucleus, neutrons play a massive role. 748 00:43:05,836 --> 00:43:08,756 Speaker 2: But protons they're just there to make sure the world 749 00:43:08,836 --> 00:43:10,916 Speaker 2: is happen and they balanced the charge. 750 00:43:11,756 --> 00:43:14,196 Speaker 1: They're just there to keep the electrons. 751 00:43:14,476 --> 00:43:15,796 Speaker 2: Have to keep the electrons around. 752 00:43:17,236 --> 00:43:20,636 Speaker 1: Yeah, what's your favorite fundamental force? 753 00:43:23,956 --> 00:43:26,436 Speaker 2: What's my favorite fundamental form? 754 00:43:26,676 --> 00:43:28,876 Speaker 1: Tired of stupid physical questions, I can ask you other 755 00:43:28,916 --> 00:43:31,716 Speaker 1: stupid questions. You ready, what'd you think of? What? What 756 00:43:31,756 --> 00:43:32,876 Speaker 1: did you think of? Oppenheimer? 757 00:43:34,036 --> 00:43:37,756 Speaker 2: I think it's a great movie, even I hope you're 758 00:43:37,756 --> 00:43:39,516 Speaker 2: talking about the movie itself, not the actual person. 759 00:43:39,756 --> 00:43:41,796 Speaker 1: I'm talking about the movie, not the actual person. 760 00:43:42,036 --> 00:43:44,476 Speaker 2: Yes, I think it was. It was really great. 761 00:43:44,876 --> 00:43:48,356 Speaker 1: I've seen you mention that you have that that a 762 00:43:48,436 --> 00:43:51,156 Speaker 1: couple of your favorite books are by authors who started 763 00:43:51,196 --> 00:43:56,396 Speaker 1: out anti nuclear and became pro nuclear, and so I'm curious, 764 00:43:56,596 --> 00:43:59,476 Speaker 1: what is something that you have changed your mind about. 765 00:44:00,156 --> 00:44:03,836 Speaker 2: One of my earliest mentors in Westinghouse, who hired me 766 00:44:03,916 --> 00:44:06,756 Speaker 2: in the first place, he said, Yeah, sir, you can 767 00:44:06,796 --> 00:44:09,116 Speaker 2: be a techie as much as you want, but unless 768 00:44:09,196 --> 00:44:14,276 Speaker 2: you understand the economic side of engineering, you truly would 769 00:44:14,276 --> 00:44:18,236 Speaker 2: not appreciate the value of what you're building. So don't 770 00:44:18,276 --> 00:44:21,956 Speaker 2: ignore the economic side. Make sure you keep it right 771 00:44:21,996 --> 00:44:24,676 Speaker 2: next to the technology. So that really opened my eyes 772 00:44:24,716 --> 00:44:27,676 Speaker 2: in this whole area of not as advanced reactors, but 773 00:44:27,716 --> 00:44:30,436 Speaker 2: also the economic side of things to make sure that 774 00:44:30,516 --> 00:44:34,156 Speaker 2: whatever I'm doing should have a relevance to society. 775 00:44:34,876 --> 00:44:37,876 Speaker 1: Yeah, I feel like the story of the economics transition 776 00:44:38,156 --> 00:44:42,476 Speaker 1: at this point is basically a technoeconomic story, right. I 777 00:44:42,476 --> 00:44:46,676 Speaker 1: feel like in many domains, the fundamental technological problems have 778 00:44:47,396 --> 00:44:51,316 Speaker 1: largely been solved, and it's so it's a question of technoeconomics, 779 00:44:51,356 --> 00:44:53,716 Speaker 1: and I mean people talk about that in like green cement, 780 00:44:53,876 --> 00:44:56,076 Speaker 1: they talk about it in batteries, you're talking about it 781 00:44:56,116 --> 00:44:59,916 Speaker 1: in nuclear power. It's interesting how often it comes. 782 00:44:59,756 --> 00:45:02,796 Speaker 2: Up right, and there's so many technologies out there to 783 00:45:02,836 --> 00:45:05,156 Speaker 2: solve problems. But at the end of the day, if 784 00:45:05,156 --> 00:45:08,396 Speaker 2: it's not economical, it's hard to convince people. Why did 785 00:45:08,436 --> 00:45:10,036 Speaker 2: you adopt it versus something else. 786 00:45:15,076 --> 00:45:19,276 Speaker 1: Yasir Arafat is the chief technology officer at alo Atomics. 787 00:45:19,876 --> 00:45:23,116 Speaker 1: Today's show was produced by Gabriel Hunter Chang. It was 788 00:45:23,396 --> 00:45:26,876 Speaker 1: edited by Lyddy Jean Kott and engineered by Sarah Bruguer. 789 00:45:27,556 --> 00:45:31,356 Speaker 1: You can email us at problem at Pushkin dot FM. 790 00:45:31,476 --> 00:45:33,796 Speaker 1: I'm Jacob Goldstein and we'll be back next week with 791 00:45:33,876 --> 00:45:43,436 Speaker 1: another episode of What's Your Problem.