1 00:00:05,960 --> 00:00:10,200 Speaker 1: The year is nineteen seventy. Mario Eskimia had been working 2 00:00:10,240 --> 00:00:13,800 Speaker 1: with Donald Lovitt, otherwise known as Porky, in the Arctic 3 00:00:13,880 --> 00:00:17,120 Speaker 1: for over a month, and he was pretty fed up 4 00:00:17,120 --> 00:00:20,640 Speaker 1: with Porky at this point. Porky had the bad habit 5 00:00:20,680 --> 00:00:25,440 Speaker 1: of getting drunk and wielding butcher cleavers while stealing his 6 00:00:25,520 --> 00:00:29,240 Speaker 1: crewmate's wine, clearly behavior that would be frowned upon in 7 00:00:29,400 --> 00:00:33,080 Speaker 1: just about any culture. So this had happened yet again, 8 00:00:33,280 --> 00:00:36,200 Speaker 1: and Eskima got all fired up and grabbed his rifle 9 00:00:36,240 --> 00:00:40,280 Speaker 1: and went to confront Porky. Porky was indeed enjoying a 10 00:00:40,320 --> 00:00:43,320 Speaker 1: cocktail with the stolen wine, and he was enjoying that 11 00:00:43,400 --> 00:00:46,800 Speaker 1: cocktail with the station manager, a man named Benny Lightsey. 12 00:00:48,000 --> 00:00:51,760 Speaker 1: Eskimia was arguing with Lightsee about what should be done 13 00:00:51,800 --> 00:00:55,760 Speaker 1: about the whole Porky situation when his rifle accidentally went. 14 00:00:55,680 --> 00:00:56,920 Speaker 2: Off, killing Lights. 15 00:00:57,440 --> 00:01:00,520 Speaker 1: This actually ended up becoming a fairly famous case in 16 00:01:00,560 --> 00:01:03,960 Speaker 1: the world of law because it provided some precedent for 17 00:01:04,040 --> 00:01:08,160 Speaker 1: what happens when you're outside of the jurisdiction of other nations. 18 00:01:08,560 --> 00:01:10,880 Speaker 1: So what happens when you're on a mobile piece of 19 00:01:10,880 --> 00:01:13,960 Speaker 1: ice in the Arctic Ocean what court do you get 20 00:01:14,000 --> 00:01:16,960 Speaker 1: tried in? And who decides what happens? Well, in this case, 21 00:01:16,959 --> 00:01:20,440 Speaker 1: it was fairly straightforward because both men were United States citizens, 22 00:01:20,880 --> 00:01:23,520 Speaker 1: so the case was tried in the US and Eskimia 23 00:01:23,640 --> 00:01:27,920 Speaker 1: was ultimately acquitted. What happens if you have a Martian 24 00:01:28,000 --> 00:01:32,760 Speaker 1: settlement and Mario Eskimia is from the United States, for example, 25 00:01:32,800 --> 00:01:35,800 Speaker 1: and Benny Lightsey is from Russia for example? 26 00:01:36,440 --> 00:01:37,800 Speaker 2: What happens on Mars? 27 00:01:37,880 --> 00:01:41,240 Speaker 1: Who oversees the court case, who decides what the punishment 28 00:01:41,280 --> 00:01:43,840 Speaker 1: should be, and what punishment is fair in a harsh 29 00:01:44,000 --> 00:01:48,200 Speaker 1: environment millions of miles from the home planet. Well to 30 00:01:48,240 --> 00:01:51,040 Speaker 1: answer some of these questions today, we're talking to doctor 31 00:01:51,240 --> 00:01:55,360 Speaker 1: Erica Nesvold. Her book Off Earth tackles questions about law, 32 00:01:55,600 --> 00:01:57,720 Speaker 1: social justice, and ethics in space. 33 00:01:58,320 --> 00:02:01,280 Speaker 2: Welcome to Daniel and kelly Is Extraordinary Universe. 34 00:02:15,520 --> 00:02:18,639 Speaker 3: Hi. I'm Daniel. I'm a particle physicist, and I'm excited 35 00:02:18,680 --> 00:02:21,440 Speaker 3: for other people to go into space. And welcome to 36 00:02:21,480 --> 00:02:24,840 Speaker 3: the podcast Daniel and Kelly's Extraordinary Universe. 37 00:02:25,480 --> 00:02:28,760 Speaker 1: I'm Kelly Wiener Smith. I'm a parasitologist, and I think 38 00:02:28,800 --> 00:02:30,720 Speaker 1: it would be great for other people to settle space, 39 00:02:30,960 --> 00:02:37,280 Speaker 1: depending on how it gets done, I have qualifications. 40 00:02:37,000 --> 00:02:39,200 Speaker 3: And so on today's episode, we're going to be digging 41 00:02:39,240 --> 00:02:41,760 Speaker 3: into the details of what questions we have to ask 42 00:02:41,840 --> 00:02:43,840 Speaker 3: before we settle space and how we handle it once 43 00:02:43,840 --> 00:02:46,320 Speaker 3: we get there, including tricky questions like who gets to 44 00:02:46,320 --> 00:02:49,280 Speaker 3: make space law on what happens when you break it, 45 00:02:49,480 --> 00:02:52,720 Speaker 3: which makes me wonder, Kelly. Down here on Earth, people 46 00:02:52,760 --> 00:02:56,960 Speaker 3: are always breaking laws, you know, jaywalking or parking illegally 47 00:02:57,000 --> 00:03:00,200 Speaker 3: or whatever. So my questioning to you today is, what 48 00:03:00,360 --> 00:03:02,640 Speaker 3: is the most serious law you've ever broken? 49 00:03:02,960 --> 00:03:03,160 Speaker 2: Oh? 50 00:03:03,200 --> 00:03:06,360 Speaker 3: Wow, on air here, not under oath, but on air. 51 00:03:06,680 --> 00:03:08,480 Speaker 1: That's right, that's right. Well, you know, I feel like 52 00:03:08,480 --> 00:03:10,160 Speaker 1: those things are the same. I have to be honest. 53 00:03:10,560 --> 00:03:13,720 Speaker 1: I think that the most trouble I almost got in 54 00:03:13,960 --> 00:03:18,360 Speaker 1: it was Thanksgiving and my three year old had been 55 00:03:18,400 --> 00:03:21,359 Speaker 1: screaming in the car on the ride home from Ohio 56 00:03:21,760 --> 00:03:26,040 Speaker 1: in the slushy snow rain combo for like hours, like 57 00:03:26,160 --> 00:03:29,320 Speaker 1: literally hours, and I went from a sixty five to 58 00:03:29,320 --> 00:03:32,800 Speaker 1: a forty five zone and I didn't realize I had 59 00:03:32,800 --> 00:03:34,440 Speaker 1: gone from a sixty five to a forty five zone. 60 00:03:34,440 --> 00:03:36,040 Speaker 1: I didn't see the forty five sign because I had 61 00:03:36,040 --> 00:03:38,480 Speaker 1: no brain lift at that point. And if you go 62 00:03:38,640 --> 00:03:42,120 Speaker 1: over twenty then you get your license taken away. 63 00:03:42,400 --> 00:03:42,920 Speaker 3: Wow. 64 00:03:43,000 --> 00:03:46,480 Speaker 1: So the police officer he walked up to the car 65 00:03:46,520 --> 00:03:48,720 Speaker 1: and he heard the screaming and he went. 66 00:03:49,000 --> 00:03:52,400 Speaker 2: Oh, like, maybe he's also a parent. 67 00:03:52,560 --> 00:03:54,440 Speaker 1: And all I know is that the ticket said I 68 00:03:54,440 --> 00:03:57,400 Speaker 1: had gone nineteen over oh, which meant I got to 69 00:03:57,440 --> 00:04:00,880 Speaker 1: keep my license. And I was more care about driving 70 00:04:00,960 --> 00:04:04,680 Speaker 1: when my kiddo was screaming after that. So basically I 71 00:04:04,720 --> 00:04:08,200 Speaker 1: am a whimp and I have not done too many dangerous, scary, 72 00:04:08,440 --> 00:04:09,680 Speaker 1: illegal things in my lifetime. 73 00:04:09,720 --> 00:04:11,120 Speaker 2: But what about you? Okay, of course it's going to 74 00:04:11,160 --> 00:04:13,880 Speaker 2: be your turn now, No. 75 00:04:13,920 --> 00:04:15,360 Speaker 3: I need to fall up on your story. Does that 76 00:04:15,440 --> 00:04:17,960 Speaker 3: mean that you can commit any felony you want? But 77 00:04:17,960 --> 00:04:20,680 Speaker 3: if there's a screaming baby, they'll downgrade it to a misdemeanor. 78 00:04:21,080 --> 00:04:24,000 Speaker 3: They're like murder one, No, but her kid was crying, 79 00:04:24,040 --> 00:04:25,560 Speaker 3: so we'll just call it manslaughter. 80 00:04:25,800 --> 00:04:29,960 Speaker 1: You know, I haven't done the appropriate experimental like design 81 00:04:30,080 --> 00:04:32,320 Speaker 1: to test that hypothesis. I mean I can imagine it 82 00:04:32,360 --> 00:04:35,320 Speaker 1: going the other way. Sometimes baby screaming just make people angry. 83 00:04:35,520 --> 00:04:37,560 Speaker 1: I could imagine, you know, somebody being like, no, I 84 00:04:37,560 --> 00:04:39,120 Speaker 1: want to write you a ticket for an extra five 85 00:04:39,160 --> 00:04:42,400 Speaker 1: miles an hour. Because I'm just not happy it's hearing 86 00:04:42,400 --> 00:04:44,880 Speaker 1: this baby screaming. So I wouldn't want to risk it. 87 00:04:44,920 --> 00:04:48,239 Speaker 3: But what do you think, Well, I've planned on that. Actually, 88 00:04:48,320 --> 00:04:50,200 Speaker 3: maybe the greatest crime I've committed that I haven't been 89 00:04:50,200 --> 00:04:53,840 Speaker 3: caught for is importing illegal cheeses. You know, we used 90 00:04:53,880 --> 00:04:56,160 Speaker 3: to go back and forth discern all the time when 91 00:04:56,200 --> 00:04:58,440 Speaker 3: our kids were young, and we would come back from 92 00:04:58,520 --> 00:05:00,880 Speaker 3: friends and they'd be our favorite cheeses and we'd want 93 00:05:00,880 --> 00:05:02,160 Speaker 3: to have them, and so we've packed them in our 94 00:05:02,160 --> 00:05:05,160 Speaker 3: suitcase even though they're illegal to import. And our plan 95 00:05:05,240 --> 00:05:07,320 Speaker 3: was always if we get searched, you know, just rely 96 00:05:07,440 --> 00:05:11,120 Speaker 3: on our toddlers to cry and scream and to get 97 00:05:11,160 --> 00:05:13,080 Speaker 3: us out of it. So I'm glad to hear that 98 00:05:13,240 --> 00:05:15,400 Speaker 3: works sometimes. That's great what I forgot. 99 00:05:15,520 --> 00:05:17,560 Speaker 2: I also committed dairy related crime. 100 00:05:18,000 --> 00:05:18,440 Speaker 3: Wow. 101 00:05:18,560 --> 00:05:21,920 Speaker 1: I was in Norway during the butter shortage that happened 102 00:05:21,920 --> 00:05:23,880 Speaker 1: a while ago. It was like some import roles and 103 00:05:23,920 --> 00:05:26,200 Speaker 1: there wasn't enough butter, and so me and the folks 104 00:05:26,200 --> 00:05:28,680 Speaker 1: that I was working with, we snuck across the border 105 00:05:28,680 --> 00:05:31,280 Speaker 1: to Sweden and bought a bunch of butter and smuggled 106 00:05:31,279 --> 00:05:34,440 Speaker 1: it back across for our dinner. I also have engaged 107 00:05:34,440 --> 00:05:35,719 Speaker 1: in dairy related crimes. 108 00:05:36,200 --> 00:05:38,160 Speaker 3: I did not realize today's podcast was going to be 109 00:05:38,160 --> 00:05:40,760 Speaker 3: about international dairy smuggling, but there we go. 110 00:05:41,120 --> 00:05:43,359 Speaker 2: There you go. It may or may not come up again, 111 00:05:43,880 --> 00:05:45,080 Speaker 2: or maybe that was just. 112 00:05:45,080 --> 00:05:47,200 Speaker 3: Us And maybe in the future of folks in space 113 00:05:47,240 --> 00:05:50,240 Speaker 3: settlements are going to be smuggling cheeses and butters from 114 00:05:50,279 --> 00:05:51,560 Speaker 3: one habitat to another. 115 00:05:51,720 --> 00:05:53,840 Speaker 1: Those are going to be some stinky cheeses if you're 116 00:05:53,839 --> 00:05:55,280 Speaker 1: taking them from Earth and they've got a six month 117 00:05:55,360 --> 00:05:58,400 Speaker 1: transit though, but some people are into that. But these 118 00:05:58,440 --> 00:06:01,919 Speaker 1: are some sort of complicated situations. If somebody is committing 119 00:06:01,920 --> 00:06:04,520 Speaker 1: crimes on Mars, what's the right court for that to 120 00:06:04,560 --> 00:06:06,240 Speaker 1: be tried in? And what happens if you have people 121 00:06:06,240 --> 00:06:09,080 Speaker 1: from different countries even now, like it's complicated when you 122 00:06:09,120 --> 00:06:11,720 Speaker 1: have people committing crimes against one another and they're from 123 00:06:11,760 --> 00:06:14,680 Speaker 1: different countries, especially if they're in a place like Antarctica 124 00:06:14,720 --> 00:06:17,320 Speaker 1: that's supposed to, you know, belong to no nation. But 125 00:06:17,400 --> 00:06:20,559 Speaker 1: that's just the tip of the ethical iceberg for stuff 126 00:06:20,560 --> 00:06:22,720 Speaker 1: that needs to get dealt with. And so today we've 127 00:06:22,720 --> 00:06:25,160 Speaker 1: got this amazing guest who can talk about loads of 128 00:06:25,200 --> 00:06:28,200 Speaker 1: different ethical problems associated with why we go to space, 129 00:06:28,240 --> 00:06:29,720 Speaker 1: how many people we send to space. 130 00:06:30,200 --> 00:06:31,680 Speaker 2: There's just a lot to think about. 131 00:06:31,720 --> 00:06:34,000 Speaker 3: That's right, and so these are really fun questions that 132 00:06:34,040 --> 00:06:36,040 Speaker 3: we do need to think about before we build out 133 00:06:36,120 --> 00:06:39,640 Speaker 3: humanity's future into space. But before we hear from an 134 00:06:39,680 --> 00:06:42,280 Speaker 3: expert who's thought a lot about this and written an 135 00:06:42,279 --> 00:06:45,360 Speaker 3: excellent book on the topic. We were curious what tricky topics 136 00:06:45,360 --> 00:06:48,240 Speaker 3: the person on the street anticipated we might run into 137 00:06:48,400 --> 00:06:49,920 Speaker 3: when we decide to settle space. 138 00:06:50,600 --> 00:06:53,000 Speaker 4: I think the ethical problems of all the bodies were 139 00:06:53,000 --> 00:06:58,480 Speaker 4: going to harm exposing to space environments. I think probably 140 00:06:58,560 --> 00:07:01,800 Speaker 4: the ethics of international law are going to be really bad. 141 00:07:01,800 --> 00:07:05,440 Speaker 4: They're already bad on the oceans. So space is just 142 00:07:05,520 --> 00:07:09,120 Speaker 4: space ocean, but worse. I mean, there's just the ongoing 143 00:07:09,160 --> 00:07:11,920 Speaker 4: problem which is not new to space, but space worse 144 00:07:12,520 --> 00:07:16,200 Speaker 4: where we take actions that have huge implications for like 145 00:07:16,280 --> 00:07:20,640 Speaker 4: way later human generations, you know, and we're not always 146 00:07:20,640 --> 00:07:23,760 Speaker 4: doing right by people to generations down the road, much 147 00:07:23,840 --> 00:07:24,800 Speaker 4: less one hundred. 148 00:07:24,480 --> 00:07:25,480 Speaker 2: Generations down the road. 149 00:07:25,800 --> 00:07:27,600 Speaker 4: And so I would really hope that we're thinking at 150 00:07:27,680 --> 00:07:29,720 Speaker 4: least one hundred gen down the road when we do 151 00:07:29,800 --> 00:07:30,400 Speaker 4: space stuff. 152 00:07:30,800 --> 00:07:34,040 Speaker 3: The allocation of resources, who do. 153 00:07:34,000 --> 00:07:37,160 Speaker 5: You say first, if there's a disaster, be it like 154 00:07:37,760 --> 00:07:40,920 Speaker 5: during Space Traveler, once you actually land on the planet 155 00:07:41,000 --> 00:07:44,760 Speaker 5: they're aiming for, Like, do you prioritize your engineers and 156 00:07:44,800 --> 00:07:46,960 Speaker 5: scientists to people who can actually pet things running? 157 00:07:47,000 --> 00:07:49,680 Speaker 6: Do you protire to your children the women who you 158 00:07:49,800 --> 00:07:53,160 Speaker 6: plan to have children with, you know? Do you just 159 00:07:53,240 --> 00:07:57,520 Speaker 6: draw straws and help for the best? Another one, what 160 00:07:57,560 --> 00:07:59,560 Speaker 6: do you do if you run into any indigenous people 161 00:07:59,920 --> 00:08:03,600 Speaker 6: you get there? And also when do people become what 162 00:08:03,640 --> 00:08:05,200 Speaker 6: we consider indigenous? 163 00:08:05,240 --> 00:08:07,960 Speaker 5: You know, like if somebody landed on the planet fifty 164 00:08:08,040 --> 00:08:10,800 Speaker 5: or one hundred years before, did they have a bare 165 00:08:11,000 --> 00:08:15,080 Speaker 5: claim despite whatever government have decleared on the way there? 166 00:08:15,840 --> 00:08:16,840 Speaker 7: The value of. 167 00:08:16,920 --> 00:08:23,520 Speaker 6: Human lives and the cost benefits out of exploring space. 168 00:08:24,600 --> 00:08:28,480 Speaker 7: One I think has to do with economic class, and 169 00:08:28,560 --> 00:08:32,439 Speaker 7: I think the disparity between poor and rich, for example, 170 00:08:32,480 --> 00:08:36,680 Speaker 7: would be even greater than it is based on decisions 171 00:08:36,679 --> 00:08:40,920 Speaker 7: of who gets to settle in these new colonies. Would 172 00:08:40,960 --> 00:08:44,480 Speaker 7: it be considered a consequence in sending the poor out 173 00:08:44,520 --> 00:08:47,440 Speaker 7: there or would it be considered a benefit and the 174 00:08:47,520 --> 00:08:49,880 Speaker 7: rich would go out there? Either way, I think would 175 00:08:49,880 --> 00:08:54,439 Speaker 7: disparity would be great and huge. I also think it 176 00:08:54,480 --> 00:09:02,400 Speaker 7: would cause political tensions as well between countries over who 177 00:09:02,480 --> 00:09:07,480 Speaker 7: gets to colonize these new areas in space. 178 00:09:07,840 --> 00:09:09,800 Speaker 3: So, Kelly, who did you ask these questions of? 179 00:09:10,120 --> 00:09:13,160 Speaker 1: I asked these questions of friends of mine on Facebook? 180 00:09:13,200 --> 00:09:15,720 Speaker 1: I asked who wants to answer some anonymous questions for me? 181 00:09:15,760 --> 00:09:17,760 Speaker 1: And then I texted them the question and they left 182 00:09:17,800 --> 00:09:18,920 Speaker 1: me a voicemail with the answer. 183 00:09:19,400 --> 00:09:21,160 Speaker 3: Definitely a scientific sample for sure. 184 00:09:21,320 --> 00:09:25,440 Speaker 1: Yeah, yeah, these are all Kelly's social network answers. None 185 00:09:25,480 --> 00:09:28,440 Speaker 1: of them cited my book though, so they can't be 186 00:09:28,480 --> 00:09:30,920 Speaker 1: that quote. But no, no, no, I thought these were 187 00:09:31,040 --> 00:09:33,200 Speaker 1: all great and interesting answers. 188 00:09:33,240 --> 00:09:33,920 Speaker 2: What did you think? 189 00:09:34,360 --> 00:09:36,360 Speaker 3: Yeah? I think these are pretty well informed. They would 190 00:09:36,360 --> 00:09:39,080 Speaker 3: be better informed if they'd read your excellent book, of course, 191 00:09:39,400 --> 00:09:41,079 Speaker 3: but they give us a good preview for what kind 192 00:09:41,120 --> 00:09:43,360 Speaker 3: of things people are thinking about, what kind of tricky 193 00:09:43,440 --> 00:09:46,840 Speaker 3: questions there are when humanity decides to move into the stars. 194 00:09:47,400 --> 00:09:49,720 Speaker 1: And if you would like us to expand out past 195 00:09:49,880 --> 00:09:52,520 Speaker 1: Kelly's social network, please feel free to get in touch 196 00:09:52,559 --> 00:09:55,960 Speaker 1: with us at questions at danielant Kelly dot org. 197 00:09:56,320 --> 00:09:58,599 Speaker 3: We'll send you the questions for future episodes, and you 198 00:09:58,640 --> 00:10:00,839 Speaker 3: can hear your voice on the We. 199 00:10:00,800 --> 00:10:02,880 Speaker 1: Would love to have your voice featured on the show. 200 00:10:03,200 --> 00:10:06,400 Speaker 1: All right, so, without further ado, show, we invite Erica 201 00:10:06,440 --> 00:10:07,040 Speaker 1: onto the show. 202 00:10:07,200 --> 00:10:09,080 Speaker 3: Let's do it all right, here we go. 203 00:10:12,480 --> 00:10:15,480 Speaker 1: On today's show, we have doctor Erica Nesvold. She is 204 00:10:15,520 --> 00:10:19,400 Speaker 1: an astrophysicist and a developer for Universe Sandbox, which she 205 00:10:19,520 --> 00:10:24,679 Speaker 1: calls astronomy educational software masquerading as a video game I'm in. 206 00:10:25,280 --> 00:10:25,680 Speaker 5: She made a. 207 00:10:25,720 --> 00:10:28,760 Speaker 1: Podcast called Making New Worlds where she interviewed a bunch 208 00:10:28,800 --> 00:10:31,680 Speaker 1: of experts from a variety of different backgrounds about questions 209 00:10:31,679 --> 00:10:35,400 Speaker 1: related to ethics and justice in space, and expanded those 210 00:10:35,440 --> 00:10:39,040 Speaker 1: conversations into a book called Off Earth Ethical Questions and 211 00:10:39,120 --> 00:10:43,480 Speaker 1: Quandaries for Living in Outer Space. She also, because apparently 212 00:10:43,520 --> 00:10:46,720 Speaker 1: she can do everything, co founded a nonprofit called the 213 00:10:46,920 --> 00:10:50,520 Speaker 1: Just Space Alliance, which advocates for a more inclusive and 214 00:10:50,559 --> 00:10:53,520 Speaker 1: ethical future in space while also striving to make the 215 00:10:53,520 --> 00:10:56,200 Speaker 1: world more just and equitable back here on Earth today. 216 00:10:56,720 --> 00:10:57,480 Speaker 2: Thanks for being on. 217 00:10:57,400 --> 00:10:59,720 Speaker 8: The show, Erica, Thank you for having me. 218 00:11:00,200 --> 00:11:02,080 Speaker 1: It's good to be chatting with you again. We've met 219 00:11:02,600 --> 00:11:06,040 Speaker 1: irl once, which was super fun. It was fun, Yes, 220 00:11:06,120 --> 00:11:09,000 Speaker 1: talking about how people should maybe take birth in space 221 00:11:09,040 --> 00:11:10,840 Speaker 1: a little bit more seriously, or that was the part 222 00:11:10,840 --> 00:11:12,440 Speaker 1: that I remembered most strongly. 223 00:11:12,720 --> 00:11:13,040 Speaker 8: That's right. 224 00:11:13,320 --> 00:11:16,640 Speaker 1: Okay, So let's start with ethics around going to space 225 00:11:16,679 --> 00:11:19,880 Speaker 1: at all. So when I was getting involved in this literature, 226 00:11:19,920 --> 00:11:22,320 Speaker 1: I found a lot of arguments that weren't super convincing. 227 00:11:22,760 --> 00:11:24,440 Speaker 1: So let's start with some of the arguments that you 228 00:11:24,440 --> 00:11:26,440 Speaker 1: think are least convincing, and then work our way up 229 00:11:26,440 --> 00:11:28,480 Speaker 1: to the argument that you think is the most convincing 230 00:11:28,520 --> 00:11:29,960 Speaker 1: for why we should be settling space. 231 00:11:30,440 --> 00:11:32,040 Speaker 8: Yeah, I mean this is sort of where I got 232 00:11:32,040 --> 00:11:34,920 Speaker 8: into these questions in the first place. I always been 233 00:11:34,920 --> 00:11:36,960 Speaker 8: a big fan of space and even a big fan 234 00:11:37,040 --> 00:11:40,000 Speaker 8: of human space expiation and settlement, and so I've been 235 00:11:40,040 --> 00:11:44,480 Speaker 8: sort of steeped in these space settlement advocacy rhetoric since 236 00:11:44,520 --> 00:11:46,839 Speaker 8: I was a kid. And then I started really digging 237 00:11:46,840 --> 00:11:48,800 Speaker 8: into it and thinking about whether I agree with all 238 00:11:48,800 --> 00:11:51,600 Speaker 8: of these arguments, and I started off my book really 239 00:11:51,640 --> 00:11:54,360 Speaker 8: digging into them because they come in a few different flavors. 240 00:11:54,600 --> 00:11:57,480 Speaker 8: There's the people who say that we need to go 241 00:11:57,520 --> 00:11:59,240 Speaker 8: to space so that we don't have all of our 242 00:11:59,240 --> 00:12:01,760 Speaker 8: eggs in one back. We need to become a multiplanetary 243 00:12:01,760 --> 00:12:05,360 Speaker 8: species because our planet where we all live is potentially 244 00:12:05,400 --> 00:12:08,200 Speaker 8: a weak point. It's fragile. We've seen species get wiped 245 00:12:08,240 --> 00:12:13,080 Speaker 8: out here before through the dinosaurs and various different climate changes, 246 00:12:13,200 --> 00:12:15,840 Speaker 8: and we're more and more aware all the time of 247 00:12:15,840 --> 00:12:17,600 Speaker 8: all of the different things that could kill off all 248 00:12:17,600 --> 00:12:20,480 Speaker 8: the humans, most of which rely on the humans being 249 00:12:20,520 --> 00:12:22,000 Speaker 8: all in the same place. So if we go to 250 00:12:22,040 --> 00:12:25,160 Speaker 8: different planets and spread out, then we've got to backup 251 00:12:25,480 --> 00:12:29,320 Speaker 8: if Earth's biosphere collapses. And to me, I find that 252 00:12:29,360 --> 00:12:32,680 Speaker 8: one to be the most convincing argument. Unfortunately, it's also 253 00:12:32,720 --> 00:12:36,520 Speaker 8: a very powerful emotional argument, because then if you have 254 00:12:36,520 --> 00:12:39,600 Speaker 8: any sort of pushback against either space settlement as a 255 00:12:39,600 --> 00:12:42,880 Speaker 8: whole or specific parts of space settlement, or if you 256 00:12:43,000 --> 00:12:45,400 Speaker 8: just say, for example, maybe we should slow down space 257 00:12:45,440 --> 00:12:47,920 Speaker 8: settlement and think about what we're doing here, the opposition 258 00:12:48,000 --> 00:12:49,679 Speaker 8: to that tends to be, oh, do you want all 259 00:12:49,800 --> 00:12:50,720 Speaker 8: humans to die? 260 00:12:50,800 --> 00:12:50,959 Speaker 5: Then? 261 00:12:51,520 --> 00:12:54,760 Speaker 8: So that sort of false dichotomy where your criticisms are 262 00:12:54,800 --> 00:12:56,640 Speaker 8: on one side and on the other side is the 263 00:12:56,679 --> 00:12:58,960 Speaker 8: extinction of the human race. It makes it hard to 264 00:12:58,960 --> 00:13:02,240 Speaker 8: have reasonable conversations. But then beyond that, there's some more 265 00:13:02,280 --> 00:13:06,480 Speaker 8: interesting and also very emotional arguments. People who argue that 266 00:13:06,720 --> 00:13:10,080 Speaker 8: we have to expand always and continuously without stopping, because 267 00:13:10,080 --> 00:13:14,199 Speaker 8: otherwise our culture or our scientific knowledge. If we stop growing, 268 00:13:14,440 --> 00:13:17,640 Speaker 8: that collapses, We stagnate, we stop knowing how to invent 269 00:13:17,679 --> 00:13:20,600 Speaker 8: new things, we stop knowing how to do art. And 270 00:13:20,679 --> 00:13:23,520 Speaker 8: that's the problem with pop music these days, is not 271 00:13:23,679 --> 00:13:26,760 Speaker 8: enough expansion. And not only is this just a very 272 00:13:27,679 --> 00:13:32,120 Speaker 8: common and often recycled argument you hear between generations a 273 00:13:32,160 --> 00:13:34,600 Speaker 8: lot about what's wrong with the kids these days, but 274 00:13:34,880 --> 00:13:37,040 Speaker 8: this is also the kind of argument that you've seen 275 00:13:37,040 --> 00:13:40,120 Speaker 8: throughout history, especially in the US. There's an argument that 276 00:13:40,160 --> 00:13:44,280 Speaker 8: once the frontier of European settlement reached the west coast 277 00:13:44,320 --> 00:13:45,679 Speaker 8: of the US and the frontier in a way, that 278 00:13:45,760 --> 00:13:49,120 Speaker 8: America somehow became less of itself because our culture had 279 00:13:49,160 --> 00:13:51,200 Speaker 8: been built on having a frontier, and so we have 280 00:13:51,280 --> 00:13:54,640 Speaker 8: to find this new frontier out in space. I don't 281 00:13:54,679 --> 00:13:57,720 Speaker 8: find that one particularly convincing for a number of reasons. 282 00:13:57,800 --> 00:14:00,520 Speaker 8: I also think that the idea that we need to 283 00:14:00,559 --> 00:14:04,160 Speaker 8: always expand continuously or something bad will happen leads us 284 00:14:04,200 --> 00:14:06,640 Speaker 8: to making bad decisions, trying to grow too fast, trying 285 00:14:06,679 --> 00:14:08,880 Speaker 8: to consume too fast, and that we should in fact 286 00:14:09,000 --> 00:14:12,240 Speaker 8: be learning how to balance how to live sustainably with 287 00:14:12,280 --> 00:14:14,520 Speaker 8: our environment. And there's plenty of cultures who are more 288 00:14:14,559 --> 00:14:17,079 Speaker 8: focused on that, So maybe we should amplify their voices. 289 00:14:17,320 --> 00:14:19,840 Speaker 3: You don't think we need space cowboys to stay who 290 00:14:19,880 --> 00:14:20,200 Speaker 3: we are. 291 00:14:21,000 --> 00:14:23,560 Speaker 8: I mean, I don't have a problem with the esthetic. 292 00:14:23,480 --> 00:14:26,720 Speaker 3: Space cowboy rhinestones on the spaceship kind of thing. 293 00:14:27,880 --> 00:14:29,800 Speaker 8: Listen, you're just selling on it. The more you talk 294 00:14:29,840 --> 00:14:30,680 Speaker 8: about What. 295 00:14:30,720 --> 00:14:32,560 Speaker 1: I love about those arguments, though, is that the people 296 00:14:32,560 --> 00:14:34,920 Speaker 1: who are making those stagnation arguments, and I'll just go 297 00:14:34,920 --> 00:14:36,560 Speaker 1: ahead and say that I do have Rubbert Zubert in 298 00:14:36,560 --> 00:14:39,120 Speaker 1: mind right now, are also the ones who are like, well, 299 00:14:39,160 --> 00:14:42,520 Speaker 1: now is the time because SpaceX is dropping the cost 300 00:14:42,520 --> 00:14:44,520 Speaker 1: of launching the space and it's like all of this 301 00:14:44,680 --> 00:14:48,240 Speaker 1: amazing technological growth, and like how AI is helping us? 302 00:14:48,520 --> 00:14:50,640 Speaker 1: Is what's getting us there? And so where is the 303 00:14:50,680 --> 00:14:53,360 Speaker 1: stagnation that they're talking about anyway? 304 00:14:53,360 --> 00:14:55,520 Speaker 8: Okay, no, no, that's fair. I mean the problem here, 305 00:14:55,560 --> 00:14:56,960 Speaker 8: and the problem with a lot of these arguments is 306 00:14:56,960 --> 00:14:58,480 Speaker 8: even if you have people off living on the space 307 00:14:58,520 --> 00:15:00,560 Speaker 8: frontier somewhere, there's still more some of us are going 308 00:15:00,600 --> 00:15:02,480 Speaker 8: to be living on Earth needing to learn how to 309 00:15:02,520 --> 00:15:04,200 Speaker 8: live sustainably with the Earth. So we have to figure 310 00:15:04,200 --> 00:15:07,000 Speaker 8: out that part anyway. So then another big one that 311 00:15:07,000 --> 00:15:10,360 Speaker 8: you see a lot, especially and really strong space settlement 312 00:15:10,400 --> 00:15:13,280 Speaker 8: advocates is the idea that it's our destiny, that it's 313 00:15:13,280 --> 00:15:17,480 Speaker 8: not just fulfilling our biological or psychological or whatever drive 314 00:15:17,800 --> 00:15:21,160 Speaker 8: to go out and explore and expand, but it's somehow inevitable, 315 00:15:21,280 --> 00:15:24,200 Speaker 8: and if we try to stop ourselves, we're working against nature. 316 00:15:24,320 --> 00:15:26,520 Speaker 8: This is a to me, a much less strong argument. 317 00:15:26,640 --> 00:15:29,240 Speaker 8: Tends to go with very poetic language, so it's appealing, 318 00:15:29,360 --> 00:15:32,080 Speaker 8: you know, emotionally in that sense, and you hear this 319 00:15:32,160 --> 00:15:34,640 Speaker 8: the most and the more flowery marketing talk. But I 320 00:15:34,640 --> 00:15:38,200 Speaker 8: don't think it's a good rational argument. And often when 321 00:15:38,200 --> 00:15:40,720 Speaker 8: I'm talking to space settlement advocates, I try to point 322 00:15:40,720 --> 00:15:42,640 Speaker 8: that out to them. I'm like, listen, you've got a 323 00:15:42,640 --> 00:15:45,480 Speaker 8: great little, you know, video that you've put together with 324 00:15:45,520 --> 00:15:49,040 Speaker 8: stirring music and images of space vistas and talk about 325 00:15:49,080 --> 00:15:52,240 Speaker 8: how it's humanity's destiny. But that's actually not the best 326 00:15:52,320 --> 00:15:54,320 Speaker 8: way to sell it to people who aren't sure. You 327 00:15:54,360 --> 00:15:57,800 Speaker 8: need to explain how space settlement's going to benefit everyone, 328 00:15:58,000 --> 00:16:01,280 Speaker 8: including the people back on Earth, and you can't use 329 00:16:01,280 --> 00:16:04,040 Speaker 8: this sort of argument from nature, because even if it 330 00:16:04,080 --> 00:16:06,800 Speaker 8: were true that we have some sort of innate drive 331 00:16:06,920 --> 00:16:09,240 Speaker 8: to expand and explore, which, as far as I understand, 332 00:16:09,680 --> 00:16:12,520 Speaker 8: the science is still out on that. We don't just 333 00:16:12,640 --> 00:16:14,960 Speaker 8: do things because our genetics tell us to like you 334 00:16:15,160 --> 00:16:17,720 Speaker 8: and so have evolved past that and we have to 335 00:16:17,760 --> 00:16:19,920 Speaker 8: think about our decisions instead of just saying, oh, well, 336 00:16:20,000 --> 00:16:22,440 Speaker 8: you know, it's our innate drive, so we don't need 337 00:16:22,480 --> 00:16:23,960 Speaker 8: to examine this plane at all. 338 00:16:24,120 --> 00:16:25,680 Speaker 3: Well, can I jump in and speak up for that 339 00:16:26,280 --> 00:16:29,840 Speaker 3: irrational motivation because that's the one that resonates the most 340 00:16:29,880 --> 00:16:32,560 Speaker 3: with me, and rational or not, Like, I want to 341 00:16:32,600 --> 00:16:35,480 Speaker 3: know what's out there in the universe. I want us 342 00:16:35,520 --> 00:16:38,480 Speaker 3: to explore the edges of the galaxy. I want us 343 00:16:38,680 --> 00:16:40,840 Speaker 3: to figure out how to go from galaxy to galaxy. 344 00:16:40,840 --> 00:16:43,320 Speaker 3: And it seems to me like step one of that 345 00:16:43,720 --> 00:16:46,880 Speaker 3: dream is to go from planet to planet, right, And 346 00:16:46,960 --> 00:16:51,440 Speaker 3: so my desire doesn't come from like a genetic disposition 347 00:16:51,520 --> 00:16:54,800 Speaker 3: to conquer every square inch of land on every surface, 348 00:16:54,840 --> 00:16:57,280 Speaker 3: but just to know how the universe works, Like I 349 00:16:57,320 --> 00:16:58,800 Speaker 3: feel like it would be a shame for us to 350 00:16:58,800 --> 00:17:01,200 Speaker 3: be trapped on this rock for ever, and I don't 351 00:17:01,240 --> 00:17:03,320 Speaker 3: know how it's going to happen, but eventually has to 352 00:17:03,360 --> 00:17:06,120 Speaker 3: happen if we're going to figure stuff out. Maybe that's irrational, 353 00:17:06,200 --> 00:17:09,520 Speaker 3: but can't irrational motivations drive our decision sometimes? I mean, 354 00:17:09,800 --> 00:17:14,159 Speaker 3: my whole research program is probably useless and impractical and 355 00:17:14,200 --> 00:17:17,920 Speaker 3: therefore irrational. Does that mean that we shouldn't do art 356 00:17:18,040 --> 00:17:19,400 Speaker 3: and culture and stuff like that? 357 00:17:19,800 --> 00:17:21,639 Speaker 8: To be honest, I have the exact same drive. I 358 00:17:21,720 --> 00:17:23,639 Speaker 8: also would love to be out there and want us 359 00:17:23,640 --> 00:17:25,560 Speaker 8: to be exploring and want us to poke around and 360 00:17:25,600 --> 00:17:28,040 Speaker 8: see what's going on, And that certainly motivates me as 361 00:17:28,080 --> 00:17:31,080 Speaker 8: a scientist. And there's nothing wrong with emotional motivations. Let 362 00:17:31,080 --> 00:17:33,320 Speaker 8: me say it that way instead of irrational. The key here, 363 00:17:33,359 --> 00:17:36,720 Speaker 8: I think is not to try to use an emotional 364 00:17:36,800 --> 00:17:40,960 Speaker 8: argument as if it's a rational argument, I think. And 365 00:17:41,160 --> 00:17:43,560 Speaker 8: the other key I think is it's important to identify 366 00:17:43,600 --> 00:17:47,320 Speaker 8: when there's multiple motivations going on, especially if your individual 367 00:17:47,359 --> 00:17:51,000 Speaker 8: motivation is maybe different from the organization's argument that they're 368 00:17:51,000 --> 00:17:53,320 Speaker 8: poting forward. I think a lot of people are very 369 00:17:53,480 --> 00:17:56,119 Speaker 8: driven by these emotional feelings that they want to be 370 00:17:56,200 --> 00:17:58,159 Speaker 8: out there, they want to see humanity out there, and 371 00:17:58,200 --> 00:17:59,960 Speaker 8: then they go off and try to cook up a 372 00:18:00,080 --> 00:18:02,800 Speaker 8: good marketing case for it that doesn't acknowledge that they're 373 00:18:02,840 --> 00:18:04,880 Speaker 8: being driven emotionally, I think we need both. 374 00:18:05,119 --> 00:18:06,600 Speaker 3: And let me be clear, I don't want to be 375 00:18:06,680 --> 00:18:08,600 Speaker 3: out there. I want humanity to be out there, but 376 00:18:08,680 --> 00:18:10,680 Speaker 3: I'm staying here on Earth like I'm a total wimp. 377 00:18:10,760 --> 00:18:12,480 Speaker 3: There's no way you're getting me in the spaceship. 378 00:18:14,320 --> 00:18:16,040 Speaker 8: And the other important thing, like I'm not trying to 379 00:18:16,040 --> 00:18:19,520 Speaker 8: say some arguments count and some don't. I think it's 380 00:18:19,600 --> 00:18:22,320 Speaker 8: just really important to figure out what your motivations are, 381 00:18:22,600 --> 00:18:25,120 Speaker 8: not just because then you can decide, like the binary 382 00:18:25,200 --> 00:18:27,840 Speaker 8: yes or no, should you do this? But our motivations 383 00:18:27,880 --> 00:18:30,879 Speaker 8: will drive our decision making in every aspect of our planning, 384 00:18:31,119 --> 00:18:33,199 Speaker 8: and so it's important to be very self aware of 385 00:18:33,240 --> 00:18:35,720 Speaker 8: that and aware of the people that you're working with 386 00:18:35,760 --> 00:18:38,280 Speaker 8: so that you don't end up having an obstacle in 387 00:18:38,320 --> 00:18:40,520 Speaker 8: your decision making because you didn't realize one of you 388 00:18:40,600 --> 00:18:43,080 Speaker 8: is trying to make a trillion dollars from space mining 389 00:18:43,080 --> 00:18:45,560 Speaker 8: and the other one just wants to retire in space. 390 00:18:45,640 --> 00:18:47,520 Speaker 8: That's going to lead to some different decision making. 391 00:18:48,280 --> 00:18:52,280 Speaker 3: Well, I think most people probably feel like space exploration 392 00:18:52,480 --> 00:18:55,840 Speaker 3: is overall good, and maybe there's different motivations. And I 393 00:18:55,880 --> 00:18:57,720 Speaker 3: totally agree with you that we should be upfront about 394 00:18:57,760 --> 00:19:00,159 Speaker 3: what they are, But I feel like the context for 395 00:19:00,520 --> 00:19:03,680 Speaker 3: understanding which these arguments are good or bad or where 396 00:19:03,680 --> 00:19:05,800 Speaker 3: they come from is that there are arguments on the 397 00:19:05,800 --> 00:19:08,480 Speaker 3: other side, right, that there are reasons to not just 398 00:19:08,640 --> 00:19:12,199 Speaker 3: rush headlong into space exploration. And that might come as 399 00:19:12,240 --> 00:19:14,639 Speaker 3: a surprise to some folks, Like what's an argument for 400 00:19:15,000 --> 00:19:17,720 Speaker 3: not settling space? What's an example of a reason why 401 00:19:17,800 --> 00:19:21,640 Speaker 3: we should not be building colonies on Mars Organmede. 402 00:19:21,760 --> 00:19:24,199 Speaker 8: That's a very important question, and I actually think that 403 00:19:24,280 --> 00:19:27,919 Speaker 8: all of these arguments for space have their opposite argument. 404 00:19:28,160 --> 00:19:30,879 Speaker 8: So one argument for space settlement of space expression that 405 00:19:30,880 --> 00:19:32,600 Speaker 8: I didn't list is just that we could make money 406 00:19:32,640 --> 00:19:35,879 Speaker 8: doing it. But it also costs an enormous amount of 407 00:19:35,880 --> 00:19:38,600 Speaker 8: money to do any of these projects, especially the ones 408 00:19:38,640 --> 00:19:41,439 Speaker 8: that are being proposed for the future. And so you know, 409 00:19:41,520 --> 00:19:45,720 Speaker 8: from a very pragmatic financial perspective, we can say, as 410 00:19:45,880 --> 00:19:49,600 Speaker 8: a species, as humanity, or as individual investors, is it 411 00:19:49,640 --> 00:19:51,560 Speaker 8: worth the huge amount of time and money that we 412 00:19:51,640 --> 00:19:54,200 Speaker 8: need to invest. Will we get that back or will 413 00:19:54,240 --> 00:19:56,600 Speaker 8: we just kind of be pouring it down the drain 414 00:19:56,960 --> 00:19:58,879 Speaker 8: and then a few rockets will explode in the industry 415 00:19:58,880 --> 00:20:00,760 Speaker 8: will collapse. That's a decision. 416 00:20:01,040 --> 00:20:04,720 Speaker 3: Should we be building more elementary schools or funding space explorations. 417 00:20:04,760 --> 00:20:07,359 Speaker 8: Yeah, is there something else we could be putting that 418 00:20:07,440 --> 00:20:10,960 Speaker 8: money and brain power into on Earth that would help 419 00:20:11,280 --> 00:20:14,040 Speaker 8: better ensure that humanity is going to survive, better ensure 420 00:20:14,080 --> 00:20:16,400 Speaker 8: that people on Earth will have better lives. 421 00:20:16,240 --> 00:20:19,879 Speaker 3: More parasitology grants, for example, exactly. 422 00:20:19,400 --> 00:20:20,280 Speaker 8: The most important. 423 00:20:20,480 --> 00:20:22,960 Speaker 2: That's right. I'm glad we all agree. 424 00:20:23,240 --> 00:20:25,560 Speaker 8: And on the existential risk idea, the idea that if 425 00:20:25,600 --> 00:20:28,520 Speaker 8: we don't escape the planet, then our species risks being 426 00:20:28,560 --> 00:20:31,600 Speaker 8: killed off, has its reverse argument as well. There are 427 00:20:31,680 --> 00:20:34,000 Speaker 8: people Daniel Dudney is one of them. If he wrote 428 00:20:34,040 --> 00:20:35,760 Speaker 8: a book called Dark Skies in the past few years 429 00:20:35,920 --> 00:20:39,760 Speaker 8: who argues that trying to settle space will increase the 430 00:20:39,880 --> 00:20:42,760 Speaker 8: risk that we will kill ourselves off. Because he makes 431 00:20:42,760 --> 00:20:47,440 Speaker 8: this political science argument that more activity in space will 432 00:20:47,440 --> 00:20:51,399 Speaker 8: mean more nations arguing with each other about what's happening 433 00:20:51,440 --> 00:20:54,960 Speaker 8: in space, and more weapons in space, potentially weapons of 434 00:20:55,040 --> 00:20:59,119 Speaker 8: mass destruction. We could destabilize the global balance and end 435 00:20:59,200 --> 00:21:00,840 Speaker 8: up in a nuclear war that would kill us off 436 00:21:00,880 --> 00:21:03,920 Speaker 8: because we tried to leave the Earth, and so there 437 00:21:03,960 --> 00:21:07,320 Speaker 8: are risks. There are risks that going to space could 438 00:21:07,359 --> 00:21:09,920 Speaker 8: lead to nuclear war. There's also just risks that the 439 00:21:10,000 --> 00:21:13,280 Speaker 8: opportunity cost, we could be spending that money elsewhere. There's 440 00:21:13,400 --> 00:21:16,800 Speaker 8: risks that if we do it too fast without deliberate thought, 441 00:21:16,880 --> 00:21:19,080 Speaker 8: that will just make a mess of things, and that 442 00:21:19,160 --> 00:21:21,080 Speaker 8: maybe it would be better for everyone in space and 443 00:21:21,119 --> 00:21:23,280 Speaker 8: on Earth if we took it a little slower. And 444 00:21:23,359 --> 00:21:25,200 Speaker 8: so I've heard all these arguments, and I think it's 445 00:21:25,200 --> 00:21:27,800 Speaker 8: really important. Even if you're someone like me who thinks 446 00:21:28,000 --> 00:21:31,080 Speaker 8: I'm pro space settlement, I think it's it's important to 447 00:21:31,119 --> 00:21:33,639 Speaker 8: listen to all these arguments, including the people who say no, 448 00:21:33,760 --> 00:21:35,920 Speaker 8: we're not ready to go to space, because they tend 449 00:21:35,920 --> 00:21:39,359 Speaker 8: to have the best and most observant criticisms of the plans, 450 00:21:39,400 --> 00:21:41,360 Speaker 8: so they can poke holes in your big plans, which 451 00:21:41,359 --> 00:21:41,880 Speaker 8: is important. 452 00:21:41,920 --> 00:21:44,080 Speaker 3: Well, that makes sense as much as I'm, for example, 453 00:21:44,400 --> 00:21:48,280 Speaker 3: an advocate of exploring the universe and potentially meeting aliens. 454 00:21:48,359 --> 00:21:50,960 Speaker 3: Is also the risk that if you advertise your location 455 00:21:51,040 --> 00:21:52,760 Speaker 3: to meet aliens, they're like, oh, why don't we come 456 00:21:52,760 --> 00:21:54,840 Speaker 3: and kill you and eat you or nuq from orbit. 457 00:21:54,960 --> 00:21:56,919 Speaker 3: So yeah, absolutely there are risks. 458 00:21:56,640 --> 00:21:58,919 Speaker 8: There well, and even scientists. I'm a scientist too, I'm 459 00:21:58,960 --> 00:22:02,240 Speaker 8: motivated by no and stuff. But we have plenty of history, 460 00:22:02,359 --> 00:22:06,200 Speaker 8: especially in non astronomy fields of examples where learning stuff 461 00:22:06,240 --> 00:22:08,399 Speaker 8: and being able to build new stuff led us to 462 00:22:08,600 --> 00:22:10,720 Speaker 8: cause a lot of harm from the discoveries and the 463 00:22:10,720 --> 00:22:13,479 Speaker 8: technologies made, which to me isn't an argument to not 464 00:22:13,560 --> 00:22:16,800 Speaker 8: do science, but it's an argument to think about all 465 00:22:16,880 --> 00:22:19,080 Speaker 8: the ethics and human rights stuff at the same time 466 00:22:19,200 --> 00:22:20,800 Speaker 8: that we're doing the science and technology. 467 00:22:21,400 --> 00:22:23,119 Speaker 1: The rockets that took us to the Moon are the 468 00:22:23,160 --> 00:22:25,960 Speaker 1: same rockets that we're being dropped on England during World 469 00:22:26,000 --> 00:22:29,960 Speaker 1: War two and are puplicated historically exactly. 470 00:22:30,600 --> 00:22:33,520 Speaker 3: So we're not saying, hey, space is bad, our exploration 471 00:22:33,640 --> 00:22:36,320 Speaker 3: is bad. We're saying there's nuance here. We should understand 472 00:22:36,320 --> 00:22:38,919 Speaker 3: the motivation of our arguments. We should acknowledge that there 473 00:22:38,920 --> 00:22:41,040 Speaker 3: are two sides of this, and we should proceed carefully 474 00:22:41,320 --> 00:22:44,840 Speaker 3: rather than just rushing headlong into it and launching stuff 475 00:22:44,840 --> 00:22:45,400 Speaker 3: because we can. 476 00:22:45,760 --> 00:22:49,320 Speaker 8: That is my arch Other arguments exist. 477 00:23:07,040 --> 00:23:10,119 Speaker 1: I think it's reasonable to assume that probably this stuff 478 00:23:10,160 --> 00:23:12,280 Speaker 1: is going to move forward at some point no matter 479 00:23:12,359 --> 00:23:14,800 Speaker 1: what the argument is. Right now, most of the people 480 00:23:14,840 --> 00:23:17,880 Speaker 1: who have been to space are highly trained, highly fit 481 00:23:18,280 --> 00:23:22,359 Speaker 1: astronauts which were picked by governments or super rich people, 482 00:23:22,720 --> 00:23:25,600 Speaker 1: or the people picked by super rich people as space 483 00:23:25,640 --> 00:23:29,200 Speaker 1: tourism kind of opens up. Right now, the sieve seems 484 00:23:29,240 --> 00:23:31,359 Speaker 1: to be like money or being you know, cream of 485 00:23:31,359 --> 00:23:35,120 Speaker 1: the crop intellectual, physical, But presumably settlements at some point 486 00:23:35,280 --> 00:23:37,280 Speaker 1: are going to be just the rest of us, people 487 00:23:37,359 --> 00:23:39,120 Speaker 1: like me who would never get it, who could never 488 00:23:39,200 --> 00:23:41,359 Speaker 1: afford it, and could never like get picked by NASA. 489 00:23:41,760 --> 00:23:44,160 Speaker 1: So when that stage comes, what are the important things 490 00:23:44,160 --> 00:23:46,600 Speaker 1: that we need to think about for who is going 491 00:23:46,640 --> 00:23:48,280 Speaker 1: to be there in the first settlement and how we 492 00:23:48,359 --> 00:23:49,199 Speaker 1: pick these people? 493 00:23:49,400 --> 00:23:51,320 Speaker 2: And it sounds complicated. 494 00:23:51,480 --> 00:23:53,359 Speaker 8: Yeah, deciding who even gets to go to space is 495 00:23:53,400 --> 00:23:55,080 Speaker 8: a problem we do with now. As you said, the 496 00:23:55,119 --> 00:23:58,800 Speaker 8: increase in private spaceflight really felt like a huge opening 497 00:23:58,840 --> 00:24:00,600 Speaker 8: of the field because suddenly you just have to be 498 00:24:00,960 --> 00:24:03,439 Speaker 8: a fit military test pilot or picked by NASA. Now 499 00:24:03,440 --> 00:24:05,280 Speaker 8: we have this whole other set of people who get 500 00:24:05,320 --> 00:24:07,120 Speaker 8: to go to space, who don't necessarily need to meet 501 00:24:07,119 --> 00:24:11,199 Speaker 8: the fitness requirements. But this new pool not necessarily that 502 00:24:11,280 --> 00:24:13,719 Speaker 8: big either, as you point out, it's still pretty limited, 503 00:24:14,000 --> 00:24:16,920 Speaker 8: and we can think about how we'd want to keep 504 00:24:16,960 --> 00:24:20,040 Speaker 8: expanding that, and we can think about, well, so the 505 00:24:20,080 --> 00:24:22,240 Speaker 8: people we've picked so far have been for specific purposes. 506 00:24:22,320 --> 00:24:25,760 Speaker 8: NASA is trying to accomplish certain missions, mostly science exploration, 507 00:24:25,880 --> 00:24:28,679 Speaker 8: technology testing. They pick people who can do that. The 508 00:24:29,040 --> 00:24:32,920 Speaker 8: private space flight people have their own motivations. In this stage, 509 00:24:32,920 --> 00:24:35,640 Speaker 8: it seems to be just raw raw space exploration is good. 510 00:24:35,680 --> 00:24:38,000 Speaker 8: Let's send the first whatever person to space on private 511 00:24:38,000 --> 00:24:40,280 Speaker 8: space flight. That's great. But if we were going to 512 00:24:40,280 --> 00:24:42,679 Speaker 8: build a space settlement that we wanted it to last, 513 00:24:43,400 --> 00:24:46,080 Speaker 8: then that leads to bigger questions. That's much different than 514 00:24:46,119 --> 00:24:48,399 Speaker 8: what NASA is trying to do, and so we have 515 00:24:48,440 --> 00:24:50,959 Speaker 8: to ask questions like, maybe we don't need the strict 516 00:24:51,800 --> 00:24:55,159 Speaker 8: physical and medical requirements because people are going to age 517 00:24:55,480 --> 00:24:57,280 Speaker 8: and get sick and injured in space, so we'll have 518 00:24:57,320 --> 00:24:59,480 Speaker 8: to deal with that anyway. Maybe we need to think 519 00:24:59,480 --> 00:25:03,399 Speaker 8: about reproductive fitness if we want to have reproduction in 520 00:25:03,440 --> 00:25:06,760 Speaker 8: space and perpetuated space colony with multiple generations. But then 521 00:25:07,520 --> 00:25:11,479 Speaker 8: asking prospective space settlers about their fertility and their interest 522 00:25:11,560 --> 00:25:13,840 Speaker 8: in having children. That goes against a lot of our 523 00:25:13,880 --> 00:25:18,320 Speaker 8: ethics and our arguments for inclusivity today, and that leads 524 00:25:18,359 --> 00:25:20,400 Speaker 8: to a lot of murky questions as well. And then 525 00:25:20,400 --> 00:25:22,639 Speaker 8: there's just who do you pick? Do you pick a 526 00:25:22,640 --> 00:25:25,160 Speaker 8: bunch of astrophysicists I'm in a fan of that plant, 527 00:25:25,160 --> 00:25:27,080 Speaker 8: but I also would like to be living on Mars 528 00:25:27,119 --> 00:25:30,399 Speaker 8: with maybe some dentists and plumbers and teachers for the 529 00:25:30,520 --> 00:25:34,120 Speaker 8: kids and artists as well. And so all of these 530 00:25:34,200 --> 00:25:37,360 Speaker 8: questions get us back to questions about, well, what kind 531 00:25:37,359 --> 00:25:40,480 Speaker 8: of person do we actually value in society which has 532 00:25:40,520 --> 00:25:42,800 Speaker 8: this dark other side of what kind of person don't 533 00:25:42,840 --> 00:25:45,760 Speaker 8: revalue in society? And do we want to intentionally exclude them? 534 00:25:45,800 --> 00:25:48,879 Speaker 8: What kind of a were creating a dystopia here? Accidentally? 535 00:25:49,080 --> 00:25:50,600 Speaker 8: There's tons of questions like that. 536 00:25:51,200 --> 00:25:54,560 Speaker 3: It's really interesting how this stuff is decided. I feel 537 00:25:54,600 --> 00:25:58,639 Speaker 3: like a lot of times in brand new technologies, we 538 00:25:58,720 --> 00:26:02,840 Speaker 3: have decisions being made mostly government right funding agencies because 539 00:26:02,880 --> 00:26:06,760 Speaker 3: there isn't initially a commercial return, but eventually things get 540 00:26:06,760 --> 00:26:09,399 Speaker 3: picked up by private industry, and then decisions are being 541 00:26:09,400 --> 00:26:12,720 Speaker 3: made basically because of capitalism. Right, So where do you 542 00:26:12,760 --> 00:26:15,920 Speaker 3: think we are there and how should we be making 543 00:26:15,920 --> 00:26:19,240 Speaker 3: these decisions? Should we let capitalism decide basically who gets 544 00:26:19,280 --> 00:26:21,919 Speaker 3: to go live on Mars? Or should we have like 545 00:26:22,240 --> 00:26:26,480 Speaker 3: you know, table of experts and government advisors figuring this 546 00:26:26,520 --> 00:26:28,960 Speaker 3: stuff out for the benefit of society as a whole. 547 00:26:29,240 --> 00:26:31,280 Speaker 3: Would it happen if it's led by the government, like, 548 00:26:31,440 --> 00:26:34,280 Speaker 3: at some point it needs to become capitalism driven, I 549 00:26:34,280 --> 00:26:36,359 Speaker 3: feel like, but I don't really know anything about it. 550 00:26:36,400 --> 00:26:39,560 Speaker 3: Tell me what's your thoughts about that transition for space settlement. 551 00:26:40,000 --> 00:26:42,760 Speaker 8: There's definitely a lot of arguments right now that capitalism 552 00:26:42,840 --> 00:26:44,560 Speaker 8: is the only way to get us into space because 553 00:26:44,560 --> 00:26:46,879 Speaker 8: you need that sort of incentive to innovate and to 554 00:26:46,920 --> 00:26:48,960 Speaker 8: take risks, and government didn't have it, which is why 555 00:26:49,000 --> 00:26:51,760 Speaker 8: we're not all living in space right now, so capitalism 556 00:26:51,800 --> 00:26:53,480 Speaker 8: is the way to do it. I think that those 557 00:26:53,560 --> 00:26:56,680 Speaker 8: arguments are so popular right now because capitalism is what's 558 00:26:56,760 --> 00:27:00,520 Speaker 8: driving current space exploration, along with the existing government. They 559 00:27:00,520 --> 00:27:03,600 Speaker 8: haven't gone away, but I'm not convinced that they're going 560 00:27:03,640 --> 00:27:06,800 Speaker 8: to finish the job. I'm not convinced that capitalism is 561 00:27:06,840 --> 00:27:08,760 Speaker 8: going to be what gets us into space. And I 562 00:27:08,760 --> 00:27:11,360 Speaker 8: think it's possible that generations from now, what finally gets 563 00:27:11,400 --> 00:27:13,040 Speaker 8: us a permanent space settlement is going to be some 564 00:27:13,119 --> 00:27:17,080 Speaker 8: other political and economic system, and by then they'll be saying, well, 565 00:27:17,119 --> 00:27:18,920 Speaker 8: you know, communism is the only way we could have 566 00:27:18,960 --> 00:27:21,479 Speaker 8: done this, and this is why we need a tyrannical government. 567 00:27:21,480 --> 00:27:21,639 Speaker 4: You know. 568 00:27:21,800 --> 00:27:23,920 Speaker 8: They could have their own argument for why their system 569 00:27:24,160 --> 00:27:26,280 Speaker 8: is perfect. And the interesting thing is that each of 570 00:27:26,280 --> 00:27:28,159 Speaker 8: these systems is going to come up with a different 571 00:27:28,160 --> 00:27:31,879 Speaker 8: way to pick who gets to go live in space. 572 00:27:31,920 --> 00:27:34,480 Speaker 8: So for now, maybe it's well, whoever can pay, whoever 573 00:27:34,480 --> 00:27:38,160 Speaker 8: can pay for a ticket on a SpaceX passenger starliner 574 00:27:38,280 --> 00:27:41,560 Speaker 8: to Mars gets to go. We don't care whether their 575 00:27:41,600 --> 00:27:44,760 Speaker 8: dentists or not. We don't care that maybe it'll be 576 00:27:45,000 --> 00:27:47,440 Speaker 8: a bunch of particularly wealthy people and none of them 577 00:27:47,480 --> 00:27:50,280 Speaker 8: know how to farm, for example, and that could lead 578 00:27:50,280 --> 00:27:51,879 Speaker 8: to its own troubles. This is the sort of thing 579 00:27:51,880 --> 00:27:54,240 Speaker 8: we've seen historically happen on Earth. If you had a 580 00:27:54,280 --> 00:27:56,200 Speaker 8: bunch of rich colonists and nobody who knows how to 581 00:27:56,240 --> 00:27:58,760 Speaker 8: do the work. That causes a lot of problems, or 582 00:27:59,160 --> 00:28:01,840 Speaker 8: it could end up we swing back towards some sort 583 00:28:01,840 --> 00:28:05,520 Speaker 8: of international government efforts, and they take a long time 584 00:28:05,560 --> 00:28:07,000 Speaker 8: to decide who to send, and they have a lot 585 00:28:07,040 --> 00:28:09,359 Speaker 8: of arguments and committees about it, and they try to 586 00:28:09,400 --> 00:28:14,360 Speaker 8: balance everything fairly. And there's other risks with that as well, 587 00:28:14,400 --> 00:28:17,120 Speaker 8: and including the fact that maybe that goes slower than 588 00:28:17,160 --> 00:28:19,159 Speaker 8: doing it in a capitalist way. So notice here that 589 00:28:19,200 --> 00:28:21,560 Speaker 8: I'm not saying here's the correct way for it. I'm 590 00:28:21,600 --> 00:28:24,560 Speaker 8: saying this is another example of how your motivation will 591 00:28:24,680 --> 00:28:27,639 Speaker 8: drive the choices that you make. It could come down to, 592 00:28:28,400 --> 00:28:31,320 Speaker 8: we've developed all the technology, but haven't invested in a 593 00:28:31,359 --> 00:28:34,480 Speaker 8: space settlement yet. But now the asteroid's coming and we 594 00:28:34,560 --> 00:28:37,560 Speaker 8: have to evacuate some amount of people to perpetuate the 595 00:28:37,640 --> 00:28:40,000 Speaker 8: human species. And then you have all sorts of what 596 00:28:40,040 --> 00:28:44,640 Speaker 8: we call lifeboat ethics Titanic level, like the ship disaster 597 00:28:44,800 --> 00:28:47,360 Speaker 8: Titanic level, with decisions about who we put on the 598 00:28:47,400 --> 00:28:50,360 Speaker 8: lifeboat and what's fair, who do we save, how do 599 00:28:50,400 --> 00:28:53,560 Speaker 8: we make sure that it's enough dentists, whatnot, and so 600 00:28:53,560 --> 00:28:55,880 Speaker 8: that would cause a whole different set of decisions. 601 00:28:56,640 --> 00:28:59,880 Speaker 3: I love the dentists are your standard like pretty use 602 00:29:00,280 --> 00:29:02,800 Speaker 3: but not very exciting kind of categories. 603 00:29:02,880 --> 00:29:04,920 Speaker 8: It's really important to me. The dentists are in space. 604 00:29:04,960 --> 00:29:07,600 Speaker 3: We love you dentists. It's important. But I want to 605 00:29:07,600 --> 00:29:09,880 Speaker 3: follow up on the historical comments you made about what 606 00:29:09,920 --> 00:29:12,840 Speaker 3: we've learned, Like clearly we've done this kind of stuff 607 00:29:12,840 --> 00:29:15,240 Speaker 3: in the past. You know, we gave away land to 608 00:29:15,240 --> 00:29:17,920 Speaker 3: people to encourage them to move west, or we stole 609 00:29:18,000 --> 00:29:20,960 Speaker 3: land and then gave it to people, and even further back, 610 00:29:21,120 --> 00:29:25,440 Speaker 3: kings and queens funded expeditions to the New World because 611 00:29:25,480 --> 00:29:28,440 Speaker 3: they thought there were returns there. What have we learned 612 00:29:28,440 --> 00:29:31,480 Speaker 3: from our many mistakes and successes in the past. 613 00:29:31,640 --> 00:29:34,160 Speaker 8: Some of us have learned from over stays is the 614 00:29:34,200 --> 00:29:37,160 Speaker 8: key I think the benefit we have in thinking about 615 00:29:37,600 --> 00:29:40,520 Speaker 8: expanding human civilization into space. We have all these case 616 00:29:40,520 --> 00:29:43,280 Speaker 8: studies to look at lessons learned over the history of 617 00:29:43,320 --> 00:29:45,080 Speaker 8: the human species, and the benefit we have is that 618 00:29:45,120 --> 00:29:47,760 Speaker 8: we have all of human history to learn from. So 619 00:29:47,800 --> 00:29:49,600 Speaker 8: to me, it's a waste when we don't do that. 620 00:29:50,000 --> 00:29:52,920 Speaker 8: And at the moment, I can say from experience that scientists, 621 00:29:52,920 --> 00:29:55,440 Speaker 8: at least in the US, don't get a lot of 622 00:29:55,560 --> 00:29:58,760 Speaker 8: education in history beyond you know, basic grade school level. 623 00:29:59,040 --> 00:30:01,760 Speaker 8: We tend to just beyond it and think it's irrelevant. 624 00:30:01,760 --> 00:30:05,920 Speaker 8: But now we're trying to, you know, reinvent colonial sized societies, 625 00:30:06,480 --> 00:30:08,960 Speaker 8: and without having a reference, we're just going to repeat 626 00:30:09,000 --> 00:30:11,520 Speaker 8: all the same mistakes. That's my concern, and this is 627 00:30:11,520 --> 00:30:13,880 Speaker 8: part of what got me going with the podcast I created. 628 00:30:13,920 --> 00:30:17,560 Speaker 8: I decided maybe I should talk to actual colonial historians 629 00:30:17,920 --> 00:30:19,840 Speaker 8: and ask them their thoughts, and you know, I should 630 00:30:19,840 --> 00:30:22,560 Speaker 8: read up because as an American, this is mostly the 631 00:30:22,560 --> 00:30:25,760 Speaker 8: colonial history that I think about. So there's examples in 632 00:30:25,800 --> 00:30:28,920 Speaker 8: the colonization of North America, places like Jamestown, which is 633 00:30:28,920 --> 00:30:31,360 Speaker 8: a colony that failed a bunch before it finally stuck. 634 00:30:31,880 --> 00:30:35,040 Speaker 8: And the historians I read this, a lot went wrong 635 00:30:35,240 --> 00:30:37,720 Speaker 8: with Jamestown. One of the issues is that they had 636 00:30:37,720 --> 00:30:41,200 Speaker 8: a lot of more well off colonists and not a 637 00:30:41,240 --> 00:30:43,440 Speaker 8: lot of laborers to do the work, and so they 638 00:30:43,480 --> 00:30:46,240 Speaker 8: struggled to do the basic work. They struggled to satisfy 639 00:30:46,360 --> 00:30:48,880 Speaker 8: the company that sent them. The company wanted them to 640 00:30:49,280 --> 00:30:51,720 Speaker 8: make stuff for them to sell, cut lumber, et cetera. 641 00:30:52,000 --> 00:30:54,120 Speaker 8: Didn't give them enough time to build their farms and such. 642 00:30:54,440 --> 00:30:57,440 Speaker 8: That's something you can imagining happening in a space settlement. 643 00:30:58,040 --> 00:31:00,040 Speaker 8: All sorts of lessons that we can learn from the 644 00:31:00,080 --> 00:31:02,600 Speaker 8: pass and not just to North America but around the world. 645 00:31:02,640 --> 00:31:04,640 Speaker 8: And it's just important that we bothered to do that 646 00:31:04,640 --> 00:31:06,600 Speaker 8: because it's going to save us so much time and 647 00:31:06,680 --> 00:31:10,360 Speaker 8: pain to at least recognize what could go wrong before 648 00:31:10,360 --> 00:31:12,240 Speaker 8: we repeat the same path in space. 649 00:31:12,560 --> 00:31:16,440 Speaker 1: What do you think are the most relevant historical analog So, 650 00:31:16,520 --> 00:31:19,280 Speaker 1: like you know, Daniel mentioned settling of the West, and 651 00:31:19,320 --> 00:31:21,040 Speaker 1: I have read people who would love to see like 652 00:31:21,080 --> 00:31:24,000 Speaker 1: a space Homestead Act sort of thing. I read a 653 00:31:24,000 --> 00:31:25,960 Speaker 1: bunch of that literature and then I decided, well, this 654 00:31:26,000 --> 00:31:27,320 Speaker 1: isn't going to be a big part of a city 655 00:31:27,320 --> 00:31:31,760 Speaker 1: on Mars because Antarctica and the deep seabed are more analogous. 656 00:31:31,800 --> 00:31:32,880 Speaker 1: But what do you think, And of course, if you 657 00:31:32,880 --> 00:31:34,480 Speaker 1: talk to a lot of space settlement people, they'll tell 658 00:31:34,480 --> 00:31:36,320 Speaker 1: you that the Outer Space Treaty is going to get 659 00:31:36,320 --> 00:31:38,160 Speaker 1: thrown out the window and we can have a Homestead Act. 660 00:31:38,200 --> 00:31:39,800 Speaker 1: So what do you think is the most interesting to 661 00:31:39,800 --> 00:31:42,280 Speaker 1: study historically as it relates to settlement. 662 00:31:42,840 --> 00:31:44,600 Speaker 8: I think there's two big categories of it. One, I 663 00:31:44,600 --> 00:31:48,640 Speaker 8: think we should be looking at the Western countries colonization 664 00:31:49,000 --> 00:31:50,720 Speaker 8: of as much of the world as they could grab, 665 00:31:50,800 --> 00:31:53,720 Speaker 8: and the history of that, what they tried, what went 666 00:31:54,400 --> 00:31:57,320 Speaker 8: well and what didn't, and in particularly what went well 667 00:31:57,360 --> 00:32:01,040 Speaker 8: for them and didn't, but all also what went well 668 00:32:01,080 --> 00:32:03,560 Speaker 8: and didn't for the less powerful people in those societies, 669 00:32:03,640 --> 00:32:05,040 Speaker 8: obviously the indigenous people. 670 00:32:05,440 --> 00:32:05,960 Speaker 2: Over time? 671 00:32:06,000 --> 00:32:09,800 Speaker 8: What is colonization with that particular attitude of colonization, What 672 00:32:09,840 --> 00:32:11,480 Speaker 8: does that do to the environment, What does that do 673 00:32:11,560 --> 00:32:15,160 Speaker 8: for future generations, what does that do for marginalized labor workforces? 674 00:32:15,200 --> 00:32:18,200 Speaker 8: And how can we not go that way in the future. 675 00:32:18,360 --> 00:32:20,200 Speaker 8: I think it's also really crucial, and I don't think 676 00:32:20,200 --> 00:32:23,120 Speaker 8: there's enough of this yet. And talks about space ethics, 677 00:32:23,400 --> 00:32:26,120 Speaker 8: I think it's really crucial that we learn from the 678 00:32:26,160 --> 00:32:29,680 Speaker 8: parts of human history where humans migrated and expanded into 679 00:32:29,720 --> 00:32:33,120 Speaker 8: places where no humans were living yet much earlier than colonization, 680 00:32:33,640 --> 00:32:36,960 Speaker 8: learn how that went well and didn't because Western people 681 00:32:37,000 --> 00:32:39,840 Speaker 8: didn't invent moving to a new place and building a 682 00:32:39,840 --> 00:32:43,360 Speaker 8: new community there, They just did it once everyone already 683 00:32:43,400 --> 00:32:45,959 Speaker 8: had inhabited those places and decided to do it all 684 00:32:45,960 --> 00:32:49,680 Speaker 8: over again to grab their resources. And so learning more from, 685 00:32:49,800 --> 00:32:53,800 Speaker 8: for example, indigenous cultures that expanded across North America, South America, 686 00:32:53,840 --> 00:33:00,200 Speaker 8: Australia and how they learned to coexist in a new, desolate, isolated, 687 00:33:00,880 --> 00:33:05,040 Speaker 8: harsh environment, and how not only just the practical how 688 00:33:05,080 --> 00:33:07,800 Speaker 8: did they interact with the environments, but how their cultures 689 00:33:07,840 --> 00:33:12,600 Speaker 8: developed around sustainability or settling conflicts within their communities and 690 00:33:12,640 --> 00:33:15,360 Speaker 8: with other communities. I think that's all really valuable because 691 00:33:15,720 --> 00:33:19,440 Speaker 8: they have had communities and societies in these places for 692 00:33:19,600 --> 00:33:23,680 Speaker 8: tens of thousands of years, much longer than postcolonization era 693 00:33:23,760 --> 00:33:25,600 Speaker 8: has consisted. And I think there's so much we can 694 00:33:25,680 --> 00:33:28,719 Speaker 8: learn from how that worked, and to just present an 695 00:33:28,760 --> 00:33:31,320 Speaker 8: alternative to colonization, because a lot of times, if you say, 696 00:33:31,320 --> 00:33:34,560 Speaker 8: maybe we don't want to use this wild West colonization 697 00:33:34,640 --> 00:33:37,760 Speaker 8: model in space, Westerns kind of shrug and go, okay, well, 698 00:33:37,840 --> 00:33:39,959 Speaker 8: you know, so you want me to not go to space, 699 00:33:40,080 --> 00:33:41,880 Speaker 8: there are actually other options here. 700 00:33:42,840 --> 00:33:47,360 Speaker 1: So I've read a bunch of astronaut memoirs, and reading 701 00:33:47,400 --> 00:33:50,280 Speaker 1: those memoirs makes it very clear to me that astronauts 702 00:33:50,320 --> 00:33:53,600 Speaker 1: are humans too, and that humans I know, right, and 703 00:33:53,640 --> 00:33:56,320 Speaker 1: that humans go to space and they come back and 704 00:33:56,400 --> 00:33:59,880 Speaker 1: still do bad human things sometimes, which makes me wonder 705 00:34:00,640 --> 00:34:03,840 Speaker 1: what is governance and crime and punishment and stuff like 706 00:34:03,840 --> 00:34:07,440 Speaker 1: that going to look like on a Martian settlement. What 707 00:34:07,480 --> 00:34:09,680 Speaker 1: are your thoughts who's going to be in control in 708 00:34:09,719 --> 00:34:12,280 Speaker 1: an early settlement when a crime happens, and what should 709 00:34:12,280 --> 00:34:13,320 Speaker 1: we do about that stuff? 710 00:34:13,880 --> 00:34:14,160 Speaker 2: Yeah? 711 00:34:14,200 --> 00:34:16,840 Speaker 8: I think this is another question that hasn't been studied 712 00:34:16,920 --> 00:34:20,839 Speaker 8: enough except by the reliable science fiction writers. We've been 713 00:34:20,920 --> 00:34:23,319 Speaker 8: thinking about this stuff for ages. We think a lot 714 00:34:23,360 --> 00:34:25,279 Speaker 8: about how we're going to survive in space and the 715 00:34:25,360 --> 00:34:27,879 Speaker 8: legal stuff about you know, the Homestead Act, who gets 716 00:34:27,880 --> 00:34:30,560 Speaker 8: to own mars et cetera. But I don't think we've 717 00:34:30,560 --> 00:34:32,680 Speaker 8: thought enough about the idea that, you know, hell is 718 00:34:32,719 --> 00:34:36,160 Speaker 8: other people, right, and so maybe our biggest dangers in 719 00:34:36,239 --> 00:34:38,359 Speaker 8: space will be the people that we go up there with. 720 00:34:38,880 --> 00:34:42,560 Speaker 8: And that's the extreme argument for it. There's also just 721 00:34:42,560 --> 00:34:44,120 Speaker 8: the more minor. You know, what if you just have 722 00:34:44,560 --> 00:34:48,279 Speaker 8: a disagreement with a neighbor over something, whose turn it 723 00:34:48,320 --> 00:34:50,920 Speaker 8: is to take out the trash, or you stole my 724 00:34:51,400 --> 00:34:54,000 Speaker 8: space suit, whatnot? What do we do about it? And 725 00:34:54,080 --> 00:34:56,120 Speaker 8: here on Earth we have laws, we have legal systems, 726 00:34:56,520 --> 00:35:01,000 Speaker 8: we have various different ways to address calm, but we 727 00:35:01,040 --> 00:35:03,280 Speaker 8: need something in space. We can't just assume that everyone 728 00:35:03,360 --> 00:35:07,239 Speaker 8: will just perfectly get along. But it's unclear about how 729 00:35:07,239 --> 00:35:08,919 Speaker 8: a lot of that's going to work. The Outer Space 730 00:35:08,960 --> 00:35:11,320 Speaker 8: Treaty says people in space have to follow the laws 731 00:35:11,360 --> 00:35:14,600 Speaker 8: of their countries back home. That sort of handles the 732 00:35:14,640 --> 00:35:17,000 Speaker 8: issue of whose laws do we follow, But it's going 733 00:35:17,080 --> 00:35:20,120 Speaker 8: to get more complicated as people are living more long term, 734 00:35:20,200 --> 00:35:22,719 Speaker 8: especially if we move farther and farther away where it 735 00:35:22,760 --> 00:35:24,799 Speaker 8: takes a long time to call back to Earth. You know, 736 00:35:25,120 --> 00:35:27,719 Speaker 8: unless you've brought a judge and lawyers with you, you're going 737 00:35:27,800 --> 00:35:30,840 Speaker 8: to have to settle your disagreements where you are. And 738 00:35:30,880 --> 00:35:33,000 Speaker 8: then there's the bigger question, Okay, what happens if you 739 00:35:33,040 --> 00:35:35,839 Speaker 8: decide that someone is guilty of whatever you've decided as 740 00:35:35,880 --> 00:35:38,120 Speaker 8: a crime in your space settlement, what do you do 741 00:35:38,160 --> 00:35:42,120 Speaker 8: with them? And when I ask people this, their knee 742 00:35:42,200 --> 00:35:45,280 Speaker 8: jerk reaction is to say throw them out the airlock, 743 00:35:45,960 --> 00:35:50,480 Speaker 8: because that is what all the movies do. But if 744 00:35:50,520 --> 00:35:52,880 Speaker 8: you push a little farther, you can usually get people 745 00:35:52,880 --> 00:35:56,960 Speaker 8: to agree that maybe not every crime should carry capital punishment. 746 00:35:57,719 --> 00:36:00,279 Speaker 8: And then the next thing that they usually say, well, 747 00:36:00,320 --> 00:36:03,920 Speaker 8: we'll put them in space prison. But that also is tricky. 748 00:36:04,120 --> 00:36:06,920 Speaker 8: We don't have space prisons yet. We would have to 749 00:36:06,920 --> 00:36:10,160 Speaker 8: build them. That's going to take a lot of time, effort, money, 750 00:36:11,280 --> 00:36:15,200 Speaker 8: space like literal room, especially if we're doing something like 751 00:36:15,320 --> 00:36:18,000 Speaker 8: digging underground tunnels to live on Mars or something a 752 00:36:18,040 --> 00:36:21,480 Speaker 8: lot to dig another tunnel. You have to feed them 753 00:36:21,560 --> 00:36:25,000 Speaker 8: and give them air and radiation protection and water and 754 00:36:25,040 --> 00:36:28,840 Speaker 8: heat power. Someone has to guard them. Maybe you've removed 755 00:36:28,880 --> 00:36:31,200 Speaker 8: that person from the labor pool and maybe you don't 756 00:36:31,239 --> 00:36:33,960 Speaker 8: have a lot of labor to go around. Is this 757 00:36:34,040 --> 00:36:36,840 Speaker 8: actually what we want to do? And usually at that 758 00:36:36,880 --> 00:36:39,360 Speaker 8: point in the conversation, they say, Okay, well, what's the alternative, 759 00:36:39,480 --> 00:36:42,880 Speaker 8: because this is another case where Americans in particular don't 760 00:36:42,880 --> 00:36:45,919 Speaker 8: really know of any other way to handle crime because 761 00:36:45,960 --> 00:36:49,279 Speaker 8: we're a very prison focused culture. But again, there's so 762 00:36:49,400 --> 00:36:53,440 Speaker 8: many examples on Earth of ways that especially small communities 763 00:36:54,400 --> 00:36:59,680 Speaker 8: address behavior that's not desired. I called that in their communities, 764 00:36:59,800 --> 00:37:02,800 Speaker 8: and in a way that makes sure that the victim 765 00:37:02,800 --> 00:37:05,480 Speaker 8: of a crime feels safe and is made whole, the 766 00:37:05,600 --> 00:37:09,080 Speaker 8: person who committed the crime is unlikely to do it again, 767 00:37:09,480 --> 00:37:11,520 Speaker 8: and that the community is as a whole feel safe. 768 00:37:11,760 --> 00:37:14,560 Speaker 8: And there's ways to do that without building up person. 769 00:37:14,880 --> 00:37:16,680 Speaker 8: And so that's what I've been encouraging, so that we 770 00:37:16,760 --> 00:37:19,000 Speaker 8: try to think about that ahead of time, because no 771 00:37:19,040 --> 00:37:20,520 Speaker 8: matter what we come up with, we need to come 772 00:37:20,560 --> 00:37:21,960 Speaker 8: up with it ahead of time. We don't want to 773 00:37:21,960 --> 00:37:24,480 Speaker 8: wait till the last minute here and then have to 774 00:37:24,520 --> 00:37:27,840 Speaker 8: decide under duress, because I think that'll probably lead to 775 00:37:28,239 --> 00:37:30,040 Speaker 8: unpleasant outcomes for everyone involved. 776 00:37:30,320 --> 00:37:31,360 Speaker 2: Everyone out at airlock. 777 00:37:32,400 --> 00:37:35,240 Speaker 8: Yeah, I mean it's clean. I guess it always looks 778 00:37:35,239 --> 00:37:37,640 Speaker 8: really dramatic in the movies. I get the appeal, but 779 00:37:38,160 --> 00:37:40,799 Speaker 8: that's a quick way to kill everyone off, and that's 780 00:37:40,840 --> 00:37:42,520 Speaker 8: not usually the goal of a space settlement. 781 00:37:42,680 --> 00:37:45,960 Speaker 3: And again, if business is involved, are we giving businesses 782 00:37:46,040 --> 00:37:50,759 Speaker 3: the right to set laws in territories they control? You know, 783 00:37:50,920 --> 00:37:52,920 Speaker 3: that might make sense, but I feel a little weird 784 00:37:52,920 --> 00:37:56,520 Speaker 3: about giving Elon Musk right to execute whoever he thinks 785 00:37:57,000 --> 00:37:59,040 Speaker 3: is not, you know, following his laws. 786 00:37:59,520 --> 00:38:02,239 Speaker 8: Yeah, everything I was talking about were examples of two 787 00:38:02,280 --> 00:38:05,160 Speaker 8: people in the community sort of hurting each other. There's 788 00:38:05,200 --> 00:38:07,560 Speaker 8: also the protections you might need to protect an individual 789 00:38:07,719 --> 00:38:11,480 Speaker 8: from the community. And that might mean there's an oppressed 790 00:38:11,480 --> 00:38:14,360 Speaker 8: group and the government of the space settlement is oppressing 791 00:38:14,400 --> 00:38:16,600 Speaker 8: a minority group, for example, how do we protect those people? 792 00:38:16,640 --> 00:38:19,719 Speaker 8: It might mean it's a labor workforce that's getting exploited 793 00:38:19,719 --> 00:38:22,080 Speaker 8: by the company that runs the space settlement. What can 794 00:38:22,120 --> 00:38:24,839 Speaker 8: they do about it? You can't really quit and go 795 00:38:24,960 --> 00:38:28,280 Speaker 8: home when you're in a space settlement. It's really probably 796 00:38:28,280 --> 00:38:29,840 Speaker 8: going to be a lot harder to go on strike 797 00:38:30,160 --> 00:38:33,400 Speaker 8: or protest if you're in a place where the government 798 00:38:33,520 --> 00:38:35,920 Speaker 8: or the employer can just turn down the oxygen and 799 00:38:36,000 --> 00:38:38,120 Speaker 8: make you all go to sleep. And so there's that 800 00:38:38,200 --> 00:38:40,640 Speaker 8: kind of levels of legal protection needed as well. 801 00:38:41,280 --> 00:38:43,320 Speaker 1: So I was thinking about writing about punishment for a 802 00:38:43,320 --> 00:38:45,319 Speaker 1: city on Mars, and we didn't end up deciding to 803 00:38:45,320 --> 00:38:47,960 Speaker 1: tackle it, but we did read about how you get 804 00:38:48,000 --> 00:38:51,640 Speaker 1: people in line in communal settings, and like the hutterrightes 805 00:38:51,680 --> 00:38:55,440 Speaker 1: that's a religious group. They're very insular. They do trade 806 00:38:55,440 --> 00:38:58,360 Speaker 1: with the outside world, but they have a very elaborate 807 00:38:58,400 --> 00:39:01,960 Speaker 1: like shunning procedure people who don't follow the rules, where 808 00:39:02,000 --> 00:39:03,960 Speaker 1: if you're an adult who's not following the rules, sometimes 809 00:39:04,000 --> 00:39:06,319 Speaker 1: you have to like sleep in the children house as 810 00:39:06,360 --> 00:39:08,960 Speaker 1: a way of just sort of like showing your temporary demotion. 811 00:39:09,520 --> 00:39:11,680 Speaker 1: And in a community where like if you get kicked out, 812 00:39:12,160 --> 00:39:14,360 Speaker 1: you no longer get to see maybe your parents, or 813 00:39:14,400 --> 00:39:17,879 Speaker 1: your kids, or your wife. Shunning is a very good 814 00:39:17,960 --> 00:39:21,319 Speaker 1: method of punishment that doesn't require a prison, and it 815 00:39:21,440 --> 00:39:25,080 Speaker 1: usually is coupled with almost immediately when they repent and 816 00:39:25,120 --> 00:39:27,080 Speaker 1: stop doing it, you open your arms and welcome them 817 00:39:27,080 --> 00:39:29,640 Speaker 1: back in, because otherwise, how psychologically awful would it be 818 00:39:30,000 --> 00:39:32,240 Speaker 1: to be stuck in a community where everybody is shunning 819 00:39:32,280 --> 00:39:34,480 Speaker 1: you and treating you awfully. Do you have like other 820 00:39:34,520 --> 00:39:36,400 Speaker 1: examples of how it works in other communities. 821 00:39:37,360 --> 00:39:39,120 Speaker 8: Yeah, I think you make a good point there that 822 00:39:39,320 --> 00:39:41,720 Speaker 8: leaving open the door for redemption and to rejoining the community. 823 00:39:41,800 --> 00:39:44,000 Speaker 8: Is going to be super vital, because yeah, otherwise the 824 00:39:44,040 --> 00:39:46,399 Speaker 8: person just can't fit back in and you've lost them, 825 00:39:46,680 --> 00:39:48,799 Speaker 8: whether you're throwing them out an air locker or just 826 00:39:48,840 --> 00:39:50,880 Speaker 8: locking them up forever. One of the examples I was 827 00:39:50,920 --> 00:39:54,680 Speaker 8: given during my podcast was about a community in Ethiopia, 828 00:39:54,840 --> 00:39:57,080 Speaker 8: and I'm not Ethiopians, so it'll be difficult for me 829 00:39:57,120 --> 00:39:59,320 Speaker 8: to describe its completely accurately, but she told a beautiful 830 00:39:59,320 --> 00:40:01,600 Speaker 8: story about how when someone in the community harms someone else, 831 00:40:02,040 --> 00:40:05,440 Speaker 8: everyone gathers them together under a great tree, and individually, 832 00:40:05,480 --> 00:40:07,239 Speaker 8: each member of the community goes to this person and 833 00:40:07,239 --> 00:40:09,359 Speaker 8: it says, do you remember when I broke my leg 834 00:40:09,360 --> 00:40:10,560 Speaker 8: and you help me bring the crops in? Do you 835 00:40:10,560 --> 00:40:12,560 Speaker 8: remember when I was sick and you brought me medicine? 836 00:40:12,640 --> 00:40:14,799 Speaker 8: And reminds the person of how they are a member 837 00:40:14,840 --> 00:40:17,359 Speaker 8: of the community, a valued member of the community, and 838 00:40:17,440 --> 00:40:20,240 Speaker 8: that they are still accepted, but that they've done harm 839 00:40:20,280 --> 00:40:22,640 Speaker 8: and here's how you can address this harm and be 840 00:40:22,680 --> 00:40:24,840 Speaker 8: welcome back in, which is the sort of thing again 841 00:40:24,920 --> 00:40:28,279 Speaker 8: as an American, as a Westerner, just sounds too good 842 00:40:28,320 --> 00:40:32,120 Speaker 8: to be true, like it can't possibly work, But there 843 00:40:32,280 --> 00:40:34,359 Speaker 8: is a lot of work being done in the US 844 00:40:34,360 --> 00:40:37,600 Speaker 8: in particular on what's called restorative justice, which is this 845 00:40:37,719 --> 00:40:40,879 Speaker 8: idea that you can figure out ways to make sure 846 00:40:40,880 --> 00:40:42,600 Speaker 8: the victim feel safe and is made whole, and make 847 00:40:42,640 --> 00:40:46,239 Speaker 8: sure that the perpetrator acknowledges and addresses what they've done 848 00:40:46,440 --> 00:40:49,560 Speaker 8: without prison. It's just so hard to describe what that 849 00:40:49,600 --> 00:40:53,720 Speaker 8: looks like in very simple terms because it usually takes effort, 850 00:40:53,719 --> 00:40:57,719 Speaker 8: it takes conversation. The community itself needs to help facilitate 851 00:40:57,719 --> 00:41:00,640 Speaker 8: that conversation, and it looks different for every crime, in 852 00:41:00,719 --> 00:41:06,399 Speaker 8: every interaction, and so it's much more complicated than just saying, well, 853 00:41:06,440 --> 00:41:09,080 Speaker 8: lock them up, but it tends to have better outcomes, 854 00:41:09,120 --> 00:41:10,799 Speaker 8: and there's a lot of research being done on that 855 00:41:11,200 --> 00:41:12,680 Speaker 8: in the US right now. I think some of this 856 00:41:12,719 --> 00:41:15,239 Speaker 8: will get easier in a small community. I think a 857 00:41:15,280 --> 00:41:18,719 Speaker 8: lot of it is related to culture, and it's hard 858 00:41:18,800 --> 00:41:22,200 Speaker 8: to predict how a culture of a space settlement is 859 00:41:22,239 --> 00:41:24,720 Speaker 8: going to evolve away from the culture we have today. 860 00:41:25,120 --> 00:41:28,239 Speaker 8: Being isolated and being so interdependent on other humans, I 861 00:41:28,239 --> 00:41:31,960 Speaker 8: think will make things like shunning be a bigger threat 862 00:41:32,160 --> 00:41:35,680 Speaker 8: and help people not engage in harmful behavior to others. 863 00:41:35,719 --> 00:41:37,680 Speaker 8: But I don't know. These sci fi authors have imagined 864 00:41:37,719 --> 00:41:39,839 Speaker 8: it going the other way and have it make us 865 00:41:39,840 --> 00:41:42,320 Speaker 8: worse people, or make it more likely that some tyrant 866 00:41:42,400 --> 00:41:44,719 Speaker 8: will take over and keep everyone online. It's not clear 867 00:41:44,760 --> 00:41:45,520 Speaker 8: yet how it's going to go. 868 00:41:45,800 --> 00:41:48,280 Speaker 3: Well, what's the sort of worst crime that's been committed 869 00:41:48,320 --> 00:41:49,160 Speaker 3: in space so far? 870 00:41:49,800 --> 00:41:53,680 Speaker 8: I don't think off the top of my head. Kelly 871 00:41:53,719 --> 00:41:56,239 Speaker 8: can help me hear that there has been anything officially 872 00:41:56,280 --> 00:42:00,279 Speaker 8: considered a legal crime committed in space so far. There 873 00:42:00,360 --> 00:42:03,920 Speaker 8: is often pointed to an event with the Skylab crew 874 00:42:04,800 --> 00:42:07,400 Speaker 8: that was at the time by the newspapers called a 875 00:42:07,440 --> 00:42:09,880 Speaker 8: strike a protest. It was said that the crew was 876 00:42:09,920 --> 00:42:12,279 Speaker 8: mad at NASA and churned off the radio and said 877 00:42:12,280 --> 00:42:13,759 Speaker 8: they weren't going to work for the day. That's not 878 00:42:13,920 --> 00:42:16,640 Speaker 8: actually how it went down. It was much less dramatic 879 00:42:16,680 --> 00:42:19,759 Speaker 8: than that. It wasn't a mutiny, for example, And I 880 00:42:19,800 --> 00:42:25,000 Speaker 8: think so far it's been mostly professional disagreements at that level. 881 00:42:25,400 --> 00:42:29,640 Speaker 1: I think there was one astronaut whose partner I think 882 00:42:29,680 --> 00:42:32,000 Speaker 1: they had like broken up, and she went into her 883 00:42:32,040 --> 00:42:32,880 Speaker 1: banking accounts. 884 00:42:33,440 --> 00:42:35,600 Speaker 8: Oh, that's right, from space, from. 885 00:42:35,520 --> 00:42:37,640 Speaker 1: Space, and so it was like a small law that 886 00:42:37,760 --> 00:42:39,600 Speaker 1: was broken in space. Well, it feels small to me 887 00:42:39,680 --> 00:42:41,960 Speaker 1: relative to murder, but a smaller law that was broken 888 00:42:42,000 --> 00:42:42,360 Speaker 1: in space. 889 00:42:42,400 --> 00:42:43,920 Speaker 2: And then there was an. 890 00:42:43,800 --> 00:42:47,759 Speaker 1: International incident where one of the soyez is so these 891 00:42:47,760 --> 00:42:50,360 Speaker 1: are like the vehicles that bring people from Russia or 892 00:42:50,440 --> 00:42:53,480 Speaker 1: Kazakhstan to the International Space Station, was found to have 893 00:42:53,560 --> 00:42:55,160 Speaker 1: a hole thrilled in the side. 894 00:42:54,920 --> 00:42:56,360 Speaker 8: Of it, a tiny damage. 895 00:42:56,440 --> 00:42:57,200 Speaker 2: Yeah, yeah, And. 896 00:42:57,160 --> 00:42:59,440 Speaker 1: It wasn't clear if that was because a technician who 897 00:42:59,480 --> 00:43:02,440 Speaker 1: was building this drilled the whole and then was like ah, crud, 898 00:43:02,520 --> 00:43:04,839 Speaker 1: and then like patched it poorly and the patch fell 899 00:43:04,880 --> 00:43:07,480 Speaker 1: off in space. But Russia was saying that it was 900 00:43:07,480 --> 00:43:09,640 Speaker 1: one of the astronauts on the US side who was 901 00:43:09,640 --> 00:43:11,759 Speaker 1: having a mental health crisis and wanted to go home, 902 00:43:12,080 --> 00:43:14,359 Speaker 1: so they drilled a hole to try to push an 903 00:43:14,360 --> 00:43:15,320 Speaker 1: emergency return. 904 00:43:15,719 --> 00:43:17,759 Speaker 8: Yeah. I don't think that was proven though, that's right. 905 00:43:17,840 --> 00:43:20,440 Speaker 1: No, no, no, it wasn't and all of the US astronauts 906 00:43:20,440 --> 00:43:21,640 Speaker 1: are like, no, absolutely, that's not what. 907 00:43:21,560 --> 00:43:22,160 Speaker 2: It was happening. 908 00:43:22,520 --> 00:43:24,759 Speaker 1: But I think what's interesting about that is, you know, 909 00:43:24,800 --> 00:43:30,400 Speaker 1: the International Space Station has multiple nations represented, and depending 910 00:43:30,440 --> 00:43:33,440 Speaker 1: on which module you're in, there's different jurisdictions and depending 911 00:43:33,440 --> 00:43:35,520 Speaker 1: on what country the astronauts came from, and in the end, 912 00:43:35,560 --> 00:43:37,279 Speaker 1: they kind of weren't really able to figure out what 913 00:43:37,320 --> 00:43:39,520 Speaker 1: to do about it. And so I think if anything 914 00:43:39,920 --> 00:43:42,160 Speaker 1: more serious happened, I don't know what they would do 915 00:43:42,280 --> 00:43:44,280 Speaker 1: or how that would be dealt with. It gets complicated, 916 00:43:44,320 --> 00:43:46,000 Speaker 1: I think when you've got multiple nations together. 917 00:43:46,360 --> 00:43:48,960 Speaker 8: Yeah. I also think the space agencies are incentivized not 918 00:43:49,080 --> 00:43:51,520 Speaker 8: to talk about stories like this. I mean, they're going 919 00:43:51,600 --> 00:43:53,360 Speaker 8: to report it, especially you know in the US, but 920 00:43:53,520 --> 00:43:54,799 Speaker 8: the news is going to get their hands on it 921 00:43:54,800 --> 00:43:57,359 Speaker 8: and probably blow it out of proportion. But it's not 922 00:43:57,400 --> 00:43:59,560 Speaker 8: like NASA wants to get out the idea that this 923 00:43:59,760 --> 00:44:01,920 Speaker 8: kind of happening in space. I'm not saying they're covering 924 00:44:01,960 --> 00:44:04,640 Speaker 8: up crimes. There's a lot of things about astronaut privacy 925 00:44:04,640 --> 00:44:08,160 Speaker 8: that NASA really keeps more private. There have been some 926 00:44:08,640 --> 00:44:13,719 Speaker 8: cases in analog situations. So they do these experiments down 927 00:44:13,719 --> 00:44:16,840 Speaker 8: on Earth where they seal people up in a small 928 00:44:16,880 --> 00:44:19,719 Speaker 8: habitat for a certain amount of time to test crew 929 00:44:19,840 --> 00:44:22,120 Speaker 8: dynamics and also things like can they recycle their food 930 00:44:22,120 --> 00:44:24,960 Speaker 8: and things like that, Or they go out to isolated 931 00:44:24,960 --> 00:44:27,200 Speaker 8: place like the Antarctic or the Arctic and live in 932 00:44:27,200 --> 00:44:29,440 Speaker 8: an isolated environment for a while to ten they're on Mars. 933 00:44:29,760 --> 00:44:32,400 Speaker 8: There have been cases there where there's been interpersonal conflict 934 00:44:32,560 --> 00:44:35,560 Speaker 8: and to the level of sexual harassment and conflicts like that, 935 00:44:35,680 --> 00:44:37,960 Speaker 8: and so I don't think there's any reason to assume 936 00:44:38,000 --> 00:44:40,280 Speaker 8: that kind of behavior won't happen in space as well. 937 00:44:40,680 --> 00:44:42,279 Speaker 1: I do love your point that we need to figure 938 00:44:42,280 --> 00:44:45,000 Speaker 1: out what our plan is ahead of time. I was 939 00:44:45,040 --> 00:44:49,160 Speaker 1: reading about naval punishment and like, they didn't have space 940 00:44:49,200 --> 00:44:51,959 Speaker 1: on the ship for a prison, and they didn't want 941 00:44:52,000 --> 00:44:54,480 Speaker 1: to have a member of the crew not working and 942 00:44:54,520 --> 00:44:56,640 Speaker 1: then maybe another member of the crew watching them not 943 00:44:56,760 --> 00:44:59,200 Speaker 1: work and making sure they don't And so the flogging 944 00:44:59,360 --> 00:45:01,680 Speaker 1: was there answer, Like if you hit someone, you beat 945 00:45:01,719 --> 00:45:03,759 Speaker 1: them quickly, then they can return right back to work, 946 00:45:04,040 --> 00:45:06,759 Speaker 1: and it's a severe, immediate punishment. And so I think 947 00:45:06,800 --> 00:45:08,799 Speaker 1: it's important for us to decide which one of these 948 00:45:08,800 --> 00:45:11,560 Speaker 1: things do we want to be trying when we move 949 00:45:11,600 --> 00:45:13,759 Speaker 1: out into space, and which punishment regime do we want 950 00:45:13,760 --> 00:45:14,359 Speaker 1: to take with us. 951 00:45:14,440 --> 00:45:16,239 Speaker 3: I just want to say that flogging works really well 952 00:45:16,239 --> 00:45:18,160 Speaker 3: in my research group as the motivations for. 953 00:45:20,120 --> 00:45:21,960 Speaker 1: We have a book on our shelf called In Defense 954 00:45:22,000 --> 00:45:24,680 Speaker 1: of Flogging. It's an economists take on why it's an 955 00:45:24,680 --> 00:45:27,880 Speaker 1: efficient punishment system where maybe you'd rather get flogged than 956 00:45:27,960 --> 00:45:31,000 Speaker 1: spend five years in prison, or thrown out or thrown 957 00:45:31,000 --> 00:45:34,080 Speaker 1: out the airlock. So anyway, lots of options for punishment. 958 00:45:34,400 --> 00:45:36,200 Speaker 1: Let's take a break, and when we come back, we're 959 00:45:36,200 --> 00:45:37,560 Speaker 1: going to talk about babies in space. 960 00:45:53,680 --> 00:45:58,560 Speaker 2: All right, Space babies erica ethical or not. 961 00:45:59,680 --> 00:46:04,960 Speaker 8: I find space babies to be the cruelest creatures. The 962 00:46:05,040 --> 00:46:09,000 Speaker 8: idea of having babies in space is obviously a fascinating one, 963 00:46:09,000 --> 00:46:12,280 Speaker 8: and it's also crucial to the whole idea of having 964 00:46:12,280 --> 00:46:14,680 Speaker 8: a permanent space settlement. You can't have a space settlement 965 00:46:15,040 --> 00:46:18,320 Speaker 8: stand on its own if you have to keep shipping 966 00:46:18,440 --> 00:46:24,480 Speaker 8: humans from Earth. So the question of whether we can 967 00:46:24,520 --> 00:46:27,880 Speaker 8: produce more humans in space is both an important one, 968 00:46:27,920 --> 00:46:31,160 Speaker 8: both scientifically and ethically. So the scientific part is the 969 00:46:31,160 --> 00:46:34,440 Speaker 8: tough one. We don't actually know if humans can have 970 00:46:34,600 --> 00:46:37,000 Speaker 8: babies in space. We don't know that about any stage 971 00:46:37,000 --> 00:46:40,759 Speaker 8: of the process, about conceiving, about growing a fetus in 972 00:46:40,920 --> 00:46:44,480 Speaker 8: utero in a healthy way. We don't know how harder 973 00:46:44,640 --> 00:46:47,440 Speaker 8: or easy delivering a baby would work in space. And 974 00:46:47,480 --> 00:46:49,920 Speaker 8: we don't know if children can grow in a healthy 975 00:46:49,960 --> 00:46:53,680 Speaker 8: way in the absence of, for example, Earth's gravitational field, 976 00:46:53,880 --> 00:46:56,240 Speaker 8: or if all this radiation exposure you get in space 977 00:46:56,320 --> 00:46:58,400 Speaker 8: is going to cause problems all along the way. So 978 00:46:58,560 --> 00:47:02,200 Speaker 8: tons of interesting scientific questions there, but we also don't 979 00:47:02,200 --> 00:47:06,760 Speaker 8: know how to even ethically study those questions because most 980 00:47:07,000 --> 00:47:10,640 Speaker 8: bioethicists and medical researchers will tell you it's very ethically 981 00:47:10,719 --> 00:47:14,600 Speaker 8: murky to perform science experiments on pregnant people and fetuses, 982 00:47:14,719 --> 00:47:19,040 Speaker 8: especially the fetuses that can't consent. So it's tricky, and 983 00:47:19,320 --> 00:47:22,640 Speaker 8: I think that there's a good chance that the ethicist 984 00:47:22,680 --> 00:47:26,080 Speaker 8: will be arguing about this while some space tourists are 985 00:47:26,200 --> 00:47:28,960 Speaker 8: just making it happen someday, and we're going to end 986 00:47:29,040 --> 00:47:33,840 Speaker 8: up with this experiment happening unmonitored and unprotected. So it 987 00:47:33,880 --> 00:47:35,799 Speaker 8: could be that we can't do it. It could be 988 00:47:35,800 --> 00:47:37,719 Speaker 8: the humans can't reproduce in space. That's going to put 989 00:47:37,719 --> 00:47:40,520 Speaker 8: a big damper on people's plans for conquering the universe, 990 00:47:40,719 --> 00:47:43,920 Speaker 8: and we will have to reanalyze all of our plans 991 00:47:43,920 --> 00:47:46,440 Speaker 8: for space settlement. It's also possible there's some middle ground 992 00:47:46,440 --> 00:47:48,160 Speaker 8: here where we'll be able to do it, but maybe 993 00:47:48,160 --> 00:47:50,640 Speaker 8: it'll be a lot more difficult, maybe it'd be less likely. 994 00:47:50,719 --> 00:47:54,480 Speaker 8: Maybe we'll need some sort of extra technology help, like 995 00:47:54,640 --> 00:47:58,879 Speaker 8: artificial gravity or radiation shielding or artificial wounds, which could 996 00:47:58,960 --> 00:48:02,160 Speaker 8: turn into a thing where wealthy space settlers can afford 997 00:48:02,200 --> 00:48:05,960 Speaker 8: to have children, or you have to get special permit 998 00:48:06,040 --> 00:48:09,000 Speaker 8: from the space government to use the artificial womb and 999 00:48:09,040 --> 00:48:11,759 Speaker 8: have babies, which could cause a lot of issues. And 1000 00:48:11,840 --> 00:48:14,759 Speaker 8: maybe it'll be super easy and will be superfertile in 1001 00:48:14,840 --> 00:48:17,640 Speaker 8: space and have babies everywhere, which will lead to other problems, 1002 00:48:17,920 --> 00:48:21,319 Speaker 8: like maybe we have an overpopulated space settlement and we 1003 00:48:21,400 --> 00:48:23,160 Speaker 8: have to figure out what to do about that, and 1004 00:48:23,239 --> 00:48:26,200 Speaker 8: boy do we have examples of that on Earth, and 1005 00:48:26,239 --> 00:48:29,960 Speaker 8: how it has led to government trying to control fertility 1006 00:48:30,600 --> 00:48:34,640 Speaker 8: and punishing people for having too many children or forcing 1007 00:48:34,680 --> 00:48:37,360 Speaker 8: them to be sterilized so that they can't have more children. 1008 00:48:37,560 --> 00:48:42,120 Speaker 8: Often throughout history this has in particular led to people 1009 00:48:42,160 --> 00:48:46,520 Speaker 8: deciding to limit other groups' fertilities, but only for certain people. 1010 00:48:46,560 --> 00:48:49,640 Speaker 8: We want to limit population growth, but only for these 1011 00:48:49,680 --> 00:48:52,279 Speaker 8: people we don't think should be reproducing, and that's just 1012 00:48:52,280 --> 00:48:55,760 Speaker 8: eugenics and leads to all sorts of other really terrible 1013 00:48:55,760 --> 00:48:58,719 Speaker 8: outcomes that could happen in space. And in space we 1014 00:48:58,760 --> 00:49:01,560 Speaker 8: could go the other way around too and have underpopulation 1015 00:49:02,040 --> 00:49:04,880 Speaker 8: and have questions about whether we can force people to 1016 00:49:04,960 --> 00:49:08,480 Speaker 8: have children which also has a few parallels throughout history 1017 00:49:08,520 --> 00:49:13,000 Speaker 8: as well, and leads to things like outlawing abortion, birth control, 1018 00:49:13,520 --> 00:49:15,520 Speaker 8: knowing about contraception, and things like that. 1019 00:49:15,760 --> 00:49:17,520 Speaker 3: It feels to me like space is sort of putting 1020 00:49:17,520 --> 00:49:20,160 Speaker 3: a spotlight on what's already kind of a tricky question, 1021 00:49:20,640 --> 00:49:23,080 Speaker 3: you know, about is it ethical to have kids? We 1022 00:49:23,160 --> 00:49:27,319 Speaker 3: try to avoid ever restricting people from having kids, but 1023 00:49:27,400 --> 00:49:30,920 Speaker 3: you could also always just ask questions, like some people 1024 00:49:30,960 --> 00:49:33,680 Speaker 3: in their living conditions, should they be having kids? You know, 1025 00:49:33,760 --> 00:49:36,759 Speaker 3: if I want to like set up camp near a volcano, 1026 00:49:37,120 --> 00:49:39,440 Speaker 3: is it ethical for me to have kids, knowing that 1027 00:49:39,480 --> 00:49:42,279 Speaker 3: the kids could be like bathed in lava one day 1028 00:49:42,320 --> 00:49:45,400 Speaker 3: as they sleep. Nobody tells me I can't have kids 1029 00:49:45,480 --> 00:49:47,680 Speaker 3: or I shouldn't have kids. Lots of people out there 1030 00:49:47,680 --> 00:49:50,640 Speaker 3: are terrible parents, but nobody like gave them an exam 1031 00:49:50,680 --> 00:49:53,160 Speaker 3: and says, hey, are you going to be a reasonable parent. 1032 00:49:53,960 --> 00:49:56,560 Speaker 3: Why shouldn't we just apply the same like lass a 1033 00:49:56,719 --> 00:49:59,040 Speaker 3: fair thing to space and say, hey, you're going to 1034 00:49:59,120 --> 00:50:01,760 Speaker 3: have kids in space. Maybe they won't just take correctly, 1035 00:50:01,800 --> 00:50:04,600 Speaker 3: maybe they'll get cancer, maybe they'll end up nine feet tall. 1036 00:50:04,719 --> 00:50:05,399 Speaker 3: It's on you. 1037 00:50:06,239 --> 00:50:09,960 Speaker 8: Yeah, if you think about what we owe to the children, 1038 00:50:10,120 --> 00:50:12,960 Speaker 8: like what rights do they have here? That does have 1039 00:50:13,040 --> 00:50:15,560 Speaker 8: interesting questions. Maybe we're making them more at risk for 1040 00:50:16,200 --> 00:50:19,480 Speaker 8: radiation induced illness, like you mentioned with your lava example. 1041 00:50:19,880 --> 00:50:22,400 Speaker 8: Maybe they can grow and be healthy and have healthy 1042 00:50:22,440 --> 00:50:24,840 Speaker 8: lives on Mars, but because they're growing in a lower 1043 00:50:24,880 --> 00:50:28,080 Speaker 8: gravity environment, they can't ever return to Earth because they'll 1044 00:50:28,120 --> 00:50:31,799 Speaker 8: collapse and die. Is there something unethical about that? Are 1045 00:50:31,840 --> 00:50:33,920 Speaker 8: we somehow taking away their birthright to go back to 1046 00:50:33,960 --> 00:50:36,000 Speaker 8: the planet that they evolved on or will they not 1047 00:50:36,160 --> 00:50:38,279 Speaker 8: care because they're not from Earth and they're too cool 1048 00:50:38,320 --> 00:50:39,879 Speaker 8: for Earth And it's not a big deal. 1049 00:50:40,000 --> 00:50:42,240 Speaker 3: I mean, Kelly's decided to have her kids in Virginia. 1050 00:50:42,320 --> 00:50:44,800 Speaker 3: Now they have to tell people they're from Virginia. Not 1051 00:50:44,920 --> 00:50:46,719 Speaker 3: to overcome that their whole life when they could have 1052 00:50:46,719 --> 00:50:49,040 Speaker 3: grown up in California. How do you deal with that, Kelly? 1053 00:50:49,840 --> 00:50:52,799 Speaker 1: It's beautiful here and I think they're very happy, and 1054 00:50:52,880 --> 00:50:54,640 Speaker 1: I feel bad for your California kids. 1055 00:50:54,719 --> 00:50:59,560 Speaker 3: Well, clearly they're brainwashed. Propaganda is everywhere, but there. 1056 00:50:59,520 --> 00:51:01,799 Speaker 8: Are exemples of this on Earth where people have to 1057 00:51:02,200 --> 00:51:05,000 Speaker 8: make the really tough decision about whether to become refugees 1058 00:51:05,320 --> 00:51:07,240 Speaker 8: certain parts of the world. Are they going to decide 1059 00:51:07,239 --> 00:51:09,520 Speaker 8: to take their kids on this really dangerous journey, like 1060 00:51:09,560 --> 00:51:12,160 Speaker 8: a rocket trip to a new place they've never seen before, 1061 00:51:12,160 --> 00:51:14,320 Speaker 8: where potentially they're never going to be able to return 1062 00:51:14,680 --> 00:51:17,799 Speaker 8: to their home country. That's a similar decision, and on 1063 00:51:17,840 --> 00:51:20,360 Speaker 8: Earth we let parents make that decision. They have to 1064 00:51:20,400 --> 00:51:22,359 Speaker 8: make that decision for their kids if their kids can't 1065 00:51:22,360 --> 00:51:25,120 Speaker 8: make it, and generally what we hope is that the 1066 00:51:25,200 --> 00:51:27,439 Speaker 8: parents are working in the best interest of the kids. 1067 00:51:27,440 --> 00:51:29,400 Speaker 8: In those cases, it's usually because the kids are at 1068 00:51:29,440 --> 00:51:32,160 Speaker 8: risk if they stay where they are. It's sort of 1069 00:51:32,200 --> 00:51:35,040 Speaker 8: a different idea. We're talking about space because at the moment, 1070 00:51:35,080 --> 00:51:37,920 Speaker 8: there's no asteroid headed for the Earth. So most people 1071 00:51:37,920 --> 00:51:39,600 Speaker 8: who want to go raise kids in space, it's just 1072 00:51:39,640 --> 00:51:42,680 Speaker 8: because they want to, not because they necessarily know that 1073 00:51:42,719 --> 00:51:45,600 Speaker 8: it's going to give their kids a better life. But 1074 00:51:46,000 --> 00:51:48,120 Speaker 8: you're right, we have a bit more of a hands 1075 00:51:48,120 --> 00:51:50,680 Speaker 8: off attitude here on Earth, and we let people raise 1076 00:51:50,719 --> 00:51:53,560 Speaker 8: their kids how they want, but not completely. There's plenty 1077 00:51:53,560 --> 00:51:56,560 Speaker 8: of limitations that the state puts on what people do 1078 00:51:56,680 --> 00:52:00,399 Speaker 8: with their kids different ways. In different places in the US, 1079 00:52:00,440 --> 00:52:04,080 Speaker 8: they're required to have an education sometimes that's better monitored 1080 00:52:04,120 --> 00:52:06,960 Speaker 8: than in other places, and in other places around the world, 1081 00:52:07,000 --> 00:52:10,759 Speaker 8: there's more restrictions on whether you can or can't make 1082 00:52:10,840 --> 00:52:13,719 Speaker 8: decisions about your own ability to have kids. And this 1083 00:52:13,760 --> 00:52:17,600 Speaker 8: all comes back to the balance between someone's own bodily 1084 00:52:17,640 --> 00:52:20,240 Speaker 8: autonomy that they should be able to decide what happens 1085 00:52:20,239 --> 00:52:24,560 Speaker 8: with their body or not, versus what society needs or 1086 00:52:24,600 --> 00:52:27,399 Speaker 8: wants from them, which when it comes to population level, 1087 00:52:27,440 --> 00:52:29,960 Speaker 8: stuff is affected by whether people are having kids round. 1088 00:52:30,920 --> 00:52:33,959 Speaker 1: So we've kind of scratched the surface of the many 1089 00:52:34,000 --> 00:52:36,560 Speaker 1: interesting ethical questions that you brought up in your book. 1090 00:52:36,960 --> 00:52:41,000 Speaker 1: Sometimes it can feel sort of overwhelming and helpless to 1091 00:52:41,040 --> 00:52:43,439 Speaker 1: think about all of this stuff. What are some options 1092 00:52:43,440 --> 00:52:44,880 Speaker 1: for things that we can do now to try to 1093 00:52:44,880 --> 00:52:47,000 Speaker 1: make sure that we do settle space when we get 1094 00:52:47,000 --> 00:52:48,759 Speaker 1: there to that point that we do it in an 1095 00:52:48,760 --> 00:52:49,719 Speaker 1: ethical and just way. 1096 00:52:50,239 --> 00:52:53,280 Speaker 8: Well, just having these conversations at all is super important, 1097 00:52:53,400 --> 00:52:56,239 Speaker 8: because the alternative of just giving it a go and 1098 00:52:56,239 --> 00:52:58,680 Speaker 8: seeing what happens, I think, is not the best choice. 1099 00:52:58,680 --> 00:53:00,480 Speaker 8: So any level of conversations, more. 1100 00:53:00,360 --> 00:53:02,480 Speaker 3: Conversation, launch baby launch. 1101 00:53:04,480 --> 00:53:06,920 Speaker 8: At any level of conversation is good. Encouraging people to 1102 00:53:06,920 --> 00:53:10,520 Speaker 8: have these conversations at the decision maker level is very important. 1103 00:53:11,120 --> 00:53:14,000 Speaker 8: Reading Kelly's book, I think purchasing and reading Kelly's. 1104 00:53:13,640 --> 00:53:16,240 Speaker 2: Book is super important with yours. 1105 00:53:16,719 --> 00:53:21,480 Speaker 8: Yeah, absolutely by books about this topic, and as an individual, 1106 00:53:21,600 --> 00:53:24,520 Speaker 8: I think definitely trying to learn more about history if 1107 00:53:24,520 --> 00:53:27,120 Speaker 8: you're interested in space is very important. It's not immediately 1108 00:53:27,120 --> 00:53:29,160 Speaker 8: obvious that should be the case, but as we've talked about, 1109 00:53:29,160 --> 00:53:32,960 Speaker 8: there's plenty of examples from history, and not just the 1110 00:53:33,000 --> 00:53:35,480 Speaker 8: history of your part of the world. I think North Americans, 1111 00:53:35,760 --> 00:53:37,480 Speaker 8: like myself, focused a little too much on the history 1112 00:53:37,520 --> 00:53:40,040 Speaker 8: of North American colonization, but again, all of human history 1113 00:53:40,080 --> 00:53:43,000 Speaker 8: to learn from. Let's do that. I think anyone who's 1114 00:53:43,320 --> 00:53:47,040 Speaker 8: working anywhere near space, whether you're in an astronomy department 1115 00:53:47,080 --> 00:53:49,840 Speaker 8: at university, or you're working for a space startup, or 1116 00:53:49,840 --> 00:53:52,920 Speaker 8: you're working in government regulation, there's plenty of ways you 1117 00:53:52,960 --> 00:53:56,000 Speaker 8: can work towards making your company more ethical, making them 1118 00:53:56,000 --> 00:53:57,919 Speaker 8: make the right decisions. But I also think it's really 1119 00:53:57,960 --> 00:54:00,799 Speaker 8: important to look at your organization and see if it 1120 00:54:00,800 --> 00:54:02,759 Speaker 8: does it embody the kind of values you want to 1121 00:54:02,800 --> 00:54:05,680 Speaker 8: see in a space settlement. Is there a labor being exploited, 1122 00:54:05,719 --> 00:54:09,040 Speaker 8: are there people being marginalized, or is everyone being treated well? 1123 00:54:09,080 --> 00:54:13,160 Speaker 8: Because whatever values we have in our society today is 1124 00:54:13,360 --> 00:54:16,560 Speaker 8: what's going to get pushed forward into space in the future. 1125 00:54:16,600 --> 00:54:18,960 Speaker 8: So the better we can treat each other now on Earth, 1126 00:54:18,960 --> 00:54:21,880 Speaker 8: the easier it'll be to transfer those norms into space. 1127 00:54:21,960 --> 00:54:25,000 Speaker 8: And that's just as important as building a good legal 1128 00:54:25,040 --> 00:54:27,640 Speaker 8: system or safe technologies, is to make sure that the 1129 00:54:27,719 --> 00:54:29,560 Speaker 8: values we want to see in our descendants in the 1130 00:54:29,560 --> 00:54:33,000 Speaker 8: future are what we're actually actively working towards for ourselves. Now, 1131 00:54:33,400 --> 00:54:35,680 Speaker 8: we can't just wait and say, well, they'll figure it out, 1132 00:54:35,680 --> 00:54:37,200 Speaker 8: they'll be better in space. We don't need to do 1133 00:54:37,239 --> 00:54:39,120 Speaker 8: that hard work. We have to do it for them. 1134 00:54:39,200 --> 00:54:40,680 Speaker 3: Yeah, I agree, And I think a lot of people 1135 00:54:40,719 --> 00:54:43,120 Speaker 3: see space settlement in their minds as some sort of 1136 00:54:43,200 --> 00:54:45,880 Speaker 3: utopia where we go there and we don't bring any 1137 00:54:45,880 --> 00:54:48,520 Speaker 3: of the problems we have here, and that we somehow 1138 00:54:48,640 --> 00:54:51,359 Speaker 3: all magically get along. But we all know that we're 1139 00:54:51,400 --> 00:54:53,360 Speaker 3: going to make the same mistakes. Somebody's going to invent 1140 00:54:53,400 --> 00:54:57,000 Speaker 3: white chocolate even if we don't bring it with us, right, Like, 1141 00:54:57,640 --> 00:55:00,239 Speaker 3: this is just who we are as humans. We I 1142 00:55:00,280 --> 00:55:02,000 Speaker 3: agree with you. We need to acknowledge that and face 1143 00:55:02,040 --> 00:55:04,280 Speaker 3: those and figure some of this stuff out before it happens. 1144 00:55:04,480 --> 00:55:06,640 Speaker 8: That's right. Space is only a blank slate till we 1145 00:55:06,680 --> 00:55:07,160 Speaker 8: show up. 1146 00:55:07,719 --> 00:55:08,200 Speaker 2: That's right. 1147 00:55:08,680 --> 00:55:11,359 Speaker 1: So if folks want to stay in touch with Erica, 1148 00:55:11,480 --> 00:55:14,359 Speaker 1: they can go to Erica nesvuld dot com. That's E 1149 00:55:14,520 --> 00:55:17,080 Speaker 1: R I KA ne E s v O l d 1150 00:55:17,520 --> 00:55:19,440 Speaker 1: dot com and you can sign up for her newsletter 1151 00:55:19,520 --> 00:55:23,279 Speaker 1: to stay up to date on Erica updates. Absolutely well, 1152 00:55:23,320 --> 00:55:25,120 Speaker 1: thank you so much for being on the show, Erica. 1153 00:55:25,200 --> 00:55:26,759 Speaker 1: It was I was going to say a lot of fun, 1154 00:55:26,800 --> 00:55:29,239 Speaker 1: but maybe space ethics isn't always supposed to be fun. 1155 00:55:29,280 --> 00:55:31,600 Speaker 1: But it was certainly very informative and I enjoyed talking 1156 00:55:31,640 --> 00:55:31,879 Speaker 1: to you. 1157 00:55:31,960 --> 00:55:33,760 Speaker 8: I had a great time talking about you think. 1158 00:55:33,719 --> 00:55:36,360 Speaker 3: And I think the takeaway message is not we shouldn't 1159 00:55:36,400 --> 00:55:38,520 Speaker 3: settle space, so that we can't settle space, but then 1160 00:55:38,600 --> 00:55:41,120 Speaker 3: we have to do it thoughtfully and carefully. We shouldn't 1161 00:55:41,160 --> 00:55:44,319 Speaker 3: just rush into it and do it because it seems fun, right, 1162 00:55:44,360 --> 00:55:46,960 Speaker 3: So we don't have to feel like we're being pessimist 1163 00:55:47,000 --> 00:55:48,240 Speaker 3: about it. We're just being thoughtful. 1164 00:55:48,440 --> 00:55:50,800 Speaker 8: That's right. And we'll leave the white chocolate behind. 1165 00:55:54,120 --> 00:55:54,640 Speaker 3: Thank you. 1166 00:55:55,960 --> 00:55:57,279 Speaker 2: That is the right note to end on. 1167 00:56:03,880 --> 00:56:07,440 Speaker 1: Daniel and Kelly's Extraordinary Universe is produced by iHeartRadio. 1168 00:56:07,640 --> 00:56:10,200 Speaker 2: We would love to hear from you, We really would. 1169 00:56:10,360 --> 00:56:13,120 Speaker 3: We want to know what questions you have about this 1170 00:56:13,320 --> 00:56:15,000 Speaker 3: extraordinary universe. 1171 00:56:15,080 --> 00:56:18,040 Speaker 1: We want to know your thoughts on recent shows, suggestions 1172 00:56:18,080 --> 00:56:19,040 Speaker 1: for future shows. 1173 00:56:19,160 --> 00:56:21,520 Speaker 2: If you contact us, we will get back to you. 1174 00:56:21,719 --> 00:56:25,239 Speaker 3: We really mean it. We answer every message. Email us 1175 00:56:25,280 --> 00:56:28,040 Speaker 3: at Questions at Danielandkelly dot. 1176 00:56:27,920 --> 00:56:29,560 Speaker 2: Org, or you can find us on social media. 1177 00:56:29,680 --> 00:56:33,520 Speaker 1: We have accounts on x, Instagram, Blue Sky and on 1178 00:56:33,560 --> 00:56:34,480 Speaker 1: all of those platforms. 1179 00:56:34,560 --> 00:56:37,440 Speaker 2: You can find us at D and K Universe. 1180 00:56:37,600 --> 00:56:39,160 Speaker 3: Don't be shy, write to us